Methods of Downloading

How downloads are made can be varied, the most usual form of downloading is via the web default HTTP connection, transferring a file from a website as it would the same for the actual web pages. This works but for larger files it's not the most reliable, better than it was since the introduction of more reliable broadband connection but there's still room for improvement. That's why there are vendors offering download managers that can handle possible connection issues that would simply fail when downloading using s browser.

Moving forward in technology there are other popular methods of transferring large files, with torrent and Usenet (or newsgroups) being two that are discussed here.


Torrent downloading, or torrent sharing to be more correct is a peer to peer network where parts of a file are downloaded from different users on the network to be patched together and form the complete file, with the downloaded file or parts of being offered to other users wanting to download the same.

Benefits start with the service being free to use, partly due to it being completely supported by the users on it with no central authority to have control of it. Also since there are many users there is a lot of content available that may have been long since removed from the hosting websites.

Download speeds are able to saturate a downstream bandwidth but this is wholly dependent on the number users (peers) that have the file you want and their connection, specifically the upstream bandwidth. And this brings about the disadvantage of torrents, since the network is based entirely on other users, the content available is always changing and download speeds can vary greatly depending on the popularity of content. For example the latest fresh on the market release of Ubuntu Linux will be aplenty and download speeds at a maximum, but looking for a classic version of Mandriva Linux 9.2 may return only a handful of peers if any and getting the complete file could take days. If a web server is running on the server then take into account that torrents use the upstream connection which could impact visitors to the sites.


Usenet is a sort of peer to peer, but the sharing of files happens between the Usenet providers. But consumers access and download from servers that hold or are able to retrieve the files as fast as the consumers connection can handle. This speed comes at a price however, while long ago internet providers give Usenet access as part of the service the rocketing amount of data being hosted meant that it wasn't feasible to continue, leaving it to specialist providers offering subscriptions for Usenet downloading.

It's amazing to think that Usenet is still going strong as its origins date back to before the modern web was invented, before websites the Internet was mainly full of forum like bulletin boards, each service provider had their own but there were others that were available to many, Arpanet that was available to an elite and Usenet that was considered as a poor persons alternative Arpanet. Content is arranged in different categories or discussion groups that can be created by anyone who requested them be added by service providers after which as users can post messages to the group and so the conversations can begin.

During the 90s service providers started offering binary groups, where instead of text discussions, users were able to upload machine data to the groups and be able to share media and software to others. Due to its legacy as file sizes got bigger than the limits of post sizes, files had to be split up over a number of posts to be reattached by the end user. The use of compression and splitting was generally adopted by the use of RAR files. RAR files and its method are still the standard and to help with any missing parts (when servers don't transfer all posts between themselves) there are Par file, a great technology that seems to create data out of thin air, to patch the missing parts giving a full file download.

Before getting too nostalgic, it's best to summarise here, Usenet offers downloading at full speeds at the cost of a subscription to dedicated providers. But the availability of content is limited to when others upload and of that they are only available for a limited time, however the retention times of files are increasing all the time and is used as a marketing tool by providers.

Sounds complicated but the how to guide will give you a good start in Usenet downloading.


The standard for downloading from websites is Http and less so Ftp. Great for smaller files that are downloaded quickly but for larger files that may need an overnight download it is not the best, as any errors during download will cause an unrecoverable fail. Download managers step in to try and prevent these errors by keeping track on download progress to recover downloading from the last point when errors occur. Also many have the ability to set download speed limits to prevent the downloads hogging all the download bandwidth. This software was popular during the time of dialup connections when the slow speeds and unreliable connections made downloading tricky. But are still useful today when download sizes are pushing extraordinary amounts.

FTP was designed to combat the issues that dialup users had when downloading, but is almost unheard of these days partly because of Internet Explorers terrible implementation of it when it was a monopoly on the web. Even still it is still in use today and there may be a time when FTP is needed. Luckily download managers also incorporate ftp downloads and make things a lot simpler then they have to be.