Separate names with a comma.
Discussion in 'Support' started by fpefpe, Sep 4, 2013.
has ftp.jpsoft.com been turned off?
Yes. Attacks were outnumbering valid accesses about 12,000 to 1.
There isn't anything that was on the ftp site that isn't available from the web site.
Is there a catalog file available we can compare with a saved catalog file to know what is new? I mean similar to the http://jpsoft.com/downlaods/v##/tcmd###.aiu files, but instead of 2 separate files for each current version, a single all-encompassing file so the search could be automated? For example, I have effectively replaced one of my machines with a 64b machine, but newver downloaded older 64b version installers...
No, nor is any planned.
Moving away from automatable procedures... Many years ago JPsoft support published a batch file to create such a catalog.
Many years ago, JP Software distributed software on its FTP site. But that was many years ago, FTP is largely obsolete (and wholly dangerous), and Take Command has had a built-in updater (for many years) which renders the point entirely moot.
Is FTP deprecated by W3C, and if so, what is the suggested replacement? And why is File Transfer Protocol more dangerous for file transfers than hypertext transfer protocol? Does HTTP provide a method to compare a downloaded file with the original as is possible with FTP? Or is FTP more dangerous than HTTP because the security of an FTP server against unauthorized intrusion modifying its downloadable files is inherently less than that of an HTTP server against an analogous attack?
TC's built-in updater is fine for those who have unmeasured wideband internet connection, and for those who only have a single-copy license, but if you have copies on multiple machies, and a slow or measured internet connection, it requires multiple downloads of the same file, an obvious waste of bandwidth and download volume. Thanks to the update information (.aiu) files, I can minimize the download volume even with HTTP, but I must depend on the hash codes in the .aiu files as the sole means to verify download accuracy, I cannot compare with the originals.
Er... does W3C have anything to do with the FTP protocol?
While we're at it, could SCP be the answer?
My mistake - I referenced W3C (World Wide Web Consortium), which issues some of the IP standards, but FTP standards were actually issued by IETF (Internet Engineering Task Force) as RFC 959,also known as STD 9, and many others. The latest modification (a registry for FTP extensions) is RFC 5797 of March, 2010. FTP is alive and well! There is also FTPS, such as RFC 2228, already supported by TCMD. SCP could also work; I do not recall ever using it.
My wild guess is that the real issue with FTP is that a server can have a virtually unlimited number of concurrent HTTP connections, and thus concurrently deliver downloads to all requestors (albeit slower), the number of concurrent FTP connections is very limited, and concentrated denial of service attacks can totally prevent legitimate users from accessing the FTP server.
If you have an extra 40-50 hours a week (more like 80-100 once the hackers find you), I suggest you actually run a public FTP server for a while. You'll quickly find learn some painful truths about the holes in FTP security.
I could write an additional new version of Take Command every year with the time I have spent supporting < 10 (mostly non-paying) users on FTP. It is a ridiculously extravagant waste of my time for an infinitesimal benefit -- the public JP Software FTP server is gone, and it is never going to return. End of discussion.