Anyone can test this. It's a perl script. Plug in values for $ftp_hostname, $ftp_port, and $ftp_passive, and run from cron or the command line.
Code:
#!/usr/local/bin/perl
my $ftp_hostname = ''; # ftp host name
my $ftp_port = '21'; # typical value
my $ftp_passive = 0; # change to 1 for passive mode
use Net::FTP;
my $ftp = Net::FTP->new($ftp_hostname, Port => $ftp_port, Passive => $ftp_passive);
print "Content-type: text/html\n\n";
if (!ftp) {
print "FTP connection failed: $@";
} else {
print "FTP connection successful.";
}
exit;
That's just a snippet pulled from my original code, which would result in a "bad hostname" error about half the time when run in the wee hours of the morning. Again just for clarity, this script is
running on a different host, trying to connect via FTP to my linode.
The script only runs once per day, so I suspect this may relate to a DNS cache (which might explain why it works all day long when I try it at the command line; the lookup has already occurred so it is now cached for the day, even though the script may not have waited long enough for the query to finish).