That lynx trick is cute. Of course, it doesn't get you very far if you need to parameterize the session in any way.
A more robust tool for screen scraping type of automation is
curl (apt-get install curl). It allows as much flexibility and parameterization in the script as you need. I have found two practical ways of using it:
1. If you just need to poke/grab one web page, encapsulate the curl command in a shell script that carries the command line args to curl params/options.
2. If the automation script needs to "go places", or generally maintain state (cookies etc.) over multiple PUTs/GETs, the easiest is to use the PHP bindings for curl and use the command line interface version of PHP, also known as "PHP (cli)".
Code:
apt-get install php5-cli php5-curl
Of course, seeing that a Linode API will soon be here, there's hopefully no longer any need to use this form of automation on Linode specifically.