Spyders to gather information from the internet

Jan 8, 2009 at 9:07pm
Hello,
I am looking to make a tedious task fast. My friends and me listen to "A State of Trance" by Armin Van Buuren. It's a weekly radio show and has many different artists per two-hour mix. You can find the track listings at: http://www.arminvanbuuren.com/asot/ . I want to make a spyder that you can just type in (ether command line or GUI): 283 and it will save the track listings (http://www.arminvanbuuren.com/asot/98/) of episode 283 to a text file. Is this easy to do? Would it be easier to do in another language? Thanks again for your past help and current stuff too,

enduser000
Jan 8, 2009 at 9:11pm
You can write a simple TCP connection to execute a basic HTTP req and parse the results off yourself. Fairly simple.

If you can find an HTTP library than that'd make it even easier. But overall it's not too hard.
Jan 9, 2009 at 5:52am
wget and grep can do this work.
If you are using Linux, you can just write a bash script.
Jan 9, 2009 at 7:14pm
A script to download the source of the page? I'm running Ubuntu 8.10, I'll look into it, thanks,

enduser
Feb 19, 2009 at 8:08pm
Used wget, worked like a charm, thanks again,

enduser
Topic archived. No new replies allowed.