After working with .NET's HttpWebRequest
/Response
objects, I'd rather shoot myself than use this to crawl through web sites. I'm looking for an existing .NET library that can fetch URLs, and give you the ability to follow links, extract/fill in/submit forms on the page, etc. Perl's LWP
and WWW::Mechanize
modules do this very well, but I'm working with a .NET project.
I've come across the HTML Agility Pack, which looks awesome, but it stops short of simulating links/forms.
Does such a tool already exist?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…