I just came across this very cool and useful tool called HTTrack. As a website copier / offline browser utility HTTrack allows the user to download a specified site to a local directory. HTTrack builds out all of the directories recursively grabbing HTML, images and other files from the server and placing them onto your computer.
After the website is copied, HTTrack creates a log file identifying crawl errors (like 403, 404, robot.txt files, etc.) and the problem directories. As you can imagine this is quite handy for testing websites. Besides finding broken links, testers can take the downloaded sites and modify the code to find server side problems.
Download HTTrack in 32 or 64 bit here: http://www.httrack.com/page/2/en/index.html