If you are unsatisfied with wget about recursive downloading websites you will find web-dumper a great idea.
When you are browsing with a navigator it downloads the entire page regardless if the domain is the same, when you use wget you have to set if it will allow or deny spanning across hosts, this rule will remain the entire schedule.
So web-dumper prevents this inconvenience, when it interprets a page it will download all related files regardless of which source, the rules about domain only will be applied on the next page to be interpreted.