I want to download websites so I can access it offline, with preserved navigation. Let's say a few hundred articles/pages.
Any suggested tool/method? To be done on a Linux machine, preferable with opensource tool.
@wydow Thank you for the suggestion, I've not heard of it before. To the description, it seems not be best fit for the use case I'm thinking about. My use is an offline copy of a complete website, with its navigation and all. I have tested one other suggestion, HTTrack, and for my trial it worked well, but maybe other tools may work better on some sites, to be explored further. Thanks for your suggestion!