I want to download websites so I can access it offline, with preserved navigation. Let's say a few hundred articles/pages.

Any suggested tool/method? To be done on a Linux machine, preferable with opensource tool.

Follow

@wydow Thank you for the suggestion, I've not heard of it before. To the description, it seems not be best fit for the use case I'm thinking about. My use is an offline copy of a complete website, with its navigation and all. I have tested one other suggestion, HTTrack, and for my trial it worked well, but maybe other tools may work better on some sites, to be explored further. Thanks for your suggestion!

Sign in to participate in the conversation
Librem Social

Librem Social is an opt-in public network. Messages are shared under Creative Commons BY-SA 4.0 license terms. Policy.

Stay safe. Please abide by our code of conduct.

(Source code)

image/svg+xml Librem Chat image/svg+xml