How can I download an entire website?












331















How can I download all pages from a website?



Any platform is fine.










share|improve this question




















  • 2





    Check out serverfault.com/questions/45096/website-backup-and-download on Server Fault.

    – Marko Carter
    Jul 28 '09 at 13:55













  • @tnorthcutt, I'm surprised too. If I don't recall awfully wrong, my Wget answer used to be the accepted one, and this looked like a settled thing. I'm not complaining though — all of a sudden the renewed attention gave me more than the bounty's worth of rep. :P

    – Jonik
    Sep 17 '09 at 6:05











  • did you try IDM? superuser.com/questions/14403/… my post is buried down. What did you find missing in IDM?

    – Lazer
    Sep 21 '09 at 10:30






  • 5





    @joe: Might help if you'd give details about what the missing features are...

    – Ilari Kajaste
    Sep 23 '09 at 11:06











  • browse-offline.com can download the complete tree of the web-site so you can ... browse it offline

    – Menelaos Vergis
    Mar 5 '14 at 13:11


















331















How can I download all pages from a website?



Any platform is fine.










share|improve this question




















  • 2





    Check out serverfault.com/questions/45096/website-backup-and-download on Server Fault.

    – Marko Carter
    Jul 28 '09 at 13:55













  • @tnorthcutt, I'm surprised too. If I don't recall awfully wrong, my Wget answer used to be the accepted one, and this looked like a settled thing. I'm not complaining though — all of a sudden the renewed attention gave me more than the bounty's worth of rep. :P

    – Jonik
    Sep 17 '09 at 6:05











  • did you try IDM? superuser.com/questions/14403/… my post is buried down. What did you find missing in IDM?

    – Lazer
    Sep 21 '09 at 10:30






  • 5





    @joe: Might help if you'd give details about what the missing features are...

    – Ilari Kajaste
    Sep 23 '09 at 11:06











  • browse-offline.com can download the complete tree of the web-site so you can ... browse it offline

    – Menelaos Vergis
    Mar 5 '14 at 13:11
















331












331








331


197






How can I download all pages from a website?



Any platform is fine.










share|improve this question
















How can I download all pages from a website?



Any platform is fine.







download website web






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Jul 26 '16 at 4:08


























community wiki





13 revs, 4 users 88%
joe









  • 2





    Check out serverfault.com/questions/45096/website-backup-and-download on Server Fault.

    – Marko Carter
    Jul 28 '09 at 13:55













  • @tnorthcutt, I'm surprised too. If I don't recall awfully wrong, my Wget answer used to be the accepted one, and this looked like a settled thing. I'm not complaining though — all of a sudden the renewed attention gave me more than the bounty's worth of rep. :P

    – Jonik
    Sep 17 '09 at 6:05











  • did you try IDM? superuser.com/questions/14403/… my post is buried down. What did you find missing in IDM?

    – Lazer
    Sep 21 '09 at 10:30






  • 5





    @joe: Might help if you'd give details about what the missing features are...

    – Ilari Kajaste
    Sep 23 '09 at 11:06











  • browse-offline.com can download the complete tree of the web-site so you can ... browse it offline

    – Menelaos Vergis
    Mar 5 '14 at 13:11
















  • 2





    Check out serverfault.com/questions/45096/website-backup-and-download on Server Fault.

    – Marko Carter
    Jul 28 '09 at 13:55













  • @tnorthcutt, I'm surprised too. If I don't recall awfully wrong, my Wget answer used to be the accepted one, and this looked like a settled thing. I'm not complaining though — all of a sudden the renewed attention gave me more than the bounty's worth of rep. :P

    – Jonik
    Sep 17 '09 at 6:05











  • did you try IDM? superuser.com/questions/14403/… my post is buried down. What did you find missing in IDM?

    – Lazer
    Sep 21 '09 at 10:30






  • 5





    @joe: Might help if you'd give details about what the missing features are...

    – Ilari Kajaste
    Sep 23 '09 at 11:06











  • browse-offline.com can download the complete tree of the web-site so you can ... browse it offline

    – Menelaos Vergis
    Mar 5 '14 at 13:11










2




2





Check out serverfault.com/questions/45096/website-backup-and-download on Server Fault.

– Marko Carter
Jul 28 '09 at 13:55







Check out serverfault.com/questions/45096/website-backup-and-download on Server Fault.

– Marko Carter
Jul 28 '09 at 13:55















@tnorthcutt, I'm surprised too. If I don't recall awfully wrong, my Wget answer used to be the accepted one, and this looked like a settled thing. I'm not complaining though — all of a sudden the renewed attention gave me more than the bounty's worth of rep. :P

– Jonik
Sep 17 '09 at 6:05





@tnorthcutt, I'm surprised too. If I don't recall awfully wrong, my Wget answer used to be the accepted one, and this looked like a settled thing. I'm not complaining though — all of a sudden the renewed attention gave me more than the bounty's worth of rep. :P

– Jonik
Sep 17 '09 at 6:05













did you try IDM? superuser.com/questions/14403/… my post is buried down. What did you find missing in IDM?

– Lazer
Sep 21 '09 at 10:30





did you try IDM? superuser.com/questions/14403/… my post is buried down. What did you find missing in IDM?

– Lazer
Sep 21 '09 at 10:30




5




5





@joe: Might help if you'd give details about what the missing features are...

– Ilari Kajaste
Sep 23 '09 at 11:06





@joe: Might help if you'd give details about what the missing features are...

– Ilari Kajaste
Sep 23 '09 at 11:06













browse-offline.com can download the complete tree of the web-site so you can ... browse it offline

– Menelaos Vergis
Mar 5 '14 at 13:11







browse-offline.com can download the complete tree of the web-site so you can ... browse it offline

– Menelaos Vergis
Mar 5 '14 at 13:11












19 Answers
19






active

oldest

votes


















317





+100









HTTRACK works like a champ for copying the contents of an entire site. This tool can even grab the pieces needed to make a website with active code content work offline. I am amazed at the stuff it can replicate offline.



This program will do all you require of it.



Happy hunting!






share|improve this answer





















  • 6





    Been using this for years - highly recommended.

    – Umber Ferrule
    Aug 9 '09 at 20:38











  • You can also limit the speed of download so you don't use too much bandwidth to the detriment of everyone else.

    – Umber Ferrule
    Aug 21 '09 at 22:18






  • 3





    Would this copy the actual ASP code that runs on the server though?

    – Taptronic
    Mar 19 '10 at 13:02






  • 7





    @Optimal Solutions: No, that's not possible. You'd need access to the servers or the source code for that.

    – Sasha Chedygov
    Mar 31 '10 at 7:08






  • 1





    After trying both httrack and wget for sites with authorization, I have to lean in favor of wget. Could not get httrack to work in those cases.

    – Leo
    May 18 '12 at 11:55



















253














Wget is a classic command-line tool for this kind of task. It comes with most Unix/Linux systems, and you can get it for Windows too. On a Mac, Homebrew is the easiest way to install it (brew install wget).



You'd do something like:



wget -r --no-parent http://site.com/songs/


For more details, see Wget Manual and its examples, or e.g. these:




  • wget: Download entire websites easy


  • Wget examples and scripts







share|improve this answer





















  • 11





    There's no better answer than this - wget can do anything :3

    – Phoshi
    Sep 16 '09 at 22:30






  • 4





    +1 for including the --no-parent. definitely use --mirror instead of -r. and you might want to include -L/--relative to not follow links to other servers.

    – quack quixote
    Oct 9 '09 at 12:43






  • 2





    As I also asked for httrack.com - would this cmd line tool get the ASP code or would it just get the rendering of the HTML? I have to try this. This could be a bit worrisome for developers if it does...

    – Taptronic
    Mar 19 '10 at 13:04






  • 5





    @optimal, the HTML output of course - it would get the code only if the server was badly misconfigured

    – Jonik
    Mar 19 '10 at 15:17






  • 2





    unfortunately it does not work for me - there is a problem with links to css files, they are not changed to relative i.e., you can see something like this in files: <link rel="stylesheet" type="text/css" href="/static/css/reset.css" media="screen" /> which does not work locally well, unless there is a waz to trick firefox to think that certain dir is a root.

    – gorn
    Jul 27 '12 at 0:42



















137














Use wget:



wget -m -p -E -k www.example.com


The options explained:



-m, --mirror            Turns on recursion and time-stamping, sets infinite 
recursion depth, and keeps FTP directory listings.
-p, --page-requisites Get all images, etc. needed to display HTML page.
-E, --adjust-extension Save HTML/CSS files with .html/.css extensions.
-k, --convert-links Make links in downloaded HTML point to local files.





share|improve this answer





















  • 6





    +1 for providing the explanations for the suggested options. (Although I don't think --mirror is very self-explanatory. Here's from the man page: "This option turns on recursion and time-stamping, sets infinite recursion depth and keeps FTP directory listings. It is currently equivalent to -r -N -l inf --no-remove-listing")

    – Ilari Kajaste
    Sep 23 '09 at 11:04






  • 2





    If you don’t want to download everything into a folder with the name of the domain you want to mirror, create your own folder and use the -nH option (which skips the host part).

    – Rafael Bugajewski
    Jan 3 '12 at 15:33






  • 2





    What about if the Auth is required?

    – Val
    May 13 '13 at 16:04






  • 4





    I tried using your wget --mirror -p --html-extension --convert-links www.example.com and it just downloaded the index. I think you need the -r to download the entire site.

    – Eric Brotto
    Jul 14 '14 at 10:49






  • 3





    for those concerned about killing a site due to traffic / too many requests, use the -w seconds (to wait a number of secconds between the requests, or the --limit-rate=amount, to specify the maximum bandwidth to use while downloading

    – vlad-ardelean
    Jul 14 '14 at 18:33



















8














You should take a look at ScrapBook, a Firefox extension. It has an in-depth capture mode.



enter image description here






share|improve this answer





















  • 3





    No longer compatible with Firefox after version 57 (Quantum).

    – Yay295
    Apr 16 '18 at 22:31



















8














Internet Download Manager has a Site Grabber utility with a lot of options - which lets you completely download any website you want, the way you want it.




  1. You can set the limit on the size of the pages/files to download


  2. You can set the number of branch sites to visit


  3. You can change the way scripts/popups/duplicates behave


  4. You can specify a domain, only under that domain all the pages/files meeting the required settings will be downloaded


  5. The links can be converted to offline links for browsing


  6. You have templates which let you choose the above settings for you



enter image description here



The software is not free however - see if it suits your needs, use the evaluation version.






share|improve this answer

































    7














    itsucks - that's the name of the program!






    share|improve this answer

































      5














      I'll address the online buffering that browsers use...



      Typically most browsers use a browsing cache to keep the files you download from a website around for a bit so that you do not have to download static images and content over and over again. This can speed up things quite a bit under some circumstances. Generally speaking, most browser caches are limited to a fixed size and when it hits that limit, it will delete the oldest files in the cache.



      ISPs tend to have caching servers that keep copies of commonly accessed websites like ESPN and CNN. This saves them the trouble of hitting these sites every time someone on their network goes there. This can amount to a significant savings in the amount of duplicated requests to external sites to the ISP.






      share|improve this answer

































        5














        I like Offline Explorer.

        It's a shareware, but it's very good and easy to use.






        share|improve this answer

































          4














          WebZip is a good product as well.






          share|improve this answer

































            4














            I have not done this in many years, but there are still a few utilities out there.
            You might want to try Web Snake.
            I believe I used it years ago. I remembered the name right away when I read your question.



            I agree with Stecy. Please do not hammer their site. Very Bad.






            share|improve this answer

































              3














              Try BackStreet Browser.




              It is a free, powerful offline browser. A high-speed, multi-threading
              website download and viewing program. By making multiple simultaneous
              server requests, BackStreet Browser can quickly download entire
              website or part of a site including HTML, graphics, Java Applets,
              sound and other user definable files, and saves all the files in your
              hard drive, either in their native format, or as a compressed ZIP file
              and view offline.




              enter image description here






              share|improve this answer

































                3














                Teleport Pro is another free solution that will copy down any and all files from whatever your target is (also has a paid version which will allow you to pull more pages of content).






                share|improve this answer

































                  3














                  DownThemAll is a Firefox add-on that will download all the content (audio or video files, for example) for a particular web page in a single click. This doesn't download the entire site, but this may be sort of thing the question was looking for.






                  share|improve this answer


























                  • It's only capable of downloading links (HTML) and media (images).

                    – Ain
                    Sep 26 '17 at 17:07



















                  3














                  For Linux and OS X: I wrote grab-site for archiving entire websites to WARC files. These WARC files can be browsed or extracted. grab-site lets you control which URLs to skip using regular expressions, and these can be changed when the crawl is running. It also comes with an extensive set of defaults for ignoring junk URLs.



                  There is a web dashboard for monitoring crawls, as well as additional options for skipping video content or responses over a certain size.






                  share|improve this answer

































                    1














                    The venerable FreeDownloadManager.org has this feature too.



                    Free Download Manager has it in two forms in two forms: Site Explorer and Site Spider:




                    Site Explorer

                    Site Explorer lets you view the folders structure of a
                    web site and easily download necessary files or folders.
                    HTML Spider

                    You can download whole web pages or even whole web sites with HTML
                    Spider. The tool can be adjusted to download files with specified
                    extensions only.




                    I find Site Explorer is useful to see which folders to include/exclude before you attempt attempt to download the whole site - especially when there is an entire forum hiding in the site that you don't want to download for example.






                    share|improve this answer

































                      0














                      Power wget



                      While wget was already mentioned this resource and command line was so seamless I thought it deserved mention:
                      wget -P /path/to/destination/directory/ -mpck --user-agent="" -e robots=off --wait 1 -E https://www.example.com/



                      See this code explained on explainshell






                      share|improve this answer

































                        -1














                        download HTTracker it will download websites very easy steps to follows.



                        download link:http://www.httrack.com/page/2/



                        video that help may help you :https://www.youtube.com/watch?v=7IHIGf6lcL4






                        share|improve this answer


























                        • -1 duplicate of top answer

                          – wjandrea
                          Sep 24 '17 at 7:28





















                        -3














                        I believe google chrome can do this on desktop devices, just go to the browser menu and click save webpage.



                        Also note that services like pocket may not actually save the website, and are thus susceptible to link rot.



                        Lastly note that copying the contents of a website may infringe on copyright, if it applies.






                        share|improve this answer





















                        • 3





                          A web page in your browser is just one out of many of a web site.

                          – Arjan
                          May 16 '15 at 20:05











                        • @Arjan I guess that makes my option labor intensive. I believe it is more common for people to just want to save one page, so this answer may be better for those people who come here for that.

                          – jiggunjer
                          May 17 '15 at 10:10



















                        -3














                        Firefox can do it natively (at least FF 42 can). Just use "Save Page"



                        enter image description here






                        share|improve this answer





















                        • 5





                          Wrong! The question asks how to save an entire web site. Firefox cannot do that.

                          – user477799
                          Jul 26 '16 at 6:24






                        • 1





                          Your method works only if it's a one-page site, but if the site has 699 pages? Would be very tiring...

                          – Quidam
                          Dec 15 '16 at 7:03










                        protected by Community Apr 16 '13 at 10:22



                        Thank you for your interest in this question.
                        Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).



                        Would you like to answer one of these unanswered questions instead?














                        19 Answers
                        19






                        active

                        oldest

                        votes








                        19 Answers
                        19






                        active

                        oldest

                        votes









                        active

                        oldest

                        votes






                        active

                        oldest

                        votes









                        317





                        +100









                        HTTRACK works like a champ for copying the contents of an entire site. This tool can even grab the pieces needed to make a website with active code content work offline. I am amazed at the stuff it can replicate offline.



                        This program will do all you require of it.



                        Happy hunting!






                        share|improve this answer





















                        • 6





                          Been using this for years - highly recommended.

                          – Umber Ferrule
                          Aug 9 '09 at 20:38











                        • You can also limit the speed of download so you don't use too much bandwidth to the detriment of everyone else.

                          – Umber Ferrule
                          Aug 21 '09 at 22:18






                        • 3





                          Would this copy the actual ASP code that runs on the server though?

                          – Taptronic
                          Mar 19 '10 at 13:02






                        • 7





                          @Optimal Solutions: No, that's not possible. You'd need access to the servers or the source code for that.

                          – Sasha Chedygov
                          Mar 31 '10 at 7:08






                        • 1





                          After trying both httrack and wget for sites with authorization, I have to lean in favor of wget. Could not get httrack to work in those cases.

                          – Leo
                          May 18 '12 at 11:55
















                        317





                        +100









                        HTTRACK works like a champ for copying the contents of an entire site. This tool can even grab the pieces needed to make a website with active code content work offline. I am amazed at the stuff it can replicate offline.



                        This program will do all you require of it.



                        Happy hunting!






                        share|improve this answer





















                        • 6





                          Been using this for years - highly recommended.

                          – Umber Ferrule
                          Aug 9 '09 at 20:38











                        • You can also limit the speed of download so you don't use too much bandwidth to the detriment of everyone else.

                          – Umber Ferrule
                          Aug 21 '09 at 22:18






                        • 3





                          Would this copy the actual ASP code that runs on the server though?

                          – Taptronic
                          Mar 19 '10 at 13:02






                        • 7





                          @Optimal Solutions: No, that's not possible. You'd need access to the servers or the source code for that.

                          – Sasha Chedygov
                          Mar 31 '10 at 7:08






                        • 1





                          After trying both httrack and wget for sites with authorization, I have to lean in favor of wget. Could not get httrack to work in those cases.

                          – Leo
                          May 18 '12 at 11:55














                        317





                        +100







                        317





                        +100



                        317




                        +100





                        HTTRACK works like a champ for copying the contents of an entire site. This tool can even grab the pieces needed to make a website with active code content work offline. I am amazed at the stuff it can replicate offline.



                        This program will do all you require of it.



                        Happy hunting!






                        share|improve this answer















                        HTTRACK works like a champ for copying the contents of an entire site. This tool can even grab the pieces needed to make a website with active code content work offline. I am amazed at the stuff it can replicate offline.



                        This program will do all you require of it.



                        Happy hunting!







                        share|improve this answer














                        share|improve this answer



                        share|improve this answer








                        edited Oct 23 '13 at 12:56


























                        community wiki





                        2 revs, 2 users 71%
                        Axxmasterr










                        • 6





                          Been using this for years - highly recommended.

                          – Umber Ferrule
                          Aug 9 '09 at 20:38











                        • You can also limit the speed of download so you don't use too much bandwidth to the detriment of everyone else.

                          – Umber Ferrule
                          Aug 21 '09 at 22:18






                        • 3





                          Would this copy the actual ASP code that runs on the server though?

                          – Taptronic
                          Mar 19 '10 at 13:02






                        • 7





                          @Optimal Solutions: No, that's not possible. You'd need access to the servers or the source code for that.

                          – Sasha Chedygov
                          Mar 31 '10 at 7:08






                        • 1





                          After trying both httrack and wget for sites with authorization, I have to lean in favor of wget. Could not get httrack to work in those cases.

                          – Leo
                          May 18 '12 at 11:55














                        • 6





                          Been using this for years - highly recommended.

                          – Umber Ferrule
                          Aug 9 '09 at 20:38











                        • You can also limit the speed of download so you don't use too much bandwidth to the detriment of everyone else.

                          – Umber Ferrule
                          Aug 21 '09 at 22:18






                        • 3





                          Would this copy the actual ASP code that runs on the server though?

                          – Taptronic
                          Mar 19 '10 at 13:02






                        • 7





                          @Optimal Solutions: No, that's not possible. You'd need access to the servers or the source code for that.

                          – Sasha Chedygov
                          Mar 31 '10 at 7:08






                        • 1





                          After trying both httrack and wget for sites with authorization, I have to lean in favor of wget. Could not get httrack to work in those cases.

                          – Leo
                          May 18 '12 at 11:55








                        6




                        6





                        Been using this for years - highly recommended.

                        – Umber Ferrule
                        Aug 9 '09 at 20:38





                        Been using this for years - highly recommended.

                        – Umber Ferrule
                        Aug 9 '09 at 20:38













                        You can also limit the speed of download so you don't use too much bandwidth to the detriment of everyone else.

                        – Umber Ferrule
                        Aug 21 '09 at 22:18





                        You can also limit the speed of download so you don't use too much bandwidth to the detriment of everyone else.

                        – Umber Ferrule
                        Aug 21 '09 at 22:18




                        3




                        3





                        Would this copy the actual ASP code that runs on the server though?

                        – Taptronic
                        Mar 19 '10 at 13:02





                        Would this copy the actual ASP code that runs on the server though?

                        – Taptronic
                        Mar 19 '10 at 13:02




                        7




                        7





                        @Optimal Solutions: No, that's not possible. You'd need access to the servers or the source code for that.

                        – Sasha Chedygov
                        Mar 31 '10 at 7:08





                        @Optimal Solutions: No, that's not possible. You'd need access to the servers or the source code for that.

                        – Sasha Chedygov
                        Mar 31 '10 at 7:08




                        1




                        1





                        After trying both httrack and wget for sites with authorization, I have to lean in favor of wget. Could not get httrack to work in those cases.

                        – Leo
                        May 18 '12 at 11:55





                        After trying both httrack and wget for sites with authorization, I have to lean in favor of wget. Could not get httrack to work in those cases.

                        – Leo
                        May 18 '12 at 11:55













                        253














                        Wget is a classic command-line tool for this kind of task. It comes with most Unix/Linux systems, and you can get it for Windows too. On a Mac, Homebrew is the easiest way to install it (brew install wget).



                        You'd do something like:



                        wget -r --no-parent http://site.com/songs/


                        For more details, see Wget Manual and its examples, or e.g. these:




                        • wget: Download entire websites easy


                        • Wget examples and scripts







                        share|improve this answer





















                        • 11





                          There's no better answer than this - wget can do anything :3

                          – Phoshi
                          Sep 16 '09 at 22:30






                        • 4





                          +1 for including the --no-parent. definitely use --mirror instead of -r. and you might want to include -L/--relative to not follow links to other servers.

                          – quack quixote
                          Oct 9 '09 at 12:43






                        • 2





                          As I also asked for httrack.com - would this cmd line tool get the ASP code or would it just get the rendering of the HTML? I have to try this. This could be a bit worrisome for developers if it does...

                          – Taptronic
                          Mar 19 '10 at 13:04






                        • 5





                          @optimal, the HTML output of course - it would get the code only if the server was badly misconfigured

                          – Jonik
                          Mar 19 '10 at 15:17






                        • 2





                          unfortunately it does not work for me - there is a problem with links to css files, they are not changed to relative i.e., you can see something like this in files: <link rel="stylesheet" type="text/css" href="/static/css/reset.css" media="screen" /> which does not work locally well, unless there is a waz to trick firefox to think that certain dir is a root.

                          – gorn
                          Jul 27 '12 at 0:42
















                        253














                        Wget is a classic command-line tool for this kind of task. It comes with most Unix/Linux systems, and you can get it for Windows too. On a Mac, Homebrew is the easiest way to install it (brew install wget).



                        You'd do something like:



                        wget -r --no-parent http://site.com/songs/


                        For more details, see Wget Manual and its examples, or e.g. these:




                        • wget: Download entire websites easy


                        • Wget examples and scripts







                        share|improve this answer





















                        • 11





                          There's no better answer than this - wget can do anything :3

                          – Phoshi
                          Sep 16 '09 at 22:30






                        • 4





                          +1 for including the --no-parent. definitely use --mirror instead of -r. and you might want to include -L/--relative to not follow links to other servers.

                          – quack quixote
                          Oct 9 '09 at 12:43






                        • 2





                          As I also asked for httrack.com - would this cmd line tool get the ASP code or would it just get the rendering of the HTML? I have to try this. This could be a bit worrisome for developers if it does...

                          – Taptronic
                          Mar 19 '10 at 13:04






                        • 5





                          @optimal, the HTML output of course - it would get the code only if the server was badly misconfigured

                          – Jonik
                          Mar 19 '10 at 15:17






                        • 2





                          unfortunately it does not work for me - there is a problem with links to css files, they are not changed to relative i.e., you can see something like this in files: <link rel="stylesheet" type="text/css" href="/static/css/reset.css" media="screen" /> which does not work locally well, unless there is a waz to trick firefox to think that certain dir is a root.

                          – gorn
                          Jul 27 '12 at 0:42














                        253












                        253








                        253







                        Wget is a classic command-line tool for this kind of task. It comes with most Unix/Linux systems, and you can get it for Windows too. On a Mac, Homebrew is the easiest way to install it (brew install wget).



                        You'd do something like:



                        wget -r --no-parent http://site.com/songs/


                        For more details, see Wget Manual and its examples, or e.g. these:




                        • wget: Download entire websites easy


                        • Wget examples and scripts







                        share|improve this answer















                        Wget is a classic command-line tool for this kind of task. It comes with most Unix/Linux systems, and you can get it for Windows too. On a Mac, Homebrew is the easiest way to install it (brew install wget).



                        You'd do something like:



                        wget -r --no-parent http://site.com/songs/


                        For more details, see Wget Manual and its examples, or e.g. these:




                        • wget: Download entire websites easy


                        • Wget examples and scripts








                        share|improve this answer














                        share|improve this answer



                        share|improve this answer








                        edited Mar 20 '17 at 10:17


























                        community wiki





                        7 revs, 3 users 71%
                        Jonik









                        • 11





                          There's no better answer than this - wget can do anything :3

                          – Phoshi
                          Sep 16 '09 at 22:30






                        • 4





                          +1 for including the --no-parent. definitely use --mirror instead of -r. and you might want to include -L/--relative to not follow links to other servers.

                          – quack quixote
                          Oct 9 '09 at 12:43






                        • 2





                          As I also asked for httrack.com - would this cmd line tool get the ASP code or would it just get the rendering of the HTML? I have to try this. This could be a bit worrisome for developers if it does...

                          – Taptronic
                          Mar 19 '10 at 13:04






                        • 5





                          @optimal, the HTML output of course - it would get the code only if the server was badly misconfigured

                          – Jonik
                          Mar 19 '10 at 15:17






                        • 2





                          unfortunately it does not work for me - there is a problem with links to css files, they are not changed to relative i.e., you can see something like this in files: <link rel="stylesheet" type="text/css" href="/static/css/reset.css" media="screen" /> which does not work locally well, unless there is a waz to trick firefox to think that certain dir is a root.

                          – gorn
                          Jul 27 '12 at 0:42














                        • 11





                          There's no better answer than this - wget can do anything :3

                          – Phoshi
                          Sep 16 '09 at 22:30






                        • 4





                          +1 for including the --no-parent. definitely use --mirror instead of -r. and you might want to include -L/--relative to not follow links to other servers.

                          – quack quixote
                          Oct 9 '09 at 12:43






                        • 2





                          As I also asked for httrack.com - would this cmd line tool get the ASP code or would it just get the rendering of the HTML? I have to try this. This could be a bit worrisome for developers if it does...

                          – Taptronic
                          Mar 19 '10 at 13:04






                        • 5





                          @optimal, the HTML output of course - it would get the code only if the server was badly misconfigured

                          – Jonik
                          Mar 19 '10 at 15:17






                        • 2





                          unfortunately it does not work for me - there is a problem with links to css files, they are not changed to relative i.e., you can see something like this in files: <link rel="stylesheet" type="text/css" href="/static/css/reset.css" media="screen" /> which does not work locally well, unless there is a waz to trick firefox to think that certain dir is a root.

                          – gorn
                          Jul 27 '12 at 0:42








                        11




                        11





                        There's no better answer than this - wget can do anything :3

                        – Phoshi
                        Sep 16 '09 at 22:30





                        There's no better answer than this - wget can do anything :3

                        – Phoshi
                        Sep 16 '09 at 22:30




                        4




                        4





                        +1 for including the --no-parent. definitely use --mirror instead of -r. and you might want to include -L/--relative to not follow links to other servers.

                        – quack quixote
                        Oct 9 '09 at 12:43





                        +1 for including the --no-parent. definitely use --mirror instead of -r. and you might want to include -L/--relative to not follow links to other servers.

                        – quack quixote
                        Oct 9 '09 at 12:43




                        2




                        2





                        As I also asked for httrack.com - would this cmd line tool get the ASP code or would it just get the rendering of the HTML? I have to try this. This could be a bit worrisome for developers if it does...

                        – Taptronic
                        Mar 19 '10 at 13:04





                        As I also asked for httrack.com - would this cmd line tool get the ASP code or would it just get the rendering of the HTML? I have to try this. This could be a bit worrisome for developers if it does...

                        – Taptronic
                        Mar 19 '10 at 13:04




                        5




                        5





                        @optimal, the HTML output of course - it would get the code only if the server was badly misconfigured

                        – Jonik
                        Mar 19 '10 at 15:17





                        @optimal, the HTML output of course - it would get the code only if the server was badly misconfigured

                        – Jonik
                        Mar 19 '10 at 15:17




                        2




                        2





                        unfortunately it does not work for me - there is a problem with links to css files, they are not changed to relative i.e., you can see something like this in files: <link rel="stylesheet" type="text/css" href="/static/css/reset.css" media="screen" /> which does not work locally well, unless there is a waz to trick firefox to think that certain dir is a root.

                        – gorn
                        Jul 27 '12 at 0:42





                        unfortunately it does not work for me - there is a problem with links to css files, they are not changed to relative i.e., you can see something like this in files: <link rel="stylesheet" type="text/css" href="/static/css/reset.css" media="screen" /> which does not work locally well, unless there is a waz to trick firefox to think that certain dir is a root.

                        – gorn
                        Jul 27 '12 at 0:42











                        137














                        Use wget:



                        wget -m -p -E -k www.example.com


                        The options explained:



                        -m, --mirror            Turns on recursion and time-stamping, sets infinite 
                        recursion depth, and keeps FTP directory listings.
                        -p, --page-requisites Get all images, etc. needed to display HTML page.
                        -E, --adjust-extension Save HTML/CSS files with .html/.css extensions.
                        -k, --convert-links Make links in downloaded HTML point to local files.





                        share|improve this answer





















                        • 6





                          +1 for providing the explanations for the suggested options. (Although I don't think --mirror is very self-explanatory. Here's from the man page: "This option turns on recursion and time-stamping, sets infinite recursion depth and keeps FTP directory listings. It is currently equivalent to -r -N -l inf --no-remove-listing")

                          – Ilari Kajaste
                          Sep 23 '09 at 11:04






                        • 2





                          If you don’t want to download everything into a folder with the name of the domain you want to mirror, create your own folder and use the -nH option (which skips the host part).

                          – Rafael Bugajewski
                          Jan 3 '12 at 15:33






                        • 2





                          What about if the Auth is required?

                          – Val
                          May 13 '13 at 16:04






                        • 4





                          I tried using your wget --mirror -p --html-extension --convert-links www.example.com and it just downloaded the index. I think you need the -r to download the entire site.

                          – Eric Brotto
                          Jul 14 '14 at 10:49






                        • 3





                          for those concerned about killing a site due to traffic / too many requests, use the -w seconds (to wait a number of secconds between the requests, or the --limit-rate=amount, to specify the maximum bandwidth to use while downloading

                          – vlad-ardelean
                          Jul 14 '14 at 18:33
















                        137














                        Use wget:



                        wget -m -p -E -k www.example.com


                        The options explained:



                        -m, --mirror            Turns on recursion and time-stamping, sets infinite 
                        recursion depth, and keeps FTP directory listings.
                        -p, --page-requisites Get all images, etc. needed to display HTML page.
                        -E, --adjust-extension Save HTML/CSS files with .html/.css extensions.
                        -k, --convert-links Make links in downloaded HTML point to local files.





                        share|improve this answer





















                        • 6





                          +1 for providing the explanations for the suggested options. (Although I don't think --mirror is very self-explanatory. Here's from the man page: "This option turns on recursion and time-stamping, sets infinite recursion depth and keeps FTP directory listings. It is currently equivalent to -r -N -l inf --no-remove-listing")

                          – Ilari Kajaste
                          Sep 23 '09 at 11:04






                        • 2





                          If you don’t want to download everything into a folder with the name of the domain you want to mirror, create your own folder and use the -nH option (which skips the host part).

                          – Rafael Bugajewski
                          Jan 3 '12 at 15:33






                        • 2





                          What about if the Auth is required?

                          – Val
                          May 13 '13 at 16:04






                        • 4





                          I tried using your wget --mirror -p --html-extension --convert-links www.example.com and it just downloaded the index. I think you need the -r to download the entire site.

                          – Eric Brotto
                          Jul 14 '14 at 10:49






                        • 3





                          for those concerned about killing a site due to traffic / too many requests, use the -w seconds (to wait a number of secconds between the requests, or the --limit-rate=amount, to specify the maximum bandwidth to use while downloading

                          – vlad-ardelean
                          Jul 14 '14 at 18:33














                        137












                        137








                        137







                        Use wget:



                        wget -m -p -E -k www.example.com


                        The options explained:



                        -m, --mirror            Turns on recursion and time-stamping, sets infinite 
                        recursion depth, and keeps FTP directory listings.
                        -p, --page-requisites Get all images, etc. needed to display HTML page.
                        -E, --adjust-extension Save HTML/CSS files with .html/.css extensions.
                        -k, --convert-links Make links in downloaded HTML point to local files.





                        share|improve this answer















                        Use wget:



                        wget -m -p -E -k www.example.com


                        The options explained:



                        -m, --mirror            Turns on recursion and time-stamping, sets infinite 
                        recursion depth, and keeps FTP directory listings.
                        -p, --page-requisites Get all images, etc. needed to display HTML page.
                        -E, --adjust-extension Save HTML/CSS files with .html/.css extensions.
                        -k, --convert-links Make links in downloaded HTML point to local files.






                        share|improve this answer














                        share|improve this answer



                        share|improve this answer








                        edited Apr 16 '18 at 22:59


























                        community wiki





                        3 revs, 3 users 71%
                        user9437









                        • 6





                          +1 for providing the explanations for the suggested options. (Although I don't think --mirror is very self-explanatory. Here's from the man page: "This option turns on recursion and time-stamping, sets infinite recursion depth and keeps FTP directory listings. It is currently equivalent to -r -N -l inf --no-remove-listing")

                          – Ilari Kajaste
                          Sep 23 '09 at 11:04






                        • 2





                          If you don’t want to download everything into a folder with the name of the domain you want to mirror, create your own folder and use the -nH option (which skips the host part).

                          – Rafael Bugajewski
                          Jan 3 '12 at 15:33






                        • 2





                          What about if the Auth is required?

                          – Val
                          May 13 '13 at 16:04






                        • 4





                          I tried using your wget --mirror -p --html-extension --convert-links www.example.com and it just downloaded the index. I think you need the -r to download the entire site.

                          – Eric Brotto
                          Jul 14 '14 at 10:49






                        • 3





                          for those concerned about killing a site due to traffic / too many requests, use the -w seconds (to wait a number of secconds between the requests, or the --limit-rate=amount, to specify the maximum bandwidth to use while downloading

                          – vlad-ardelean
                          Jul 14 '14 at 18:33














                        • 6





                          +1 for providing the explanations for the suggested options. (Although I don't think --mirror is very self-explanatory. Here's from the man page: "This option turns on recursion and time-stamping, sets infinite recursion depth and keeps FTP directory listings. It is currently equivalent to -r -N -l inf --no-remove-listing")

                          – Ilari Kajaste
                          Sep 23 '09 at 11:04






                        • 2





                          If you don’t want to download everything into a folder with the name of the domain you want to mirror, create your own folder and use the -nH option (which skips the host part).

                          – Rafael Bugajewski
                          Jan 3 '12 at 15:33






                        • 2





                          What about if the Auth is required?

                          – Val
                          May 13 '13 at 16:04






                        • 4





                          I tried using your wget --mirror -p --html-extension --convert-links www.example.com and it just downloaded the index. I think you need the -r to download the entire site.

                          – Eric Brotto
                          Jul 14 '14 at 10:49






                        • 3





                          for those concerned about killing a site due to traffic / too many requests, use the -w seconds (to wait a number of secconds between the requests, or the --limit-rate=amount, to specify the maximum bandwidth to use while downloading

                          – vlad-ardelean
                          Jul 14 '14 at 18:33








                        6




                        6





                        +1 for providing the explanations for the suggested options. (Although I don't think --mirror is very self-explanatory. Here's from the man page: "This option turns on recursion and time-stamping, sets infinite recursion depth and keeps FTP directory listings. It is currently equivalent to -r -N -l inf --no-remove-listing")

                        – Ilari Kajaste
                        Sep 23 '09 at 11:04





                        +1 for providing the explanations for the suggested options. (Although I don't think --mirror is very self-explanatory. Here's from the man page: "This option turns on recursion and time-stamping, sets infinite recursion depth and keeps FTP directory listings. It is currently equivalent to -r -N -l inf --no-remove-listing")

                        – Ilari Kajaste
                        Sep 23 '09 at 11:04




                        2




                        2





                        If you don’t want to download everything into a folder with the name of the domain you want to mirror, create your own folder and use the -nH option (which skips the host part).

                        – Rafael Bugajewski
                        Jan 3 '12 at 15:33





                        If you don’t want to download everything into a folder with the name of the domain you want to mirror, create your own folder and use the -nH option (which skips the host part).

                        – Rafael Bugajewski
                        Jan 3 '12 at 15:33




                        2




                        2





                        What about if the Auth is required?

                        – Val
                        May 13 '13 at 16:04





                        What about if the Auth is required?

                        – Val
                        May 13 '13 at 16:04




                        4




                        4





                        I tried using your wget --mirror -p --html-extension --convert-links www.example.com and it just downloaded the index. I think you need the -r to download the entire site.

                        – Eric Brotto
                        Jul 14 '14 at 10:49





                        I tried using your wget --mirror -p --html-extension --convert-links www.example.com and it just downloaded the index. I think you need the -r to download the entire site.

                        – Eric Brotto
                        Jul 14 '14 at 10:49




                        3




                        3





                        for those concerned about killing a site due to traffic / too many requests, use the -w seconds (to wait a number of secconds between the requests, or the --limit-rate=amount, to specify the maximum bandwidth to use while downloading

                        – vlad-ardelean
                        Jul 14 '14 at 18:33





                        for those concerned about killing a site due to traffic / too many requests, use the -w seconds (to wait a number of secconds between the requests, or the --limit-rate=amount, to specify the maximum bandwidth to use while downloading

                        – vlad-ardelean
                        Jul 14 '14 at 18:33











                        8














                        You should take a look at ScrapBook, a Firefox extension. It has an in-depth capture mode.



                        enter image description here






                        share|improve this answer





















                        • 3





                          No longer compatible with Firefox after version 57 (Quantum).

                          – Yay295
                          Apr 16 '18 at 22:31
















                        8














                        You should take a look at ScrapBook, a Firefox extension. It has an in-depth capture mode.



                        enter image description here






                        share|improve this answer





















                        • 3





                          No longer compatible with Firefox after version 57 (Quantum).

                          – Yay295
                          Apr 16 '18 at 22:31














                        8












                        8








                        8







                        You should take a look at ScrapBook, a Firefox extension. It has an in-depth capture mode.



                        enter image description here






                        share|improve this answer















                        You should take a look at ScrapBook, a Firefox extension. It has an in-depth capture mode.



                        enter image description here







                        share|improve this answer














                        share|improve this answer



                        share|improve this answer








                        edited Aug 16 '11 at 8:07


























                        community wiki





                        2 revs, 2 users 71%
                        webjunkie










                        • 3





                          No longer compatible with Firefox after version 57 (Quantum).

                          – Yay295
                          Apr 16 '18 at 22:31














                        • 3





                          No longer compatible with Firefox after version 57 (Quantum).

                          – Yay295
                          Apr 16 '18 at 22:31








                        3




                        3





                        No longer compatible with Firefox after version 57 (Quantum).

                        – Yay295
                        Apr 16 '18 at 22:31





                        No longer compatible with Firefox after version 57 (Quantum).

                        – Yay295
                        Apr 16 '18 at 22:31











                        8














                        Internet Download Manager has a Site Grabber utility with a lot of options - which lets you completely download any website you want, the way you want it.




                        1. You can set the limit on the size of the pages/files to download


                        2. You can set the number of branch sites to visit


                        3. You can change the way scripts/popups/duplicates behave


                        4. You can specify a domain, only under that domain all the pages/files meeting the required settings will be downloaded


                        5. The links can be converted to offline links for browsing


                        6. You have templates which let you choose the above settings for you



                        enter image description here



                        The software is not free however - see if it suits your needs, use the evaluation version.






                        share|improve this answer






























                          8














                          Internet Download Manager has a Site Grabber utility with a lot of options - which lets you completely download any website you want, the way you want it.




                          1. You can set the limit on the size of the pages/files to download


                          2. You can set the number of branch sites to visit


                          3. You can change the way scripts/popups/duplicates behave


                          4. You can specify a domain, only under that domain all the pages/files meeting the required settings will be downloaded


                          5. The links can be converted to offline links for browsing


                          6. You have templates which let you choose the above settings for you



                          enter image description here



                          The software is not free however - see if it suits your needs, use the evaluation version.






                          share|improve this answer




























                            8












                            8








                            8







                            Internet Download Manager has a Site Grabber utility with a lot of options - which lets you completely download any website you want, the way you want it.




                            1. You can set the limit on the size of the pages/files to download


                            2. You can set the number of branch sites to visit


                            3. You can change the way scripts/popups/duplicates behave


                            4. You can specify a domain, only under that domain all the pages/files meeting the required settings will be downloaded


                            5. The links can be converted to offline links for browsing


                            6. You have templates which let you choose the above settings for you



                            enter image description here



                            The software is not free however - see if it suits your needs, use the evaluation version.






                            share|improve this answer















                            Internet Download Manager has a Site Grabber utility with a lot of options - which lets you completely download any website you want, the way you want it.




                            1. You can set the limit on the size of the pages/files to download


                            2. You can set the number of branch sites to visit


                            3. You can change the way scripts/popups/duplicates behave


                            4. You can specify a domain, only under that domain all the pages/files meeting the required settings will be downloaded


                            5. The links can be converted to offline links for browsing


                            6. You have templates which let you choose the above settings for you



                            enter image description here



                            The software is not free however - see if it suits your needs, use the evaluation version.







                            share|improve this answer














                            share|improve this answer



                            share|improve this answer








                            edited Aug 16 '11 at 8:09


























                            community wiki





                            2 revs, 2 users 79%
                            Lazer

























                                7














                                itsucks - that's the name of the program!






                                share|improve this answer






























                                  7














                                  itsucks - that's the name of the program!






                                  share|improve this answer




























                                    7












                                    7








                                    7







                                    itsucks - that's the name of the program!






                                    share|improve this answer















                                    itsucks - that's the name of the program!







                                    share|improve this answer














                                    share|improve this answer



                                    share|improve this answer








                                    edited Aug 23 '11 at 21:24


























                                    community wiki





                                    2 revs, 2 users 67%
                                    kmarsh

























                                        5














                                        I'll address the online buffering that browsers use...



                                        Typically most browsers use a browsing cache to keep the files you download from a website around for a bit so that you do not have to download static images and content over and over again. This can speed up things quite a bit under some circumstances. Generally speaking, most browser caches are limited to a fixed size and when it hits that limit, it will delete the oldest files in the cache.



                                        ISPs tend to have caching servers that keep copies of commonly accessed websites like ESPN and CNN. This saves them the trouble of hitting these sites every time someone on their network goes there. This can amount to a significant savings in the amount of duplicated requests to external sites to the ISP.






                                        share|improve this answer






























                                          5














                                          I'll address the online buffering that browsers use...



                                          Typically most browsers use a browsing cache to keep the files you download from a website around for a bit so that you do not have to download static images and content over and over again. This can speed up things quite a bit under some circumstances. Generally speaking, most browser caches are limited to a fixed size and when it hits that limit, it will delete the oldest files in the cache.



                                          ISPs tend to have caching servers that keep copies of commonly accessed websites like ESPN and CNN. This saves them the trouble of hitting these sites every time someone on their network goes there. This can amount to a significant savings in the amount of duplicated requests to external sites to the ISP.






                                          share|improve this answer




























                                            5












                                            5








                                            5







                                            I'll address the online buffering that browsers use...



                                            Typically most browsers use a browsing cache to keep the files you download from a website around for a bit so that you do not have to download static images and content over and over again. This can speed up things quite a bit under some circumstances. Generally speaking, most browser caches are limited to a fixed size and when it hits that limit, it will delete the oldest files in the cache.



                                            ISPs tend to have caching servers that keep copies of commonly accessed websites like ESPN and CNN. This saves them the trouble of hitting these sites every time someone on their network goes there. This can amount to a significant savings in the amount of duplicated requests to external sites to the ISP.






                                            share|improve this answer















                                            I'll address the online buffering that browsers use...



                                            Typically most browsers use a browsing cache to keep the files you download from a website around for a bit so that you do not have to download static images and content over and over again. This can speed up things quite a bit under some circumstances. Generally speaking, most browser caches are limited to a fixed size and when it hits that limit, it will delete the oldest files in the cache.



                                            ISPs tend to have caching servers that keep copies of commonly accessed websites like ESPN and CNN. This saves them the trouble of hitting these sites every time someone on their network goes there. This can amount to a significant savings in the amount of duplicated requests to external sites to the ISP.







                                            share|improve this answer














                                            share|improve this answer



                                            share|improve this answer








                                            answered Jul 28 '09 at 14:03


























                                            community wiki





                                            Axxmasterr
























                                                5














                                                I like Offline Explorer.

                                                It's a shareware, but it's very good and easy to use.






                                                share|improve this answer






























                                                  5














                                                  I like Offline Explorer.

                                                  It's a shareware, but it's very good and easy to use.






                                                  share|improve this answer




























                                                    5












                                                    5








                                                    5







                                                    I like Offline Explorer.

                                                    It's a shareware, but it's very good and easy to use.






                                                    share|improve this answer















                                                    I like Offline Explorer.

                                                    It's a shareware, but it's very good and easy to use.







                                                    share|improve this answer














                                                    share|improve this answer



                                                    share|improve this answer








                                                    answered Sep 17 '09 at 2:08


























                                                    community wiki





                                                    Eran
























                                                        4














                                                        WebZip is a good product as well.






                                                        share|improve this answer






























                                                          4














                                                          WebZip is a good product as well.






                                                          share|improve this answer




























                                                            4












                                                            4








                                                            4







                                                            WebZip is a good product as well.






                                                            share|improve this answer















                                                            WebZip is a good product as well.







                                                            share|improve this answer














                                                            share|improve this answer



                                                            share|improve this answer








                                                            edited Oct 23 '13 at 13:03


























                                                            community wiki





                                                            2 revs, 2 users 67%
                                                            Ash

























                                                                4














                                                                I have not done this in many years, but there are still a few utilities out there.
                                                                You might want to try Web Snake.
                                                                I believe I used it years ago. I remembered the name right away when I read your question.



                                                                I agree with Stecy. Please do not hammer their site. Very Bad.






                                                                share|improve this answer






























                                                                  4














                                                                  I have not done this in many years, but there are still a few utilities out there.
                                                                  You might want to try Web Snake.
                                                                  I believe I used it years ago. I remembered the name right away when I read your question.



                                                                  I agree with Stecy. Please do not hammer their site. Very Bad.






                                                                  share|improve this answer




























                                                                    4












                                                                    4








                                                                    4







                                                                    I have not done this in many years, but there are still a few utilities out there.
                                                                    You might want to try Web Snake.
                                                                    I believe I used it years ago. I remembered the name right away when I read your question.



                                                                    I agree with Stecy. Please do not hammer their site. Very Bad.






                                                                    share|improve this answer















                                                                    I have not done this in many years, but there are still a few utilities out there.
                                                                    You might want to try Web Snake.
                                                                    I believe I used it years ago. I remembered the name right away when I read your question.



                                                                    I agree with Stecy. Please do not hammer their site. Very Bad.







                                                                    share|improve this answer














                                                                    share|improve this answer



                                                                    share|improve this answer








                                                                    edited Oct 23 '13 at 13:05


























                                                                    community wiki





                                                                    2 revs, 2 users 80%
                                                                    Bobby Ortiz

























                                                                        3














                                                                        Try BackStreet Browser.




                                                                        It is a free, powerful offline browser. A high-speed, multi-threading
                                                                        website download and viewing program. By making multiple simultaneous
                                                                        server requests, BackStreet Browser can quickly download entire
                                                                        website or part of a site including HTML, graphics, Java Applets,
                                                                        sound and other user definable files, and saves all the files in your
                                                                        hard drive, either in their native format, or as a compressed ZIP file
                                                                        and view offline.




                                                                        enter image description here






                                                                        share|improve this answer






























                                                                          3














                                                                          Try BackStreet Browser.




                                                                          It is a free, powerful offline browser. A high-speed, multi-threading
                                                                          website download and viewing program. By making multiple simultaneous
                                                                          server requests, BackStreet Browser can quickly download entire
                                                                          website or part of a site including HTML, graphics, Java Applets,
                                                                          sound and other user definable files, and saves all the files in your
                                                                          hard drive, either in their native format, or as a compressed ZIP file
                                                                          and view offline.




                                                                          enter image description here






                                                                          share|improve this answer




























                                                                            3












                                                                            3








                                                                            3







                                                                            Try BackStreet Browser.




                                                                            It is a free, powerful offline browser. A high-speed, multi-threading
                                                                            website download and viewing program. By making multiple simultaneous
                                                                            server requests, BackStreet Browser can quickly download entire
                                                                            website or part of a site including HTML, graphics, Java Applets,
                                                                            sound and other user definable files, and saves all the files in your
                                                                            hard drive, either in their native format, or as a compressed ZIP file
                                                                            and view offline.




                                                                            enter image description here






                                                                            share|improve this answer















                                                                            Try BackStreet Browser.




                                                                            It is a free, powerful offline browser. A high-speed, multi-threading
                                                                            website download and viewing program. By making multiple simultaneous
                                                                            server requests, BackStreet Browser can quickly download entire
                                                                            website or part of a site including HTML, graphics, Java Applets,
                                                                            sound and other user definable files, and saves all the files in your
                                                                            hard drive, either in their native format, or as a compressed ZIP file
                                                                            and view offline.




                                                                            enter image description here







                                                                            share|improve this answer














                                                                            share|improve this answer



                                                                            share|improve this answer








                                                                            edited Aug 16 '11 at 8:06


























                                                                            community wiki





                                                                            2 revs, 2 users 56%
                                                                            joe

























                                                                                3














                                                                                Teleport Pro is another free solution that will copy down any and all files from whatever your target is (also has a paid version which will allow you to pull more pages of content).






                                                                                share|improve this answer






























                                                                                  3














                                                                                  Teleport Pro is another free solution that will copy down any and all files from whatever your target is (also has a paid version which will allow you to pull more pages of content).






                                                                                  share|improve this answer




























                                                                                    3












                                                                                    3








                                                                                    3







                                                                                    Teleport Pro is another free solution that will copy down any and all files from whatever your target is (also has a paid version which will allow you to pull more pages of content).






                                                                                    share|improve this answer















                                                                                    Teleport Pro is another free solution that will copy down any and all files from whatever your target is (also has a paid version which will allow you to pull more pages of content).







                                                                                    share|improve this answer














                                                                                    share|improve this answer



                                                                                    share|improve this answer








                                                                                    edited Oct 23 '13 at 12:57


























                                                                                    community wiki





                                                                                    2 revs, 2 users 83%
                                                                                    Pretzel

























                                                                                        3














                                                                                        DownThemAll is a Firefox add-on that will download all the content (audio or video files, for example) for a particular web page in a single click. This doesn't download the entire site, but this may be sort of thing the question was looking for.






                                                                                        share|improve this answer


























                                                                                        • It's only capable of downloading links (HTML) and media (images).

                                                                                          – Ain
                                                                                          Sep 26 '17 at 17:07
















                                                                                        3














                                                                                        DownThemAll is a Firefox add-on that will download all the content (audio or video files, for example) for a particular web page in a single click. This doesn't download the entire site, but this may be sort of thing the question was looking for.






                                                                                        share|improve this answer


























                                                                                        • It's only capable of downloading links (HTML) and media (images).

                                                                                          – Ain
                                                                                          Sep 26 '17 at 17:07














                                                                                        3












                                                                                        3








                                                                                        3







                                                                                        DownThemAll is a Firefox add-on that will download all the content (audio or video files, for example) for a particular web page in a single click. This doesn't download the entire site, but this may be sort of thing the question was looking for.






                                                                                        share|improve this answer















                                                                                        DownThemAll is a Firefox add-on that will download all the content (audio or video files, for example) for a particular web page in a single click. This doesn't download the entire site, but this may be sort of thing the question was looking for.







                                                                                        share|improve this answer














                                                                                        share|improve this answer



                                                                                        share|improve this answer








                                                                                        edited Jun 13 '15 at 5:38


























                                                                                        community wiki





                                                                                        2 revs
                                                                                        Will M














                                                                                        • It's only capable of downloading links (HTML) and media (images).

                                                                                          – Ain
                                                                                          Sep 26 '17 at 17:07



















                                                                                        • It's only capable of downloading links (HTML) and media (images).

                                                                                          – Ain
                                                                                          Sep 26 '17 at 17:07

















                                                                                        It's only capable of downloading links (HTML) and media (images).

                                                                                        – Ain
                                                                                        Sep 26 '17 at 17:07





                                                                                        It's only capable of downloading links (HTML) and media (images).

                                                                                        – Ain
                                                                                        Sep 26 '17 at 17:07











                                                                                        3














                                                                                        For Linux and OS X: I wrote grab-site for archiving entire websites to WARC files. These WARC files can be browsed or extracted. grab-site lets you control which URLs to skip using regular expressions, and these can be changed when the crawl is running. It also comes with an extensive set of defaults for ignoring junk URLs.



                                                                                        There is a web dashboard for monitoring crawls, as well as additional options for skipping video content or responses over a certain size.






                                                                                        share|improve this answer






























                                                                                          3














                                                                                          For Linux and OS X: I wrote grab-site for archiving entire websites to WARC files. These WARC files can be browsed or extracted. grab-site lets you control which URLs to skip using regular expressions, and these can be changed when the crawl is running. It also comes with an extensive set of defaults for ignoring junk URLs.



                                                                                          There is a web dashboard for monitoring crawls, as well as additional options for skipping video content or responses over a certain size.






                                                                                          share|improve this answer




























                                                                                            3












                                                                                            3








                                                                                            3







                                                                                            For Linux and OS X: I wrote grab-site for archiving entire websites to WARC files. These WARC files can be browsed or extracted. grab-site lets you control which URLs to skip using regular expressions, and these can be changed when the crawl is running. It also comes with an extensive set of defaults for ignoring junk URLs.



                                                                                            There is a web dashboard for monitoring crawls, as well as additional options for skipping video content or responses over a certain size.






                                                                                            share|improve this answer















                                                                                            For Linux and OS X: I wrote grab-site for archiving entire websites to WARC files. These WARC files can be browsed or extracted. grab-site lets you control which URLs to skip using regular expressions, and these can be changed when the crawl is running. It also comes with an extensive set of defaults for ignoring junk URLs.



                                                                                            There is a web dashboard for monitoring crawls, as well as additional options for skipping video content or responses over a certain size.







                                                                                            share|improve this answer














                                                                                            share|improve this answer



                                                                                            share|improve this answer








                                                                                            edited May 27 '16 at 13:45


























                                                                                            community wiki





                                                                                            2 revs
                                                                                            Ivan Kozik
























                                                                                                1














                                                                                                The venerable FreeDownloadManager.org has this feature too.



                                                                                                Free Download Manager has it in two forms in two forms: Site Explorer and Site Spider:




                                                                                                Site Explorer

                                                                                                Site Explorer lets you view the folders structure of a
                                                                                                web site and easily download necessary files or folders.
                                                                                                HTML Spider

                                                                                                You can download whole web pages or even whole web sites with HTML
                                                                                                Spider. The tool can be adjusted to download files with specified
                                                                                                extensions only.




                                                                                                I find Site Explorer is useful to see which folders to include/exclude before you attempt attempt to download the whole site - especially when there is an entire forum hiding in the site that you don't want to download for example.






                                                                                                share|improve this answer






























                                                                                                  1














                                                                                                  The venerable FreeDownloadManager.org has this feature too.



                                                                                                  Free Download Manager has it in two forms in two forms: Site Explorer and Site Spider:




                                                                                                  Site Explorer

                                                                                                  Site Explorer lets you view the folders structure of a
                                                                                                  web site and easily download necessary files or folders.
                                                                                                  HTML Spider

                                                                                                  You can download whole web pages or even whole web sites with HTML
                                                                                                  Spider. The tool can be adjusted to download files with specified
                                                                                                  extensions only.




                                                                                                  I find Site Explorer is useful to see which folders to include/exclude before you attempt attempt to download the whole site - especially when there is an entire forum hiding in the site that you don't want to download for example.






                                                                                                  share|improve this answer




























                                                                                                    1












                                                                                                    1








                                                                                                    1







                                                                                                    The venerable FreeDownloadManager.org has this feature too.



                                                                                                    Free Download Manager has it in two forms in two forms: Site Explorer and Site Spider:




                                                                                                    Site Explorer

                                                                                                    Site Explorer lets you view the folders structure of a
                                                                                                    web site and easily download necessary files or folders.
                                                                                                    HTML Spider

                                                                                                    You can download whole web pages or even whole web sites with HTML
                                                                                                    Spider. The tool can be adjusted to download files with specified
                                                                                                    extensions only.




                                                                                                    I find Site Explorer is useful to see which folders to include/exclude before you attempt attempt to download the whole site - especially when there is an entire forum hiding in the site that you don't want to download for example.






                                                                                                    share|improve this answer















                                                                                                    The venerable FreeDownloadManager.org has this feature too.



                                                                                                    Free Download Manager has it in two forms in two forms: Site Explorer and Site Spider:




                                                                                                    Site Explorer

                                                                                                    Site Explorer lets you view the folders structure of a
                                                                                                    web site and easily download necessary files or folders.
                                                                                                    HTML Spider

                                                                                                    You can download whole web pages or even whole web sites with HTML
                                                                                                    Spider. The tool can be adjusted to download files with specified
                                                                                                    extensions only.




                                                                                                    I find Site Explorer is useful to see which folders to include/exclude before you attempt attempt to download the whole site - especially when there is an entire forum hiding in the site that you don't want to download for example.







                                                                                                    share|improve this answer














                                                                                                    share|improve this answer



                                                                                                    share|improve this answer








                                                                                                    answered Sep 27 '15 at 8:49


























                                                                                                    community wiki





                                                                                                    David d C e Freitas
























                                                                                                        0














                                                                                                        Power wget



                                                                                                        While wget was already mentioned this resource and command line was so seamless I thought it deserved mention:
                                                                                                        wget -P /path/to/destination/directory/ -mpck --user-agent="" -e robots=off --wait 1 -E https://www.example.com/



                                                                                                        See this code explained on explainshell






                                                                                                        share|improve this answer






























                                                                                                          0














                                                                                                          Power wget



                                                                                                          While wget was already mentioned this resource and command line was so seamless I thought it deserved mention:
                                                                                                          wget -P /path/to/destination/directory/ -mpck --user-agent="" -e robots=off --wait 1 -E https://www.example.com/



                                                                                                          See this code explained on explainshell






                                                                                                          share|improve this answer




























                                                                                                            0












                                                                                                            0








                                                                                                            0







                                                                                                            Power wget



                                                                                                            While wget was already mentioned this resource and command line was so seamless I thought it deserved mention:
                                                                                                            wget -P /path/to/destination/directory/ -mpck --user-agent="" -e robots=off --wait 1 -E https://www.example.com/



                                                                                                            See this code explained on explainshell






                                                                                                            share|improve this answer















                                                                                                            Power wget



                                                                                                            While wget was already mentioned this resource and command line was so seamless I thought it deserved mention:
                                                                                                            wget -P /path/to/destination/directory/ -mpck --user-agent="" -e robots=off --wait 1 -E https://www.example.com/



                                                                                                            See this code explained on explainshell







                                                                                                            share|improve this answer














                                                                                                            share|improve this answer



                                                                                                            share|improve this answer








                                                                                                            answered Nov 3 '17 at 18:13


























                                                                                                            community wiki





                                                                                                            Shwaydogg
























                                                                                                                -1














                                                                                                                download HTTracker it will download websites very easy steps to follows.



                                                                                                                download link:http://www.httrack.com/page/2/



                                                                                                                video that help may help you :https://www.youtube.com/watch?v=7IHIGf6lcL4






                                                                                                                share|improve this answer


























                                                                                                                • -1 duplicate of top answer

                                                                                                                  – wjandrea
                                                                                                                  Sep 24 '17 at 7:28


















                                                                                                                -1














                                                                                                                download HTTracker it will download websites very easy steps to follows.



                                                                                                                download link:http://www.httrack.com/page/2/



                                                                                                                video that help may help you :https://www.youtube.com/watch?v=7IHIGf6lcL4






                                                                                                                share|improve this answer


























                                                                                                                • -1 duplicate of top answer

                                                                                                                  – wjandrea
                                                                                                                  Sep 24 '17 at 7:28
















                                                                                                                -1












                                                                                                                -1








                                                                                                                -1







                                                                                                                download HTTracker it will download websites very easy steps to follows.



                                                                                                                download link:http://www.httrack.com/page/2/



                                                                                                                video that help may help you :https://www.youtube.com/watch?v=7IHIGf6lcL4






                                                                                                                share|improve this answer















                                                                                                                download HTTracker it will download websites very easy steps to follows.



                                                                                                                download link:http://www.httrack.com/page/2/



                                                                                                                video that help may help you :https://www.youtube.com/watch?v=7IHIGf6lcL4







                                                                                                                share|improve this answer














                                                                                                                share|improve this answer



                                                                                                                share|improve this answer








                                                                                                                answered Sep 21 '15 at 16:02


























                                                                                                                community wiki





                                                                                                                ALI SHEKH














                                                                                                                • -1 duplicate of top answer

                                                                                                                  – wjandrea
                                                                                                                  Sep 24 '17 at 7:28





















                                                                                                                • -1 duplicate of top answer

                                                                                                                  – wjandrea
                                                                                                                  Sep 24 '17 at 7:28



















                                                                                                                -1 duplicate of top answer

                                                                                                                – wjandrea
                                                                                                                Sep 24 '17 at 7:28







                                                                                                                -1 duplicate of top answer

                                                                                                                – wjandrea
                                                                                                                Sep 24 '17 at 7:28













                                                                                                                -3














                                                                                                                I believe google chrome can do this on desktop devices, just go to the browser menu and click save webpage.



                                                                                                                Also note that services like pocket may not actually save the website, and are thus susceptible to link rot.



                                                                                                                Lastly note that copying the contents of a website may infringe on copyright, if it applies.






                                                                                                                share|improve this answer





















                                                                                                                • 3





                                                                                                                  A web page in your browser is just one out of many of a web site.

                                                                                                                  – Arjan
                                                                                                                  May 16 '15 at 20:05











                                                                                                                • @Arjan I guess that makes my option labor intensive. I believe it is more common for people to just want to save one page, so this answer may be better for those people who come here for that.

                                                                                                                  – jiggunjer
                                                                                                                  May 17 '15 at 10:10
















                                                                                                                -3














                                                                                                                I believe google chrome can do this on desktop devices, just go to the browser menu and click save webpage.



                                                                                                                Also note that services like pocket may not actually save the website, and are thus susceptible to link rot.



                                                                                                                Lastly note that copying the contents of a website may infringe on copyright, if it applies.






                                                                                                                share|improve this answer





















                                                                                                                • 3





                                                                                                                  A web page in your browser is just one out of many of a web site.

                                                                                                                  – Arjan
                                                                                                                  May 16 '15 at 20:05











                                                                                                                • @Arjan I guess that makes my option labor intensive. I believe it is more common for people to just want to save one page, so this answer may be better for those people who come here for that.

                                                                                                                  – jiggunjer
                                                                                                                  May 17 '15 at 10:10














                                                                                                                -3












                                                                                                                -3








                                                                                                                -3







                                                                                                                I believe google chrome can do this on desktop devices, just go to the browser menu and click save webpage.



                                                                                                                Also note that services like pocket may not actually save the website, and are thus susceptible to link rot.



                                                                                                                Lastly note that copying the contents of a website may infringe on copyright, if it applies.






                                                                                                                share|improve this answer















                                                                                                                I believe google chrome can do this on desktop devices, just go to the browser menu and click save webpage.



                                                                                                                Also note that services like pocket may not actually save the website, and are thus susceptible to link rot.



                                                                                                                Lastly note that copying the contents of a website may infringe on copyright, if it applies.







                                                                                                                share|improve this answer














                                                                                                                share|improve this answer



                                                                                                                share|improve this answer








                                                                                                                answered May 16 '15 at 18:05


























                                                                                                                community wiki





                                                                                                                jiggunjer









                                                                                                                • 3





                                                                                                                  A web page in your browser is just one out of many of a web site.

                                                                                                                  – Arjan
                                                                                                                  May 16 '15 at 20:05











                                                                                                                • @Arjan I guess that makes my option labor intensive. I believe it is more common for people to just want to save one page, so this answer may be better for those people who come here for that.

                                                                                                                  – jiggunjer
                                                                                                                  May 17 '15 at 10:10














                                                                                                                • 3





                                                                                                                  A web page in your browser is just one out of many of a web site.

                                                                                                                  – Arjan
                                                                                                                  May 16 '15 at 20:05











                                                                                                                • @Arjan I guess that makes my option labor intensive. I believe it is more common for people to just want to save one page, so this answer may be better for those people who come here for that.

                                                                                                                  – jiggunjer
                                                                                                                  May 17 '15 at 10:10








                                                                                                                3




                                                                                                                3





                                                                                                                A web page in your browser is just one out of many of a web site.

                                                                                                                – Arjan
                                                                                                                May 16 '15 at 20:05





                                                                                                                A web page in your browser is just one out of many of a web site.

                                                                                                                – Arjan
                                                                                                                May 16 '15 at 20:05













                                                                                                                @Arjan I guess that makes my option labor intensive. I believe it is more common for people to just want to save one page, so this answer may be better for those people who come here for that.

                                                                                                                – jiggunjer
                                                                                                                May 17 '15 at 10:10





                                                                                                                @Arjan I guess that makes my option labor intensive. I believe it is more common for people to just want to save one page, so this answer may be better for those people who come here for that.

                                                                                                                – jiggunjer
                                                                                                                May 17 '15 at 10:10











                                                                                                                -3














                                                                                                                Firefox can do it natively (at least FF 42 can). Just use "Save Page"



                                                                                                                enter image description here






                                                                                                                share|improve this answer





















                                                                                                                • 5





                                                                                                                  Wrong! The question asks how to save an entire web site. Firefox cannot do that.

                                                                                                                  – user477799
                                                                                                                  Jul 26 '16 at 6:24






                                                                                                                • 1





                                                                                                                  Your method works only if it's a one-page site, but if the site has 699 pages? Would be very tiring...

                                                                                                                  – Quidam
                                                                                                                  Dec 15 '16 at 7:03
















                                                                                                                -3














                                                                                                                Firefox can do it natively (at least FF 42 can). Just use "Save Page"



                                                                                                                enter image description here






                                                                                                                share|improve this answer





















                                                                                                                • 5





                                                                                                                  Wrong! The question asks how to save an entire web site. Firefox cannot do that.

                                                                                                                  – user477799
                                                                                                                  Jul 26 '16 at 6:24






                                                                                                                • 1





                                                                                                                  Your method works only if it's a one-page site, but if the site has 699 pages? Would be very tiring...

                                                                                                                  – Quidam
                                                                                                                  Dec 15 '16 at 7:03














                                                                                                                -3












                                                                                                                -3








                                                                                                                -3







                                                                                                                Firefox can do it natively (at least FF 42 can). Just use "Save Page"



                                                                                                                enter image description here






                                                                                                                share|improve this answer















                                                                                                                Firefox can do it natively (at least FF 42 can). Just use "Save Page"



                                                                                                                enter image description here







                                                                                                                share|improve this answer














                                                                                                                share|improve this answer



                                                                                                                share|improve this answer








                                                                                                                answered Dec 2 '15 at 13:59


























                                                                                                                community wiki





                                                                                                                user1032531









                                                                                                                • 5





                                                                                                                  Wrong! The question asks how to save an entire web site. Firefox cannot do that.

                                                                                                                  – user477799
                                                                                                                  Jul 26 '16 at 6:24






                                                                                                                • 1





                                                                                                                  Your method works only if it's a one-page site, but if the site has 699 pages? Would be very tiring...

                                                                                                                  – Quidam
                                                                                                                  Dec 15 '16 at 7:03














                                                                                                                • 5





                                                                                                                  Wrong! The question asks how to save an entire web site. Firefox cannot do that.

                                                                                                                  – user477799
                                                                                                                  Jul 26 '16 at 6:24






                                                                                                                • 1





                                                                                                                  Your method works only if it's a one-page site, but if the site has 699 pages? Would be very tiring...

                                                                                                                  – Quidam
                                                                                                                  Dec 15 '16 at 7:03








                                                                                                                5




                                                                                                                5





                                                                                                                Wrong! The question asks how to save an entire web site. Firefox cannot do that.

                                                                                                                – user477799
                                                                                                                Jul 26 '16 at 6:24





                                                                                                                Wrong! The question asks how to save an entire web site. Firefox cannot do that.

                                                                                                                – user477799
                                                                                                                Jul 26 '16 at 6:24




                                                                                                                1




                                                                                                                1





                                                                                                                Your method works only if it's a one-page site, but if the site has 699 pages? Would be very tiring...

                                                                                                                – Quidam
                                                                                                                Dec 15 '16 at 7:03





                                                                                                                Your method works only if it's a one-page site, but if the site has 699 pages? Would be very tiring...

                                                                                                                – Quidam
                                                                                                                Dec 15 '16 at 7:03





                                                                                                                protected by Community Apr 16 '13 at 10:22



                                                                                                                Thank you for your interest in this question.
                                                                                                                Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).



                                                                                                                Would you like to answer one of these unanswered questions instead?



                                                                                                                Popular posts from this blog

                                                                                                                If I really need a card on my start hand, how many mulligans make sense? [duplicate]

                                                                                                                Alcedinidae

                                                                                                                Can an atomic nucleus contain both particles and antiparticles? [duplicate]