What is the fastest compression method for a large number of files?











up vote
13
down vote

favorite
3












I need to compress a directory with around 350,000 fairly small files that amount to about 100GB total. I am using OSX and am currently using the standard "Compress" tool that converts this directory into a .zip file. Is there a faster way to do this?










share|improve this question






















  • You probably cannot beat tar, as it doesn't actually compress, only archive, without specific options that enable that. In answers, I'd love to see proof, no opinion...
    – Daniel Beck
    Jun 19 '11 at 5:28






  • 1




    Depends how much compression you want.
    – ta.speot.is
    Jun 19 '11 at 6:49






  • 1




    I did end up using tar and for speed reasons did not try compressing it yet. It was able to complete in time for what I needed it for. Thanks!
    – Spike
    Jun 20 '11 at 3:16










  • @DanielBeck, Problem with tar is that they don't show the directory tree. So to even get a "view", we need to unzip that whole tar. Are there alternatives to tar that shows directory view?
    – Pacerier
    May 16 '15 at 22:33

















up vote
13
down vote

favorite
3












I need to compress a directory with around 350,000 fairly small files that amount to about 100GB total. I am using OSX and am currently using the standard "Compress" tool that converts this directory into a .zip file. Is there a faster way to do this?










share|improve this question






















  • You probably cannot beat tar, as it doesn't actually compress, only archive, without specific options that enable that. In answers, I'd love to see proof, no opinion...
    – Daniel Beck
    Jun 19 '11 at 5:28






  • 1




    Depends how much compression you want.
    – ta.speot.is
    Jun 19 '11 at 6:49






  • 1




    I did end up using tar and for speed reasons did not try compressing it yet. It was able to complete in time for what I needed it for. Thanks!
    – Spike
    Jun 20 '11 at 3:16










  • @DanielBeck, Problem with tar is that they don't show the directory tree. So to even get a "view", we need to unzip that whole tar. Are there alternatives to tar that shows directory view?
    – Pacerier
    May 16 '15 at 22:33















up vote
13
down vote

favorite
3









up vote
13
down vote

favorite
3






3





I need to compress a directory with around 350,000 fairly small files that amount to about 100GB total. I am using OSX and am currently using the standard "Compress" tool that converts this directory into a .zip file. Is there a faster way to do this?










share|improve this question













I need to compress a directory with around 350,000 fairly small files that amount to about 100GB total. I am using OSX and am currently using the standard "Compress" tool that converts this directory into a .zip file. Is there a faster way to do this?







macos compression zip tar gzip






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Jun 19 '11 at 4:55









Spike

168116




168116












  • You probably cannot beat tar, as it doesn't actually compress, only archive, without specific options that enable that. In answers, I'd love to see proof, no opinion...
    – Daniel Beck
    Jun 19 '11 at 5:28






  • 1




    Depends how much compression you want.
    – ta.speot.is
    Jun 19 '11 at 6:49






  • 1




    I did end up using tar and for speed reasons did not try compressing it yet. It was able to complete in time for what I needed it for. Thanks!
    – Spike
    Jun 20 '11 at 3:16










  • @DanielBeck, Problem with tar is that they don't show the directory tree. So to even get a "view", we need to unzip that whole tar. Are there alternatives to tar that shows directory view?
    – Pacerier
    May 16 '15 at 22:33




















  • You probably cannot beat tar, as it doesn't actually compress, only archive, without specific options that enable that. In answers, I'd love to see proof, no opinion...
    – Daniel Beck
    Jun 19 '11 at 5:28






  • 1




    Depends how much compression you want.
    – ta.speot.is
    Jun 19 '11 at 6:49






  • 1




    I did end up using tar and for speed reasons did not try compressing it yet. It was able to complete in time for what I needed it for. Thanks!
    – Spike
    Jun 20 '11 at 3:16










  • @DanielBeck, Problem with tar is that they don't show the directory tree. So to even get a "view", we need to unzip that whole tar. Are there alternatives to tar that shows directory view?
    – Pacerier
    May 16 '15 at 22:33


















You probably cannot beat tar, as it doesn't actually compress, only archive, without specific options that enable that. In answers, I'd love to see proof, no opinion...
– Daniel Beck
Jun 19 '11 at 5:28




You probably cannot beat tar, as it doesn't actually compress, only archive, without specific options that enable that. In answers, I'd love to see proof, no opinion...
– Daniel Beck
Jun 19 '11 at 5:28




1




1




Depends how much compression you want.
– ta.speot.is
Jun 19 '11 at 6:49




Depends how much compression you want.
– ta.speot.is
Jun 19 '11 at 6:49




1




1




I did end up using tar and for speed reasons did not try compressing it yet. It was able to complete in time for what I needed it for. Thanks!
– Spike
Jun 20 '11 at 3:16




I did end up using tar and for speed reasons did not try compressing it yet. It was able to complete in time for what I needed it for. Thanks!
– Spike
Jun 20 '11 at 3:16












@DanielBeck, Problem with tar is that they don't show the directory tree. So to even get a "view", we need to unzip that whole tar. Are there alternatives to tar that shows directory view?
– Pacerier
May 16 '15 at 22:33






@DanielBeck, Problem with tar is that they don't show the directory tree. So to even get a "view", we need to unzip that whole tar. Are there alternatives to tar that shows directory view?
– Pacerier
May 16 '15 at 22:33












3 Answers
3






active

oldest

votes

















up vote
13
down vote



accepted










For directories I'd use a tar piped to bzip2 with max-compression.



a simple way to go is,




tar cfj archive.tar.bz2 dir-to-be-archived/


This works great if you don't intend to fetch small sets of files out of the archive

and are just planning to extract the whole thing whenever/wherever required.

Yet, if you do want to get a small set of files out, its not too bad.



I prefer to call such archives filename.tar.bz2 and extract with the 'xfj' option.



The max-compression pipe looks like this,




tar cf - dir-to-be-archived/ | bzip2 -9 - > archive.tar.bz2
# ^pipe tarball from here to zip-in^ into the archive file.


Note: the 'bzip2' method and more compression tends to be slower than regular gzip from 'tar cfz'.



If you have a fast network and the archive is going to be placed on a different machine,

you can speed up with a pipe across the network (effectively using two machines together).




tar cf - dir/ | ssh user@server "bzip2 -9 - > /target-path/archive.tar.bz2"
# ^ pipe tarball over network to zip ^ and archive on remote machine.


Some references,




  1. Linux Journal: Compression Tools Compared, Jul 28, 2005


    • this also refers the MaximumCompression site mentioned by Dennis




  2. gzip vs. bzip2, Aug 26, 2003


  3. A Quick Benchmark: Gzip vs. Bzip2 vs. LZMA, May 31, 2005






share|improve this answer



















  • 1




    The questioner asked for the fastest method, bzipping a 100Gb tar would take a lifetime! There comes a point with disk space being so cheap that taking aeons to squeeze out every last possible bit of redundancy is just a senseless waste of resources, unless absolutely necessary. With most of the disk usage taken up in slack space, gzipping the tar with -1 would probably do the job well enough and allow moving onto the next task a few months earlier!
    – Andy Lee Robinson
    Jul 30 '11 at 12:11










  • While I agree that a 100GB file is probably not worth compressing in totality, I don't think that bzip2 will take linearly more time for 100GB as compared to 1GB (say). Would love to see some theory or data to show either ways.
    – nik
    Jul 30 '11 at 16:57










  • I understand that bzip2's dictionary is adaptive, therefore it is constantly looking for new redundancies within its search window up to the end of the file. Subject to the homogeneity of the file's entropy, it should be relatively linear. It would be a bad compressor that assumed it had all it needed from the beginning of file to be able to compress the rest quickly, but in some cases that may be all that is needed, though there are better ways to grow old than work it out empirically with 100GB datasets!
    – Andy Lee Robinson
    Jul 31 '11 at 2:40


















up vote
6
down vote













This guy did some research on that. It appears that .zip will compress larger files faster. However, it yields one of the largest compression sizes. It also looks like he was using Windows utilities, but I'm betting OSX's utility is almost as optimized.



Here is an excellent website where numerous compression utilities have been benchmarked for speed over many files. There are many other tests on that site you could look at to determine the best utility for you.



Much of the speed has to do with the program you use. I've used 7zip's utility for Windows, and I find that to be very fast. However, compressing many files takes a long time no matter what so I would just let it go overnight. Or you could just tar the whole thing and not compress it...Personally I hate unzipping large archives so I would be careful if that's what you want to do.






share|improve this answer




























    up vote
    0
    down vote













    I prefer using



    tar cf - dir-to-be-archived/ | bzip2 -9 - > archive.tar.bz2



    for moving files to other server and coverting them at the same time






    share|improve this answer

















    • 1




      Which is already suggested in the top answer by @nik . No need to duplicate for emphasis, just upvote the other answer or add a comment if you've something substantive but don't want to give an involved answer. ;o)
      – pbhj
      Nov 21 at 16:40











    Your Answer








    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "3"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














     

    draft saved


    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f299115%2fwhat-is-the-fastest-compression-method-for-a-large-number-of-files%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    13
    down vote



    accepted










    For directories I'd use a tar piped to bzip2 with max-compression.



    a simple way to go is,




    tar cfj archive.tar.bz2 dir-to-be-archived/


    This works great if you don't intend to fetch small sets of files out of the archive

    and are just planning to extract the whole thing whenever/wherever required.

    Yet, if you do want to get a small set of files out, its not too bad.



    I prefer to call such archives filename.tar.bz2 and extract with the 'xfj' option.



    The max-compression pipe looks like this,




    tar cf - dir-to-be-archived/ | bzip2 -9 - > archive.tar.bz2
    # ^pipe tarball from here to zip-in^ into the archive file.


    Note: the 'bzip2' method and more compression tends to be slower than regular gzip from 'tar cfz'.



    If you have a fast network and the archive is going to be placed on a different machine,

    you can speed up with a pipe across the network (effectively using two machines together).




    tar cf - dir/ | ssh user@server "bzip2 -9 - > /target-path/archive.tar.bz2"
    # ^ pipe tarball over network to zip ^ and archive on remote machine.


    Some references,




    1. Linux Journal: Compression Tools Compared, Jul 28, 2005


      • this also refers the MaximumCompression site mentioned by Dennis




    2. gzip vs. bzip2, Aug 26, 2003


    3. A Quick Benchmark: Gzip vs. Bzip2 vs. LZMA, May 31, 2005






    share|improve this answer



















    • 1




      The questioner asked for the fastest method, bzipping a 100Gb tar would take a lifetime! There comes a point with disk space being so cheap that taking aeons to squeeze out every last possible bit of redundancy is just a senseless waste of resources, unless absolutely necessary. With most of the disk usage taken up in slack space, gzipping the tar with -1 would probably do the job well enough and allow moving onto the next task a few months earlier!
      – Andy Lee Robinson
      Jul 30 '11 at 12:11










    • While I agree that a 100GB file is probably not worth compressing in totality, I don't think that bzip2 will take linearly more time for 100GB as compared to 1GB (say). Would love to see some theory or data to show either ways.
      – nik
      Jul 30 '11 at 16:57










    • I understand that bzip2's dictionary is adaptive, therefore it is constantly looking for new redundancies within its search window up to the end of the file. Subject to the homogeneity of the file's entropy, it should be relatively linear. It would be a bad compressor that assumed it had all it needed from the beginning of file to be able to compress the rest quickly, but in some cases that may be all that is needed, though there are better ways to grow old than work it out empirically with 100GB datasets!
      – Andy Lee Robinson
      Jul 31 '11 at 2:40















    up vote
    13
    down vote



    accepted










    For directories I'd use a tar piped to bzip2 with max-compression.



    a simple way to go is,




    tar cfj archive.tar.bz2 dir-to-be-archived/


    This works great if you don't intend to fetch small sets of files out of the archive

    and are just planning to extract the whole thing whenever/wherever required.

    Yet, if you do want to get a small set of files out, its not too bad.



    I prefer to call such archives filename.tar.bz2 and extract with the 'xfj' option.



    The max-compression pipe looks like this,




    tar cf - dir-to-be-archived/ | bzip2 -9 - > archive.tar.bz2
    # ^pipe tarball from here to zip-in^ into the archive file.


    Note: the 'bzip2' method and more compression tends to be slower than regular gzip from 'tar cfz'.



    If you have a fast network and the archive is going to be placed on a different machine,

    you can speed up with a pipe across the network (effectively using two machines together).




    tar cf - dir/ | ssh user@server "bzip2 -9 - > /target-path/archive.tar.bz2"
    # ^ pipe tarball over network to zip ^ and archive on remote machine.


    Some references,




    1. Linux Journal: Compression Tools Compared, Jul 28, 2005


      • this also refers the MaximumCompression site mentioned by Dennis




    2. gzip vs. bzip2, Aug 26, 2003


    3. A Quick Benchmark: Gzip vs. Bzip2 vs. LZMA, May 31, 2005






    share|improve this answer



















    • 1




      The questioner asked for the fastest method, bzipping a 100Gb tar would take a lifetime! There comes a point with disk space being so cheap that taking aeons to squeeze out every last possible bit of redundancy is just a senseless waste of resources, unless absolutely necessary. With most of the disk usage taken up in slack space, gzipping the tar with -1 would probably do the job well enough and allow moving onto the next task a few months earlier!
      – Andy Lee Robinson
      Jul 30 '11 at 12:11










    • While I agree that a 100GB file is probably not worth compressing in totality, I don't think that bzip2 will take linearly more time for 100GB as compared to 1GB (say). Would love to see some theory or data to show either ways.
      – nik
      Jul 30 '11 at 16:57










    • I understand that bzip2's dictionary is adaptive, therefore it is constantly looking for new redundancies within its search window up to the end of the file. Subject to the homogeneity of the file's entropy, it should be relatively linear. It would be a bad compressor that assumed it had all it needed from the beginning of file to be able to compress the rest quickly, but in some cases that may be all that is needed, though there are better ways to grow old than work it out empirically with 100GB datasets!
      – Andy Lee Robinson
      Jul 31 '11 at 2:40













    up vote
    13
    down vote



    accepted







    up vote
    13
    down vote



    accepted






    For directories I'd use a tar piped to bzip2 with max-compression.



    a simple way to go is,




    tar cfj archive.tar.bz2 dir-to-be-archived/


    This works great if you don't intend to fetch small sets of files out of the archive

    and are just planning to extract the whole thing whenever/wherever required.

    Yet, if you do want to get a small set of files out, its not too bad.



    I prefer to call such archives filename.tar.bz2 and extract with the 'xfj' option.



    The max-compression pipe looks like this,




    tar cf - dir-to-be-archived/ | bzip2 -9 - > archive.tar.bz2
    # ^pipe tarball from here to zip-in^ into the archive file.


    Note: the 'bzip2' method and more compression tends to be slower than regular gzip from 'tar cfz'.



    If you have a fast network and the archive is going to be placed on a different machine,

    you can speed up with a pipe across the network (effectively using two machines together).




    tar cf - dir/ | ssh user@server "bzip2 -9 - > /target-path/archive.tar.bz2"
    # ^ pipe tarball over network to zip ^ and archive on remote machine.


    Some references,




    1. Linux Journal: Compression Tools Compared, Jul 28, 2005


      • this also refers the MaximumCompression site mentioned by Dennis




    2. gzip vs. bzip2, Aug 26, 2003


    3. A Quick Benchmark: Gzip vs. Bzip2 vs. LZMA, May 31, 2005






    share|improve this answer














    For directories I'd use a tar piped to bzip2 with max-compression.



    a simple way to go is,




    tar cfj archive.tar.bz2 dir-to-be-archived/


    This works great if you don't intend to fetch small sets of files out of the archive

    and are just planning to extract the whole thing whenever/wherever required.

    Yet, if you do want to get a small set of files out, its not too bad.



    I prefer to call such archives filename.tar.bz2 and extract with the 'xfj' option.



    The max-compression pipe looks like this,




    tar cf - dir-to-be-archived/ | bzip2 -9 - > archive.tar.bz2
    # ^pipe tarball from here to zip-in^ into the archive file.


    Note: the 'bzip2' method and more compression tends to be slower than regular gzip from 'tar cfz'.



    If you have a fast network and the archive is going to be placed on a different machine,

    you can speed up with a pipe across the network (effectively using two machines together).




    tar cf - dir/ | ssh user@server "bzip2 -9 - > /target-path/archive.tar.bz2"
    # ^ pipe tarball over network to zip ^ and archive on remote machine.


    Some references,




    1. Linux Journal: Compression Tools Compared, Jul 28, 2005


      • this also refers the MaximumCompression site mentioned by Dennis




    2. gzip vs. bzip2, Aug 26, 2003


    3. A Quick Benchmark: Gzip vs. Bzip2 vs. LZMA, May 31, 2005







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Jun 19 '11 at 6:38

























    answered Jun 19 '11 at 5:58









    nik

    48.1k786132




    48.1k786132








    • 1




      The questioner asked for the fastest method, bzipping a 100Gb tar would take a lifetime! There comes a point with disk space being so cheap that taking aeons to squeeze out every last possible bit of redundancy is just a senseless waste of resources, unless absolutely necessary. With most of the disk usage taken up in slack space, gzipping the tar with -1 would probably do the job well enough and allow moving onto the next task a few months earlier!
      – Andy Lee Robinson
      Jul 30 '11 at 12:11










    • While I agree that a 100GB file is probably not worth compressing in totality, I don't think that bzip2 will take linearly more time for 100GB as compared to 1GB (say). Would love to see some theory or data to show either ways.
      – nik
      Jul 30 '11 at 16:57










    • I understand that bzip2's dictionary is adaptive, therefore it is constantly looking for new redundancies within its search window up to the end of the file. Subject to the homogeneity of the file's entropy, it should be relatively linear. It would be a bad compressor that assumed it had all it needed from the beginning of file to be able to compress the rest quickly, but in some cases that may be all that is needed, though there are better ways to grow old than work it out empirically with 100GB datasets!
      – Andy Lee Robinson
      Jul 31 '11 at 2:40














    • 1




      The questioner asked for the fastest method, bzipping a 100Gb tar would take a lifetime! There comes a point with disk space being so cheap that taking aeons to squeeze out every last possible bit of redundancy is just a senseless waste of resources, unless absolutely necessary. With most of the disk usage taken up in slack space, gzipping the tar with -1 would probably do the job well enough and allow moving onto the next task a few months earlier!
      – Andy Lee Robinson
      Jul 30 '11 at 12:11










    • While I agree that a 100GB file is probably not worth compressing in totality, I don't think that bzip2 will take linearly more time for 100GB as compared to 1GB (say). Would love to see some theory or data to show either ways.
      – nik
      Jul 30 '11 at 16:57










    • I understand that bzip2's dictionary is adaptive, therefore it is constantly looking for new redundancies within its search window up to the end of the file. Subject to the homogeneity of the file's entropy, it should be relatively linear. It would be a bad compressor that assumed it had all it needed from the beginning of file to be able to compress the rest quickly, but in some cases that may be all that is needed, though there are better ways to grow old than work it out empirically with 100GB datasets!
      – Andy Lee Robinson
      Jul 31 '11 at 2:40








    1




    1




    The questioner asked for the fastest method, bzipping a 100Gb tar would take a lifetime! There comes a point with disk space being so cheap that taking aeons to squeeze out every last possible bit of redundancy is just a senseless waste of resources, unless absolutely necessary. With most of the disk usage taken up in slack space, gzipping the tar with -1 would probably do the job well enough and allow moving onto the next task a few months earlier!
    – Andy Lee Robinson
    Jul 30 '11 at 12:11




    The questioner asked for the fastest method, bzipping a 100Gb tar would take a lifetime! There comes a point with disk space being so cheap that taking aeons to squeeze out every last possible bit of redundancy is just a senseless waste of resources, unless absolutely necessary. With most of the disk usage taken up in slack space, gzipping the tar with -1 would probably do the job well enough and allow moving onto the next task a few months earlier!
    – Andy Lee Robinson
    Jul 30 '11 at 12:11












    While I agree that a 100GB file is probably not worth compressing in totality, I don't think that bzip2 will take linearly more time for 100GB as compared to 1GB (say). Would love to see some theory or data to show either ways.
    – nik
    Jul 30 '11 at 16:57




    While I agree that a 100GB file is probably not worth compressing in totality, I don't think that bzip2 will take linearly more time for 100GB as compared to 1GB (say). Would love to see some theory or data to show either ways.
    – nik
    Jul 30 '11 at 16:57












    I understand that bzip2's dictionary is adaptive, therefore it is constantly looking for new redundancies within its search window up to the end of the file. Subject to the homogeneity of the file's entropy, it should be relatively linear. It would be a bad compressor that assumed it had all it needed from the beginning of file to be able to compress the rest quickly, but in some cases that may be all that is needed, though there are better ways to grow old than work it out empirically with 100GB datasets!
    – Andy Lee Robinson
    Jul 31 '11 at 2:40




    I understand that bzip2's dictionary is adaptive, therefore it is constantly looking for new redundancies within its search window up to the end of the file. Subject to the homogeneity of the file's entropy, it should be relatively linear. It would be a bad compressor that assumed it had all it needed from the beginning of file to be able to compress the rest quickly, but in some cases that may be all that is needed, though there are better ways to grow old than work it out empirically with 100GB datasets!
    – Andy Lee Robinson
    Jul 31 '11 at 2:40












    up vote
    6
    down vote













    This guy did some research on that. It appears that .zip will compress larger files faster. However, it yields one of the largest compression sizes. It also looks like he was using Windows utilities, but I'm betting OSX's utility is almost as optimized.



    Here is an excellent website where numerous compression utilities have been benchmarked for speed over many files. There are many other tests on that site you could look at to determine the best utility for you.



    Much of the speed has to do with the program you use. I've used 7zip's utility for Windows, and I find that to be very fast. However, compressing many files takes a long time no matter what so I would just let it go overnight. Or you could just tar the whole thing and not compress it...Personally I hate unzipping large archives so I would be careful if that's what you want to do.






    share|improve this answer

























      up vote
      6
      down vote













      This guy did some research on that. It appears that .zip will compress larger files faster. However, it yields one of the largest compression sizes. It also looks like he was using Windows utilities, but I'm betting OSX's utility is almost as optimized.



      Here is an excellent website where numerous compression utilities have been benchmarked for speed over many files. There are many other tests on that site you could look at to determine the best utility for you.



      Much of the speed has to do with the program you use. I've used 7zip's utility for Windows, and I find that to be very fast. However, compressing many files takes a long time no matter what so I would just let it go overnight. Or you could just tar the whole thing and not compress it...Personally I hate unzipping large archives so I would be careful if that's what you want to do.






      share|improve this answer























        up vote
        6
        down vote










        up vote
        6
        down vote









        This guy did some research on that. It appears that .zip will compress larger files faster. However, it yields one of the largest compression sizes. It also looks like he was using Windows utilities, but I'm betting OSX's utility is almost as optimized.



        Here is an excellent website where numerous compression utilities have been benchmarked for speed over many files. There are many other tests on that site you could look at to determine the best utility for you.



        Much of the speed has to do with the program you use. I've used 7zip's utility for Windows, and I find that to be very fast. However, compressing many files takes a long time no matter what so I would just let it go overnight. Or you could just tar the whole thing and not compress it...Personally I hate unzipping large archives so I would be careful if that's what you want to do.






        share|improve this answer












        This guy did some research on that. It appears that .zip will compress larger files faster. However, it yields one of the largest compression sizes. It also looks like he was using Windows utilities, but I'm betting OSX's utility is almost as optimized.



        Here is an excellent website where numerous compression utilities have been benchmarked for speed over many files. There are many other tests on that site you could look at to determine the best utility for you.



        Much of the speed has to do with the program you use. I've used 7zip's utility for Windows, and I find that to be very fast. However, compressing many files takes a long time no matter what so I would just let it go overnight. Or you could just tar the whole thing and not compress it...Personally I hate unzipping large archives so I would be careful if that's what you want to do.







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Jun 19 '11 at 5:27









        Dennis Hodapp

        272211




        272211






















            up vote
            0
            down vote













            I prefer using



            tar cf - dir-to-be-archived/ | bzip2 -9 - > archive.tar.bz2



            for moving files to other server and coverting them at the same time






            share|improve this answer

















            • 1




              Which is already suggested in the top answer by @nik . No need to duplicate for emphasis, just upvote the other answer or add a comment if you've something substantive but don't want to give an involved answer. ;o)
              – pbhj
              Nov 21 at 16:40















            up vote
            0
            down vote













            I prefer using



            tar cf - dir-to-be-archived/ | bzip2 -9 - > archive.tar.bz2



            for moving files to other server and coverting them at the same time






            share|improve this answer

















            • 1




              Which is already suggested in the top answer by @nik . No need to duplicate for emphasis, just upvote the other answer or add a comment if you've something substantive but don't want to give an involved answer. ;o)
              – pbhj
              Nov 21 at 16:40













            up vote
            0
            down vote










            up vote
            0
            down vote









            I prefer using



            tar cf - dir-to-be-archived/ | bzip2 -9 - > archive.tar.bz2



            for moving files to other server and coverting them at the same time






            share|improve this answer












            I prefer using



            tar cf - dir-to-be-archived/ | bzip2 -9 - > archive.tar.bz2



            for moving files to other server and coverting them at the same time







            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Nov 21 at 16:31









            oussama fahd

            1




            1








            • 1




              Which is already suggested in the top answer by @nik . No need to duplicate for emphasis, just upvote the other answer or add a comment if you've something substantive but don't want to give an involved answer. ;o)
              – pbhj
              Nov 21 at 16:40














            • 1




              Which is already suggested in the top answer by @nik . No need to duplicate for emphasis, just upvote the other answer or add a comment if you've something substantive but don't want to give an involved answer. ;o)
              – pbhj
              Nov 21 at 16:40








            1




            1




            Which is already suggested in the top answer by @nik . No need to duplicate for emphasis, just upvote the other answer or add a comment if you've something substantive but don't want to give an involved answer. ;o)
            – pbhj
            Nov 21 at 16:40




            Which is already suggested in the top answer by @nik . No need to duplicate for emphasis, just upvote the other answer or add a comment if you've something substantive but don't want to give an involved answer. ;o)
            – pbhj
            Nov 21 at 16:40


















             

            draft saved


            draft discarded



















































             


            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f299115%2fwhat-is-the-fastest-compression-method-for-a-large-number-of-files%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            "Incorrect syntax near the keyword 'ON'. (on update cascade, on delete cascade,)

            Alcedinidae

            RAC Tourist Trophy