Sign In

Communications of the ACM

ACM TechNews

Putting the Squeeze on Data


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
binary data

Credit: iStockPhoto.com

Massachusetts Institute of Technology professor Piotr Indyk and graduate student Radu Berinde last year introduced two versions of a linear data-compression algorithm that equaled, and in some applications exceeded, the performance of other linear-compression systems. However, both versions of the algorithm would stop working under certain extreme conditions. Indyk and Berinde recently presented a new version of the algorithm that features the advantages of the first versions without the limitations.

Indyk says that by taking two very different files of similar size, the difference between the two has a geometric interpretation, which means there is a way to mathematically describe the difference between the files in terms of distance. The researchers found a way to analyze the difference between compressed files using a different mathematical notion of geometric distance. Through that analysis, some fast compression algorithms still preserve the distance between files. By taking advantage of this new perspective, Indyk and Berinde were able to create a decompression algorithm that recovers more information from the original file without sacrificing speed.

From MIT News
View Full Article

 

Abstracts Copyright © 2009 Information Inc., Bethesda, Maryland, USA


 

No entries found