Compress any file to 1KB and BACK..

ok karan i read your theory on this compression when applied to text...

wat explanation would you give for compressing other file types...

wats your explanation for

barf amillionrandomdigits.bin...???
the file is avail on the same page...

in one pass it compressed it down to 1 byte...

and wat is ur explanation to neuro`s claim that he once compressed a zip file using barf and got it down to 50%.....
 
NEurodrOne said:
Is it BARF! ??

does this really work to the specified extent? Theoretically it seems applicable but has it been implemented on all type of files?

Tried it on one of my already compressed zipped files [50MB]. The first attempt cut it into half and then correspondingly the ratio went on getting equalized[Got bored after it reached 13MB! :p] Yup, It would be really awesome to see any practically realized tests of reaching that 1Kb barrier till date. Any compressed examples of it thixkull?!

im sure it didnt store 25mb in just a 3 letter file name... that to it uses just 2 letters other than the x...
 
thixkull said:
im sure it didnt store 25mb in just a 3 letter file name... that to it uses just 2 letters other than the x...

study the code buddy.....

he premodels the files.... its simply amazing how he fools people WITH the source code in the front page. just shows that if you sell the stuff well enough, it WILL sell..
@thixkull: hint hint: look at the size of the .exe :)
 
About BARF. The program is a joke. A lot of people have claimed (especially on comp.compression) that they have found a way to compress every input file above a certain size, or compress random data. If such a program was possible then you could recompress the compressed output recursively until the output was as small as you liked.
I wrote the program to show how it is possible for a compressor to cheat. You have to design the rules carefully to avoid this. For example if you look at the Calgary challenge at The Compression/SHA-1 Challenge BARF would not pass. First, the size has to include the size of the decompressor, and the BARF decompressor has to have a copy of the Calgary corpus embedded in it to work (which it does). Second, the recursive part would not work because the whole thing has to be packed into an archive and the archive has to store the name of the compressed file.

www.maximumcompression.com/guestbook/gb.php%3Foffset%3D156%26poffset%3D1+barf+compression&hl=en&gl=in&ct=clnk&cd=4&client=firefox-a]Maximum Compression[/url]
 
Back
Top