Compress any file to 1KB and BACK..

There is no so called future of data compression. I don't see any big changes in compression ratios in the future. Newer compression technology will definitely not reduce file size by half.

The software posted here gets very very close to the maximum possible compression of a file which depends on the amount of information it contains but it cannot be compressed further.

So the claim that "Compress any 500 MB file to 1MB is just ridiculous"

Im sure a software pro could create a 500 Mb file and this thing wont compress it to even 499 MB
 
max_ds said:
There is no so called future of data compression. I don't see any big changes in compression ratios in the future. Newer compression technology will definitely not reduce file size by half.

The software posted here gets very very close to the maximum possible compression of a file which depends on the amount of information it contains but it cannot be compressed further.

So the claim that "Compress any 500 MB file to 1MB is just ridiculous"

Im sure a software pro could create a 500 Mb file and this thing wont compress it to even 499 MB

wat to say !!!
 
Most types of computer files are fairly redundant -- they have the same information listed over and over again. File-compression programs simply get rid of the redundancy. Instead of listing a piece of information over and over again, a file-compression program lists that information once and then refers back to it whenever it appears in the original program.

Read how compression works here : Howstuffworks "How File Compression Works".

There is a limit to the compression of any file and I dont think its possible to reduce any file to just 1KB??? How can one reduce a video file or an audio file to just 1 Kb, its just not possible, according to me unless proved otherwise...

Hope it does not happen to be BS.... :bshit:
 
..we arent talking about this -> KGB Archiver ..are we now?!

KGB Archiver is the compression tool with unbelievable high compression rate. It surpasses even such efficient compression tool like 7zip and UHARC in terms of the abilities. Unfortunately although its powerful compression rate, it has high hardware requirements (I recommend processor with 1,5GHz clock and 256MB of RAM as an essential minimum). One of the advantages of KGB Archiver is also AES-256 encryption which is used to encrypt the archives. This is one of the strongest encryptions known for human.

test2.jpg

The following article throws more light on it.

KGB, the best compressor - GameDev.Net Discussion Forums

Is it really that worth?! Something smells fishy somewhere..
 
there is another 1 hour to go ... and til then let let me answer some q`s...
1.this is not kgb archiver..

2. i found this off google...

3. its some kinda experiment.... its still in the works.. but the theory and implementation is working...

4. has anyone heard of PAQ compression...

5. this program works on the principle of compressing already compressed data... its iterative...

6. its free.. i.e open source

7. its a command line utility.
 
WinUDA, Emilcont and KGB archiever are all based on PAQ81 :p

and this prog. is a command line util remember...

its not one of them...
 
Is it BARF! ??

does this really work to the specified extent? Theoretically it seems applicable but has it been implemented on all type of files?

Tried it on one of my already compressed zipped files [50MB]. The first attempt cut it into half and then correspondingly the ratio went on getting equalized[Got bored after it reached 13MB! :p] Yup, It would be really awesome to see any practically realized tests of reaching that 1Kb barrier till date. Any compressed examples of it thixkull?!
 
congrats...

yes its barf... now i request a mod to change the title of this thread so that everyone knows its out... and they come here and get their answers..

neuro... its supposed to work 1 kb at a time after that.... u`ll have to keep trying to get the file to 1 kb....

i`ve been working on that for the last 24 hrs.... but the basic proggy.. is barf....

first im trying a basic batch to see if i can go on trying to compress compressed data and if this works... then i`ll work upwards from there...
btw.. neuro... try the digits file on the barf homepage... it works fine wit that... down to 1kb in 1 pass... i tried compressing the same file wit winrar...... winrar couldnt even compress it...
 
The BARF Principle

Note that we can combine different compression algorithms to achieve results better than either one alone. For example, a program could try both COMPRESS and GZIP and choose the smaller file (either file.Z or file.gz) and never do worse than either program alone. BARF extends this idea by trying 257 different algorithms and choosing the best one, with remarkable results.

Recursive compression has long been thought to be impractical. The pigeonhole principle states that it is impossible for a single algorithm to compress all messages of n bits or more, no matter what n is. This is because there are 2n possible messages of n bits, but only 2n-1 possible encodings of n-1 or fewer bits. This means that at least two messages must code to the same encoding. Such an encoding could not be decompressed unambiguously. To avoid this, some messages must inevitably "compress" to an equal or larger size than the original. This limitation has stymied the development of recursive file compression technology because we eventually reach a point at which the "compressed" file is not any smaller.

However, the pigeonhole principle does not apply to a set of algorithms. BARF solves the recursive compression problem by using multiple algorithms arranged in such a way that every nonempty file can be compressed by at least one of them. By choosing the best algorithm, it is guaranteed that every nonempty file can compressed by at least one byte. By repeated compression, any file can be eventually compressed to 0 bytes.

Amazing funda! And we thought PH principle was as absolute as governing dynamics. Einstein was right.. everything is relative! :p

Great find thixkull! Hope to see more developments on the same. :)
 
ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ

What a load of crap...............

The source code: http://www.cs.fit.edu/~mmahoney/compression/barf.cpp

Code:
// Test for .x?? extension.  Remove it, and prepend byte ?? to file

  // where ?? is a base 26 [0-9][a-z] representation of the byte.

  const int n=file.size();

  if (n>4 && file[n-4]=='.' && (file[n-3]=='x' || file[n-3]=='X')

      && isdigit(file[n-2])

      && isalpha(file[n-1])) {

    int c=26*(file[n-2]-'0')+file[n-1]-'a';

    if (isupper(file[n-1]))

      c=c+'a'-'A';

    if (c<0 || c>255) {

      fclose(f1);

      printf("%s was not compressed by barf\n", file.c_str());

      return;

    }

    string outfile=file.substr(0, n-4);

    f2=fopen(outfile.c_str(), "wb");

    if (!f2) {

      fclose(f1);

      perror(outfile.c_str());

      return;

    }

    putc(c, f2);

    while ((c=getc(f1))!=EOF)

      putc(c, f2);

    long len=ftell(f1);

    long len2=ftell(f2);

    fclose(f2);

    fclose(f1);

    remove(file.c_str());

    printf("%s (%ld bytes) -> %s (%ld bytes)\n", file.c_str(), len,

      outfile.c_str(), len2);

    return;

  }

For anyone with a little knowledge in C, you will notice that it is storing the data in file names......

@ All: Ever imagines... for such small source code, how come the .exe is so large.... lol, he is storing the "so called" premodelled files in the .exe itself.

Thats why some files go to .x (1 byte) immediately and some keep on reducing by 1 byte and adding a extension.

What a load of crap. @@ :mad:

It is NOT even trying to apply the most primitive of compressions.

See for yourself:



First the H was cut off, then the o.....
 
^^Nice find karan but the fact remains that anything that claims to do better than theoretically possible and violates shannon information entropy is a guaranteed scam... No use even looking at such junk and I avoid posting in such threads :p.
 
rofl.. nice find there karan.. wish i could fool my prof's like that :(

btw karan mail this to the author and letz see wht replies do u get :D

edit : just saw the person's profile , he seems to be quite educated and also wrked on PAQ series of compressors...so i am a little confused here :S
 
lol...so does that exe increase in size.

so we need that exe which has the lost info in it....which means no comression at all...is that the inference.
 
lol....this is ridiculous...i expected atlest sm compression but storing each char in a file & claiming 1KB compression is insanely insane...:rofl:
 
Back
Top