The term data compression refers to lowering the number of bits of data that needs to be stored or transmitted. This can be done with or without the loss of information, which means that what will be erased throughout the compression will be either redundant data or unnecessary one. When the data is uncompressed subsequently, in the first case the content and the quality shall be identical, while in the second case the quality will be worse. You will find various compression algorithms which are better for various type of information. Compressing and uncompressing data in most cases takes a lot of processing time, which means that the server carrying out the action must have adequate resources in order to be able to process the data fast enough. One simple example how information can be compressed is to store how many consecutive positions should have 1 and just how many should have 0 within the binary code as an alternative to storing the particular 1s and 0s.

Data Compression in Cloud Website Hosting

The ZFS file system that runs on our cloud web hosting platform uses a compression algorithm named LZ4. The aforementioned is a lot faster and better than every other algorithm you'll find, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk, which improves the overall performance of websites hosted on ZFS-based platforms. Because the algorithm compresses data really well and it does that quickly, we're able to generate several backups of all the content kept in the cloud website hosting accounts on our servers daily. Both your content and its backups will take less space and since both ZFS and LZ4 work very fast, the backup generation will not change the performance of the servers where your content will be kept.