Data compression is the compacting of information by lowering the number of bits that are stored or transmitted. In this way, the compressed data will take considerably less disk space than the initial one, so additional content could be stored using the same amount of space. You can find different compression algorithms which function in different ways and with many of them just the redundant bits are erased, so once the data is uncompressed, there is no decrease in quality. Others delete unnecessary bits, but uncompressing the data later will lead to lower quality in comparison with the original. Compressing and uncompressing content consumes a significant amount of system resources, particularly CPU processing time, therefore every hosting platform which employs compression in real time must have sufficient power to support this attribute. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of sequential 1s or 0s there should be instead of storing the actual code.

Data Compression in Website Hosting

The ZFS file system that is run on our cloud Internet hosting platform uses a compression algorithm identified as LZ4. The latter is a lot faster and better than any other algorithm on the market, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard disk drive, which improves the performance of Internet sites hosted on ZFS-based platforms. Because the algorithm compresses data very well and it does that quickly, we're able to generate several backup copies of all the content stored in the website hosting accounts on our servers on a daily basis. Both your content and its backups will need reduced space and since both ZFS and LZ4 work very quickly, the backup generation will not influence the performance of the web hosting servers where your content will be kept.