Data compression is the compacting of information by reducing the number of bits which are stored or transmitted. Because of this, the compressed info will take substantially less disk space than the original one, so much more content could be stored on the same amount of space. There're different compression algorithms that function in different ways and with some of them only the redundant bits are removed, which means that once the information is uncompressed, there's no decrease in quality. Others delete unnecessary bits, but uncompressing the data later will result in lower quality in comparison with the original. Compressing and uncompressing content takes a huge amount of system resources, and in particular CPU processing time, so every web hosting platform that uses compression in real time needs to have adequate power to support this feature. An example how information can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of sequential 1s or 0s there should be instead of storing the entire code.
Data Compression in Shared Hosting
The compression algorithm used by the ZFS file system which runs on our cloud web hosting platform is known as LZ4. It can enhance the performance of any site hosted in a shared hosting account on our end since not only does it compress data much better than algorithms used by various file systems, but it uncompresses data at speeds that are higher than the hard disk reading speeds. This can be done by using a lot of CPU processing time, which is not a problem for our platform owing to the fact that it uses clusters of powerful servers working together. One more advantage of LZ4 is that it enables us to generate backups a lot faster and on reduced disk space, so we can have multiple daily backups of your databases and files and their generation will not affect the performance of the servers. This way, we could always recover all content that you could have deleted by mistake.