The term data compression refers to reducing the number of bits of information that needs to be saved or transmitted. This can be achieved with or without the loss of data, which means that what will be erased during the compression shall be either redundant data or unneeded one. When the data is uncompressed afterwards, in the first case the data and the quality shall be identical, while in the second case the quality shall be worse. There are various compression algorithms which are better for various type of data. Compressing and uncompressing data often takes a lot of processing time, which means that the server performing the action needs to have ample resources in order to be able to process the data quick enough. A simple example how information can be compressed is to store how many sequential positions should have 1 and just how many should have 0 inside the binary code as an alternative to storing the particular 1s and 0s.

Data Compression in Shared Hosting

The compression algorithm used by the ZFS file system which runs on our cloud hosting platform is called LZ4. It can boost the performance of any website hosted in a shared hosting account on our end because not only does it compress data more effectively than algorithms used by alternative file systems, but also uncompresses data at speeds that are higher than the hard disk drive reading speeds. This can be done by using a lot of CPU processing time, that is not a problem for our platform because it uses clusters of powerful servers working together. A further advantage of LZ4 is that it enables us to generate backup copies faster and on less disk space, so we shall have several daily backups of your databases and files and their generation will not change the performance of the servers. This way, we could always recover any kind of content that you may have removed by accident.