The term data compression means reducing the number of bits of info which should be saved or transmitted. This can be achieved with or without losing information, so what will be deleted throughout the compression can be either redundant data or unneeded one. When the data is uncompressed subsequently, in the first case the content and its quality will be the same, while in the second case the quality will be worse. You'll find different compression algorithms which are better for different type of information. Compressing and uncompressing data in most cases takes plenty of processing time, which means that the server carrying out the action should have plenty of resources in order to be able to process your info quick enough. One simple example how information can be compressed is to store just how many consecutive positions should have 1 and how many should have 0 in the binary code as an alternative to storing the particular 1s and 0s.

Data Compression in Cloud Hosting

The compression algorithm that we work with on the cloud web hosting platform where your new cloud hosting account shall be created is known as LZ4 and it is applied by the state-of-the-art ZFS file system that powers the platform. The algorithm is a lot better than the ones other file systems employ because its compression ratio is higher and it processes data considerably quicker. The speed is most noticeable when content is being uncompressed since this happens at a faster rate than information can be read from a hard disk. For that reason, LZ4 improves the performance of each and every site stored on a server which uses this algorithm. We use LZ4 in one more way - its speed and compression ratio allow us to produce a number of daily backups of the entire content of all accounts and store them for one month. Not only do these backup copies take less space, but their generation doesn't slow the servers down like it often happens with other file systems.