The term data compression means reducing the number of bits of info that needs to be stored or transmitted. You can do this with or without the loss of info, so what will be deleted in the course of the compression shall be either redundant data or unnecessary one. When the data is uncompressed afterwards, in the first case the info and the quality will be identical, while in the second case the quality shall be worse. You can find different compression algorithms which are better for various type of data. Compressing and uncompressing data usually takes a lot of processing time, which means that the server performing the action should have enough resources in order to be able to process your data fast enough. An example how information can be compressed is to store how many sequential positions should have 1 and how many should have 0 in the binary code instead of storing the actual 1s and 0s.
Data Compression in Cloud Hosting
The ZFS file system which operates on our cloud hosting platform employs a compression algorithm identified as LZ4. The aforementioned is a lot faster and better than any other algorithm you'll find, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard drive, which improves the performance of sites hosted on ZFS-based platforms. Because the algorithm compresses data quite well and it does that very fast, we are able to generate several backup copies of all the content kept in the cloud hosting accounts on our servers on a daily basis. Both your content and its backups will require less space and since both ZFS and LZ4 work extremely fast, the backup generation will not change the performance of the hosting servers where your content will be kept.