Data compression – why should you compress your backups?

With the growing number of data created by the Business every day (week or month) some easy ways to shrink the size of backups would help to improve data storage management. If only… But there is a quite smart solution for this problem. And in this article, we will talk about it. As you probably guessed our today’s topic is backup compression, and how it helps create a better backup strategy.

What is data compression?

Data compression is one of the most efficient ways to reduce the size of backup copies. During this process, the file size is reduced by the utilization of a special encoding. There are two main types of compression: lossless and lossy. As the name suggests, lossless compression allows you to shrink the size of data without losing the ability to recover the original data. In case of the lossy compression, the user loses the original data in the process. This time we will focus only on lossless compression, as it is a more preferred option for organizations nowadays. 

How does lossless compression work?

When we look closely into the backup compression process – into the code itself – it turns out that almost all of the generated data is not random, and some repeating patterns occur. The lossless compression does it extremely efficiently and additionally gives the possibility of accurate data reconstruction from the previously compressed file. But as you might have guessed data compression requires resources – the main burden would be put on the CPU and RAM. In a way, data compression is a trade-off between utilizing your storage space and network usage versus utilizing your processors and memory. 

Benefits of backup compression

To settle the argument whether you should compress your backups we need to talk a little more about all specific profits which your company could benefit from.

Backup compression helps you with saving your network bandwidth. If you work in a bigger company, your network is probably under a lot of… pressure. During backups, created copies are sent to the backup server, and thus adding additional loading to the network. But by utilizing backup compression, you can greatly decrease the backup size and thus offload an IT network. There is also the fact, that compressed data will be transferred much faster.

The more measurable aspect that the backup compression has on your business is freeing up the storage space. Keep in mind that, while you can significantly reduce the amount of used storage space, but our savings may depend on compressed data types. Text documents and excel sheets can be compressed extremely well. Music, or movies formats not so much.

Still, by compressing your backups you can significantly decrease the costs associated with the storage usage while saving a lot of bandwidth of your company network. 

Backup compression in Xopero ONE

Now it’s time to take a look into how Xopero ONE – which you can test free for 30 days – compresses data during a backup process. 

Xopero allows users to customize a backup plan the way they want, which applies also to backup compression settings. Users can choose between two algorithms:

  • LZ4: is a stream compression, characterized by a very high speed of compression and even faster decompression.
  • ZStandard: is an algorithm that works in real-time providing a high compression rate. It offers a wide range of compression – you can compromise the speed of the process, it also allows you to use a fast decoder. 

Additionally, users have at their disposal three compression levels: 

  • Normal: which enables faster backup but takes more time. This option increases the speed of the compression algorithm, but the storage space required to store backup will be larger, than in the other available levels.
  • Medium: the most optimal backup compression. This solution is a compromise between the speed of performing compression and compression rate.
  • High: this one takes longer but in the end, reduces the backup size significantly. Users can take full advantage of the compression algorithm, keeping in mind that it will compromise the compression speed but keep a good compression rate.