Sometimes when you want to store your backup or any other large set of files online or want to share them someone else you need to find a way to compress and split the files into chunks of 100 or more Megabytes. I felt the need for this as well recently when I wanted to store my backups online and the online storage service had a cap of 100 MB per file. I found a really neat solution based on the tar command. Using this method I split my backup of about 1 GB into 10 chunks of 100 MB each with incremental filenames.
The 1 GB file I wanted to split was called dbbackup.db. Here’s the command I ran to create multiple tar files of 100 MB each out of it:
# tar -cf – dbbackup.db | split -b 100m – db_backup.tar
This command took a long time to run. Once it was done running I was left with ten files, 100 MB each named db_backup.taraa, db_backup.tarab, db_backup.tarac, and so on and so forth.
Now I can copy these files to my external storage or ship them with ease. To stitch the 1GB file back together all I need to do is to run the following command:
# cat db_backup.tara* | (tar x)
And voila, I get my original file again.