Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possibility of choosing zip compression method #1718

Merged
merged 7 commits into from
Oct 17, 2023

Conversation

boryn
Copy link
Contributor

@boryn boryn commented Oct 2, 2023

This PR adds possibility of choosing zip compression method and its level through these config options:

  • ['destination']['compression_method']
  • ['destination']['compression_level']

Now it's possible to select the specific compression method like ZipArchive::CM_DEFLATE or ZipArchive::CM_XZ or not use compression at all with ZipArchive::CM_STORE.

It's possible to experiment and choose optimal compression for one's needs, which impacts memory/cpu usage plus time consumption and finally file size.

In case of using only db dumps, I have decided to directly compress .sql with GzipCompressor::class and not compress .zip archive, which gives me less CPU usage and still acceptable file size.

@boryn
Copy link
Contributor Author

boryn commented Oct 2, 2023

@freekmurze, added 'can use different compression methods for backup file' test

@freekmurze
Copy link
Member

Seems like the tests are failing, could you look at that?

@erikn69
Copy link
Contributor

erikn69 commented Oct 10, 2023

@boryn
Copy link
Contributor Author

boryn commented Oct 10, 2023

@erikn69 Thank you!

PS. Why $nameInZip may be not present at all?

@erikn69
Copy link
Contributor

erikn69 commented Oct 10, 2023

$filepath is used if $entryname is empty
https://www.php.net/manual/en/ziparchive.addfile.php

public ZipArchive::addFile(
    string $filepath,
    string $entryname = "",
    int $start = 0,
    int $length = 0,
    int $flags = ZipArchive::FL_OVERWRITE
): bool

src/Tasks/Backup/Zip.php Outdated Show resolved Hide resolved
src/Tasks/Backup/Zip.php Outdated Show resolved Hide resolved
@boryn
Copy link
Contributor Author

boryn commented Oct 12, 2023

Ok, @freekmurze I have fixed this issue.

It was connected to not using $nameInZip. Now the compression works properly with or without $nameInZip parameter.

@freekmurze freekmurze merged commit 64f4b81 into spatie:main Oct 17, 2023
5 checks passed
@freekmurze
Copy link
Member

Thanks!

@siarheipashkevich
Copy link
Contributor

siarheipashkevich commented Oct 17, 2023

Hi @boryn thanks for your PR's. Could you please suggest which type of compression is better to use only for DB dump.

Server with database:
MySQL 8.0.34
image

Server with app where backup is doing:
php 8.2.11
image

Right now I have scheduler:
$schedule->command('backup:run --only-db')->dailyAt('04:30');

and it takes almost 1 hour of time (app server):

Screenshot from 2023-10-17 21-01-31

database server:
image

the size of the archive is is 3.67GB and mysql-forge.sql is ~30GB.

The config backup.php is default but I added one option to the database.php config is:
image

Maybe I can decrease the time of the dump process and decrease the size or improve anything?

@boryn
Copy link
Contributor Author

boryn commented Oct 17, 2023

Hi @siarheipashkevich!

For the only db dump, I have gone with gzip compression directly at db dump, so at backup.php I have:
'database_dump_compressor' => Spatie\DbDumper\Compressors\GzipCompressor::class,
(the default compression level for gzip is 6 and for me it is a good compromise between speed and compression ratio)

and for ZIP I don't use any compression, so I have then:
'compression_method' => ZipArchive::CM_STORE,

This configuration gives me acceptable compression ratio with not CPU hogging.

Actually, it depends what you need. For better ZIP compression (decreased file size, but definitely more compression time, memory and CPU) you may try ZipArchive::CM_XZ or ZipArchive::CM_BZIP2, or as well experiment with CM_LZMA or CM_LZ77.

@siarheipashkevich
Copy link
Contributor

Is it like you suggest?
image

I just want to have an ability backup my db every day and if something wrong restore this backup using mysqldump and also not affect users when backup is doing. I don't have any limitation of the size

@boryn
Copy link
Contributor Author

boryn commented Oct 17, 2023

Yes, though compression_level for zip is not relevant when using ZipArchive::CM_STORE for just storing the zip file without compression.

If you want to play with compression level for gzip, you'd need to extend and create your own copy of Spatie\DbDumper\Compressors\GzipCompressor::class, where you would specify:

public function useCommand(): string
{
    return 'gzip -3';
}

and you can play with different compression levels by changing the gzip parameter like -1, -2, -3, -6, -9, etc.

My tests show, that using gzip with the default compression (which is 6) does not hog the CPU and users can easily use the system. And using gzip should be quicker than the default zipping options with compression at level 9.

@siarheipashkevich
Copy link
Contributor

@boryn thanks, I trust your configuration 😄

Sorry for the stupid question, is it ok and it is about your recommendation? :
image

@boryn
Copy link
Contributor Author

boryn commented Oct 17, 2023

Yes, but please run the backup on the development / local machine first :)

Let us know how long it took and how the cpu was used.

PS. You may even remove the compression_level key

@siarheipashkevich
Copy link
Contributor

@boryn

app server:
image

database server:
image

As I can see the CPU is higher in app server right now.

@boryn
Copy link
Contributor Author

boryn commented Oct 17, 2023

Yes, interesting. Maybe try to experiment with extending the gzip class and using compression like -3 ?

@siarheipashkevich
Copy link
Contributor

siarheipashkevich commented Oct 18, 2023

Hi @boryn, there are screens from my real production:

database server:
image

app server:
image

I see some spike of the CPU usage in app server but I think it's not critical.
And also time decreased by 2 times. Thanks for your recommendations 👍

Do you have an experience of tuning MySQL or creating replication for production database?

@boryn
Copy link
Contributor Author

boryn commented Oct 26, 2023

Hi @siarheipashkevich

It's so great that the time has reduced 2x! Unfortunately I have actually no experience with replication of MySQL.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants