Hi all,
I have the problem that the archiver (run via cron or manually on the CLI) always runs out of memory. I searched the forum and found many similar topics, but if they were solved the solutions didn’t worked for me. So here I am:
Hardware:
- AMD Ryzen 5 3600 (6/12 Core)
- 64GB RAM
- 512 GB SSD
- 16GB Swap
Software:
- Debian 11.4
- Matomo 4.11.0
- MariaDB 10.5.15-MariaDB-0+deb11u1
- PHP 7.4.30 (CLI) 8.0.22 (Web)
- Matomo DB 15.3GB
I set the memory limit via /cli/php.ini
or via --php-cli-options
to many different values (8, 16, 20, 24, 32, 40, 42, 48, 56GB). PHP 7.4 had no memory_limit
at all and the archiving processes consumed all RAM and Swap until the OOM killer visited. I tried both PHP versions with different php.ini files in combination with those different memory_limits
.
At the moment I track four websites, three very small ones without any problems (and no segments) and one website with a few thousand visitors per day (at the moment two segments). The memory issue seems to be on archiving some days. I run the archiving command right now manually with --force-periods=day --concurrent-requests-per-website=1
set (I tried with small ranges via --force-date-range
too). In the error message Allowed memory size of 34359738368 bytes exhausted (tried to allocate 29105160768 bytes
) the tried to allocate
value is always different, doesn’t matter how high the memory_limit
is.
What is using so much RAM, even if the entire database fits more than twice in the memory_limit
?
Can I somehow estimate how much RAM is needed for the archiving to be successful?
Does someone has a similar setup with this problem (and maybe a solution)?
If more information is needed, I will provide that.
Hopefully someone can help me. Please
1 post - 1 participant