You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, we are using Fabrik to display some tables on a public-facing website. We noticed that when bots come index our site, the number of sessions ballons (not a Fabrik issue), but we also noticed that those sessions tend to become larger and larger, up to several megabytes, effectively DDOSsing our session storage.
We decoded a few of those session vars and we noticed that the bulk of their size is due to the Fabrik head script cache, i.e.:
The issue is that there is no upper bound on the number of URLs whose scripts will be cached. Ideally, one should keep just a handful of cache entries around in session, the latest 10 or 20 would be plenty, but in our codebase we easily reach 900+ URLs per session!
The problem is heavily visible when bots show up as they do not throttle their requests like a human would do and can easily submit thousands of requests in a single session.
Any idea about how to fix this problem?
Thanks in advance
The text was updated successfully, but these errors were encountered:
Yep, I know. Unfortunately we'll have to keep Joomla 3 for the foreseeable future (not my choice). Would Fabrik for J!4 be affected by this issue as of now?
Hi, we are using Fabrik to display some tables on a public-facing website. We noticed that when bots come index our site, the number of sessions ballons (not a Fabrik issue), but we also noticed that those sessions tend to become larger and larger, up to several megabytes, effectively DDOSsing our session storage.
We decoded a few of those session vars and we noticed that the bulk of their size is due to the Fabrik head script cache, i.e.:
https://github.com/Fabrik/fabrik/blob/master/plugins/system/fabrik/fabrik.php#L217-L251
The issue is that there is no upper bound on the number of URLs whose scripts will be cached. Ideally, one should keep just a handful of cache entries around in session, the latest 10 or 20 would be plenty, but in our codebase we easily reach 900+ URLs per session!
The problem is heavily visible when bots show up as they do not throttle their requests like a human would do and can easily submit thousands of requests in a single session.
Any idea about how to fix this problem?
Thanks in advance
The text was updated successfully, but these errors were encountered: