You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Cache locks are the main protection from the dogpile effect.
As main functionality (not related to some backend implementation) I suppose next simple structure.
Use something like a HasMap<&str, AsyncRwLock<CachedValue>>
where the key is a cache key of the current request and AsyncRWLock is an async version of classic RwLock (or some light pub-sub mechanic like tokio watch or one-shot channels).
How it should work:
First request with cache key A to Upstream will create a record in a Hash table. and send a request to Upstream
Second and next requests to Upstream with cache key A will check records on the hash table and subscribe to value changes (or asynchronous wait for changes)
After first request will be resolved in Upstream cache should publish result to all consumers/waiters and remove record from HashMap
We should extend Backend trait with cache lock related methods with a default implementation.
I think this implementation should integrate with the stale cache mechanic in case of returns stale data while the lock not released (if the stale cache is enabled).
The text was updated successfully, but these errors were encountered:
Cache locks are the main protection from the dogpile effect.
As main functionality (not related to some backend implementation) I suppose next simple structure.
Use something like a
HasMap<&str, AsyncRwLock<CachedValue>>
where the key is a cache key of the current request and AsyncRWLock is an async version of classic RwLock (or some light pub-sub mechanic like tokio watch or one-shot channels).
How it should work:
We should extend Backend trait with cache lock related methods with a default implementation.
I think this implementation should integrate with the stale cache mechanic in case of returns stale data while the lock not released (if the stale cache is enabled).
The text was updated successfully, but these errors were encountered: