-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow the use ReentrantLock
instead of SpinLock
#39
Comments
can you explain your use-case more, why do you want to grab the lock over multiple gets/sets? |
ReentrantLock seems preferred in general over SpinLock reading through the julia issue tracker / PRs, so it would make sense to me to switch. But I assume @Jutho had a reason for choosing SpinLock in the first place. Maybe we can try to do some benchmarking to see what the tradeoffs are. |
I think I chose the I am interested in the use case here. The goal is that |
Please apologize the delayed reply, I missed that thread. Regarding the question as what is the use case of taking a lock over multiple gets/sets, one simple example is where one would like to implement a method which acts as function get_or_new(cache::LRU{K,V}, key::K, args...)
lock(cache) do
if haskey(cache, key)
cache[key]
else
created = V(args...)
cache[key] = created
return created
end
end
end Now such user-level locks become easiest if one uses a |
I think this specific api is provided already: https://github.com/JuliaCollections/LRUCache.jl#getdefaultcallable-lrulru-key |
Thanks I missed that. In any case, I think it might be worth considering changing that lock in case performance gains can be made. |
In the current implementation, if a user needs to hold the cache's lock over multiple gets/sets, one cannot use the cache's internal lock since it is a
SpinLock
and blocks.Suggestion: Use a
ReentrantLock
or provide the user the option to use such a lock instead of aSpinLock
. Moreover, implement(likewise for
unlock
) to allow users taking that lock.The text was updated successfully, but these errors were encountered: