-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Polyfill] FIx - Deduplicate watched signals #191
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for fixing this important issue. Let’s make sure to optimize the data structures used before landing though.
node.dirty = false; // Give the watcher a chance to trigger again | ||
const prev = setActiveConsumer(node); | ||
|
||
const producerNodeSet = new Set(node.producerNode); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems a bit expensive to reconstruct this set on each watch call. Is there any way to reduce this cost, whether by maintaining the set across several method calls, or somehow avoiding constructing a set and using the producerNode data structure in some other way instead?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1 - the current implementation is specifically the result of profiling / perf optimizations and the observations that arrays are much faster collections as compared to sets / maps.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
... are much faster collections as compared to sets / maps.
Makes sense. Does that generally include traversing an array though ? If so we can just filter out duplicates first instead of constructing a set. What do you think ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does that generally include traversing an array though ? If so we can just filter out duplicates first instead of constructing a set. What do you think ?
Traversing isn't a problem. Filtering would be, though, as it would require creation of a new collection or shift elements in an array. More importantly, how would you eliminate duplicates without a set or sorting?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Something like signals.filter(s => node.producerNode.includes(s)).forEach(producerAccessed)
should work ( or even just a for loop without creating a new array with the filter ). We are at O(n.m) though, since we "traverse" producerNode for each signal.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated in 0684e8a to use .includes
instead. It's not the ideal option perf wise though. If we want better performance i think we should rethink the producerNode
data structure all together. @littledan @pkozlowski-opensource what do you think ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd expect it to be a bit faster if we checked the signal's consumers rather than the other way around, right? But it'd be nice if we had actual benchmarks.
The polyfill has moved to https://github.com/proposal-signals/signal-polyfill so let's pick up this PR from there.
This PR:
producerAccessed
if signal is already tracked.