-
Notifications
You must be signed in to change notification settings - Fork 210
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve database model conversion performance by caching extra data and keeping row cache #3534
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -222,11 +222,6 @@ class DatabaseContainer: NSPersistentContainer, @unchecked Sendable { | |
try actions(self.writableContext) | ||
FetchCache.clear() | ||
|
||
// Refresh the state by merging persistent state and local state for avoiding optimistic locking failure | ||
for object in self.writableContext.updatedObjects { | ||
self.writableContext.refresh(object, mergeChanges: true) | ||
} | ||
|
||
Comment on lines
-225
to
-229
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can you explain why was this needed before, and now it is not? 🤔 What are the chances of this producing some regressions? 🤔 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Isn't there a chance that Data will be missing or the FRC won't trigger updates? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yes, we need more details on why was this removed. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I looked into the history of this and it goes like this:
// If you touch ManagedObject and update one of it properties to same value
// Object will be marked as `updated` even it hasn't changed.
// By reseting such objects we remove updates that are not updates.
for object in self.writableContext.updatedObjects {
if object.changedValues().isEmpty {
self.writableContext.refresh(object, mergeChanges: false)
}
} refresh(_:mergeChanges:false) = "Updates the persistent properties of a managed object to use the latest values from the persistent store. If flag is false, the context discards pending changes and the managed object becomes a fault. Upon next access, the context reloads the object’s values from the persistent store or last cached state."
if let currentUser = currentUser, !currentUser.hasChanges {
let fakeNewUnread = currentUser.unreadChannelsCount
currentUser.unreadChannelsCount = fakeNewUnread
}
Since the change number 2 some things have happened like we do not use viewContext anymore (except reading the state in DataStore, DB observers do not use it anymore). All the DB mutations go through the There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Didn't we also introduce a state layer context? Will this have an impact on it, have you checked that? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. For state layer, with similar test, the change is 1.60 s to 0.77 s (as expected) |
||
if self.writableContext.hasChanges { | ||
log.debug("Context has changes. Saving.", subsystems: .database) | ||
try self.writableContext.save() | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -9,6 +9,7 @@ import Foundation | |
final class StreamJSONDecoder: JSONDecoder, @unchecked Sendable { | ||
let iso8601formatter: ISO8601DateFormatter | ||
let dateCache: NSCache<NSString, NSDate> | ||
let rawJSONCache: RawJSONCache | ||
|
||
override convenience init() { | ||
let iso8601formatter = ISO8601DateFormatter() | ||
|
@@ -17,12 +18,21 @@ final class StreamJSONDecoder: JSONDecoder, @unchecked Sendable { | |
let dateCache = NSCache<NSString, NSDate>() | ||
dateCache.countLimit = 5000 // We cache at most 5000 dates, which gives good enough performance | ||
|
||
self.init(dateFormatter: iso8601formatter, dateCache: dateCache) | ||
self.init( | ||
dateFormatter: iso8601formatter, | ||
dateCache: dateCache, | ||
rawJSONCache: RawJSONCache(countLimit: 500) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. In the performance benchmark table, we are not checking the memory usage. Is there any significant increase in memory by caching all the extra data? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I don't expect it, but maybe we clean it up on memory warning notifications, just in case? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Internally it is NSCache which clears itself on low memory warning. I used a wrapping class because of unit-tests. Can't mock NSCache from the Swift side (it is objc generic class and brings restrictions). There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. |
||
) | ||
} | ||
|
||
init(dateFormatter: ISO8601DateFormatter, dateCache: NSCache<NSString, NSDate>) { | ||
init( | ||
dateFormatter: ISO8601DateFormatter, | ||
dateCache: NSCache<NSString, NSDate>, | ||
rawJSONCache: RawJSONCache | ||
) { | ||
iso8601formatter = dateFormatter | ||
self.dateCache = dateCache | ||
self.rawJSONCache = rawJSONCache | ||
|
||
super.init() | ||
|
||
|
@@ -50,6 +60,49 @@ final class StreamJSONDecoder: JSONDecoder, @unchecked Sendable { | |
} | ||
} | ||
|
||
extension StreamJSONDecoder { | ||
class RawJSONCache { | ||
private let storage: NSCache<NSNumber, BoxedRawJSON> | ||
|
||
init(countLimit: Int) { | ||
storage = NSCache() | ||
storage.countLimit = countLimit | ||
} | ||
|
||
func rawJSON(forKey key: Int) -> [String: RawJSON]? { | ||
storage.object(forKey: key as NSNumber)?.value | ||
} | ||
|
||
func setRawJSON(_ value: [String: RawJSON], forKey key: Int) { | ||
storage.setObject(BoxedRawJSON(value: value), forKey: key as NSNumber) | ||
} | ||
|
||
final class BoxedRawJSON { | ||
let value: [String: RawJSON] | ||
|
||
init(value: [String: RawJSON]) { | ||
self.value = value | ||
} | ||
} | ||
} | ||
|
||
/// A convenience method returning decoded RawJSON dictionary with caching enabled. | ||
/// | ||
/// Extra data stored in models can be large, what can significantly slow | ||
/// down DTO to model conversions. This function is a convenient way for | ||
/// caching some of the data in DTO to model conversions. | ||
func decodeCachedRawJSON(from data: Data?) throws -> [String: RawJSON] { | ||
guard let data, !data.isEmpty else { return [:] } | ||
let key = data.hashValue | ||
if let value = rawJSONCache.rawJSON(forKey: key) { | ||
return value | ||
} | ||
let rawJSON = try decode([String: RawJSON].self, from: data) | ||
rawJSONCache.setRawJSON(rawJSON, forKey: key) | ||
return rawJSON | ||
} | ||
} | ||
|
||
extension JSONDecoder { | ||
/// A default `JSONDecoder`. | ||
static let `default`: JSONDecoder = stream | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another interesting thing that we could try to improve is that
ChatMember
andChatUser
are both classes. And I think we should be able to convert them to structs without API changes. Could be interesting to see if there are any performance improvements by doing so 🤔There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why do you think structs would improve performance? Probably only the SDK size would increase.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ChatUser.init itself is visible in the time profiler, but it is like 1 ms which is nothing compared to other things happening. At some point it would be nice to only use structs for models (consistent).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Usually structs are lightweight to create and destroy, so it should improve things, but maybe not significantly. @laevandus We not only need to check the init, but also how much time is spend destroying existing objects 🤔
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Based on time profiler I don't see this popping up really, a couple of ms for
ChatUser.__deallocating_deinit
andChatChannelMember.__deallocating_deinit
. We should do that for consistency, but in a separate ticket. No real performance improvement here.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay nice 👍