You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I use MemoryPack to save building information in the game world. And everything works fine usually, hundreds of people downloading and saving their builds at the same time. But when I wanted to migrate and add a new field manually (deserialize old format -> serialize to new format) I started getting a header size error:
Unhandled exception. MemoryPack.MemoryPackSerializationException: Length header size is larger than buffer size, length: 218103807.
Serializing:
List<EntityData>entitiesData= ...
using var compressor =new BrotliCompressor();
MemoryPackSerializer.Serialize(compressor, entitiesData);
Deserializing:
byte[]?data= ...
using var decompressor =new BrotliDecompressor();
MemoryPackSerializer.Deserialize(decompressor.Decompress(data),refentitiesData!);
I don't have the ability to use streaming serialization like.
And during migration I need to deserialize about 10000 files (~150 kilobytes each) and serialize again
The text was updated successfully, but these errors were encountered:
I don't know if this will help you to fix the error, but I managed to avoid it by checking that the array is empty, having previously marked Payload and Textures as nullable - apparently at some point an array with size 0 was created.
I use MemoryPack to save building information in the game world. And everything works fine usually, hundreds of people downloading and saving their builds at the same time. But when I wanted to migrate and add a new field manually (deserialize old format -> serialize to new format) I started getting a header size error:
Unhandled exception. MemoryPack.MemoryPackSerializationException: Length header size is larger than buffer size, length: 218103807.
Serializing:
Deserializing:
I don't have the ability to use streaming serialization like.
And during migration I need to deserialize about 10000 files (~150 kilobytes each) and serialize again
The text was updated successfully, but these errors were encountered: