You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to be able to use the CBOR parser as a tokenizer basically, so that I can parse a Stream (which gets its data from an underlying socket) into CBOR "tokens". The reason being that the document could contain some very large data structures (e.g. a very long array) and from what I can tell the current API would require reading the full array into memory into a corresponding CBORObject. In my case I would then copy the data from there into the application's own data structures and would thus temporarily lead to a lot of additional data being allocated (and also put more pressure on the GC).
.NET's own CBOR implementation provides a CBOR parser that can be used directly. Unfortunately it doesn't allow me to read data synchronously from a Stream, which is why I prefer to use this library. Now I am wondering how I should proceed when parsing very large arrays.
The text was updated successfully, but these errors were encountered:
I would like to be able to use the CBOR parser as a tokenizer basically, so that I can parse a
Stream
(which gets its data from an underlying socket) into CBOR "tokens". The reason being that the document could contain some very large data structures (e.g. a very long array) and from what I can tell the current API would require reading the full array into memory into a correspondingCBORObject
. In my case I would then copy the data from there into the application's own data structures and would thus temporarily lead to a lot of additional data being allocated (and also put more pressure on the GC)..NET's own CBOR implementation provides a CBOR parser that can be used directly. Unfortunately it doesn't allow me to read data synchronously from a
Stream
, which is why I prefer to use this library. Now I am wondering how I should proceed when parsing very large arrays.The text was updated successfully, but these errors were encountered: