azos
azos copied to clipboard
Async Json
The benefits of full ASYNC json writer (and reader) are very questionable. There are many calls to TextWriter.Write(":")
with a single char and context switching via Task would make this far less performant than sync version, especially when server responds with content in memory-backed buffer. The benefit of async
is very questionable
We will need to use something like PooledBytBufferWriter
instead of text writer with a generic Stream
, this way it always writes to meemory using fully synchronous code, however the buffer then gets written using AsyncWrite()....
We will need to use something like
PooledBytBufferWriter
instead of text writer with a genericStream
, this way it always writes to meemory using fully synchronous code, however the buffer then gets written using AsyncWrite()....
The trick is to do that in 100% sync code using chunks, say of 2 kb in size. The chunks get handled asynchronously then.
I already started the refactoring. The ISourceText
needs to add a prop like .ChunkEof
when true, the deserializer should break-out in a co-routine style so the reader can perform ASYNC read of the next chunk.
The trick would be implementing a fully alloc-free state machine to enable for LOGICAL co-routines without relying on call stack.
For that we can use readonly ref structs
to capture a "point of execution suspension".
To conclude: definitely one can not implement this efficiently using regular await Task
approach due to exorbitant context switching.
This deals with #837.
Microsoft Kestrel + StreamReader
have a race condition in a sync api mode.
We will need to provide ASYNC-first json read apis like this:
JazonLexer
relies on StreamSourceText
which will have to NOT inherit from ~~StreamReader
~~, instead
we will read a chunk of chars
in buffer asynchronously and partially release the control back to caller
for snchrnous parsing of that chunk. Chunk size may be say 8kb, so it will be very efficient, yet process input in synchronosuly using async-acquired chunks.
See Azos.CodeAnalysis.Source.StreamSource.cs
JsonIngestOptions
As a part of requirements: we need to add JsonIngestOptions
(as opposed to JsonWritingOptions/Read options which are higher-level to limit:
a. Total datagram size in characters
b. Maximum datagram depth, e.g. default 64
c. Maximum datagram size in json values (property/array member count)
Abort ASYNC reading upon exceeding these limits which are always present, defaulted if not passed
ISourceText changes
IAsyncSourceText : ISourceText - provides async processing hooks
This is needed for StreamSourceText
and FileSourceText
as ~~StringSource
~~ does not need async processing.
Continue subsequent datagaram processing from the same stream
When JSON datagarams follow one-after another in the same stream, for example:
{a: 1}{b: "I am another datagram}{c: true}
- 3 datagrams following one after another.
- Ability to CONTINUE processing of
ITextSource
with different name/language AS OF the current postion. - Ability to rename the source, keep the stream, possibly change Language property as well
- This needs to be added to
ISourceText
as it applies to ANY source (string, stream) etc...
Need to Test
- Old tests - the benchmarks should be around the same (they use ISourceText) which is new
- StreamSource with sync
- StreamSource with async
According to latest test results, the addition of new StreamSource
implementation has NOT caused any delays in SYNC processing paths. Which is a very good outcome, as it was expected.
The asynchronous json parsing using StreamHookUse.CaseOfRandomAsyncBufferReading
is definitely slower than SYNC, however in real applications the data is never fully present in memory, hence using ASYNC json parser in HTTP WAVE framewrok by default may be very beneficial WHEN content-length
header is not specified, or content is larger than a few kb.
Feb 27, 2023 RELEASE .NET6
RELEASE .NET 6 runtime debug (notice the 2x+ performance over .Net 4 Fx), ERROR SNIPPETS ENABLED
Started 02/27/2023 15:48:09
Starting Azos.Tests.Nub::Azos.Tests.Nub.Serialization.JsonBenchmarkTests ...
- Test_Primitives {cnt=250000 par=false} Did 250,000 in 3.8 sec at 66,041 ops/sec
[OK]
- Test_Primitives {cnt=250000 par=true} [1] Did 250,000 in 0.5 sec at 532,190 ops/sec
[OK]
- Test_SimpleObject {cnt=250000 par=false} Did 250,000 in 0.6 sec at 400,614 ops/sec
[OK]
- Test_SimpleObject {cnt=250000 par=true} [1] Did 250,000 in 0.1 sec at 3,139,903 ops/sec
[OK]
- Test_ModerateObject {cnt=150000 par=false} Did 150,000 in 1.0 sec at 150,023 ops/sec
[OK]
- Test_ModerateObject {cnt=150000 par=true} [1] Did 150,000 in 0.1 sec at 1,222,514 ops/sec
[OK]
- Test_ComplexObject {cnt=95000 par=false} Did 95,000 in 4.3 sec at 21,949 ops/sec
[OK]
- Test_ComplexObject {cnt=95000 par=true} [1] Did 95,000 in 0.5 sec at 181,708 ops/sec
[OK]
Mar 21, 2023 RELEASE .NET6 (with ASYNC parser)
RELEASE .Net 6
Started 03/21/2023 16:16:02
Starting Azos.Tests.Nub::Azos.Tests.Nub.Serialization.JsonBenchmarkTests ...
- Test_Primitives {cnt=250000 par=false} Did 250,000 in 3.3 sec at 76,104 ops/sec
[OK]
- Test_Primitives {cnt=250000 par=true} [1] Did 250,000 in 0.4 sec at 588,220 ops/sec
[OK]
- Test_SimpleObject {cnt=250000 par=false} Did 250,000 in 0.6 sec at 427,322 ops/sec
[OK]
- Test_SimpleObject {cnt=250000 par=true} [1] Did 250,000 in 0.1 sec at 2,569,999 ops/sec
[OK]
- Test_ModerateObject {cnt=150000 par=false} Did 150,000 in 0.8 sec at 182,646 ops/sec
[OK]
- Test_ModerateObject {cnt=150000 par=true} [1] Did 150,000 in 0.1 sec at 1,199,381 ops/sec
[OK]
- Test_ComplexObject {cnt=95000 par=false} Did 95,000 in 4.1 sec at 23,235 ops/sec
[OK]
- Test_ComplexObject {cnt=95000 par=true} [1] Did 95,000 in 0.5 sec at 177,183 ops/sec
[OK]
- Test_ComplexObject_Async {cnt=95000 par=false} Did 95,000 in 24.7 sec at 3,852 ops/sec
[OK]
- Test_ComplexObject_Async {cnt=95000 par=true} [1] Did 95,000 in 1.3 sec at 71,206 ops/sec
[OK]
... done JsonBenchmarkTests
Mar 30, 2023 Perf test run after JsonReadingOptions
introduction
Just re-ran all JsonBenchmarkTests, results are the same as before the JsonReadingOptions
introduction
The results are the same.
RELEASE .Net 6
Started 03/30/2023 19:51:23
Starting Azos.Tests.Nub::Azos.Tests.Nub.Serialization.JsonBenchmarkTests ...
- Test_Primitives {cnt=250000 par=false} Did 250,000 in 3.3 sec at 76,206 ops/sec
[OK]
- Test_Primitives {cnt=250000 par=true} [1] Did 250,000 in 0.4 sec at 575,449 ops/sec
[OK]
- Test_SimpleObject {cnt=250000 par=false} Did 250,000 in 0.6 sec at 411,943 ops/sec
[OK]
- Test_SimpleObject {cnt=250000 par=true} [1] Did 250,000 in 0.1 sec at 2,883,131 ops/sec
[OK]
- Test_ModerateObject {cnt=150000 par=false} Did 150,000 in 0.8 sec at 176,881 ops/sec
[OK]
- Test_ModerateObject {cnt=150000 par=true} [1] Did 150,000 in 0.1 sec at 1,164,434 ops/sec
[OK]
- Test_ComplexObject {cnt=95000 par=false} Did 95,000 in 4.1 sec at 23,286 ops/sec
[OK]
- Test_ComplexObject {cnt=95000 par=true} [1] Did 95,000 in 0.5 sec at 180,411 ops/sec
[OK]
- Test_ComplexObject_Async {cnt=95000 par=false} Did 95,000 in 24.9 sec at 3,822 ops/sec
[OK]
- Test_ComplexObject_Async {cnt=95000 par=true} [1] Did 95,000 in 1.3 sec at 72,440 ops/sec
... done JsonBenchmarkTests
Doc Benchmarks 03/31/2023
Payload used: json.txt
RELEASE .Net 6 Started 03/31/2023 12:58:46
Starting Azos.Tests.Nub::Azos.Tests.Nub.Serialization.JsonBenchmarkDocTests ...
- Test_TypicalPerson {cnt=50000 par=false} Did 50,000 in 1.3 sec at 38,576 ops/sec
[OK]
- Test_TypicalPerson {cnt=50000 par=true} [1] Did 50,000 in 0.2 sec at 327,856 ops/sec
[OK]
- Test_TypicalFamilyWithPolymorphicShapes {cnt=50000 par=false} Did 50,000 in 4.0 sec at 12,628 ops/sec
[OK]
- Test_TypicalFamilyWithPolymorphicShapes {cnt=50000 par=true} [1] Did 50,000 in 0.5 sec at 101,583 ops/sec
[OK]
... done JsonBenchmarkDocTests
Async JSON writing now....