(no title)
Gluber | 1 year ago
The higher level ASP.NET Core stack is also quite efficient and optimized.
BUT: as soon as you gove above the basic middleware pipeline its tends to get bloated and slow. ASP.NET COre MVC is particulary bad.
System.Text.Json is also quite nice, and often is allocation free.
We bascially just us the middleware pipeline and nothing else, and can get millions of requests per second on basic hardware.
neonsunset|1 year ago
As you noted, the problems happen later in the handling pipeline. There are also choices that ASP.NET Core has to make as an extremely general-purpose web framework like that.
System.Text.Json is pretty good for a default serializer, but it's far from how fast json serialization can be done in .NET too.
Both of these end up reallocating and transcoding data, and STJ also might take a hit if deserialized classes/structs have converters in them as it can disable a fast-path.
My idea here is that a new implementation can benefit from the hindsight of strength/weaknesses of asp.net core and how its design influences the implementation choices of the user (and whether they end up being optimal or not).
It would also not be constrained by backwards compatibility or certain organizational restrictions - for example you do not need to explicitly use out-of-box QuicConnection and QuicStream that rely on mquic and opt to statically link to parts of the stack implemented in Rust, or bring over more logic to C# instead. There is a certain conventional way you are expected to approach this in dotnet/* repositories, and it might be a bit restrictive in achieving end goals of such a design.
It would be able to approach the problem as a library that expects a more advanced user, closer to how e.g. Axum or back in the day actix-web did (and by advanced I mean effective usage of (ref) structs and generics, not that it would need boilerplate).
p.s.: does this organization with millions of RPS have a name?