I am trying to serialize an object into a MemoryStream using System.Text.Json's JsonSerializer. I am unable to find the implementation/method of that in the documentation. Can someone share the sample implementation for serialization and deserialization using System.Text.Json?
UPDATE
.NET 6 added JsonSerializer.Serialize overloads that write to a stream. It's now possible to write just :
JsonSerializer.Serialize(stream,myObject);
This produces unindented JSON using UTF8 without BOM
Original Answer
It's unclear what the problem is, or what documentation and examples are missing, as there's are multiple sections in learn.microsoft.com and hundreds of blog posts and articles. In the docs JSON serialization and deserialization is a good place to start and How to serialize and deserialize (marshal and unmarshal) JSON in .NET includes the section Serialize to UTF8.
A MemoryStream is just a Stream wrapper over a byte[] array anyway, so serializing to a MemoryStream is the same as serializing to a byte[] array directly. This can be done with JsonSerializer.SerializeToUtf8Bytes:
byte[] jsonUtf8Bytes =JsonSerializer.SerializeToUtf8Bytes(weatherForecast);
And finally, in .NET anything that needs to serialize to something, works through Reader and Writer objects, like TextReader, StreamReader, TextReader and -Writers. In JSON.NET's case, this is done through the Utf8JsonWriter object. JsonSerializer.Serialize has an overload that writes to a Utf8JsonWriter :
using var stream=File.OpenWrite(somePath);
using var writer=new Utf8JsonWriter(stream);
JsonSerializer.Serialize(writer,myObject);
That's the slow way of using System.Text.Json though. Using buffers means allocating them and cleaning them up, which is costly, especially in web applications. For this reason, ASP.NET Core uses IO pipelines instead of streams to receive and send data to sockets, using reusable buffers leased from buffer pools and passed along each step in the ASP.NET Core pipeline. Passing byte[] buffers around copies their contents, so .NET Core introduced the Span<> and Memory<> types, which represent a view over an existing (possibly pooled) buffer. This way, ASP.NET Core passes those "views" of the buffers around, not the buffers themselves.
System.Text.Json was built to use pipelines and reusable memory instead of streams, allowing ASP.NET Core to use minimal memory and as few allocations as possible in high traffic web sites. ASP.NET Core uses the Utf8JsonWriter(IBufferWriter) constructor to write to the output pipeline through a PipeWriter.
We can use the same overload to write to a reusable buffer with an ArrayBufferWriter. That's the equivalent of using a MemoryStream BUT the output is accessed through either a ReadOnlySpan<byte> or Memory<byte> so it doesn't have to be copied around :
using var buffer=new ArrayBufferWriter<byte>(65536);
using var writer=new Utf8JsonWriter(buffer);
JsonSerializer.Serialize(writer,myObject);
ReadOnlySpan<byte> data=buffer.WrittenSpan;
The better option is to use newtonsoft.json.
It has lot of examples
Related
I have a Web API method in ASP.NET Core 3.1 that returns an enumerable list of objects:
public async Task<IEnumerable<MyObject>> Get()
The Web API returns JSON by default. This has worked fine, until I added a property of type Dictionary<int, object> to MyObject, not realizing that whatever serializer ASP.NET Core 3.1 is using to build the response can't serialize a Dictionary (error is "type Dictionary<int, object> is not supported"). I can reproduce the same error by trying to serialize the Dictionary using the new System.Text.Json library, which is what I'm guessing is being used by the web API to build the JSON response.
Since JsonConvert still serializes Dictionary just fine, it wouldn't be hard to do the serialization manually in the method. But that means making my own JSON response and returning it as a content string, which just seems ... not great.
Is there another associative array type I could be using that the web API can serialize into JSON correctly? Or is there a way to configure the web API to use a JSON serialization library that can handle Dictionary? Or am I just stuck making my own JSON response for this method?
Edit: To be clear, I am curious which of these are possible, what the advantages or disadvantages are to each, so that I might choose a good solution for my circumstance.
This answer comes from #dbc in the comments. I hadn't realized that the System.Text.Json difficulty with serializing Dictionary wasn't with Dictionary generally, but specifically with Dicitonary using a non-string key. It was trivial for us to write a converter to translate the necessary Dictionary to a string-keyed type, and then the built-in JSON serialization handled it just fine.
Thanks, #dbc!
I encoutered a question here and need your help.
I had a .net web api project using PushStream to do async downloading, something like
new PushStreamContent(async (outputStream, httpContent, transportContext))=>{}
In this way, I can do multiple parts downloading in the action.
However, now I want to move the project into .net core and I cannot find a replacement in .net core for pushstream.
Could you please let me know is there something like pushstream in .net core or any methods to implement it?
Thanks a lot.
PushStreamContent works by essentially setting a callback to be invoked when the output stream is being processed. This was necessary in ASP.NET Web Api because there was no direct access to OutputStream.
However, in ASP.NET Core, you do have direct access to the output stream via HttpContext.Response.Body. As a result, you can just directly write to that stream without needing something like PushStreamContent.
FWIW, while this can save some server resources, it's actually pretty bad, especially with something like an API. Since you're writing directly to the output stream, the headers are already sent, including the response status code. This means if there's any exception raised while you're writing, there's no opportunity to handle it appropriately. The response is simply aborted, and the client is left hanging with a supposedly "successful" response, and only a partial body. Personally, I would say avoid this practice altogether and take the resource hit. Unless you're streaming gigs of data, any good server setup will have a plentiful amount of RAM to handle what you're doing. Even then, there's probably better methods to handle delivering that data.
I'm very new at WCF (and .NET in general), so I apologize if this is common knowledge.
I'm designing a WCF solution (currently using Entity Framework to access the database). I want to grab a (possibly very large) set of data from the database, and return it to the client, but I don't want to serialize the entire set of data over the wire all at once, due to performance concerns.
I'd like to operation to return some sort of object to the client that represents the resulting data and I'd like to deal with that data on the client, being able to navigate through it backwards and forwards and retrieve the actual data over the wire as needed.
I don't want to write a lot client code to individually find out what rows meet my search criteria, then make separate calls to get each record if I can help it. I'm trying to keep the client as simple as possible.
Ideally, I'd like to write the client code similar to something like the below pseudocode:
Reference1.Service1Client MyService = new Reference1.Service1Client("Service1");
DelayedDataSet<MyRecordType> MyResultSet = MyService.GetAllCustomers();
MyResultSet.First();
while (!MyResultSet.Eof)
{
Console.Writeline(MyResultSet.CurrentRecord().CUSTFNAME + " " + MyResultSet.CurrentRecord().CUSTLNAME);
Console.Writeline("Press Enter to see the next customer");
Console.Readline();
MyResultSet.Next();
}
Of course, DelayedDataSet is something I just made up, and I'm hoping something like it exists in .NET.
The call to MyService.GetAllCustomers() would return this DelayedDataSet object, with would not actually contain the actual records. The actual data wouldn't come over the wire until CurrentRecord() is called. Next() and Previous() would simply update a cursor on the server side to point to the appropriate record. I don't want the client to have any direct visibility to the database or Entity Framework.
I'm guessing that the way I wrote the code probably won't work over WCF, and that the functions like CurrentRecord(), Next(), First(), etc. would have to be separate service contract operations. I guess I'm just looking for a way to do this without having to write all my own code to cache the results on the server, somehow persist the data sets server side, write all the retrieval and navigation code in my service library, etc. I'm hoping most of this is already done for me.
It seems like this would be a very commonly needed function. So, does something like this exist?
-Joe
No, that's not what WCF is designed to do.
In WCF, the very basic core architecture is that you have a client and a server, and nothing but (XML-)serialized data going between the two over the wire.
WCF is not a remote-procedure call method, or some sort of remote object mechanism - there is no connection between the client and the server except the serialized message that conforms to the service (and data) contracts defined between the two.
WCF is not designed to handle huge data volumes - it's designed to handle individual messages (GetCustomerByID(42) and such). Since WCF is from the ground up designed to be interoperable with other platforms (non - .NET, too - like Java, Ruby etc.) you should definitely not be using heavy-weight .NET specific types like DataSet anyway - use proper objects.
Also, since WCF ultimately serializes everything to XML and send it across a wire, all the data being passed must be expressible in XML schema - which excludes interfaces and/or generics.
From what I'm reading in your post, what you're looking for is more of a "in-proc" data access layer - not a service level. So if you want to keep going down this path, you should investigate the repository and unit-of-work patterns in conjunction with Entity Framework.
More info:
MSDN: What is Windows Communication Foundation?
WCF Essentials—A Developer's Primer
Picture of the very basic WCF architecture from that Primer - there's only a wire with a serialized message connecting client and server - nothing more; but serialization will always happen
What I would like to do is stream the request to a file store asynchronously so that the incoming request does not take up a lot of memory and so that handling thread is not held up for IO.
I see that there is an asynchronous HTTP handler that I can implement. This looks like it would help with the thread usage, but it looks like the request has already been fully copied into memory by this point by IIS/ASPNET.
Is there a way to keep ASP.NET from reading the entire request in before handling it?
There is a new method added to the HttpRequest in .NET 4 called GetBufferlessInputStream(), which gives you synchronous access to the request stream.
From the MSDN article:
This method provides an alternative to using the InputStream property. The InputStream property waits until the whole request has been received before it returns a Stream object. In contrast, the GetBufferlessInputStream method returns the Stream object immediately. You can use the method to begin processing the entity body before the complete contents of the body have been received.
The ability to access the request stream asynchronously will be available in .NET 4.5. See the What's New in ASP.NET 4.5 article for more information. It looks like there will be several nice ASP.NET performance improvements in 4.5.
you are not searching SO enough.
the solution you need is explained here, step by step, in very much details: Efficiently Uploading Large Files via Streaming
check this one, your question is a duplicated: Streaming uploaded files directly into a database table
While this SO question is specifically about MVC, the answers should work for ASP.NET generally. Specifically, people appear to have had a good experience with Darren Johnstone's UploadModule.
I've implemented a caching interface and memchanged provider for our website using enyim. Works great in testing until we get to load testing, where it spikes the CPU of w3wp.exe to near 100%. We have a configuration property to switch the caching provider back to dotnet's API and the CPU goes back to 5-7%. Has anyone experienced similar?
Every time you store something in memcached through enyim, the .NET runtime will perform binary serialization on the stored object. And deserialization when you retrieve. For some types (string, byte[] and some more), enyim implements a more specific and light weight serialization, but most types are serialized by the standard BinaryFormatter. This is processor intensive.
It especially hurts when your code is written towards the in-memory cache in ASP.NET. You will probably have code that thinks that getting something from cache is free. You may get it from cache again and again and again. We had comparable problems when we switched to memcached. If you do some profiling, you'll probably find that you do insanely many reads from cache.
Our experiences with the enyim client have been very positive. We run memcached in an ASP.NET server farm on around 10 nodes and it is very stable. For some forms of data (very often accessed), we prefer the in-memory in-process caching of ASP.NET.
Be sure to also check your serialization and deserialization code for proper object or stream disposal.
I had the exact same w3p.exe spiking to 99% symptoms, and thought for sure it was an Enyim/Membase driver bug, but it wasn't. It was ours, and it was because we forgot to Dispose() of the MemoryStream after Deserializing every JSON object in our JSON helper class:
public static T DeserializeToObject<T>(this string json)
{
byte[] byteArray = Encoding.ASCII.GetBytes( json );
MemoryStream stream = new MemoryStream( byteArray );
DataContractJsonSerializer serializer = new DataContractJsonSerializer(typeof(T));
T returnObject = (T)serializer.ReadObject(stream);
stream.Close();
stream.Dispose(); // we forgot this line!
return returnObject;
}