I need to send large data using webservice. the size of data would be between 300 MB to 700 MB. The webservice generates data from SQL database and send to the client. it is in form of DataSet with around 20 to 25 tables.
I tried solution from artical, "How to: Enable a Web Service to Send and Receive Large Amounts of Data" and sample fo Microsoft WSE 3.0, but mostly it is giving me "System.OutOfMemoryException".
I think the problem is WebService buffers data in memory on server and it crosses limit.
i thought two alternate,
(1) send DataTable one by one, but some time one DataTable can have around 100MB to 150MB data
(2)Write file on server and transfer using HttpWebRequest(FTP possible, but FTP server is not accessible currently)
can any one suggest workaround for this problem using webservice?
Thanks,
A dataset will load all the data in memory. It is not suited to transfer that huge amounts of data. DataSets carry a lot of extra information when they are serialized.
If you know the structure of the tables you will need to transfer, create a set of Serializable objects and sending an array of those would help reduce your data payload significantly.
If you must use a DataSet take a look into enabling BinaryRemoting.
BinaryFormatter bf = new BinaryFormatter();
myDataSet.RemotingFormat = SerializationFormat.Binary;
bf.Serialize(s, ResultDataSet);
After reducing your data payload by such means, it would be best to write the files to a publicly accessible location on your http server. Hosting it over http allows clients to download the file far more easily than ftp. You can control access to these http folders by means of proper permissions given to the users.
Related
I'm trying to implement a video-streaming service. I use ASP.NET Web API, and as I've searched, PushStreamContent is exactly what I want, and it works very fine, sending HTTP response 206 (partial content) to the client, keeping the connection alive and pushing (writing) streams of bytes to the output.
However, I can't scale. Because I can't retrieve partial binary data from database. For example consider that I have a 300MB video in my SQL Server table (varbinary field) and I use Entity Framework to get the record, and then push it to the client using PushStreamContent.
However, this hugely impacts RAM. And for each seeking action that client does, the RAM uses another extra 600MB of space. Look at it in action:
1) First request for video
2) Second request (seeking to the middle of the video)
3) Third request (seeking into the last quarter of the video)
This can not be scaled at all. 10 users watching this movie, and our server is down.
What should I do? How can I stream video directly from SQL Server table without loading the entire video into RAM with Entity Framework and then pushing it to client via PushStreamContent?
You could combine the SUBSTRING function with VARBINARY fields, to return portions of your data. But I suspect you'd prefer a solution that doesn't require jumping from one chunk to the next.
You may also want to review this similar question.
I am planning to create sqlite table on my android app. The data comes from the the server via webservice.
I would like to know what is the best way to do this.
Should I transfer the data from the webservice in a sqlite db file and merge it or should i get all the data as a soap request and parse it in to table or should I use rest call.
The general size of the data is 2MB with 100 columns.
Please advise the best case where I can quickly get this data, with less load on the device.
My Workflow is:
Download a set of 20000 Addresses and save them to device sqlite database. This operation is only once, when you run the app for the first time or when you want to refresh the whole app data.
Update this record when ever there is a change in the server.
Now I can get this data either in JSON, XML or as pure SqLite File from the server . I want to know what is the fastest way to store this data in to Android Database.
I tried all the above methods and I found getting the database file from server and copying that data to the database is faster than getting the data in XML or JSON and parsing it. Please advise if I am right or wrong.
If you are planning to use sync adapters then you will need to implement a content provider (or atleast a stub) and an authenticator. Here is a good example that you can follow.
Also, you have not explained more about what is the use-case of such a web-service to decide what web-service architecture to suggest. But REST is a good style to write your services and using JSON over XML is advisable due to data format efficiency (or better yet give protocol-buffer a shot)
And yes, sync adapters are better to use as they already provide a great set of features that you will want to implement otherwise when written as a background service (e.g., periodic sync, auto sync, exponential backoff etc.)
To have less load on the device you can implement a sync-adapter backed by a content provider. You serialize/deserialize data when you upload/download data from server. When you need to persist data from the server you can use the bulkInsert() method in content-provider and persist all your data in a transaction
I have some data which is obtained from an API which I display via a master-detail web page. The data I receive from the API is in JSON format and I currently cache a serialised version of this to disk. All files are stored in a single folder. The file is used for a maximum of 1 week as new content is released every week. There can be up to a maximum of 40,000 files. Each file is about 12kb and a guid is used as the filename.
What is the best caching strategy?
Keep as is.
Store the raw JSON instead of serialised data.
Replace the disk caching solution with a NoSQL solution like Redis.
Organise the files into folders
Use faster serialization / deserialization techniques
If you have huge RAM then in order to retrieve the data faster you can avoid serialization and de serialization and keep the data directly in Redis as key value pair.
I've been having nothing but problems with Blackberry development and SQLite for Blackberry in general.
I'm thinking of alternatives for storing data on the device.
First of all the data stored on the device comes from web service calls 99% of the time. The web service response can range from less than 0.5kB up to 10 or maybe even 20 Kb.
A lot of the trouble I've been having revolves around the fact that I use threads to make my web service calls asynchronous, and many conflicts arise between database connections. I've also been having trouble with a DatabaseOutOfMemoryException, which I haven't even found in the documentation.
Is storing the web service response in it's raw XML (as an xml or txt file on the device) and just reading it from there everytime I want to load something on the UI a good idea?? Right now I just get the raw XML in a string and parse it (using DocumentBuilder etc...), storing the contents into different tables of my SQLite.
Would doing away with SQLite and using XML exclusively be faster?? Would it be easier?? Would there be conflicts with read/write access to open files? My app has a lot of read/write going on, so I'd like to make it as easy as possible to manage.
Any ideas would be great, thanks!!
You can use the persistent store, instead of SqLite. One big advantage of the persistent store is that it is always available - no worries about missing SDCards or the filesystem being mounted while the device is USB connected. By "big", I mean this is absolutely huge from a support perspective. Explaining all the edge cases around when a SqLite database is usable on BlackBerry is a huge pain.
The biggest disadvantage of the persistent store is the 64kb limit per object. If you know all your XML fragments never exceed that, then you're fine. However, if you might exceed 64kb, then you'll need to come up with a persistable object that intentionally fragments any large streams into components under 64kb each.
We generate reports in our web application by querying our sql server for data returned as xml and then processing with xslt to create the final output.
As a way to speed up the system, we removed all the static information from the returned sql xml and cached a large XDocument with all the static info in it. Right before performing the xsl transform, we append the XDocument with the static info to the end of the xml that came from sql server. The static XDocument is about 50Meg and takes many seconds to build from the sql server.
Our problem is that once we started caching a few of these large XDoc's, we hit the cache private bytes limit size and the cache was cleared. Rebuilding these XDocuments is too time consuming to do while people are running reports. I have not tried saving these XDocs to a physical file because they are needed for every report run which happens constantly through the day.
I've thought of installing AppFabric Cache but I'm not sure it's a great idea to store 5 to 10 of these large items in it.
Any ideas? If I install more memory on the web server, will it automatically be available to asp.net for a larger cache? I've tried compressing the data before storing it in the cache (shrunk by a factor of 5), but uncompressing it and reparsing the XDocument slowed the server way down.
Final, just save it to a file as it is and then reload it as it is because its all ready Serialized.
The protobuf-net is super fast and light and I have test it and use it, but it not make any good because its all ready serialized.
You can serialize the xml object in a binary format and store it on the database using a varbinary(max). Not sure about the performance of that, but It might worth to try it since it won't take very long to implement it.
Something else that you might want to address is the performance penalty for the first user accessing the report. In order to avoid this, you could pre-generate the reports so they are cached for everyone.