Using SignalR to display row by row processing of Excel fileUpload - signalr

I am trying to figure out how can i use FileUpload along with signalR where i can start processing the uploaded Excel file row by row(without waiting for the file to be fully uploaded).
So i have a large(could be upto 2GB, but consider on average to be 100 mb) Excel file being uploaded, i want to start display the progress in Percentage as well as display all the rows that were processed and if any error occurred during the processing of that row.
Any links to an article will be appreciated.

I have created a decoupled message bus proxy (Eventaggregator proxy) for SignalR.
This fits your use case perfectly, in your case I would fire events while processsing the file. This will be automatically forwarded to the cients, you can also constraint so that only the user that uploaded the file will see events generated by that file upload.
Please check this blog post I made for a insight into the library
http://andersmalmgren.com/2014/05/27/client-server-event-aggregation-with-signalr/
Demo
https://github.com/AndersMalmgren/SignalR.EventAggregatorProxy/tree/master/SignalR.EventAggregatorProxy.Demo.MVC4

Related

Network error triggering the download report(report generation) action in server.R twice

I have the shiny application deployed on the Rshiny pro server. The main aim of the application is to process the input excel files and produce the report in the form of word document which has couple of tables and around 15 graphs rendered using the ggplot.
This application works perfect for the input excel files having less than approx. 3500-4500 rows for around 10 metrics.
Now, I am trying to process the excel file with around 4000-4500 rows for around 20 metrics. While processing this file, during report generation(Rmarkdown file processing) it's showing the network error on the UI only. Despite this error on the UI, in the back-end the report file is getting generated, but the generated report doesn't get downloaded. After this error, the report generation action is getting triggered automatically resulting in the generation of two reports which is again doesn't get downloaded.
So, from this observations, I came to the conclusion that on getting the network error, the download report(report generation and downloading) action is getting triggered again by the server.R.
Has anyone been through such strange situation? I am looking for guidance regarding the two problems here-
What can be the reason of getting the network error sometime only?
What is there, which is triggering the download report action twice?
Is there any option to specify the max. session timeout period?
I have found answers to above questions and I have already answered it here.
Though I would like to quickly answer questions in above explained context.
Reason for getting network error: User will be presented with the network error only if the computations(in this case report generation) doesn't get completed within the 45 seconds. This is because the http_keepalive_timeout parameter is not defined in the server configuration and the default value for http_keepalive_timeout parameter is 45 seconds.
Why download report action was getting triggered twice? : It is because the user session with the server was getting terminated during the computations which were happening after clicking the Download action button
. There is parameter called reconnect in the shiny server configuration which is enabled by default. When a user's connection to the server is interrupted, Shiny Server will offer them a dialog that allows them to reconnect to their existing Shiny session for 15 seconds. This implies that the server will keep the Shiny session active on the server for an extra 15 seconds after a user disconnects in case they reconnect. After the 15 seconds, the user's session will be reaped and they will be notified and offered an opportunity to refresh the page. If this setting is true, the server will immediately reap the session of any user who is disconnected.
You can read about it in the shiny server documentation.
Option to specify the max. session timeout period: Yes. There is a parameter called http_keepalive_timeout. It will allow you to specify the maximum session timeout period. You will need to add http_keepalive_timeout parameter to the shiny-server.conf at the top level with the timeout period you want in seconds as shown below.
http_keepalive_timeout 120;
Read more about http_keepalive_timeout here.

Failing to upload JSON file through Chrome to Firebase Database

This is really frustrating. I have a 104 MB JSON file that I want to upload to my Firebase database through the web front end, but after a random period of time (I've timed it, it's not constant, anywhere from 2 to 20 seconds) I get the error:
There was a problem contacting the server. Try uploading your file again.
So I do try again, and it just keeps failing. I've uploaded files nearly this big before, and the limit for stored data in the realtime DB is 1 GB,
I'm not even close to that. Why does it keep failing to upload?
This is the error I get in chrome dev tools:
Failed to load resource: net::ERR_CONNECTION_ABORTED
https://project.firebaseio.com/.upload?auth=eyJhbGciOiJIUzI1NiIsInR5cCI6…Q3NiwiYWRtaW4iOnRydWUsInYiOjB9.CihvjvLSlx43nOBynAJeyibkBRtygeRlG4Yo1t3jKVA
Failed to load resource: net::ERR_CONNECTION_ABORTED
If I click on the link that shows up in the error, it's a page with the words POST request required.
Turns out the answer is to ignore the web importer entirely and use firebase-import. It worked perfectly first time, and only took a minute to upload the whole json. And it also has merging capabilities.
Using firebase-import as the accepted answer suggested, I get error:
Error: WRITE_TOO_BIG: Data to write exceeds the maximum size that can be modified with a single request.
However, with the firebase-cli I was successful in deleting my entire database:
firebase database:remove /
It seems like it automatically traverses down your database tree to find requests that are under the limit size, then it does multiple delete requests automatically. It takes some time, but definitely works.
You can also import via a json file:
firebase database:set / data.json
I'm unsure if firebase database:set supports merging.

Getting Firebase timestamp before pushing data

I have a chat app powered by Firebase, and I'd like to get a timestamp from Firebase before pushing any data.
Specifically, I'd like to get the time that a user pushes the send button for a voice message. I don't actually push the message to Firebase until the upload was successful (so that the audio file is guaranteed to be there when a recipient receives the message). If I were to simply use Firebase.ServerValue.TIMESTAMP, there could be an ordering issue due to different upload durations. (A very short message following a very long one, for example.)
Is there anyway to ping Firebase for a timestamp that I'm not seeing in the docs? Thank you!
If you want to separate the click, from the actual writing of the data:
var newItemRef = ref.push();
uploadAudioAndThen(audioFile, function(downloadURL) {
newItemRef.set({
url: downloadURL,
savedTimestamp: Firebase.ServerValue.TIMESTAMP
});
});
This does a few things:
it creates a reference for the item item before uploading. This reference will have a push ID based on when the upload started. Nothing is written to the database at this point, but the key of the new location is determined.
it then does the upload and "waits for it" to complete.
in the completion handler of the upload, it writes to the new location it determine in step 1.
it writes the server timestamp at this moment, which is when the upload is finished
So you now have two timestamps. One is when the upload started and is encoded into the key/push id of the new item, the other is when the upload completed and is in the savedTimestamp property.
To get the 3 most recently started uploads that have already completed:
ref.orderByKey().limitToLast(3).on(...
To get the 3 most recently finished uploads:
ref.orderByChild('savedTimestamp').limitToLast(3).on(...

QNetworkAccessManager timeout

Presently I am working on an application which sends and receives file from remote server. To do network operation I am using QNetworkAccessManager.
To upload a file I am using QNetworkAccessManager::put() and to download I am using QNetworkAccessManager::get() functions.
While uploading a file I will initialize a timer with time out of 15 sec. if I upload a small file it will complete it within the time out period. But if I try to upload a file which is very large in size get time out. So how to decide time out for uploading of large file.
Same in case of downloading of a large file. I get file in chunk by chunk in readyread() signal. Here also if I download a large file I get time out. So how to decide time out for uploading of large file.
Use the QNetworkReply::uploadProgress() (or downloadProgress) signal to alert you that the operation is progressing. Then, set a timer for 15 seconds after the last uploadProgress/downloadProgress notification (with the timer started when the download/upload commenced.) If the download ever stops, you can cancel the operation 15 seconds after the last update.

Seek in remotely hosted FLV problem with Flash Video componnent

I'm trying to open a video file (flv) that is being hosted remotely.
When I seek in the video using another start point other than 0, the player turns 'black' and then nothing happens.
I see the progress bar (in firebug) loading, so data is being received but nothing is displayed in the video componnent.
Am I missing something ?
The server has to support this.
When loading the file from the middle the server has to regenerate the file on the fly: It has to reaad the original header (to get the size and duration and stuff) then locate the closest keyframe and the write a new header and stream the file starting at the identified keyframe.
In case the server doesn't support this your player either loads the complete file and waits till it has loaded enough or reads data from the middle missing the header.
Typically this is solved by using lighttpd as web-server and mod_flv. See http://jan.kneschke.de/projects/flv-streaming

Categories

Resources