Seek in remotely hosted FLV problem with Flash Video componnent - flvplayback

I'm trying to open a video file (flv) that is being hosted remotely.
When I seek in the video using another start point other than 0, the player turns 'black' and then nothing happens.
I see the progress bar (in firebug) loading, so data is being received but nothing is displayed in the video componnent.
Am I missing something ?

The server has to support this.
When loading the file from the middle the server has to regenerate the file on the fly: It has to reaad the original header (to get the size and duration and stuff) then locate the closest keyframe and the write a new header and stream the file starting at the identified keyframe.
In case the server doesn't support this your player either loads the complete file and waits till it has loaded enough or reads data from the middle missing the header.
Typically this is solved by using lighttpd as web-server and mod_flv. See http://jan.kneschke.de/projects/flv-streaming

Related

Uploading larger files with User-Agent python-requests/2.2.1 results in RemoteDisconnected

Using the python library requests and uploading larger files I will get the error RemoteDisconnected('Remote end closed connection without response').
However it will work if I change the default User-Agent of the library to something like "Mozilla/5.0".
Does anybody know the reason for this behaviour ?
Edit: Only happens with Property X-Explode-Archive: true
Are there any specific pattern of timeout that you could highlight in this case?
For example: It times out after 60 seconds every time (of that sort)?
I would suggest to check the logs from all the medium configured with the Artifactory instance. Like, Reverse-proxy & the embedded-tomcat too. As the issue is specific to large-sized files, correlate the timeout pattern with the timeouts configured from all the entities which would give us a hint towards this issue.

Trouble writing to HttpServletResponse ServletOutputStream

I have a Spring application running on Tomcat 8.5 that queries our database, generates a PDF file, and then serves it to the user via the ServletOutputStream:
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
//PDDocument from org.apache.pdfbox.pdmodel
this.document.save(byteArrayOutputStream);
int pdfDocumentSize = byteArrayOutputStream.size();
response.setContentLength(pdfDocumentSize);
ServletOutputStream resOutputStream = response.getOutputStream();
byteArrayOutputStream.writeTo(resOutputStream);
response.flushBuffer();
byteArrayOutputStream.close();
resOutputStream.close();
this.document.close();
For the vast majority of users, this works fine. However, we have received reports that some users are having a lot of trouble downloading this file. They click on the download link, and the page sits for about 3 minutes before crashing (They get an ERR_EMPTY_RESPONSE message). Occasionally, the file opens properly after this wait period.
Unfortunately, we are personally unable to replicate this issue.
According to our logs, the file is generated correctly. The file size is relatively small (105631 bytes, or about .1MB), so I don't think it's due to size. The logs also seem to indicate that the file was created correctly.
The logs also show a
org.apache.catalina.connector.ClientAbortException: java.io.IOException: Broken pipe
Which would normally indicate that the user aborted the download. However, we watched the user replicate this issue via screencast, and no actions were taken that would abort it. This user was also on a MAC. We don't have any MACs to test on here, but we do have an iPad. The iPad was able to download the file successfully as well.
What could be causing this?
Update: We've heard from another user that they are also experiencing this problem. They are on a Windows operating system and using Chrome. Different browsers have been tried, but they all behave the same.
In addition, the exception being generated in the log is about 10 minutes later than the time the error was reported.

Failing to upload JSON file through Chrome to Firebase Database

This is really frustrating. I have a 104 MB JSON file that I want to upload to my Firebase database through the web front end, but after a random period of time (I've timed it, it's not constant, anywhere from 2 to 20 seconds) I get the error:
There was a problem contacting the server. Try uploading your file again.
So I do try again, and it just keeps failing. I've uploaded files nearly this big before, and the limit for stored data in the realtime DB is 1 GB,
I'm not even close to that. Why does it keep failing to upload?
This is the error I get in chrome dev tools:
Failed to load resource: net::ERR_CONNECTION_ABORTED
https://project.firebaseio.com/.upload?auth=eyJhbGciOiJIUzI1NiIsInR5cCI6…Q3NiwiYWRtaW4iOnRydWUsInYiOjB9.CihvjvLSlx43nOBynAJeyibkBRtygeRlG4Yo1t3jKVA
Failed to load resource: net::ERR_CONNECTION_ABORTED
If I click on the link that shows up in the error, it's a page with the words POST request required.
Turns out the answer is to ignore the web importer entirely and use firebase-import. It worked perfectly first time, and only took a minute to upload the whole json. And it also has merging capabilities.
Using firebase-import as the accepted answer suggested, I get error:
Error: WRITE_TOO_BIG: Data to write exceeds the maximum size that can be modified with a single request.
However, with the firebase-cli I was successful in deleting my entire database:
firebase database:remove /
It seems like it automatically traverses down your database tree to find requests that are under the limit size, then it does multiple delete requests automatically. It takes some time, but definitely works.
You can also import via a json file:
firebase database:set / data.json
I'm unsure if firebase database:set supports merging.

Using SignalR to display row by row processing of Excel fileUpload

I am trying to figure out how can i use FileUpload along with signalR where i can start processing the uploaded Excel file row by row(without waiting for the file to be fully uploaded).
So i have a large(could be upto 2GB, but consider on average to be 100 mb) Excel file being uploaded, i want to start display the progress in Percentage as well as display all the rows that were processed and if any error occurred during the processing of that row.
Any links to an article will be appreciated.
I have created a decoupled message bus proxy (Eventaggregator proxy) for SignalR.
This fits your use case perfectly, in your case I would fire events while processsing the file. This will be automatically forwarded to the cients, you can also constraint so that only the user that uploaded the file will see events generated by that file upload.
Please check this blog post I made for a insight into the library
http://andersmalmgren.com/2014/05/27/client-server-event-aggregation-with-signalr/
Demo
https://github.com/AndersMalmgren/SignalR.EventAggregatorProxy/tree/master/SignalR.EventAggregatorProxy.Demo.MVC4

size of remote file is not known of the fetch run time

When I carried out a fetch command, the following messages were output.
"size of remote file is not known"
Will this be an error? Will it be what or the thing which disappears if I appoint an option? Or will it be all right even if I do not mind it?
This is the Freebsd fetch command, right?
That is not an error, just a warning. I don't think there's an option to suppress that warning, though.
This is because the HTTP server doesn't send the Content-Length header with the response. That way, the client doesn't know in advance how long the file is, and has to assume that it ends when the connection is closed by the server, with the side effect that if the connection drops prematurely, you'll end up with an incomplete download without knowing it.
This doesn't sound very good, but is in fact quite usual practice on the web, especially for dynamic content generated by scripts.
Here is the way in UNIX
//File size of Google Image
echo "Weberdev Logo Size : " . getRemoteFileSize'http://www.weberdev.com/images/BlueLogo150x45.gif');

Resources