Send fodder bytes until actual response data is ready? - asp.net

I need to serve MP3 content that is generated dynamically during the request. My clients (podcatchers I can't configure) are timing out before I'm able to generate the first byte of the response data.
Is there a way to send fodder/throwAway data while I'm generating the real data, to prevent/avoid the timeout, but in a way that allows me to instruct the client to ignore/discard the fodder data once I'm ready to start sending the "real" data?

If the first few bytes of the encoded content are always the same then you could very slowly send back those bytes. I'm not familiar with the MP3 file format, but if the first few bytes are always some magic (and constant) header, this technique could work.
Once the file encoding gets started you could then skip the first few bytes (since you already sent them) and continue from there.

You could have a default, static "hi, welcome to Lance's stream!" stream go out while you're generating the real deal.

Related

Implications of Encoding/Decoding and Encrypting/Decrypting with variable buffers

I have a program that encrypts and decrypts data using a symmetric key.
During the encryption process I:
Encrypt the data
Base64 Encode it
During decryption:
Base64 decode it
Decrypt the data
It works fine. Now I'm trying to do the process on a streaming buffer. Let's assume the encryption is done with the above-mentioned program on the bulk of the data and only the decryption happens whilst streamed.
In this scenario does the buffer size/chunk-size with which I encoded the data matter when I decode it?
As in if I encoded the data in buffers of 3000 bytes should I also read up to 3000 bytes and decode? Or is it that this doesn't matter?
Also when decrypting, should I decrypt using the same buffer-size I used to pass the data into the Cipher?
I tried with varying values with the standalone program and it works fine. However, when I try to do it streamingly:
Get some bytes
Decode it
Decrypt it
Save to file
For the next set of bytes decrypted keep appending to the same file.
This way it seems to work for some sizes of data and not for others. And the final size of the data is like lexx by 2-4 bytes.
Am I missing some important principle here? Or is it that I might have made a mistake in the logic or a loop somewhere which causes some bytes being left out?
If it's the latter I will dig deeper to check it.
Thank You
Shabir
THanks for the hints above. I was able to solve the issue I had.
As mentioned in the comments above the buffer size itself did not matter much when decoding and decrypting the data as a stream.
However, the reason for the problem I had was because I was initializing the CipherOutputStream for every new chunk of incoming data.
Instead, when I initialized it once at the beginning only and maintained it for all the chunks of a single encrypted-data package, the flow worked as usual and without issue.
CipherOutputStream cipherOutputStream = new CipherOutputStream(byteArrOutputStream, cipher);
This was done once for all the chunks in the stream and it worked.
Thanks
Shabir

Reconstructing data sent over the network in UWP apps via streams

I'm experimenting working with streams in UWP to send data from one machine to another over the network.
On the sending machine, I created a DatagramSocket and serialize the data I want to send into bytes and write that to the output stream.
On the receiving machine, I create another DatagramSocket and handle the MessageReceived event to collect the sent data.
This appears to be working in the sense that when I send data from one machine, I do receive it on the other.
However, the data I'm serializing on the sender is a size of say 8150 bytes, which I write to the stream.
On the receiving end, I'm only getting about 13 bytes of data instead of the full load I expected...
So it appears that I'm responsible on the receiving end for reconstructing the full data object by waiting for all the data to come in over what might be multiple streams...
However, it appears that I'm getting packets 1:1 -- that is, if I set a breakpoint right before the send and right after the receive, that when I write and flush the data to the output stream and send it, the receiving end then triggers and I get what seems to be partial data, but I never get anything else.
so while I send 8150 bytes from the sending machine, the receiving end only gets a single packet about 13 bytes in length...
am I losing packets? it seems to be a consistent 13 bytes, so perhaps it's a buffer setting, but the problem is I the 8150 bytes is arbitrary; sometimes it's larger or smaller...
I'm obviously doing this wrong, but I'm so new to network programming I'm not really sure where to start fixing this; on a high level what's the proper way to send a complete memory object from one machine to another so that I can reconstruct an exact copy of it on the receiving end?
Okay so it turns out that the problem was that when I was writing to the output stream on the sender machine, I was using a regular StreamWriter and sending it the array as an object:
using (StreamWriter writer = new StreamWriter(stream))
{
writer.Write(output);
writer.Flush();
}
I believe this ends up writing the object itself using ToString() so what I was actually writing to the stream was "byte[]" or whatever the type was...
Instead I replaced this with a BinaryWriter and write the full output array and the complete contents are now received on the other end!
using (BinaryWriter writer = new BinaryWriter(stream))
{
writer.Write(output);
writer.Flush();
}
I know this wasn't very well put together, but I barely know what I'm doing here :) still i hope this might be helpful to others.

Beep Sound when Decoding DSP TrueSpeech To PCM

I'm trying to decode array of bytes from DSP TrueSpeech to PCM.
When we convert this array as part of streaming (divide it to packets) we can hear some strange "Beep" tones after the decoding.
We tried to decode the entire WAV file in one piece and we didn't get those Beeps.
Currently we are using Alvas.net for it, but we tried also with NAudio and got the same reaults?
My questions:
1)Is anyone familiar with this kind of behavior?
2)Do you have an idea what can we do?
Thanks
Ziv
How are you performing the decode? Often codecs maintain internal state, so it's important that you don't keep closing and re-opening the codec for each block of audio that you receive. In NAudio, that means just one AcmStream/WaveFormatConversionStream that everything you receive is passed through.
Also, make sure it is only compressed audio that is being passed into the codec. Sometimes when you receive audio over the network it is contained within some kind of larger packet that contains timing or encoding metadata (e.g. RTP).
At the bottom line, we have the packet data(array of bytes) which we are sending to decode (return as PCM) and then we're writing the new decoded array of bytes in to the new WAV file.
We're defiantly going to try your suggestion regarding the stream with NAudio.
Regarding the bytes we're working on, they don't contain any garbage. We've wrote a tester that stream the file directly (without network) and got the same beep results.
Our solution is working so well with many other codecs (GSM and etc..) and only in true speech we're having this problem.
Therefore it seems to be like some behavior of True Speech codec, but we didn't find any documentation about it.
Thanks Again
Ziv

J2me multipart/form-data sending java.lang.exception Out Of Memory Error - help needed

I am trying to send image from my midlet to an HTTP server. images are converted into byte
and sent to server using http multipart/form-data request format.
ByteArrayOutputStream bos = new ByteArrayOutputStream();
bos.write(boundaryMessage.getBytes());
bos.write(fileBytes);
bos.write(endBoundary.getBytes());
When the image size is less than around 500Kb then the code works fine, but when the size is greater than it shows: Uncaught exception java.lang.OutOfMemoryError. I tried using Java ME SDK 3.0 and Nokia S40 5th edition FP1. Any help is greatly appreciated. Thanks for looking
I used the following class file: click here
Being forced to read the whole file into memory with the first `getFileBytes(), in order to transmit in one piece, is most likely what's running the system out of memory.
Find a way to read about 100K, transmit it, then read another 100, until the whole file is done.
The HttpMultipartRequest class's constructor as written allows only for the transmission of the file as one single object. Even though it's an implementation of the MIME multipart content protocol, it is limited to the case of transmitting just one part:
The class can be modified to allow sending multiple parts. Have a look at the protocol specification RFC1341, especially the example half-way through.
With these three lines together as they are in the constructor, the whole file is sent in one part;
bos.write(boundaryMessage.getBytes());
bos.write(fileBytes);
bos.write(endBoundary.getBytes());
But in the multipart case, there needs to be multiple boundaries, before the endBoundary:
for(bytes=getMoreFileBytes(); ! bytes.empty; bytes=getMoreFileBytes()){
bos.write(boundaryMessage.getBytes());
bos.write(bytes);
}
bos.write(endBoundary.getBytes());
As a quick fix, let the constructor open the file and read it 100k at a time. It already receives a fileName parameter.
The PHP script on the other end, should reassemble the original file from the pieces.
I am not very familiar with the forum rules, I tried to comment your answer but it shows negative.
Okay..
Now I am getting java.io.IOException: Persistent connection dropped after first chunk sent, cannot retry
previously I tried to use application/x-www-form-urlencoded request type with Base64 encoding using kidcandy's code here: http://forums.sun.com/thread.jspa?threadID=538500
This code divides the imagedata in chunks to avoid 'Persistent connection drop' problem and creates connection with the server using 'for' loop. The problem is maximum chunk size maybe only 500-700 bytes. So to send a 100kb image the code needs to create and close connection 200 times, I tried to run this on nokia 5310 phone, it behaves like it is hibernating... so it is not useful.
Now should I use that for loop for 'multipart/form-data' request?
What is the maximum chunk size for this type of request?
Or any other idea?
Regards

Base64 string Compression

I have got an ActiveX Control that gets an image from a fingerprint device as base64 string. The Active works great and I could transfer the returned base64 string to the server to be converted to a binary data and then to be saved to a database. I use ASP.NET as server side technology and JavaScript as client side technology. The problem is that the base64 string is tool large and it would take from 30 to 40 seconds for the string to be transferred to the server. My question is: Is there any way to compress this base64 string on client (Browser) and deflate it back on server.
If the base64 image is really a jpeg or some other compressed image format, I wouldn't expect you to be able to get much extra compression out of it in the first place. You'd also need to work out a way of posting the binary compressed data afterwards in a portable way, which may not be terribly easy. (You may be able to pretend it's a file or something like that.)
Just how large is this image? 30 seconds is a very long time...
on my linux system, using the bzip2 utility (which uses burrows-wheeler transform and then compresses), I reduce a jpg encoded in Base64 from 259.6KB to 194.5KB.
Using gzip (which uses an LZ algorithm of some variety), I reduce it to 194.4KB.
So, yes you can compress it. the question is why do you want to? It sounds as though your problems are really lying elsewhere.
Base64 is a format that is usually only used to get around technical limitaions of sending binary data. For example, a base64 string could be used in a GET request, like "website.com/page?q=BASE64ENCODED".
You have two options:
Figure out how to send/recieve binary data in a POST request, then send the data in a different form and convert it appropriately on the server.
-OR-
Since this is a fingerprint device, I assume you're using it as a password, so you actually don't have to send the raw fingerprint data every time. Just make an SHA-1 hash of it, and compare it to a stored hash on the server. This is just as secure and will take a fraction of a second to upload.
Hope I helped!

Resources