Beep Sound when Decoding DSP TrueSpeech To PCM - decode

I'm trying to decode array of bytes from DSP TrueSpeech to PCM.
When we convert this array as part of streaming (divide it to packets) we can hear some strange "Beep" tones after the decoding.
We tried to decode the entire WAV file in one piece and we didn't get those Beeps.
Currently we are using Alvas.net for it, but we tried also with NAudio and got the same reaults?
My questions:
1)Is anyone familiar with this kind of behavior?
2)Do you have an idea what can we do?
Thanks
Ziv

How are you performing the decode? Often codecs maintain internal state, so it's important that you don't keep closing and re-opening the codec for each block of audio that you receive. In NAudio, that means just one AcmStream/WaveFormatConversionStream that everything you receive is passed through.
Also, make sure it is only compressed audio that is being passed into the codec. Sometimes when you receive audio over the network it is contained within some kind of larger packet that contains timing or encoding metadata (e.g. RTP).

At the bottom line, we have the packet data(array of bytes) which we are sending to decode (return as PCM) and then we're writing the new decoded array of bytes in to the new WAV file.
We're defiantly going to try your suggestion regarding the stream with NAudio.
Regarding the bytes we're working on, they don't contain any garbage. We've wrote a tester that stream the file directly (without network) and got the same beep results.
Our solution is working so well with many other codecs (GSM and etc..) and only in true speech we're having this problem.
Therefore it seems to be like some behavior of True Speech codec, but we didn't find any documentation about it.
Thanks Again
Ziv

Related

How to get raw (un-decoded) byte arrays from a TCPSocket in Ruby

I've been using ruby to setup a TCPSocket with a server and I've hit a snag. When receiving data from the socket (with either a socket.gets or a socket.recv) I get something like this:
x00\x03!\xB2\x00\x00*\xCF
What I get when I capture the packets in Wireshark is
x00\x03\x21\xB2\x00\x00\x2a\xCF
As you can see, the \x21 is decoded into the ASCII equivalent ! and the \x2a is decoded into the ASCII equivalent *.
I've checked and googled a ton of times and have not yet found a solution to get the raw, un-decoded information. I have a parser built that will search for the relevant data from the stream and grab what I need, but I don't want to have to waste time re-encoding it before I have to decode it. Or, I incorporate ASCII into my parser, but that would be a huge pain. There is a lot of bytes in this stream and to re-encode them all would be time consuming. I also see that netcat returns the same output from the TCP stream that ruby does. I could not figure out how to get netcat to output the un-decoded byte arrays either.
Code:
require 'socket'
s = TCPSocket.new "10.0.0.3", 27000
while true do
item = s.recv(5000)
puts item
puts item.inspect
end
This is my first foray into socket programming, so I apologize if I missed something very obvious.
I kind of invented the problem in my head, I am dumb.
To solve this, all you need to do is take the string of TCP information and call unpack("H*") on it like this:
"x00\x03!\xB2\x00\x00*\xCF".unpack("H*")
=> ["7830300321b200002acf"]
Which is exactly like x00\x03\x21\xB2\x00\x00\x2a\xCF
Now I just need to adjust my parser to split it or deal with the big clump of byte arrays
More info on unpack

Pharo: read byte from Arduino serial port

I have an Arduino hanging off /dev/ttyUSB1, communicating at 115kbaud. The statements below work fine up to the 's next' method call, where Pharo hangs. The Arduino responds to the '99' command by sending a single character $1 back to the computer. If I pull out the cable, the program continues and s contains the character $1 just like it should, but not until I pull out the cable. So it's my impression that 's next' does not return after it reads just a single byte (ok, sure, there's nothing that says it should return after reading a single byte). How do I read a single byte from a stream in Pharo? Or how do I open a read/write byte stream? I haven't found anything in the source classes that seem to do this. I've tried setting the stream to ascii, to binary, to text, and it doesn't change the behavior.
s := FileStream oldFileNamed: '/dev/ttyUSB1'.
s readWrite.
s nextPutAll: '99'. "'99' is successfully received by Arduino"
s next. "hangs here"
s close.
Thanks for your help.
Take a look at the class side of FileStream. There you'll notice that you are getting a MultiByteStream (the concreteStream) when asking Filestream for an oldFileNamed:.
There can be a TextConverter or buffer involved. open:forWrite: of MultiByteStream is called, and that calls super. StandardFileStream>open:forWrite: calls enableReadBuffering.
You probably want to call disableReadBuffering on your stream.
There is an Arduino package that has all these issues solved, take a look at this repo:
http://ss3.gemstone.com/ss/Arduino.html

How to send chunks of video for streaming using HTTP protocol?

I am creating an app which uses sockets to send data to other devices. I am using Http protocol to send and receive data. Now the problem is, i have to stream a video and i don't know how to send a video(or stream a video).
If the user directly jump to the middle of video then how should i send data.
Thanks...
HTTP wasn't really designed with streaming in mind. Honestly the best protocol is something UDP-based (SCTP is even better in some ways, but support is sketchy). However, I appreciate you may be constrained to HTTP so I'll answer your question as written.
I should also point out that streaming video is actually quite a deep topic and all I can do here is try to touch on some of the approaches that you might want to investigate. If you have control of the end-to-end solution then you have some choices to make - if you only control one end, then your choices are more or less dictated by what's available at the other end.
If you only want to play from the start of the file then it's fairly straightforward - make a standard HTTP request and just start playing as soon as you've buffered up enough video that you can finish downloading the file before you catch up with your download rate. You don't need any special server support for this and any video format will work.
Seeking is trickier. You could take the approach that sites like YouTube used to take which is to simply not allow the user to seek until the file has downloaded enough to reach that point in the video (or just leave them looking at a spinner until that point is reached). This is not the user experience that most people will expect these days, however.
To do better you need to be in control of the streaming client. I would suggest treating the file in chunks and making byte range requests for one chunk at a time. When the user seeks into the middle of the file, you can work out the byte offset into the file and start making byte range requests from that point.
If the video format contains some sort of index at the start then you can use this to work out file offsets - so, your video client would have to request at least enough to get the index before doing any seeking.
If the format doesn't have any form of index but it's encoded at a constant bit rate (CBR) then you can do an initial HEAD request and look at the Content-Length header to find the size of the file. Then, if the use seeks 40% of the way through the video, for example, you just seek to 40% of the way through the encoded frames. This relies on knowing enough about the file format that you can calculate an appropriate seek point so that you can identify framing information and the like (or at least an encoding format which allows you to resynchonise with both the audio and video streams even if you jump in at an arbitrary point in the file). This approach might also work with variable bit rate (VBR) as long as the format is such that you can recover from an arbitrary seek.
It's not ideal but as I said, HTTP wasn't really designed for streaming.
If you have control of the file format and the server, you could make life easier by making each chunk a separate resource. This is how Apple HTTP live streaming and Microsoft smooth streaming both work. They need tool support to pre-process the video, and I don't know if you have control of the server end. Might be worth looking into, however. These also do more clever tricks such as allowing a client to switch between multiple versions of the stream encoded at different bit rates to cope with differences in bandwidth.

Qt mp3 file to datastream

My primary objective is to send a mp3 file over network using QDataStream, QTcpServer and QTcpSocket. But I have broken this task to smaller pieces. At first I need to get the mp3 file to the correct format so that It can be "fed" to the data stream.
How am I supposed to accomplish this? I figured it would be the easiest to use Phonon? But the MediaObject doesnt seem to be offering some sort of getData method.
Any help on how am I supposed to do that would be much appreciated. If needed I can explain more about this.
There is no "correct format". Also, your problem is not MP3-specific. You do the same for all files, regardless of what kind of data they contain. You open the file, read bytes from it and send those bytes until there's nothing left to send.
You don't need Phonon or anything MP3-related. You only need to open the file and read bytes from it. You then write those bytes to the socket using the write() function of your QTcpSocket object. You don't even need a QDataStream, since you're only dealing with data that you don't need to parse.

J2me multipart/form-data sending java.lang.exception Out Of Memory Error - help needed

I am trying to send image from my midlet to an HTTP server. images are converted into byte
and sent to server using http multipart/form-data request format.
ByteArrayOutputStream bos = new ByteArrayOutputStream();
bos.write(boundaryMessage.getBytes());
bos.write(fileBytes);
bos.write(endBoundary.getBytes());
When the image size is less than around 500Kb then the code works fine, but when the size is greater than it shows: Uncaught exception java.lang.OutOfMemoryError. I tried using Java ME SDK 3.0 and Nokia S40 5th edition FP1. Any help is greatly appreciated. Thanks for looking
I used the following class file: click here
Being forced to read the whole file into memory with the first `getFileBytes(), in order to transmit in one piece, is most likely what's running the system out of memory.
Find a way to read about 100K, transmit it, then read another 100, until the whole file is done.
The HttpMultipartRequest class's constructor as written allows only for the transmission of the file as one single object. Even though it's an implementation of the MIME multipart content protocol, it is limited to the case of transmitting just one part:
The class can be modified to allow sending multiple parts. Have a look at the protocol specification RFC1341, especially the example half-way through.
With these three lines together as they are in the constructor, the whole file is sent in one part;
bos.write(boundaryMessage.getBytes());
bos.write(fileBytes);
bos.write(endBoundary.getBytes());
But in the multipart case, there needs to be multiple boundaries, before the endBoundary:
for(bytes=getMoreFileBytes(); ! bytes.empty; bytes=getMoreFileBytes()){
bos.write(boundaryMessage.getBytes());
bos.write(bytes);
}
bos.write(endBoundary.getBytes());
As a quick fix, let the constructor open the file and read it 100k at a time. It already receives a fileName parameter.
The PHP script on the other end, should reassemble the original file from the pieces.
I am not very familiar with the forum rules, I tried to comment your answer but it shows negative.
Okay..
Now I am getting java.io.IOException: Persistent connection dropped after first chunk sent, cannot retry
previously I tried to use application/x-www-form-urlencoded request type with Base64 encoding using kidcandy's code here: http://forums.sun.com/thread.jspa?threadID=538500
This code divides the imagedata in chunks to avoid 'Persistent connection drop' problem and creates connection with the server using 'for' loop. The problem is maximum chunk size maybe only 500-700 bytes. So to send a 100kb image the code needs to create and close connection 200 times, I tried to run this on nokia 5310 phone, it behaves like it is hibernating... so it is not useful.
Now should I use that for loop for 'multipart/form-data' request?
What is the maximum chunk size for this type of request?
Or any other idea?
Regards

Resources