I am using Flex 3 and FMS3 from where I and sending a videostream. I want the user to be able to pause the stream, then resume it.
For this I am using the methods pause() and resume(). The problem is, when I call pause() the bufferLength is released and equals zero. Accordingly when I resume, the NetStream needs to start buffering all over again, which means I loose all video from the second I paused untill I press resume. And the intention of pause and resume seems of significance.
Any help?
Please see my other question for this.
Create server-side DVR application to be able to record DVR in FMS
If possible, please close this one.
Regards Niclas
Related
I'm using ReplyingKafkaTemplate to make synchronous call with reply. From what i found till now, every time i'm going to use template i call start() and after receiving response the stop() method. However, I came across a problem with the message commit, the offset of my consumer was not increasing. I assumed, that is because consumer did not have time to make a commit, because basic commit time (property "auto.commit.interval.ms") is set to 5 seconds in ConsumerConfig class and I'm stopping him immediatelly after receiving a message. So i changed this time to 1 ms, to commit immediatelly after receiving message. This way it's working, but i would like to understand it better
My question is : How start() and stop() methods should be used properly, is there a purpose to start it before every call and stop after ? And what is a right way to make sure that commit was made ?
Btw. I would be honored if Gary answered the question
You should not start and stop the container each time; just leave the reply container running all the time.
In any case, you should leave enable.auto.commit alone - while it's default is true, Spring will set it to false unless you explicitly set it to true.
The container will commit the offset in a more deterministic manner than the built-in auto commit mechanism.
I'm creating a music app using only QML and it's going really well and I'm now working on the track queue. I'm using Qt.Multimedia to play the tracks and there is a property that could be used to play next track when the current has ended, but I don't understand how to get the signal.
Here is the doc I'm using: https://qt-project.org/doc/qt-5.0/qtmultimedia/qml-qtmultimedia5-audio.html
There's a EndOfMedia that I was planning of using, but I don't understand how?
It seems reasonable to connect a Slot to the playbackStateChanged() or stopped() signal that checks the status to see if it is EndOfMedia and then plays the next track.
I am using openCL to play sounds and I have noted that the sounds stop functioning after a call enters and I press Decline.
I was able to trace it to the endInterruption not being called.
The problem is that this happens only about once out of 5 times I repeat the replication.
This means that my code is ok, because in the majority of times it does call endInterruption, but still every other time iOS decides not to call endInterruption and I have no idea why.
Check if you are calling the "play" functions from a background thread instead of the main thread. If sometimes you initiate the play/playAtTime call from a background thread, you will not receive the endInterruption callback.
Interestingly though, the system does call beginInterruption even when the play call was initiated from a background thread.
Hope this helps.
and by unresponsive I mean that after the first three successful connections, the fourth connection is initiated and nothing happens, no crashes, no delegate functions called, no data is sent out (according to wireshark)... it just sits there?!
I've been beating my head against this for a day and half...
iOS 4.3.3
latest xCode, happens the same way on a real device as in the simulator.
I've read all the NSURLConnection posts in the Developer Forums... I'm at a loss.
From my application delegate, I kick off an async NSURLConnection according to Apple docs, using the App Delegate as the delegate for the NSURLConnection.
From my applicationDidFinishLaunching... I trigger the initial two queries which successfully return XML that I then pass off to a OperationQueue to be parsed.
I can even loop, repeating these queries with no issues, repeated them 10 times and worked just fine.
The next series of five queries are triggered via user input. The first query runs successfully and returns the correct response, then the next query is created and when used to create a NSURLConnection (just like all the others), just sits there.?!
The normal delegate calls I see on all the other queries are never seen.
Nothing goes over the wire according to Wireshark?
I've reordered the queries and regardless of the query, after the first one the next one fails (fails as in does nothing, no errors or aborts, just sits there)
It's obviously in my code, but I am blind to it.
So what other tools can I use to debug the async NSURLConnection... how can I tell what it's doing? if at all.
Any suggestions for debugging a NSURLConnection or other ways accomplish doing the same thing a NSURLConnection does??
Thanks for any help you can offer...
OK tracked it down...
I was watching the stack dump in each thread as I was about to kick off each NSURLConnection, the first three were all in the main thread as expected... the fourth one ended up in a new thread?! In one of my NSOperation thread?!?!
As it turns out I inadvertently added logic(?) that started one my NSURLConnection in the last NSOperation call to didFinishParsing: so the NSURLConnection was async started and then the NSOperation terminated... >.<
So I'll move the NSURLConnection out of the didFinishParsing and it should stay in the main loop and I should be good!
I have to detect avtivityLevel of microphone in Flex. I am using the activityLevel property of Microphone class but as I found out it always return -1 even if I have done Microphone.getMicrophone().
To detect activity level we have to set microphone.setLoopback = true;
Does anybody know how to do this without using loop back as I do not want to hear my sound back just monitor the activity level
Thanks
The microphone won't show any activity until it is attached to a NetStream connection. You can use a MockNetStream to fake the connection using OSMF - see my answer here.