I'm trying to stop a QML video and show its last frame when playback has finished. Does anybody know how to do this? (Sorry, this seems to be not as trivial as it sounds...)
At the moment, my problem is that the Video element simply becomes invisible/hidden after playback is done. (onVisibleChanged is never called.)
When I use the hack in onStatusChanged in my code, the video disappears for a moment after the end and then shows the end of the video.
What I'm doing is simply:
Video {
anchors.fill: parent
fillMode: VideoOutput.PreserveAspectFit;
source: "path/to/file"
autoPlay: true
onStatusChanged: {
console.warn("StatusChanged:"+status+"|"+MediaPlayer.Loaded)
if (status == MediaPlayer.EndOfMedia)
{
// seek a bit before the end of the video since the last frames
// are the same here, anyway
seek(metaData.duration-200)
play()
pause()
}
}
onVisibleChanged:
{
console.log(visible)
}
}
It's possible that I'm missing something, but I could not find anything on this topic in the docs. Also, using separate MediaPlayer and VideoOutput does not change the behavior.
For the record, I'm using the latest Qt 5.2 on Windows (msvc2010+OpenGL-build).
I'm still looking for a better solution to this. What I've come up with is to pause the video one second before it's over:
MediaPlayer {
autoLoad: true
id: video
onPositionChanged: {
if (video.position > 1000 && video.duration - video.position < 1000) {
video.pause();
}
}
}
Why one second? On my machine if you try to pause it about 500ms before the end, the video manages to run to completion and disappear from view without even registering the pause() call. Thus 1 second is sort of a good safe value for me.
Frankly, I'd prefer if there was a more explicit way to tell the MediaPlayer what to do at the end of the video. I know for a fact that GStreamer, which is what Qt uses on Linux and Mac, notifies you when the video is almost over so that you can decide what to do next - e.g. pause the video or loop it seamlessly.
I share your pain (mainly interested in OSX and iOS, where the same problem occurs). The only solution I have is to pair each video (which at least are "canned" app resources, not dynamic content off the net) with a png image of their final frame. When the video starts, enable display of the image under it (although it's not actually visible at that point). When the video ends abruptly, the image is left visible.
This works perfectly on iOS, but on (some?) Macs there may a slight brightness jump between the video and the image (guessing: something to do with OSX display preferences' screen calibration not affecting video?)
An option to the MediaPlayer or VideoOutput element types to freeze on the last frame would indeed be much simpler.
One other possibility I've considered but not tried would be stacking two videos. The one on top is the main player, but the one underneath would just be seeked to, say, the last millisecond of the video and paused. Then when the main video finishes and disappears... there's as-good-as the final frame of the video showing there underneath. This'd basically be the same as the image-based solution, but with the image underneath being dynamically created using a video player. I've found mobile HW's video and Qt's wrappings of it to be temperamental enough (admittedly more back in the earlier days of Qt5) to really not want to try and do anything too clever with it at all though.
A few years later and I am facing the same issue. However, I found a workaround (for Qt >= 5.9) that allows me to pause the video within 100 ms of the end:
It seems that the issue is related to the notifyInterval property (introduced in Qt 5.9). It is by default set to 1000ms (same as the 1000ms observed in other answers, not a coincidence I believe). Thus, changing it to a very small value when the video is almost done allows for catching a position very close to the end and pausing the video there:
Video {
id: videoelem
source: "file:///my/video.mp4"
notifyInterval: videoelem.duration>2000 ? 1000 : 50
onPositionChanged: {
if(position > videoelem.duration-2*notifyInterval) {
if(notifyInterval == 1000)
notifyInterval = 50
else if(notifyInterval == 50)
videoelem.pause()
}
}
onStatusChanged: {
if(status == MediaPlayer.Loaded)
videoelem.play()
}
}
Hope this helps somebody!
in MediaPlayer element: Pause the video 100ms before end based on duration
MediaPlayer {
id: video1
onPlaybackStateChanged: {
if(playbackState==1){
durTrig.stop();
durTrig.interval=duration-100;
durTrig.restart();
}
}
}
Timer:{
id:durTrig
running:false
repeat: false
onTriggered:{
video1.pause();
}
}
Related
An image often being the easiest way to explain something, here is a little screengrab of the problem I'm having:
If you look at the right side of the window, you can see that the content is resized with a visible lag / delay. It's a problem that happens in quite a lot of applications, but I was wondering if there is a way to fix this in a Qt application using QQuickView and QML content.
Basically my application is created like this:
QQuickView view;
view.resize(400, 200);
view.setResizeMode(QQuickView::ResizeMode::SizeRootObjectToView);
view.setSource(...);
The QML's content is just an item with 2 rectangles to highlight the problem.
Edit: here is a simplified version of the QML file (yes, the simplified version also suffers from the same problem ;p)
import QtQuick 2.12
Item {
Rectangle {
color: "black"
anchors { fill: parent; margins: 10 }
}
}
Edit2: Running this small QML snippet through the qmlscene executable also shows the same delay / lag.
Edit3: The same problem occurs on some Linux distros but not on some others: on my Ubuntu it works fine, but on my CentOS 7 is shows the same delay / glitches as on Windows. Both Qt version were 5.12.3. On an old OSX it works fine (tested on Qt 5.9) I'm really lost now ^^
Is there any way to prevent this kind of delay ? The solution will probably be platform specific since it seems the problem comes from the fact that the native frame is resized before Qt has the possibility to get the event, and so the content gets resized with a 1 frame delay ... but I'd like to know if anyone has an idea on how to handle this ?
Any help or pointer appreciated :)
Regards,
Damien
As you mentioned in your update - content gets resized with a 1 frame delay.
And there is a quite simple hack to handle this.
Use nativeEventFilter, handle
WM_NCCALCSIZE with pMsg->wParam == TRUE and remove 1px from top or from bottom.
if( pMsg->message == WM_NCCALCSIZE and pMsg->wParam == TRUE )
{
LPNCCALCSIZE_PARAMS szr = NULL;
szr = reinterpret_cast<LPNCCALCSIZE_PARAMS>( pMsg->lParam );
if( szr->rgrc[0].top != 0 )
{
szr->rgrc[0].top -= 1;
}
}
Regards, Anton
QCoreApplication::setAttribute(Qt::AA_UseDesktopOpenGL);
QCoreApplication::setAttribute(Qt::AA_UseOpenGLES);
QCoreApplication::setAttribute(Qt::AA_UseSoftwareOpenGL);
There is no obvious lag after I use OpenGL.
If so, try the following codeļ¼
if (msg == WM_NCCALCSIZE) {
*result = WVR_REDRAW;
return true;
}
I am trying to implement custom animations on a ng-repeat list. When an element is removed, all the elements that are below it go up.
This is done using a CSS animation on the transform attribute. On the beginning the element is not really removed from the ng-repeat list (there is an animation on the opacity attribute).
At one point, I should actually delete the element from the list. And, at the same time, I need to wind back the animation on the other elements, that have been artificially put too high.
Here is the HTML:
<div ng-repeat="card in cards" ng-style="shouldBeUpped ? uppedStyle : ''">...</div>
And here is the JS:
$timeout(function() {
$scope.shouldBeUpped = false;
$scope.cards.splice(index, 1);
}, 1000);
The problem is that $scope.cards.splice(index, 1); and $scope.shouldBeUpped = false; are not simultaneous. There is a small noticeable delay (maybe 20 or 30 milliseconds), that looks very bad, because in the meantime there is a blank space on the screen ($scope.shouldBeUpped = false; is rendered before $scope.cards.splice(index, 1);).
Do you know what I can do please?
I heard about ng-leave and ng-move classes, but the examples I found on the web don't work for me... (I am using Angular 1.4).
I think you problem is $timeout. You're adding 1s delay there.
I'm not sure how view could know about $scope.shouldBeUpped = false; earlier than $scope.cards.splice(index, 1); as all the changes are reflected when $scope.$apply()
I want to fade out the volume of an audio file while fading in the sound of a video in QML.
This should be no problem with animations, but I'm hitting a wall here.
It seems like the volume property is somehow shared between all instances of all media elements in QML. See for example the following code:
Rectangle
{
id:mainScreen
focus: true
Video
{
id:video
anchors.fill: parent
source: "path/to/file.mp4";
volume:1
onVolumeChanged: console.warn("video: "+volume)
autoPlay: true
}
Audio
{
source: "path/to/file.mp3";
id:audio
volume:1
onVolumeChanged: console.warn("audio: "+volume)
autoPlay: true
}
Keys.onPressed:
{
audio.volume = Math.random(1);
}
}
When I press a key, the onVolumeChanged-Handlers of both video and audio are called.
Is there a way to control the volume of the elements independently?
Or should I file a Qt bug report? This is the OpenGL MSVC2010 build of Qt 5.2.0 in case it matters.
Kakadu, you were right. I hit this bug and the provided patch fixes it!
With that patch, everything works as-is.
I'm implementing WebRTC video chat. I want to implement the following case:
By default video element has background-image via css and if there are no video input then user see his (or interlocutor's) avatar:
No video expected result:
No video actual result:
As you can see from the screenshots I have black rectangles above my fancy backgrounds. I want to make this ugly black rectangle transparent and keep my video's backgrounds visible.
Actually it will be awesome to resolve the problem without introducing any additional markup.
Appreciate your help =)
Update:
"No video" means that user/users don't have web cams and stream has only audio track.
Bingo!
Reading documentation in depth gave some results =) It was as easy:
<video poster="image.jpg">
One simple attribute made me happy
Try Alpha transparency in Chrome video or waitUntilRemoteStreamStartsFlowing.
function onaddstream(event) {
remote_video.src = webkitURL.createObjectURL(event.stream);
// remote_video.mozSrcObject = event.stream;
waitUntilRemoteStreamStartsFlowing();
}
function waitUntilRemoteStreamStartsFlowing()
{
if (!(remote_video.readyState <= HTMLMediaElement.HAVE_CURRENT_DATA
|| remote_video.paused || remote_video.currentTime <= 0))
{
// remote stream started flowing!
}
else setTimeout(waitUntilRemoteStreamStartsFlowing, 50);
}
I'm trying to get styles applied to a page only when the page is projected on the wall (by a projector, when someone is giving a presentation). As the moment, I can only get this in Opera in fullscreen mode.
Is there any way to get #media projection to take affect in other browsers? Also, is there a way to make it only apply to the projection, and not the laptop its projecting from?
If not, are there any viable workarounds to this? I am trying to create a slideshow in css, but also offer a "presenter view" with extra controls on the laptop of the presenter.
Any help in any surrounding area is much appreciated.
#media projection is an abstract concept. Practically projection can be 'on' only on devices of special kind with custom browser builds.
On desktop/laptop with projector attached as an external monitor there is no way for the browser to know what kind of additional monitor is used (if any) for viewing.
The only option for you is to put <button>"Fullscreen" mode</button> and to use something like:
$(button).click( function() { $(document.body).toggleClass("fullscreen") } );
And use styles:
body { ... }
body.fullsceen { ... }
If the projector's output is a different resolution than your laptop monitor, you can use a CSS media query to control the display of an extra element inside each slide, with notes for the presenter.
For example, let's say the laptop is 1024x768, the projected screen is 1280x800, and the notes are inside an element with the class name "notes" -- you'd do something like this:
.slide > .notes
{
display:none;
}
#media projection and (width:1280px)
{
.slide > .notes
{
display:block;
}
}
It would still require the projector and the laptop to be different screens (like using two monitors), but with that as a given, it totally works -- I've done this for real.
I use Opera in fullscreen mode whenever I give presentations; I also use a Mac OS X app called "Mira", which allows you to configure the Apple Remote so it sends keystrokes to applications. So mapping the "Fwd" and "Back" keys on the remote to "page-up" and "page-down" in Opera, I can use the remote to step-through the slides :-D