Recording Video using Qt5.4 - qt

I am building a cross-platform application to record multimedia files for ongoing processing. This is based on an inherited application and I am not able to re-write using alternative libraries.
My current issue is that the QMediaRecorder does not apparently save the video file onto the local drive - I have temporarily hard-coded the file to be saved as banana.mov into the users root folder.
When executed, the output file is not being saved.
I have tried forcing the resolution as suggested here and have seen that others have had issues recording from windows but OSX was fine
Development environment OSX 10.10 with Qt5.4 (the same issue is also happening on a Windows 8.1 machine using Qt5.3)
This code on Github is based on the camera example, with additional debug code added when trying to identify and reproduce the issue.
While investigating, the QMediaRecorder::​supportedAudioCodecs and
QMediaRecorder::supportedVideoCodecs both return empty lists. This happens on both the OSX build and the Windows environments.
The debug output is as follows:
Status change SIGNAL 'The recorder is initializing.'
Output location file:~/banana.mov
2015 01 05 14:59:58.111 Number of supported AUDIO Codecs 0
2015 01 05 14:59:58.111 Number of Audio sample rates 0
2015 01 05 14:59:58.111 Number of Video Codecs 0
2015 01 05 14:59:58.111 Number of Video Frame Rates 0
2015 01 05 14:59:58.111 Number of Containers 0
Location Changed SIGNAL 'file:~/banana.mov'
State change SIGNAL 'The recording is requested.'
Recording should have started
2015 01 05 14:59:58.111 Number of supported AUDIO Codecs 0
2015 01 05 14:59:58.111 Number of Audio sample rates 0
2015 01 05 14:59:58.111 Number of Video Codecs 0
2015 01 05 14:59:58.111 Number of Video Frame Rates 0
2015 01 05 14:59:58.111 Number of Containers 0
Status change SIGNAL 'Recording is requested but not active yet.'
I have a feeling that I'm missing something really obvious, I just haven't spotted it quite yet!
edit 1 The obvious this is that the status is Recording is requested but not active yet and not Recording is active. I'm currently trying to work out why the recording has not started.
edit 2 The audio recorder example does record and save an audio file. It looks like QMediaRecorder does not return a list of available audio codecs, but QAudioRecorder does return a list of audio codecs. I am getting the same results on both Windows 8.1 using Qt5.3 and OSX using Qt 5.4

It's highly likely your looking at an OS specific artifact rather than a core problem with QT.
I've seen this issue so many times in so many tool kits and frameworks that it scares me that no one has yet figured out an elegant solution.
The Basis of the Problem
Most operating systems implement some kind of protection for critical system files.
Under *nix, this is in the form of the user/group permissions system, under windows it's similar but with the UAC (User Access Control) sub system.
What this boils down to is that you often can't just pick an arbitrary location to write files to, not without first seeking permission from the various security API's and mechanisms in the OS to do so.
The second half of the problem then, comes from variable expansion.
In *nix particularly the tilde char '~' is expanded by the shell to mean the users home directory.
When we say shell, we mean bash, tch, csh or what ever environment your running the app in.
Within this mix, we'll also put the desktop environment too, as most things like Kde, Gnome, Unity or what ever else is being used, have some kind of operating system call that when a '~' is passed to it, knows to convert it to '/home/neil/' or what ever it needs to be expanded too.
Windows also has a similar thing, whereby you can make an OS call and say 'Hey mr operating system, where are my user folders stored', which it will happily reply to with something like 'c:\users\'
Why The Problem Manifests The Way It Does
Simply because when your in charge of creating the path strings yourself, in your own application, things are often not passed to these various OS calls for expansion and security clearance, you pretty much have to make sure your the one that calls them.
The exception to this rule, is if your using the *nix philosophy of small tools combined to do one job. In this case, you would often pass a result to a shell based program, which because it's running via the shell then knows that if it sees '~' it has to expand it.
Beacuse your using direct file access, managing your own file paths when you thought you where writing to '\home\neil\file.mov' you where in actual fact trying to write a file called 'file.mov' in the current folder where your app is running from, in a folder called '~' which I'm willing to be doesn't exist.
Add to this the fact that a lot of these frameworks (QT is no exception) are designed to hide and abstract away all the ugly details, there's a good chance it consumed the OS exception that was generated when you actually tried to write the file, if it had not then your app would most likely have crashed with some kind of exception dialog.
How to Solve The Problem
There are 3 approaches you can take to mitigating this.
1) You can hard code the paths, that is you can explicitly say to the application always store your file at '/home/neil/videos/blah.mov', this has a disadvantage however that every user of the app will need a custom build, as it's unlikely 'person2' will have write permissions on 'neil's home directory.
2) You can build in functionality that gives the user a dialog box and asks them where they would like to save the file. Since your using something like QT this should be very easy, most of these UI tool kits have built in functionality to easily present the user with such an experience.
3) You can find out if your framework or the underlying OS has any calls for you to ask who the current user is, and where their home directory is, you can then use the returned information to dynamically build a static patch similar to that in option 1. Doing things this way ensures that the application adapts automatically to it's environment irrespective of the user.
Myself personally, I generally adopt option 1 during development, then when development is complete I switch to number 2, very rarely do I use number 3 in desktop based software.
Option 3 for me is often used when the application in question is designed to do a single job, such as converting a file for another process to work with, or generating some output for a server to display in a web page etc.
For you, right now as a solution to this question however, Option 1 is your best bet.

To add to Shawty's great answer, I found that Qt does not support Recording on Windows http://doc.qt.io/qt-5/qtmultimedia-windows.html
This might be helpful https://github.com/kibsoft/QtMEL

Related

Qt 5.13.2.0 possible malware Variant.Adware.Kazy.795337 in qwebp.dll

today we received info from one of our customer about this malware detection:
Gen:Variant.Adware.Kazy.795337
It's only inside the qwebp.dll file attached to our project by qtdeploy process.
We're building 32-bit Qt (5.13.2.0) from the source and the same issue is reported on the same DLL no matter where it was built. We're using the latest VS 2019.
https://www.virustotal.com/gui/file/9f09c05803ad4ffcd99454c420a840e17549ee711690fb1f11fd1b59bccc3b23/detection
https://www.virustotal.com/gui/file/80c4c747d781a27c72de71c0900ccc045aefd2b4e4f17c949aaeeb3d0b7973b1/detection
When I scanned the older version (5.13.0.0) everything is ok:
Previous versions seem to be clean:
https://www.virustotal.com/gui/file/b7b7cacaef0e76439ef8c367c401524e93dfa00c9ca67a20290e829fec325a5a/detection
Also, any debug build and 64-bit builds are clean too.
Any idea what can cause this? Can anyone else please try to scan this file?
Thanks
TL;DR: It is probably nothing, but notify Qt anyway (and check your own systems).
Are you using the prebuilt Qt binaries or are you compiling the sources yourself?
If you are using the official prebuilt binaries, I'd of course expect that the Qt Devteam scans them and verifies that they don't accidently spread malware, but there is always the miniscule chance of something slipping through.
Same goes for the sources - while their review process should be thorough enough to avoid malicious code being slipped in, there is still the outside chance of either a key account being compromised or (even more unlikely) bad code being added slice-by-slice over a longer time period to avoid detection (along the lines of the underhanded C contest). Still, either case seems to be rather unlikely.
Bottom line: while this does sound like (and probably is) a false positive, you still may want to raise an issue with Qt e.g. on the their Bugtracking site or directly with Qt support (if you have a commercial license) to be sure. Also (if you didn't do that already) verify that the problem is not on your end, e.g. that your computers are clean and that you don't just randomly catch/detect your infection in that file.
Update:
A ticket concerning this issue was opened (I assume by Ludek Vodicka) on Qt bugtracker. Opened on Nov 19th and categorized as P1: Critical, but unfortunately no indication that it is actually being worked on (at least of Dec 18th).

obtain data from remote computers using autoit

I have been maintaining a program written in batch. I want to write a replacement program using autoit.
The program is downloaded to the desktop of remote computers and prints out a log of the scan results in notepad on the desktop.
I want it to cover windows XP-vista-7-8-8.1-10. At the moment it does not cover 8-8.1 or 10.
This is the printout:
Results of my test version 001
Windows 7 x86 Service Pack 1 ---- (shows in brackets if service pack is out of date)
(UAC) --- shows if UAC is on or disabled.
Internet Explorer----(shows if out of date)
Antivirus/Firewall Check:
Windows Firewall Enabled!
Panda Free Antivirus
WMI entry may not exist for antivirus; attempting automatic update.
Anti-malware/Other Utilities Check:
CCleaner
Java 8 Update 31 (Java version out of Date!)
Adobe Flash Player 17.0.0.188
Adobe Reader XI
Mozilla Firefox (38.0.5)
Thunderbird (38)
System Health check
Total Fragmentation on Drive C: 2%
````````````````````End of Log``````````````````````
So this is possible. To get versions of files(like java and firefox) I think you can use FileGetVersion
To get if windows filewall is enabled you have to read the registry. This key might be a little bit different depending on your system but the one for me was this one
RegRead("HKLM\SYSTEM\CurrentControlSet\Services\SharedAccess\Parameters\FirewallPolicy\DomainProfile\", "EnableFirewall")
These two macros should be usefull to determine OS specific information that you request
#OSType
#OSVersion
UAC can also be read from the registry and as with the firewall it might depend on your system but for me this was the registry:
RegRead("HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\System", "EnableLUA")
Im not quite sure what the Total Fragmentation means so I am not sure how you can get this.
You should be able to compose a txt file with all this information. You should be able to find examples of autoit code that transfers text files just by searching here on stackoverflow or on google.

Visual Studio + Qt Addin, PC freeze when building

this is a more unusual question so give me a hint when stack overflow isn't the right place for it. ;)
I have a problem with Visualstion 2012 where it freezes every so often when I compile my project.
I am currently working on a Qt project so the Qt add-in is installed. I am sure you can't remote-fix my problem but I would like to ask what could cause such freezes.
Here are some important infos:
the PC doens't freeze every time I compile (seems to be a bit random)
the freeze takes from 5 to 15 minutes. In most cases, it ends with the screen switching to black and then back to "normal"
I often try to open the task manager which returns an exit code after the freeze which says that the task manager couldn't start
the PC comes back to life after 5-15 minutes but many applications (incl VS) aren't responding for additional ~5 minutes.
the hardware components are fine for what I can tell. (I tested HDD and RAM, temps are fine)
I hope you can give me a hint where the cause of the freezes could be hiding. ;)
You could start by analyzing what is unique about your system.
Perhaps you are using an unusual source control system, anti-virus, network connections, mapped drives or some weird form of integration that nobody else uses. My guess is that this may be your source control integration or some server connection that is triggering an unusual locking condition.

Monitor unlimited files/folders under Mac osx with Qt

I am working on an application targeted to Mac OSX 10.6+ using Qt 4.7.4
I have a folder with as much as 1000 files + and some or many or even all of these files may be renamed or moved or deleted, so I want to report to my application if:
File is renamed (report original and renamed filename)
Folder renamed (report original and renamed folder name)
File/folder is deleted (just report it as deleted)/moved (report the moved location)
PROBLEM: is the underlying system may (its MAY) only allow 256 descriptors to be monitored so at most 256 files! How can I over come this?
Note: used QFileSystemWatcher interface (it has the above stated problem)
ALSO : How to handle in case of version lower than OSX 10.5
Do mention how do i get renamed filename/foldername
From the QFileSystemWatcher docs:
On Mac OS X 10.4 and all BSD variants, for example, an open file descriptor is required for each monitored file. Some system limits the number of open file descriptors to 256 by default. This means that addPath() and addPaths() will fail if your process tries to add more than 256 files or directories to the file system monitor. Also note that your process may have other file descriptors open in addition to the ones for files being monitored, and these other open descriptors also count in the total. Mac OS X 10.5 and up use a different backend and do not suffer from this issue.
So you should not need to worry about this at all in your case.
QFileSystemWatcher doesn't provide the information you requested in your edit. It will emit signals when one of the paths it monitors changes, but in case of a rename, you won't get the new name. It's intended more for things like file manager programs that will just update/reload their current view on receipt of such events.
If you need more information than that, you'll need to use OS specific APIs. You can look at the code Qt uses for different platforms in the Qt source. It's in src/core/io/qfilsystemwatcher_*.[h|cpp].
For Mac OS X 10.5 or greater, the underlying API used is the FSEvents API. You can read in the Technology Overview page:
The important point to take away is that the granularity of notifications is at a directory level. It tells you only that something in the directory has changed, but does not tell you what changed.
So that OS-level API doesn't provide what you want either directly.
For older versions of Mac OS X and FreeBSD, Qt uses the kqueue API, with the EVFILT_VNODE event filter. That API doesn't provide the new name of a renamed file either.
In short, either you'll need to code something yourself based on one of those APIs, find a library that does it (with guarantees that meet your needs), or you'll need to redesign your application. "Watching" a directory in a portable manner is at best very tricky, and generally race- and error-prone. If I were you, I wouldn't be too optimistic especially if your design requires that no "event" be missed.

What artifacts to save for a nightly build?

Assume that I set up an automatic nightly build. What artifacts of the build should I save?
For example:
Input source code
output binaries
Also, how long should I save them, and where?
Do your answers change if I do Continuous Integration?
You shouldn't save anything for the sake of saving it. you should save it because you need it (i.e., QA uses nightly builds to test). At which point, "how long to save it" becomes however long QA wants them.
i wouldn't "save" source code so much as tag/label it. I don't know what source control you're using, but tagging is trivial (performance & disk space) for any quality source control system. Once your build is tagged, unless you need binaries, there really isn't any benefit to just having them around because you can simply re-compile when necessary from source.
Most CI tools let you tag on each successful build. This can become problematic for some systems as you can easily have 100+ tags a day. For such cases I recommend still running a nightly build and only tagging that.
Here are some artifacts/information that I'm used to keep at each build:
The tag name of the snapshot you are building (tag and do a clean checkout before you build)
The build scripts themselfs or their version number (if you treat them as a separate project with its own version control)
The output of the build script: logs and final product
A snapshot of your environment:
compiler version
build tool version
libraries and dll/libs versions
database version (client & server)
ide version
script interpreter version
OS version
source control version (client and server)
versions of other tools used in the process and everything else that might influence the content of your build products. I usually do this with a script that queries all this information and logs it to a text file that should be stored with the other build artifacts.
Ask yourself this question: "if something destroys entirely my build/development environment what information would I need to create a new one so I can redo my build #6547 and end up with the exact same result I got the first time?"
Your answer is what you should keep at each build and it will be a subset or superset of the things I already mentioned.
You can store everything in your SCM (I'd recommend a separate repository), but in this case your question on how long you should keep the items looses sense. Or you should store it to zipped folders or burn a cd/dvd with the build result and artifacts. Whatever you choose, have a backup copy.
You should store them as long as you might need them. How long, will depend on your development team pace and your release cycle.
And no, I don't think it changes if you do continous integration.
This isn't a direct answer to your question, but don't forget to version control the nightly build setup itself. When the project structure changes, you may have to change the build process, which will break older builds from that point on.
In addition to the binaries as everyone else has mentioned I would recomend setting up a symbol server and a source server and making sure you get the correct information out and into those. It will aid in debugging tremendously.
We save the binaries, stripped and unstripped (so we have the exactly same binary, once with and once without debug symbols). Further we build everything twice, once with debug output enabled and once without (again, stripped and unstripped, so every build result in 4 binaries). The build is stored to a directory according to SVN revision number. That way we can always retain the source from the SVN repository by simply checking out this very revision (that way the source is archived as well).
A surprising one I learned about recently: If you're in an environment that might be audited you'll want to save all the output of your build, the script output, the compiler output, etc.
That's the only way you can verify your compiler settings, build steps, etc.
Also, how long to save them for, and where to save them?
Save them until you know that build won't be going to production, iow as long as you have the compiled bits around.
One logical place to save them is your SCM system. Another option is to use a tool that will automatically save them for you, like AnthillPro and its ilk.
We're doing something close to "embedded" development here, and I can tell you what we save:
the SVN revision number and timestamp, as well as the machine it was built on and by whom (also burned into the build binaries)
a full build log, showing whether it was a full/incremental build, any interesting (STDERR) output the data baking tools produced, a list of files compiled and any compiler warnings (this compresses very well, being text)
the actual binaries (for anywhere from 1-8 build configurations)
files produced as a side effect of linking: a linker command file, address map, and a sort of "manifest" file indicating what was burned into the final binaries (CRC and size for each), as well as the debugging database (.pdb equivalent)
We also mail out the result of running some tools over the "side-effect" files to interested users. We don't actually archive these since we can reproduce them later, but these reports include:
total and delta of filesystem size, broken down by file type and/or directory
total and delta of code section sizes (.text, .data, .rodata, .bss, .sinit, etc)
When we have unit tests or functional tests (e.g. smoke tests) running, those results show up in the build log.
We've not thrown out anything yet -- given, our target builds usually end up at ~16 or 32 MiB per configuration, and they're fairly compressible.
We do keep uncompressed copies of the binaries around for 1 week for ease of access; after that we keep only the lightly compressed version. About once a month we have a script that extracts each .zip that the build process produces and 7-zips a whole month of build outputs together (which takes advantage of only having small differences per build).
An average day might have a dozen or two builds per project... The buildserver wakes up about every 5 minutes to check for relevant differences and builds. A full .7z on a large very active project for one month might be 7-10GiB, but it's certainly affordable.
For the most part, we've been able to diagnose everything this way. Occasionally there's a hiccup on the buildsystem and a file isn't actually a the revision it's supposed to be when a build happens, but there's usually enough evidence of this in the logs. Sometimes we have to dig out a tool that understands the debugging database format and feed it a few addresses to diagnose a crash (we have automatic stackdumps built into the product). But usually all the information needed is there.
We haven't had to crack the .7z archives yet, to mention. But we have the info there, and I have some interesting ideas on how to mine bits of useful data from it.
Save what can't be reproduced easily. I work on FPGAs where only the FPGA team have the tools and some cores (libraries) of the design are licensed to compile on only one machine. So we save the output bitstreams. But try to check them over one another rather than with a date/time/version stamp.
Save as in check in to source code control or just on disk? Save nothing to source code control. All derived files should be visible in the file system and available to developers. Don't checkin binaries, code generated from XML files, message digests etc. A separate packaging step will make these end products available. As you have the change number you can always reproduce the build if necessary assuming of course everything you need to do a build is completely in the tree and is available to all builds by syncing.
I would save your built binaries for exactly as long as they have a chance to go into production or be used by some other team (like a QA group). Once something has left production, what you do with it can vary a lot. For a lot of teams, they'll keep just their most recent prior build around (for rollback) and otherwise discard their builds.
Others have regulatory requirements to keep anything that went into production around for as long as seven years (banks). If you are a product company, I'd keep around any binary a customer might have installed in case a tech support guy wants to install the same version.

Resources