QFileSystemWatcher locks directory on Windows - qt

I am watching a directory recursively using QFileSystemWatcher. And I am not able to rename/delete the parent directory either programmatically or manually if its sub directories are being watched.
When trying to rename manually through system i get a message box saying "The action cannot be completed because the folder/ file in it is opened in another program" and on renaming programmatically it fails.
I got these similar bugs, but no resolution:
http://qt-project.org/forums/viewthread/10530
https://bugreports.qt-project.org/browse/QTBUG-7905
I am not watching . and .. as said in the above link, but still the directory is locked.
In case of programmatically renaming.. I tried a workaround:
1. Remove all the subdirectory paths from watcher before renaming the parent.
2. Rename parent.
3. Add subdirectory paths again.
But here too my program fails on first step. QFileSystemWatcher::removePath() returns false when trying to remove the subdirectory path, and QFileSystemWatcher::directories() show that directory in the paths being watched. Same as posted here https://bugreports.qt-project.org/browse/QTBUG-10846
Since step 1 fails here, step 2 also fails and i cannot rename the parent dir.
I am using Qt5.2.1 and Windows 7.
Kindly help me with a resolution.

This is a bug in QFileSystemWatcher as discussed here
After days of trying, I am finally able to find the solution of my problem by using Win32 API for watching directories on Windows platform. I wrote a blog post on How to use Win32 Api to monitor directory changes. I would like to share the link so it may help others who land up here to find the solution of the same problem.
Win32 API to monitor Directory Changes

Related

Unable to use correct file paths in R/RStudio

Disclaimer: I am very new here.
I am trying to learn R via RStudio through a tutorial and very early have encountered an extremely frustrating issue: when I am trying to use the read.table function, the program consistently reads my files (written as "~/Desktop/R/FILENAME") as going through the path "C:/Users/Chris/Documents/Desktop/R/FILENAME". Note that the program is considering my Desktop folder to be through my documents folder, which is preventing me from reading any files. I have already set and re-set my working directory multiple times and even re-downloaded R and RStudio and I still encounter this error.
When I enter the entire file path instead of using the "~" shortcut, the program is successfully able to access the files, but I don't want to have to type out the full file path every single time I need to access a file.
Does anyone know how to fix this issue? Is there any further internal issue with how my computer is viewing the desktop in relation to my other files?
I've attached a pic.
Best,
Chris L.
The ~ will tell R to look in your default directory, which in Windows is your Documents folder, this is why you are getting this error. You can change the default directory in the RStudio settings or your R profile. It just depends on how you want to set up your project. For example:
Put all the files in the working directory (getwd() will tell you the working directory for the project). Then you can just call the files with the filename, and you will get tab completion (awesome!). You can change the working directory with setwd(), but remember to use the full path not just ~/XX. This might be the easiest for you if you want to minimise typing.
If you use a lot of scripts, or work on multiple computers or cross-platform, the above solution isn't quite as good. In this situation, you can keep all your files in a base directory, and then in your script use the file.path function to construct the paths:
base_dir <- 'C:/Desktop/R/'
read.table(file.path(base_dir, "FILENAME"))
I actually keep the base_dir assignemnt as a code snippet in RStudio, so I can easily insert it into scripts and know explicitly what is going on, as opposed to configuring it in RStudio or R profile. There is a conditional in the code snippet which detects the platform and assigns the directory correctly.
When R reports "cannot open the connection" it means either of two things:
The file does not exist at that location - you can verify whether the file is there by pasting the full path echoed back in the error message into windows file manager. Sometimes the error is as simple as an extra subdirectory. (This seems to be the problem with your current code - Windows Desktop is never nested in Documents).
If the file exists at the location, then R does not have permission to access the folder. This requires changing Windows folder permissions to grant R read and write permission to the folder.
In windows, if you launch RStudio from the folder you consider the "project workspace home", then all path references can use the dot as "relative to workspace home", e.g. "./data/inputfile.csv"

Converted WinForm app can't write-access WindowsApps folder

everybody. Sorry if I ask something trivial. I looked into the previously asked questions, to no avail. If the question was already asked, I beg your excuses, and please point me in the right direction.
I have a number of single form WinForms apps that I'm in the process of converting to appx for the store.
So far so good. No issues with that.
Despite, one of my apps uses the application folder to save some data to a temporary file
Dim sw As New StreamWriter(Application.StartupPath + "\somefile.csv", True)
The .exe program of course works with no issue.
The converted .appx program complains that I have no write permission to the WindowsApps and it subfolders. I quickly solved by taking ownership of the folder and give myself full control over it.
Do I have any chance to prevent the error message to appear on a generic machine, other than trivially changing the source code to point the temporary folder to some other path?
Clearly I don't want the admin to give the user full control over WindowsApps folder.
Writing to the package folder is not allowed. You need to change your code to write to a location that is writeable for the app/user, for example to the AppData folder.
This is documented in the Desktop Bridge preparation guide:
https://learn.microsoft.com/en-us/windows/uwp/porting/desktop-to-uwp-prepare
(it's the eighth bullet point)

Error: ENOTEMPTY, directory not empty in Meteorjs

Error: ENOTEMPTY, directory not empty '/path/disk/folder/.meteor/local/build-garbage- qb4wp0/programs/ctl/packages'
I already looked over this website for this problem and known what are maybe the causes of this error and also tried them. I also tried their solutions and I can manage to always reset the project.
The problem is, whenever the project is reset, on the first run of the project,it will run smoothly and no errors will occur but after some moment or changes to my project like error checking, adding packages or changing some stuff... that error will occur.
I have no idea on how to fix this problem and my temporary solution is to always create another meteor project and put all my project files and also install all packages I used.
Badly need help.
I had this error when running Meteor.js on a Vagrant machine. For additional background, I had created a symbolic link for the MongoDB's db folder, since I had faced a locking issue (solution I used for that was described elsewhere).
Following that, my setup was as follows:
/vagrant/.meteor/local/db -> /home/vagrant/my_project_db (symbolic link)
That solved the problem I had with MongoDB's lock, but everytime any source file changed, meteor would crash with the same exception that you faced. Deleting files didn't help, neither did meteor reset.
Fortunately enough it was remedied by changing the folder structure to this:
/vagrant/.meteor/local -> /home/vagrant/my_project_local (symbolic link)
What I did was as simple as moving the Meteor.js's local folder out from the shared folder and only referencing to that with a symbolic link:
cd /vagrant/.meteor
mv local /home/vagrant/my_project_local
ln -s /home/vagrant/my_project_local local
In the end all is good. The error is long gone and feedback cycle is much shorter.
Try deleting the folder it tells you are issues. I think its trying to clean them but there's an unhandled type of situation (it has files in it and its using rm instead of a recursive one)
Remove
/media/Meteor/hash/.meteor/local/build-garbage-**
(Anything with build-garbade in the name). Also you might want to check whether your permissions are right this might have been caused initially with something to do with incorrectly set permissions, maybe you ran as sudo once? If you're on a Mac you could use repair disk permissions.

cleartool update error in Solaris Unix

I am working on a view created from the main code repository on a Solaris server. I have modified a part of the code on my view and now I wish to update the code in my view to have the latest code from the repository. However when I do
cleartool update .
from the current directory to update all the files in the current directory, some(not all) of the files do not get updated and the message I get is
Keeping hijacked object <filePath> - base no longer known.
I am very sure that I have not modified the directory structure in my view nor has it been modified on the server repository. One hack that I discovered is to move the files that could not be updated to a different filename(essentially meaning that files with original filename no longer exist on my view) and then run the update command. But I do not want to work this out one by one for all the files. This also means I will have to perform the merge myself.
Has someone encountered this problem before? Any advice will be highly appreciated.
Thanks in advance.
You should try a "cleartool update -overwrite" (see cleartool update), as it should force the update of all files, hijacked or not.
But this message, according to the IBM technote swg1PK94061, is the result of:
When you rename a directory in a snapshot view, updating the view will cause files in the to become hijacked.
Problem conclusion
Closing this APAR as No Plans To Fix (NPTF) because:
(a) to the simple workaround of deleting the local copy of renamed directories which will mitigate the snapshot view update problem and
(b) because of this issue's low relative priority with higher impact defects
So simply delete (or move) the directory you have rename, relaunch you update, and said directory (and its updated content) will be restored.
Thanks for your comment VonC. I did check out the link you mentioned, but I did not find it much useful as I had not renamed any directory. After spending the whole day yesterday, I figured out that I had modified some of the files previously without checking them out first. This made me to modify them forecfully as they were in read-only mode as they were not checked-out. This resulted in those files to become hijacked, and hence when I tried to update my view to look at all the modifications in the repository, it was unable to merge my modified file with that on the server as those files were modified without being checked out so the cleartool update was made to believe that the file is not modified(since it was not checked out) but actually it was. That was the fuss!! :)

Showing page based on operating system with NSIS

I'm working on an NSIS script in which I have two directory pages. One gets the directory for the program install, and one gets the directory for putting any data.
The reason for this, is that with some of the control issues in windows 7 and vista involving the Program Files folder, I want the data to be placed outside of the Program Files folder but still give the user the option to put it where they want.
I have the version plugin for NSIS, and I understand how to use that. My issue is that when someone is installing on XP or earlier I don't want to give them the options for the data.
How can I show a directory page based on what OS the user is running?
To skip a page, call the abort instruction in the pre callback function for the page you want to skip.
I'd also like to point out that even though most users are admin on 2000/XP, that same permission issue exists on any NT based platform, not just Vista+.

Resources