Showing page based on operating system with NSIS - directory

I'm working on an NSIS script in which I have two directory pages. One gets the directory for the program install, and one gets the directory for putting any data.
The reason for this, is that with some of the control issues in windows 7 and vista involving the Program Files folder, I want the data to be placed outside of the Program Files folder but still give the user the option to put it where they want.
I have the version plugin for NSIS, and I understand how to use that. My issue is that when someone is installing on XP or earlier I don't want to give them the options for the data.
How can I show a directory page based on what OS the user is running?

To skip a page, call the abort instruction in the pre callback function for the page you want to skip.
I'd also like to point out that even though most users are admin on 2000/XP, that same permission issue exists on any NT based platform, not just Vista+.

Related

Unable to use correct file paths in R/RStudio

Disclaimer: I am very new here.
I am trying to learn R via RStudio through a tutorial and very early have encountered an extremely frustrating issue: when I am trying to use the read.table function, the program consistently reads my files (written as "~/Desktop/R/FILENAME") as going through the path "C:/Users/Chris/Documents/Desktop/R/FILENAME". Note that the program is considering my Desktop folder to be through my documents folder, which is preventing me from reading any files. I have already set and re-set my working directory multiple times and even re-downloaded R and RStudio and I still encounter this error.
When I enter the entire file path instead of using the "~" shortcut, the program is successfully able to access the files, but I don't want to have to type out the full file path every single time I need to access a file.
Does anyone know how to fix this issue? Is there any further internal issue with how my computer is viewing the desktop in relation to my other files?
I've attached a pic.
Best,
Chris L.
The ~ will tell R to look in your default directory, which in Windows is your Documents folder, this is why you are getting this error. You can change the default directory in the RStudio settings or your R profile. It just depends on how you want to set up your project. For example:
Put all the files in the working directory (getwd() will tell you the working directory for the project). Then you can just call the files with the filename, and you will get tab completion (awesome!). You can change the working directory with setwd(), but remember to use the full path not just ~/XX. This might be the easiest for you if you want to minimise typing.
If you use a lot of scripts, or work on multiple computers or cross-platform, the above solution isn't quite as good. In this situation, you can keep all your files in a base directory, and then in your script use the file.path function to construct the paths:
base_dir <- 'C:/Desktop/R/'
read.table(file.path(base_dir, "FILENAME"))
I actually keep the base_dir assignemnt as a code snippet in RStudio, so I can easily insert it into scripts and know explicitly what is going on, as opposed to configuring it in RStudio or R profile. There is a conditional in the code snippet which detects the platform and assigns the directory correctly.
When R reports "cannot open the connection" it means either of two things:
The file does not exist at that location - you can verify whether the file is there by pasting the full path echoed back in the error message into windows file manager. Sometimes the error is as simple as an extra subdirectory. (This seems to be the problem with your current code - Windows Desktop is never nested in Documents).
If the file exists at the location, then R does not have permission to access the folder. This requires changing Windows folder permissions to grant R read and write permission to the folder.
In windows, if you launch RStudio from the folder you consider the "project workspace home", then all path references can use the dot as "relative to workspace home", e.g. "./data/inputfile.csv"

Converted WinForm app can't write-access WindowsApps folder

everybody. Sorry if I ask something trivial. I looked into the previously asked questions, to no avail. If the question was already asked, I beg your excuses, and please point me in the right direction.
I have a number of single form WinForms apps that I'm in the process of converting to appx for the store.
So far so good. No issues with that.
Despite, one of my apps uses the application folder to save some data to a temporary file
Dim sw As New StreamWriter(Application.StartupPath + "\somefile.csv", True)
The .exe program of course works with no issue.
The converted .appx program complains that I have no write permission to the WindowsApps and it subfolders. I quickly solved by taking ownership of the folder and give myself full control over it.
Do I have any chance to prevent the error message to appear on a generic machine, other than trivially changing the source code to point the temporary folder to some other path?
Clearly I don't want the admin to give the user full control over WindowsApps folder.
Writing to the package folder is not allowed. You need to change your code to write to a location that is writeable for the app/user, for example to the AppData folder.
This is documented in the Desktop Bridge preparation guide:
https://learn.microsoft.com/en-us/windows/uwp/porting/desktop-to-uwp-prepare
(it's the eighth bullet point)

QFileSystemWatcher locks directory on Windows

I am watching a directory recursively using QFileSystemWatcher. And I am not able to rename/delete the parent directory either programmatically or manually if its sub directories are being watched.
When trying to rename manually through system i get a message box saying "The action cannot be completed because the folder/ file in it is opened in another program" and on renaming programmatically it fails.
I got these similar bugs, but no resolution:
http://qt-project.org/forums/viewthread/10530
https://bugreports.qt-project.org/browse/QTBUG-7905
I am not watching . and .. as said in the above link, but still the directory is locked.
In case of programmatically renaming.. I tried a workaround:
1. Remove all the subdirectory paths from watcher before renaming the parent.
2. Rename parent.
3. Add subdirectory paths again.
But here too my program fails on first step. QFileSystemWatcher::removePath() returns false when trying to remove the subdirectory path, and QFileSystemWatcher::directories() show that directory in the paths being watched. Same as posted here https://bugreports.qt-project.org/browse/QTBUG-10846
Since step 1 fails here, step 2 also fails and i cannot rename the parent dir.
I am using Qt5.2.1 and Windows 7.
Kindly help me with a resolution.
This is a bug in QFileSystemWatcher as discussed here
After days of trying, I am finally able to find the solution of my problem by using Win32 API for watching directories on Windows platform. I wrote a blog post on How to use Win32 Api to monitor directory changes. I would like to share the link so it may help others who land up here to find the solution of the same problem.
Win32 API to monitor Directory Changes

What is the best location for a "read me" file on the target machine when deploying an ASP.NET application using an .MSI package?

For an ASP.NET web application that is packaged and sold to customers for deployment, what would be the best location for a "read me" file with notes about setup and configuration on the target system?
Requirements:
The file should not be accessible by
users of the web application, only
the person doing setup and
configuration.
The file should be
consumable by the MSI installer
program, so that it can be displayed
as part of the setup wizard UI.
The solution should be simple and very
low cost. (I don't want an elaborate
solution for just a simple text
file.)
Some thoughts I have are to copy the file to *App_Data* or to bin as those are protected folders by default, and then pull the file in from one of those locations in the setup program.
The readme should be a separate file that sits beside the MSI on the media you distribute the web app on. This is a standard practice dating from generations ago the dark ages. If you distribute as a download from the web then have a link for the MSI, and a link for the readme.
You could also include the same file into the MSI, but arguably that is the wrong place for it as the user has yet to reach the configuration stage, and unless they print it they won't be able to refer to it later in the MSI process (if you have any configuration steps in the MSI).
Having the instructions available via the web app is also arguably wrong, as the user may have to do some initial configuration in order to reach the page telling them how to configure the app....
So ship the instructions separately to the MSI, and make sure they look okay and are easily readable when printed out. Remember these pointers:
Instructions are not always read
Instructions are not always read at the time of installation
Instructions are not always read by the same person that does the installation
Instructions are not always read from the screen
Instructions are not always read correctly, even when they are simple
Instructions are not always read (I know that is a duplicate of the first point...)
Don't forget to clearly distinguish between pre-install and post-install configuration instructions (even if they are in the same document) - you want to minimize the risk of the end user getting it wrong (which some of them will do no matter how hard you try).
Build the important message into your application. Do it like Apache where it says "this is a new installation of...." and don't allow that screen to go away until they go in and do all the things that you consider important.
This isn't a problem for your installer to solve.

Problem with workflow on SharePoint email enabled document library

SO ... here is the scenario ... i have a workflow on a document library that copies a file to a windows directory ... this workflow is set to be started at the time when a new item is added to the document library ... so everything works fine when you are manually uploading files to the doc library ... but the problem occurs when we use emails to populate the doc library instead of the manual uploading of files.
When an email is received ... the workflow starts successfully and runs properly (i have kept workflow history entries to check every section of code is being executed or not) ... the workflow stops when the section where the file is being copied to the windows folder is reached.
I basically think this is a problem with the permissions or access issues. Because when we upload the file manually (i.e. from doc library > upload) everything works fine. But maybe there is some other permission set which is used while an email is received by the doc library ... i have tried by assigning permissions to "Everyone" on the windows folder ... but no luck...
Can someone let me know which windows user account is used when an email is received by a document library? (i think its the IIS default account - but isnt it included in Everyone?? )
One solution which i can devise in my mind is that for the file transfer to the windows folder i should use temporary impersonation for the specific code segment (which writes the doc library file to windows folder) but any suggestions are welcome.
P.S. I dont have access to the server right now so i can only devise approaches in my mind ... cant test them right nw... so it would be good to have all suggestions u have so that once i get the access i can try all stuff :D
This is a well known situation. The system does not know who sent the email so it cannot impersonate a user it has no knowledge about.
Depending on which version of SharePoint you are running, the workflow may not start at all or it may start under the account that published the workflow.
For details see this Microsoft Support Article.

Resources