Desktop Windows's apps created using QT need admin rights - qt

I created desktop app for Windows (running mostly on Win 10) using QT libraries. Explicitly in my code, I don't perform any operations that require administrator rights, especially writing to "Program Files" etc - application uses local app data folder structure (I double checked this going deeper and deepr into this matter).
In my manifest file application also doesn't need admin privileges (it's as invoker).
However, my application still requires admin rights to run.
My question is not about how to solve my specific case, because I established that it's because deep dependencies hidden in QT libs to Windows API and these calls often require admin rights in case of operations that seem to not exactly need it like drag & drop or network connection with specific IP address.
I followed it using Microsoft Standard User Analyzer (SUA) tool on my executable.
I'm putting here example log from SUA investigation:
In detailed info for pos. 1-2 I can see it's because:
However for 3rd position, it is even more complex problem related to PROCESS_QUERY_INFORMATION access allowed only by elevated processes. Example stack trace (one of many many more):
Summarizing - my question:
You can believe me that I don't perform any operations that require admin rights from "normal", common sense point of view. Moreover my customer have old application written in .NET env that doesn't need admin rights and does the same things in general (I mean nothing "special").
What is a general way to overcome such problems with QT development environment?
Or using QT everyone takes a risk that the application mostly will require admin rights?

Related

Automatically unblocking executables downloaded from the web site

I have a web site (intranet) that allows you to download an executable (currently a .Net Console Application) written in ASP.NET and is using https.
However on many machines I can't run it right away after download - I need to right click on it, go to Properties and click Unblock which makes using this app uncomfortable (users will often have to download this executable and run - every time it is a new one as it is code generated)
Is there any way to make this executable automatically unblocked? Modifying client machine is not an option, but I can do anything with the server.
From the beginning I thought this is impossible as it is a security protection, but Chrome somehow does this. If I take a new PC with IE installed, type Chrome into Bing and install it - I don't have to unblock executable.
So far I've tested this only on W10 Chrome and IE, but I am pretty sure older Windows versions have this problem as well.
The mechanism for showing the untrusted executable dialog is based around alternate Datastreams. The metadata gets added by Windows or the browser when you download something from a network source, thus it is not possible for your file/webserver to influence this behaviour. Windows on the other hand has a ruleset which it uses to apply the flags which can be found in the TrustZone-Settings of your Internet Options.
NTFS has a neat little feature which allows for a file to have multiple contents, also known as alternate Datastreams. This is an NTFS-only feature, so you won't find it on other partition types. This basically allows you to store more data in your file which is not perse visible to the user and cannot be easily found out by a standard windows user. Windows uses those alternate datastreams to mark the origin of a file, especially when downloaded from the inter- or intranet. The Alternate Datastream which is used for this data is called the "Zone.Identifier" and holds an ID to the zone which the file was copied from. When you decide to trust a file you basically tell Windows to remove that datastream.
Windows uses the concept of different zones to classify those files. Windows knows four zones in Total: Internet, Intranet, Trusted Sites and restricted Sites. You can alter the settings and rules for those in the Internet-Options dialog in the tab "Trust Zone"
Security Remark: Before changing your settings for the trust zones in the company consider the security risks of this thrice. As it will allow any executable from those verified sources to be executed, potentially laying way to malicous executables which can then be started by already infected PCs or Users themselves.
The correct way to resolve that issue is to sign that executable with a trusted and valid code signing certificate which is better to be with EV (Extended Validation). Windows will check the certificate when you run the file and will allow it to run without further actions as it is signed with a trusted cert.

Deploying multiple MSI's into the same BizTalk Application

During our development of schemas orchestrations, ports, etc. We've been exporting MSI's and binding files for deployment into our test and ultimately production environment
So, for example, we set up a series of receive ports/locations in a single BizTalk app, for the purpose of receiving all HL7 v2 messages from our HCIS. We then exported that to a bindings file, and imported into test.
Then, as we developed new schemas, we exported each schema into it's own msi file and deployed that into the same BizTalk application in our test environment. We did that because the schemas are specific to the inbound messages from our HCIS.
So now, in test, we've ended up with a BizTalk application with the receive ports and schemas we need to receive messages from our HCIS. The issue I discovered is that, if I look at the installed programs list in the control panel, I only see 1 application. So if I want to uninstall and re-install a particular schema, I'm not sure what will happen. For some reason, I half expected to see an entry for every msi I installed, but I suppose that because they're all going into the same BizTalk application, they are all registered in windows as the same application. I'm betting there is a better way to do this, any suggestions?
You can, and probably should, create different applications for each logical grouping of code. If you examine the 'deploy' section of the project properties you'll see a text box to enter your application name. When you trigger a deploy they will be placed into a separate application with the name you provide. You'll see it in the BizTalk management console.
We deploy to dev using the framework mentioned below. Then to deploy to QA right click on the application and create an MSI from that point. It will allow creating an MSI for only one application.
NOTE: the deploy setting is NOT saved globally. If another developer opens the project his project will not inherit the application name you've set.
We use the biztalk deployment framework to help manage changes when we do development.
So now, in test, we've ended up with a BizTalk application with the receive ports and schemas we need to receive messages from our HCIS. The issue I discovered is that, if I look at the installed programs list in the control panel, I only see 1 application.
I can only think of two scenarios where you might observe this behaviour:
You have multiple different MSI's (once for each schema) which you are importing into BizTalk (and hence they are appearing in the BizTalk Admin Console), but you are not running the MSI on the local machine (and so it is not appearing in 'Installed Programs'); or
You MSI's are all named the same, in which case after the import into BizTalk and the local install, you only have a single program visible in 'Installed Programs'.
I'm betting there is a better way to do this, any suggestions?
With regards to approach, you are certainly along the correct lines. I tend to advise clients to group logical artifacts into a single logical bucket - either project or Application - that can be deployed (and redeployed) without affecting other parts of the system.
In a HL7 scenario, one logical bucket might be Patient artifacts (schemas and supporting maps) and a second may be Financial artifacts (schemas and supporting maps). These logical buckets can either be deployed to different BizTalk Applications, or the same BizTalk Application depending on your requirements. However, the main benefit here is that they are separate and therefore all artifacts do not need to be redeployed if you need to make a small modification to A19 - Patient Query/Response schema for example.
How to deploy is another question entirely. I'm a massive fan of MSBuild and have written comprehensive build scripts that I tweak and reuse for each project I work on. These deployment scripts will tear down an existing environment and re-build from the ground-up, creaing Applications, deploying Resources, importing Bindings, creating Hosts and Host Instances etc. before finally starting the application. This approach removes all human error from the process and tends to be favoured by clients who often have their infrastructure teams perform the deployment rather than their development teams.
I notice that Jay mentioned the use of the BizTalk Deployment Framework. I personally struggle with this tool, partly because I need to maintain my configuration in Excel which I can't check in to source control easily.

Installing third-party Drupal modules on Azure

I've just started playing around with the new "Website" feature in Azure that allows you to create websites with just one step - and also allows you to create websites from a "Gallery", including Drupal. And I can get my Drupal site up and running, no problem. But if I try to add a third-party module (for instance, Mindtree's ODataDrupal), then I get this error message:
Installation failed! See the log below for more information.
odata_support
Error installing / updating
File Transfer failed, reason: Cannot chmod /DWASFiles/Sites/theparentsunion/VirtualDirectory0/site/wwwroot/sites/all/modules/odata_support.
More-or-less the same thing happens if I try to update some of the existing modules (which Drupal warns, with big red flashing letters, are out of date), except then my Drupal install is left crippled, with no way to fix it that I've been able to find.
Is this as-designed, or some limitation of the beta website integration? (Because a Drupal installation is kinda worthless if you can't add new modules to it, or update existing ones.) Or am I doing something wrong?
If you are trying to use plugins and 3rd party modules to Drupal based Windows Azure Websites, the results may vary person to person. This is mainly because the kind of configuration needed by specific module or plugin may or may not be supported by Windows Azure Websites model and not all kind of custom configuration will work on Windows Azure Websites and you would need to move to Windows Azure Virtual Machines.
About application specific structure, what you can do is open the websites FTP folder and whatever you could see there is user configurable, so you can configure it the way you want. However if you application will try to make changes to outside its limited scope, you will hit errors as above.
Here is a case study where Azure VM was used for Drupal based migration which shows that for complex application you may need to use AZure VM rather then Azure Websites.

InstallShield 2010 with license - no license for automatic build system (CI) as Windows service

I really need help here.
We are using CI build-process (Hudson) as an automated build system using Msbuild.
The CI run in Apache Tomcat 6 that run under the credentials of a domain user (not a local Windows user ).
Every time the CI try to build an InstallShield project (using isproj files) we get a license error message:
" C:\Program Files\MSBuild\InstallShield\2010\InstallShield.targets(62,3): error : -7159: The product license has expired or has not yet been initialized. You must launch the IDE to configure the product license in order to proceed.
C:\Program Files\MSBuild\InstallShield\2010\InstallShield.targets(62,3): error : Exception Caught".
If I log in to the same machine with the same domain user credentials and build the InstallShield project there is a license and it is working well.
Adding the user to the local Users group doesn't help (no license).
Adding the user to the local Administrators group helps and it is working.
We do not want the user to be in the local Administrators group - for various reasons.
What do I need to do to make it work?
Do I need to add permissions to the use?
Help will be highly appreciated.
Gilad
Is your build calling isSaBld.exe or isCmdBld.exe? InstallShield changed their policy in 2010 so that the standalone build functionality (isSaBld) is only available with a top-tier license. In previous versions it was usable in Pro too. Maybe this has something to do with it?
We have a similar build system - Hudson in tomcat 6, IS2010, but with Ant scripts - and calling IsCmdBld.exe is working for us.
If you are using Hudson as a service, try running the service as an administrator. But you need to make sure the administrator succees to build the project from the InstallShield IDE first.
We do not want the user to be in the local Administrators group
To my knowledge there is no way around this requirement. InstallShield's product licensing runs low-level system checks that require that the running user be in the Administrators group to succeed. That's why when you start the InstallShield IDE the UAC prompt appears. That way they can verify that the license they granted you hasn't been moved to a different machine. Without being privy to exactly how they do this, imagine e.g. direct disk sector access, CPU serial number reads, hard drive firmware access, etc. You just can't do those things without Admin rights.
However to ensure that every build can be reproduced, a build machine should be sacrosanct, and access to it should only be granted to trusted build users. It's standard for them to be Administrators on the build machine.
Can you give more details about why you need to keep the user from being an Administrator? That would enable us to give you better input.

Does anybody create installers to deploy internal asp.net web applications?

I've always deployed my web applications via FTP (sometimes even xcopy), and then manually run database scripts myself.
I started deploying this way in the 90's, but lately, I've seen a few web apps with installers. I'm starting to question, if I'm locked into an out dated process. I'm a consultant, my apps are usually internal, so I don't worry about distributing and having others installing them.
But I'm curious; does anybody create installers to deploy internal asp.net web applications?
If so, why? (Voluntarily, mandated, or part of an automation process)
And have you had any problems doing it this way?
absolutely. We use it to do all of our apps. That way we create the installer and run it on the qa and uat environments to test and we know exactly what is going to happen in production. There are no guesses as to what order someone might do something in, or if they miss a step. It makes things a lot easier.
Ooh I forgot about the automated process too. We have systems in place (Ant Hill Pro) which automatically deploy it to the proper environments. The qa people don't have to wait for something to be done, because it's all done at 2 am. If they need to rerun the build with updates, the devs check the code in and we push a button, and it's automatically deployed. No waiting for the build engineer, because he's in a meeting or sick or whatever.
You always want to have an automated way to build and deploy - it greatly reduces the chances of a one-off error if you forget a certain step. Also, it allows you to offload the deploy to someone else easily without having to teach them 100 customized steps. Whether the project is internal or not, all applications should follow best practices.
Personally I'm a bit like the OP; generally I just deploy using FTP, but in saying that typically my applications are internal, or in the case of other projects, 100% managed by me.
I've also been thinking about this lately however, and have started to think about how using proper deployment may improve the process - having to document a detailed install process can be a real pain.
I use Powershell and found really easy to automate lots of tasks. You will probably find a bit different at the very begining but at the end you will see that it's all about the power of the .NET libraries !!!
I have use the "Web Setup Project" to create an MSI that installed the output of a "Web Deployment Project" for an internal app. Our server admin wasn't up to the task to doing a 50 step manual install. For my current app, my server admin doesn't like the 'black box' feel of MSI installers and prefers getting a pile of files and a 50 step deployment manual. (See a pattern here? Ask your server admin what he wants.)
The Web Setup Project doesn't make it immediately obvious how to install to anything other than the "Default Website", other than that, it made the installation process repeatable and created a built in way to rollback (by just running the installer from 1 version ago).
This of course assumes that your virtual directory doesn't hold any user modified content-- I wouldn't trust an MSI to properly merge user created and new files.
We use the "XCopy" deploy model here, since the Ops folks have their own method of setting up security on a new web application on the server.
However, we did need to use an installer when we had to install a web application that was using a newer version of Crystal Reports since it had to do something special with a key and we didn't have a full blown version of CR on the server itself. So keep that in mind when working with third party apps, they may need to do some kind of merge module that the MSI handles easily.
Yep...we have an app that needs a lot of pre-requisites set up....web service, windows service, user accounts, security, folder creation, GAC bits etc....I rolled it all up into a nice MSI with custom actions that can install and uninstall cleanly. Saved about one hours worth of work to deploy on a new box.
A lot of the other smaller apps are just deployed by doing Publish Website to a local folder then ftp'ing the contents to the target.
It greatly depends upon the scale of your project, your enviornment and your internal user base. I rarely deploy with an msi because we are too small an operation to have multiple environments (except for SharePoint, that's different all together) . We develop and use VS to deploy web apps to a development box, assuming they are approved then we use VS again to deploy to the live box.
The only proviso is that we have multiple copies of the web.config (appended with test, dev and live) and we then delete the suffix off the relevant file depending upon where its been deployed.
It's probably not the best methodology (I know it's not), but it works and it aids rapid deployment of small to medium sized solutions in a small-scale user environment.
F5ToDebug...
Your saying its OK to take short cuts if you dont have time to do it properly?
"who's going to test the code on the test environment?" You said it yourself that you have config files for _test - why would that not be a suitable test?

Resources