How QReadWriteLock works properly - qt

I had a server programm that takes settings from a file. I tried to lock while reading it using QReadWriteLock class allocated dynamically in my own class and released to free store in destructor of my class object of which exists in main(). My major was to lock it, so that if launched, the second time .exe of the programm it couldn't get access so that I could have had a Singleapplication Compiling shows no error but when I launch my server from second .exe (yet the first functioning) it also has a access to settings. What is my mistake or I'm not suppose to utilize QReadWriteLock in this case, due to it's related to threads only? Not to seperate executables?

The QReadWriteLock is used to synchronize producer/consumer-like threads within a single application. If you want to prevent starting the same executable more than once, you'd use a PID-file or something alike. An other option is to use the QtSingleApplication code from the QtSolutions-plugin: Qt: Best practice for a single instance app protection.

Related

SQLite reader.read() always returning false when executed from Task Scheduler

I have a C# program that uses a SQLite database to read/write data. This program requires UAC elevation and I require it to be running at all times. When I run this program manually, which I have to 'Run as admin', my SQLite database functions normally, able to read/write data to the database file. However, my issue is when I try to have this program execute automatically when the computer starts.
As I mentioned earlier, I require this program to execute at all times. So, I have put a couple things in place that re-executes the program in the event it crashes (which works great). However, I also need this to begin executing when the computer restarts. Normally this isn't a problem, but the program requires UAC and I will rarely be around to click Yes on the UAC dialog, so I read around and it seems the only way to do this was to set up a task in Task Scheduler. So, I have set up a task to run this at startup. Upon testing, the program does execute but not functioning correctly. Upon further debugging, I've found that each time my code reaches a SQLiteDataReader.read() line, it always seems to return false even though I know there are records there, but this only happens when the program is executed thru Task Scheduler. No errors seem to be coming from SQLite. I suspect file permissions to be the issue, but do not know how to resolve.
A couple things to note of what I've tried already.
1) In the Task Scheduler, I've set this up to execute using the same user account as I've been using to run it manually, which is also a Domain Admin, Admin, and a local Admin account.
2) The task is set to "run with highest privileges"
3) I've changed the security permissions to Full Control for just about every object I can think of (Admins, Domain Admins, Users, , Everyone, etc) on both the root folder of the program AND the SQLite database file.
4) I've even tried moving the entire application outside of the Program Files folder in case there was some sort of restricted access involved there as well.
I'm at my wits end trying to figure this out. Any ideas on what to try next? Or other solutions to get this to execute correctly at startup without user interaction?
I'm a bit late on reporting back on this issue. Stupid on my part... The task scheduler simply needed to include the applications file folder path as the Startup path. So it wasn't finding the database file as my path is using relative paths to reference. I personally don't understand why this shouldn't always just default to the app's folder, but you live and you learn and bang your head on everything in between.

Best practices place to put URL that configs my app?

We have a Qt app that when it starts tries to connect to a servlet to get config parameters that it needs to keep running.
The URL may change frequently because we have to test the application in several environments. Right now (as a temporary solution) the URL is a constant in source code, but it is a little bit ugly.
Where is the best place to mainting this URL, so that we do not need to change the source code every time I want to change the environment target?
In a database table maybe (my application uses a SQLite DB), in a settings file, or in some other way?
Thank you for you replies.
You have a number of options:
Hard coded (like you have already)
Run-time user input
Command line arguments
QSettings
Read from a bespoke file as text.
I would think option 3 would be the most simple to implement without being intrusive, but it does depend on what kind of application you have.
I would keep the list of url in a document, e.g. a XML, stored in a central, well known place, e.g. a known web server, and hardcode the url of the known place in the app.
The list could then be edited externally without recompiling your app;
The app would at startup download and parse the list, pointing to the right servlet based upon an environment specified as a command line parameter.

Editing an Excel document with Macros in ASP.net

Is it possible in any way to edit an excel sheet through an ASP.net page that contains macro. I have tried to open the Excel sheet and it seems to just hang rather than load the excel. Testing on a page without macros works perfectly fine?
Disclaimer: I don't know the Excel license agreement and I don't know if utilizing Excel in a server process violates it or not. This is purely a technical description of how to get it working. The reader is advised to check the license agreement to see if it's allowed to do so or not. Different Office versions may have different license agreements. I used this method at several Fortune 100/500 companies and they didn't seem to care. Go figure.
This solution works but it has some limitations and require a fair amount of control over the server where it runs. The server also needs to have lots of memory.
To start, make sure that you perform a complete installation of every single Office feature on the server so that Excel won't try to install something if you attempt to use a feature that's not present.
You also need to create a dedicated user account on the server that has the right privileges. I can't tell you what exactly because in my case we controlled the server and we gave admin rights to this user.
When you have the user account, you need to log in as that user and run Excel (preferably all Office applications) at least once so that it can create its settings.
You also need to configure Excel to run under this user account when it's created as a COM object. For this, you need to go into DCOM Config on the server and configure Launch and Activation Permissions for the Excel.Application object to use your new user account. I'm not sure if I remember correctly, but I think after this step, running Excel as an interactive user was slightly problematic.
By default, Office applications try to display various messages on the screen: warnings, questions, etc. These must be turned off because when you utilize an Office application from a web application, it runs on the server so a human user won't be around to dismiss these messages - the Office program will just sit around indefinitely, waiting for the message to be dismissed.
You need to set (at the minimum) these properties:
DisplayAlerts = false
AskToUpdateLinks = false
AlertBeforeOverwriting = false
Interactive = false
Visible = false
FeatureInstall = 0 'msoFeatureInstallNone
to disable UI messages from Excel. If you use Excel 2010, there may be more, but I'm not familiar with that.
If you have Excel files with macros in them, you may have to disable macro security in Excel - that can't be done programmatically, for obvious reasons.
To access Excel services, implement a manager object that will actually hold the Excel reference - don't try to hold the Excel.Application object in the page because your page code will become very complicated and you may not be able to properly clean things up.
The object that holds the Excel reference may be a separate DLL or an out-of-process server. You must make sure, however, that when you acquire an instance of Excel on a given thread you always create a new Excel instance. The default is that an already running Excel instance will also serve other requests but that won't work for you because the same Excel instance cannot be shared among multiple threads. Each request-processing thread in IIS must have its own Excel instance - if you share instances, you'll have all kinds of problems. This means that your server will need to have quite a bit of memory to have many instances of Excel running. This was not an issue for me becasue we controlled the server.
If you can, try to create an out-of-proc (.exe) COM server because this way you can hold the Excel reference in a separate process. It's possible to get it working using an in-proc (.dll) COM object but it'll be more risky to your application pool - if Excel crashes, it'll crash your app pool as well.
When you have an .exe server, you can pass parameters in several possible ways:
Make your manager objet a COM object and pass parameters as properties.
Pass parameters as command-line parameteres to the .exe as it starts up.
Pass parameters in a text/binary file; pass the name of the file on the command-line.
I used all these and found the COM object option the cleanest.
In your manager object, follow these guidelines:
Wrap every single function that uses Excel in a try..catch block to capture any possible exception.
Always explicitly release all Excel objects by calling Marshal.ReleaseComObject() and then setting your variables to null as soon as you don't need them. Always release these objects in a finally block to make sure that a failed Excel method call won't result in a dangling COM object.
If you try to use any formatting features in Excel (page header, margins, etc.) you must have a printer installed and accessible to the user account that you use to run Excel. If you don't have an active printer (preferably attached to the server), formatting-related features may not work.
When an error happens, close the Excel instance that you're using. It's not likely that you can recover from Excel-related errors and the longer you keep the instance, the longer it uses resources.
When you quit Excel, make sure that you guard that code against recursive calls - if your exception handlers try to shut down Excel while your code is already in the process of shutting down Excel, you'll end up with a dead Excel instance.
Call GC.Collect() and GC.WaitForPendingFinalizers() right after calling the Application.Quit() method to make sure that the .NET Framework releases all Excel COM objects immediately.
Edit: John Saunders may have a point regarding the license agreement - I can't advise about that. The projects that I did using Word/Excel were all intranet applications at large clients and the requirement to use Word/Excel was a given.
The link he provided also has some tools that may be useful, although those libraries won't have full Excel functionality and if that's what you need, you don't have a choice. If you don't need full Excel functionality, check out those libraries - they may be much simpler to use.
A few links that may be useful to people trying to implement this approach:
StackOverflow question
Possible alternate products
COM server activation and window stations
The story changed a little while ago, with HPC Services for Excel.
With that, you can do Office Automation on a web server. I'm still trying to determine how it fits my situation, but you may want to check it out.

Identify what process is using DirectShow filter and kill it?

Is there a way to identify what process is using a particular DirectShow filter? Specifically a video capture filter.
If our application throws an exception trying to use a DirectShow filter because it's already in use, we would like to identify the process that is using the filter and kill it. Of course this is not a general purpose or distributed application but one installed on a dedicated computer whose sole purpose is to run our application.
Thanks,
Ideally, I think killing a process should be avoided by all means... many bad things can happen as result. That said, my proposal counts on 5 parts:
Locating the fitler dll file in the file-system.
Enumerating all processes
Enumarating all loaded modules of each process
identifying which process is using the filter.
Killing the process.
Since you did not specify any language or programming framework, I will assume C#/.net just for convenience.
1- DirectShow filters are just COM objects, so they are registered in the system as such. You need to figure out the GUI of your filter, using this GUID, you can locate the registry key where this object information is stored, then you can retrive the location of the dll in the file system from there. Microsoft.Win32.Registry can be used to access the registry.
2- System.Diagnostics.Process.GetProcesses() can be used to enumerate all running process.
3- System.Diagnostics.Process.Modules can be used to enumerate all modules (dlls) loadded by the process.
The rest of the steps should be trivial.

What would be a good approach to use Thread or Thead Pool in ASP.NET

I am developing a component to create bespoke BulkImport functionality in ASP.NET. Underline this component will be using SqlBulkCopy class. There will be different file formats. The file is imported into a intermidiate table and is then transformed to the required tables. The upload file can be big and might take couple of minutes for processing. I would like to use Thread or Thead Pool to do asynchronous processing. Can you please suggest a good approach to handle this problem.
note: This is an internal application which would be used by max 2-5 person at any given time.
The main problem with firing up additional threads in ASP.NET is that the framework can rip the AppDomain out from under you (for example, if someone edits the web.config or IIS decides to recycle the worker process). If that happens, your worker thread is also terminated and you can't really control it.
If you don't think that'll be a problem, then it doesn't really matter, but I would suggest that a better solution would probably be to fire up the work in a separate process that you can then monitor from your web application.
That way, if someone edits the web config, or IIS recycles the worker process, the import process is running independently and you don't have to worry.
Here is my approach:
Ask the user to paste in the unc path to the file. Save this path into a table in sql.
Write a windows service to check for new entries in the path table. When finding a new entry, start processing the file. Update the tabel periodicly with the progress and check flags (below)
Have an ajax callback in the browser that checks the table for progress, returning as a percentage to the client. Allow the client to stop the process by adding some flags to the table.

Resources