Including patches with an installer using a basic msi Installshield project - asp.net

I'm currently stuck with an Installshield project for installing our ASP.Net Application and need to implement upgrading. From my initial investigation it seems extremely complicated for what is essentially copying over a number of files.
Of the options available: patches and small, minor and major upgrades, what seems to most suit our needs is a patch but it is done as a separate .exe.
Is there a way to include patches in the full setup.exe or another recommendation that makes the whole process less complicated.
EDIT
Any alternative recommendation still needs to be done as part of an installer.

No, there is no way to include patches in the installer setup.exe. Patches, as well as small and minor updates, are applied to already installed application. I mean users already used the original installation package to install your application. And patch update contains only small set of files that are modified.
What you want is a major update. This kind of package contains all the required files, and it can be used to install the application for the first time. In case where the application is already installed, this kind of installation package will automatically remove the old version and install the new one.

If it involves only copying files then IMO, the best option is to give the bunch of files in needed directory structure and ask to overwrite existing copies. A slightly more user-friendly measure would be to zip up the directory structure along with a batch file and ask to unzip it in the app directory under some designated folder and then run the batch file to overwrite files.

Related

How to include a full R distribution in my GitHub repository

I build transport models for various government agencies. My model is managed through GitHub, and it depends on R to perform certain calculations. I currently have my entire r installation folder in the repository. This can't be the right solution, but here are some of my constraints:
My clients are usually even less sophisticated programmers then I am. When they download/clone the model, it just needs to work.
This needs to be the case 10 years from now - regardless of what the current build of R and all the package dependencies are.
Placing my entire R folder in the repo solves these two problems, but creates some new ones:
The repository is much larger than it needs to be / longer download time.
If the transport model is updated to a new version (say v2.0), I'd want to update R and its packages to the latest versions. I'm afraid this would increase the size of the repo even further.
One solution I understand is submodules. I could place the full R folder in a separate repo and bring it in as a submodule. This, at the very least, cleans up the model repository.
What about zipping the R folder? Some early testing showed that git can diff the zip file, but I don't know if it is doing it as a flat file or reading the contents. Also, is GitHub going to complain about 100MB+ zip file? I'd like to avoid GitLFS if I can, but asking my clients to unzip that file wouldn't be a problem.
I also looked at packrat, but as far as I can tell, that only works for R projects.
Lastly, I don't entirely understand makefiles / recipes, but it would be nice if there was a script I could run that would download specific versions of R and it's libraries. One complicating thing is that some of the R packages are private GitHub repos.
Anyway, I'm happy to provide more info if needed. Thank you for your help!

Qt online installers list all repos

I'm learning the Qt Installer Framework and creating repositories on a web server to be used by the online installers. My problem is that even though each repository and installer is treated separately in the creation process, when I run any of the installers they list ALL the programs/repos on our server and check each of them for install. It is of course possible to manually uncheck them, but really I want an installer specific to each program as different departments at my work use different programs and shouldn't have to go through the full list.
I don't understand why the installers are listing everything: each program/package has an individual repository on the server and an individual installer created using a config.xml, installscript.qs, and package.xml specific to that program. Nowhere in any of those files is there a reference to any other program or repository, and I've used the repogen and binarycreator for each individual program rather than as a batch. The only thing I can think of that might be affecting it is that the individual packages are subfolders under the same "package" folder in one unified installer framework folder. I just point the repogen.exe and binarycreator.exe at the desired subfolder. Could this be causing my problem? Do I really need to have a separate installer folder with config and package subfolders for each program?
There's obviously a lot of moving parts to this so I'm not sure what specific code/info I should post, but please feel free to ask me for something that may be helpful and I will provide.
I'm not sure of the details of why, but the answer appears to be that every program needs its own installer folder. It seems that everything under a single "packages" directory is consider a component of the same program, regardless of subfolders or arguments passed to the binarycreator. So if you have multiple programs they each need their own installer folder with config and packages directories. That is unless you want a full list of available programs associated with each installer; and then what's the point of separate installers?

What is the recommended procedure for upgrading to a new maintenance build of oXygen XML?

Periodically we receive announcements of new maintenance builds of Oxygen XML Editor. It's easy to locate documentation on installing new versions, but I was unable to find any instructions on installing maintenance builds.
In the past I've renamed the downloaded folder, e.g, "17-1", which completely duplicates all the files in Applications (I'm using OS X), then later on deleted the older folders when it seemed safe to do so.
I would like to know the best-practice, most efficient way to routinely install these frequently released maintenance builds.
Since there is no Oxygen installer for OS X (it's just an archive), there is no straightforward way of upgrading (installing in the same folder), like there is for Windows or Linux.
The official upgrade procedure for maintenance builds (it's the same for minor version updates) goes like this:
To upgrade:
For Windows and Linux you can install the new build in the same folder as the previous installation, it will automatically upgrade it.
Before you upgrade, if you have added files or made changes to any of the files from the Oxygen installation folder (especially the frameworks folder), you may want to create a backup of them because they will be overwritten during the upgrade procedure. Custom frameworks will be preserved but we recommend backing them up anyway, just to be safe.
For Mac OS X you will have to either move the old folder from Applications to a different location and put the new version of Oxygen in its place, or install in a different folder. You can then copy any files you may have changed from the old folder (if any) to the new folder.
The Oxygen preferences will be preserved since they are located elsewhere (user home folder).
What I'd like to add is that, if you have custom frameworks and want to keep Oxygen up to date, it's a good idea to keep the custom frameworks in a different folder (from your user home) than the Oxygen installation folder and simply configure Oxygen to load them from that folder (Options > Preferences, Document Type Association > Locations, Additional frameworks directories). This greatly simplifies the upgrade procedure.
Regards,
Adrian
According to a colleague, his way of doing it, FWIW:
I keep all oXygen stuff in the directory /Applications/oxygen
When I get a new oxygen.zip download, I put it there, unzip it, and rename the directory to the oXygen version name. So right now I have
/Applications/oxygen/17.0
I usually compress the previous version and delete the directory for it, but keep the zipfile for a while in case I need to revert to the
old version
I keep the related jarfiles in /Applications/oxygen/lib so that they don't live in the same directory as an oxygen version that might get
upgraded
I create an alias under /Applications named "oxygen" that points to whatever current version of oXygen Editor I'm using (and it needs to
be updated whenever the current directory changes)
I can't accept this as the best answer unless I receive confirmation that this is the ideal method on Mac OS X. If there is another proposed procedure that is conventionally accepted as the best practice, or a definitive answer from an authoritative source, then I will accept that answer.

Use Fossil for system files?

As a new user of Fossil, I'm curious if there are any negative implications with using Fossil to store things like /etc/, /usr/local/etc files from Unix like systems like FreeBSD & OpenBSD. If I'm doing this for multiple systems, I think I'd create a branch with each hostname to track those files.
Q1: Have you done this? Do you prefer a different VCS to handle the system files?
Q2: Lots of changes have happened in Fossil over the years and I'm curious if it's possible to restrict who can merge branches with trunk. From reading earlier threads it wasn't possible but there are two workarounds:
a) tell people not to merge to trunk
b) have people clone and trunk maintainer pick up changes from their repo
System configuration files stored in /etc, /var or /usr/local/etc can generally only edited by the root user. But since root has complete access to the whole system, a mistaken command there can have dire consequences.
For that reason I generally use another location to keep edited configuration files, a directory in my home-directory that I call setup, which is under control of git. Since I have multiple machines running FreeBSD, each machine gets its own subdirectory. There is a special subdirectory of setup called shared for those configuration files that are used on multiple machines. Maintaining multiple copies of identical files in separate repositories or even branches can be a lot of extra work.
My workflow is the following;
Edit a configuration file in my repository.
Copy it to its proper location.
Test the changes. If problems occur, go back to step 1.
Commit the changes to the revision control system. Copy the
committed files to their proper location.
Initially I had a shell script (basically a list of install commands) to install the files for me. But I also wanted to see the differences between the working tree and the installed files.
So for my convenience, I wrote a script called deploy to help me with this. It can tell me which files in the repo are different from the installed files and can show me the differences. It can also install files to their proper locations.

Single git repo setup tracking multiple locations on hard drive

I'm very new to the world of git (done some svn in the past) and would like some advice on trying to accomplish the following.
My current workflow is that I setup the static html files using Middleman to get the base HTML structure and styles before porting over to a Wordpress template. These static files are located at C:/git/project-name/HTMLTemplates.
My wordpress setup uses Xampp so the theme files are kept in C:/Xampp/wordpress/wp-content/themes/project-theme.
What I would like to do is have a single git repo that tracks the changes of the two different locations (HTMLTemplates and project-theme)
Is this at all possible, or do I simply create two individual repos (eg: proect-static and project-wordpress)?
No, there is no mechanism in git for this. Git assumes that all files that it manages (the "working copy") live in a single directory (and subdirectories); there is no support for managing two separate directories in in repo.
So you'll have to somehow keep everything in one directory, probably as subdirectories HTMLTemplates and theme or similar.
You could use two git repos, but I'd strongly advise against this. A single repo should contain a whole "project", i.e. everything needed to build one piece of software (excluding things like external libraries). If you split your project across two repositories, you cannot usefully branch and merge (because you'd have to do it in both repos simultaneously), you cannot easily check out old versions etc..
To solve your problem, I see a few possible solutions:
Have some build / deployment script that copies everything to the right places. You probably alread have a script that invokes Middleman, and possibly tells Wordpress to refresh its cache, so you could add it there.
Set up a symbolic link for the wordpress directory. On UNIX-like systems this is easy and commonly done. On Windows, you can create "junction points", which I believe work similarly.
Configure Wordpress / Apache to read the directory directly from your git working copy. The path should be configurable.
I would prefer the first solution; this has the added advantage that it will decouple your development environment from the server configuration. This will make it easier if your setup later changes or your project needs to run in a different environment (development on a different machine, someone else also wants to work on your project, you want to deploy to a hosted server somewhere etc.).
Note: The problem is, I believe, that your are trying to use git as a deployment tool. While many people do this, git is not really suitable for this purpose. Deployment should usually be a separate step.

Resources