Sqlite Data base not work after creating windows installer using Excelsior JET - sqlite

Trying to build an installer using Excelsior JET
I am trying to create an installer for Eclipse RCP application product.
My product is working fine only concern is when i try to make windows installer (using Excelsior JET and install creator) the database does not update.

I didn't look at the tutorial (since it is flash), but is the problem that the MSI does not overwrite an existing database file on installation? If so, this normally has to do with MSI's default file versioning rules - and how it will preserve modified, unversioned files - in essence non-versioned files that have different create and modify date stamps. This issue is a common confusion associated with MSI deployments.
I will check back to see if this is the problem. In the mean time, here is a link to an answer describing ways to deploy data files and per-user files and settings: Create folder and file on Current user profile, from Admin Profile. You might want to install a read-only database file to a per-machine location, and then copy it to the user profile upon application launch.

Related

How can I work on an existing Firebase (functions) project on a different machine?

I'm relatively new to Firebase and want to work on one Firebase Project from multiple Machines. When setting up a new Project locally via firebase CLI and attaching it to an existing Project in the cloud, there's a full project folder created in my local directory.
Is there any chance of sort of "downloading"/updating an existing project to a second machine?
The workaround I'd have chosen would be to manually copy the whole directory to the new environment and then login firebase.
But this would, given the fact of missing source control, bring the risk of overwriting changes made on machine 1 yesterday, when firebase deploy from machine 2 today, wouldn't it?
Sorry for maybe not expressing myself in a decent it-guy way, but I'm far from being a full-blooded programmer.
Thanks!
You have to manage your source code yourself, typically using a source control mechanism such as git or svn. Firebase does not provide a source control system for the code and configuration that you deploy to Cloud Functions.

How to use 51Degrees via NuGet with Azure?

I'm tryign to use 51Degrees in a .NET project that I deploy to Azure. August 2011, they released v1.2.1.3 marked as "Azure Compatible":
Foundation can now be deployed on to the Windows Azure Cloud service.
See the release note for full details on requirements and how to
setup. Azure related changes include: Instead of a log file, log
entries are written to a log table Instead of a devices file, previous
device requests are written to a device table A new conditional
compilation symbol - 'AZURE'. AZURE enabled builds will not work in
traditional ASP.NET.
Since then there have been a dozen releases and they are up to v2.1.4.9. However, their documentaiton is super light on how to use it with Azure. In fact, there was a bug originally because v1.2.1.3 stated
To make use of the changes you must create a storage account called
‘fiftyonedegrees’. The foundation will then create two tables, one for
previous devices, and another for logs.
This isn't possible because Azure storage accounts need to be unique across all instances so everyone can't create ones named fifityonedegrees.
Their response was:
After rereading the blog it seems I've made an oversight in this
regard, and will update shortly.
The storage account that the Foundation looks for can be changed in
the Foundation source code. Go to Foundation/Properties/Constants.cs
and change the string 'AZURE_STORAGE_NAME' to the name of your storage
account.
However, I'm still at a loss at how to utilize it within my project. Here's my issues:
I'm not clear whether v1.2.1.3 is the only Azure compatible release, or every release after is Azure compatible. Their documentation doesn't say.
When I install 51Degrees via NuGet, my project doesn't get an App_Data folder created which contradicts their documentation. The web.config file even has entries in it that reference the App_Data folder such as <log logFile="~/App_Data/Log.txt" logLevel="Info"/>.
Based on the response to the Azure storage account bug I quoted earlier, they are sayign IN need to edit the file Foundation/Properties/Constants.cs. However, since I'm installing via NuGet and it's a DLL, NuGet is presumably the wrong approach? Do I need to download the source and compile it myself and wire it up to my project manually?
I'm generally new to .NET, NuGet, VS, etc so appreciate the help.
All versions are Azure compatible from 1.2.1.3 onwards. I'm assuming this is the blog post you were talking about. After you've created your azure storage account, you'll have to edit the Constants.cs file in the source code and add in your account name. It's my understanding that this means you'll have to get access to the source code and edit it directly. One you have done this you'll need to recompile for the software to work correctly. I'm not sure if there is a way to perform the same task using NuGet, but I'll look into it. Hope this helps.

Looking for a good web application deployment strategy (ASP.NET MVC3)

I’m looking for a good deployment strategy for deploying a ASP.NET MVC3 application. What I imagine is that each deployment would be some kind of commit to a Source Management System in the sense that a deployment tool could automatically do the following:
1) Upon generating a deployment package (a commit) the tool would
remember the state of my Web.Config file, the state of a folder of
auto-generated scripts containing new database changed, the state of
a folder of batch files that contain new tasks to be run on the
server, the state of files specifying ISS settings changes, etc.
2) When I build a package the next time, the tool would know to only
package the new script files, web.config changes, new batch files,
new ISS settings since my last package
3) Apply the package unto my web application
I started looking into MS Deploy but it only seems to do number 3. I’ve been searching around for either an application that that does what I imagine or a strategy to combine some SMS and MS Deploy. I'm hoping that someone has already solved the problem I feel I have here. My last resort of course is to build the tool but again, that would be my last resort.
Are you using Team Foundation Server? If so, TFS comes with tools to automate builds (including labeling code, running unit tests, deploying, et cetera.) Take a look at http://msdn.microsoft.com/en-us/library/ms181710(v=vs.80).aspx
TFS is not exactly easy to configure and get going but it's free if you are already using TFS.
If you are not using TFS, look for continuous integration tools like NAnt or TeamCity.
Have you used Web Deploy and the "Publish" feature under Build in Visual Studio?
You can set options for things like leaving the previous files on the server.
Your web.config file, do you mean the main one or one that already exists elsewhere on the server? Your web.config file should copy from your project to the server, or are there settings that are different when running locally vs server? If so, look at using transforms to modify web.config.
This is only a partial answer to #1 for you, but we looked for a long time on a migration tool that we liked... We ultimately found Migrator.Net: http://code.google.com/p/migratordotnet/
Doing this, you can turn db migrations into a batch command

MSBuild: automate collecting of db migration scripts?

Summary of environment.
Asp.net web application (source stored in svn)
SQL Server database. (Database schema (tables/sprocs) stored in svn)
db version is synced with web application assembly version. (stored in table 'CurrentVersion')
CI hudson server that checks out web app from repo and runs custom msbuild file to publish/package app.
My msbuild script updates the assembly version of the web app (Major.Minor.Revision.Build) on each build. The 'Revision' is set to the currently checked out svn revision and the 'Build' to the hudson build number (incremented on each automated build).
This way i can match the app to a specific trunk revision also get other build stats from the hudson build number.
I'd like to automate the collecting of migration scripts (updated sprocs etc) to add to the zip package.
I guess by comparing the svn revision of the db that has yet to be deployed to, to the revision being deployed, i can find what db files have changed in the trunk since the last deployment to that database/environment.
This could easily be achieved by manually calling the svn diff -r REVNO:REVNO command to list changed .sql files. These files could then manually have to be added to the package.
It would be great if this could be automated.
Firstly i'd imagine I'll have to write a custom task to check the version of the db that has yet to be deployed to. After that I'm quite unsure.
Does anyone have any suggestion on how this would be achieved through an msbuild task either existing or custom?
Finally I'll have to autogen a script to add to the package that updates the database version table so as to be in sync with the application.
Integrating SQL changes into an automated build/deploy process is HARD. I know, because I've tried to to it a couple times with limited success. What you're trying to do is roughly on the right track, but I would argue that it's actually a bit too complicated. In your proposal, you suggest collecting the specific SQL scripts that need to be applied to your DB at build/package time. Instead, you should package all your delta scripts (for the entire history of your database) with your project, and calculate the deltas that actually need to be applied when you deploy -- that way, your deployable package can be deployed to environments with databases of differing versions. There are two implementation pieces you need to achieve this:
1) You need to package your deltas into your deployable package. Note that you should package deltas -- not static files that create the schema in its current state. These delta scripts should be in source control. It's okay to keep the static schema in source control as well, but you will have to keep it in sync with the deltas. You can actually use a tool like Red Gate's SQLCompare or the VS Database version to generate (most) deltas from the static schema. To get the deltas into your deployable package, and given that you're using svn -- you may want to look into svn:externals as a way to "soft link" the delta scripts into your web project. Your build script can then simply copy them into your deployable package.
2) You need a system that can read the list of delta files, compare them to an existing database, determine which deltas need to be applied to that database, and then apply the deltas (and update the bookkeeping information, like the database version). There is an open-source project (sponsored by ThoughtWorks) called dbdeploy that accomplishes this. I've had some success with that tool personally.
Good luck -- this is a tough nut to crack (correctly).
Have a look at SQL database projects. In VS 2010 they have been enhanced quite a bit and have built in deployment capabilities that can sync your DEV database to other environments.
Here are a few good links about DB projects in vs 2010:
http://msmvps.com/blogs/deborahk/archive/2010/05/02/vs-2010-database-project-building-and-deployment.aspx
http://weblogs.asp.net/gunnarpeipman/archive/2009/07/29/visual-studio-2010-database-projects.aspx
Try SQL Examiner:
http://www.sqlaccessories.com/Howto/Version_Control.aspx
You can automate script collecting with SQL Examiner command-line tool.
The solutions available today that target a .NET/SQL Server stack are:
DBUp (open source)
ReadyRoll (deeper Visual Studio integration,
auto-generation of scripts)
The latter product is one that we're actively developing here at Redgate.

How do you deploy your ASP.NET applications to live servers?

I am looking for different techniques/tools you use to deploy an ASP.NET web application project (NOT ASP.NET web site) to production?
I am particularly interested of the workflow happening between the time your Continuous Integration Build server drops the binaries at some location and the time the first user request hits these binaries.
Are you using some specific tools or just XCOPY? How is the application packaged (ZIP, MSI, ...)?
When an application is deployed for the first time how do you setup the App Pool and Virtual Directory (do you create them manually or with some tool)?
When a static resource changes (CSS, JS or image file) do you redeploy the whole application or only the modified resource? How about when an assembly/ASPX page changes?
Do you keep track of all deployed versions for a given application and in case something goes wrong do you have procedures of restoring the application to a previous known working state?
Feel free to complete the previous list.
And here's what we use to deploy our ASP.NET applications:
We add a Web Deployment Project to the solution and set it up to build the ASP.NET web application
We add a Setup Project (NOT Web Setup Project) to the solution and set it to take the output of the Web Deployment Project
We add a custom install action and in the OnInstall event we run a custom build .NET assembly that creates an App Pool and a Virtual Directory in IIS using System.DirectoryServices.DirectoryEntry (This task is performed only the first time an application is deployed). We support multiple Web Sites in IIS, Authentication for Virtual Directories and setting identities for App Pools.
We add a custom task in TFS to build the Setup Project (TFS does not support Setup Projects so we had to use devenv.exe to build the MSI)
The MSI is installed on the live server (if there's a previous version of the MSI it is first uninstalled)
We have all of our code deployed in MSIs using Setup Factory. If something has to change we redeploy the entire solution. This sounds like overkill for a css file, but it absolutely keeps all environments in sync, and we know exactly what is in production (we deploy to all test and uat environments the same way).
We do rolling deployment to the live servers, so we don't use installer projects; we have something more like CI:
"live" build-server builds from the approved source (not the "HEAD" of the repo)
(after it has taken a backup ;-p)
robocopy publishes to a staging server ("live", but not in the F5 cluster)
final validation done on the staging server, often with "hosts" hacks to emulate the entire thing as closely as possible
robocopy /L is used automatically to distribute a list of the changes in the next "push", to alert of any goofs
as part of a scheduled process, the cluster is cycled, deploying to the nodes in the cluster via robocopy (while they are out of the cluster)
robocopy automatically ensures that only changes are deployed.
Re the App Pool etc; I would love this to be automated (see this question), but at the moment it is manual. I really want to change that, though.
(it probably helps that we have our own data-centre and server-farm "on-site", so we don't have to cross many hurdles)
Website
Deployer:
http://www.codeproject.com/KB/install/deployer.aspx
I publish website to a local folder, zip it, then upload it over FTP. Deployer on server then extracts zip, replaces config values (in Web.Config and other files), and that's it.
Of course for first run you need to connect to the server and setup IIS WebSite, database, but after that publishing updates is piece of cake.
Database
For keeping databases in sync I use http://www.red-gate.com/products/sql-development/sql-compare/
If server is behind bunch of routers and you can't directly connect (which is requirement of SQL Compare), use https://secure.logmein.com/products/hamachi2/ to create VPN.
I deploy mostly ASP.NET apps to Linux servers and redeploy everything for even the smallest change. Here is my standard workflow:
I use a source code repository (like Subversion)
On the server, I have a bash script that does the following:
Checks out the latest code
Does a build (creates the DLLs)
Filters the files down to the essentials (removes code files for example)
Backs up the database
Deploys the files to the web server in a directory named with the current date
Updates the database if a new schema is included in the deployment
Makes the new installation the default one so it will be served with the next hit
Checkout is done with the command-line version of Subversion and building is done with xbuild (msbuild work-alike from the Mono project). Most of the magic is done in ReleaseIt.
On my dev server I essentially have continuous integration but on the production side I actually SSH into the server and initiate the deployment manually by running the script. My script is cleverly called 'deploy' so that is what I type at the bash prompt. I am very creative. Not.
In production, I have to type 'deploy' twice: once to check-out, build, and deploy to a dated directory and once to make that directory the default instance. Since the directories are dated, I can revert to any previous deployment simply by typing 'deploy' from within the relevant directory.
Initial deployment takes a couple of minutes and reversion to a prior version takes a few seconds.
It has been a nice solution for me and relies only on the three command-line utilities (svn, xbuild, and releaseit), the DB client, SSH, and Bash.
I really need to update the copy of ReleaseIt on CodePlex sometime:
http://releaseit.codeplex.com/
Simple XCopy for ASP.NET. Zip it up, sftp to the server, extract into the right location. For the first deployment, manual set up of IIS
Answering your questions:
XCopy
Manually
For static resources, we only deploy the changed resource.
For DLL's we deploy the changed DLL and ASPX pages.
Yes, and yes.
Keeping it nice and simple has saved us alot of headaches so far.
Are you using some specific tools or just XCOPY? How is the application packaged (ZIP, MSI, ...)?
As a developer for BuildMaster, this is naturally what I use. All applications are built and packaged within the tool as artifacts, which are stored internally as ZIP files.
When an application is deployed for the first time how do you setup the App Pool and Virtual Directory (do you create them manually or with some tool)?
Manually - we create a change control within the tool that reminds us the exact steps to perform in future environments as the application moves through its testing environments. This could also be automated with a simple PowerShell script, but we do not add new applications very often so it's just as easy to spend the 1 minute it takes to create the site manually.
When a static resource changes (CSS, JS or image file) do you redeploy the whole application or only the modified resource? How about when an assembly/ASPX page changes?
By default, the process of deploying artifacts is set-up such that only files that are modified are transferred to the target server - this includes everything from CSS files, JavaScript files, ASPX pages, and linked assemblies.
Do you keep track of all deployed versions for a given application and in case something goes wrong do you have procedures of restoring the application to a previous known working state?
Yes, BuildMaster handles all of this for us. Restoring is mostly as simple as re-executing an old build promotion, but sometimes database changes need to be manually restored, and data loss can occur. The basic rollback process is detailed here: http://inedo.com/support/tutorials/performing-a-deployment-rollback-with-buildmaster
web setup/install projects - so you can easily uninstall it if something goes wrong
Unfold is a capistrano-like deployment solution I wrote for .net applications. It is what we use on all of our projects and it's a very flexible solution. It solves most of the typical problems for .net applications as explained in this blog post by Rob Conery.
it comes with a good "default" behavior, in the sense that it does a lot of standard stuff for you: getting the code from source control, building, creating the application pool, setting up IIS, etc
releases based on what's in source control
it has task hooks, so the default behaviour can be easily extended or altered
it has rollback
it's all powershell, so there aren't any external dependencies
it uses powershell remoting to access remote machines
Here's an introduction and some other blog posts.
So to answer the questions above:
How is the application packaged (ZIP, MSI, ...)?
Git (or another scm) is the default way to get the application on the target machine. Alternatively you can perform a local build and copy the result over the Powereshell remoting connection
When an application is deployed for the first time how do you setup the App Pool and Virtual Directory (do you create them manually or with some tool)?
Unfold configures the application pool and website application using Powershell's WebAdministration Module. It allows us (and you) to modify any aspect of the application pool or website
When a static resource changes (CSS, JS or image file) do you redeploy the whole application or only the modified resource? How about when an assembly/ASPX page changes?
Yes unfold does this, any deploy is installed next to the others. That way we can easily rollback
when somehting goes wrong. It also allows us to easily trace back a deployed version to
a source control revision.
Do you keep track of all deployed versions for a given application?
Yes, unfold keeps old versions around. Not all versions, but a number of versions. It makes rolling back almost trivial.
We've been improving our release process for the past year and now we've got it down pat. I'm using Jenkins to manage all of our automated builds and releases, but I'm sure you could use TeamCity or CruiseControl.
So upon checkin, our "normal" build does the following:
Jenkins does a SVN update to fetch the latest version of the code
A NuGet package restore is done running against our own local NuGet repository
The application is compiled using MsBuild. Setting this up is an adventure, because you need to install the correct MsBuild and then the ASP.NET and MVC dll's on your build box. (As a side note, when I had <MvcBuildViews>true</MvcBuildViews> entered in my .csproj files to compile the views, msbuild was randomly crashing, so I had to disable it)
Once the code is compiled the unit tests are run (I'm using nunit for this, but you can use anything you want)
If all the unit tests pass, I stop the IIS app pool, deploy the app locally (just a few basic XCOPY commands to copy over the necessary files) and then restart IIS (I've had problems with IIS locking files, and this solved it)
I have separate web.config files for each environment; dev, uat, prod. (I tried using the web transformation stuff with little success). So the right web.config file is also copied across
I then use PhantomJS to execute a bunch of UI tests. It also takes a bunch of screenshots at different resolutions (mobile, desktop) and stamps each screenshot with some information (page title, resolution). Jenkins has great support for handling these screenshots and they are saved as part of the build
Once the integration UI tests pass the build is successful
If someone clicks "Deploy to UAT":
If the last build was successful, Jenkins does another SVN update
The application is compiled using a RELEASE configuration
A "www" directory is created and the application is copied into it
I then use winscp to synchronise the filesystem between the build box and UAT
I send a HTTP request to the UAT server and make sure I get back a 200
This revision is tagged in SVN as UAT-datetime
If we've got this far, build is successful!
When we click "Deploy to Prod":
The user selects a UAT Tag that was previously created
The tag is "switched" to
Code is compiled and synced with Prod server
Http request to Prod server
This revision is tagged in SVN as Prod-datetime
The release is zipped and stored
All up a full build to production takes about 30 secs which I'm very, very happy with.
Upsides to this solution:
It's fast
Unit tests should catch logic errors
When a UI bug gets into production, the screenshots will hopefully show what revision # caused the it
UAT and Prod are kept in sync
Jenkins shows you a great release history to UAT and Prod with all of the commit messages
UAT and Prod releases are all tagged automatically
You can see when releases happen and who did them
The main downsides to this solution are:
Whenever you do a release to Prod you need to do a release to UAT. This was a conscious decision we made because we wanted to always ensure that UAT is always up to date with Prod. Still, it's a pain.
There's quite a few configuration files floating around. I've attempted to have it all in Jenkins, but there's a few support batch files needed as part of the process. (These are also checked in).
DB upgrade and downgrade scripts are part of the app and run at app startup. It works (mostly), but it's a pain.
I'd love to hear any other possible improvements!
Back in 2009, where this answer hails from, we used CruiseControl.net for our Continuous Integration builds, which also outputted Release Media.
From there we used Smart Sync software to compare against a production server that was out of the load balanced pool, and moved the changes up.
Finally, after validating the release, we ran a DOS script that primarily used RoboCopy to sync the code over to the live servers, stopping/starting IIS as it went.
At the last company I worked for we used to deploy using an rSync batch file to upload only the changes since the last upload. The beauty of rSync is that you can add exclude lists to exclude specific files or filename patterns. So excluding all of our .cs files, solution and project files is really easy, for instance.
We were using TortoiseSVN for version control, and so it was nice to be able to write in several SVN commands to accomplish the following:
First off, check the user has the latest revision. If not, either prompt them to update or run the update right there and then.
Download a text file from the server called "synclog.txt" that details who the SVN user is, what revision number they are uploading and the date and time of the upload. Append a new line for the current upload and then send it back to the server along with the changed files. This makes it extremely easy to find out what version of the site to roll back to on the off chance that an upload causes problems.
In addition to this there is a second batch file that just checks for file differences on the live server. This can highlight the common problem where someone would upload but not commit their changes to SVN. Combined with the sync log mentioned above we could find out who the likely culprit was and ask them to commit their work.
And lastly, rSync allows you to take a backup of the files that were replaced during the upload. We had it move them into a backup folder So if you suddenly realised that some of the files should not have been overwritten, you can find the last backup up version of every file in that folder.
While the solution felt a little clunky at the time I have since come to appreciate it a whole lot more when working in environments where the upload method is a lot less elegant or easy (remote desktop, copy and paste the entire site, for instance).
I'd recommend NOT just overwriting existing application files but instead create a directory per version and repointing the IIS application to the new path.
This has several benefits:
Quick to revert if needed
No need to stop IIS or the app pool to avoid locking issues
No risk of old files causing problems
More or less zero downtime (usually just a pause at the new appdomain initialises)
The only issue we've had is resources being cached if you don't restart the app pool and rely on the automatic appdomain switch.

Resources