I'm tryign to use 51Degrees in a .NET project that I deploy to Azure. August 2011, they released v1.2.1.3 marked as "Azure Compatible":
Foundation can now be deployed on to the Windows Azure Cloud service.
See the release note for full details on requirements and how to
setup. Azure related changes include: Instead of a log file, log
entries are written to a log table Instead of a devices file, previous
device requests are written to a device table A new conditional
compilation symbol - 'AZURE'. AZURE enabled builds will not work in
traditional ASP.NET.
Since then there have been a dozen releases and they are up to v2.1.4.9. However, their documentaiton is super light on how to use it with Azure. In fact, there was a bug originally because v1.2.1.3 stated
To make use of the changes you must create a storage account called
‘fiftyonedegrees’. The foundation will then create two tables, one for
previous devices, and another for logs.
This isn't possible because Azure storage accounts need to be unique across all instances so everyone can't create ones named fifityonedegrees.
Their response was:
After rereading the blog it seems I've made an oversight in this
regard, and will update shortly.
The storage account that the Foundation looks for can be changed in
the Foundation source code. Go to Foundation/Properties/Constants.cs
and change the string 'AZURE_STORAGE_NAME' to the name of your storage
account.
However, I'm still at a loss at how to utilize it within my project. Here's my issues:
I'm not clear whether v1.2.1.3 is the only Azure compatible release, or every release after is Azure compatible. Their documentation doesn't say.
When I install 51Degrees via NuGet, my project doesn't get an App_Data folder created which contradicts their documentation. The web.config file even has entries in it that reference the App_Data folder such as <log logFile="~/App_Data/Log.txt" logLevel="Info"/>.
Based on the response to the Azure storage account bug I quoted earlier, they are sayign IN need to edit the file Foundation/Properties/Constants.cs. However, since I'm installing via NuGet and it's a DLL, NuGet is presumably the wrong approach? Do I need to download the source and compile it myself and wire it up to my project manually?
I'm generally new to .NET, NuGet, VS, etc so appreciate the help.
All versions are Azure compatible from 1.2.1.3 onwards. I'm assuming this is the blog post you were talking about. After you've created your azure storage account, you'll have to edit the Constants.cs file in the source code and add in your account name. It's my understanding that this means you'll have to get access to the source code and edit it directly. One you have done this you'll need to recompile for the software to work correctly. I'm not sure if there is a way to perform the same task using NuGet, but I'll look into it. Hope this helps.
Related
I'm relatively new to Firebase and want to work on one Firebase Project from multiple Machines. When setting up a new Project locally via firebase CLI and attaching it to an existing Project in the cloud, there's a full project folder created in my local directory.
Is there any chance of sort of "downloading"/updating an existing project to a second machine?
The workaround I'd have chosen would be to manually copy the whole directory to the new environment and then login firebase.
But this would, given the fact of missing source control, bring the risk of overwriting changes made on machine 1 yesterday, when firebase deploy from machine 2 today, wouldn't it?
Sorry for maybe not expressing myself in a decent it-guy way, but I'm far from being a full-blooded programmer.
Thanks!
You have to manage your source code yourself, typically using a source control mechanism such as git or svn. Firebase does not provide a source control system for the code and configuration that you deploy to Cloud Functions.
I am working on a ASP.net MVC4 project where a same project needs to be deployed to many clients on daily basis, each client will have its own domain / sub domain and a separate app pool and db (MSSSQL).
Doing each deployment manually could take at least 1-2 hours if everything goes well. Is there anyway using which I can do this in some automated way?
Moreover, we also need to update all of the apps when a new version is released.. may be one by one or all of them at same time. However, doing this manually could take weeks and once we have more clients then it will not possible doing this update manually.
The update involves, suspending app for some time, taking a full backup of files and db, update application code/ files in app folder, upgrade db with a script and then start app, doing some diagnosis script to check if update was successful or not, if not we need to check what went wrong?
How can we automate this updates? Any idea would be great on how to approach this issue.
As a developer for BuildMaster, I can say that this scenario, known as the "Core Version" pattern, is a common one. If you're OK with a paid solution, you can setup your deployment plans within the tool that do exactly what you described.
As a more concrete example, we experience this exact situation in a slightly different way. BuildMaster has a set of 60+ extensions that rely on a specific SDK version. In our recent 4.0 release, we had to re-deploy every extension because of breaking API changes within the SDK. This is essentially equivalent to having a bunch of customers and deploying to them all at once. We have set up our deployment plans such that any time we create a new release of the SDK application, we have the option to set a variable that says to build every extension that relies on the SDK:
In BuildMaster, the idea is to promote a build (i.e. an immutable object that travels through various environments like Dev, Test, Staging, Prod) to its final environment (where it becomes the deployed build for the release). In your case, this would be pushing your MVC application to its final environment, and that would then trigger the deployments of all dependent applications (i.e. your customers' instances of your application). For our SDK, the plan looks like this:
For your scenario, you would only need the single action, "Promote Build". As I mentioned before, any dependents would then be promoted to their final environments, so all your customer deployments would kick off once that action is run during deployment. As an example, our Azure extension's deployment plan for its final environment looks like this (internal URLs redacted):
You may have noticed that these plans are marked "Shared", which means every extension we have has the exact same deployment plan, but utilizes different variables to handle the minor differences like names, paths, etc.
Since this is such an enormous topic I could go on for ages, but I think that should be sufficient for your use-case if you wanted to try it out.
There are others but you could setup Team Server Foundation to deploy automated builds.
http://msdn.microsoft.com/en-us/library/ff650529.aspx
I find the easiest way to do this from an MVC project is to create a publish profile.
This is done by right-clicking your project selecting publish and then configuring it to your needs.
Then from TFS you create a new build definition, this kicks of a wizard which takes you through it.
There are quite a few options which would be too long to go into for every scenario.
The main change I usually find the most important is to set an MSBuild Argument to deploy with the publish profile.
This can be found at Process > Advanced > MSBuild Arguments.
Once this is configured correctly it's a simple case of right-clicking and queue new build to build and deploy.
You wil need different PublishProfile/Build configuration per deployment environment.
For backups I use a powershell script which can be called manually or from TFS.
You also have a drop folder in TFS which keeps a backup of x many releases.
The datbases are automatically configured via Sql server to backup, TBH I didn't set that up it was a DB admin guy who is also involved with releases.
From a dev testing side I use jMeter (http://jmeter.apache.org/) to run some automated scripts that check that users can login and view certain screens, just to confirm nothing major has gone wrong. However there is usually a testing team to run more detailed tests, again not setup by me.
All of the above will probably take you sometime to setup but in the long run it will literally save you weeks of time over a year.
A free alternative to TFS is http://www.cruisecontrolnet.org/, I have used this in the past too and is pretty good.
You can automate your .Net deployments with Beanstalk, which will give you a way to trigger deployments with a single click, watch progress, manage permissions and see history of deployments. Check out this guide on the topic:
http://guides.beanstalkapp.com/deployments/deploy-dotnet.html
I hope you will find it useful.
P.S. - I work at Beanstalk.
I have recently deployed my web role to Windows Azure. In the properties of my WebRole I have set Enable Diagnostics.
I can also see that it correctly maps to a storage account once deployed by viewing the configuration file of the hosted service.
I have not setup anything else for diagnostics, I am unaware that I need to do anything else.
I am now setting up AzureWatch (by paraleap) to monitor my instances however it reports that WADPerformanceCountersTable does not exist.
I am very new to Azure, don't have a clue how the diganostics work and can't find anything on Google that shows me how. Could someone please show me the way.
Ok I figured it out and will leave this here for others to follow.
Step 1
If you follow http://dunnry.com/blog/2012/02/27/SettingUpDiagnosticsMonitoringInWindowsAzure.aspx Windows Azure Diagnostics will start saving data into your attached Blob storage, full of diagnostic information.
Special Note: These count towards your storage transaction, which is why you will see them go up.
Step 2
However I needed the WADPerformanceCounterTable, which should have been located in the tables section of the storage account but it never was created. I needed this to use services like AzureWatch to monitor and spin up or down instances.
Special Note: This is performance counters, a specific subset of diagnostic information and this isn't stored in the blob section by default.
Step 3
In your project you need to add which performance counters to monitor in the WebRole.cs.
Special Note: You won't have this if you just added an existing project to an Azure deployment project. Unless you specifically started the project from scratch and chose the Azure templates, you will need to create this manually. You would also need to add: Microsoft.WindowsAzure.Diagnostics, Microsoft.WindowsAzure.ServiceRuntime and Microsoft.WindowsAzure.StorageClient as References. Best way to see how it all works is to create a blank project from an Azure template and copy over the necessary items.
Step 4
Next you need to define which performance counters to monitor. As such here is a great sample: http://code.msdn.microsoft.com/windowsazure/Windows-Azure-PerformanceCo-7d80ebf9
Extra Reference
Microsoft also has a few steps you can follow here that might help out if things still aren't working: http://msdn.microsoft.com/en-us/library/windowsazure/hh411521.aspx
Take a look at:
http://dunnry.com/blog/2012/02/27/SettingUpDiagnosticsMonitoringInWindowsAzure.aspx
There is also a lot of information on:
http://msdn.microsoft.com/en-us/library/windowsazure/gg433048.aspx
I can't think of a better way to word the question than the subject line, but I'll try my best.
We're comfortable copying and handling the AX instances themselves, but how do I create a duplicate of an existing Enterprise Portal? For example, we have one that is working well, and we'd like to spin off a copy to point at a different copy of the AOS (for development purposes).
Is there a simple process, is it as easy as copying the inetpub folder from IIS to another site instance or server? Or can you re-run the setup for Enterprise Portal on another server, copying over the files after the fact?
Links to any documentation on the process would be appreciated, although my Google searches yielded very little results.
Given that your site only contains Enterprise Portal content (pages and web parts), you should make sure that everything is persisted with latest copy /version in the AOT.
Then it would be a matter of making a project with everything in and export it and import it. You can use axupdateportal utility to deploy on the second environment.
This procedure implies installing EP from scratch on the new server.
The trick with EP development is to make sure you have a solid project within AX that holds the most recent version of your custom EP applications.
Seems like there are so many different ways of automating one's build/deployment that it becomes difficult to parse through all the different scenarios that people support in tutorials on the web. So I wanted to present the question to the stackoverflow crowd ... what would be the best way to set up an automated build and deployment system using the following configuration:
Visual Studio 2008
Web Application Project
CruiseControl.NET
One of the first things I tried was to have CCnet automatically zip the output and copy it to the server, but then that requires manual work to unzip at the destination. However, if we try to copy all the files individually, then it could potentially take a long time if it's a large application (build server lives outside of the datacenter in our office ... I know).
Also of particular interest is how we would support multiple environments as we have dev, qa, uat, and then of course prod.
MSDeploy seems really interesting, but unless I'm interpreting the literature incorrectly, doesn't help in the scenario of deploying from the output of a build server. If anything, it seems like it'll be useful in deploying one build across a build farm ... but even for deploying from one environment to another, one would have to manually change config settings and web service URLs, etc.
I recently spent a few days working on automating deployments at my company.
We use a combination of CruiseControl, NAnt, MSBuild to generate a release version of the app. Then a separate script uses MSDeploy and XCopy to backup the live site and transfer the new files over.
Our solution is briefly described in an answer to this question Automate Deployment for Web Applications?
You might be interested in MSDeploy. Here's a Scott Hanselman post on this. It's only available as a technical preview at the moment (September 2008) but is worth evaluating against your requirements.
There is another new build tool (a very intelligent wrapper) called NUBuild. Its lightweight, open source and extremely easy to setup and provides almost no-touch maintenance. I really like this new tool and we have made it standard tool for our continuous build and integration process of our projects (we have about 400 projects across 75 developers). Try it out.
http://nubuild.codeplex.com/
Easy to use command line interface
Ability to target all .Net framework
version i.e. 1.1, 2.0, 3.0 and 3.5
Supports XML based configuration
Supports both project and file
references
Automatically generates the “complete
ordered build list” for a given
project – No touch maintenance.
Ability to detect and display
circular dependencies
Perform parallel build -
automatically decides which of the
projects in the generated build list
can be built independently.
Ability to handle proxy assemblies
Provides visual clue to the build
process e.g. showing “% completed”,
“current status” etc.
Generates detailed execution log both
in XML and text format
Easily integrated with
Cruise-Control.Net continuous
integration system
Can use custom logger like XMLLogger
when targeting 2.0 + version
Ability to parse error logs
Ability to deploy built assemblies to
user specified location
Ability to synchronize source code
with source-control system
Version management capability
Do you have the ability to run commands remotely? The PsExec utility from Systinternals would let run a command line unzip program on the remote machine. If you have a script that copies the build as a .zip file to the remote site, you would just need one more line for the PsExec call to unzip the files.
I had a related question about getting a deployable set of files from an automated build. I found Web Deployment Projects (links and all in the old question) did what I needed - they're a VS and MSBuild add-on.
This is a common problem (and I wish I had read it sooner) for all development, not just ASP.NET. Being one of its developers, my team naturally uses BuildMaster internally for the entire release process, and for most scenarios it's free. Within the tool, we are able to perform all the standard CI builds to create artifacts and then set up an automation process to deploy these artifacts to any one of the 40+ servers we have internally or externally hosted, depending on the specific application or environment.
Since you specifically mentioned deployment to different testing environments, this is a fundamental aspect of the tool. The idea is to model the environment workflow (e.g. Integration -> QA -> Production) you already have in place and essentially promote a build all the way from source control to production. Most times, it's as simple as adding a deployment action that deploys an artifact to the environment, other times it can be much more complex.
You also casually mentioned configuration file changes are part of deployment, which is another built-in component to BuildMaster. The idea we had was to use the tool itself as the central hub for all configuration files and deployments, thus ensuring the latest changes are applied automatically with a simple "deploy configuration files" action in your deployment plan.
One thing you didn't mention with regard to this process is the database deployment aspect. Most ASP.NET applications require an associated database, otherwise they could just be static HTML files. It is crucial that the database schema gets updated to the appropriate database version with every deployment. There is, not surprisingly, a module within BuildMaster that handles this for you as well. The idea is to store DDL-DML scripts within the tool itself, and by executing scripts only once per environment, it ensures that all of your databases across each environment are up-to-date as your builds are deployed through them. Other scripts (e.g. stored procedures, views, triggers, etc.) are essentially code files and therefore belong in source control. These DROP-CREATE-CONFIGURE type scripts can be run each and every time in most cases with a simple deployment action.
Another piece of the deployment puzzle that most developers do not think about is process automation. Many developers need to perform sign-offs or fill out change request forms in order to manually perform these processes. Again, this is all available as part of the automated workflow setup within BuildMaster. You can setup blockers that do not allow promotion to say the QA environment unless all unit tests have passed, or block promotion to the Staging environment unless someone from the QA team approves the build and all issues in your issue tracking tool are resolved/closed for that particular release.
While I realize I left out CC.NET from the answer, our applications are all built and deployed through BuildMaster so we no longer need it, though we could however just as easily pickup the artifacts from a drop location and deploy them in later environments.
I see that many people use CC for their .NET projects, but why not use Jenkins, Sonarqube? They got all you need. I setup all this in 3 days. I have a Win 2008 server R2, MSSQL, Jenkins, VIsual SVN and Sonarqube.
It all works great and u get all metrics on your project. Sonarqube uses Gallio, Gendarme, FXcop, Stylecop, NDepths and PartCover to get your metrics and all this is pretty straight forward since SonarQube do this automatically without much configuration.
i post som pictures for u too get a feeling for it. Here is Jenkins witch builds and get Sonar metrics and a another job for deploying automatically to IIS
And Sonarqube, all metrics for my project. This is a simple MVC4 app, but it works great!:
If you want more information i can be more specific but i think you should at least consider jenkins. If CC suites you better, at least you looked at good alternative before you chose.
This whole setup uses MSBuild, too build and deploy the apps.