My team is doing web development (ASP.NET, WCF), and we are at a beginning stage where everyone needs to make DB changes and use own sample data.
We use a dedicated DB server, and we want each developer to develop against separate DB.
What we appear to need is ability to configure connection string on per-developer basis in source controlled way. Obviously, we might have other configuration settings that need custom setting and finally, we'll need to maintain a set of configuration settings that are common to all developers.
Can anyone suggest a best practice here?
PS Similar issue appears when we want to deploy a built application to different environments (test, stage, production) without having to manually tweak configurations (except perhaps configuring the environment name).
You can use config transforms for your deployment to different environments. That's easy enough. Scott Hanselman did a pretty awesome video on it here.
For your individual developer db problem, there isn't any particularly elegant solution I can think of. Letting each developer have a unique configuration isn't really a "best practice" to begin with. Once everyone starts integrating their code, you could have a very ugly situation on your hands if everyone wrote their code against a unique db and configuration set. It almost guarantees that code won't perform the same way for two developers.
Here is what I would recommend, and have done in the past.
Create a basic framework for your database, on one database on your test db server.
Create a Database Project as part of your solution.
Use .Net's built in Schema Compare to write your existing database to the database project.
When someone needs to change the database, first, they should get latest on the Database project, then make their changes, and then repeat step 4 to add their changes to the project.
Using this method, it is also very easy for developers to deploy a local instance of the database that matches the "main" database, make changes, and write those changes back to the project.
OK.
Maybe not so elegant solution, but we've chosen to read connection string from a different place when the project is built using Debug configuration.
We are using registry, and it has to be maintained manually.
It requires some extra coding, but the code to read the registry is only compiled in debug (#if debug), so there is no performance hit in production.
Hope this helps as well.
Cheers
v.
Related
We have a ASP.NET web application and need to maintain the database creation and initialization script.
Are there any industry best practices that people know of for maintaining database creation and initialization scripts. I can think of two main approaches.
Maintain a tsql creation script directly by hand.
Maintain a master database and create the script that is then checked into source safe.
Also the script should be able to be tracked through source control, i.e. table order should be controllable.
If possible should also include the ability to track initialisation data either in the same or a seperate script.
Currently we generate the script from management studio but the order of the tables seems to be random.
And the more automated the solution the better.
The problem is not maintaining the script, nor maintaining a 'master' copy of the database. The real problem is upgrading existing database(s). You do your modification in the developer environment, which are then propagated to the test environment, and finally pushed into production environment. While at developer and test environment stage is possible to start from scratch, in production you always have to upgrade the existing deployment.
In my experience the best practice is to use upgrade scripts. This practice is useful even with a single deployed site, but it becomes invaluable with multiple locations that may be at different versions. But even with one single operational site is still useful to be able to test the upgrade repeatedly (starting from backups of current version), keep the changes in source control, have a well formalized and peer reviewed change procedure (the upgrade script). And upgrade scripts can be tailored to specific needs of the operational site, like handling a large table with special care, or deal with encrypted data, or whatever one of the myriad of the details diff based tools neglect or ignore. The main disadvantage is the the scripts have to be written, which require real T-SQL knowledge (forget all the 'designers' in you favorite management tool).
You might want to check out RedGate SQL Source Control.
Are you looking for Visual Studio Database Projects?
I use database projects to store all database objects (tables, views, functions, keys, triggers, indexes across schemas) and keep versioning in TFS. You can build the database to ensure that everything is valid. You can deploy to a fresh database, or do a schema comparison with an existing database.
I also keep all reference and setup data in post deployment scripts which are automatically run after deployment.
I'm working on a large Drupal website under two environments -Development and Stage. While I work in Development, my client enters content in Stage.
My work in Development modifies the database. Then, I need to be really careful when taking my work to Stage -otherwise I could affect my client's content.
This is painful and inefficient. Do you know of other options for this particular scenario? Perhaps a database merging tool? Thank you for your advice.
This is an inherent issue with Drupal, the storing of configuration and content in the same database. There are methods to help mitigate the issue, (like the Features module that helps you compartmentalize configuration changes) but they are very dependent on module support.
On our last site we tried using the Features module and the Deployment module but so many of the modules we wanted to use didn't have support for Deployment that we ended up not going that route and just manually duplicate configuration changes by hand.
Depending on what your client is entering you might be able to use some handy mysql to solve the issue, can you tell us a bit more about your scenario?
There are two kinds of data, configuration and user content. For user content, set autoincrement to two and use even and odd for dev / staging. For configuration write update hooks. Easy.
Try looking at these two former thread on the same issue:
Drupal DATABASE deployment strategies?
How to merge Drupal database changes
You could turn it around and keep copying your staging site's information to new instances of your development site's platform. Miguel Jacq has done a nice write up on achieving this set up. After testing things through you can then set up a thoroughly tested production platform at the production address and copy the staging site over.
Miguel's article: http://greenbeedigital.com.au/content/drupal-deployments-workflows-version-control-drushmake-and-aegir
Aegir: http://community.aegirproject.org
#jhuebsch: that sounds like a disturbing experience. Can you add a list of the afflicted modules and were you sure to use UUID & Strongarm?
Ok, so here's the thing.
I'm developing an existing (it started being an ASP classic app, so you can imagine :P) web application under ASP.NET 4.0 and SQLServer 2005. We are 4 developers using local instances of SQL Server 2005 Express, having the source-code and the Visual Studio database project
This webapp has several "universes" (that's how we call it). Every universe has its own database (currently on the same server) but they all share the same schema (tables, sprocs, etc) and the same source/site code.
So manually deploying is really annoying, because I have to deploy the source code and then run the sql scripts manually on each database. I know that manual deploying can cause problems, so I'm looking for a way of automating it.
We've recently created a Visual Studio Database Project to manage the schema and generate the diff-schema scripts with different targets.
I don't have idea how to put the pieces together
I would like to:
Have a way to make a "sync" deploy to a target server (thanksfully I have full RDC access to the servers so I can install things if required). With "sync" deploy I mean that I don't want to fully deploy the whole application, because it has lots of files and I just want to deploy those new or changed.
Generate diff-sql update scripts for every database target and combine it to just 1 script. For this I should have some list of the databases names somewhere.
Copy the site files and executing the generated sql script in an easy and automated way.
I've read about MSBuild, MS WebDeploy, NAnt, etc. But I don't really know where to start and I really want to get rid of this manual deploy.
If there is a better and easier way of doing it than what I enumerated, I'll be pleased to read your option.
I know this is not a very specific question but I've googled a lot about it and it seems I cannot figure out how to do it. I've never used any automation tool to deploy.
Any help will be really appreciated,
Thank you all,
Regards
Have you heard of the term Multi-Tenancy? It might be worth look that up to see if that applied to your "Multiverse" especially if one universe is never accessed by another...
See:
http://en.wikipedia.org/wiki/Multitenancy
http://msdn.microsoft.com/en-us/library/aa479086.aspx
UPDATE:
If the application and database is the same for each client (or Tenant) I believe there are applications that may help in providing the same code/db as an SaaS application? ie another application/configuration layer on top that can handle the deployments etc?
I think these are called Platform as a Service (PaaS) applications:
see: http://en.wikipedia.org/wiki/Platform_as_a_service
Multi-Tenancy in your case may be possible, depending on client security requirements, with a bit of work (or a lot of work):
Option 1:
You could use the one instance of the application, ie deploy the site once and connect to a different database for each client. You would need to differentiate each client by URL to isolate content/data byt setting a connection string for each etc. (This would reduce your site deployments to one deployment)
Option 2:
You could create both a single instance of the application and use a single database. You would need to add a "TenantID" to each table and adjust all your code to accept a TenantID to ensure data security/isolation. Again you wold need to detect/differentiate the Tenant based on the URL to set the TenantID for the session used for every database call. (This would reduce your site and database deployment to one of each)
I have been utilizing two third party components for PDF document generation (in .NET, but i think this is a platform independent topic). I will leave the company's names out of it for now, but I will say, they are not extremely well known vendors.
I have found that both products make undocumented use of the filesystem (i.e. putting temp files on disk). This has created a problem for me in my ASP.NET web application as I now have to identify the file locations and set permissions on them as appropriate. Since my web application is setup for impersonation using Windows authentication, this essentially means I have to assign write permissions to a few file locations on my web server.
Not that big a deal, once I figured out why the components were failing, but...I see this as a maintenance issue. What happens when we upgrade our servers to some OS that changes one of the temporary file locations? What happens if the vendor decides to change the temporary file location? Our application will "break" without changing a line of our code. Related, but if we have to stand this application up in a "fresh" machine (regardless of environment), we have to know about this issue and set permissions appropriately.
Unfortunately, the components do not provide a way to make this temporary file path "configurable", which would certainly at least make it more explicit about what is going on under the covers.
This isn't really a question that I need answered, but more of a kick off for conversation about whether what these component vendors are doing is appropriate, how this should be documented/communicated to users, etc.
Thoughts? Opinions? Comments?
First, I'd ask whether these PDF generation tools are designed to be run within ASP.NET apps. Do they make claims that this is something they support? If so, then they should provide documentation on how they use the file system and what permissions they need.
If not, then you're probably using an inappropriate tool set. I've been here and done that. I worked on a project where a "well known address lookup tool" was used, but the version we used was designed for desktop apps. As such, it wasn't written to cope with 100's of requests - many simultaneous - and it caused all sorts of hard to repro errors.
Commonplace? yes. Appropriate? usually not.
Temp Files are one of the appropriate uses IMHO, as long as they use the proper %TEMP% folder or even better, use the integrated Path.GetTempPath/Path.GetTempFileName Functions.
In an ideal world, each Third Party component comes with a Code Access Security description, listing in detail what is needed (and for what purpose), but CAS is possibly one of the most-ignored features of .net...
Writing temporary files would not be considered outside the normal functioning of any piece of software. Unless it is writing temp files to a really bizarre place, this seems more likely something they never thought to document rather than went out of their way to cause you trouble. I would simply contact the vendor explain what your are doing and ask if they can provide documentation.
Also Martin makes a good point about whether it is a app that should run with Asp.net or a desktop app.
What is the simplest way to distribute an asp.net web application? I tried to look at some of the open source asp.net projects out there to see how they distribute their apps and how they do updates and they seem rather complicated to me (not for myself to perform but for non-technical users). A lot of them entail backing up the entire installed project, deleting specific folders and save parts of their web.config. I am hoping to find a solution that will make the update process specifically as simple as possible.
Thanks.
I am working on a project with a similar requirement now. We decided to use WiX to create an installer that can be run on the server or machine where the site is installed. WiX is incredibly powerful, but takes a bit to get the hang of.
There are plenty of other open source, and paid installer technologies as well. Here is a post with some info on a few.
CommunityServer provides a setup msi that will create a virutal directory, generate the SQL database and populate it with default data. Updating for point releases though is still a manual process involving an update.sql file and having everyone download then merge binary and static file changes.
They probably could have created an update msi too, but because so many people customize CommunityServer, it is probably better to let people merge changes themselves.
Do you mean in terms of breaking up the functionality into tiers that could be handled on separate machines, e.g. having 3 servers for a 3-tier architecture where one is the DB server, one handles middleware and the other handles the requests in ASP.Net? Another point here would be in going from a web server to multiple web servers in terms of scaling up.
Or are you referring to deployment?
It's a web application, man. Serve it publicly, require registration, and move on. Isn't that the point of the web application?