Publishing Targets Tridion 2011 - tridion

We are using Tridion 2011 SP1 and DD4T framework.
We have websites in both Staging and Live servers. We have published to both the servers from Tridion 2011 SP1 using different targets (Staging and Live). Now i am planning to add staging servers into Live target, So while publishing to Live it will be published to staging also. I am not going to use Staging target after this.
Here my question is. Will it create any problem or any issues in this. Does it have any disadvantage?
Thanks,
Jagadeesh.

So, all you want to do is to publish to Staging also when you publish to Live?
If this is all you want to do, then the easiest is to chain both Publication Targets with the same Target Type. Open the Staging Publication Target and in the advanced tab, link it with the Live Target type (and unlink it from the Staging Target Type).
You should probably also remove the Staging Target Type so as to not confuse your editors.
PS - I am answering here, but your question is not a Stack Overflow/Programming question. You should have asked this in serverfault instead.
2nd part of the question: Will it cause issues?
Think why you had staging to begin with, since you're going to lose that now. You are probably removing the ability to implement Experience Manager (ex-SiteEdit), but maybe that's not a requirement. Staging is also typically a smaller environment. If you're doing this because you need more capacity on your Live server, then you should have considered buying a new server instead, and linking it to the same database (since you're using DD4T there's no nastyness related to file system replication or multiple deployers).

Related

Migrating structure and content between instances in Drupal 9

I'm very new to Drupal, so please don't be too mad in case I have any major misunderstandings :) I've tried searching for a similar problem, but is just couldn't find a suitable solution for my case.
We're currently setting up a Drupal 9 project, which will perspectively have a shared development environment and a production environment as well as a local instance to develop on. I'd wish to have a way to synchronize those instances to have the same configuration, content types and optionally even content.
At the moment, I'm developing a theme locally, which means I have installed a Drupal instance inside a XAMPP server. That theme is versioned by git, so it is migratable to another developer without a problem.
For migrating the structure and content (which is obviously saved in the database), I tried using Backup & Migrate, but there were two issues I was facing: The D9 version is not fully supported yet, so an installation via composer fails with default security settings, and there seems to be an already multiple times reported bug when trying to backup the entire site. You can workaround it by backing up the database and the files separately, but this is pretty inconvenient due to other issues (but let's keep it a little short...).
I also tried to export the whole database, which is actually working (after this little fix), but the overhead seems a little high for me. Especially when I just want to copy new content types from dev to prod environment without users, content and so on, for instance.
So, to finally come to an end, is there any best practice for this case? Or should I even consider to go a whole other way?
Thanks in advance!
I definitely wouldn't recommend using Backup & Migrate for this - that's so Drupal 7! Drupal 9 has better tools that are baked into core!
There are many possible ways to import/export Config and Content entities across environments, but I'll share what I believe to be the current best practices.
For Configuration, Drupal 9 has a built-in Configuration Management system that makes it quite easy to migrate Config across environments. It can be used within the Drupal UI, and also integrates with Drush (a command-line tool for Drupal).
Essentially, the Config system exports all Config settings as standardized YAML files, which can easily be included in your Git repository. This will make it incredibly easy to set up another Drupal environment that is identical in its Config settings.
More info on Configuration Management can be found here.
For Content, Drupal 9 has a built-in Migrate API, which facilitates migrations from any data source into a Drupal 9 environment. That means you could set up a migration that would allow you to migrate your Content entities across environments.
I've only ever used this for migrating Content (migrated from Drupal 7), but I believe it's also possible to use this to migrate Config as well.
If you decide to use the Migrate API, you may (depending on the setup of your content) need to install Migrate Tools and Migrate Plus.
More info on the Migrate API can be found here.

Mimic Azure Staging on Custom Server

In WebsiteAzure we have an Staging feature. So we can deploy to one staging site, test it, fill all caches and then switch production with stage.
Now how could I do this on a normal Windowsserver with IIS ?
Possible Solution
One stragey i was thinking about is having a script which copies content from one folder to an other.
But there can be file locks. Also as this is not transactional there is some time a kind of invalid state in the websites.
First Poblem:
I've an external loadbalancer but it is externally hosted and unfortunately currently not able to handle this scenario.
Second problem As I want my scripts to always deploy to the staging i want to have a fix name in IIS for the staging site which i'm using in the buildserver scripts. So I would also have to rename the sites.
Third Problem The Sites are synced between multiple servers for loadbalancing. Now when i would rebuild bindings on a site ( to have consistent staging server) i could get some timing issues because not all Servers are set to the same folder.
Are there any extensions / best practices on how to do that?
You have multiple servers so you are running a distributed system. It is impossible by principle to have an atomic release of the latest code version. Even if you made the load-balancer atomically direct traffic to new sites some old requests are still in flight. You need to be able to run both code versions at the same time for a small amount of time. This capability is a requirement for your application. It is also handy to be able to roll back bad versions.
Given that requirement you can implement it like this:
Create a staging site in IIS.
Warm it up.
Swap bindings and site names on all servers. This does not need to be atomic because as I explained it is impossible to have this be atomic.
as explained via Skype, you might like to have a look at "reverse proxy iis". The following article looks actually very promising
http://weblogs.asp.net/owscott/creating-a-reverse-proxy-with-url-rewrite-for-iis
This way you might set up a public facing "frontend" website which can be easily switched between two (or more) private/protected sites - even they might reside on the same machine. Furthermore, this would also allow you to actually have two public facing URLs that are simply swapped depending on your requirements and deployment.
Just an idea... i haven't tested it in this scenario but I'm running a reverse proxy on Apache publicly and serving a private IIS website through VPN as content.

Team Foundation Server Environments

We are a small team of 3 developers. We have a mix of classic ASP code and ASPX pages. All the code is contained in one solution with multiple projects. We are currently not using any VC software and have just install TFS 2013 and want to move to using its VC. Our current environment is setup as follows.
Development environment - new code or changes to existing code.
Test Environment - once the code from development passes unit testing, it is moved here to allow users to test changes.
Staging Environment - this is a mirror of production. once the users have accepted the changes in test we migrate the code here to test and make sure it works against the mirror copy of the database(sql).
Production Environment - code is not modified in this environment.
All of this is done manually and now that our staff has grown from 1 to 2 to 3 developers over the last 6 months we need to make use of version control. What we are not sure of is how to implement this same environment using TFSVC. Do we need to install TFS in each environment and have the 4 separate copies of the code and then how do we migrate the code between each environment using TFS. We need help and suggestions on how to set this up. We want to keep it simple since there is only 3 of us.
Normally you would have one TFS server that holds the sources for all of your environments. Many people implement a branching strategy to support different versions of source code deployed as part of different releases or in different staging environments.
Many people treat TFS as a Development Tool and as such it ends up in the development "network". We recommend people to treat TFS as a production server though, it contains your source code (Intellectual property and an large investment in knowledge and tme) and you might also use it to hold your Product Backlog (which could contain sensitive information on where your company wants to move in the future). If you were to lose any of them it would be a great loss. So make sure you treat the TFS server as something holding value and implement a proper backup & restore and disaster recovery procedure.
Helpful links:
ALM Rangers Planning Guide
ALM Rangers Version Control Guide (aka Branching Guide)

Considering WebDeploy for internal cluster sites. Experiences?

We have recently started to use cluster servers in our company. I have done some reading on MS WebDeploy and the technology looks promising. Our requirements:
Create backups before deployment
Deploy to different servers
Test server
Two live clusters
Ability to stop application pools for specific web applications before publish and start them again afterward
Allowing limited access: In other words a developer may only publish to sites that they are responsible for
Possible customisation: We would like to disallow publishes if related bugs have not been solved in our bug tracker, and possibly more, like approvals from management. Can external customisations be done without losing VS integration
Visual Studio integration and the use of Web.config transforms
SQL Schema changes and especially stored procedures without affecting data
Our environment
IIS 7
Windows Server 2008
SQL Server 2005 (Planned move to 2008)
Visual Studio 2010
Based on my research it does seem that many of the above requirements have been met. What I would like to know is how reliable the solution is and whether the above requirements will be met. More importantly I would like to know what your personal experiences with webdeploy are and whether you would recommend it or whether there are better alternatives.
At the moment we are using file copying which proves to be unreliable (due to human error) and tedious.
We do about 80% of what your asking for using WebDeploy packaging and Thoughtworks GO! for orchestration of our release pipeline. It works really well. We have over a 100 websites/services and deploy something to production every four hours. The following post describes how we perform the deployment and links to related information:
http://www.dotnetcatch.com/2016/12/28/zero-downtime-clustered-deployment-of-webdeploy-packages-via-powershell/
One note, config transforms happen at build time which is problematic when you want to deploy to multiple environments. WebDeploy parameterization accomplishes the same result but is applied at deploy time. Check it out -
http://www.dotnetcatch.com/2014/09/08/parameterizationpreview-visual-studio-extension/

ASP.NET integration Environment

All,
My dev team and I would like to setup a development environment for our ASP.NET projects. BY development environment i do not mean Visual Studio. I mean, that we have a Database Server, a Application Server and a Web Server in a 'Development Environment'.
We want to use this as our integration environment. Where the developers all work on there parts of ASP.NET Applications and then we can push our new changes up to test them as a whole.
My Question is , what is the best way to deploy our code together without stepping on our toes?
Thanks.
Team Foundation Server is a good candidate for this.
You need a source code control methodology and with it you'll get the benefits you're searching for. SVN and other solutions in this space offer "conflict resolution" to avoid inadvertent overwriting/toe squashing.
Setup a subversion repository, get all of the developers up to speed on svn and using it.
Once you have your source under control you can consider setting up a continuous integration server which can build your code and deploy to your target environment in batch. Organizing your project code properly into trunk, tags and branches per solution will make it very easy to control what is deployed or redeployed to your dev environment at any given time.
There are other options for source code control (git, tfs, and many others) but they all offer close to the same features... SVN is one of the nicer options because it's open source, free and stable.
Another thing to consider is keeping your database schema changes in sync with your code changes. Consider using migrator.net or similar solution to enable your team to keep everything in sync through revisions, including database state.

Resources