Content porter package reversing in 2011 SP1 in production environment - tridion

Is there a need (is it advisable) to have Content Porter (cp) server license in Production environment?
Does reversing package require CP Server license?
How much does it cost?

It can be useful if you need to adjust a BluePrint or copy some of your production content to lower environments for testing purposes. It is more common to see the entire CM database backed up and restored to lower environments, but this may not be possible in all cases due to staggered work streams etc.
As for cost - you would need to contact your SDL Tridion sales representative.

Related

What is the correct deployment process for AX 2009

I was wondering what would be the correct deployment workflow for customization under AX 2009. For AX 2012 I found a nice whitepaper Deploying Customizations Across Microsoft Dynamics AX 2012 Environments .
But this doesn't help much because with AX 2009 there is no concept of model store deployment (unless I'm mistaken).
Are XPOs the way to go with AX 2009? If someone could point me in the right direction this would be great.
XPOs can and do work, but there are some issues using them. From most of what I've researched, when it comes to SOX compliance the best way to migrate code changes is by moving the binary server files - the .aod, .ahd, etc files - that are stored in the application directory of the server. Since these files are the compiled versions of the application code, it is easier to prove that the modifications that were created in a development environment are the same modifications deployed to the production environment. XPOs are plain text and can be manipulated in a text editor, making it more difficult to prove this, though not impossible.
I actually did a writeup of what we have done to manage our code deployments if you are interested. It covers XPO vs Layer file migrations, and ultimately describes our process for automated builds and deployments. Since we put it in place our auditors have been very happy when it comes to auditing our system.

How to Choose a Microsoft SQL Server Edition 2012 as a Developer?

I hope this question isn't too obtuse; however, I couldn't find anything specific. I'm a web-developer and I have an MSDN Subscription that gives me access to any SQL server edition I want. As a developer, I would like to know what I should choose to install on a dev machine based on this criteria (which other developers may relate to):
I need access to all the tools for SQL and T-SQL programming (I think all editions come with this?)
I want it to be efficient--I don't want it to take up too much ram\cpu processing time. My queries will not be very heavy so I'd rather trade off longer queries than to have the server taking up valuable resources.
I am programming for an enterprise sql version hosted somwhere else, but I don't need more than 1 Gigs of space, 1 CPU core support,
I never really worked with reporting tools, but would as a developer (Aka, non-DBA) would I ever need them on a dev machine?
Best integration with VS2013
I know that the SQL Server Developer edition is basically Enterprise, but without the liscence to use it for non-dev purposes. Based on the above criteria is there any sense for me to install it? Or should I choose SQL Express with Advanced Services? Perhaps Web?
Thanks for all your help,
All editions come with all the tools (unless you get into the BI side of things, then I think Express won't come with all of those tools).
In general, the edition won't make your local development environment any different in terms of resource usage. There are a few things that Enterprise / Developer have (like online index rebuild, certain optimizations etc.) that can make some operations more efficient, but these are highly unlikely to impact your day-to-day work or really change the number of resources SQL Server uses (these are very easy to cap through configuration anyway, e.g. if you don't want SQL Server to use more than x GB of memory, you can set that).
If you don't need more than 1 GB / 1 CPU in the ultimate deployment, you should probably develop on Express. This will prevent you from using Enterprise features inadvertently (which can happen if you use Developer). The down-side is that if you later do need features that aren't in Express (say you have another project where you will be deploying to Enterprise), you'll need to add an instance (with or without removing the old one). Given that you have access to MSDN, maybe the best solution is to install two instances - one Express, and one Developer, and then you can target the edition you want by using the appropriate instance locally.
I think that Express with Advanced Services come with these things, but I'm not an SSRS guy, so I'm not sure.
No single aspect of integration with Visual Studio should be edition-dependent.
Also, Web is not an edition that is suitable for your workstation - try to find a license somewhere. This edition is exclusively for web hosts and resellers who offer SQL Server as part of their hosted offerings.

What are the Criteria for choosing between SiteEdit and the Experience Manager

On a Tridion 2011 SP1 system, you have a choice between implementing SiteEdit 2009 SP3 and the more recent "User Interface update for SDL Tridion 2011 SP1" (also known as Experience Manager). What criteria are important in making this choice, and why?
For example:
Ease/cost of implementation
Infrastructure
License costs
Future support
Improved functionality
Both SiteEdit 2009 SP3 and Experience Manager are currently supported products. But it's clear that SDL's focus going forward is to further extend Experience Manager and not SiteEdit 2009 anymore.
In simple scenarios SiteEdit 2009 may be a bit easier to implement, due to the fact that Experience Manager has a bigger impact on the Content Delivery system due to the prerequisites for its Session Preview mechanism. When I install Experience Manager without Session Preview however, I find that it takes me no more time than setting up SiteEdit 2009 - a product that I've installed considerably more often.
But Experience Manager should typically cause fewer integration problems, due to the fact that it doesn't use a server-side proxy. A lot of more advanced authentication scenarios should simply work with Experience Manager, where they've proven challenging with SiteEdit 2009.
I think the above covers points 1 through 4 of your question. I'll leave number 5 to others, although I already mentioned "Session Preview" as one of the big new features in Experience Manager.
My thoughts (I'm still calling the new product UI btw):
UI will only work on 2011 Sp1 Hr1
SiteEdit is quite old now, UI is the later product... why would you choose to install something that isn't the latest software?
To your points:
the cost of installation of 2009 will be a waste when you have to installed UI shortly after :)
UI doesn't have the proxy anymore, it's part of the CM machine. setting up sites is much much easier
3/4. No idea on license cost, I'd imagine SE2009 isn't supported by SDL though, so I'd ask SDL.
UI is really great, I'm not going to write an essay on the new stuff, but I think the point on your list, that isn't there, should be:
'What do the end users think of both systems?'
This is a user editing tool, infrastructure, implementation of technical details, which I'm sure you can work around for either (should you have to), if you're putting in a tool that will be used by users and you have a choice to make, shouldn't it be the one that they agree works best?
I will add my 2 cents on costs - In a most of my implementations, getting SiteEdit 2009 installed and tested typically has taken less than half a day per environment. Make sure you apply all the hot fixes from SDL Tridion World to get it to work with the latest browsers.
The new UI can optionally use something called 'Session Preview' (to enable fast publishing) which makes use of several Content Delivery technologies such as OData. If you are not already using these in your implementation, then their is likely to be a considerable investment into infrastructure/application design to get it designed/installed/working/tested (I have heard cases where this has taken over a month), which will make the Experience Manager considerably more expensive to implement in the short term. If you don't use the 'Session Preview' feature, (as Frank has already said) the implementation time/cost is similar, but you will not benefit from the new fast publishing features of the newer product.
As for functionality - the two environments look and feel very different. Experience Manager is clearly the direction the products are moving towards and provides a much slicker interface. So if your client is new to SDL Tridion, I would suggest using it, however if they are a long time SDL Tridion customer who has experience with SiteEdit 1.3 or 2009, and you don't plan to take advantage of some of the newer features in the short term, I would be tempted to stay with SE 2009, and make the shift to Experience Manager when they upgrade to SDL Tridion 2013.

Proper DTAP setup for Content Delivery

I've had this setup, but it didn't seem quite right.
How would you improve Content Delivery (CD) development across multiple .NET (customer) development teams?
CMS Server -> Presentation Server Environments
CMS Production -> Live and Preview websites
CMS Combined Test + Acceptance (internally called "Staging") -> Live ("Staging")
CMS Development (DEV) -> Live (Dev website) and sometimes Developer local machines (laptops)
Expectations and restrictions:
Multiple teams and multiple websites
Single DEV CMS license (typical for customers, I believe?)
Enough CD licenses for each developer
Preferably developer could program and run changes locally--was this a reasonable expectation?
Worked
We developed ASP.NET pages using the Content Delivery API against the same broker database for local machines and CD DEV. Local machines had CD dlls, their own license files, and ran/debug fine with queries and component presentation calls.
Bad
We occasionally published to both the Dev presentation server and Developer machines which doesn't seem right now, but I think it was to get schema files on our local machines. But yes, we didn't trust the Dev broker database.
Problematic:
Local machines sometimes needed Tridion-published pages but we couldn't reliably publish to local machines:
Setting multiple publication destinations for a single "Local Machine" publication target wouldn't work--we'd often take these "servers" home.
VPN blocked access to laptops offsite (used "incoming" folder at the time).
Managing publication targets for each developer and setting up CD for each new laptop was good practice (as in exercise, not necessarily as a good idea) but just a little tedious.
Would these hindsight approaches apply?
Synchronize physical files from Dev to local machines on our own?
Don't run presentation sites locally (localhost) but rather build, upload dll, and test from Dev?
We were simply missing a fourth CMS environment? As much as we liked our Sales Guy, we weren't interested in purchasing another CM license.
How could you better setup .NET CD for several developers in an organization?
Edit: #DominicCronin pointed out this is only a subset of a proper DTAP setup. I updated my terms and created a separate question to clarify DTAP with Tridion.
The answer to this one is heavily depending on the publish model you choose.
When using a dynamic model with a framework like DD4T you will suffice with just a single dev environment. There is one CMS, and one CD server in that environment and everything is published to a broker database. The CD environment could be used as an auto build system, the developers purely work locally on a localhost website (which gets the data from the dev broker database), and their changes are checked in an VCS (based on which the auto build could be done).
This solution can do with only a single CMS because there is hardly any code developed on the CMS side (templates are standardized and all work is done on the CD side).
It gets more complex if you are using a static or broker publishing model. Then I think the solution is to split Dev up in Unit-Dev and Dev indeed as indicated by Nuno and Chris.
This solution requires coding on both the CMS and CD side, so every developer has a huge benefit in having its own local CMS and CD env.
Talk to your Tridion account manager and agree a license package that suits the development model you want to have. Of course, they want to maximise their income, but the various things that get counted are all really meant to ensure that big customers pay accordingly, and smaller customers get something they can afford at a price that reflects the benefits they get. In fact, setting up a well-thought-out development street with a focus on quality is the very thing that will ensure good customer satisfaction and a long-running engagement.
OK - so the account managers still have internal rules to follow, but they also have a fair amount of autonomy in coming to a sensible deal with a customer. I'm not saying this will always work, but its way better than blindly assuming that they are going to insist on counting every server the same way.
On the technical side - sure, try to have local developer setups and a common master dev server a-la Chris's 5th. These days, your common dev environment should probably be seen as a build/integration server: the first place where the team guarantees all the tests will run.
Requirements for CM and CD development aren't very different, although you may be able to publish to multiple developer targets from one CM if there's not much CM development going on. (This is somewhat true of MVC-ish approaches, but it's no silver bullet.)

how to develop, test, version, sync SQL Server 2000, 2005, 2008, 2008R2 databases?

A development shop has a range of ASP.NET projects using SQL Server 2000, 2005, 2008. 2008 R2 databases.
How would you design, develop, maintain, version control, fill with test data, stress load, test, automate, maintain in sync with production such range of databases?
Does recent Visual Studio 2010 Ultimate or Database Eds support SQL Server 2000 databases?
Update: The question is not confined to VS2010 or even to MS-only products.
Even if confined, then how to organize the development infrastucture and environment.
Also, variants of cutting some of the functionalities in order to minimize/cut or optimize time and expenses are to be considered.
I was reading so far on it (with sublinks and related links):
Different Development environment than Test & Production environments?
Keeping testing and production server environments clean, in sync, and consistent
How to keep track of performance testing
Get Your Database Under Version Control
http://www.codinghorror.com/blog/2008/02/get-your-database-under-version-control.html
Verify database changes (version-control)
Is Your Database Under Version Control?
http://www.codinghorror.com/blog/2006/12/is-your-database-under-version-control.html
How do you stress load dev database (server) locally?
I suggest you develop against the lowest common denominator (i.e. the SQL 2000 database).
You can then backup and restore this database to the other version of SQL Server in your testing and staging environments to give you the range of database servers you need.
First have your developers load client tools of all three versions on their machines. You have to start from 2000 and work out to work correctly. Then have them work in Query Analyzer for projects supporting 2000 and in SSMS for projects supporting 2005 or 2008. Insist that they always work only against the lowest version of the database the client will be using. Most things that work in 2000 will work in 2008 (not so true of the next version, so customers on 2000 should strongly be encouraged to upgrade.)
Have them do all work in scripts (even database changes and inserts to lookup type tables) and check the scripts into source control just like any other code.
If you have testers, make sure they are connected to the correct version of the database and that they do tests against that and not some higher version.
I also would have a cheat sheet made up for your developers concerning what T-SQL code will work on which version. Best way to do this is to look in Books Online for 2005 and 2008 to see what new features were added.
But it is critical that they only work in the database the particular project will support or you will have to rewrite large swaths of code when it goes to prod. Newer devs don't know 2000 and are used to using things like CTEs that are not supported. It is best they find out immediately when they write the code that it won't work not in test or worse on prod.

Resources