We are looking to turn an internal tool we have developed into a Visual Studio Package that we would sell to other developers. The tool will impact the custom editor and/or custom languages.
Visual Studio 2010 has redesigned the API's heavily to simplify much of the work involved for these types of integration but the key question we have is:
What is the typical adoption pace of new Visual Studio versions? Is there any information out there on adoption rates based on history? How many shops are still using 2005?
This will help us to consider whether to target just 2010 using the new APIs or whether trying to go back and support 2008 (maybe 2005) and testing it forward.
The short answer:
I'd primarily target VS2005 (as you shouldn't have much trouble getting a 2005-specific add-in to work in 2005/2008/2010 and thus will maximise your potential market).
The longer answer:
As you move from 2005 to 2008 to 2010, it gets progressively easier to write addins. In particular, the new extensibility features in 2010 make it much easier to build and deploy extensions (the older add-in and package systems used in 2005 & 2008 are much more painful to get working).
However, quite a large percentage of users still use 2005 (indeed, there are still a lot of people using 2003 and VS6), but I'd guess that most are now on 2008. Don't expect particularly high percentages of 2010 users until at least SP1, as a lot of companies won't even look at it until it's been out there for at least 6 months and any teething troubles are sorted out. So at present if you want a large market, I think you have no choice but to target 2005 and 2008.
As a general rule, if your add-in works in 2005 it is likely to work well in 2008 and 2010, so targeting the add-in at 2005 is the best bet if you want a large market. Unless there is a specific feature of 2008 that you need, then in most areas there is little difference between 2005 and 2008, so I'd advise you to start by targetting 2005 and only jump up to 2008 if you find a problem that can't easily be solved unless you use the 2008 APIs. This should work fine in 2010 as it's well supported, but there is no guarantee that future Visual Studios will continue to support add-ins.
The alternative, as you say, is to ditch the old "add in" interfaces and use the new 2010 extensibility APIs. This will make development easier, gain much more access to the internals of 2010, and be more future-proof... but it will take months/years for the market size to build.
Well, the larger the project, the more time it will take to move from 2005 to 2008 and 2010.
I know many projects that are still in 2005, so, if you can afford it - make a 2005 version, 2008 and 2010. Large project usually have funds to buy staff...
If you can afford only one version of the product, goto 2010, in the long term... this is the best option.
(2010 will start getting market share in a few days/week, If you can offer the product in less than 6 months, you should target the older versions first, as they will rule the market for at least one more year).
Related
We are supposed to benchmark the performance of a dynamics ax 2012 application.
I have no prior experience in dynamics ax 2012 or load testing of desktop applications.
If anyone has worked on the same, please tell me the best available options.
From what I have been reading, I've gathered there is nothing like Application Benchmark Toolkit(which was for ax 2009) for ax 2012.
Currently Microsoft has released some benchmarking white papers, specifically the 'Microsoft Dynamics AX 2012 Day in the Life Benchmark' which gives some guidance on the sizing of environments. If you want to do your own load testing there is no easy way to get there currently. The closest you could reasonably get would be:
Writing a number of routines or jobs either in X++ or C# that call AX services and perform operations. This would let you do things like enter a large number of customers and orders and time the operations. This does not benchmark the client performance though.
Visual Studio 2010 Ultimate has UI automation testing tools that allow you to attach to an application and create UI tests that perform certain actions. You could use this to do manual tests in the Dynamics AX Client and then run them multiple times. Obviously this is only ideal if you need to test client performance.
According to recent posts from Convergence 2013, Microsoft is supposed to be releasing a load testing tool that seemingly meets your requirements in April/May 2013, so you may in fact luck out from a timing perspective unless you have a very tight implementation deadline.
A few quick rules of thumb from a performance perspective:
Don't virtualize SQL Server, Microsoft says best case scenario (You have a really good SQL Admin), you'll take a 15% performance hit, and worse case it's closer to 60%.
Use dedicated AOS's to handle things like batch jobs since they tend to get more and more involved as the system gets more mature.
I'll reply to an old question, maybe it'll help some people in the future who land here through google.
In the meantime there is an application benchmark SDK for dynamics AX 2012.
You can find full documentation here.
Basically it's a set of tests you run from visual studio, there are some standard tests available and the SDK allows you to perform your own tests
On a Tridion 2011 SP1 system, you have a choice between implementing SiteEdit 2009 SP3 and the more recent "User Interface update for SDL Tridion 2011 SP1" (also known as Experience Manager). What criteria are important in making this choice, and why?
For example:
Ease/cost of implementation
Infrastructure
License costs
Future support
Improved functionality
Both SiteEdit 2009 SP3 and Experience Manager are currently supported products. But it's clear that SDL's focus going forward is to further extend Experience Manager and not SiteEdit 2009 anymore.
In simple scenarios SiteEdit 2009 may be a bit easier to implement, due to the fact that Experience Manager has a bigger impact on the Content Delivery system due to the prerequisites for its Session Preview mechanism. When I install Experience Manager without Session Preview however, I find that it takes me no more time than setting up SiteEdit 2009 - a product that I've installed considerably more often.
But Experience Manager should typically cause fewer integration problems, due to the fact that it doesn't use a server-side proxy. A lot of more advanced authentication scenarios should simply work with Experience Manager, where they've proven challenging with SiteEdit 2009.
I think the above covers points 1 through 4 of your question. I'll leave number 5 to others, although I already mentioned "Session Preview" as one of the big new features in Experience Manager.
My thoughts (I'm still calling the new product UI btw):
UI will only work on 2011 Sp1 Hr1
SiteEdit is quite old now, UI is the later product... why would you choose to install something that isn't the latest software?
To your points:
the cost of installation of 2009 will be a waste when you have to installed UI shortly after :)
UI doesn't have the proxy anymore, it's part of the CM machine. setting up sites is much much easier
3/4. No idea on license cost, I'd imagine SE2009 isn't supported by SDL though, so I'd ask SDL.
UI is really great, I'm not going to write an essay on the new stuff, but I think the point on your list, that isn't there, should be:
'What do the end users think of both systems?'
This is a user editing tool, infrastructure, implementation of technical details, which I'm sure you can work around for either (should you have to), if you're putting in a tool that will be used by users and you have a choice to make, shouldn't it be the one that they agree works best?
I will add my 2 cents on costs - In a most of my implementations, getting SiteEdit 2009 installed and tested typically has taken less than half a day per environment. Make sure you apply all the hot fixes from SDL Tridion World to get it to work with the latest browsers.
The new UI can optionally use something called 'Session Preview' (to enable fast publishing) which makes use of several Content Delivery technologies such as OData. If you are not already using these in your implementation, then their is likely to be a considerable investment into infrastructure/application design to get it designed/installed/working/tested (I have heard cases where this has taken over a month), which will make the Experience Manager considerably more expensive to implement in the short term. If you don't use the 'Session Preview' feature, (as Frank has already said) the implementation time/cost is similar, but you will not benefit from the new fast publishing features of the newer product.
As for functionality - the two environments look and feel very different. Experience Manager is clearly the direction the products are moving towards and provides a much slicker interface. So if your client is new to SDL Tridion, I would suggest using it, however if they are a long time SDL Tridion customer who has experience with SiteEdit 1.3 or 2009, and you don't plan to take advantage of some of the newer features in the short term, I would be tempted to stay with SE 2009, and make the shift to Experience Manager when they upgrade to SDL Tridion 2013.
A development shop has a range of ASP.NET projects using SQL Server 2000, 2005, 2008. 2008 R2 databases.
How would you design, develop, maintain, version control, fill with test data, stress load, test, automate, maintain in sync with production such range of databases?
Does recent Visual Studio 2010 Ultimate or Database Eds support SQL Server 2000 databases?
Update: The question is not confined to VS2010 or even to MS-only products.
Even if confined, then how to organize the development infrastucture and environment.
Also, variants of cutting some of the functionalities in order to minimize/cut or optimize time and expenses are to be considered.
I was reading so far on it (with sublinks and related links):
Different Development environment than Test & Production environments?
Keeping testing and production server environments clean, in sync, and consistent
How to keep track of performance testing
Get Your Database Under Version Control
http://www.codinghorror.com/blog/2008/02/get-your-database-under-version-control.html
Verify database changes (version-control)
Is Your Database Under Version Control?
http://www.codinghorror.com/blog/2006/12/is-your-database-under-version-control.html
How do you stress load dev database (server) locally?
I suggest you develop against the lowest common denominator (i.e. the SQL 2000 database).
You can then backup and restore this database to the other version of SQL Server in your testing and staging environments to give you the range of database servers you need.
First have your developers load client tools of all three versions on their machines. You have to start from 2000 and work out to work correctly. Then have them work in Query Analyzer for projects supporting 2000 and in SSMS for projects supporting 2005 or 2008. Insist that they always work only against the lowest version of the database the client will be using. Most things that work in 2000 will work in 2008 (not so true of the next version, so customers on 2000 should strongly be encouraged to upgrade.)
Have them do all work in scripts (even database changes and inserts to lookup type tables) and check the scripts into source control just like any other code.
If you have testers, make sure they are connected to the correct version of the database and that they do tests against that and not some higher version.
I also would have a cheat sheet made up for your developers concerning what T-SQL code will work on which version. Best way to do this is to look in Books Online for 2005 and 2008 to see what new features were added.
But it is critical that they only work in the database the particular project will support or you will have to rewrite large swaths of code when it goes to prod. Newer devs don't know 2000 and are used to using things like CTEs that are not supported. It is best they find out immediately when they write the code that it won't work not in test or worse on prod.
I've been working with ASP.NET for about 5 years now and I'm looking to get into SharePoint development.
Would it make more sense to get up to speed on SP 2007 first or just dive straight into SP 2010?
Seems like learning SP 2007 would give me a better understanding of the "story" and broaden my work opportunities.
What do we think? Is a grasp of SP 2007 a must for any SP developer at this point?
Thanks
Thanks for the helpful and encouraging answers. Seems the unanimous recommendation is to skip SP 2007 and dive straight into SP 2010 as the dev tools are much better, so I'll probably do that :)
As I wrote in a previous answer I think people starting should focus on 2010. Depending on your work situation, it may limit you in the types of contract jobs you can take, but the development environment is significantly better in 2010. The only reason to start with 2007 is so you can appreciate how much easier 2010 makes it :)
SharePoint 2010 is a completely new architecture and is fundamentally different in many ways. For example, SharePoint Services Provider is no more and the sandbox has been added.
I started with 2007 because that's where we are at work. I think the version being used in your workplace should dictate your decision (unless you're a consultant).
IMHO, starting with SP 2010 would be easier - the support for SP 2010 development comes built in with VS 2010.
I don't think you would gain that much from starting with SharePoint 2007. As the others have said, the development environment with 2010 is much friendlier and doesn't require as many 3rd party tools.
More importantly though, the object model is almost the same. While it is true that SharePoint Service Provider is gone, it is not something that you play with a lot when you develop (in my experience anyway). The important objects and concepts (content types, web parts, lists, list items, etc.) are pretty much the same, which is why I don't think there is much of a "story" to get.
Make your life easier and go with 2010.
SharePoint 2010 is a much easier way to start as others have already suggested. From an analogy perspective, there isn't a lot reason today to learn COM+ programming for instance. You can work with .NET and be happy most of the times.
A couple of real good resources that would help you bring up to speed with SharePoint 2010 are listed below...
http://technet.microsoft.com/en-us/sharepoint/ee518660.aspx
http://technet.microsoft.com/hi-in/sharepoint/ff420396(en-us).aspx
I have a client running a 1/2 dozen or so orchestrations running on Biztalk 2004 (that I wrote) that they use to exchange cXML documents (mostly too send orders) with their suppliers. It has a ASP.NET 1.1 front end. It uses the SQL adapter to store the parsed cXML. I gets & sends the documents via HTTPS.
My question: Is the upgrade to Biztalk 2006 R2 as straight forward as MS says? Any advice or things I should watch out for?
We finished a similar upgrade last year with little effort other than importing the projects into Visual Studio 2005. The upgrades were without issue. The biggest problem we had was with the various deployment scripts we used. There was a bit of rewriting to work with some of the new features of 2006. We also had to adjust to the multiple-host model for our apps. But all in all, no problems - just more features and API changes on deployment.
Best of luck.
At some point you will want to review the recommended tuning parameters for BizTalk 2006 R2 - I've prepared a list that may be helpful of the relevant resource links
http://intltechventures.blogspot.com/2008/11/2008-11-01-saturday-biztalk-2006-r2.html