What are the kind of challenges faced when we migrate/move from versions of ATG commerce (<10) that were not using EndecaExp Manager to versions that use it. For ex, would all the JSPs undergo a change in the way they are rendered, given that the pages will now have to be template driven ?
What would be some best practices here to have a minimum impact of the move on the UI & maximize the reuse of the JSPs ?
I have read the migration docs but they do not seem to cover this aspect.
As you know ATG and Endeca only really started integrating in ATG 10.2.x. So in older versions of ATG the integration requires a lot more work from the developer. I've worked on an ATG 9.2 and Endeca 3.1.2 implementation that does exactly that. Your question should really be how far off are you from migrating to a later version of ATG that does integrate nicely with Endeca and how much of your current system would you want to retain after such a migration? This is important as it will mean you either need to build a solution that mimics the ATG Assembler Pipeline functionality (giving you the most control over your templates and cartridges when integrating with Experience Manager) or a less intrusive approach, based on the InvokeAssembler droplet.
The other thing to consider is how much do you want to render through Experience Manager. Typically you would do the homepage and category pages. The product detail page would call some components from Experience Manager (for example breadcrumbs) but the data in the index isn't usually as accurate as the data in the database (for example inventory levels) so for the PDP you go directly to the repository. You are also unlikely to build your checkout flow in Experience Manager. This should give you an indication that you are likely to retain a large number of your existing pages.
Your quickest approach would be to build a droplet that will retrieve your contentItems from Experience Manager and then start to render them. Keep in mind that the content items are just glorified JSON responses so you can easily parse them when you get hold of them.
Related
I'm thinking about attempting to design a new framework architecture aimed at allowing a web app to later be easily ported into a system such as Drupal or Joomla while maintaining the independence of the original app such that updates to core functionality would require only one release or otherwise minimal extra work.
Before I start on this however, I would like to see what work has previously been done that comes closest to what I am proposing. So an answer to this question would come in the form of a reference to the most similar work or if possible a definitive 'no' that this has not been done before.
Clarification by example: MediaWiki is a common web app that has become one of the most highly recommended of its kind. However, site admins building their sites with Drupal would be required to hack MediaWiki in order for it to play nicely with Drupal in terms of sharing a user base for example. Imagine that MediaWiki has decided to do a complete rebuild of their system, what design could be used to make this interaction simply require a Drupal module or Joomla component and thus make MediaWiki available to more users?
I'm using MediaWiki as just an example, I think modules and components already exist that solve this particular problem but I hope I am able to get my idea across. It is a problem I have encountered many times during web development now that CMS systems are appearing more and more enterprise-like.
Thanks!
godwin
Content Management Interoperability Services (CMIS) is an OASIS Specification that you can use to imrprove the data portability and interoperability of a CMS. If your system has (or your provide) a CMIS interface, you can move content to / from other CMS systems that also provide CMIS interfaces.
See:
http://en.wikipedia.org/wiki/Content_Management_Interoperability_Services
http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=cmis
We’re currently evaluating development with Sitecore 6 for a project. The client already bought it, so using another CMS isn't an option. The proposed setup would have Sitecore as our site’s content data provider; which would be consumed by a site built in ASP.Net MVC 3. We’d use Sitecore’s libraries to retrieve data from the Sitecore database on the server side.
In some cases, we also may want to consume content data via client side AJAX calls. I’ve been working on prototypes for this to see what data I can get back from a custom proxy service. This service calls GetOuterXml on the item, converts the Xml to JSON, and sends back the JSON to the calling script. So far, I’m finding using this method limiting; as it appears GetOuterXml only returns fields and values for fields that were set on the specific item, ignoring the template’s standard value fields and their default values for example. I tried Item.Fields.ReadAll(), still wouldn’t return the standard values. Also, there are circular references in the Item graph (item.Fields[0].Item.Fields[0]...); which has made serialization quite difficult without having to write something totally custom.
Needless to say, I've been running into many roadblocks on my path down this particular road and am definitely leaning toward doing things the Sitecore way. However, my team really wants to use MVC for this project; so before I push back on this, I feel its my responsibility to do some due diligence and reach out to the community to see if anyone else has tried this.
So my question is, as a Sitecore developer, have you ever used Sitecore as purely a content data provider on the client-side and/or server-side? If you have, have you encountered similar issues and were you able to resolve them? I know by using Sitecore in this way; you lose a lot of features such as content routing/aliasing, OMS, the rendering and layout engine; among other features. I’m not saying we’re definitely going down this path, we’re just at the R&D phase of using Sitecore and determining how it would best be utilized by our team and our development practices. Any constructive input is greatly appreciated.
Cheers,
Frank
I don't have experience with trying to use Sitecore solely as a data provider, but my first reaction to what you're suggesting is DON'T!
Sitecore offers extremely rich functionality which is directly integrated into ASP.Net and configured from within the Sitecore UI. Stripping that off and rebuilding it in MVC is lnot so much reinventing the wheel as reinventing the car.
I think that in 6.4 you can use some MVC alongside Sitecore, so you may be able to provide a sop to your colleagues with that.
There is the requirement, to write a portal like ASP.NET based web application.
There should be a lightweigted central application, which implements the primary navigation and the authentication. The design is achieved by masterpages.
Then there are several more or less independent applications(old and new ones!!), which should easily and independent be integrated into this central application (which should be the entry point of these applications).
Which ways, architectures, patterns, techniques and possibilities can help and support to achieve these aims? For example makes it sense to run the (sub)applications in an iframe?
Are there (lightweighted and easy to learn) portal frameworks, which can be used (not big things like "DOTNETNUKE")?
Many thanks in advance for you hints, tips and help!
DON'T REINVENT THE WHEEL! The thing about DotNetNuke is that it can be as big or as small as you make it. If you use it properly, you will find that you can limit it to what you need. Don't put yourself through the same pain that others have already put themselves through. Unless of course you are only interested in learning from your pain.
I'm not saying that DNN is the right one for you. It may not be, but do spend the time to investigate a number of open source portals before you decide to write your own one. The features that you describe will take 1000s of hours to develop and test if you write them all from scratch.
#Michael Shimmins makes some good suggests about what to use to implement a portal app with some of the newer technology and best practice patterns. I would say, yes these are very good recommendations, but I would encourage you to either find someone who has already done it this way or start a new open source project on codeplex and get other to help you.
Daniel Dyson makes a fine point, but if you really want to implement it your self (there may be a reason), I would consider the following components:
MVC 2.0
Inversion of Control/Dependency Injection (StructureMap for instance)
Managed Extensibility Framework
NHibernate (either directly or through a library such as Sh#rp or Spring.NET
A service bus (NServiceBus for instance).
This combination gives you flexible user interface through MVC, which can be easily be added to via plugins (exposed and consumed via MEF), a standard data access library (NHibernate) which can be easily configured by the individual plugins to connect to specific databases, an ability to publish events and 'pick them up' by components composed at runtime (NServiceBus).
Using IoC and DI you can pass around interfaces which are resolved at runtime based on your required configuration. MEF gives you the flexibility of defining 'what' each plugin can do, and then leave it up to the plugins to do so, whilst your central application controls cross cutting concerns such as authentication, logging etc.
This may sound a bit general, but I have a startup that is working on an ASP.NET (greenfield) suite of software applications. We are aiming to spend a substantial amount of time in the architecture phase to develop a strong foundation for our software. I was wondering if anyone has any advice, anything we should focus on or any suggestions for areas we should focus on to build a better suite.
Some things we are focusing on right now:
1. Session state requirements - should sessions be sticky or should we take server clustering session migration into account.
2. User login authentication - what are the major concerns in this space - LDAP, AD, custom SQL authentication systems etc.
3. The DAL - ORM vs Stored Proc
4. Integrating multiple ASP.NET applications in a single software suite. How it should look/feel. How it should be architected, etc.
I would appreciate any advice from any architects out there that have built similar systems from the ground up.
I know there are lots of solutions to session, but if you can create your framework to be session-free, you will avoid a lot of potential headaches. (There are lots of session-free options, but an obvious one is a hidden form field, somewhat like ViewState.)
Just some quick notes. I can't get too detailed since we went through this exercise where I worked last year - and I don't work there any more!
Start from the beginning using Enterprise Library, especially the Logging and Exception Handling application blocks. I've also found their Unity dependency injection library to be very useful.
Consider using Visual Studio Team Foundation Server. It's not just for source control, but can create you a complete continuous integration solution, complete with integrated bug tracking, code quality tracking, etc. If you've got the time and people, it's well worth a man-month to learn how to do an initial deployment.
You may want to buy one or more licenses of one of the Visual Studio Team System editions. You don't need these versions in order to use TFS, but they work well with it.
Consider globalization right from the start. Same with customization, if your suite will run on customer premises and be customizable by them.
You haven't said how large your team is, or is expected to be. If it's large enough, you'll want to spend at least a man-week learning a bit about what's available to you in terms of Visual Studio Extensibility. Your developers (and maybe also your QA folks) will all but live in Visual Studio, so the ability to customize it to meet your needs can be a big win. Whether it's just some macros and maybe some customized project or item templates, or whether you want to do add-ins or more, Visual Studio is very extensible.
Be certain to use WCF for any web services work. The older ASMX web service technology is now considered by Microsoft to be "legacy software".
Finally, be sure to find out whether you qualify for BizSpark, "A program that provides Software, Support and Visibility for Software Startups." And does so almost for free.
I saw a demo of Silverlight 3 at the PhillyDotNet User Group last night - WOW. Wow for business applications, not graphic applications. There is a learning curve, but you get a lot for it. For example, the demo showed a grid being bound to a table without needing to write any code.
Right out of the box you had sorting, editing, paging, etc. But it wasn't the lame stuff you normally get and then have to rework. For example the paging was smart enough to write the sql that would only bring back the 20 rows you needed for the page.
The demo continued with him putting a detail form on the page for editing. Again no code, but it was smart enough to know that it had the same datasource as the grid on the page. So as you were moving row to row on the grid - the detail form was showing the current row (and it was very fast).
Both the grid and the detail form were editable and as you changed a field in one the other would reflect the new value. The editing was smart enough to validate the field on its own. So you couldn't put a letter in a field that was an integer type, etc. It also limited the number of characters that could be entered based on the column size found in the database. All the date fields on the detail form automatically had a calendar next to them. You get the idea - no coding for any of this.
If this weren't enough, it can be used to build occasionally connected applications. So he showed how he updated a few records on a few different pages, had the option to revert back a field later (ctrl-Z), and then at the end submitted all the changed records to be saved.
Also, they said it works with Linq2SQL and the entity fraimwork.
So if I were building a new product now, I would really look into this as a way of differentiating my product. And I suspect that if you don't do it with Silverlight now, you will be rewriting it in a few years anyway.
Here is a link to a demo (not the one I saw.)
Some general thoughts. If you'd like me to expound on any of these, let me know.
Inheriting from a custom subclass of
Page instead of Page itself is a
great way to share functionality
across your site.
Nested MasterPages are good.
Charting: I've tried DevExpress,
Syncfusion, and MSChart control and
all have their own issues.
The built-in forms authentication is
pretty good. Building a site that
allows both integrated authentication
and forms authentication is tricky
but can be done.
I've tried using cross page postbacks
and I'm still not sure if I like
them.
Localization takes a lot of time
Come up with a good structure for your App_Themes and css.
Use Elmah to track unhandled exceptions
Im in the middle of developing a product that i will hopefully be attempting to sell towards the end of the year and i was just wondering what the best way to handle the licensing is.
My Product is going to be a downloadable asp.net web application and at the moment looks like there will be a free version and a premium version.
Im thinking about using serials that i can keep track of on my end, but the question is, Whats the best way to restrict the free version and have the application 'know' its premium? or should i just have 2 branches of the same product instead of trying to do it all in the 1?
Im planning on making a web installer for the product where the user can put in the serial and it will determine what version they have.
you could have an encrypted license file that your system checks for every so often which tells it what "version" it is. You can dynamically restrict functionality based on that.
This allows you to keep a single code base, and also make it impossible for the users of your system to simply change a setting in a config file and get your entire system for free.
This is going to be hard to do with a web app.