We are upgrading to Tridion 2011 SP1 and as a part of Tridion search implementation we are using FS4SP (Fast Search for sharepoint 2010).
In proposed implemenatation search environement consists of following servers:
FAS4SP
FISE
Can someone guide us regarding how to push content to FAST from tridion and how to retrieve the same?
(Here due to some reasons we are not considering crawling of website by FAST)
What all APIs can be used for this implementation?
If you don't want to use the crawling approach, you will need to create a custom deployer, please take a look at this other article:
How can we integrate Microsoft FAST with SDL Tridion 2011 SP1?
Alternatively, if you don't have a development team who is familiar with Java, you might considering creating a .NET application which updates your FAST index based on either a File System or Database trigger when your pages or components are published, updated or deleted from your broker repository.
You will probably want to create XML for FAST and have the Custom Deployer (or Event System) send the content to FAST.
First create the FAST XML that works and write a sample app so you can insert it into the FAST index from either a .NET or Java application. This does not yet involve Tridion.
Then write your Custom Deployer or Event System and pass the XML to FAST.
IF you are using a Custom Deployer approach I would suggest to contact Tridion Professional Services if you have not done it yourself or are not a Java programmer. The new Tridion 2011 Storage API provides new opportunities for the Custom Deployer. In the meantime I would suggest to append the FAST XML to the normal Page Content at the end, surrounded by some markers, and have your custom deployer pull it out of the Page output, send to FAST, then remove from the output before continuing.
This is a fairly difficult challenge for those who do not have serious Content Delivery / Deployer / Java skills. However, if you want to go for it yourself I would suggest taking at least 2 weeks of time to research existing solutions and experiment with the API.
Using the Event System might be a little easier - but your success or failure message will not appear in the Publish Queue and if the search index fails to update you can only log the failure and not pass the info back to users.
Related
I am trying to understand different artifacts around implementing Tridion XPM. Unfortunately I did not find any article that has answers to my questions.
For example Content delivery server (on WebSphere) can have four different applications, one for Content delivery(which handles publishing), one for web service (oData implementation), one for XPM (with Session preview) and the actual preview application. I am assuming we can merge one or more of these applications into one for a simple implementation (with one or two consuming websites).
My questions are:
How can we simplify the number of applications (WARs) to be made? Though I always prefer to keep the preview application separate from Tridion's framework. What are the benefits around Content Delivery session preview Web service as a standalone Java/JSP Web application Versus adding it to existing application?
How can I associate/integrate webservice(oData) to XPM or other common basic functionality like component linking?
For example if I publish a page with out enabling oData it will split tridion:ComponentLink tag where as if I enable it will split tcdl:Link tag. So I need to come up with custom class to read these tags? which eventually require adding Tridion framework in the preview application. (Assuming I did not include any Tridion framework in my preview to have a clean application dependency)
I looked at Tridion's live site for more information, but I could not find much useful info for my questions.
There's a LOT of questions right there, let's see what I can do here.
You will need 3 applications:
A Deployer (standard deployment configuration)
A Staging Website (with Tridion stack + XPM Filters + Ambient Framework)
A WebService app (OData + Ambient Framework)
XPM itself runs in the Content Manager Explorer, so it's not per se a separate app, it's an extension of the Tridion Content Manager.
If you want to have your staging separate from Tridion, then you will not be able to use "Session Preview", which in turn means you do not need OData. This however will revert back to the SiteEdit days whereby every change you do to pages requires the page to be republished (with Session Preview this is immediate).
You do NOT need to use OData for your Website in any way (unless you really want to). The WebService is there to support Session Preview only. I wrote down the interactions between XPM/CME/Staging site here and they're documented here.
If you publish tcdl:link code to OData, then just use TCDL/REL, Tridion will render it for you, you don't need to custom parse anything.
Hope this helps...
I would like to create workflow using SDL Tridion 2011 SP1, and i am going through the documentation in the live content portal.
I have few questions when I go through the documentation as follows:
Can I use C# (TOM.NET) for automatted activities/decisions? or should I use only VBScript (TOM)? Is there any sample code given in the live content portal for automatted activity/decision?
If C# (TOM.NET) is not allowed to use in workflows, why are its namespace/class/member references given there in TOM.NET API file?
If VBScript only allowed to use in WF, where can I get code/TOM API reference in SDL live content? As of now I dont have access to SDL Tridion server to get the documentation from the installer package.
Can I use C# (TOM.NET) for automated activities? Or should I use only VBScript (TOM)?
You can use the TOM within your C# code to write automated activities. There is a primary interop assembly provided for that purpose (IIRC).
Is use of TOM.NET allowed in workflows?
Accessing workflow items from within existing TOM.NET code (i.e. a TBB or DataExtender) is supported. So you can query items that are in workflow, kick off workflows, etc.. But using the TOM.NET for writing automated workflow activities is not supported.
The reason for this has something to do with incompatible threading models from what I recall. But I mostly just took the word of the developers for it; they are bound to know better than me.
Where can I get code/TOM API reference?
API reference documentation for Tridion is not in LiveContent, but instead is delivered in CHM (or zipped JavaDoc) files. The latest documentation for the TOM API can be found in the "SDL Tridion 2009 full documentation" zip on the Tridion 2009 documentation page on SDL Tridion World (login required).
Thanks to Quirijn and Alvin for pointing this out in the comments.
You can use C# for automated tasks. Create a class and sign it with:
[ProgId("[Namespace].[Class Name]")]
[ComVisible(true)]
And sign the assembly with RegAsm.exe with the /codebase parameter.
Then in the Script of the automated action you can use this object.
Create the workflow diagram (based on the requirement) using visio after installing the visio plugin.
upload the workflow into SDL Content Manager by providing the credentials and choosing the relevant publication.
For automatted activities: create a c# class library and refer the tridion dll's, and user progid and comvisible = true in the solution.
create the necessary functions for your workflow.
register the assembly in the SDL Tridion Content Manager server.
in the workflow "Edit script" use vb script code to get the C# object and its methods.
This will simplify the VBScript code and provide flexibility to the developr to work in c#.
It is fine to use the TOM .Net API. However we must consider that we would need to create Session instances since our new TOM .Net for Workflow won't allow you to pass a WorkItem instance from VBScript (Code Tab in Visio for Automatic Activities) and you are forced to pass the TcmUri for that WorkItem. The session creation is mandatory in order to get Tridion objects instantiated since you just have a tcmuri. The recommendation here is to use the C# class registered as a COM class by using the ComVisible and ProgId attributes but use the CoreServices for all the processing in your Com Visible class.
If you use the Core Services for processing you won't need to take care of sessions creations and your core would be much faster and scalable. You might be interested in use a TCP binding or a Net Pipes binding for performance obviously.
I've not dug into the details of what and how SDL Tridion is storing data in it's internal search engine (SOLR), but I need to build a GUI extension that needs to perform searching on component/metadata fields across publications.
I can't see any reason not to have a look into SOLR, but before I invest the time, does anyone know any reason why this would be a bad idea?
Thanks in advance!
It's a bad idea in general to bypass the API and directly query SOLR.
From your question, I see no reason to do so.
Do you need to index more data than what is already indexed by Tridion?
If not, surely you can just search using the API?
If you do, you could consider implementing a custom Search Indexing Handler for the additional data. Although this is not very well documented at the moment, it seem rather straight forward to create (implement ISearchIndexingHandler and update your CM and SOLR configuration). The benefit would be that your data can also be searched for using the standard Tridion search.
It really depends on your search requirements. If it's just about simple search - then it's probably fine, but if you want to make some Tridion specific searches then it will be quite difficult as SDL Tridion does a lot of post processing on SOLR results. Why can't you just use CoreService and have a convenient supported search interface?
As Peter said, its really a bad idea to interact with SOLR that comes with Tridion. Tridion has a abstraction layer to hide complexity of SOLR query. For example tridion hides case sensivity of the search keyword.
I strongly recommend to use tridion search api to build ur interface. Tridion search api also supports executing solr query directly. But its not recommended.
For indexing additional data u can implement ISearchIndexingHandler. It has some complexity with the solr config files (adding new fields).
I am exploring if Workflow Foundation 4.0 is stable enough to start developing on it but the documentations I've seen so far are mysteriously silent about why there are no built-in Transaction & SQL Tracking services! They were available in WF 3.5 and seemed to be reasonably stable. Any clues? Was there no time for MS to release WF 4.0 on schedule or the whole concept was broken in 3.5 that they decided to scrap them? I know there are lot of links and hints pointing to writing a custom (SQL) tracking participant, but then what is the point of a "framework"? Moreover there's no way to query the tracked data. And nothing about Transaction service! So how do we keep the WF persistence data & application data consistent? Am i missing something here?
Some unsatisfactory answers on "missing" SQL tracking in WF4:
- http://social.msdn.microsoft.com/Forums/en-US/wfprerelease/thread/8cfe598a-a400-4804-92ad-d68aa444d8f3
[got a few more links, but couldn't post them here bcoz new users can post only one hyperlink per question :( ]
Any help will be greatly appreciated :)
SQL tracking is missing however the AppFabric does include tracking if you go the workflow services route.
Transactions are supported. There is the TransactionScope activity for short running transactions an a CompensatableTransaction for doing long running transactions. There is also the option of creating activity extensions based upon PersistenceIOParticipant where you can save extra data durin THE transaction used to save THE workflow.
According to MSDN, the SQLTrackingService is still supported (see the bottom of the below article):
http://msdn.microsoft.com/en-us/library/system.workflow.runtime.tracking.sqltrackingservice.aspx
You will have to add references to System.Workflow.Runtime.dll (and probably System.Workflow.ComponentModel.dll) to your project. Make sure you are targeting the full .net 4 framework in your project properties (i.e. not the client .net 4 framework). Both dlls can be found in the v4 framework directory.
Background:
I am an intermediate web app developer working on the .Net Platform. Most of my work has been defined pretty well for me by my peers or superiors and I have no problem following instructions and getting the job done.
The task at hand:
I was recently asked by an old friend to redo his web app from scratch. His app is extremely antiquated and he is getting overwhelmed by it breaking all the time. The app in question is an inventory / CRM application and currently each customer requires a new install of the app (usually accomplished by deploying it on a different domain on the same server and pointing to a new database).
Currently if any client wants any modifications to the forms such as additional fields, new features, etc my friend goes in and manually adds those fields to the forms, scripts, database etc. As a result all installs of this application are unique. There is no one singular source repository and no one single version of this app. Generally new features are overtime rolled into the other sites, but still this is done on an individual site by site basis.
I will be approaching this on a very modular basis. Initially I will be coding a module that will query an external web service for some data, display and store it, and periodically update it automatically. The next module will likely be for storing and displaying inventory data. This way I want to over time duplicate the current feature set of his app 100% but do it incrementally.
The Million Dollar Questions
I want to make the app have user
configurable form fields. The user
should be able to go to an admin
page, create a new forms page of a
certain category, and then specify
what fields he wants in there. He
could say 'create a new text field
called Item # and make it a
requirement" and that will get
stored somewhere. All forms will be
dynamically rendered to screen based
on what the user has configured. Is
this a good way to go about the
problem of having no idea what a
customer could want in a form? and
thus be able to store and display
form data of any sort ? What sort of
design pattern should I follow here?
I am familiar with asp.net and
the .net framework in general and
have decent knowledge of javascript,
html, silverlight, jquery, c# etc
etc. I can work my way around web
apps in a good way, but I am not
sure what sort of framework or tech
I should use to accomplish this
task. Would ASP.net 3.5 webforms be
the way to go? or should I look into
ASP.NET MVC? Do I use jquery and ajax for
complete decoupling of frontend and
backend ? or will a normal asp.net
page with some spattering of ajax
thrown in working with a codebehind
be the order of the day?
Just looking for general advice before I start.
I am currently thinking of using ASP.NET 3.5 webforms, jquery for clientside animation, ui, manipulation and data validation, and sqlserver + a .net or wcf webservice for backend.
Your advice is much appreciated as always.
I've recently implemented a white-label ecommerce system for an insurance company that allowed each partner to choose their own set of input fields, screens, and order the flow of the application to suit their individual needs.
Although it wasn't rocket science, it added complexity and increased development time.
Consider the user configuration aspect very carefully In hindsight both my client and their clients in turn, would have been happy with a more rigid system.
As for the tech side of your question, I developed my project in VS2005, using asp.net webforms and webservices with a SQLserver back end, so the stack that you're looking at is definitely capable of delivering a working product. ASP.net MVC will almost certainly help as far as testability goes.
The biggest thing I would change now if I was going to start again would be to replace the intermediate webservices with message based services using nServiceBus, MassTransit or the like. While the webservices worked fine, message based communication should be quicker and more reliable.
Finally, before you start to code, make sure that you understand the current system's functionality inside and out. If the new system doesn't do something that the old system did, it will be pretty obvious to the end users straight away.