Not directly an implementation question but hopefully someone can offer a few pointers.
I wanted to ask if by integrating SDL SmartTarget into Tridion you are effectively getting SDL Fredhopper as well?
Do you still get Fredhopper Business Manager etc?
Could you feed more traditional product data straight into Fredhopper, outside of Tridion then build a site with personalisation/targeting via SmartTarget mixed with a product catalogue driven by Fredhopper and all the cross-sell/up-sell/recommendations it can do?
Cheers
Yes, SDL SmartTarget includes SDL Fredhopper, and you can indeed add data to it via the Data Manager.
The main differences between both offerings is obviously the connector and Ambient Data Framework support for ST queries, a set of Java Taglibs/.NET controls you can use to communicate with Fredhopper and a few other things like session preview support for FH queries (ability to change the variables on-the-fly for testing).
N
Related
I have to implement full-ext search for website based on SDL Tridion WCMS. Any suggestions or an idea how to implement full text search using Tridion Query?
The SDL Tridion Content Delivery API is designed for retrieval of content based on system or custom metadata and/or taxonomy. The full text is not available via the API for searching. To implement a full text site search on a Tridion site it is normal to use/integrate a separate search engine, such as Google Site Search or one of the Lucene based solutions. The best integrations usually use a storage extension to notify the search indexer when content has changed.
See How can we integrate Microsoft FAST with SDL Tridion 2011 SP1? and Extending Content Delivery Storage in SDL Tridion 2011 for some ideas/examples.
If your site is accessible to a Google bot, Google Site Search is easy.
You might also look to the app server for your full text search (for instance its in a .NET/SQL environment).
If you want an enterprise search platform, check out the open source Solr. With Java, .Net and JavaScript APIs and a REST-based server/service, this open source option is worth taking a long look.
Not to go too far off topic, but this helped me visualize when I was answering the same question for the first time: site search means three things. One, a search engine; two, a search schema/index (decide what the beast eats and feed it); three, a search user interface.
We are upgrading to Tridion 2011 SP1 and as a part of Tridion search implementation we are using FS4SP (Fast Search for sharepoint 2010).
In proposed implemenatation search environement consists of following servers:
FAS4SP
FISE
Can someone guide us regarding how to push content to FAST from tridion and how to retrieve the same?
(Here due to some reasons we are not considering crawling of website by FAST)
What all APIs can be used for this implementation?
If you don't want to use the crawling approach, you will need to create a custom deployer, please take a look at this other article:
How can we integrate Microsoft FAST with SDL Tridion 2011 SP1?
Alternatively, if you don't have a development team who is familiar with Java, you might considering creating a .NET application which updates your FAST index based on either a File System or Database trigger when your pages or components are published, updated or deleted from your broker repository.
You will probably want to create XML for FAST and have the Custom Deployer (or Event System) send the content to FAST.
First create the FAST XML that works and write a sample app so you can insert it into the FAST index from either a .NET or Java application. This does not yet involve Tridion.
Then write your Custom Deployer or Event System and pass the XML to FAST.
IF you are using a Custom Deployer approach I would suggest to contact Tridion Professional Services if you have not done it yourself or are not a Java programmer. The new Tridion 2011 Storage API provides new opportunities for the Custom Deployer. In the meantime I would suggest to append the FAST XML to the normal Page Content at the end, surrounded by some markers, and have your custom deployer pull it out of the Page output, send to FAST, then remove from the output before continuing.
This is a fairly difficult challenge for those who do not have serious Content Delivery / Deployer / Java skills. However, if you want to go for it yourself I would suggest taking at least 2 weeks of time to research existing solutions and experiment with the API.
Using the Event System might be a little easier - but your success or failure message will not appear in the Publish Queue and if the search index fails to update you can only log the failure and not pass the info back to users.
I would like to create workflow using SDL Tridion 2011 SP1, and i am going through the documentation in the live content portal.
I have few questions when I go through the documentation as follows:
Can I use C# (TOM.NET) for automatted activities/decisions? or should I use only VBScript (TOM)? Is there any sample code given in the live content portal for automatted activity/decision?
If C# (TOM.NET) is not allowed to use in workflows, why are its namespace/class/member references given there in TOM.NET API file?
If VBScript only allowed to use in WF, where can I get code/TOM API reference in SDL live content? As of now I dont have access to SDL Tridion server to get the documentation from the installer package.
Can I use C# (TOM.NET) for automated activities? Or should I use only VBScript (TOM)?
You can use the TOM within your C# code to write automated activities. There is a primary interop assembly provided for that purpose (IIRC).
Is use of TOM.NET allowed in workflows?
Accessing workflow items from within existing TOM.NET code (i.e. a TBB or DataExtender) is supported. So you can query items that are in workflow, kick off workflows, etc.. But using the TOM.NET for writing automated workflow activities is not supported.
The reason for this has something to do with incompatible threading models from what I recall. But I mostly just took the word of the developers for it; they are bound to know better than me.
Where can I get code/TOM API reference?
API reference documentation for Tridion is not in LiveContent, but instead is delivered in CHM (or zipped JavaDoc) files. The latest documentation for the TOM API can be found in the "SDL Tridion 2009 full documentation" zip on the Tridion 2009 documentation page on SDL Tridion World (login required).
Thanks to Quirijn and Alvin for pointing this out in the comments.
You can use C# for automated tasks. Create a class and sign it with:
[ProgId("[Namespace].[Class Name]")]
[ComVisible(true)]
And sign the assembly with RegAsm.exe with the /codebase parameter.
Then in the Script of the automated action you can use this object.
Create the workflow diagram (based on the requirement) using visio after installing the visio plugin.
upload the workflow into SDL Content Manager by providing the credentials and choosing the relevant publication.
For automatted activities: create a c# class library and refer the tridion dll's, and user progid and comvisible = true in the solution.
create the necessary functions for your workflow.
register the assembly in the SDL Tridion Content Manager server.
in the workflow "Edit script" use vb script code to get the C# object and its methods.
This will simplify the VBScript code and provide flexibility to the developr to work in c#.
It is fine to use the TOM .Net API. However we must consider that we would need to create Session instances since our new TOM .Net for Workflow won't allow you to pass a WorkItem instance from VBScript (Code Tab in Visio for Automatic Activities) and you are forced to pass the TcmUri for that WorkItem. The session creation is mandatory in order to get Tridion objects instantiated since you just have a tcmuri. The recommendation here is to use the C# class registered as a COM class by using the ComVisible and ProgId attributes but use the CoreServices for all the processing in your Com Visible class.
If you use the Core Services for processing you won't need to take care of sessions creations and your core would be much faster and scalable. You might be interested in use a TCP binding or a Net Pipes binding for performance obviously.
MorphX report designer in Ax 2009 seems to be not the 'best' as report designer.. I don't know if is my fault, or if morphx report designer too buggy to do his job.
I'm guessing if there are some alternative to do report for Ax 2009: maybe Crystal Report and Visual Studio ? Or ... ?
Thanks
SSRS is the main alternative for AX 2009. You can deploy the reporting extensions and analysis cubes for some good reporting data. Analysis cubes will need to be configured to match your individual license file.
If you just want to be able to create SSRS reports, I believe you can just go to (Admin>Setup>Business analysis>Reporting Services>Reporting Servers" and point to your SSRS instance, and create the "Dynamics AX" data source.
You might need to do Kerberos setup too depending on your environment topology.
The best alternative options is by far Reporting Services. It is supported by Dynamics AX 2009 in a way there there is tools and platforms to develop reports than honor the security from within AX and also the important feature of being able to persist the report design back to the Application Object Tree (under Report Libraries).
How to setup and configure SSRS for Dynamics AX 2009 is a topic on it's own, but there should be plenty of good resources out there to help you.
Good luck!
In addition to previous answers you can use any report designer you like if you are going working with database directly.
But be ready some of axapta features will not be working automatically, for example - labels for enum values.
Great place to start with SSRS using Visual Studio 2008 are screencasts available on youtube, just go onto youtube and search for "AX2009 SSRS".
SSRS is fine for periodic reporting, however "online" reporting - such as invoices, pick lists, etc (anything printed when posting) is better off handled by external software. You may wish to print to file or to a DB and use 3rd party software to pick up the design/formatting.
Bottomline's Create Forms is one example I have seen used. You also have workflow options which is great when you have different companies/customers/suppliers with difference requirements, even better if you have multiple brands within the same company.
I've not dug into the details of what and how SDL Tridion is storing data in it's internal search engine (SOLR), but I need to build a GUI extension that needs to perform searching on component/metadata fields across publications.
I can't see any reason not to have a look into SOLR, but before I invest the time, does anyone know any reason why this would be a bad idea?
Thanks in advance!
It's a bad idea in general to bypass the API and directly query SOLR.
From your question, I see no reason to do so.
Do you need to index more data than what is already indexed by Tridion?
If not, surely you can just search using the API?
If you do, you could consider implementing a custom Search Indexing Handler for the additional data. Although this is not very well documented at the moment, it seem rather straight forward to create (implement ISearchIndexingHandler and update your CM and SOLR configuration). The benefit would be that your data can also be searched for using the standard Tridion search.
It really depends on your search requirements. If it's just about simple search - then it's probably fine, but if you want to make some Tridion specific searches then it will be quite difficult as SDL Tridion does a lot of post processing on SOLR results. Why can't you just use CoreService and have a convenient supported search interface?
As Peter said, its really a bad idea to interact with SOLR that comes with Tridion. Tridion has a abstraction layer to hide complexity of SOLR query. For example tridion hides case sensivity of the search keyword.
I strongly recommend to use tridion search api to build ur interface. Tridion search api also supports executing solr query directly. But its not recommended.
For indexing additional data u can implement ISearchIndexingHandler. It has some complexity with the solr config files (adding new fields).