Better way to handle page that links to hundreds of binaries? - tridion

I've struggled with a better solution for the following setup. I'm not actively working on this, but know some that might appreciate other ways of handling this.
Setup:
Tridion-managed page has a single "linked list" component Linked list
Single component has component links to other components in Tridion
Linked-to components often link to multimedia component (mm)
An XSLT component template (XSLT CT) renders XML with above content and with links to PDF
XSL document() function used to grab embedded (linked-to) content, all content converted to XML nodes and attributes
TCMScriptAssistant namespace with publishBinary() publishes related PDF and other media
Page template just outputs the result of the CT
Business requirements:
improved publishing (last I worked on this, some of these files created a 2GB publishing transaction because of the PDFs)
published XML content file must reference the associated PDFs; hyperlinks work but identifiers might not help because of...
no Tridion content delivery APIs, mainly for independence from the storage database but also to avoid Tridion-specific code on the presentation server (loosely coupled setup and less training for developers)
The biggest issue is the huge transport package during publishing. The second problem is publishing any of the linked-to PDFs will cause the page to republish.
How could this setup be improved or re-engineered, preferably without too many changes to the existing templates, though modular templating could be considered.
Dynamic component presentations could possibly work, but would need to be published to the file system and not use dynamic linking or broker objects (e.g. no criteria filters, binary metadata, etc).

There are indeed 2 questions. I will handle them in reverse order.
To prevent the page from being republished when you publish a binary, you can use the event system in older versions of Tridion (pre-2011) to turn off link resolving, or with newer versions you can use a custom resolver to prevent this. There is an article by Nuno which explains this(http://nunolinhares.blogspot.com/2011/10/tridion-publisher-and-custom-resolvers.html)
Your second one is a bit tougher, in no small part because of your criteria for not using the SDL Tridion CD APIs. I would have suggested publishing the binaries separately (this would keep the file size down of your transaction package), and using Binary Linking to resolve the paths at request time.
Given this is not an option, I think the only was I would approach it would be to still use dynamic component presentations, and then use predictable unique file names for the PDfs (i.e. use something like 317-12345.pdf based on the URI), and use one directory for all the binaries. That way you could enter the paths to the binary using your XSLT template, as you know where the binaries will be located later. You could then use a custom resolver to publish the binaries when you publish the main list component or page.
Hope that helps
Chris

Related

Business Central Identifying Object form vs Extension Form

I'm also one of many who has begun development work in Business Central. I'm currently in-charge of migrating C/SIDE to AL. My question is, is there a way to identify whether something is in Object form or Extension form? The documentation I have from a third-party vendor says:
"All of company XYZ's products are available in both Object form and in Extension form. Existing customers who want to migrate from the Object version of a solution to the App version will need to go through a migration...."
First a little clarification:
Object form means that the modifications have been done through C/Side.
Extension form means that the modifications are isolated within their own package with one or more dependencies to other extensions. These are not visible in the C/Side Object Designer.
When modifications are done through C/Side the system generates symbols to simulate the extension interface. This provides the needed features to extend C/Side objects.
The easiest way to determine if a modification is in Object or Extension form is to check which extensions are installed on the system. This is can be done in two ways:
In the Business Central client go the the Extension Mangement page. Here all installed extensions will be listed (apart from a few hidden Microsoft extensions that you need not worry about).
Run the command Get-NAVAppInfo through PowerShell. This will list all installed extensions on the requested tenant.

Update Resx files at runtime in .Net Core

My goal is to implement the changing values of resources and give the admin the ability to maintain the language through the portal. In order to do that, I need to be able to change resx files at the runtime, because all their values are stored on the resx files. I have 3 resx files for each different language. In my case I want the translation can be maintain later by an admin at runtime. For example admin can add, edit or delete the entry for the language at runtime.
As #Xerillio mentioned in his comment this is a lot of effort.
Recently I've created a nuget that may save your time and effort. Have a look at XLocalizer, it creates resources and uses online translation services to auto translate the missing resources and save them in XML or DB, then it provides an easy interface to export them to RESX. Finally you may see XLocalizer.Samples, it contains sample setup for different scenarios.
If you need another file/DB type to store the resources, you may create your custom resource provider and register it in startup.
The DB sample provides UI to edit resources, but with XML sample I didn't create a UI for editing resources, instead you may download the XML, do your corrections if any, then upload and use the built-in exporter to export to RESX.
With this nuget, all I have to do to add a new culture even in runtime, is just to add the culture name to the supported cultures list, then do some correction to the auto translations, all the rest is handled by XLocalizer.
Notice: it was not possible to put all this in the comment, thats why I posted as answer :)

Is there a way to link ASP.NET Web Forms aspx files transparently? (using the file system or some other way)

We have a legacy ASP.NET 2.0 Webforms Web Site that we need to extend. It is poorly designed, and the architecture forces many files to be duplicated.
Whenever we add new functionality we are forced to physically copy our pages+code behind for "Weekly" versions of our products to other folders for our "Daily" products. For obvious reasons, this makes updates very annoying, since we need remember to update 2 copies of each file. Although the Weekly and Daily versions can differ, this rarely happens, so they usually have to be identical and exactly in syc.
Is there a way to create links/shortcuts in Windows or Visual Studio, so that we only need to create pages for our "weekly" products, and if a page is requested from the corresponding "daily" product, ASP.NET would transparently serve the "weekly" page, unless we physically subsitute a modified "daily" page? Bonus points if we can fool VS 2012 as well.
Clarification 1: We have folders like /Products/ProductAMonthly/Price.aspx and /Products/ProductADaily/Price.aspx The products are set up in a config file, and a framework does the routing. Unfortunately, the config file forces each product into a separate folder on the server, so we can't get the config file to reuse pages.
We have also refactored into base classes, and could perhaps refactor some more, but this doesn't get rid of the need for identical pages to exist in the folders defined in the config file.
If a daily product is suppose to display the same data as the weekly, in the code behind for a Daily product, you could do a server.transfer("Url for Weekly Product")
In the event the daily product is different than the weekly, you don't use a server.transfer, and you implement the desired daily business logic.
I eventually used a Windows feature called Symbolic Links (or Junctions). ASP.NET/IIS seems to serve these up without problems. We have a batch file that uses mklink to create these on each developer's machines. We also added the junctioned files to our source control's ignore file. It seems to work well.

Tridion 2011 SP1 HR1 - which extension to use?

We have a requirement that on a page publish, we need to:
Find a component presentation that has a component based upon a particular schema.
Extract certain field vales from that component and store them in a custom database table that's available to our .NET application (on the Content Delivery side).
I think this is a good candidate for either a Deployer extension or a Storage extension - but I'm a little unclear which and why having never written either?
I've ruled out the Event System as this kind of code would be located on the CM, which seems like the wrong "side" to me - my focus is on extending what happens on the CD-side after a page is published.
Read a few articles on Tridion World (this, this, this and this) and I think a storage extension would be the better choice?
Mihai's article seems to be very close to what we need, where he uses a new item type mapping:
<ItemTypes defaultStorageId="brokerdb" cached="true">
<Item typeMapping="PublishAction" cached="false" storageId="searchdb" /></ItemTypes>
But how does Tridion "know" to use this new item type when content is published, (its not one of the defined TYPE_NAMEs, which is kind of the point)?
I should clarify I'm a .NET/C# dev not a Java dev so this is probably really obvious to Java people - apologies if it is!
Cheers
Tridion will not know by default how to deploy your new entity. My advise is to create a Deployer Module (your links should give you enough information about how you can do that) that executes in post-processing phase (of the deployment process), that processes all components from the deployment/transport package, extracts the needed information and uses a custom Storage Extension to store the needed information.
Be careful: you need to set-up in config your new type but you also need to use it yourself from that Deployer Module.
Hope this helps.

Need to get the XML of a component's that version which is published

We are iterating the components in a folder in Tridion 2011 and creating our custom XML to be used on CDS on the basis of the publishing status of component. I am giving below example to make you understand the problem.
Supppose we have 10 components in a folder which are all published and we publish our XML then the XML gets generated for 10 items.
Now we make change in one of the component and don't publish it.
After modification of component, we publish the XML again. then the XML get updated for the modified component also. So it creates the difference between the published version of that component and the that is in our XML.
So I want to publish the custom XML in such a way that it should only contain that data which is in sync with published version of component.
So you want to:
determine the XML of the Component that was last published
determine the changes between that XML and the current XML of the Component
only publish the changes
Tridion doesn't keep track of the version that was published (on the Content Manager at least). So the closest you can do is find out when the Component was last published and retrieve the XML of that time. This question is a great starting point for more information on that approach. Based on that XML you can then do steps 2 and 3 above.
Alternatively you can keep a snapshot of the XML that you published "somewhere" (for example in Application Data) when you're rendering the Component. Then when the Component gets published next time, you can retrieve that XML and do steps 2 and 3 above.
Note that with any of those solutions you should really wonder if you should be implementing it to begin with. You are overriding some of Tridion's default rendering behavior and circumventing part of its architecture (a clear, explicit disconnect between Content Management and Content Delivery, with the former knowing "nothing" about the latter) and anything you do will come back to haunt you in time. In this use-case you have to wonder what will happen when the CDS and TCM get out of sync. Simply republishing the content suddenly won't be good enough anymore, since your code will be in there deciding that "nothing changed since last publish, so we'll publish nothing".
Please forgive me if I jump to conclusions, but I strongly feel this question has arisen from a lack of understanding of Tridion. Publishing in Tridion does more than just raise a flag to indicate the item is 'published', in other words ready to be shown to the outside world. I know this is how some (many) content management systems operate (which may explain why you are asking this question).
In Tridion, however, publishing means that the item is actually - physically - transferred from the content management environment to the content delivery environment. This environment always contains versions of your content that represent the state when the item was last published - simply because it was the very act of publishing that created them.
In my opinion, what you are really asking is how to rebuild this publishing functionality. This is never a good idea. Instead, you should take Bart's comment seriously and look at one of the content delivery APIs that Tridion has on offer (the broker API or the OData web service). Optionally you might want to look into DD4T, which is built on top of the broker and exposes the full Tridion data model.
Then your solution is to
Write an event handler on the Publish Transaction Save event
Which saves the publish info (version data) to Application Data of the published Component
I'm mentioning the Publish Transaction Save event because from there you can ensure that the publish info is only saved when the transaction is successfull.
Also be aware that this publish info can go out of sync when the event handler fails to execute, and you might loose all of the application data when moving to another environment.
So when this information is absolutely crucial I would save it to a separate database, and not to Application Data.

Resources