Publishing many items uses a lot of memory - tridion

I have a problem similar to this, although hopefully I've narrowed it down a little.
I am currently testing the publishing part of my Tridion 2011 installation (Conent Manager and Content Delivery on separate servers). When I try to publish many pages at once (1000 in my case) the process dllhst3g.exe *32 starts to acquire a large amount of memory. This is understandable as there is a lot of work to do but it never gives it back. This is causing the content manager to run slowly and eventually crash.
I was getting out of memory exceptions being thrown by the Tridion Content Manager when the dllhst3g.exe *32 process had around 3.6GB of memory (I assume this is due to it being a 32 bit process). To prevent these I limited the SDL Tridion Content Manager COM+ Application to an arbitrary 500Mb per process which gets forcibly terminated 15 minutes after the limit is reached. This has stopped the out of memory exceptions for now but I still have performance issues and a potential to run out of memory if more than my servers 8GB is allocated in the 15 minute window. More about this here
I have ruled out the underlying PageTemplate code as the associated page template has no code associated. These pages are blank.
The issue also seems to be much more prominent when using the Core Service API. Code I am using is
using (var client = new Tridion2011CoreService.CoreServiceClient())
{
foreach (var id in ids) // ids is a collection of 500 page ids
{
// publishing to staging and live
var targets = new string[] { "tcm:0-7-65538", "tcm:0-8-65538" };
var publishInstructionData = new PublishInstructionData();
publishInstructionData.ResolveInstruction = new ResolveInstructionData();
publishInstructionData.RenderInstruction = new RenderInstructionData();
var readOptions = new ReadOptions();
client.Publish(new string[] { id }, publishInstructionData, targets, PublishPriority.Normal, readOptions);
}
}
(I realise I could send all the ids through with one call to publish but then I hit a message limit error and as far as I am aware the result of multiple calls and single calls is effectively the same)
Any ideas?
(let me know if I have missed out any details and I'll update the question accordingly)
Forgot to add. The Content Manager is installed on a virtualised Windows Server 2008 with 4 CPUs, 8GB RAM and 50GB disk space. The Content Delivery is installed on a separate server with the same specs.

After contacting SDL customer support I was told that the only solution is to recycle the dllhost process when it takes up a pre-determined amount of memory.
To do this for SDL Tridion Content Manager 2011 running on Windows Server 2008
Open Component Services (either search for it or Control Panel -> Administrative Tools)
Expand the tree Component Services -> Computers -> My Computer -> COM+ Applications
Right click SDL Tridion Content Manager and select Properties
Select the Pooling & Recycling tab
Set Pool Size to 1
Set Lifetime limit to 0
Set Memory Limit to 524288 (512 MB) or choose your own limit (I'd go for a value less than 1GB)
Set Call Limit to 0
Set Activation Limit to 0
If you can, restart your system. Otherwise, restart all the Tridion Services (Control Panel -> Administrative Tools -> Services and restart everything that starts with "Tridion")
Maybe the "real" fix is but a patch away...

Try monitoring the transport package root folder. This would be set in SDL Tridion MMC snap in. Default value would be C:\temp in CMS server. Please check the work area folder of transport which would be set in cd_transport config xml in Tridion installed folder\Config. Also check for the deployer incoming and work folders.Large volume of files accumlating in these folders may cause in poor performance and try cleaning up these folders if you have errored out folders or files to prevent Tridion from retrying the failed transactions. Also if your deployer is runnning as a website-httpupload.asp, try verifying the application domain recycle settings of deployer just to make sure you're not resetting appdomain every few minutes.
You could also try looking in to related items of your publish items.If you have lot of versions or unwanted related items, you could think of custom resolving of publish items to avoid unwanted components with a particular schema etc.Nuno has provided a very good
article in below link. Thanks to Nuno.. http://nunolinhares.blogspot.nl/2011/10/tridion-publisher-and-custom-resolvers.html

Related

autorun a program on client in LAN

I am doing a Project on LAN of an internet cafe. I have a scenario in which 9 nodes and a server is attached with a switch. I have a "reboot and restore" application hosted on a server. So when a client node reboots the particular application on server auto runs and restore the settings on client node.
I want to know which application server will be used for particular store and reboot application with auto start settings?
Look, as for as Auto Star feature is concerned, You can add (to my experience) any application to the startup calls of Windows (assuming you are a windows user)....and the procedure is not that tough,
go to start -> program -> start up ( The idea is to open this folder for you required user)
paste a shortcut of your application executable in to it..
when next time, windows will login, it will load all the applications present in this folder ( and others coming from different Registery Entries ) and thus your application will become Autostart..
and the other thing, I saw a few cafes / offices using an application named "freeze". what this application does is, it creates a snap of user settings, and each time when system is logged in, this application loads and restore that snap shop of user settings...thus, you are able to have predefined settings always applied to such scenarios as yours..
Hopes this helps..

Timeout when uploading images

I am currently testing Tridion 2011 and am having problems creating multimedia components with uploaded content (as opposed to external).
I fill out the title, schema, multimedia type, select a file from my system then click save. I get a Saving item... information message then approximately 30 seconds later I will receive a The wait operation timed out message.
There doesn't appear to be any error messages in the C:\Program Files (x86)\Tridion\log directory. Looking at the event viewer I see the following information relating to the save action
Unable to save Component (tcm:4-738361).
The wait operation timed out
Error Code:
0x8004033F (-2147220673)
Call stack:
System.Data.SqlClient.SqlConnection.OnError(SqlException,Boolean,Action`1)
System.Data.SqlClient.SqlInternalConnection.OnError(SqlException,Boolean,Action`1)
System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject,Boolean,Boolean)
System.Data.SqlClient.TdsParser.TryRun(RunBehavior,SqlCommand,SqlDataReader,BulkCopySimpleResultSet,TdsParserStateObject,Boolean&)
System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader,RunBehavior,String)
System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior,RunBehavior,Boolean,Boolean,Int32,Task&,Boolean)
System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior,RunBehavior,Boolean,String,TaskCompletionSource`1,Int32,Task&,Boolean)
System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(TaskCompletionSource`1,String,Boolean,Int32,Boolean)
System.Data.SqlClient.SqlCommand.ExecuteNonQuery()
Tridion.ContentManager.Data.AdoNet.Sql.SqlDatabaseUtilities.SetBinaryContent(Int32,Stream)
Tridion.ContentManager.Data.AdoNet.ContentManagement.ItemDataMapper.Tridion.ContentManager.Data.ContentManagement.IItemDataMapper.SetBinaryContent(Stream,TcmUri)
Tridion.ContentManager.ContentManagement.RepositoryLocalObject.SetBinaryContent(BinaryContent)
Tridion.ContentManager.ContentManagement.Component.OnSaved(SaveEventArgs)
Tridion.ContentManager.IdentifiableObject.Save(SaveEventArgs)
Tridion.ContentManager.ContentManagement.VersionedItem.Save(Boolean)
Tridion.ContentManager.ContentManagement.VersionedItem.Save()
Tridion.ContentManager.BLFacade.ContentManagement.VersionedItemFacade.UpdateAndCheckIn(UserContext,String,Boolean,Boolean)
XMLState.Save
Component.Save
I already have my timeout settings in the Content Manager Snap-In set to high values (more than 10 minutes) due to another issue.
The BINARIES table in the Content Manage Database is 25GB if that helps.
Any ideas? Thanks.
Edit 1
Following suggestions from Bart Koopman, my DBA has rebuilt the indexes but does not reckon the Transaction log has any impact on performance. The problem persists.
Edit 2
I have just found more details of the error
Unable to save Component (tcm:0-0-0).
Timeout expired.
The timeout period elapsed prior to completion of the operation or the server is not responding.
A database error occurred while executing Stored Procedure "EDA_ITEMS_UPDATEBINARYCONTENT".EDA_ITEMS_UPDATEBINARYCONTENT
After taking a look at this procedure it looks like the following statement could be the root cause
SELECT 1 FROM BINARIES WHERE ID = #iBINARY_ID AND CONTENT IS NULL
I execute it manually with #iBINARY_ID as -1 and after 2 minutes it still hasn't completed. I assume that when I insert a new multimedia component the query will be something similar (i.e. the id will not exist in the table).
The BINARIES table currently has a NON-CLUSTERED Primary Key. Maybe the solution would be to change this to a CLUSTERED Primary Key? However, I assume it is NON-CLUSTERED for a reason.
Just had a response from SDL customer support. Apparently this is a known issue related to statistics and the chosen query plan.
Running the following statement manually from SQL Server Management Studio fixes the problem (it didn't even need to complete for me)
SELECT 1 FROM BINARIES WHERE ID = -1 AND CONTENT IS NULL
Hope this helps someone else out!
Timeouts on database operations are usually an indication of a misconfiguration or a lack of maintenance. By increasing the timeout you are just working around the problem rather than solving it.
With a binaries table that big you will want to make sure you have proper database setup with data files that are separated from your log files (separated on different physical partitions/disks) and possibly even multiple data files on multiple physical partitions to take advantage of performance gains.
Next to that you will want to assure that the standard database maintenance is performed daily/hourly. Things like backing up and truncating the transaction log every hour will greatly improve your database performance (on MS SQL Server a transaction log of more than 1GB slows the database down drastically, you should always try to keep it below that size through timely backup/trucate). Updating statistics and rebuilding indexes is also something you should not forget on a regular basis.

Plugging another relational DB to OpenDS

Currently I'm working on a project with opends. I have to upload more than 200k entries in the OpenDS. But unfortunately its fails at random times when file limit exceeding more than 10k - 15k.
When I google for that particular error (alert ID 9896233: JE Database Environment corresponding to backend id userRoot is corrupt. Restart the Directory Server to reopen the Environment) it seems like openDS backend DB [BerklyDB] is not that reliable when adding massive number of entries. How can i plug in new commercial or open source reliable relational DB [Oracle/ H2] to the openDS. any configuration ? or do i have to change the openDS code ?
First you should be aware that Oracle has pulled the plug on the OpenDS project and it is now completely stalled. Development continues as open source as the OpenDJ project : http://opendj.forgerock.org.
This said, I believe that there is a problem with your environment. When I was still working on OpenDS, our basic stress test was importing and running very high load against 10 Millions users. 200K entries is not massive number. My daily OpenDJ tests on my laptop are done with 100K to 1M entries. We have customers running in production with OpenDJ with more than 20M entries, growing 40% every 6 months !
Berkeley DB has been proved to be very scalable and reliable.
Things you might want to check : what is the maximum number of files that can be opened by a single process on your machine ? Linux defaults to 1024 and the limit may be easy to hit with OpenDS or OpenDJ. Are you using a local filesystem ? Berkeley DB is not supported on networked FS such as NFS or other NAS.
Finally, check the logs/errors file and your systems log. Chances are that one of them will have a message containing the root cause of the problem (most likely logs/errors).
Kind regards,
Ludovic Poitou
ForgeRock - Product Manager for OpenDJ

Adobe AIR HTTP Connection Limit

I'm working on an Adobe AIR application which can upload files to a web server, which is running Apache and PHP. Several files can be uploaded at the same time and the application also calls the web server for various API requests.
The problem I'm having is that if I start two file uploads, while they are in progress any other HTTP requests will time out, which is causing a problem for the application and from a user point of view.
Are Adobe AIR applications limited to 2 HTTP connections, or is something else probably the issue?
From searching about this issue I've not found much but one article did indicated that it wasn't limited to just two connections.
The file uploads are performed by calling the File classes upload method, and the API calls are done using the HTTPService class. The development web server I am using is a WAMP server, however when the application is released it will be talking to a LAMP server.
Thanks,
Grant
Here is the code I'm using to upload the file:
protected function btnAddFile_clickHandler(event:MouseEvent):void
{
// Create a new File object and display the browse file dialog
var uploadFile:File = new File();
uploadFile.browseForOpen("Select File to Upload");
uploadFile.addEventListener(Event.SELECT, uploadFile_SelectedHandler);
}
private function uploadFile_SelectedHandler(event:Event):void
{
// Get the File object which was used to select the file
var uploadFile:File = event.target as File;
uploadFile.addEventListener(ProgressEvent.PROGRESS, file_progressHandler);
uploadFile.addEventListener(IOErrorEvent.IO_ERROR, file_ioErrorHandler);
uploadFile.addEventListener(Event.COMPLETE, file_completeHandler);
// Create the request URL based on the download URL
var requestURL:URLRequest = new URLRequest(AppEnvironment.instance.serverHostname + "upload.php");
requestURL.method = URLRequestMethod.POST;
// Set the post parameters
var params:URLVariables = new URLVariables();
params.name = "filename.ext";
requestURL.data = params;
// Start uploading the file to the server
uploadFile.upload(requestURL, "file");
}
Here is the code for the API calls:
private function sendHTTPPost(apiFile:String, postParams:Object, resultCallback:Function, initialCallerResultCallback:Function):void
{
var httpService:mx.rpc.http.HTTPService = new mx.rpc.http.HTTPService();
httpService.url = AppEnvironment.instance.serverHostname + apiFile;
httpService.method = "POST";
httpService.requestTimeout = 10;
httpService.resultFormat = HTTPService.RESULT_FORMAT_TEXT;
httpService.addEventListener("result", resultCallback);
httpService.addEventListener("fault", httpFault);
var token:AsyncToken = httpService.send(postParams);
// Add the initial caller's result callback function to the token
token.initialCallerResultCallback = initialCallerResultCallback;
}
If you are on a windows system, Adobe AIR is using Microsofts WinINet library to access the web. This library by default limits the number of concurrent connections to a single server to 2:
WinInet limits the number of simultaneous connections that it makes to a single HTTP server. If you exceed this limit, the requests block until one of the current connections has completed. This is by design and is in agreement with the HTTP specification and industry standards.
... Connections to a single HTTP 1.1 server are limited to two simultaneous connections
There is an API to change the value of this limit but I don't know if it is accessible from AIR.
Since this limit also affects page loading speed for web sites, some sites are using multiple DNS names for artifacts such as images, javascripts and stylesheets to allow a browser to open more parallel connections.
So if you are controlling the server part, a workaround could be to create DNS aliases like www.example.com for uploads and api.example.com for API requests.
So as I was looking into this, I came across this info about using File.upload() in the documentation:
Starts the upload of the file to a remote server. Although Flash Player has no restriction on the size of files you can upload or download, the player officially supports uploads or downloads of up to 100 MB. You must call the FileReference.browse() or FileReferenceList.browse() method before you call this method.
Listeners receive events to indicate the progress, success, or failure of the upload. Although you can use the FileReferenceList object to let users select multiple files for upload, you must upload the files one by one; to do so, iterate through the FileReferenceList.fileList array of FileReference objects.
The FileReference.upload() and FileReference.download() functions are
nonblocking. These functions return after they are called, before the
file transmission is complete. In addition, if the FileReference
object goes out of scope, any upload or download that is not yet
completed on that object is canceled upon leaving the scope. Be sure
that your FileReference object remains in scope for as long as the
upload or download is expected to continue.
I wonder if something there could be giving you issues with uploading multiple files. I see that you are using browserForOpen() instead of browse(). It seems like the probably do the same thing... but maybe not.
I also saw this in the File class documentation
Note that because of new functionality added to the Flash Player, when publishing to Flash Player 10, you can have only one of the following operations active at one time: FileReference.browse(), FileReference.upload(), FileReference.download(), FileReference.load(), FileReference.save(). Otherwise, Flash Player throws a runtime error (code 2174). Use FileReference.cancel() to stop an operation in progress. This restriction applies only to Flash Player 10. Previous versions of Flash Player are unaffected by this restriction on simultaneous multiple operations.
When you say that you let users upload multiple files, do you mean subsequent calls to browse() and upload() or do you mean one call that includes multiple files? It seems that if you are trying to do multiple separate calls that that may be an issue.
Anyway, I don't know if this is much help. It definitely seems that what you are trying to do should be possible. I can only guess that what is going wrong is perhaps a problem with implementation. Good luck :)
Reference: http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/net/FileReference.html#upload()
http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/net/FileReference.html#browse()
Just because I was thinking about a very similar question because of an error in one of my actual apps, I decided to write down the answer I found.
I instantiated 11
HttpConnections
and was wondering why my Flex 4 Application stopped working and threw an HTTP-Error although it was working pretty good formerly with just 5 simultanious HttpConnections to the same server.
I tested this myself because I did not find anything regarding this in the Flex docs or on the internet.
I found that using more than 5 HTTPConnections was the reason for the Flex application to throw the runtime error.
I decided to instantiate the connections one after another as a temporally workaround: Load the next one after the other has received the data and so on.
Thats of course just temporally since one of the next steps will be to alter the responding server code in that way that it answers a request that contains the results of requests to more then one table in one respond. Of course the client application logic needs to be altered, too.

best way to get a volume serial number in an asp.net application?

I work on an ASP.Net application which is a web version of an already existing winforms desktop application. The desktop application reads a license file defining enabled features to adapt its behavior. To lock the application to a specific machine, one of the setting of the license file is the serial number of the first system volume. At startup the application checks that the serial number in the license file matches the volume serial number. Getting the volume serial number is done by PInvoking kernel32.dll's GetVolumeInformation function.
However in the asp.net version a standard application pool using a local service or local network identity does not have permission to PInvoke, resulting in the impossibility to check the license file is valid. How can I check the license at the application startup?
I can think of the following alternatives:
replacing the PInvoke by a method which does not require special permissions to get the volume serial number (is it the case with WMI?)
putting all the license checking code in a separate assembly and install it to the GAC to have it executed elevated
creating an application pool with administrator identity just for my application during the installation process
The first solution would be the best but I don't know if it is possible.
Have I other possibilities? What are the pros and cons of each method? Which one is the best?
I suggest using WMI. The Win32_Volume class has the serial number of the volume. You can use something like this:
using System.Management;
SelectQuery query = new SelectQuery("Win32_Volume");
ManagementObjectSearcher searcher = new ManagementObjectSearcher(query);
foreach (ManagementObject obj in searcher.Get())
{
Console.WriteLine("{0} {1}", obj["DriveLetter"], obj["SerialNumber"]);
}

Resources