jpedal jpg2000 error - jai

I convert pdfs to images using jpedal. This works fine for most of the pdfs but some containing jpeg2000 i continue receiving the following error:
java.lang.RuntimeException: JPeg 2000 Images needs the VM parameter -Dorg.jpedal.jai=true switch turned on
at org.jpedal.parser.PdfStreamDecoder.decodeStreamIntoObjects(Unknown Source)
at org.jpedal.parser.PdfStreamDecoder.decodePageContent(Unknown Source)
at org.jpedal.PDFtoImageConvertor.convert(Unknown Source)
at org.jpedal.PdfDecoder.getPageAsImage(Unknown Source)
at org.jpedal.PdfDecoder.getPageAsImage(Unknown Source)
at com.....
I already set the JVM Parameter in the JAVA_OPTS, the run configuration of my tomcat and also in program code using:
System.setProperty("org.jpedal.jai", "true");
PdfDecoder decode_pdf = new PdfDecoder(true);
FontMappings.setFontReplacements();
decode_pdf.openPdfArray(pdf_file);
also the 3 JAI libs are on my build path.
So I don't know what else I have to do?
My complete code for the conversion is:
List<BufferedImage> images = new LinkedList<BufferedImage>();
System.setProperty("org.jpedal.jai", "true");
PdfDecoder decode_pdf = new PdfDecoder(true);
FontMappings.setFontReplacements();
decode_pdf.openPdfArray(pdf_file);
decode_pdf.setExtractionMode(0, 1f); //do not save images
for (int i = 1; i<= decode_pdf.getPageCount(); i++)
{
images.add(decode_pdf.getPageAsImage(i));
}
decode_pdf.closePdfFile();
Any sugestions?

Activate jai for jpedal
System.setProperty("org.jpedal.jai", "true");
A better solution (than the article from Mark Stephens blog) is to re-register the
provider, because this only has to be done once:
IIORegistry registry = IIORegistry.getDefaultInstance();
registry.registerServiceProvider(new com.sun.media.imageioimpl.plugins.jpeg2000.J2KImageWriterSpi());
registry.registerServiceProvider(new com.sun.media.imageioimpl.plugins.jpeg2000.J2KImageReaderSpi());
Of course JAI libs need to be in the classpath to work correctly.

I found the answer to this problem here.
When in a Tomcat environment you have to disable the JreLeakPreventionListener in server.xml then it works just fine.

Related

How can I connect oracle coherence remote cluster using java

There is a coherence cluster (with a cache by name mycache) that is runnig on ip address xxx.xxx.xxx.xxx (not localhost). I am trying to connect it and read from cache using java.
This is my Reader class:
import com.tangosol.net.CacheFactory;
import com.tangosol.net.NamedCache;
public class Reader {
public static void main(String[] args) {
NamedCache cache = CacheFactory.getCache("mycache");
System.out.println("Value in cache is: " + cache.get("key1"));
}
}
I am using Intellij IDEA, in vm option for reader I added this line:
-Dtangosol.coherence.cacheconfig=mycache.xml
and this is mycache.xml file:
<?xml version='1.0'?>
<coherence xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://xmlns.oracle.com/coherence/coherence-operational-config"
xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-operational-config
coherence-operational-config.xsd"
xml-override="{tangosol.coherence.override /tangosol-coherence-override-{mode}.xml}">
<cluster-config>
<member-identity>
<cluster-name>RemoteCluster</cluster-name>
</member-identity>
<unicast-listener>
<well-known-addresses>
<socket-address id="1">
<address>192.168.104.160</address>
<port>8088</port>
</socket-address>
</well-known-addresses>
</unicast-listener>
</cluster-config>
</coherence>
when I run reader.main() I get this exception:
Problem : An ElementProcessor could not be located for the element [coherence]
Advice : The specified element is unknown to the NamespaceHandler implementation. Perhaps the xml element is foreign to the Xml Namespace?
at com.tangosol.util.Base.ensureRuntimeException(Base.java:286)
at com.tangosol.net.ScopedCacheFactoryBuilder.instantiateFactory(ScopedCacheFactoryBuilder.java:433)
at com.tangosol.net.ScopedCacheFactoryBuilder.buildFactory(ScopedCacheFactoryBuilder.java:385)
at com.tangosol.net.ScopedCacheFactoryBuilder.getFactory(ScopedCacheFactoryBuilder.java:267)
at com.tangosol.net.ScopedCacheFactoryBuilder.getConfigurableCacheFactory(ScopedCacheFactoryBuilder.java:119)
at com.tangosol.net.CacheFactory.getConfigurableCacheFactory(CacheFactory.java:127)
at com.tangosol.net.CacheFactory.getCache(CacheFactory.java:205)
at com.tangosol.net.CacheFactory.getCache(CacheFactory.java:182)
at Reader.main(Reader.java:11)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: com.tangosol.config.ConfigurationException: Configuration Exception
-----------------------
Problem : An ElementProcessor could not be located for the element [coherence]
Advice : The specified element is unknown to the NamespaceHandler implementation. Perhaps the xml element is foreign to the Xml Namespace?
it looks like the problem in the mycache.xml. Those elements used when you want to set up cluster member, while you want to connect a client.
Assuming that "mycache" schema exists on remote cluster try to change the mycache.xml with following:
<?xml version="1.0"?>
<!DOCTYPE cache-config SYSTEM "cache-config.dtd">
<cache-config xmlns="http://schemas.tangosol.com/cache">
<caching-scheme-mapping>
<cache-mapping>
<cache-name>mycache</cache-name>
<scheme-name>extend-dist</scheme-name>
</cache-mapping>
</caching-scheme-mapping>
<caching-schemes>
<remote-cache-scheme>
<scheme-name>extend-dist</scheme-name>
<service-name>ExtendTcpCacheService</service-name>
<initiator-config>
<tcp-initiator>
<remote-addresses>
<socket-address>
<address>192.168.104.160</address>
<port>8088</port>
</socket-address>
</remote-addresses>
</tcp-initiator>
<outgoing-message-handler>
<request-timeout>20s</request-timeout>
</outgoing-message-handler>
</initiator-config>
</remote-cache-scheme>
</caching-schemes>
</cache-config>
Note: if remote cluster uses POF serialization for mycache you'll have to add POF mapping and configuration -Dtangosol.pof.enabled=true
Your xml file is an operational config rather than cache config. To use this configuration, run your programm with:
-Dtangosol.coherence.override=mycache.xml
instead of:
-Dtangosol.coherence.cacheconfig=mycache.xml
BTW you should rename mycache.xml to e.g. operational-config.xml in order to not confuse it with cache configuration.

Is there anyway to test run EDIDisassembler component with pipeline testing tool?

First, Let me explain why I need do this.
I have an inbound port with EDIReceive Pipeline configuration. it receives EDI X12 837I files and disassemble these files to 837I messages.
There's one file failed with error description below:
The following elements are not closed: ns0:X12_00501_837_I. Line 1, position 829925.
It looks like the incoming file have some structure issue. Making the disassembler cannot produce the message correctly. But the error itself don't help to locate the issue. Also, no TA1 and 999 generated to help us locate the issue.
So I created a little console application using the Pipeline Component Test Library try to run this file through the edidisassembler pipeline component to see if I can find what cause the error.
The code is pretty straightforward:
namespace TestEDIDasm
{
using System;
using System.IO;
using Microsoft.BizTalk.Edi.Pipelines;
using Microsoft.BizTalk.Message.Interop;
using Winterdom.BizTalk.PipelineTesting;
using Microsoft.BizTalk.Edi.BatchMarker;
class Program
{
static void Main(string[] args)
{
var ediDasmComp = new EdiDisassembler();
ediDasmComp.UseIsa11AsRepetitionSeparator = true;
ediDasmComp.XmlSchemaValidation = true;
var batchMaker = new PartyBatchMarker();
IBaseMessage testingMessage = MessageHelper.LoadMessage(#"c:\temp\{1C9420EB-5C54-43E5-9D9D-7297DE65B36C}_context.xml");
ReceivePipelineWrapper testPipelineWrapper = PipelineFactory.CreateEmptyReceivePipeline();
testPipelineWrapper.AddComponent(ediDasmComp, PipelineStage.Disassemble);
testPipelineWrapper.AddComponent(batchMaker, PipelineStage.ResolveParty);
var outputMessages = testPipelineWrapper.Execute(testingMessage);
if (outputMessages.Count <= 0)
{
Console.WriteLine("No output message");
Console.ReadKey();
return;
}
var msg = outputMessages[0];
StreamReader sr = new StreamReader(msg.BodyPart.Data);
Console.WriteLine(sr.ReadToEnd());
Console.ReadKey();
}
}
}
I added some breakpoint but end up with following errors in message context:
"X12 service schema not found"
Clearly, the EDIDisassembler component rely on some other stuff to do its job.
Now goes to my question:
Is there anyway to make EdiDisassembler working in testing
environment?
If there any other way to debug/trace the disassembler component
processing file other than Pipeline Component Test Library?
Theoretically, sure, but you have to replicate a lot of engine context that exists during Pipeline execution. The EDI components have issues running inside Orchestrations so it's likely a pretty tall order.
Have you tried a Preserve Interchange Pipeline with the Fallback Settings? That's about as simple as you can get with the EDI Disassembler.

JSoup randomly throws java.io.IOException: stream is closed when running from browser

I'm having some weird JSoup problem when running my JavaFX application from the browser (or as web-start).
When I run from inside the IDE (Eclipse or Netbeans) or as a standalone app, it runs normally. When I try to run as a web-start or from the browser (Chrome), JSoup randomly throws a "java.io.IOException: stream is closed".
The site I'm trying to parse is thepiratebay.sx. When I first run the application (from browser), I get this error. With the application running, if I try to parse again, than it works... sometimes.
The JSoup code:
try {
//TODO: Change to HttpFetcher. This method is reporting "stream is closed" when running on browser
Connection con = Jsoup.connect(url)
.timeout(HTTP_TIMEOUT)
.userAgent(UserAgentGenerator.getUserAgent())
.followRedirects(false);
doc = con.get();
System.out.println("Fetching... " + url);
} catch (IOException e) {
e.printStackTrace();
System.out.println("Parser connect must have timed out, no results. " + url);
fetchFailed[i] = true;
continue;
}
finally {
i++;
if (CommonTFUtils.isAllTrue(fetchFailed)) {
throw new HttpException("Fetcher failed on every URL of " + response.getSite_name());
}
}
And the exception thrown:
CacheEntry[http://thepiratebay.sx/browse/207/0/7]: updateAvailable=true,lastModified=Tue May 14 14:28:16 BRT 2013,length=-1
java.io.IOException: stream is closed
at sun.net.www.http.ChunkedInputStream.ensureOpen(Unknown Source)
at sun.net.www.http.ChunkedInputStream.read(Unknown Source)
at java.io.FilterInputStream.read(Unknown Source)
at sun.net.www.protocol.http.HttpURLConnection$HttpInputStream.read(Unknown Source)
at sun.net.www.protocol.http.HttpURLConnection$HttpInputStream.read(Unknown Source)
at sun.net.www.protocol.http.HttpURLConnection$HttpInputStream.read(Unknown Source)
at sun.net.www.protocol.http.HttpURLConnection$HttpInputStream.close(Unknown Source)
at org.jsoup.helper.HttpConnection$Response.execute(HttpConnection.java:468)
at org.jsoup.helper.HttpConnection$Response.execute(HttpConnection.java:410)
at org.jsoup.helper.HttpConnection.execute(HttpConnection.java:164)
at org.jsoup.helper.HttpConnection.get(HttpConnection.java:153)
at com.package.torrent.parser.GenericParser.search(GenericParser.java:147)
at com.package.torrent.parser.GenericParser.browse(GenericParser.java:82)
at com.package.search.TrackerSearch.searchTracker(TrackerSearch.java:69)
at com.package.search.TrackerSearch.searchAllTrackers(TrackerSearch.java:40)
at com.package.search.TrackerSearch.searchAllTrackers(TrackerSearch.java:23)
at com.package.search.MovieBrowser.browseTrackers(MovieBrowser.java:49)
at com.package.ui.browse.BrowseController$MovieBrowserTask.call(BrowseController.java:237)
at com.package.ui.browse.BrowseController$MovieBrowserTask.call(BrowseController.java:213)
at javafx.concurrent.Task$TaskCallable.call(Task.java:1259)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Does anyone have an idea of what might be causing this?
Thanks in advance.
I think I found a solution. Place this code before you ever call JSoup. Apparently, applets and web start set this value to true. Now, I wonder why Sun forces you to access a static variable non-statically.
new URL("jar:file://dummy.jar!/").openConnection().setDefaultUseCaches(false);
JSoup doesn't handle well when the URL is cached and treats it as an exception.

SDL Tridion 2009: Creating components through TOM API (via Interop) fails

Am facing a problem, while creating components through TOM API using .NET/COM Interop.
Actual Issue:
I have 550 components to be created through custom page. I am able to create between 400 - 470 components but after that it is getting failed and through an error message saying that
Error: Thread was being aborted.
Any idea / suggestion, why it is getting failed?
OR
Is there any restriction on Tridion 2009?
UPDATE 1:
As per #user978511 request, below is error on Application event log:-
Event code: 3001
Event message: The request has been aborted.
...
...
Process information:
Process ID: 1016
Process name: w3wp.exe
Account name: NT AUTHORITY\NETWORK SERVICE
Exception information:
Exception type: HttpException
Exception message: Request timed out.
...
...
...
UPDATE 2:
#Chris: This is my common function, which is called in a loop by passing list of params. Here am using Interop dll's.
public static bool CreateFareComponent(.... list of params ...)
{
TDSE mTDSE = null;
Folder mFolder = null;
Component mComponent = null;
bool flag = false;
try
{
mTDSE = TDSEInitialize();
mComponent = (Component)mTDSE.GetNewObject(ItemType.ItemTypeComponent, folderID, null);
mComponent.Schema = (Schema)mTDSE.GetObject(constants.SCHEMA_ID, EnumOpenMode.OpenModeView, null, XMLReadFilter.XMLReadAll);
mComponent.Title = compTitle;
...
...
...
...
mComponent.Save(true);
flag = true;
}
catch (Exception ex)
{
CustomLogger.Error(String.Format("Logged User: {0} \r\n Error: {1}", GetRemoteUser(), ex.Message));
}
return flag;
}
Thanks in advance.
Sounds like a timeout, most likely in IIS which is hosting your custom page.
Are you creating them all in one synchronous request? Because that is indeed likely to time out.
You could instead create them in batches - or make sure your operations are done asynchronously and then polling the status regularly.
The easiest would just be to only create say 10 Components in one request, wait for it to finish, and then create another 10 (perhaps with a nice progress bar? :))
How you call TDSE object. I would like to mention here "Marshal.ReleaseComObject" procedure. Without releasing COMs objects can lead to enormous memory leaks.
Here is code for component creating:
private Component NewComponent(string componentName, string publicationID, string parentID, string schemaID)
{
Publication publication = (Publication)mTdse.GetObject(publicationID, EnumOpenMode.OpenModeView, null, XMLReadFilter.XMLReadContext);
Folder folder = (Folder)mTdse.GetObject(parentID, EnumOpenMode.OpenModeView, null, XMLReadFilter.XMLReadContext);
Schema schema = (Schema)mTdse.GetObject(schemaID, EnumOpenMode.OpenModeView, publicationID, XMLReadFilter.XMLReadContext);
Component component = (Component)mTdse.GetNewObject(ItemType.ItemTypeComponent, folder, publication);
component.Title = componentName;
component.Schema = schema;
return component;
}
After that please not forget to release mTdse ( in my case it is previously created TDSE object). Disposing "Components" object can be useful also after finish working with them.
For large Tridion batch operations I always use a Console Application and run it directly on the server.
Use Console.WriteLine to write to the output window and Console.ReadLine as the last line of code in the app (so the window stays open). I also use Log4Net as the logger.
This is by far the best approach if you have access to a remote session on the server - or can ask an admin to run it for you and give you access to the log folder via a network share.
As per #chris suggestions and part of immediate fix I have changed my web.config execution time out to 8000 seconds.
<httpRuntime executionTimeout="8000"/>
With this change, custom page is able to handle as of now.
Any more best suggestion, please post it.

Flyway output to SQL File

Is it possible to output the db migration to an SQL file instead of directly invoking database changes in flyway?
Most times this will not be needed as with Flyway the DB migrations themselves will already be written in SQL.
Yes it's possible and as far as I am concerned the feature is an absolute must for DBAs who don't want to allow flyway in prod.
I made do with modifying code from here, it's a dry run command for flyway, you can add a filewriter and write out migrationDetails:
https://github.com/killbill/killbill/commit/996a3d5fd096525689dced825eac7a95a8a7817e
I did it like so... Project structure (just copied it out of killbill's project and renamed package to flywaydr:
.
./main
./main/java
./main/java/com
./main/java/com/flywaydr
./main/java/com/flywaydr/CapturingMetaDataTable.java
./main/java/com/flywaydr/CapturingSqlMigrationExecutor.java
./main/java/com/flywaydr/DbMigrateWithDryRun.java
./main/java/com/flywaydr/MigrationInfoCallback.java
./main/java/com/flywaydr/Migrator.java
./main/java/org
./main/java/org/flywaydb
./main/java/org/flywaydb/core
./main/java/org/flywaydb/core/FlywayWithDryRun.java
In Migrator.java add (implement callback and put it in DbMigrateWithDryRun.java) :
} else if ("dryRunMigrate".equals(operation)) {
MigrationInfoCallback mcb = new MigrationInfoCallback();
flyway.dryRunMigrate();
MigrationInfoImpl[] migrationDetails = mcb.getPendingMigrationDetails();
if(migrationDetails.length>0){
writeMasterScriptToFile(migrationDetails);
}
}
Then to write stuff to file something like:
private static void writeMasterScriptToFile(MigrationInfoImpl[] migrationDetails){
FileWriter fw = null;
try{
String masterScriptLoc="path/to/file";
fw = new FileWriter(masterScriptLoc);
LOG.info("Writing output to " + masterScriptLoc);
for (final MigrationInfoImpl migration : migrationDetails){
Path file =Paths.get(migration.getResolvedMigration().getPhysicalLocation());
//if you want to copy actual script files parsed by flyway
Files.copy(file, Paths.get(new StringBuilder(scriptspathloc).append(File.separator).append(file.getFileName().toString()).toString()), REPLACE_EXISTING);
}
//or just get the sql
for (final SqlStatement sqlStatement : sqlStatements) {
//sqlStatement.getSql();
}
fw.write(stuff.toString());
} catch(Exception e){
LOG.error("Could not write to file, io exception was thrown.",e);
} finally{
try{fw.close();}catch(Exception e){LOG.error("Could not close file writer.",e);}
}
}
One last thing to mention, I compile and package this into a jar "with dependencies" (aka fatjar) via maven (google assembly plugin + jar with dependencies) and run it via command like below or you can include it as a dependency and call it via mvn exec:exec goal, which is something I had success with as well.
$ java -jar /path/to/flywaydr-fatjar.jar dryRunMigrate -regular.flyway.configs -etc -etc
I didnt find a way. Switched to mybatis migration. Looks quite nice.

Resources