Flex WSDL to Actionscript as ant task? - apache-flex

Flex Builder 3 provides support to generate actionscipt from WSDL via the GUI ( Data->Import Web Service (WSDL) ) - but this sort of method requires that you check in the generated source. This is not desirable to us (we understand both sides of the 'should generated source be checked in' and we have decided that they should not) so we would like a method to generate the actionscript classes from an ant task. In this case, the WSDL would live in the file system.
Any ideas?

You could spent some time digging through Flex Builder's JARs to find the libraries they're using to do this, then invoke them from a very thin custom Ant Task you write yourself. The likelihood of this succeeding is small but might be worth investigating to save yourself a ton of work, just in case.
Short of that, I'd start with WSDL2Java to generate Java classes that represent your WSDL entities. The results won't necessarily be beautiful but you should get classes that adhere to the JavaBean spec. Then you could use one of the open source Java-to-ActionScript generators which include:
Granite Data Services' Gas3
Spicefactory's Pimento, which has Java->AS3 generation
I'm almost positive that Gas3 has an Ant Task you can use; not sure about Pimento.

From the comments in the generated code Flex Builder uses Apache Axis2
/**
* BaseBlaBlahService.as
* This file was auto-generated from WSDL by the Apache Axis2 generator modified by Adobe
* Any change made to this file will be overwritten when the code is re-generated.
*/
I've also found this on the Adobe Forum http://forums.adobe.com/thread/96006.
I'm also trying to solve this issue. I guess we need to create a feature request on the adobe flex website. Let me see if i can find my adobe.com user id....

Related

The transaction currently built is missing an attachment for class - Attempted to find a suitable attachment but could not find any in the storage

Full Error:
transactions.TransactionBuilder. - The transaction currently built is missing an attachment for class: com/gibtn/corda/printutilities/PrintLedgerTransaction. Attempted to find a suitable attachment but could not find any in the storage.
This has been asked here and here but I hope to get better clarification.
Problem:
I have built a set of libraries to perform common tasks in my Flows that I include in all my CorDapps. For now I just copy the JARs into each project, make some changes to the gradle files and everything works great.
I recently put together a small library for performing common tasks in Contracts and added the JAR the same way.
This works fine with MockNodes. But when I test with real nodes I will get this error in the CRaSH shell and the transaction will fail with a NoClassDefFoundError exception.
Question:
Is what I am doing even possible? Or do I always have to keep my utility classes inside the Contracts module in IntelliJ so they are bundled together with the Contracts into a single JAR? That way when the node starts the JAR (containing the Contracts and any utilities) is added to Attachment storage as a single Attachment.
I found a way to solve this. It's a bit dirty but initial testing seems to work. I just created a blank class in my utilities JAR that implements Contract. It's verify() method is empty. Now when the Corda node starts it sees this Contract and adds the JAR to Attachment storage. So from the CRaSH shell if I run:
attachments trustInfo
...my utility JAR will be listed (it wasn't before). I see when I use one of the utility methods in a Contract the utility JAR will be included as a separate Attachment in the WireTransaction.
I'm not crazy about this solution and will probably stop using a utility JAR for Contracts. I'll go back to copying the classes into each project. Nevertheless there is a way to do it. I would just need a more experienced Corda developer to give it their blessing before I'd go forward into production with it.

how to develop a custom connector in SailPoint

I am novices to the field of Identity and Access management.
Till now I know, Sail point has provided the some direct connectors to integrate the known systems like LDAP, HR systems, OIM, Databases..
And sailpoint also provided the support for disconnected applications with the use of Custom connectors.
Here, My question is how to develop a custom connector..?
I do not have jar file provided by sailpoint which contain "AbstractConnector" class.
So that I can write my own class and develop..?
I also so not understand, what to do with that class?(if i have a jar)
How sailpoint will refer to that class..
Do we need to deploy that class to somewhere...
Here I am expecting the complete flow to develop and deploy the custom connector..
If anyone is working please help..
If you unzip your identityiq.war, you'll find a JAR file called WEB-INF/lib/connector-bundle.jar. This is the JAR where you'll find AbstractConnector. Once you've written your connector code, you will need to compile it and bundle it into a JAR file, which you will place into WEB-INF/lib.
Finally, you will need to update the ConnectorRegistry object (under Configuration on the debug screen) to reference the new class, which will make it available as an Application type. If it has custom connection parameters (as most do), you will also need an xhtml page that will be embedded into the Sailpoint UI to prompt the user configuring the Application.
If you have Compass access, they have a whitepaper called Custom Connectors that you will find helpful.
All that said, I encourage you to try to find a way to use an out-of-box connector if possible.
Most of the times it will be better if you use the DelimitedFile connector, you can import a CSV of identity data, and make it work within Sailpoint's workflow. You will be able to map fields, correlate accounts and create multi-valued group memberships rapidly. Of course, this means that Sailpoint will not be connected directly to the application, and you will have to develop a workflow to extract the identities and upload them. But at least, you can integrate without going the Custom Connector way.

Adobe CQ: Client Library Manager

I'm attempting to modify/override functionalities of the CQ client library manager and I was wondering if anyone is familiar with where the code lives?
I've found some js that controls channel detection in DefaultChannelDetector.js and CQClientLibraryManager.js which seemingly only deals with channels, not dependencies or embedding. These are served as a clientlib, etc/clientlibs/foundation/librarymanager.js which I assume can be overriden by pointing htmllibmanager.clientmanager in apps/system/config/com.day.cq.widget.impl.HtmlLibraryManagerImpl.config
So for modifying, I would need to know where the code lives. For overriding, I assume I point the htmllibmanager.clientmanager toward something else but I would still need to know how to access dependencies/categories/embed properties of clientlibs.
Additionally, any low-level insight into how the cq:includeClientLib tag works would be appreciated. (low-level as in point to the code that implements it)
The vast majority of the Client Library functionality is in the HtmlLibraryManager component implemented OOB by the HtmlLibraryManagerImpl class in the com.day.cq.cq-widgets bundle. You can look up this component in Felix to see what bundle it is in and then decompile that bundle if you need to look at the guts of what the implementation does.
At a high level this component handles both the generation of the results of the cq:includeClientLib tag and the concatenation and compilation of libraries when a library URL is requested. Speaking specifically to the cq:includeClientLib tag, the HtmlLibraryManager's writeIncludes method will determine, based on parameters of the request and parameters provided in the cq:includeClientLib tag, how to write includes to the page for the existing libraries.
In the case of dynamic libraries (libraries which are channel based) it will write calls to the library manager JavaScript mechanisms which will dynamically include libraries based on the user's channel. Otherwise appropriate script and link tags for JavaScript and CSS respectively will be written for the requested libraries and their dependencies.

Better way to handle page that links to hundreds of binaries?

I've struggled with a better solution for the following setup. I'm not actively working on this, but know some that might appreciate other ways of handling this.
Setup:
Tridion-managed page has a single "linked list" component Linked list
Single component has component links to other components in Tridion
Linked-to components often link to multimedia component (mm)
An XSLT component template (XSLT CT) renders XML with above content and with links to PDF
XSL document() function used to grab embedded (linked-to) content, all content converted to XML nodes and attributes
TCMScriptAssistant namespace with publishBinary() publishes related PDF and other media
Page template just outputs the result of the CT
Business requirements:
improved publishing (last I worked on this, some of these files created a 2GB publishing transaction because of the PDFs)
published XML content file must reference the associated PDFs; hyperlinks work but identifiers might not help because of...
no Tridion content delivery APIs, mainly for independence from the storage database but also to avoid Tridion-specific code on the presentation server (loosely coupled setup and less training for developers)
The biggest issue is the huge transport package during publishing. The second problem is publishing any of the linked-to PDFs will cause the page to republish.
How could this setup be improved or re-engineered, preferably without too many changes to the existing templates, though modular templating could be considered.
Dynamic component presentations could possibly work, but would need to be published to the file system and not use dynamic linking or broker objects (e.g. no criteria filters, binary metadata, etc).
There are indeed 2 questions. I will handle them in reverse order.
To prevent the page from being republished when you publish a binary, you can use the event system in older versions of Tridion (pre-2011) to turn off link resolving, or with newer versions you can use a custom resolver to prevent this. There is an article by Nuno which explains this(http://nunolinhares.blogspot.com/2011/10/tridion-publisher-and-custom-resolvers.html)
Your second one is a bit tougher, in no small part because of your criteria for not using the SDL Tridion CD APIs. I would have suggested publishing the binaries separately (this would keep the file size down of your transaction package), and using Binary Linking to resolve the paths at request time.
Given this is not an option, I think the only was I would approach it would be to still use dynamic component presentations, and then use predictable unique file names for the PDfs (i.e. use something like 317-12345.pdf based on the URI), and use one directory for all the binaries. That way you could enter the paths to the binary using your XSLT template, as you know where the binaries will be located later. You could then use a custom resolver to publish the binaries when you publish the main list component or page.
Hope that helps
Chris

In Flex, is it posible to identify if the code is runing on Web or AIR?

I'm coding an app that runs both in the web and on AIR, to avoid copying code arround, I figured I should do 3 kinds of projects on flex builder: Library, Web and AIR projects.
So all my code is on the Library project.
I have accessData.as that extends EventDispatcher to fetch web services and return them as an event. I plan on using this class to also fetch SQLite data for the desktop version, but to do so I need it to decide from wich source to get the data depending on if its Web or AIR.
Anyone know how to do this?
Please refer to this link Detect AIR versus Flash Player from an actionscript library Its more detailed.
You really should have two build targets, one for Web and one for AIR. And your code should be designed in a way that the rest of the system doesnt care what the implementing part is doing, only that it conforms to a certain interface. This way, each build simply replaces the implementing code for each desired platform.
You may find something useful under System or Capabilities in the docs.
Create 2 projects Air and Standalone and create 2 conditional compilation variables for example "standalone" and "air". (more here).
Go to Project->Properties->Flex Compiler and add
For air project:
-define=CONFIG::standalone,false -define=CONFIG::air,true
and for stanalone:
-define=CONFIG::debugging,true -define=CONFIG::air,false
In your code set:
CONFIG::standalone {
trace("this code will be compiled only when air=false and standalone=true");
}
CONFIG::air {
trace("this code will be compiled only when air=true and standalone=false");
}
umm... I just found out a way
var appName:String = Application.application.name;
this works since the web version is called "" and the desktop version is called " desktop"
but if anyone has a better way please go ahead.
thanks.

Resources