I am often developing applications in an IDE and then distributing them with an executable JAR. So I seek interoperability in regards to library functions. I have a partial solution using an InputStream to robustly read data from a file located in either a FileSystem or a Jar file, but I would like to make use of the robust Kotlin function Closable.use() to adapt to the error being thrown or not.
After some struggle I found this post that explains how compression inside the Jar file makes its access different than that from a FileSystem. In particular the nice Kotlin extension function File.readBytes() will not work for a file in a Jar. That is a shame because looking into the source of readBytes() there is a clever use of the use() function to handle the stream contingencies. It says...
closes it down correctly whether an exception is thrown or not.
Here is the partial solution which returns a ByteArray. These functions work the same whether running the application from within the IDE or from an executable Jar.
inline fun <reified T> getResourceAsStream(filename:String):InputStream {
//it is calling function's responsibility to close the stream
//force the first character to be a backslash that indicates root
val fnameWithPath= if (filename[0] != '/') "/$filename" else filename
//extract the stream if it exists
return T::class.java.getResourceAsStream(fnameWithPath)
}
inline fun <reified T> getResourceByteArray(filename:String):ByteArray {
//will open a stream, read bytes and then close stream
val stream= getResourceAsStream<T>(filename)
val byteArray = stream.readBytes()
stream.close()
return byteArray
}
You're almost there. What you might have missed is that Kotlin already defines an extension method use() on the Closable interface, which InputStream implements. So your second method can be rewritten as a one-liner:
inline fun <reified T> getResourceByteArray(filename: String)
= getResourceAsStream<T>(filename).use{ it.readBytes() }
I incorporated #gidds ' solution to make use of the Kotlin library function Closeable.use() as part of the operation to load the ByteArray from a resource file through an InputStream. I'm expecting the use() function to take care of closing the InputStream or otherwise dealing with errors thrown or not.
In the updated code, the calling Class provided as a reified type gives the context for which resources directory will contain the file. In a multi-project build, there may be several resources directories. It is the case I'm operating under with different projects requiring different input data.
inline fun <reified T> getResourceAsStream(filename:String):InputStream {
//it is calling function's responsibility to close the stream
//force the first character to be a backslash that indicates root
val fnameWithPath= if (filename[0] != '/') "/$filename" else filename
//extract the stream if it exists
return T::class.java.getResourceAsStream(fnameWithPath)
}
inline fun <reified T> getResourceByteArray(filename: String)
= getResourceAsStream<T>(filename).use{ it.readBytes() }
class ReadByteData() {
init{
val byteArray= getResourceByteArray<ReadByteData>("fileName")
}
}
Related
Using BizTalk 2013r2 CU1, I have a created a property schema for my inbound xsd and deployed the application.
When I receive a sample xml document using a standard "xml receive" pipeline then I can see that the required element is promoted into the context as expected.
I then created a custom pipeline which contains the "XML disassembler" component in the "Disassemble" stage and a custom component in the "Validate" stage. This custom component needs to read the promoted property from the context. However, I find that when I switch the Receive Location from "xml receive" pipeline to my custom pipeline then my property does not get promoted. I am using the following code within my custom component to write out a list of items in the message context:
for (int x = 0; x < contextList.CountProperties; x++)
{
contextList.ReadAt(x, out name, out nspace);
string value = contextList.Read(name, nspace).ToString();
contextItems += "Name: " + name + " - " + "Namespace: " + nspace + " - " + value + "\r\n";
if (name == _ContextPropertyName && nspace == _ContextPropertyNamespace)
promotedPropFound = true;
}
Helpers.EventLogHelper eventHelper = new EventLogHelper();
eventHelper.LogEvent(string.Format("Context items:{0}", contextItems));
if (promotedPropFound == false)
throw new Exception(string.Format("Unable to find promoted property with name[{0}] and namespace [{1}]", _ContextPropertyName, _ContextPropertyNamespace));
From the output in the event log I can see that certain properties such as MessageType have been promoted but my custom property has not. Again, if I change the receive location back to use a standard "xml receive" pipeline then the property will be promoted from a copy of the same xml document (I check this by stopping the subscribing send port and viewing the context from the admin console).
I find this very strange since the same "XML disassembler" component is present in the same "Disassemble" stage of both pipelines, with the same (default)configuration. I'm starting to think perhaps there's a problem with 2013r2CU1 - has anyone else encountered the same?
By the time the XML Disassembler has executed in your custom pipeline, there is no guarantee that your properties have been promoted.
The incoming message arrives in the pipeline as a stream with the data pointer set at the start of the stream.
I think the XML Disassembler does not read the stream, it wraps it into some stream wrapper class that will populate the promoted properties when the stream actually gets read.
The stream will have to be read at least once: when the message gets inserted into the message box. So there is a guarantee that the properties will get promoted, but you cannot assume it will be done before the "Validate" stage executes.
To make sure this is really the problem your are encountering: check your message AFTER it has been imported into the message box.
If your promoted property is there, what I described is probably what is happening.
Solutions:
To make your custom pipeline component work, the best solution would be to do just as the XML Disassembler: get the incoming stream and wrap it into a stream wrapper class that can trigger whatever functionality you need.
The assembly Microsoft.BizTalk.Streaming.dll has some wrapper class that might interest you: ForwardOnlyEventingReadStream.
This class has an event AfterLastReadEvent. You can create some EventHandler and have it subscribe to this event to trigger your custom functionality only after the stream has been fully read., and all properties have been promoted.
Your custom component would look like that:
public IBaseMessage Execute(IPipelineContext context, IBaseMessage message)
{
Stream stream = message.BodyPart.GetOriginalDataStream();
CForwardOnlyEventingReadStream eventingReadStream = new CForwardOnlyEventingReadStream(stream);
eventingReadStream.AfterLastReadEvent += new AfterLastReadEventHandler(DoSomething);
message.BodyPart.Data = eventingReadStream;
return message;
}
private static void DoSomething(object src, EventArgs args)
{
}
A less efficient way to solve your problem would be to read the stream fully in your custom component at the "Validate" stage and put the stream pointer back to the start of the stream.
Microsoft has some guidelines for when you're manipulating the message stream in pipeline component:
https://msdn.microsoft.com/en-us/library/aa577699.aspx
Update:
OP needs to pass the message context to the Event Handler.
It is possible using a Lambda expression:
public IBaseMessage Execute(IPipelineContext context, IBaseMessage message)
{
Stream stream = message.BodyPart.GetOriginalDataStream();
CForwardOnlyEventingReadStream eventingReadStream = new CForwardOnlyEventingReadStream(stream);
eventingReadStream.AfterLastReadEvent += new AfterLastReadEventHandler((src, args) => DoSomething(src, args, message.Context));
message.BodyPart.Data = eventingReadStream;
return message;
}
private static void DoSomething(object src, EventArgs args, IBaseMessageContext messageContext)
{
}
This SO question can be interesting for reference for passing the additional parameter:
Pass parameter to EventHandler
Can you do whatever you had planned for the Validate Stage in an Orchestration? That would be much easier.
If not, the most common solution to this specific problem is an intermediate Pipeline Component that forces a full read on the stream, though technically, you'd only have to read until the Promoted node is hit.
We've seen a number of posts relating to JSON data returns via WCF, but they all cover the aspect of converting object to JSON and then returning that object converted to JSON via the magic of attributes.
We've got a number of preformatted JSON files that we want to return via an WCF service. Essentially all we need to do is read the files in (or a cached copy of of the file) and then return the data as a string . I think ... It seems wasteful to read in the JSON file, serialize it to an object then deserialize back to JSON.. Any help on this?
When using the WebHttpBinding, this is as simple as creating a WebGet annotated method with a Stream return type:
[WebGet]
public Stream GetFile(Int32 someId)
{
//your logic to lookup or create the file here.
//Open the file (a MemoryStream would be acceptible as well if you were creating the data on the fly
Stream stream = File.OpenRead(yourFilePath);
//register an event to clean up the temporary file (if necessary)
OperationContext.Current.OperationCompleted += (s, e) =>
{
File.Delete(yourFilePath);
};
return stream;
}
I am attempting to develop a generic BizTalk application for configuring dynamic ports. I have an orchestration that pulls back all the configuration settings for each port and I want to loop through these settings and configure the ports. The settings are held in MSSQL and, for instance, two of the properties are PortName and Address. So from within the orchestration I would like to reference the port by the string variable PortName. So is there some way to get a collection of all the ports in an orchestration or reference a port via a string variable i.e. Port['MyPortName'](Microsoft.XLANGs.BaseTypes.Address) = "file://c:\test\out\%MessageId%.xml" Thanks
In order to dynamically configure Dynamic Logical Send Ports from within an orchestration, one has to store the settings into a persistent datastore (e.g. a database or configuration file) and implement a way to assign those properties dynamically at runtime.
But first, we need to understand what is happening when configurating a Dynamic Send Port.
How to Configure a Dynamic Logical Send Port
Configuring the properties of a dynamic logical send port from within an orchestration involves two steps:
First, the TransportType and target Address properties must be specified on the Send Port. This is usually done in an Expression Shape with code similar to this:
DynamicSendPort(Microsoft.XLANGs.BaseTypes.TransportType) = "FILE";
DynamicSendPort(Microsoft.XLANGs.BaseTypes.Address) = "C:\Temp\Folder\%SourceFileName%";
Second, any additional transport properties must be specified on the context of the outgoing message itself. Virtually all BizTalk adapters have additional properties that are used for the communication between the Messaging Engine and the XLANG/s Orchestration Engine. For instance, the ReceivedFileName context property is used to dynamically set a specific name for when the FILE adapter will save the outgoing message at its target location. This is best performed inside an Assignment Shape, as part of constructing the outgoing message:
OutgoingMessage(FILE.ReceiveFileName) = "HardCodedFileName.xml"
You'll notice that most configuration properties must be specified on the context of the outgoing messages, specifying a namespace prefix (e.g. FILE), a property name (e.g. ReceiveFileName) and, obviously, the value that gets assigned to the corresponding property.
In fact, all the context properties are classes that live Inside the well-known Microsoft.BizTalk.GlobalPropertySchemas.dll assembly. This is confirmed by looking up this assembly in Visual Studio's object explorer.
Even though most context properties that are necessary to configure Dynamic Logical Send Ports live Inside this specific assembly, not all of them do. For instance, the MSMQ BizTalk adapter uses a separate assembly to store its context properties. Obviously, third-party or custom adapters come with additionnal assemblies as well.
Therefore, in order to setup a context property on a Dynamic Send Port using a flexible approach like the one describe below, four pieces of information are necessary:
The fully qualified name of the assembly containing the context property classes.
The namespace prefix.
The property name.
The property value.
Storing Port Settings in a Persistent Medium
The following .XSD schema illustrate one possible structure for serializing port settings.
Once serialized, the specified context properties can then be stored in a SQL database or a configuration file very easily. For instance, here are the settings used as an example in this post:
A Flexible Approach to Configuring Dynamic Logical Send Ports
With a simple helper Library, setting up the dynamic port configuration is very easy. First, you have to retrieve the serialized settings from the persistent medium. This can easily be achieved using the WCF-SQL Adapter and a simple stored procedure.
Once retrieved, those properties can then be deserialized into a strongly-typed C# object graph. For this, first create a C# representation of the ContextProperties schema shown above, using the following command-line utility:
xsd.exe /classes /language:cs /namespace:Helper.Schemas .\ContextProperties.xsd
This generates a partial class that can be improved with the following method:
namespace Helper.Schemas
{
public partial class ContextProperties
{
public static ContextProperties Deserialize(string text)
{
using (MemoryStream stream = new MemoryStream())
{
byte[] buffer = Encoding.UTF8.GetBytes(text);
stream.Write(buffer, 0, buffer.Length);
stream.Seek(0, SeekOrigin.Begin);
return (ContextProperties)
Deserialize(
stream
, typeof(ContextProperties));
}
}
public static Object Deserialize(Stream stream, Type type)
{
XmlSerializer xmlSerializer = new XmlSerializer(type);
return xmlSerializer.Deserialize(stream);
}
}
}
Second, applying this configuration involves creating an XLANG/s message from code and setting up the context properties dynamically using reflection, based upon the description of the context property classes specified in the deserialized ContextProperties object graph.
For this, I use a technique borrowed from Paolo Salvatori's series of articles regarding dynamic transformations, which consists in creating a custom BTXMessage-derived class, used internally by the BizTalk XLANG/s engine.
namespace Helper.Schemas
{
using Microsoft.BizTalk.XLANGs.BTXEngine; // Found in Microsoft.XLANGs.BizTalk.Engine
using Microsoft.XLANGs.Core; // Found in Microsoft.XLANGs.Engine
[Serializable]
public sealed class CustomBTXMessage : BTXMessage
{
public CustomBTXMessage(string messageName, Context context)
: base(messageName, context)
{
context.RefMessage(this);
}
public void SetContextProperty(string assembly, string ns, string name, object value)
{
if (String.IsNullOrEmpty(ns))
ns = "Microsoft.XLANGs.BaseTypes";
if (String.IsNullOrEmpty(assembly))
assembly = "Microsoft.BizTalk.GlobalPropertySchemas";
StringBuilder assemblyQualifiedName = new StringBuilder();
assemblyQualifiedName.AppendFormat("{0}.{1}, {2}", ns, name, assembly);
Type type = Type.GetType(assemblyQualifiedName.ToString(), true, true);
SetContextProperty(type, value);
}
internal void SetContextProperty(string property, object value)
{
int index = property.IndexOf('.');
if (index != -1)
SetContextProperty(String.Empty, property.Substring(0, index), property.Substring(index + 1), value);
else
SetContextProperty(String.Empty, String.Empty, property, value);
}
}
}
Now, the last piece of the puzzle is how to make use of this custom class from within an Orchestration. This is easily done in an Assignment Shape using the following helper code:
namespace Helper.Schemas
{
using Microsoft.XLANGs.BaseTypes;
using Microsoft.XLANGs.Core; // Found in Microsoft.XLANGs.Engine
public static class Message
{
public static XLANGMessage SetContext(XLANGMessage message, ContextProperties properties)
{
try
{
// create a new XLANGMessage
CustomBTXMessage customBTXMessage = new CustomBTXMessage(message.Name, Service.RootService.XlangStore.OwningContext);
// add parts of the original message to it
for (int index = 0; index < message.Count; index++)
customBTXMessage.AddPart(message[index]);
// set the specified context properties
foreach (ContextPropertiesContextProperty property in properties.ContextProperty)
customBTXMessage.SetContextProperty(property.assembly, property.#namespace, property.name, property.Value);
return customBTXMessage.GetMessageWrapperForUserCode();
}
finally
{
message.Dispose();
}
}
}
}
You can use this static method inside your Assignment Shape like the code shown hereafter, where OutboundMessage represents the message which you want to set the context:
OutboundMessage = Helper.Schemas.Message.SetContext(OutboundMessage, contextProperties);
In the first place you shouldn't attempt to do configuration changes like this using an Orchestration. Technically it's feasible to do what you are attempting to do, but as a practice you shouldn't mix up your business process with administration.
The best way to do such things will be by either writing some normal scripts or PowerShell.
To answer you question, you can get the data you want from BtsOrchestration class in ExplorerOM
http://msdn.microsoft.com/en-us/library/microsoft.biztalk.explorerom.btsorchestration_members(v=bts.20)
I am incredibly new to .NET and Mono. I have a .NET 4 application that I am trying to run locally, and I'm getting a compilation error when I try to run it (using xps4 on Ubuntu). At the end of the stacktrace it says:
/tmp/jeremy-temp-aspnet-0/3b8f3547/App_Web_635c7158_48.cs(32,21): error CS0246: The type or namespace name `bool' could not be found. Are you missing a using directive or an assembly reference?
Does that mean that it doesn't recognize the boolean type? A Google search wasn't much help.
Update - Here's the code:
public virtual #bool ShowRecentPlans {
get {
return ((#bool)(this.GetPropertyValue("ShowRecentPlans")));
When you prefix an identifier (like a type name) with #, you're telling the compiler that, even though it looks like a reserved word, it refers to something defined in your program.
Unless you have something defined somewhere like
public class #bool
{
...
}
then this isn't going to work.
Try
public virtual bool ShowRecentPlans {
get {
return (bool)(this.GetPropertyValue("ShowRecentPlans"));
}
}
For instance, if you wanted to use the keyword new as an identifier, you could:
int new = 5; /// error!
int #new = 5; /// compiles
# is of course also used to tell the compiler how a string should be interpreted.
// throws an error because \p and \m look like formatting sequences
var path = "c:\pub\myFile.txt";
// compiles
var path = #"c:\pub\myFile.txt";
Also, I just have to ask: what made you use #bool instead of bool to start with?
(And, for the record, using a keyword as an identifier is a very, very bad idea.)
Your return type should be bool instead of #bool.
Suppose I have the following simple wrapper of a NativeClassInstance.
public ref class Wrapper
{
private:
NativeClass *_wrapped;
public:
Renderer()
{
_wrapped = new NativeClass();
}
~Renderer()
{
delete _wrapped;
}
operator NativeClass*()
{
return _wrapped;
}
}
Now, I want to create an instance of Wrapper from C# with Wrapper wrapper = new Wrapper() and use it in another native functionalities wrapper that resides in another assembly with Helper.Foo(wrapper) (nothing strange having other functionalities not directly related to the wrapped classes in another assembly, IMO):
// Utilities is in another Assembly
public ref class Helper
{
public:
static Foo(Wrapper ^wrapper)
{
// Do something in native code with wrapper->_wrapped
}
}
The results with the implicit user conversion is:
candidate function(s) not accessible
If I make _wrapped public it is:
cannot access private member declared in class ...
Now, I've learnt that native type visibility is private outside of the assembly. So, how I'm supposed to use the wrapped entity in native code outside the assembly it's defined? I've read of make_public but you can't use with template types so it seems very limiting in the general case. Am I missing something? Is there a more correct solution?
I haven't been able to successfully expose native types using make_public, however a solution I have used is to put NativeClass in its own native DLL and then a) reference the native DLL from both assemblies; and b) pass the pointer to the native class around as an IntPtr.
Under the above scenario, instead of having an operator NativeClass* you might use a property such as
property IntPtr WrappedObject {
IntPtr get() { return IntPtr(_wrapped); }
}
You then retrieve NativeObject in you helper assembly by
static void Foo(Wrapper ^wrapper)
{
NativeObject *_wrapped
= static_cast<NativeObject*>(wrapper->WrappedObject.ToPointer());
// ... do something ...
}
If you use make_public, your solution of making _wrapped public should work (it would obviously be best to make a public accessor instead). Regarding your comment "I've read of make_public but you can't use with template types so it seems very limiting in the general case." I agree--read here for the workaround I used:
http://social.msdn.microsoft.com/Forums/en-US/vclanguage/thread/b43cca63-b0bf-451e-b8fe-74e9c618b8c4/
More related info:
Best workaround for compiler error C2158: make_public does not support native template types
Good luck!