Does the EDI Tradacoms standard provide a test flag? - biztalk

I'm currently looking into coding a B2B/EDI integration, using the TRADACOMS standard, for integration with a UK-based company. I have lots of experience with EDIFACT, however TRADACOMS is very new to me.
In EDIFACT (EU-based) there is something called a test flag (UNB11).
In X12 (US-based), there is ISA15 (I/T/P).
Is there a similar field in the TRADACOMS spec to use? I have received some information on TRADACOMS standards, but can't seem to grasp any mentioning of this field.
If no such thing is present within the standard, then how would this typically be done?

No, there is no explicit Test/Production flag as there is with X12 or EDIFACT.
This requirement can be met by simply using different Application Codes for each purpose.

Related

SOA: Use SDO (Service Data Object)? [duplicate]

I've been programming in Delphi with Midas/DataSnap for quite long time and quite happy with it. Moving to .NET I'm more than happy with the ADO.NET DataSet. For CRUD application, I'm highly uncomfortable with any kind of ORM. Generic data-structure with automatic diff/delta handling get my job done better for me, an average database application developer.
Tried to study Java years ago, and could not find similar idea implemented. The closest I could find is SDO (Service Data Object). I thought it should be widely adopted when I saw it, but I'm wrong. Even the spec is rather old now, I still hardly find many people discuss on it or use it extensively. Assuming from information I can find on the internet, SDO usage is highly passive.
Wondering if it's dying ? Any experience in SDO you want to share ? Manual DTO coding is always better ?
Ok. I see. The answer is "no"
;)
Same for me when trying SDO first time. Old specs, passive feedback... Definitely NO.
I wouldn't recommend using SDO unless it's imposed on you by some other part of the project.
WebSphere process server uses SDO. It's not really a bad API once you learn it. But the spec and the documentation are vague. It doesn't spell out what happens if you ask for a field that doesn't exist, or whether it does type conversions while getting or setting fields, to name two gripes.
I don't think the API defines how to define new types, so that part will be implementation-specific. Type definitions are based on XSD, so you'll be working with those and all of the associated standards.
As others have implied, the API isn't widely used. This means it'll be hard to find people experienced with it, or help using it.

Which tincan verbs to use

For data normalisation of standard tin can verbs, is it best to use verbs from the tincan registry https://registry.tincanapi.com/#home/verbs e.g.
completed http://activitystrea.ms/schema/1.0/complete
or to use the adl verbs like those defined:
in the 1.0 spec at https://github.com/adlnet/xAPI-Spec/blob/master/xAPI.md
this article http://tincanapi.com/2013/06/20/deep-dive-verb/
and listed at https://github.com/RusticiSoftware/tin-can-verbs/tree/master/verbs
e.g.
completed http://adlnet.gov/expapi/verbs/completed
I'm confused as to why those in the registry differ from every other example I can find. Is one of these out of date?
It really depends on which "profile" you want to target with your Statements. If you are trying to stick to e-learning practices that most closely resemble SCORM or some other standard then the ADL verbs may be most fitting. It is a very limited set, and really only the "voided" verb is provided for by the specification. The other verbs were related to those found in 0.9 and have become the de facto set, but aren't any more "standard" than any other URI. If you are targeting statements to be used in an Activity Streams way, specifically with a social application then you may want to stick with their set. Note that there are verbs in the Registry that are neither ADL coined or provided by the Activity Streams specification.
If you aren't targeting any specific profile (or existing profile) then you should use the terms that best capture the experiences which you are trying to record. And we ask that you either coin those terms at our Registry so that they are well formed and publicly available, or if you coin them under a different domain then at least get them catalogued in our Registry so others may find them. Registering a particular term in one or more registries will hopefully help keep the list of terms from exploding as people search for reusable items. This will ultimately make reporting tools more interoperable with different content providers.

Reference in understanding NCPDP D.0

I have am working on a Health care project and have come across NCPDP D.0 standard, although I have googled and found some basic information on wiki and other sites, I was looking are there any simple reference or open source example of software allowing pharmacy transactions in NCPDP D.0 format.
Has anyone in the community worked/working on this, and if they can share some information it would be of gr8 help.
Thanks,
HSR.
I think that you will be hard pressed to find any public documentation or open source software following this standard. I haven't been able to find it myself, and to gain access to the standard it's my understanding that you will need to be a member of NCPDP. This is a $675 bump and you are not allowed to share the standard outside of your organization. Standards Purchase.
In addition you'll also need the X12 5010 standard. On the X12 store this could set you back a few thousand dollars.
In any event, the documentation though incomplete is decent on the NCPDP.ORG site, you should check it out.

Is XForms still a live standard?

The XForms standard page seemed to indicate that it was no longer live, and that html5 kinda sorta does the job now. Is this the case? I'm looking at storing heterogenous data nuggets as XML fragments - generating a editor page according to the datatype.
To add to Phil's answer:
The XForms Working Group at W3C is active and currently working on XForms 2.0. See in particular the proposed 2.0 features on the wiki and the in-progress draft of the spec as of Feburary, 2012.
Also I don't think it's fair to say that HTML 5 "does the job". HTML 5 forms bring small and welcome improvements over HTML 4 forms, but they don't bridge the gap with XForms.
XForms on the other hand provides:
MVC architecture
XML data model (you like it or you don't, of course)
a powerful repeat model with xf:repeat
declarative properties and calculations
declarative event handlers
integration between the data model and REST services with xf:submission
built-in notion of hint, help, and alert messages
And I am probably missing some.
UPDATE 2016-11-28: For an answer up to date as of the end of 2016, please see this newer question.
The standard definitely isn't dead, although it's perhaps true to say that it hasn't gained much traction within the standard web-browsing ecosystem.
I have worked on a number of projects where XForms has been used as the forms layer in some bespoke application; in my cases the XForms parts have been handled by either Backplane BX or Ubiquity XForms, both of which may be worth taking a look at depending on your requirements (full disclosure: I've worked in the past as an implementer on both projects). Backplane BX is Windows/IE specific; Ubiquity XForms is a cross-browser, client-side javascript library; both are open source.
There are also a number of other libraries that I've not worked with but which I've heard good things about: Orbeon and XSLTForms spring to mind, but a more complete, albeit slightly outdated, list can be found here.

What is the difference between AntiXss.HtmlEncode and HttpUtility.HtmlEncode?

I just ran across a question with an answer suggesting the AntiXss library to avoid cross site scripting. Sounded interesting, reading the msdn blog, it appears to just provide an HtmlEncode() method. But I already use HttpUtility.HtmlEncode().
Why would I want to use AntiXss.HtmlEncode over HttpUtility.HtmlEncode?
Indeed, I am not the first to ask this question. And, indeed, Google turns up some answers, mainly
A white-list instead of black-list approach
A 0.1ms performance improvement
Well, that's nice, but what does it mean for me? I don't care so much about the performance of 0.1ms and I don't really feel like downloading and adding another library dependency for functionality that I already have.
Are there examples of cases where the AntiXss implementation would prevent an attack that the HttpUtility implementation would not?
If I continue to use the HttpUtility implementation, am I at risk? What about this 'bug'?
I don't have an answer specifically to your question, but I would like to point out that the white list vs black list approach not just "nice". It's important. Very important. When it comes to security, every little thing is important. Remember that with cross-site scripting and cross-site request forgery , even if your site is not showing sensitive data, a hacker could infect your site by injecting javascript and use it to get sensitive data from another site. So doing it right is critical.
OWASP guidelines specify using a white list approach. PCI Compliance guidelines also specify this in coding standards (since they refer tot he OWASP guidelines).
Also, the newer version of the AntiXss library has a nice new function: .GetSafeHtmlFragment() which is nice for those cases where you want to store HTML in the database and have it displayed to the user as HTML.
Also, as for the "bug", if you're coding properly and following all the security guidelines, you're using parameterized stored procedures, so the single quotes will be handled correctly, If you're not coding properly, no off the shelf library is going to protect you fully. The AntiXss library is meant to be a tool to be used, not a substitute for knowledge. Relying on the library to do it right for you would be expecting a really good paintbrush to turn out good paintings without a good artist.
Edit - Added
As asked in the question, an example of where the anti xss will protect you and HttpUtility will not:
HttpUtility.HtmlEncode and Server. HtmlEncode do not prevent Cross Site Scripting
That's according to the author, though. I haven't tested it personally.
It sounds like you're up on your security guidelines, so this may not be something I need to tell you, but just in case a less experienced developer is out there reading this, the reason I say that the white-list approach is critical is this.
Right now, today, HttpUtility.HtmlEncode may successfully block every attack out there, simply by removing/encoding < and > , plus a few other "known potentially unsafe" characters, but someone is always trying to think of new ways of breaking in. Allowing only known-safe (white list) content is a lot easier than trying to think of every possible unsafe bit of input an attacker could possibly throw at you (black-list approach).
In terms of why you'd use one over the other, consider that the AntiXSS library gets released more often than the ASP.NET framework - since, as David Stratton says 'someone is always trying to think of new ways of breaking in', when someone does come up with one the AntiXSS library is much more likely to get an updated release to defend against it.
The following are the differences between Microsoft.Security.Application.AntiXss.HtmlEncode and System.Web.HttpUtility.HtmlEncode methods:
Anti-XSS uses the white-listing technique, sometimes referred to as the principle of inclusions, to provide protection against Cross-Site Scripting (XSS) attacks. This approach works by first defining a valid or allowable set of characters, and encodes anything outside this set (invalid characters or potential attacks). System.Web.HttpUtility.HtmlEncode and other encoding methods in that namespace use principle of exclusions and encode only certain characters designated as potentially dangerous such as <, >, & and ' characters.
The Anti-XSS Library's list of white (or safe) characters support more than a dozen languages (Greek and Coptic, Cyrillic, Cyrillic Supplement, Armenian, Hebrew, Arabic, Syriac, Arabic Supplement, Thaana, NKo and more)
Anti-XSS library has been designed specially to mitigate XSS attacks whereas HttpUtility encoding methods are created to ensure that ASP.NET output does not break HTML.
Performance - the average delta between AntiXss.HtmlEncode() and HttpUtility.HtmlEncode() is +0.1 milliseconds per transaction.
Anti-XSS Version 3.0 provides a test harness which allows developers to run both XSS validation and performance tests.
Most XSS vulnerabilities (any type of vulnerability, actually) are based purely on the fact that existing security did not "expect" certain things to happen. Whitelist-only approaches are more apt to handle these scenarios by default.
We use the white-list approach for Microsoft's Windows Live sites. I'm sure that there are any number of security attacks that we haven't thought of yet, so I'm more comfortable with the paranoid approach. I suspect there have been cases where the black-list exposed vulnerabilities that the white-list did not, but I couldn't tell you the details.

Resources