what are the technical differences between implementing SCORM vs xAPI? - scorm

I want to integrate eLearning to an existing system that I already have, I have been reading a lot about two standards SCORM and xAPI but all what I read was theoritical differences about pros and cons of each standard, anyhow I want to have a technical differences from a developper perspective about implementing those standards in the system, what are the differences by implementing those standards from a developping view?i Just want headlines of the process of developping those standards. any references or documentation about that would also be very helpful.
and would it be doable or logical to integrate one standard in the system and later on integrate the other one? for example SCORM then later if needed I integrate xAPI?

Let me start with the integration part. Yes you can do SCORM and later integrate xAPI, though that might require retooling the SCORM course, or LMS to do the xAPI part. This is done in practice. A lot of what I do is integrate existing SCORM ecosystems with xAPI and LRSs.
As for differences in SCORM and xAPI, here's some high-level info.
SCORM is a set of specifications that defines the way to package content, the way to have your content report data, and the way an LMS launches and manages SCORM content and data. xAPI is a specification that defines a REST style API and JSON data format to track interactions/activity that happened in content.
As Andrew said, SCORM content finds an embedded API object in the browser DOM and uses that to communicate very structured and specific data to an LMS. xAPI uses a REST HTTP API to communicate various data in a well defined format.
Without some creative programming, SCORM typically is content that is delivered via a browser from a Learning Management System to the client. Typically that is in the form of HTML, images, videos. xAPI can be those types of things as well but since the API is HTTP/REST-like, the support of unmanaged content - simulators, phone apps, games - is a little easier.
SCORM's data model is very clearly defined in the specification and not normally extended. xAPI's data format is defined, but the actual data is much looser and open to the needs of the developer.
SCORM has been around since about the year 2000 in various versions. It is well supported in LMSs and content development tools. And many in the eLearning space know it. xAPI is newer. Support is growing as well as the number of folks who understand it, but it is still less supported than SCORM.
One final thing of note, SCORM specs never defined a way to get data out of the LMS once the SCO attempt has ended. This made reporting and metrics difficult to do without getting the LMS vendor to build those features in. xAPI defines a GET endpoint to retrieve data (This might not be performant when you might have 100ks to millions of data points, but you can get the data back out - caveats about permissions aside) Some LRS vendors do add reporting and analytics platforms, as well as some adding ways to get your data into BI or data analysis tools.
There's more you'll find as you get into the space but that's some of the things off the top of my head.
I would recommend you read the xAPI spec first mainly because it is more easily consumed. Then look at SCORM - it has different versions (1.2, 2004 2-4th editions).
As for implementing content,
SCORM: Figure out the version to build to, create the SCOs (content that reports data to the LRS), have that find the API embedded in the HTML dom, use the defined methods (Initialize, Terminate, SetValue, GetValue) to communicate with the LMS, then package it all up in a zip with an XML manifest and support xml schema, deploy to the LMS.
xAPI: Create your content, preferably support an xAPI launch mechanism like TinCan Launch, make REST calls to the LRS's xAPI endpoint using something like Fetch or Requests, etc, host/package/deploy as you determine.
Looking at the specs is the driest but the authoritative way to learn about the different specs. There are also some very good articles and videos out there by various vendors and implementors.

The main technical difference is that SCORM uses a javascript API to communicate between a course in a window or frame wrapped by the LMS. xAPI uses a restful API to communicate with an LRS over HTTP.

Related

AJAX in SCORM content

I'm new to SCORM, I'm planning to implement an export-to-SCORM feature.
Currently, playing the content (which is not a SCORM player) is like a small HTML5 web app, and as part of its implementation, it uses features like AJAX for example to lazy load some files (JS files, CSS files).
I was thinking to just have a way to use the same player code when creating the SCORM course.
Are SCORM courses required to work offline or from disk?
If that's the case, using the techniques, like the lazy load I described above, won't work.
I imagine there could be some SCORM player mobile apps which store and load SCORM course on the device and a web-server is not available so that AJAX could work. (Or are these mobile apps actually implementing a local web-server within themselves in order to play the SCORM content?)
AFAIK, SCORM 1.2 or 2004 doesn't specifically put any restriction whether AJAX could be used in the presentation layer of the SCORM content, but in practice, when content is played in a player it obviously matters.
SCORM courses are not required to work offline or from disk (depending on what that means), which isn't to say they can't. SCORM courses are expected to load the initial resource from the location that they've been imported to, so while AJAX to a different location could work there are issues with security (credentials can't be secured or trusted), 3rd party loading, etc. Unless you were going to AJAX back to the loading host to retrieve content shipped with the package which should work and there are mechanisms that can be used to increase security for content retrieval.
The one offline implementation of a SCORM player that I'm familiar with (Rustici Software's) requires the packages to be all inclusive because there is no web server available. Again, that isn't to say that it couldn't be implemented with one, just that I'm not aware that such a thing has been implemented.

How can I reference a training course from another LMS?

I've been asked to make a course that is available in one cloud-based learning management system (brightspace) available in other learning management systems. The intent is that someone would open the course in the third party learning management system (LMS), like Moodle, and then from what I've read, an iframe would load that would contain the course as hosted by the original LMS (Brightspace).
I've been researching this all day and I haven't made any headway. It seems like there is oauth between the LMS' but I can't work it out.
How can I create a SCORM package that will contain an iframe to a central LMS? And, is there any standardised LMS/SCORM protocol that handles authentication or something like that?
Thanks!
Embedding a SCORM package inside another SCORM package is not the way to go. The solution for this problem as intended by the makers of SCORM would be to export the SCORM package and import it in the third party LMS, because thats what SCORM is all about. However, this is obviously not what you want to achieve.
In general, a SCORM package is simply a packaged website (with a manifest), which requires a JS API to be provided by the embedding LMS. So basically, you can do "anything" inside a SCORM package, e.g. creating an iframe, calling functions in the parent browsing context, open a popup etc., as long as it is not prohibited by web security mechanisms such as the same origin policy.
In theory, if your LMS would serve the content of the package "as is" and without authentication, i.e. you have a deep-link to the start page (think index.html) inside the scorm package (and the LMS would not send protective headers such as X-Frame-Options), you would be generally able to embed this page in any iframe in the web, thus within another SCORM package. The remaining problem would be the same origin policy, which would prevent the package sitting in the child frame from calling the API in the parent frame. There might be some tricks to work around this, e.g. by using a reverse proxy under the same origin that forwards to the other domain, but this will most likely be not practical or prohibited by other mechanisms. If you can work around this, you would still have to manually pass-through/forward the API calls from the embedded package up to your LMS's API adapter. Overall, this approach is not really practical/feasible.
In general, the SCORM does not deal with authentication. Please have a look at the IMS Learning Tools Interoperability (LTI) specification for that purpose. It allows to launch a Tool/Content hosted by another party and provides backchannels for e.g. grades.
I think the guys from Rustici Software provide a hosted SCORM RTE that can be launched via LTI, you may want to have a look at that, too...
Cross domain fun with SCORM
Another quick work around to "my content is on another domain" is to reference the JS/CSS on the media / content server, but include an index (player or launch) html file on the intended LMS which can bring those in without the cross-domain issues.
So your re-packaged SCO would just have the necessary launch file but inside that instead of the "css/styles.css" you're pointing to "//domain.com/path/to/css/styles.css". Repeat the same for JavaScript files.
It may be possible the content is a little more complicated than just statically defined assets in a HTML document. If that is the case it may take some further adjustments.
Wiki here has some extra tips https://github.com/cybercussion/SCOBot/wiki
I'll also update I added a cross domain feature to the SCOBot RTE which utilizes IFRAME postMessage api's to enable cross domain communication from domain A to domain B. You would have to be able to place the controller on domain B with the content.
GL

What is the use of GateWayScripts in DataPower?

Could you please let me know Any useful online resource to learns and implement some scenarios to explore more about it. Thanks.
Datapowers are historically, in order:
XML Tranformation acceleration Devices (that used to be a thing, XSLT was too slow to process)
SSL offloading devices (again, that used to be a thing, same reason)
Web site and Applications Gateways. Both web sites and web services security, concentrated around HTTP and SOAP/XML application layer mechanisms and standards (SSL/TLS, WS-S, SAML, etc.), but also token management, security conversion ... think "super SSO" + application security gateway
More specialized integration tools : Transformation of XML (with XSLT), Transformation to/from non-XML format (like CSV), Database connections, integration patterns (like routing, composing, and a LOT more). Some called the Datapower a lightweight ESB.
More specialized uses : B2B(EDI), JSON processing, REST/JSON support, API Mgmt (when used as deployment point for API Connect)
Notice that all later features needs the former ones (ESB is based on WS Security, etc.)
As you may know, most of Datapower devlopement is done with tranformations. The default, established language for them is XSLT (XQuery is also and historic, less popular option).
XSLT is both one of the most powerful and most horrible language to work with. Kind of like the Perl+REGEX of the XML world...
... but there is another problem with XSLT. It is not designed to work with JSON. Making the Datapower of 10 years ago heading for a fats retirement.
At first, IBM designed pseudo-XML ways of dealing with JSON. You could convert inbound JSON to XML and work with the JSON AS XML in XSLT. The inverse operation was to use XSLT to generate JSON... it worked perfectly but kind a looked like old school HTML/PHP merging code.
So IBM came up with a good idea: GatewayScript.
(Mostly based on many other good ideas)
GatewayScript is basically ECMAScript 2015 (ES6) + CommonJS 1.0 + Many super popular JS crypto libraries.
ECMAScript is obviously more known as JavaScript.
Pertaining to your question, the main advantage of GatewayScript is to enabled easier JSON Web Services Development of all the features in the list above, for modern REST/JSON APIs, instead of older (but still good) SOAP/XML Web Services.
GatewayScript has now been present for years, no longer a "beta" option.
Here are some other neat GatewayScript features:
Access to a DOM model, representing the incoming and outgoing version of the document, in simple JS notation.
Better errors in the logs when something does not work (you get the .js line number, unlike with the XSLT errors)
Better debugging options (you can enable a line-by-line debugger)
Some examples from the web written in Node.js and other JS frameworks can work... which is amazing
A very useful IBM site (Datapower Playground) where you can learn and test GatewayScripts examples without your own Datapower, à-la-w3cschool
And more.
I hope this helps.
GhislainCote's answer is very complete but basically GatewayScript is Node.js with an added framework for handling the session object which will contain your data/payload.
There are also some special objects, e.g. service-metadata and header-metadata that will contain DataPower variables and headers.
Sample scripts are available in the store:///gatewayscript/ directory and as the store:///healthcheck.js for example.
Also review the Knowledgecenter, it contains a lot of help and information about GatewayScript:
https://www.ibm.com/support/knowledgecenter/SS9H2Y_7.7.0/com.ibm.dp.doc/gatewayscript_model.html
GatewayScript is very powerful, I've coded support for AS2 de-/en-veloping (for customers not having the B2B Module option) and RosettaNet handling in GatewayScript so there is pretty much no limit to what you can achieve!

Web Player Scorm previewer?

I'm very new to SCORM and i'm not entirely sure if i'm even asking this question correctly so please pardon my newness. I have been tasked with implementing a SCORM "previewer" functionality into a website we're building.
I won't need any of the extended features that i understand are provided by the SCORM wrapper such as LMS integration and testing, but simply the ability to preview the images and flash files as they were created by the author of the SCORM package.
We have additional requirements that prevent us from using an external cloud-based solution.
Is this possible? Am i completely misunderstanding the way this works?
The SCORM specification defines a Content Aggregation Model (CAM) and RunTime Environment (RTE). The CAM isn't relevant to your question but the RTE might. The RTE defines how SCORM content communicates with an LMS.
It sounds like in your 'previewer' application, you don't want to store any data in the LMS. For some SCORM packages, this will mean there is no work to do at all as you don't need to implement anything to store no data!
Other SCORM packages will expect a response from the LMS and will error if they don't receive one. I suspect this is what is happening in your case and why you have been tasked with creating a previewer application. You will therefore need to work out which SCORM data the package is sending, catch those requests and return the expected response. See scorm.com for an overview of the Run Time Environment.
If you need a more generic solution that will work with any SCORM package, I fear you will need a complete SCORM implementation. This will be a LOT of work to do yourself. I would normally recommend SCORM Cloud but you say you need an internally hosted solution. If this preview application is likely to be well used and/or customer facing, you should take a look at SCORM Engine. If not, perhaps consider hosting an Open Source LMS such as Moodle?

SCORM Content on our web server talking to LMS using LTI

Hope all are doing well,
I wanted to know if the below scenario can be achieved.
We have a SCORM package that we wanted to have it on our Own web server and specify the link
to it in LMS(blackboard,moodle).
When User logs into LMS, it should perform a Single sign on (with LTI) and show the scorm content
from our web server.
Can SCORM in our web server access details of logged in User(UserID,Score details etc..).
I have searched and found some details below
http://scorm.com/scorm-solved/scorm-cloud-developers/how-to-get-started-with-the-scorm-cloud-api/
but this api is not free.
Your own web server utilizing another, then dealing with user credentials, assignments, and launching of courseware would be a tough one. These systems essentially have a Runtime API that manages the student attempt SCORM Interacts with.
There are a few parts of SCORM support that you'd obtain from Rustici's SCORM Engine that are actually worth paying for.
100% SCORM Compatibility (1.2, 2004)
I believe they have a .NET and Java implementation (uncertain about PHP) that you can plug into your platform. If you don't use those languages I'm sure they'd be grateful to answer any questions you have on further support.
You're covered on importing PIF/ZIP Content Aggregation Model packages. Even the robust ones.
SCORM Cloud hosting could negate the need for the SCORM Engine (#2)
The main reason you can't find much free in this space (with the exception of moodle) is mainly that is a epic amount of work, and another main reason you find many platforms with mixed support of SCORM. There is also the legacy space, and aged antiquated stuff that also comes to that end.
In the end you have the Runtime, Content Package Parsing, and if your using SCORM 2004 all the sequence and navigation rules. Those 3 things don't sound like much, but they are an exhausting amount of work from scratch.
Hope that all made sense,
Mark

Resources