Making a Fhir server using C# and SQL Server ( RDBMS ) - dstu2-fhir

I am excited with Fhir's promises. I started getting my head around on this subject for last couple of days.
We have an existing SQL Server database containing health related records. We are trying to communicate with Fhir compliant messages.
Sending data : Based on the given specification in http://hl7.org/fhir/, and using data object model of https://www.nuget.org/packages/Hl7.Fhir.DSTU2 , I can transform my relational data to Hl7.Fhir.Model data. Then , its a matter of transforming that data to either JSON / XML.
Consuming data : We can have the incoming data mapped to Hl7.Fhir.Model. But , I find it difficult to map extensions ( i.e. not a direct property) with our columns. Is there any way I can do this easily?
Is SQL Server not a good choice to build a Fhir server ? Do I have to consider using MongoDB / DocumentDB ?

you can add tables to support extensions directly, if you want. of course, you would not know the extensions internally and make use of the content in them. But it would be just like using mongo etc.
But you do not have to round-trip extensions. Many many FHIR implementations are exactly what you say: a FHIR facade over an existing schema, usually a relational database. They support specific extensions that they've decided to support, by building them into their schema (or they already existed)

Related

How should I specify the resource database via HTTP Requests

I have a REST API that will be facilitating CRUD from multiple databases. These databases all represent the same data for different locations within the organization (IE We have 20 or so implementations of a software package and we want to read from all of the supporting databases via one API).
I was wondering what the "Best Practice" would be for facilitating what database to access resources from?
For example, right now in my request headers I have a custom "X-" header that would represent the database id. Unfortunately, this sort of thing feels a bit like a workaround.
I was thinking of a few other options:
I could bake the Database Id into the URI (/:db_id/resource/...)
I could modify the Accept Header like someone would with an API version
I could split up the API to be one service per database
Would one of the aforementioned options be considered "better" than the others, and if not what is considered the "best" option for this sort of architecture?
I am, at the moment, using ASP.NET Web API 2.
These databases all represent the same data for different locations within the organization
I think this is the key to your answer - you don't want to expose internal implementation details (like database IDs etc.) outside your API - what if you consolidate? or change your internal implementation one day?
However, this sentence reveals a distinction that is meaningful to the business - the location.
So - I'd make the location part of the URI:
/api/location/{locationId}/resource...
Then map the locationId internally to a database ID. LocationId could also be a name, or a code, or something unique that would be meaningful to the API client.
Then - if you later consolidate multiple locations to the same database or otherwise change your internal implementation, the clients don't have to change.
In addition, whoever is configuring the client applications, can do so thinking about something meaningful to the business - the location they are interested in.

Best UI interface/Language to query MarkLogic Data

We will be moving from Oracle and use MarkLogic 8 as our datastore and will be using MarkLogic's Java api to talk with data.
I am exploring for any UI tool (like SQL Developer is there for Oracle), which can be used for ML. I found that ML's Query Manager can used for accessing data. But I see multiple options wrt language:
SQL
SPARQL
XQuery
JavaScript
We need to perform CRUD operations and search for data, and our testing team is aware of SQL (for Oracle), so I am confused which route I should follow and on what basis I should decide which one/two will be better to explore. We are most likely to use JSON document type.
Any help/suggestions would be helpful.
You already mention you will be using the MarkLogic Java Client API, that should provide most of the common needs you could have, including search, CRUD, facets, lexicon values, and also custom extension though REST extensions as the Client API will be leveraging the MarkLogic REST API. It saves you from having to code inside MarkLogic to a large extent.
Apart from that you can run ad hoc commands from the Query Console, using one of the above mentioned languages. SQL will require the presence of a so-called SQL view (see also your earlier question Using SQL in Query Manager in MarkLogic). SPARQL will require enabling the triple index, and ingestion of RDF data.
That leaves XQuery and JavaScript, that have pretty much identical expression power, and performance. If you are unfamiliar with XQuery and XML languages in general, JavaScript might be more appealing.
HTH!

Bi-Directional Sync on Android Using SyncAdapter

I am planning to create sqlite table on my android app. The data comes from the the server via webservice.
I would like to know what is the best way to do this.
Should I transfer the data from the webservice in a sqlite db file and merge it or should i get all the data as a soap request and parse it in to table or should I use rest call.
The general size of the data is 2MB with 100 columns.
Please advise the best case where I can quickly get this data, with less load on the device.
My Workflow is:
Download a set of 20000 Addresses and save them to device sqlite database. This operation is only once, when you run the app for the first time or when you want to refresh the whole app data.
Update this record when ever there is a change in the server.
Now I can get this data either in JSON, XML or as pure SqLite File from the server . I want to know what is the fastest way to store this data in to Android Database.
I tried all the above methods and I found getting the database file from server and copying that data to the database is faster than getting the data in XML or JSON and parsing it. Please advise if I am right or wrong.
If you are planning to use sync adapters then you will need to implement a content provider (or atleast a stub) and an authenticator. Here is a good example that you can follow.
Also, you have not explained more about what is the use-case of such a web-service to decide what web-service architecture to suggest. But REST is a good style to write your services and using JSON over XML is advisable due to data format efficiency (or better yet give protocol-buffer a shot)
And yes, sync adapters are better to use as they already provide a great set of features that you will want to implement otherwise when written as a background service (e.g., periodic sync, auto sync, exponential backoff etc.)
To have less load on the device you can implement a sync-adapter backed by a content provider. You serialize/deserialize data when you upload/download data from server. When you need to persist data from the server you can use the bulkInsert() method in content-provider and persist all your data in a transaction

Lightweight method for adding persistent data to ASP.NET website?

Aside from creating SQL SERVER tables, is there a light-weight technology or method for adding persistent data to an ASP.NET website which works with LINQ and preferably doesn't require much in terms installing installation/packages to a project nor learning large frameworks?
Session state is one option but only if it is run out of process and configured for SQL Server which doesn't fit my needs.
Options to satisfy the question:
1.
Session State -> Only if configured for out-of-process and SQL Server.
2.
NoSql Database Solutions -> MonogoDb, RavenDB, Sqlite.org
3.
SQL Server Key/Value Singleton -> Create a table and store key/value pairs as a single entry in the table or create a generic key/value table. Keys will need to be unique and values will need to scalars only or multiple values crammed into one key using a deliminator. A generic key/value table will need to store all keys as strings and rely on type conversion either implicit to the program or stored as an extra column.
See below
http://en.wikipedia.org/wiki/Entity-attribute-value_model
How to design a product table for many kinds of product where each product has many parameters
4.
Create an XML file or other flat file and store/write key/values to it. May require special permissions.
I will likely go with option #3 because it satisfies my current requirements best but will explore the NoSQL solutions for future projects.

Which is fastest to transmit: XML or DataTables?

I would like to know which is faster. Let me give you the scenario. I'm on a LAN, have a report to build using data from a SQL Server database (if we need the version let's say 2005) and have these ways of getting the report done:
Have a web service at the server, where the data is taken from the server and serialized into XML. The client uses this XML as a source for a report that is built in the client machine. The cliente would be a windows form app.
From the client side, connect to the database using ADO.Net, get a DataTable and uses as a source for the report built in the client.
The same as (2) but using a DataReader.
Also, is there a better way to do this?
The serialization to XML is going to cost both in terms of the time it takes to do it, the overhead of the XML structure, and the time to deserialize. It will, however, provide a format that is consumable by more technologies. If you are using .NET end-to-end, and that isn't likely to change, I would not use XML, but use the framework-provided data access methods. Personally, I would probably use LINQ over DataTables or a DataReader but that more for ease of use and readability on the client-side than any performance advantage.
The best practice is to not use .NET-specific types in the interface of a web service. Even if you are certain today that your service will never be called by anything other than a .NET program, things change, and tomorrow you may be told that the service will be called by a Perl program.
Perl programs don't understand DataSet. Nor do Java programs, nor anything other than .NET.
The best practice is to create a Data Transfer Object containing just the data you need to transfer, in simple properties with primitive types, or collections or arrays of primitive types, or collections or arrays of Data Transfer Objects, etc. These will be understandable by any client.

Resources