Dynamic return values based on user - wso2-data-services-server

Am looking for a way to use authentication as a key (certificate) to determine which return values are exposed by a service operation.
For example;
- presume the underlying source has 5 columns that can be returned, 1 - 5
- Authenticated user A invokes Operation 'GetData' and user B invokes 'GetData'
I would like to enable User A to receive columns 1,2,3 and user B columns 1,4,5
Why is this important?
For privacy reasons, we are required to return only the data that a consumer needs to see, not all the data. The alternative to what I am asking is to create one operation for each consumer.
Many thanks for your thoughts.

Related

Should rest apis return a message for an enum besides its value?

Say, we have an api which returns a list of employee records. Each record has a gender field whose value may be 'MALE' or 'FEMALE'. We want to show the gender of each employee with a message which may be 'Male' or 'Female'.
To achieve the goal above, we have following options:
Return both the value and the message so the client doesn't bother
Return the value only and let client determine the appropriate message
Which one is better?
It does depend. If you plan to give a uniform vision about your backend data and potentially there will be N API Consumer that will use your service, I'd prefer the first option. This can come in handy if you have to tackle to internationalization issue

Multi users wth multi data set login scenairon for jmeter not working

I need to test multi users login scenario with different data set on single operation. let say For each 10 user I wants to open different 10 documents at a time and do some operations on it. I created below plan to test this case.
TestPlan
Thread Group
While controller
CSVLoginUserDataConfig
LoginRequestRecordingController
HTTPLoginRequset
DocumentOperationRecordingController
While Controlder
DocIDCSVDataList
HttpSaveRequest
My problem is I get 10 users login successfully but for DocuemntOperation all DocIDs pass to single thread(users) so Only single user is doing operation at a time. I wants to achieve 10 users get 10 different DocIds and perform operation on it simultaneously.
Where I need to change my test plan or settings of any sampler\config to achieve my scenario?
Thank you in advance.
You need to remove the While Controller under DocumentOperationRecordingController
Currently every user/thread is looping and using the 10 different DocIds instead of using DocId for each user

Fetch the process instances by passing the multiple business key[processBusinessKey] values

I have one process which has task and form fields. And I have one field "location" which I am considering as business key. For example, I have 2 locations as: India and UK. I want to fetch the process instances of these two locations. Means I need to pass the multiple business key values. Is it possible to pass the multiple business key values and fetch the process instances of these 2 business key values [multiple business key values]?
Thanks & Regards
Shilpa Kulkarni
There's no out of the box functionality for this. but you can always query process instances based on variable values. e.g. create a service which takes multiple keys as argument and query them separately with runtimeService.createProcessInstanceQuery().variableValueLike("location", "yourKey").list(); this will return all process instances with having location as yourKey entered.
An Activiti process instance can only have a single associated Business key. However you can retrieve a list of instaces based on a list of buisiness keys and other properties by using a process instance query.

Firebase database: High number of calls with minimal data

I built my Firebase database as flat as possible.
This is my structure: https://stackoverflow.com/a/40115527/3669981
with a little adjustment: Instead of role values in my projectsRoles node I'm using the role keys assigned to the /roles node (So I can add and edit roles easier).
The pain starts when I need to make 1 + (numUsers*2) Calls in order to get a project member list, assuming I already have the project ID.
Call to projectsRoles/$projectID to get all user IDS of the current project.
Loop each USER ID+Role ID received:
Get the role name by roles/$roleID
get user information by users/$userID
That means if a project will have 30 members, the app will make 61 calls to firebase database.
My question is: Although the number of calls is high, the data received each call is minimal. I followed the instruction to make is as flat as possible, But is it common to make so much calls to firebase?

Transmit data to client on specific page, based on SQL Server column or row update

I want to achieve something specific using ASP.NET and SQL Server. Let's for example that I have several pages, each one with each own identification (ie. id=1, id=5). Furthermore, let's assume that for each one of those id I have a row in the database:
So in short, what I want to achieve is: Pushing database changes in-directly to specific clients on specific pages while taking advantage of web sockets (persistent connection).
for example:
Row 1:
id = 1
name = myname1
Row 2:
id = 2
name = myname2
What I want to do is that when the specific row or even a specific value in a column changes, it will trigger an event that can send a specific data to ONLY those clients that are visiting the page with a specific id that was changed.
for example: if row 1 column name changed from 'name1' to 'name2', and the ID of the primary key is 5, I want all those who visit the page with id=5 to recieve an event in the client side.
I want to prevent myself for developing a client code that will contentiously send requests to a webservice and query that specific row by id to see if it was update or a specific column value was changed.
One solution that I thought about is to keep the key/value in memory (ie. memcache) like the key represents the id and the value will be the datetime lst updated. Then I can query the memory and if, for example, [5, 05/11/2012 12:03:45] I can know if they data was last updated by saving the last time I queries the memory in the client side, and compare the dates. If the client datetime value is older than the one in the key/value in the memory, then I would query the database again.
However, it's still a passive approach.
Let me draw it how it should work:
Client and Server have persistent connection [can be done using ASP.NET 4.5 socket protocols]
Server knows to differentiate between a connection that comes from different pages, those with different query strings for example, id=1, id=2, etc. One option I thought about is to create an array in memory that stores the connection Ids for each connection string id value. For example: {1: 2346,6767,87878, 2:876,8765,3455}. 1 and 2 are the page's identification (ie. id=1, id=2), and the other values are the connection ids of the persistent connection that I get using ASP.net 4.5
A column value in a row with primary key value id=5 has its column 'count' updated from value '1' to '2'.
A trigger calls a function and pass the id (let's assume value X) of the changed row. I prefer being able to also send specific columns' value ( some column of my choice) [this is done using CLR triggers]
The function has a list of connections, for the clients who are visiting the page with id with value X (a number)
The Server sends the client the column values, or if it's not possible, just send true or false, notifying the client that a change to that row has been taken place.
Solved until now:
1] Can be done using ASP.NET 4.5 socket protocols
4] Using CLR triggers I can have a function that gets to have the columns data and id of a specific row that was altered.
I am developing my app using ASP.NET 4.5.
Thanks
Sql Server Service Broker can accomplish a good portion of your requirements.
Service Broker allows for async messaging in sql server. Since it's asynchronous let's split up the functionality into 2 parts.
The first part is a trigger on the table that writes a message the the service broker queue. This is just straight T-SQL, and fairly straight forward. The payload of the message is anything you can convert to varbinary(max). it could be xml, a varchar(100) that contains comma seperated values, or some other representation.
The second part is the handling of the message. You issue a transact-sql RECEIVE statement to get the next message from the queue. This statement blocks until something arrives. A queue can have multiple conversations, so each client gets their own notifications.
Conceptually, it could work like this (Client is asp.net code):
Client opens Service Broker conversation .
Client sends a message which says "I'm Interested in Page=3)
Client does a RECEIVE which blocks indefinitely
UPDATE changes data for page=3
Trigger on table sends message to every conversation that is interested in Page=3
Client receives the message, and sends updated data to web browser.
No CLR required, no periodic polling of the database.

Resources