Connecting EEG device to Azure Machine Learning studio - microsoft-cognitive

I did some researches and video to figure out how to connect EEG devices which is Emotiv Insight in real-time to Microsoft Azure Machine Learning Studio.
I thought any ways to do it, perhaps I need to connect to other services before connecting to Azure studio.
My aim is making an app taken brainwave and use Azure studio to analyze it. Finally, data is saved to firebase and response to my app.
However, I am stucking to find a way to connect my EEG data to Azure. It is appreciated for anyone can help me.

Emotive use a service, running on the cortex process. You need a websocket to talk with it. Additionaly all communication to and from the service is using JSon objects. Then you need to transport that into Azure

Related

Send messages to and receive messages from Azure Service Bus queues

I have to send & receive messages between R and Azure Service Bus. This is possible with Python, Java, .Net but there is no help for R script. As I'm limited to use only R to achieve this, is there any resource/documentation available to refer.
According to this documentation, Currently, R programming doesn't support the Azure Service bus service. As of now Data Science Virtual Machine, ML Services on HDInsight, Azure Databricks, Azure Machine Learning, Azure Batch and Azure SQL Managed Instance are the only Azure services that are compatible with R programming.

Connection options to Azure Analysis Services vs Power BI vs Power BI embedded vs Power BI Premium

I'm looking for option to connect to and query the "Model"/Database of Azure Analysis Services(AAS)/Power BI. I've found multiple options for connecting AAS to .Net Core, but nothing for Power BI. Can I use any of the following connection types to connect to Power BI? And if so which flavor? Power BI Pro, Power BI Premium, Power BI Embedded?
I can connect to Azure Analysis Services using the following:
ADOMD <- This is my preferred connection method.
AMO
MSOLAP
REST API with Bearer token
I'm not looking to embed my report in a .Net Core application. I'm looking to actually query different models so everyone is reporting off the same data.
I don't want to shell out for AAS if I can do this with Power BI Pro!
As a short answer, I would say you can most likely do what you are asking with data sets hosted within a Power BI Premium instance or by users with a Premium per user (PPU) license.
My reasoning is simple. Access to the XMLA endpoint is only available for datasets hosted within Power BI Premium. Microsoft describes that in a bit more detail here within the Power BI documentation. Power BI Embedded (in a round about way) also ends up requiring Power BI Premium, so I believe this would be the same case for Power BI Embedded.
As a reminder on why the XMLA endpoint matters, Power BI Premium encapsulates an AAS instance (with some limitations). Per Microsoft (from here):
XMLA is the same communication protocol used by the Microsoft Analysis Services engine, which under the hood, runs Power BI's semantic modeling, governance, lifecycle, and data management.
So XMLA endpoint is required in order to allow connectivity to the AAS instance behind Power BI.
To answer your question regarding the different connection methods:
ADOMD/AMO
Microsoft provides a client library for .NET Framework and .NET Core for both ADOMD and AMO which should be able to connect to the XMLA endpoint of Power BI. You can browse those and the information available from Microsoft on those here. There are several open-source tools out there (recommended by Microsoft) that make use of these libraries. So if you are looking for examples, look in to Tabular Editor 2 or DAX Studio.
MSOLAP
Per Microsoft (in same link about client libraries):
Analysis Services OLE DB Provider (MSOLAP) is the native client library for Analysis Services database connections. It's used indirectly by both ADOMD.NET and AMO, delegating connection requests to the data provider. You can also call the OLE DB Provider directly from application code.
So unless you have some very specific needs regarding MSOLAP, I would probably rely on the Microsoft's AMO/ADOMD client libraries.
REST API
Assuming we are talking about the actual Power BI REST API (like this link) then, it depends. There are certain functionalities that the API exposes that might have been your use case for wanting to use a direct connection to the XMLA endpoint. For example, it does allow you to execute DAX queries or initiate dataset refreshes (all with its limitations). So I would advise you to review the API's documentation. It seems to be a good tool so far, and my guess is that it will only expand.

Work with ROS data in server side

I need to implement a social network in the mobile application, in the application I will work with the local database, and then synchronize the data with the Realm Object Server. To handle some user activity, I will use a neural network written in python. How can I implement the work with the data on the server side, which are stored on the Realm Object Server for the needs of the neural network? Is this possible when using the free version (Developer Edition) of the Realm Platform?
That is unfortunately not possible in the Developer Edition. The server side access functionality you need is only available in the Professional and EE versions.
You can read more here: https://realm.io/pricing

R script on premise data gateway

I have created a report that read data from OData source, SQL Server and R.
R script read the data from an OData source.
Refresh works fine on my computer.
I want to share my work with my colleague and publish the report and use our On Premise Data Gateway, but I keep getting an error that data gateway is not configured correctly. If I use my personal gateway on my computer, everything works fine.
Any idea why On Premise Gateway is not working?
I'm happy to stand corrected, but it's my understanding that R Scripts are not a supported data source for an Enterprise On Premise Data Gateway.
I imagine Microsoft are worried about taking on the intense compute demands generated by R within their cloud. The Personal Gateway keeps your machine doing all the R processing.

Monitoring Integration points

Our company is working on integrating Guidewire(claims processing system) into the existing claims system. We will be executing performance tests on the integrated system shortly. I wanted to know if there was some way to monitor the integration points specific to guidewire.
The system is connected through Web Services. We have access to Loadrunner and Sitescope, and are comfortable with using other open source tools also.
I realize monitoring WSDL files is an option, Could you suggest additional methods to monitor the integration points?
Look at the architecture of Guidewire. You OS have OS monitoring points and you have application monitoring points. The OS is straightforward using SiteScope, SNMP (with SiteScope or LoadRunner), Hyperic, Native OS tools or a tool like Splunk.
You likely have a database involved: This monitoring case is well known and understood.
Monitoring the services? As the application experts inside of your organization what they look at to determine if the application is healthy and running well. You might be implementing a set of terminal users (RTE) with datapoints, log monitoring through SiteScope, custom monitors scheduled to run on the host piping the output through SED to a standard form that can be imported into Analysis at the end of the test.
Think Architecturally. Decompose each host in the stack into OS and services. Map your known monitors to the hosts and layers. Where you run into issues grab the application experts and have them write down the monitors they use (they will have more faith in your results and analysis as a result)

Resources