Connection options to Azure Analysis Services vs Power BI vs Power BI embedded vs Power BI Premium - .net-core

I'm looking for option to connect to and query the "Model"/Database of Azure Analysis Services(AAS)/Power BI. I've found multiple options for connecting AAS to .Net Core, but nothing for Power BI. Can I use any of the following connection types to connect to Power BI? And if so which flavor? Power BI Pro, Power BI Premium, Power BI Embedded?
I can connect to Azure Analysis Services using the following:
ADOMD <- This is my preferred connection method.
AMO
MSOLAP
REST API with Bearer token
I'm not looking to embed my report in a .Net Core application. I'm looking to actually query different models so everyone is reporting off the same data.
I don't want to shell out for AAS if I can do this with Power BI Pro!

As a short answer, I would say you can most likely do what you are asking with data sets hosted within a Power BI Premium instance or by users with a Premium per user (PPU) license.
My reasoning is simple. Access to the XMLA endpoint is only available for datasets hosted within Power BI Premium. Microsoft describes that in a bit more detail here within the Power BI documentation. Power BI Embedded (in a round about way) also ends up requiring Power BI Premium, so I believe this would be the same case for Power BI Embedded.
As a reminder on why the XMLA endpoint matters, Power BI Premium encapsulates an AAS instance (with some limitations). Per Microsoft (from here):
XMLA is the same communication protocol used by the Microsoft Analysis Services engine, which under the hood, runs Power BI's semantic modeling, governance, lifecycle, and data management.
So XMLA endpoint is required in order to allow connectivity to the AAS instance behind Power BI.
To answer your question regarding the different connection methods:
ADOMD/AMO
Microsoft provides a client library for .NET Framework and .NET Core for both ADOMD and AMO which should be able to connect to the XMLA endpoint of Power BI. You can browse those and the information available from Microsoft on those here. There are several open-source tools out there (recommended by Microsoft) that make use of these libraries. So if you are looking for examples, look in to Tabular Editor 2 or DAX Studio.
MSOLAP
Per Microsoft (in same link about client libraries):
Analysis Services OLE DB Provider (MSOLAP) is the native client library for Analysis Services database connections. It's used indirectly by both ADOMD.NET and AMO, delegating connection requests to the data provider. You can also call the OLE DB Provider directly from application code.
So unless you have some very specific needs regarding MSOLAP, I would probably rely on the Microsoft's AMO/ADOMD client libraries.
REST API
Assuming we are talking about the actual Power BI REST API (like this link) then, it depends. There are certain functionalities that the API exposes that might have been your use case for wanting to use a direct connection to the XMLA endpoint. For example, it does allow you to execute DAX queries or initiate dataset refreshes (all with its limitations). So I would advise you to review the API's documentation. It seems to be a good tool so far, and my guess is that it will only expand.

Related

Analysis Services access configuration challenge

Looking for access related solution that involves:
a) Azure Analysis Services
b) Reports in Power BI Services
c) and restricting connection access to Azure Analysis Services from Excel, Power BI Desktop, other tools
The picture below illustrates the problem what I'm trying to solve (marked red).
So we want that certain Azure AD Group(s) (e.g. Salespersons) have access to Azure Analysis Services only through published Power BI Reports. So they use reports but can’t connect to Azure Analysis model with Excel, Power BI Desktop and other tools.
And at the same time other Azure AD Group(s) (e.g. Controllers) can use Excel and other tools to explore whole Azure Analysis Services model. They also can use reports.
I see that current Azure Analysis Services Firewall setup operates with IP Addresses.
So looks it can't be used in my case as I can identify users by AD Group(s) not IP(s).
Does anybody know is it somehow possible to solve this.
Might be in combination with some other Azure services
https://i.stack.imgur.com/gNeRm.png
[link to picture]
you can create an Azure VM, provide the user group access(allowed to use AAS from excel) on that VM and whitelist the IP of that VM in firewall of AAS server.
So anyone accessing the AAS needs to access it via the Azure VM only and not via any local IP.
As to others users, even if they have read access on AAS cube, they wont be able to access due to firewall restrictions.
Note : All the above is assuming that the POwerBi reports are in connectlive mode and not import mode.
In case if the reports are in import mode, the users need not have read access in the AAS cube but just have access on the reports

How to copy data from SSAS (on-premise) to Azure Analysis Services

My company plans to copy all data from on-premise SQL Services Analysis Services (2017 tabular) to Azure Analysis Services on a periodic basis. We want to do this at least once a day, and then use the Azure Analysis Services version for Power BI reporting only. The idea is to reduce load on the on-premise cube, and to improve response in Power BI.
Is this a recommended design for reporting?
What are the methods available for the periodic copy of data (and pros and cons for each)?
In addition to Nandan’s approach, you could continue to refresh the model on premises, then backup and restore to Azure Analysis Services. I shared a PowerShell script which automates this operation.
can you tell us what is the data source for the on prem SSAS cube ?
In case it a SQL server, rather than syncing data from SSAS to AAS, you can directly refresh the AAS with on prem SQL server as the source via on prem gateway.
And in case if the cube is only used for reporting(powerbi), then having AAS is enough rather than maintaining SSAS and AAS.

Difference between Daas and EAI

What is the difference between Daas Data as Service and EAI Enterprise Application Integration ?
I understand that EAI is a framework designed to overcome the complexities of Enterprise Software integration (between ERP, SCM, CRM etc..) using ESB Enterprise Service Bus,
Would like to know where DAAS fits in the picture ?
Would also like to understand the difference between EAI and SOA
Data as a Service (DaaS) gives access to data or content collected and provided by some external service provider. Examples include post area codes, geospatial data, customer address data, market prices, economic trends, exchange rates, stock quotes and bank codes.
Enterprise Application Integration (EAI) is a general term which describes that applications in a (big) company are connected via a centralized facility rather than via a variety of proprietary point-to-point interfaces. To access a DaaS service, an application could use an EAI platform.
Applications can benefit from standardized EAI integration functions (connectivity, routing, data transformation, logging, monitoring, security, error handling ...) and do not have to implement these in themselves. Support and operation might also be more economic compared to point-to-point integration.
SOA as Service-oriented Architecture is a special architecture for integration. Rather than providing and consuming services, one could also send and receive messages or use a database as central information hub.

Does dynatrace monitor oracle ebs(11i) completely?

I want to monitor oracle ebs(11i) & oracle db(11g) simultaneously during load test through dynatrace.
Oracle EBS architecture
I know we can monitor oracle db using dynatrace but did not find how to Identify what areas or modules (e.g. Order Management, Sales, Finance, Shipping) a particular work flow/user request touches during the load test?
I found that using DC RUM we can capture the metrics for Form Server. Apart from this I also want to monitor Concurrent processing server. Is it possible using dynatrace or not?
With Dynatrace DC RUM you may choose one of two approaches of monitoring EBS performance.
First - DC RUM using agentless technology, captures network traffic between all servers and clients and as result provides you with information on performance, usage and availability details. Additionally for most popular network protocols including the ones used during communication with Oracle database, Oracle Forms servers and web servers it’s possible to use analyzers, that provides deeper performance insights. For example with Oracle Forms analyzer applied for EBS monitoring DC RUM is decoding all interactions between user and oracle forms reporting user names, form names, control names and identifying EBS module name. For Oracle Database traffic it reports performance down to single query execution including SQL, database schema and user name. Answering your question it allows monitoring of Oracle EBS and Oracle DB simultaneously.
Second one – Enterprise Synthetic allows you to create synthetic tests for key transactions in EBS. This way for example you may track the performance of whole creating sales order transaction.
DC RUM is intended to constant, systematic application performance monitoring. However if you have it in your company it’s also perfect tool to evaluate the results of the load tests performed on EBS.

Monitoring Integration points

Our company is working on integrating Guidewire(claims processing system) into the existing claims system. We will be executing performance tests on the integrated system shortly. I wanted to know if there was some way to monitor the integration points specific to guidewire.
The system is connected through Web Services. We have access to Loadrunner and Sitescope, and are comfortable with using other open source tools also.
I realize monitoring WSDL files is an option, Could you suggest additional methods to monitor the integration points?
Look at the architecture of Guidewire. You OS have OS monitoring points and you have application monitoring points. The OS is straightforward using SiteScope, SNMP (with SiteScope or LoadRunner), Hyperic, Native OS tools or a tool like Splunk.
You likely have a database involved: This monitoring case is well known and understood.
Monitoring the services? As the application experts inside of your organization what they look at to determine if the application is healthy and running well. You might be implementing a set of terminal users (RTE) with datapoints, log monitoring through SiteScope, custom monitors scheduled to run on the host piping the output through SED to a standard form that can be imported into Analysis at the end of the test.
Think Architecturally. Decompose each host in the stack into OS and services. Map your known monitors to the hosts and layers. Where you run into issues grab the application experts and have them write down the monitors they use (they will have more faith in your results and analysis as a result)

Resources