Analysis Services access configuration challenge - azure-analysis-services

Looking for access related solution that involves:
a) Azure Analysis Services
b) Reports in Power BI Services
c) and restricting connection access to Azure Analysis Services from Excel, Power BI Desktop, other tools
The picture below illustrates the problem what I'm trying to solve (marked red).
So we want that certain Azure AD Group(s) (e.g. Salespersons) have access to Azure Analysis Services only through published Power BI Reports. So they use reports but can’t connect to Azure Analysis model with Excel, Power BI Desktop and other tools.
And at the same time other Azure AD Group(s) (e.g. Controllers) can use Excel and other tools to explore whole Azure Analysis Services model. They also can use reports.
I see that current Azure Analysis Services Firewall setup operates with IP Addresses.
So looks it can't be used in my case as I can identify users by AD Group(s) not IP(s).
Does anybody know is it somehow possible to solve this.
Might be in combination with some other Azure services
https://i.stack.imgur.com/gNeRm.png
[link to picture]

you can create an Azure VM, provide the user group access(allowed to use AAS from excel) on that VM and whitelist the IP of that VM in firewall of AAS server.
So anyone accessing the AAS needs to access it via the Azure VM only and not via any local IP.
As to others users, even if they have read access on AAS cube, they wont be able to access due to firewall restrictions.
Note : All the above is assuming that the POwerBi reports are in connectlive mode and not import mode.
In case if the reports are in import mode, the users need not have read access in the AAS cube but just have access on the reports

Related

How to copy data from SSAS (on-premise) to Azure Analysis Services

My company plans to copy all data from on-premise SQL Services Analysis Services (2017 tabular) to Azure Analysis Services on a periodic basis. We want to do this at least once a day, and then use the Azure Analysis Services version for Power BI reporting only. The idea is to reduce load on the on-premise cube, and to improve response in Power BI.
Is this a recommended design for reporting?
What are the methods available for the periodic copy of data (and pros and cons for each)?
In addition to Nandan’s approach, you could continue to refresh the model on premises, then backup and restore to Azure Analysis Services. I shared a PowerShell script which automates this operation.
can you tell us what is the data source for the on prem SSAS cube ?
In case it a SQL server, rather than syncing data from SSAS to AAS, you can directly refresh the AAS with on prem SQL server as the source via on prem gateway.
And in case if the cube is only used for reporting(powerbi), then having AAS is enough rather than maintaining SSAS and AAS.

Connection options to Azure Analysis Services vs Power BI vs Power BI embedded vs Power BI Premium

I'm looking for option to connect to and query the "Model"/Database of Azure Analysis Services(AAS)/Power BI. I've found multiple options for connecting AAS to .Net Core, but nothing for Power BI. Can I use any of the following connection types to connect to Power BI? And if so which flavor? Power BI Pro, Power BI Premium, Power BI Embedded?
I can connect to Azure Analysis Services using the following:
ADOMD <- This is my preferred connection method.
AMO
MSOLAP
REST API with Bearer token
I'm not looking to embed my report in a .Net Core application. I'm looking to actually query different models so everyone is reporting off the same data.
I don't want to shell out for AAS if I can do this with Power BI Pro!
As a short answer, I would say you can most likely do what you are asking with data sets hosted within a Power BI Premium instance or by users with a Premium per user (PPU) license.
My reasoning is simple. Access to the XMLA endpoint is only available for datasets hosted within Power BI Premium. Microsoft describes that in a bit more detail here within the Power BI documentation. Power BI Embedded (in a round about way) also ends up requiring Power BI Premium, so I believe this would be the same case for Power BI Embedded.
As a reminder on why the XMLA endpoint matters, Power BI Premium encapsulates an AAS instance (with some limitations). Per Microsoft (from here):
XMLA is the same communication protocol used by the Microsoft Analysis Services engine, which under the hood, runs Power BI's semantic modeling, governance, lifecycle, and data management.
So XMLA endpoint is required in order to allow connectivity to the AAS instance behind Power BI.
To answer your question regarding the different connection methods:
ADOMD/AMO
Microsoft provides a client library for .NET Framework and .NET Core for both ADOMD and AMO which should be able to connect to the XMLA endpoint of Power BI. You can browse those and the information available from Microsoft on those here. There are several open-source tools out there (recommended by Microsoft) that make use of these libraries. So if you are looking for examples, look in to Tabular Editor 2 or DAX Studio.
MSOLAP
Per Microsoft (in same link about client libraries):
Analysis Services OLE DB Provider (MSOLAP) is the native client library for Analysis Services database connections. It's used indirectly by both ADOMD.NET and AMO, delegating connection requests to the data provider. You can also call the OLE DB Provider directly from application code.
So unless you have some very specific needs regarding MSOLAP, I would probably rely on the Microsoft's AMO/ADOMD client libraries.
REST API
Assuming we are talking about the actual Power BI REST API (like this link) then, it depends. There are certain functionalities that the API exposes that might have been your use case for wanting to use a direct connection to the XMLA endpoint. For example, it does allow you to execute DAX queries or initiate dataset refreshes (all with its limitations). So I would advise you to review the API's documentation. It seems to be a good tool so far, and my guess is that it will only expand.

In GCP share a VPN gateway with other projects

I'm in the process of starting the design of the networks (VPC, subnetworks and such) as part of the process of moving a rather complex organization on-premise structure, on the cloud.
The chosen provider is GCP and I read and taken the courses to be associate engineer. However, the courses I've followed don't go into details of the technical aspects of doing something like this, just present you with the possible options.
My background is of a senior backend, then fullstack, developer. So I lack some of the very interesting and useful knowledge of a sysadmin unfortunately.
Our case is as follows:
On premise VMs on several racks, reachable only inside a VPN
Several projects on the GCP Cloud
Two of them need to connect to the on-premise VPN but there could be more
Some projects see each other resources (VMs, SQL, etc) using VPC Peering
Gradually we will abandon the on-premise, unless we find some legacy application that really is messed up
Now, I could just create a new VPN connection for every project from Hybrid Connectivity -> VPN but I'd rather create a project dedicated to having the VPN gateway set up and allow other projects to use that resources.
Is this a possible configuration? Is it a valid design? As far as I explored the VPN creation, it seems that I'll have to create a VM that will expose an IP acting as gateway, if that's the case I was thinking to be using the VPC peering to allow other projects to exit into the on premise VPN. No idea if I'm talking gibberish here. I'm still waiting for some information (IKE shared key, etc) before attempting anything, so I'm rather lost at this point.
You have to take in consideration several aspect:
Cost: if you set up a VPN in each project, and if you have to double your connectivity for HA, it will be expensive. If you have only 1 gateway project, it's cheaper
Cheaper, imply trade off. VPN have limited bandwidth: 3Gbps (Cloud Interconnect also, but higher and more expensive). If all your projects use the same VPN thanks to mutualization, take care at this bottleneck.
If you want to mutualise, at least for DEV/UAT project, I recommend you to use VPC Peering, I mean 1 VPN project, and others with VPC peering. Take care at your IP range assign for peering. If you are interested, I wrote an article on this
It's also possible to use Shared VPC, which is great! But there is less compatibility with several product (for example, serverless VPC Connector for Cloud Function and App Engine isn't yet compliant with shared VPC).

Does dynatrace monitor oracle ebs(11i) completely?

I want to monitor oracle ebs(11i) & oracle db(11g) simultaneously during load test through dynatrace.
Oracle EBS architecture
I know we can monitor oracle db using dynatrace but did not find how to Identify what areas or modules (e.g. Order Management, Sales, Finance, Shipping) a particular work flow/user request touches during the load test?
I found that using DC RUM we can capture the metrics for Form Server. Apart from this I also want to monitor Concurrent processing server. Is it possible using dynatrace or not?
With Dynatrace DC RUM you may choose one of two approaches of monitoring EBS performance.
First - DC RUM using agentless technology, captures network traffic between all servers and clients and as result provides you with information on performance, usage and availability details. Additionally for most popular network protocols including the ones used during communication with Oracle database, Oracle Forms servers and web servers it’s possible to use analyzers, that provides deeper performance insights. For example with Oracle Forms analyzer applied for EBS monitoring DC RUM is decoding all interactions between user and oracle forms reporting user names, form names, control names and identifying EBS module name. For Oracle Database traffic it reports performance down to single query execution including SQL, database schema and user name. Answering your question it allows monitoring of Oracle EBS and Oracle DB simultaneously.
Second one – Enterprise Synthetic allows you to create synthetic tests for key transactions in EBS. This way for example you may track the performance of whole creating sales order transaction.
DC RUM is intended to constant, systematic application performance monitoring. However if you have it in your company it’s also perfect tool to evaluate the results of the load tests performed on EBS.

Network Traffic Emulator

I am testing an application which monitors the traffic and identify the various hosts/machines present in the network with list of services/applications runs on them.
How will i simulate the traffic from multiple application like (Eg: Http, Https, Oracle, SQL Server, SAP etc)?
I don't think there is any type simulator doing what you have asked.
We have load testers, load balance testers etc.
But simulating all devices and technologies like SAP etc , its really difficult.
There should be an individual application each installed on the application/device which use network and does need ful

Resources