Dynamics AX 2009 Business Connector Permissions - axapta

In direct mode, what permissions does the Business Connector use?
In AX 2009, the Business Connector can run in direct or indirect mode.
In indirect mode, you use LogonAs to impersonate an AX user, and you inherit all their permissions. I understand that, it makes sense; I'm good with it.
Now... in direct mode, the Business Connector runs under the proxy account, which (by the installation checklist) cannot be associated with a user account in AX. So, what permissions do you have in AX while in direct mode--unlimited access to all tables and classes?
Two more items:
The AX documentation lists four security keys for controlling the Business Connector: SysCom, SysComData, SysComExecution, and SysComIIS. However, these keys aren't assigned to any objects, user groups, or tables in AX. How do they come into play? You can't assign more than one key to an object in the AOT, and I definitely won't be removing my standard keys to add in Business Connector keys.
I also have a reference book, Inside Dynamics AX 2009. Great book, but the explanation for direct mode makes even less sense. "The direct approach uses the credentials of the current Dynamics AX User." WHICH USER? We have a client application server using the Business Connector to connect to an AX server with hundreds of users. In direct mode, does the Business Connector just pick rights from any logged in user at will? What if no users are logged in?
So. If anyone understands it. I'd really like to understand.
Thanks!

"The direct approach uses the credentials of the current Dynamics AX User." - When using AX BC, it is usually for connections that exist outside of AX (SSRS, EP, Rolecenters, Workflow). BC acts as a proxy for the user. Meaning if you log into the SSRS Website and attempt to run a report, the BC will act as your account and will have the same access to the data and tables within AX that you would have.

Related

Add users for ASP.NET Core from internal website

Sorry no code here because I am looking for a better idea or if I am on the right track?
I have two websites, lets call them A and B.
A is a website exposed to the internet and only users with valid account can access.
B is a internal (intranet) website with (Windows authentication using Active directory). I want Application B (intranet) to create users for Application A.
Application A is using the inbuilt ASP.NET JWT token authentication.
My idea is to expose a Api on the extranet website (A) and let (B) access this API. I can use CORS to make sure only (B) has access to the end point but I am not sure if this is a good enough protection? We will perform security penetrations test from a third party company so this might fail the security test?
Or
I can use entity framework to a update the AspnetUsers table manually. Not idea if this is feasible or the right way or doing things.
Any other solution?
In my opinion, don't expose your internal obligations with external solutions like implementing APIs etc ...
Just share the database to be accessible for B. In this way, the server administration is the only security concern and nobody knows how you work. In addition, It's not important how you implement the user authentication for each one (whether Windows Authentication or JWT) and has an independent infrastructure.
They are multiple solution to this one problem. It then end it really depends on your specific criteria.
You could go with:
B (intranet) website, reaching into the database and creating user as needed.
A (internet) website, having an API exposing the necessary endpoint to create user.
A (internet) website, having data migration running every now and then to insert users.
But they all comes with there ups and downs, I'll try to break them down for you.
API solution
Ups:
Single responsibility, you have only one piece of code touching this database which makes it easier to mitigate side effect
it is "future proof" you could easily have more services using this api.
Downs:
Attack surface increased, the API is on a public so subject to 3rd parties trying to play with it.
Maintain API as the database model changes (one more piece to maintain)
Not the fastest solution to implement.
Database direct access
Ups:
Attack surface minimal.
Very quick to develop
Downs:
Database model has to be maintained twice
migration + deployment have to be coordinated, hard to maintain.
Make the system more error prone.
Migration on release
Ups:
Cheapest to develop
Highest performance on inserts
Downs:
Not flexible
Very slow for user
Many deployment
Manual work (will be costly over time)
In my opinion I suggest you go for the API, secure the API access with OAuth mechanism. It OAuth is too time consuming to put in place. Maybe you can try some easier Auth protocols.

Securing SQL Server 2008 R2 on Customers Network

We have a ASP.NET web application with SQL Server 2008 R2 as the backend.
Our client wants the application hosted on their servers to which they will have full administrative access.
I have 2 questions:
1 - Is there any good way of restricting their access to the back-end database.
2 - Are there any tools (free or cheap preferably) to monitor if anyone has logged into the database from outside the application ?
Many Thanks.
Regards
In answer to your first question:
If they have full admin access to the server, they're going to be able to do whatever they want with the databases on it. However you can still add auditing to the server, if you can trust them not to tamper with that. I'd suggest making it a condition of the support you provide them, to not make changes to the database directly.
In answer to your second question:
SQL Server Auditing - can be used for instance and database level auditing.
For more information, this is a pretty good guide with examples of how to set it up:
http://bradmcgehee.com/2010/03/30/an-introduction-to-sql-server-2008-audit/
This also gives even more information on how it works and examples:
http://msdn.microsoft.com/en-us/library/dd392015%28v=sql.100%29.aspx

Is there a direct way to query and update App data from within a proxy or do I have to use the management API?

I have a need to change Attributes of an App and I understand I can do it with management server API calls.
The two issues with using the management server APIs are:
performance: it’s making calls to the management server, when it
might be possible directly in the message processor. Performance
issues can probably be mitigated with caching.
availability: having to use management server APIs means that the system is
dependent on the management server being available. While if it were
done directly in the proxy itself, it would reduce the number of
failure points.
Any recommended alternatives?
Finally all entities are stored in the cassandra ( for the runtime )
Your best choice is using access entity policy for getting any info about an entity. That would not hit the MS. But just for your information - most of the time you do not even need an access entity policy. When you use a validate apikey or validate access token policy - all the related entity details are made available as flow variable by the MP. So no additional access entity calls should be required.
When you are updating any entity (like developer, application) - I really assume it is management type use case and not a runtime use case. Hence using management APIs should be fine.
If your use case requires a runtime API call to in-turn update an attribute in the application then possibly that attribute should not be part of the application. Think how you can take it out to a cache, KVM or some other place where you can access it from MP (Just a thought without completely knowing the use cases ).
The design of the system is that all entity editing goes through the Management Server, which in turn is responsible for performing the edits in a performant and scalable way. The Management Server is also responsible for knowing which message processors need to be informed of the changes via zookeeper registration. This also ensures that if a given Message Processor is unavailable because it, for example, is being upgraded, it will get the updates whenever it becomes available. The Management Server is the source of truth.
In the case of Developer App Attributes, (or really any App meta-data) the values are cached for 3 minutes (I think), so that the Message Processor may not see the new values for up to 3 minutes.
As far as availability, the Management Server is designed to be highly available, relying on the same underlying architecture as the message processor design.

Access SSAS cube from across domains without direct database connection

I'm working with SQL Server Analysis Services for the first time and have the dilemma of working on a project in which users must be able to access SSAS Cubes (via a custom web dashboard) that live across different servers and domains, but without having access to the other server's SSAS database connection strings. So Organization A and Organization B will have their own cubes on their own servers, but Organization A users must be able to view Organization B's cubes, and Organization B users must be able to view Organization A's cubes, but neither organization should have access to the connection string.
I've read about allowing HTTP access to the SSAS server and cube from the link below, but that requires setting up users for authentication or allowing anonymous access to one organization's server for users of another organization, and I'm not sure this would be acceptable for this situation, or if this is the preferred way to do this. Is performance acceptable here?
http://technet.microsoft.com/en-us/library/cc917711.aspx
I also wonder if perhaps it makes sense to run a nightly/weekly process that accesses the other organization's SSAS database via a web service or something, and pull that data into a database on the organization's server, and then rebuild the cube. Then that cube would be queried without having to go and connect to the other organization server when viewing the cube.
Has anyone else attempted to accomplish something similar? Is HTTP access the standard way to go for this? Or any other possible options? Thanks, and please let me know if you need more info, still unclear on how some of this works.
HTTP is probably the best option for what you sound like you are trying to do. if they are two machines on same network but not same domain, using ipaddress\username on each (same user/pass) will work, like old school windows networking in workgroups.
You could also just backup , ftp and download/restore the cube on the other machine, might work for what you are doing.
as suggested by ScaleOvenStove HTTP is the best solution for your case, but users should be synced on both the servers to get access via HTTP. Users across both the organization's network can to be synced with a AD Sync tool. User has to be created in the other organization's network with bare minimum rights, and you can define role based security for what they can access in the cube

Database Authentication for Intranet Applications

I am looking for a best practice for End to End Authentication for internal Web Applications to the Database layer.
The most common scenario I have seen is to use a single SQL account with the permissions set to what is required by the application. This account is used by all application calls. Then when people require access over the database via query tools or such a separate Group is created with the query access and people are given access to that group.
The other scenario I have seen is to use complete Windows Authentication End to End. So the users themselves are added to groups which have all the permissions set so the user is able to update and change outside the parameters of the application. This normally involves securing people down to the appropriate stored procedures so they aren't updating the tables directly.
The first scenario seems relatively easily to maintain but raises concerns if there is a security hole in the application then the whole database is compromised.
The second scenario seems more secure but has the opposite concern of having to much business logic in stored procedures on the database. This seems to limit the use of the some really cool technologies like Nhibernate and LINQ. However in this day and age where people can use data in so many different ways we don't foresee e.g. mash-ups etc is this the best approach.
Dale - That's it exactly. If you want to provide access to the underlying data store to those users then do it via services. And in my experience, it is those experienced computer users coming out of Uni/College that damage things the most. As the saying goes, they know just enough to be dangerous.
If they want to automate part of their job, and they can display they have the requisite knowledge, then go ahead, grant their domain account access to the backend. That way anything they do via their little VBA automation is tied to their account and you know exactly who to go look at when the data gets hosed.
My basic point is that the database is the proverbial holy grail of the application. You want as few fingers in that particular pie as possible.
As a consultant, whenever I hear that someone has allowed normal users into the database, my eyes light up because I know it's going to end up being a big paycheck for me when I get called to fix it.
Personally, I don't want normal end users in the database. For an intranet application (especially one which resides on a Domain) I would provide a single account for application access to the database which only has those rights which are needed for the application to function.
Access to the application would then be controlled via the user's domain account (turn off anonymous access in IIS, etc.).
IF a user needs, and can justify, direct access to the database, then their domain account would be given access to the database, and they can log into the DBMS using the appropriate tools.
I've been responsible for developing several internal web applications over the past year.
Our solution was using Windows Authentication (Active Directory or LDAP).
Our purpose was merely to allow a simple login using an existing company ID/password. We also wanted to make sure that the existing department would still be responsible for verifying and managing access permissions.
While I can't answer the argument concerning Nhibernate or LINQ, unless you have a specific killer feature these things can implement, Active Directory or LDAP are simple enough to implement and maintain that it's worth trying.
I agree with Stephen Wrighton. Domain security is the way to go. If you would like to use mashups and what-not, you can expose parts of the database via a machine-readable RESTful interface. SubSonic has one built in.
Stephen - Keeping normal end users out of the database is nice but I am wondering if in this day and age with so many experienced computer users coming out of University / College if this the right path. If someone wants to automate part of their job which includes a VBA update to a database which I allow them to do via the normal application are we losing gains by restricting their access in this way.
I guess the other path implied here is you could open up the Application via services and then secure those services via groups and still keep the users separated from the database.
Then via delegation you can allow departments to control access to their own accounts via the groups as per Jonathan's post.

Resources