When trying to generate X++ reports over batch,
these fail to generate, and say that the report is empty.
I was told that in AX2012, it is impossible to create X++ reports over batch, is this true?
If not, how can I make sure my reports will generate over batch?
Related
It is possible to run a R script with Pentaho, but instead of export the result as a csv file, insert the result directly into a table on a DB?
Using the Community Edition of Pentaho, you could use a script executor step to execute a shell script in your OS to do all the work, including inserting to the database, which is not much Pentaho related, all the work is done by the shell script and you just use Pentaho to call the execution of that script.
There's also a very old plugin available in Github that I don't know if it would work with modern versions of Pentaho and R, to execute R code within Pentaho and then continue the stream of data to "normal" steps like the table output to insert the data to a table.
These are the details to configure that plugin from the developers:
http://dekarlab.de/wp/?p=5
The issue is i have a catalogue of reports that have security policies assigned to them. Let's say things like 'manager' can access this report, but not this. This goes on throughout the catalogue. I have been told that a large chunk of reports need these security policies updated. The only thing is is seems very tedious going through each any every report and updating access manually.
Is there a quick way of doing this for say 500+ reports, some nested within others?
Use the command line:
runcat.cmd -cmd setItemPermissions
Use this command with -help suffix to get the documentation and then write what you need.
I'm working on a project to create an external data warehouse using some data from Microsoft Dynamics AX 7. I am using the new BYOD approach which allows you to define an external database, and then use that database as the target when exporting one or more Dynamics entities.
See: https://blogs.msdn.microsoft.com/dynamicsaxbi/2016/07/27/export-dynamics-ax7-entities-to-your-own-azure-sql-database/
At the time of writing, this export mechanism is able to do incremental exports (only inserted or updated records) but it lacks support for record deletions.
With AX7 you cannot directly access the AX database from external systems, so what I'm wanting to do is run a post-export SQL script that will examine the MSSQL Change Tracking tables and based on that data for deleted rows, execute a series of delete systems on the same external database.
If it is possible, I'm hoping to use this more generic, sql-centric approach rather than getting involved in writing custom AX entities and export code.
How might one best approach this?
I wanted to execute R code from SSIS package. How can I add a data control step that executes R-code? SSIS supports only vb.net and asp.net.
SSIS has many data transformations available but R is very friendly when it comes to data manipulations.
I want to run a R-code from SSIS scripts or some other way.Basically, I'm trying to integrate R in ETL process.
I wanted to extract data(E) from from a CSV file.
Transform (T) it in R and load (L) it in Microsoft database.
Is it possible to get this workflow done in SSIS package by executing R-script using SSIS data control items? Thanks!
Here are a couple of ways you could integrate R into your ETL process.
Crude, fast and dirty - Execute Process Task in the Control Flow. This would be similar to calling RScript from the command line. You would likely make your transformation, save it to a file on disk, and get that filename from your Execute Process Task so you can feed it into a Data Flow task. Upside is you're keeping your R clean and separate from your C#/VB.
Integrated via Rdotnet - You could use the RDotNet library (I believe, haven't tried to integrate it). You would need to register the DLLs in the GAC, and then you can either work with .NET objects in your SSIS scripts or call R scripts directly.
Integrated in SQL Server 2016 - Microsoft has added R support via extended stored procedures. You call the R script via stored proc and use a sql query for input data and can store the output. See more detail here. This would mean utilizing an Execute SQL task in SSIS.
I hope it helps you or someone else, since you want data processing you might bring your dataset into a CSV file (throught a data flow task), execute the file using: "Rscript " (it might be executed as a command with the execute process task), inside the file you have to upload the dataset into a dataframe ( calling it with readLines() function), then do all the math/Calculation you request, write the data or calculation results into a CSV file an reading again it from SSIS.
It is not an elegant solution, but it works :), At least till microsoft integrates R as a control/data flow process.
CYA
PS. here you go how to execute files from the command line: Run R script from command line
I have a dataset in a Visual Studio 2010 Web App project which accesses the DB with a complex SQL statement. If I run the statement in SQL Management Studio directly, it loads in a less than a second. If however, I run it using the "Preview Data" button in the dataset designer, or I try to access it on a page (with a gridview for example), it takes over 40 seconds!
What steps should I take to track down what's causing this huge delay when working with the dataset?
There are two cases:
Problem is on Application Level
Problem is on Data Base level, actually a SQL query itself
So as first step try to exclude one of the cases, much easier from my perspectives is to debug SQL side:
Run Sql Profiler
Run query from Management Studio
Save profiler logs
Clear profiler logs
Run "Preview DataSet"
Compare execution logs and see whether any difference in SQL
Some steps to follow
Put a sql profiler to the database server to be sure what sql command the app is executing
The dataset class has very poor performance, you should try use a datareader
You can use a stopwatch instance to take diferent times to determinate where are the slow lines of code