icCube excel reporting - changing data with the same schema - iccube

I am able to connect to an existing cube from Excel. I am then trying to connect this same Excel file to a different Cube that has the same schema but a completely different set of dimensions and measures.
However, when I try to refresh the file against the new cube, Excel stays in the "Executing OLAP query" state indefinitely and on the icCube server I see COMMON_GENERAL_ERROR with a null message. It seems that the problem happens because icCube is trying to pull data based on old dimension values that are in my PivotTable filters and since these values don't exist in the new cube, I'm getting an error. The dimensions in question have an All member and I was hoping that in such cases the query would use the All member. Is this possible and/or am I missing some settings in my schema definition?

Related

Column pruning on parquet files defined as an external table

Context: We store historical data in Azure Data Lake as versioned parquet files from our existing Databricks pipeline where we write to different Delta tables. One particular log source is about 18 GB a day in parquet. I have read through the documentation and executed some queries using Kusto.Explorer on the external table I have defined for that log source. In the query summary window of Kusto.Explorer I see that I download the entire folder when I search it, even when using the project operator. The only exception to that seems to be when I use the take operator.
Question: Is it possible to prune columns to reduce the amount of data being fetched from external storage? Whether during external table creation or using an operator at query time.
Background: The reason I ask is that in Databricks it is possible to use the SELCECT statement to only fetch the columns I'm interested in. This reduces the query time significantly.
As David wrote above, the optimization does happen on Kusto side, but there's a bug with the "Downloaded Size" metric - it presents the total data size, regardless of the selected columns. We'll fix. Thanks for reporting.

AnalysisServices: Cannot query internal supporting structures for column because they are not processed. Please refresh or recalculate the table

I'm getting the following error when trying to connect Power BI to my tabular model in AS:
AnalysisServices: Cannot query internal supporting structures for column 'table'[column] because they are not processed. Please refresh or recalculate the table 'table'
It is not a calculated column and the connection seems to work fine on the local copy. I would appreciate any help with this!
This would depend on how you are processing the data within your model. If you have just done a Process Data, then the accompanying meta objects such as relationships have not yet been built.
Every column of data that you load needs to also be processed in this way regardless of whether it is a calculated column or not.
This can be achieved by running a Process Recalc on the Database or by loading your tables or table partitions with a Process Full/Process Default rather than just a Process Data, which automatically runs the Process Recalc once the data is loaded.
If you have a lot of calculated columns and tables that result in a Process Recalc taking a long time, you will need to factor this in to your refreshes and model design.
If you run a Process Recalc on your database or a Process Full/Process Default on your table now, you will no longer have those errors in Power BI.
More in depth discussion on this can be found here: http://bifuture.blogspot.com/2017/02/ssas-processing-tabular-model.html

How to recover the data missing from internal tables(CMPSCOMAPCLASSES) of OWB

We are using Oracle Warehouse Builder in our project.Accidentally,some of the internal tables
data got deleted.The impact is when i open the map in OWB,the canvas in completely blank.I cannot see the tables and transformations applied.However when I right click on the map and execute,it does that perfectly fine.But the code is not visible and neither can I deploy that map.The table whose data deletion caused this was CMPSCOMAPCLASSES.We do not have a regular backup of the databse.Hence cannot recover the data.
Can anybody please help me in getting back the data anyhow.
Appreciate your help.

Insert Transactional Data to Cube

I am new to the data warehouse and currently working on this project.
Is there any way to insert new data transactional to existing cube? With tools or with MDX query maybe?
MDX is usually just a read-only language.
With an OLAP cube you have two options to change the data:
UPDATE/INSERT to the underlying SQL data mart yourself, and then rebuild the cube
Use something called WRITEBACK where you set numbers directly in the cube, and it decides how to save these back to the data mart (which is tricky if you set a number at the top level, and it has to decide how to split that value up between all the members down to the bottom level)
Usually there is an ETL (Extract, Transform, Load) tool like Pentaho (open source) or Informatica which populates a data warehouse. The data warehouse itself may use a proper database and a product like Mondrian is used to hold data in cubes.Jasper Server, for example has mondrian packaged with it. Data from transactional system is populated in the data warehouse, then the cube is 'refreshed'. There may be other possible approaches.

Triggering transformations from report results in BI server

is there a capability or function in BI server to manually run a transformation that would act on the results of the report?
like for example you would retrieve records that are tagged as 'NEW' using a pentaho report designer file deployed on the server. After being provided of results, I would like to act on those 'NEW' records and do some processing with them then tag them as 'OLD' using a transformation through a button.
is this possible? someone told me of xactions but i have the faintest idea about it.
Just wondering, why would you want the BI tool to perform the action. Why not do it in the database itself?
Vijay.
Rather than creating a set of instructions using xactions that would call some transformation and work on the user input, I just created the procedure in the database and called it from a prpt file. http://forums.pentaho.com/showthread.php?77311-Call-procedure-with-report-designer-3-6

Resources