Where can I set and retrieve the environment variables used by icCube? - iccube

Within icCube there is reference to several variables, like $install, ${EXAMPLE} etcetera.
Where can I retrieve the values for these variables, and where can I set these?
Is it also possible for me to introduce new variables?

Related

Use of dynamic MDX Categories in Global Filters in the icCube application (dashboard)

Situation
I've defined a couple of MDX++ Categories in icCube and I want to use these as Global Filters in the application. An example of a MDX++ Categorie is
create category member [Stats].[Top 100 Leveranciers].[Totaal].[top 100] as
order(TopCount([leverancier].[leverancier].[leverancier],100, [measures].[bedrag]),[measures].[bedrag], bdesc),
add_children=true
If I use this as a Global Filter in the application, all my dashboards are filtered, showing the data for the "Top 100 Leveranciers". Perfect, so far.
Now comes the problem/ question
Some users have a security setting that allows them to see only a subset of the data. The "Top 100 Leveranciers" should therefore be different to them, then to users that can view all the data. But, it is not. The "Top 100 Leveranciers" give exactly the same members for the persons with access to sub-set as to users that can access all.
--> How can I achieve the desired functionality in icCube?
My analysis
This is what I believe is happening 'under-the-hood':
To include a CATEGORY as a global filter option, it has to be defined in the SCHEMA DEFINITION as a SCRIPT. It is only allowed (so far) to have STATIC categories in the script. So I guess I am looking on ways to create CATEGORIES that can be used as global filters, but are DYNAMIC for dashboard users.
Not sure to understand why the Top members are not filtered by the actual user access rights: should be the case. Better to contact icCube support directly.
Here is a quick details about STATIC vs DYNAMIC evaluation of the categories from the online documentation:
STATIC | DYNAMIC : an optional modifier to specify the evaluation context. The default value is DYNAMIC:
in that case, when evaluating the formula the list of members is filtered by the slicer
and/or sub-select content. In STATIC mode, slicer and sub-select are ignored.

Is there a way to compare file vs table record with creating new mapping using Informatica?

I'm working on a scenario where I have to compare a data record which is coming from a file with the data from a table as part of validation check before loading the data file into the staging table. I have come up with a couple of possible scenarios which involve something that needs to change within the load mapping, but my team suggested to me to make a change to something that is easy to notice since it is a non-standard approach.
Is there any approach that we can handle within the workflow manager using any of the workflow tasks or session properties?
Create a mapping that will read the file, join data with the table, do the required validation and will write nothing out (use a filter with FALSE condition) and set a variable to 0/1 to indicate if the loading should start.
Next, run the loading session if the validation passed.
This can be improved a bit if you want to store the validation errors in some audit table. Then you don't need a variable - the condition can refer to $PMTargetName#numAffectedRows built-in variable. If it's more then zero - meaning there were some errors - don't start the load.
create a workflow with command line where you need to write a script which will pull the data from the table by using JDBC connections and try to compare with data present in the file and then flag whether to load or not .
based on this command line output you need to go ahead with staging workflow or not..
Use awk commands for comparison of the data , where you ll get flexibility to compare date parts in a column
FYR : http://www.cs.unibo.it/~renzo/doc/awk/nawkA4.pdf

DynamoDB Set order

From DynamoDB docs:
An attribute of type String Set. For example:
"SS": ["Giraffe", "Hippo" ,"Zebra"]
Type: Array of strings
Required: No
This is all I could find. I did some testing but that's clearly not enough for production environments and I would like to get a confirmation/confutation from people who have actually worked with these Sets.
Do DynamoDB Sets maintain insertion order? Can I count on that fact & build logic around that?
Im mainly interested in String Set but it probably applies to all of them (String, Number, Binary).
Here is the documentation. SET data type doesn't preserve the order.
SET : The order of the values within a set are not preserved;
therefore, your applications must not rely on any particular order of
elements within the set.
LIST - A list type attribute can store an ordered collection of values
Similar discussion on AWS forum

Only Allow Schema Change from Backendless Console

I want to restrict creating or modifying the data model schema for backendless. Specifically, only want to create or modify the data model schema using the console. The data model schema should not be modified using the API.
How can this be achieved?
It this point it can be achieved by you making sure that your code stays consistent and does not introduce new fields/properties. Adding a new field to a class and then saving an instance of the class with the API will result in a new column being created. So to avoid that make sure that your classes at any point of time represent the schema of the backend.

Programatically populating ValidValues property of ReportParameters in .NET for SSRS

I am trying to populate an SSRS parameter's ValidValues property at run-time without writing custom stored procedures. The issue is that the available values for a certain parameter change depending on a user's security level. I'd like to keep the logic for it in the code rather than in stored procedures. Is there a way to populated ValidValues for ReportParameters in .NET for SSRS?
have you tried creating a data set (either a select statement or sp) to pull the possible values? you can then set the parameter to use the data set.

Resources