Query to fetch table names from AX takes too long - axapta

I am using the following code in X++ to get table names:
client server public static container tableNames()
{
tableId tableId;
int tablecounter;
Dictionary dict = new Dictionary();
container tableNamesList;
for (tablecounter=1; tablecounter<=dict.tableCnt(); tablecounter++)
{
tableId = dict.tableCnt2Id(tablecounter);
tableNamesList = conIns(tableNamesList,1,dict.tableName(tableId));
}
return tableNamesList;
}
Business connector code :
tablesList = (AxaptaContainer)Global.ax.
CallStaticClassMethod("Code_Generator", "tableNames");
for (int i = 1; i <= tablesList.Count; i++)
{
tableName = tablesList.get_Item(i).ToString();
tables.Add(tableName);
}
The application hangs for 2 - 3 minutes while fetching data. What could be the cause? Any optimizations?

Rather than use ConIns, use +=, it will be faster
tableNamesList += dict.tableName(tableId);
ConIns has to work out where in the container to place the insert. += just adds it to the end

As mentioned before avoid conIns() when appending elements to a container because it makes a new copy of the container. Use += instead to append in place.
Also, you may want to check for permissions and leave out temporary tables, table maps, and other special cases. Standard Ax has a method to build a table name lookup form that takes these things into account. Check the method Global::pickTable() for details.
You could avoid some calls through the business connector as well and build the entire list in Ax in a similar way and return that in a single function call.

If you are using Dynamics Ax 2012, you could skip the treeNode stuff and use the SysModelElement table to fetch the data and return it immediately as a .Net Array to easy up things on the other side.
public static System.Collections.ArrayList FetchTableNames_ModelElementTables()
{
SysModelElement element;
SysModelElementType elementType;
System.Collections.ArrayList tableNames = new System.Collections.ArrayList();
;
// The SysModelElementType table contains the element types
// and we need the recId for the next selection
select firstonly RecId
from elementType
where elementType.Name == 'Table';
// With the recId of the table element type,
// select all of the elements with that type (hence, select all of the tables)
while select Name
from element
where element.ElementType == elementType.RecId
{
tableNames.Add(element.Name);
}
return tableNames;
}
}

Alright, I have tried a lot of things and in the end, I decided to create a table consisting of all table names. This table will have a Job populating it. I am fetching records from this table.

Related

Dynamics AX - Adding tables to DatabaseLog programatically in AX 2009

I'm looking for a way to enable logging changes for certain tables.
I have tried and tested adding tables to database log programatically, but with various success so far - sometimes it works sometimes it doesn't (mostly it does not) - it seems simply inserting rows into DatabaseLog table doesn't quite do the trick.
What I have tried:
Adding row with proper tableId, fieldId, logType and . Domain had been assigned as 'Admin', main company, empty field and subcompanies with the same result.
I have created class that handles inserts, main two functions are:
public static void InsertBase(STR tableName, domainId _domain='Admin')
{
//base logging for insert, delete, uptade on fieldid=0
DatabaseLog DBDict;
TableId _tableId;
DatabaseLogType _logType;
fieldId _fieldId =0;
List logTypes;
int i;
ListEnumerator enumerator;
;
_tableId= tableName2id(tableName);
logTypes = new List(Types::Enum);
logTypes.addEnd(DatabaseLogType::Insert);
logTypes.addEnd(DatabaseLogType::Update);
logTypes.addEnd(DatabaseLogType::Delete);
logTypes.addEnd(DatabaseLogType::EventInsert);
logTypes.addEnd(DatabaseLogType::EventUpdate);
logTypes.addEnd(DatabaseLogType::EventDelete);
enumerator = logTypes.getEnumerator();
while(enumerator.moveNext())
{
_logType = enumerator.current();
select * from dbdict where
dbdict.logTable==_tableId && dbdict.logField==_fieldId
&& dbdict.logType==_logType;
if(!dbDict) //that means it doesnt exist
{
dbdict.logTable=_tableId;
dbdict.logField=_fieldId;
dbdict.logType=_logType;
dbdict.domainId=_domain;
dbdict.insert();
}
}
info("Success");
}
and the method that lists every single field and adds as logType::Update
public static void init(str TableName, DomainId domain='Admin')
{
DatabaseLogType logtype;
int i;
container kk, ll;
DatabaseLog dblog;
tableid _tableId;
fieldid _fieldid;
;
logtype = DatabaseLogType::Update;
//holds a container of not yet added table fields to databaselog
kk = BLX_AddTableToDatabaseLog::buildFieldList(logtype,TableName);
for(i=1; i <= conlen(kk);i++)
{
ll = conpeek(kk,i);
_tableid = tableName2id(tableName);
_fieldid = conpeek(ll,1);
info(strfmt("%1 %2", conpeek(ll,1),conpeek(ll,2)));
dblog.logType=logType;
dblog.logTable = _tableId;
dblog.domainId = domain;
dblog.logField =_fieldid;
dblog.insert();
}
}
result:
What am I missing ?
#EDIT with some additional info
Does not work for SalesTable and SalesLine, WMSBillOfLading.
I couldn't add log for SalesTable and SalesLine by using wizard in administration panel, but my colleague somehow did (she has done exactly the same things as me). We also tried to add log to various other tables and we often found out that she could while I could not and vice versa (and sometimes none managed to do it like in case of WMSBillOfLading table).
The inconsistency of this mechanism is what drove me to write this code, which I hoped would solve all the problems.
After doing your setup changes you probably have to call
SysFlushDatabaseLogSetup::main();
in order to flush any caches.
This method is also called in the standard AX code in the form method SysDatabaseLogTableSetup\Methods\close and in the class method SysDatabaseLogWizard\doRun.

How to migrate dynamodb data on major table change?

During development structures and requirements change. Key and index settings need to be changed, that might break incremental table update. So my solution so far is to delete the table and recreate it from the cloudformation stack.
But how to solve this problem with a production deployment? Is it possible to automate dynamodb deployment as follows?
Create new table
Migrate data from old table to new table
Delete old table
Yes, it is perfectly possible to automate such a deployment structure. As long as you have code to create a table, it should be fairly straightforward to get all of the data from an old table, change the data, and then upload it all to a new table without any drops in up-time. If you write what language you would like to do such a thing in I can help a bit more.
I've done this before and I've added below a small generified code-sample on how you could do this in Java.
Java method for creating a table given the class of the object type stored in dynamo:
/**
* Creates a single table with its appropriate configuration (CreateTableRequest)
*/
public void createTable(Class tableClass) {
DynamoDBMapper mapper = createMapper(); // you'll need your own function to do this.
ProvisionedThroughput pt = new ProvisionedThroughput(1L, 1L);
CreateTableRequest ctr = mapper.generateCreateTableRequest(tableClass);
ctr.withProvisionedThroughput(new ProvisionedThroughput(1L, 1L));
// Provision throughput and configure projection for secondary indices.
if (ctr.getGlobalSecondaryIndexes() != null) {
for (GlobalSecondaryIndex idx : ctr.getGlobalSecondaryIndexes()) {
if (idx != null) {
idx.withProvisionedThroughput(pt).withProjection(new Projection().withProjectionType("ALL"));
}
}
}
TableUtils.createTableIfNotExists(client, ctr);
}
Java method to delete table:
private static void deleteTable(String tableName) {
AmazonDynamoDB client = AmazonDynamoDBClientBuilder.standard().build();
DynamoDB dynamoDB = new DynamoDB(client);
Table table = dynamoDB.getTable(tableName);
try {
System.out.println("Issuing DeleteTable request for " + tableName);
table.delete();
System.out.println("Waiting for " + tableName + " to be deleted...this may take a while...");
table.waitForDelete();
}
catch (Exception e) {
System.err.println("DeleteTable request failed for " + tableName);
System.err.println(e.getMessage());
}
}
I would scan the whole table and plop all of the content into a List and then map through that list, converting the objects into your new type, and then create a new table of that type but with a different name, push all of your new objects, and then delete the old table after switching any references you might have of the old table to the new one. Unfortunately this does mean that everything consuming your tables are going to need to be able to switch between your two staging tables.

How to identify advanced query or dynamic joins from query window?

In the query window that pops up, if a user right clicks and chooses "1:n" and selects a table, how can one detect and use that table? I have a good sample job and screenshots that should demonstrate what I'm trying to accomplish.
I wrote this sample job that dumps out the AOT query objects but not the dynamically joined table/range/value.
static void InventSumQuery(Args _args)
{
Query query = new Query(queryStr(InventDimPhys));
QueryRun qr = new QueryRun(query);
QueryBuildRange queryRange;
DictField dictField;
int i, n;
if(qr.prompt())
{
for (n=1; n<=query.dataSourceCount(); n++)
{
for (i=1; i<=query.dataSourceNo(n).rangeCount(); i++)
{
queryRange = query.dataSourceNo(n).range(i);
dictField = new dictField(query.dataSourceNo(n).table(), fieldName2id(query.dataSourceNo(n).table(), queryRange.AOTname()));
info(strFmt("%1.%2", tableId2name(dictField.tableid()), dictField.name()));
}
}
}
info("Done");
}
Of course I figure my own answer out. Query objects are static, and the query form actually just modifies the query when you make the change.
So you need to modify the code above to:
if(qr.prompt())
{
query = qr.query();
This gets the modified query. The advanced querying actually is just a function of the form itself that ultimately modifies the query.

Spring JDBC Dynamic Query into Map of objects

I have to dynamically execute queries which will come from database.The query has dynamic fields,which needs to be converted into map as key value pairs and send to view.For ex
one query may return only one fields and other may return more than two field of multiple rows.I have to write code in such way that it will work for n no.of fields and return it as map using spring jdbc.
Spring offers two ways to solve your problem.
Approach 1: use queryForList method from JdbcTemplate class. this will return List of Map populated by column names as key , and DB record as value. you have to manualy iterate over the list. each map object inside the list represents a single row in resultset.
example :
List<Map<String, Object>> result = jdbcTemplate.queryForList(query, new Object[]{123});
Iterator items = result.iterator();
while(items.hasNext()){
Map<String, Object> row = (Map<String, Object>) items.next();
System.out.println(row);
}
Approach 2 : this dosen't exactly match your requirements, but little faster than the first approach also more coding involved. you can use queryForRowSet method.
SqlRowSet rowSet = jdbcTemplate.queryForRowSet(query, new Object[]{3576});
int columnCount = rowSet.getMetaData().getColumnCount();
System.out.println(columnCount);
while(rowSet.next()){
for(int id =1 ; id <= columnCount ; id ++){
System.out.println(rowSet.getString(id)) ;
// your custom logic goes here
}
}

Linq to entity delete a specific column from a table

Linq to entity query to delete a specific column from a table by matching a condition`
public ActionResult deleteChecks(string checkValue)
{
check_master checks = (from table in db.check_master
where table.check_code == checkValue
select table).First();
//now how to delete/remove checks.mcheck
return View("Edit");
}`
Only want to update a single column element with null value(of selected row) from the table check_master
You can set a single property (which maps to a column) to null and save it to the database
foreach(check_master check in checks)
{
check.mcheck = null;
}
db.SaveChanges();
using (NorthwindDataContext db = new NorthwindDataContext())
{
// Retrieve the existing entity. Database Call 1
Product product = db.Products.First(p => p.ProductID == 1);
// Change the properties. LINQ to SQL knows
// these specific properties have changed.
product.UnitsInStock = 14;
// Flush the changes. Database Call 2
db.SubmitChanges();
}
Entity framework works with constant table scheme only.
Tell please, what your global aim is, may be there's some more suitable way to do it.
Updated:
foreach(var chm in db.check_master)
{
chm.mcheck = null;
}
db.SaveChanges();
I believe that Linq to Entities only support DML, it does not support DDL operations.
So you would have to use stored procedure or ADO.NET raw query.
EDIT
you can do simple update like this :
public ActionResult deleteChecks(string checkValue)
{
check_master checks = (from table in db.check_master
where table.check_code == checkValue
select table).First();
checks.mcheck = null;
db.SaveChanges();
return View("Edit");
}`

Resources