Save values global in NodeRED - global-variables

We are working with MQTT messages, process these messages and call a REST service to send the information to another system.
Now we need to save some values from the MQTT message to compare these values with other messages. Is it possible to create an array that is outside the workflow that can be accessed from other workflows? Currently we are saving the values to a file but this is not the way we want to do it.

Sounds like you need either an external database or a keyvalue store.
There a lots of database nodes for Node-RED that could do this or for a keyvalue pair you can use something like redis.
You can search on https://flows.nodered.org for database and redis nodes.

You can set global variables however they won't be retained when you restart node-red. Here's an example for you.
[{"id":"5a6c6b8.2487294","type":"inject","z":"98c20df4.95abc","name":"","topic":"","payload":"val1","payloadType":"str","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":590,"y":260,"wires":[["e62b621d.37897"]]},{"id":"f83255b0.19aa48","type":"inject","z":"98c20df4.95abc","name":"","topic":"","payload":"val2","payloadType":"str","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":590,"y":300,"wires":[["e62b621d.37897"]]},{"id":"e62b621d.37897","type":"change","z":"98c20df4.95abc","name":"","rules":[{"t":"set","p":"testvar","pt":"global","to":"payload","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":810,"y":280,"wires":[[]]},{"id":"c99e3c90.ae63d","type":"inject","z":"98c20df4.95abc","name":"","topic":"","payload":"val1","payloadType":"str","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":590,"y":400,"wires":[["a72eed82.28ddd"]]},{"id":"940128d3.c5d0a8","type":"inject","z":"98c20df4.95abc","name":"","topic":"","payload":"val2","payloadType":"str","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":590,"y":440,"wires":[["a72eed82.28ddd"]]},{"id":"a72eed82.28ddd","type":"function","z":"98c20df4.95abc","name":"","func":"var compare = global.get(\"testvar\");\n\nif(typeof(compare)=='undefined'){\n //Good idea to check if it's been set so you don't get 'undefined' errors\n node.status({text:\"Global var has not been set yet\"});\n}else if(msg.payload == compare){\n node.status({text:\"Same\"});\n}else{\n node.status({text:\"NOT the same\"});\n}\n\nreturn msg;","outputs":1,"noerr":0,"x":767.01953125,"y":415.00390625,"wires":[[]]},{"id":"997cc65f.c8d238","type":"comment","z":"98c20df4.95abc","name":"Set the Global var here","info":"","x":560,"y":220,"wires":[]},{"id":"34e55934.227c16","type":"comment","z":"98c20df4.95abc","name":"Test the Global var here","info":"","x":560,"y":360,"wires":[]},{"id":"81bed25f.6022a","type":"inject","z":"98c20df4.95abc","name":"","topic":"","payload":"val1","payloadType":"str","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":590,"y":560,"wires":[["164c525e.b4f6ce"]]},{"id":"d03c41c8.088a","type":"inject","z":"98c20df4.95abc","name":"","topic":"","payload":"val2","payloadType":"str","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":590,"y":600,"wires":[["164c525e.b4f6ce"]]},{"id":"a4cd5575.a4cef8","type":"comment","z":"98c20df4.95abc","name":"Another way to set the Global var","info":"","x":590,"y":520,"wires":[]},{"id":"164c525e.b4f6ce","type":"function","z":"98c20df4.95abc","name":"","func":"global.set(\"testvar\",msg.payload);\n\nreturn msg;","outputs":1,"noerr":0,"x":770,"y":580,"wires":[[]]}]

Try node-red-contrib-state it persists state across node-red restarts, and has a few other tools for state management.
mqtt is also good for saving state if you have access to an mqtt broker.

Related

Hideous performance using Azure mobile services MobileServiceSyncTable

I have a mobile service sync table that is giving me absolutely HORRENDOUS performance.
The table is declared as:
IMobileServiceSyncTable<Myclass> myclassTable;
this.client = new MobileServiceClient("my url here");
var store = new MobileServiceSQLiteStore(“localdb.db”);
store.DefineTable<Myclass>();
this.client.SyncContext.InitializeAsync(store);
this.myclassTable = client.GetSyncTable<Myclass>();
Than later in a button handler I’m calling into:
this.myclassTable.ToCollectionAsync();
The problem is, the performance is horrific. It takes at best minutes and most times just sits there indefinitely.
Is there anything in the above that I’ve done that would explain why performance is so absolutely terrible?
this.myclassTable.ToCollectionAsync();
For IMobileServiceSyncTable table, the above method would execute the SELECT * FROM [Myclass] sql statement against your local sqlite db.
The problem is, the performance is horrific. It takes at best minutes and most times just sits there indefinitely.
AFAIK, when working with offline sync, we may invoke the pull operation for retrieving a subset of the server data, then insert the retrieved data into the local store table. For await this.myclassTable.PullAsync(), it would send request and retrieve the server data with the MaxPageSize in 50, and the client SDK would send another request to confirm whether there has more data and pull them automatically.
In summary, I would recommend you checking with your code to locate the specific code which causes this poor performance. Also, you could leverage adding diagnostic logging, capturing the network traces via Fiddler to troubleshoot with this issue.

Transaction Scope in Pub/Sub + Message Label in Rebus

Currently I'm using WCF as service bus. But I want to switch to a more powerful service bus. I Chose Rebus.
I'm somehow new to Rebus. I have some problems :
1) My data is persisted in a DB table. I want publisher to read all persisted data every n seconds and publishes it to subscribers and then set a sent flag to data in DB.
Is there some timing for publishing?
Reading and publishing and changing (setting flag) data must be done in a transaction scope. Is there any defined solution in Rebus?
2) In Consumer, I want to save published data in some table. Reading message from message queue and saving in DB (in my handler) must be done in transaction scope. How Rebus do this?
3) Message label for published messages set to a random unique string. I want to set my custom label for created MSMQ message. is there any solution?
1) You are on your own when it comes to querying database tables at regular intervals – there's no built-in mechanism in Rebus that does this.
I can recommend you take a look at System.Timers.Timer or something similar.
2) You can enable automatic transaction scopes in your Rebus handlers by using the Rebus.TransactionScopes package.
3) Out of the box, it is not possible to specify the label to be used on the MSMQ message. It will be set by Rebus to a string consisting of the message type and ID as indicated by this extension method.

Microsoft AX Dynamics Process Integration through Outbound Ports

I would like to know the Process Integration steps.
Through Outbound ports
If any of the event occurs at AX Dynamics, we just want to know that events in the form of XML(Process Integration).
Example: Sales Order Creation, Customer Creation, Purchase Order Creation..
Outbound ports are only useful for asynchronous communication.
See AX 2012 Export Data with Outbound ports for an example (using the file system).
The steps to initiate sending data is in the AIF_SendCustomer.
As this is no lightweight operation, you may consider logging the records which needs integration in a custom integration table, then doing the processing in batch.
This is done in the insert and/or update and maybe delete method.
Deletes requires you store the RecId field value in the external system to be used for delete requests. The following does not cover this.
For logged table make the following method:
void syncRecord()
{
XXXRecordLog log;
log.RefTableId = this.TableId;
log.RefRecId = this.RecId;
log.insert();
}
Then call this.syncRecord() in the insert and update methods.
In the query to the outbound service be sure to exists join your table and the log table. This way only changed records are exported.
Make a batch job to do the transfer using the AIF_SendCustomer as a template.
After a synchronous (AifSendMode::Sync) transfer of the records, delete the log records (or mark them transferred).
Finally call AIFoutboundProcessingService to flush the file:
new AIFoutboundProcessingService().run();
Try to keeps things simple. It might be simpler to do a comma file export of the changed records!

How can I use multiple databases in a sitescope monitor?

We have 2 databases. One is an oracle 11g DB and the other is a DB2 database. I need to make a query to the oracle database to get a list of accounts and feed that as parameters into another DB2 query. If the DB2 Query returns any results then I need to send an alert. Is this in any way possible with sitescope (I am fairly new to sitescope so be gentle)? It looks like there is only room for 1 connection string in the sitescope monitors. Can I create 2 monitors (one for DB2 and one for Oracle) and use the results of one query as a parameter into the other monitor? Looks like there is some monitor to monitor capabilities but still trying to understand what is possible. Thanks!
It is possible to extract the results from the first query using a custom script alert, but it is not possible than to reuse the data in another monitor.
SiteScope (SiS) Database Query monitor has no feature to include dynamically changing data in the query used. Generally speaking monitor configurations are static and can only be updated by an external agent (user or integration).
Thinking inside the box of one vendor using HP Operations Orchestration (OO) would be an option to achieve your goal. You could either use OO to run the checks and send you alerts in case of a problem or run the checks and dump the result to a file somewhere which than subsequently can be read by SiS using Script monitor or Logfile monitor.

BizTalk 2013 - execute stored procedure on send port without orchestration?

A while back I set up BizTalk to pick up a file via FTP and drop it into a network directory. It's all passsthru so I didn't use an orchestration.
Now I've been asked to execute a stored procedure once the file is picked up. The procedure contains no parameters and I do not need the contents of the file.
It seems like such a simple request but I can't figure it out. Is there any way to do this without over complicating things?
This can be accomplished through the use of either the WCF-SQL adapter or the WCF_Custom adapter with a SQL binding. You can do this using messaging only with just a SendPort with a filter/map on it thus no orchestration needed.
For the SOAP action header use TypedProcedure/dbo/name_of_your_stored_procedure and in the messages tab you can specify the paramters to the stored procuders as well as add a payload in the following manner:
<name_of_your_stored_procedure xmlns="http://schemas.microsoft.com/Sql/2008/05/TypedProcedures/dbo">
<parameter1>XXXX</parameter1>
<xml_parameter>
<bts-msg-body xmlns="http://www.microsoft.com/schemas/bts2007" encoding="string"/>
</xml_parameter>
</name_of_your_stored_procedure>
In the above case xml_parameter will have the contents of the message payload passed to it.
The stored procedure should look something like :
CREATE PROCEDURE [dbo].[name_of_your_stored_procedure]
#parameter1 int,
#xml_parameter nvarchar(max)
AS
BEGIN
-- your code goes here
END
More details can be found here
Regards Hasse
This MSDN page describes the process and has this to say: "You must create a BizTalk orchestration to use BizTalk Server for performing an operation on SQL Server."
However if you're really desperate not to use an orchestration I believe you have the option of setting the operation context property in a custom pipeline component. Then you can initialise the message in a map on a port. In theory this should work but I can't guarantee it.

Resources