BizTalk Scripting - Receive Location Schedule - biztalk

I'm tasked with 2 things that I'm not seeing support for.
Writing a script to enable a receive location with a schedule (No
schedule currently set)
Writing a script to remove a schedule from a receive location.
I see how to enable/disable a receive location, but I don't see how to manipulate the schedule via script.

Check the below article. This may help to create script.
http://msdn.microsoft.com/en-us/library/aa559496.aspx

Create two bindings files, one representing the receive port with schedule and one without. Then use BTSTask importbindings command to import whichever bindings file you want.

There is no support for scripting the scheduling of a receive location that I'm aware of.
You can achieve this in many other ways though. The simplest being to script the delivery
of the files to the receive location using shell script/etc.

Before going down your own scripting route, I'll consider 2 options
Check the schedule/service window capability in Receive Location and see if it solves your problems, else
There is a community Scheduled task adapter, which is suitable for such activities.
http://biztalkscheduledtask.codeplex.com/

Related

Airflow python client

We have some applications running and we want to start using airflow. From the documentation it seems that the only way to start a DAG is over command line. Is this true?
For example we have a flask server running and we want to start some workflow controlled by airflow. How can we achieve this? Is there an API to trigger e.g.: "Run DAG now with parameters x,y,h"?
There are couple of ways to achieve this with airflow. It depends on your situation which one or if any at all is suitable for you. Two suggestions that come to my mind:
Use Triggered DAGs. Python Jobs running in the Background may trigger a DAG to be executed in case an event happens. Have a look at the example_trigger_controller_dag.py and example_trigger_target_dag.py in the repository: GitHub Airflow
Use SensorTasks: There are some predefined sensors available which you can use to listen for specific events in a datasource f.e. If the existing once do not satisfy your need, airflow should be adaptable enough to let you implement your own sensor Airflow Sensor
After reading your question i understand your usecase as:
That you wish to run/trigger a DAG from HTTP server
--> you can just use the provided Airflow webserver(localhost:8080/) from which you can trigger/run the dag manually
Also You Can go HERE ,which is still in experimentation mode and use the api as provided
Please elaborate more so as to understand question better.

Advance MQ FTE scheduler option

we are using FTE agents for transferring files,
we want to configure scheduler transfer to work in a certain hours of the day,
so for example, if *.txt files is in the folder, transfer those files between 08:00AM to 12:00PM.
We tried so far several designed patterns (such as using ANT to determine the current hour and using trigger file which is different from the *.txt files) to solve the issue, but no success.
Any suggestion ?
I do not believe there is currently an option in WebSphere MQ FTE/MFT that provides exactly what you are looking for. From my understanding, what you are basically requesting is the Resource Monitor functionality (see the link below) but with an extra to option to say only have the Resource Monitor active between two time periods.
http://www-01.ibm.com/support/knowledgecenter/SSEP7X_7.0.4/com.ibm.wmqfte.doc/resource_monitoring.htm
Currently, a Resource Monitor is active when the FTE/MFT agent hosting the Resource Monitor is running.
You would need a system that requests these transfers manually at the times you want them to be processed.
Perhaps you would like to consider raising a Request For Enhacement (RFE) against the product?:
https://www.ibm.com/developerworks/rfe/?BRAND_ID=181

Asynchronous Update in Web2py of a single instance

I am currently writing a program in web2py to control a dynamometer. It essentially mimics functionality provided by labview (ie. set the mode, direction, and speed/torque of dyno, query dyno speed). I want to be able to keep one instance of the dyno alive and update a print output of the value of the dyno's speed several times a seconds without having the user do anything. Is this possible and is there a way I can do this with scheduler or is there a better way I should go about doing it? Thanks in advance.
Yes, it is possible. You need to look into gluon/contrib/websocket_messaging.py. It contains an example in the docstring. You run it as a background process with Tornado and connect it to the instrument. It will push data to the page via a websocket and trigger execution of custom js.

Generating SNMP traps

I am trying to implement just half of the SNMP functionality. On certain events, I want to create trap corresponding to each event. I am using C and Linux.
What would be the simplest way to achieve this? Do I need to use any open source utilities?Some of the events that I want to notify are very specific to my application. How to go about implementing this case?
I am new to SNMP. I have couple of basic questions: How agent and manager figure about what property i.e. object is being referred to? Do they both parse the MIB? How is MIB shared between agent and manager?
The easiest way is to execute Net-SNMP executable called snmptrap,
http://www.net-snmp.org/tutorial/tutorial-5/commands/snmptrap.html
Of course, you can also link to its underlying library so as to call the C functions directly.
About your basic questions on SNMP, you should start from a book, such as Essential SNMP or Understanding SNMP MIBs.

Drupal Rules Scheduler sends out duplicate emails (Drupal 7 Views-Rules integration)

I am sending out a nightly email through rules scheduler, when I manually execute it sends out one email as it should, however when it runs on the schedule it sends me 10 duplicate emails. I've looked all over and can't seem to find any solution to the problem.
Thanks in advance for any suggestions
Use Job scheduler module. In this module you first insert the data in job_schedule and create a queue for each schedule . when crons run it start executing each queue and send mails then it delete its entry from the job_scheduler table. hence it will not send same mail again and again to the same person.. There is proper documentation in job_scheduler module in drupal7. Just go through it.
This sounds like a bug in the Rules module; it has its quirks. I see you have reported this issue in the Rules issue queue: http://drupal.org/node/1314916, which is what I was first going to suggest. So now I know your issue is for Rules 7.x-2.x dev integration with Views 7... both of which have more than a few bugs. I strongly suspect this issue has as much to do with Views as with Rules. (The 10x repetition seems unlikely to be a coincidence since 10 is a default value for results-per-page in Views, etc)
When you report an issue, it's helpful to include all pertinent information (Drupal version, steps to replicate, what's written to the log, etc). I'd personally suggest seeing if you can replicate your issue in a clean installation of Drupal with just the modules necessary to run your test. If you can replicate it that way, it's easier to provide enough information for the developers to identify the issue and resolve it. (e.g. use Devel generate to create some nodes and dummy users, then create a very simple view, e.g. just titles of the five most recent nodes, and use that view as the source for your email content. Does it send 5 copies? You may need to configure a localhost mail server to test this.

Resources