How to create package that will import jobs in Control-M? - control-m

I am very new to Control-M and have been trying to find an answer for this question for a while now.
All of my jobs are on GitHub (json or xml) and I want to send them to Control-M but Control-M is not installed on my computer (it is installed on the computer of our team scheduling admin).
From my understanding so far, I need to export my files from GitHub in xml or json format for them to be 'readable' my Control-M.
I cannot install Control-M Automation API workbench nor a Control-M instance on my computer (I am not allowed to install anything from the internet on my computer). How can I send all of my jobs to Control-M and import/load them into Control-M that is not on my computer? Do I need to write a package to import the jobs in the scheduler?
Apologies if my question is very stupid. Thank you very much for your help.

You should be able to install the Control-M fat client on your desktop. If your admin doesn't allow you to do that they really aren't allowing you to do your job. Alternatively they should give you access to a server where the fat client is installed.
The fat client has options to import and export the scheduling tables and calendars in XML format.

Related

How to download latest files from s3 bucket into local machine using Airflow

Is there a way to download latest files from S3 bucket into my local system using Airflow .
Since I am a newbie to Airflow I don't have much idea on how to proceed. Please assist.
Short answer: You could use S3KeySensor to detect when a certain key appears in an S3 bucket and then use S3Hook.read_key() to get the content of the key.
Assuming you are completely new to Airflow, I would suggest:
Start with the tutorial
Read up on Connections, Hooks, and Sensors
Use this example as a starting point for your own DAG
As a followup:
Browse the amazon provider package docs to see what else there is for working with AWS services
Look through other examples

Airflow cannot open .cache file

I am trying to automate the flow of calling Spotify. I am nearly there but I am held up at one point.
I am using Airflow (ETL tool) to run my python scripts.
I am able to connect using the spotipy module but it cannot read/access the .cache-'username' file which it needs to. I am running on a Debian Google Cloud Compute instance.
The error is
Couldn't write token to cache at: .cache-'username'

Using Airflow to Run .bat file or PowerShell program located in remote Windows Box

Currently some of the jobs are running in different Windows VM's.
for eg.,
Task Scheduler to run
Powershell files
.bat files
python files.
Sql Agent jobs
To run SSIS packages
We are planning to use Airflow to trigger all these jobs to have better visibility and manage dependencies.
Our Airflow in Ubuntu.
I would like know if there is any way to trigger above mentioned jobs in Windows via Airflow.
Can I get some examples on how to achieve my objectives? Please suggest what packages/libraries/plugins/operators I can use.
Yes there is. I would start by looking into the winrm operator and hook that you find in under Microsoft in providers:
http://airflow.apache.org/docs/apache-airflow-providers-microsoft-winrm/stable/index.html
and maybe also:
https://github.com/diyan/pywinrm

Trigger mainframe job from AirFlow

May I know if AirFlow support Mainframe jobs ? Can we schedule Mainframe jobs using AirFlow ?
Thanks in advance.
I do not know airflow specifically, but we have used Ansible, Jenkins, and IBM Urban Code Deploy for orchestration that includes distributed and mainframe process parts.
You can SSH into z/OS and use Bash, Python, cURL, Node.js, or Groovy. You could submit JCL via REST APIs. There is a command line processor for Db2 to execute SQL and stored procedures via bash terminal. There is the new Zowe CLI that brings a modern command line interface to z/OS.
I would ask the question - what is the nature of what you want to be scheduled? What language is it written in, or what language do you want it written in? If something exists today, what is the process and how is it scheduled today?
While I haven't used airflow, you can use modern interfaces to do things on z/OS, and frequently that is what is actually needed to integrate with orchestration tools.
Elaborating on Patrick Bossman's good summary, Apache Airflow definitely supports SSH connections to run commands and/or transfer files:
https://airflow.apache.org/howto/connection/ssh.html
z/OS includes OpenSSH as a standard, IBM supported feature in the base operating system at no additional charge, although it's possible it's not running in your particular z/OS installation. Dovetailed Technologies has published a helpful "Quick Install Guide" that explains how to configure and start OpenSSH on z/OS if it isn't configured already:
http://dovetail.com/docs/pt-quick-inst/pt-quick-inst-doc.pdf
Their reference points to IBM's official z/OS documentation if you need more information.
You may decide to have other connections to z/OS from Apache Airflow, but SSH is certainly an available option.
FYI, it appears possible to run Apache Airflow directly on z/OS 2.4 itself. I haven't personally tried it, but it looks good to go. The recipe to do that would be as follows:
Configure and fire up the z/OS Container Extensions ("zCX"), a standard, included, IBM supported, no additional charge feature in z/OS 2.4 that's compatible with IBM z14 and higher model IBM Z machines.
Install and run a Python container (Docker/OCI format) on zCX, for example a Python container from DockerHub. You'll need a Python container image that includes "s390x" architecture support, either on its own or in a multi-architecture container. (No problem with DockerHub's image.)
Use pip to install Apache Airflow within your Python container, per normal.
Configure your SSH (and perhaps other) connection(s) from Airflow to the rest of z/OS, as described above.
You can also run Apache Airflow on Linux on Z/LinuxONE, either on the same IBM Z machine where z/OS runs or on a different machine. You can test Apache Airflow using the free (for up to 120 days) IBM LinuxONE Community Cloud, and you could even create your own custom Docker/OCI container on the LinuxONE Community Cloud for deployment to zCX.
It might even be possible to run Airflow on Python for z/OS, without zCX, although if so there'd be some more work involved. Python for z/OS is available from Rocket Software here:
https://www.rocketsoftware.com/product-categories/mainframe/python-for-zos

Export emails from thunderbird to sql server

I need to load into sqlserver all of my locally stored email messages. Currently those are thunderbird-based but if i need some sort of export-to-outlook utility, fine, just say the word. I could probably adapt some asp.net (c#) code to access the local messages but doing the google against:
export emails from thunderbird to sql server
import to sqlserver from outlook
and a vast variation there of, is not getting me any closer to either a utility or someone's codeplex project.
it's probably trivial local file access stuff, so it's probably been done a few thousand times and has to have been presented as utility code a few hundred...but how do i find it?
thx
A solution is to setup a virtual machine or a server running linux (debian) or FreeBSD (my favorite). To install postgreSQL and DBmail and to connect through IMAP to DBmail.
With dbmail it is possible to store all email into SQL.

Resources