ansible async module returning task cannot be completed withing limited time though it completes well in time - asynchronous

I have written this playbook in ansible 2.9.9
sample.yml
---
name: async
hosts: [test-servers]
tasks:
- name: sleep for 30 seconds
command: sleep 30
async: 50
poll: 5
- name: side by side task
command: touch /tmp/sideTask
but when running ansible-playbook sample.yml
its returning
PLAY [async] *******************************************************************
TASK [Gathering Facts] *********************************************************
ok: [target1]
TASK [sleep for 30 seconds] ****************************************************
fatal: [target1]: FAILED! => {
"changed": false
}
MSG:
async task did not complete within the requested time - 50s
msg:
async task did not complete within the requested time - 50s
PLAY RECAP *********************************************************************
target1 : ok=1 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0

Related

When I am running playbook its showing Variable not defined

- hosts: switch
connection: network_cli
become_method: enable
gather_facts: no
vars_prompt:
- name: vlan_id
prompt: enter the vlan_id
private: no
vars:
cli:
username: admin
password: int123$%^
vlans:
100: "CORE"
200: "MONITORING"
300: "ACCESS"
400: "GUEST_WIFI"
ansible_buffer_read_timeout: 2
tasks:
- name: "creating the vlans"
ios_vlans:
config:
- vlan_id: "{{ vlan_id }}"
mtu: 700
state: active
shutdown: disabled
register: show_vlan
- debug:
var: show_vlan.stdout_lines
Output:
enter the vlan_id: 11
PLAY [switch] ****************************************************************************************************************************************************
TASK [creating the vlans] **************************************************************************************************************************************** changed: [172.16.1.252]
TASK [debug]
ok: [172.16.1.252] => show_vlan.stdout_lines: VARIABLE IS NOT DEFINED!
PLAY RECAP 172.16.1.252 : ok=2 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
ios_vlans module does not have stdout_lines key in it's return values. Please check the documention here
so debug show_lan
- debug:
var: show_vlan

Apache Airflow : Dag task marked zombie, with background process running on remote server

**Apache Airflow version:**1.10.9-composer
Kubernetes Version : Client Version: version.Info{Major:"1", Minor:"15+", GitVersion:"v1.15.12-gke.6002", GitCommit:"035184604aff4de66f7db7fddadb8e7be76b6717", GitTreeState:"clean", BuildDate:"2020-12-01T23:13:35Z", GoVersion:"go1.12.17b4", Compiler:"gc", Platform:"linux/amd64"}
Environment: Airflow, running on top of Kubernetes - Linux version 4.19.112
OS : Linux version 4.19.112+ (builder#7fc5cdead624) (Chromium OS 9.0_pre361749_p20190714-r4 clang version 9.0.0 (/var/cache/chromeos-cache/distfiles/host/egit-src/llvm-project c11de5eada2decd0a495ea02676b6f4838cd54fb) (based on LLVM 9.0.0svn)) #1 SMP Fri Sep 4 12:00:04 PDT 2020
Kernel : Linux gke-europe-west2-asset-c-default-pool-dc35e2f2-0vgz
4.19.112+ #1 SMP Fri Sep 4 12:00:04 PDT 2020 x86_64 Intel(R) Xeon(R) CPU # 2.20GHz GenuineIntel GNU/Linux
What happened ?
A running task is marked as Zombie after the execution time crossed the latest heartbeat + 5 minutes.
The task is running in background in another application server, triggered using SSHOperator.
[2021-01-18 11:53:37,491] {taskinstance.py:888} INFO - Executing <Task(SSHOperator): load_trds_option_composite_file> on 2021-01-17T11:40:00+00:00
[2021-01-18 11:53:37,495] {base_task_runner.py:131} INFO - Running on host: airflow-worker-6f6fd78665-lm98m
[2021-01-18 11:53:37,495] {base_task_runner.py:132} INFO - Running: ['airflow', 'run', 'dsp_etrade_process_trds_option_composite_0530', 'load_trds_option_composite_file', '2021-01-17T11:40:00+00:00', '--job_id', '282759', '--pool', 'default_pool', '--raw', '-sd', 'DAGS_FOLDER/dsp_etrade_trds_option_composite_0530.py', '--cfg_path', '/tmp/tmpge4_nva0']
Task Executing time:
dag_id dsp_etrade_process_trds_option_composite_0530
duration 7270.47
start_date 2021-01-18 11:53:37,491
end_date 2021-01-18 13:54:47.799728+00:00
Scheduler Logs during that time:
[2021-01-18 13:54:54,432] {taskinstance.py:1135} ERROR - <TaskInstance: dsp_etrade_process_etrd.push_run_date 2021-01-18 13:30:00+00:00 [running]> detected as zombie
{
textPayload: "[2021-01-18 13:54:54,432] {taskinstance.py:1135} ERROR - <TaskInstance: dsp_etrade_process_etrd.push_run_date 2021-01-18 13:30:00+00:00 [running]> detected as zombie"
insertId: "1ca8zyfg3zvma66"
resource: {
type: "cloud_composer_environment"
labels: {3}
}
timestamp: "2021-01-18T13:54:54.432862699Z"
severity: "ERROR"
logName: "projects/asset-control-composer-prod/logs/airflow-scheduler"
receiveTimestamp: "2021-01-18T13:54:55.714437665Z"
}
Airflow-webserver log :
X.X.X.X - - [18/Jan/2021:13:54:39 +0000] "GET /_ah/health HTTP/1.1" 200 187 "-" "GoogleHC/1.0"
{
textPayload: "172.17.0.5 - - [18/Jan/2021:13:54:39 +0000] "GET /_ah/health HTTP/1.1" 200 187 "-" "GoogleHC/1.0"
"
insertId: "1sne0gqg43o95n3"
resource: {2}
timestamp: "2021-01-18T13:54:45.401670481Z"
logName: "projects/asset-control-composer-prod/logs/airflow-webserver"
receiveTimestamp: "2021-01-18T13:54:50.598807514Z"
}
Airflow Info logs :
2021-01-18 08:54:47.799 EST
{
textPayload: "NoneType: None
"
insertId: "1ne3hqgg47yzrpf"
resource: {2}
timestamp: "2021-01-18T13:54:47.799661030Z"
severity: "INFO"
logName: "projects/asset-control-composer-prod/logs/airflow-scheduler"
receiveTimestamp: "2021-01-18T13:54:50.914461159Z"
}
[2021-01-18 13:54:47,800] {taskinstance.py:1192} INFO - Marking task as FAILED.dag_id=dsp_etrade_process_trds_option_composite_0530, task_id=load_trds_option_composite_file, execution_date=20210117T114000, start_date=20210118T115337, end_date=20210118T135447
Copy link
{
textPayload: "[2021-01-18 13:54:47,800] {taskinstance.py:1192} INFO - Marking task as FAILED.dag_id=dsp_etrade_process_trds_option_composite_0530, task_id=load_trds_option_composite_file, execution_date=20210117T114000, start_date=20210118T115337, end_date=20210118T135447"
insertId: "1ne3hqgg47yzrpg"
resource: {2}
timestamp: "2021-01-18T13:54:47.800605248Z"
severity: "INFO"
logName: "projects/asset-control-composer-prod/logs/airflow-scheduler"
receiveTimestamp: "2021-01-18T13:54:50.914461159Z"
}
Airflow Database shows the latest heartbeat as:
select state, latest_heartbeat from job where id=282759
--------------------------------------
state | latest_heartbeat
running | 2021-01-18 13:48:41.891934
Airflow Configurations:
celery
worker_concurrency=6
scheduler
scheduler_health_check_threshold=60
scheduler_zombie_task_threshold=300
max_threads=2
core
dag_concurrency=6
Kubernetes Cluster :
Worker nodes : 6
What was expected to happen ?
The backend process takes around 2hrs 30 minutes to finish. During
such long running jobs the task is detected as zombie. Eventhough the
worker node is still processing the task. The state of the job is
still marked as 'running'. State if the task is not known during the
run time.

How to check if an encrypted variable is decrypted?

I have an Ansible encrypted variable. Now I'd like to be able to run my playbook even when I don't unlock the variable (with --ask-vault-pass) and just skip the tasks that depend on it. Ideally with a warning saying that the task was skipped.
Now when I run my playbook without --ask-vault-pass, it fails with an error:
fatal: [...]: FAILED! => {"changed": false, "msg": "AnsibleError: An unhandled exception occurred while templating '{{ (samba_passwords |
string | from_yaml)[samba_username] }}'. Error was a <class 'ansible.parsing.vault.AnsibleVaultError'>, original message: Attempting to decrypt bu
t no vault secrets found"}
Is there a way how to check in the when: clause that an encrypted variable is not decrypted and thus inaccessible?
Q: "Check if an encrypted variable is decrypted. Skip the tasks that depend on it. Ideally with a warning saying that the task was skipped."
A: For example, given the file with the variable
shell> cat vars-test.yml
test_var1: test var1
Encrypt the file
shell> ansible-vault encrypt vars-test.yml
New Vault password:
Confirm New Vault password:
Encryption successful
shell> cat vars-test.yml
$ANSIBLE_VAULT;1.1;AES256
61373230346437306135303463393166323063656561623863306333313837666561653466393835
3738666532303836376139613766343930346263633032330a323336643061373039613330653237
30666364376266396633613162626536383161306262613062373239343232663935376364383431
6335623366613834360a336531656537626662376166323766376433653232633139383636613963
64356632633863353534323636313231633866613635343962383463636565303032
Then the playbook
shell> cat pb.yml
- hosts: test_01
tasks:
- include_vars: vars-test.yml
ignore_errors: true
- set_fact:
test_var1: "{{ test_var1|default('default') }}"
- name: Execute tasks if test_var1 was decrypted
block:
- debug:
msg: Execute task1
- debug:
msg: Execute task2
when: test_var1 != 'default'
gives (abridged)
shell> ansible-playbook pb.yml --ask-vault-pass
TASK [include_vars] ****
ok: [test_01]
TASK [set_fact] ****
ok: [test_01]
TASK [debug] ****
ok: [test_01] =>
msg: Execute task1
TASK [debug] ****
ok: [test_01] =>
msg: Execute task2
If you don't provide the command with the password the playbook gives (abridged)
shell> ansible-playbook pb.yml
PLAY [test_01] ****
TASK [include_vars] ****
fatal: [test_01]: FAILED! => changed=false
ansible_facts: {}
ansible_included_var_files: []
message: Attempting to decrypt but no vault secrets found
...ignoring
TASK [set_fact] ****
ok: [test_01]
TASK [debug] ****
skipping: [test_01]
TASK [debug] ****
skipping: [test_01]
I've researched but I haven't found anything to do that. The easy way to solve this case would be used ignore_errors: yes in the task.

Ansible datetime timezone conversion

Is there a way to convert ansible date into a different timezone in a "debug" statement in my playbook ? I dont want a global timezone setting at the playbook level. I have this :
debug:
msg: "{{ '%Y-%m-%d %H:%M:%S' | strftime(ansible_date_time.epoch) }}"
This works fine but displays time in UTC. I need the time to be displayed in EDT without setting timezone at the global playbook level. How do I accomplish this ?
If you use a command task to run date rather than relying on the ansible_date_time variable, you can set the timezone via an environment variable. E.g. the following playbook:
- hosts: localhost
vars:
ansible_python_interpreter: /usr/bin/python
tasks:
- command: "date '+%Y-%m-%d %H:%M:%S'"
register: date_utc
environment:
TZ: UTC
- command: "date '+%Y-%m-%d %H:%M:%S'"
register: date_us_eastern
environment:
TZ: US/Eastern
- debug:
msg:
- "{{ date_utc.stdout }}"
- "{{ date_us_eastern.stdout }}"
Results in this output:
PLAY [localhost] *****************************************************************************
TASK [Gathering Facts] ***********************************************************************
ok: [localhost]
TASK [command] *******************************************************************************
changed: [localhost]
TASK [command] *******************************************************************************
changed: [localhost]
TASK [debug] *********************************************************************************
ok: [localhost] => {
"msg": [
"2020-05-12 15:21:05",
"2020-05-12 11:21:06"
]
}
PLAY RECAP ***********************************************************************************
localhost : ok=4 changed=2 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0

How to define an Ansible playbook environment with passed in environment dictionary

My problem is that I'm unable to set the environment for the entire playbook by passing in a dict to be set as the environment. Is that possible?
For example, here is my sample ansible playbook:
- hosts: localhost
vars:
env_vars: "{{ PLAY_ENVS }}"
environment: "{{ env_vars }}"
tasks:
- name: Here is what you passed in
debug: msg="env_vars == {{ env_vars }}"
- name: What is FAKE_ENV
debug: msg="FAKE_ENV == {{ lookup('env', 'FAKE_ENV') }}"
And I'm passing the command:
/bin/ansible-playbook sample_playbook.yml --extra-vars '{PLAY_ENVS: {"FAKE_ENV":"/path/to/fake/destination"}}'
The response I'm getting is the following:
PLAY [localhost] ***************************************************************
TASK [setup] *******************************************************************
ok: [localhost]
TASK [Here is what you passed in] **********************************************
ok: [localhost] => {
"msg": "env_vars == {u'FAKE_ENV': u'/path/to/fake/destination'}"
}
TASK [What is FAKE_ENV] ********************************************************
ok: [localhost] => {
"msg": "FAKE_ENV == "
}
PLAY RECAP *********************************************************************
localhost : ok=3 changed=0 unreachable=0 failed=0
As you can see 'FAKE_ENV' is not being set in the environment. What am I doing wrong?
Lookups in Ansible are executed in a context of parent ansible process.
You should check your environment with a spawned process, like this:
- hosts: localhost
vars:
env_vars:
FAKE_ENV: foobar
environment: "{{ env_vars }}"
tasks:
- name: Test with spawned process
shell: echo $FAKE_ENV
And get expected result: "stdout": "foobar",

Resources