Why the apache-airflow logs does not show values - airflow

I am running a simple code to check the logging in from the airflow task. Why can't I see the parameters variable value? I see only the copy of the executed code without the values of the variables
this is the out:
[2020-05-21 21:25:34,766] {base_task_runner.py:115} INFO - Job 40: Subtask rna-step1-gsm-srr-task-id [2020-05-21 21:25:34,765] {bash_operator.py:129} INFO - import json
[2020-05-21 21:25:34,767] {base_task_runner.py:115} INFO - Job 40: Subtask rna-step1-gsm-srr-task-id [2020-05-21 21:25:34,765] {bash_operator.py:129} INFO - logging.info("!!!in test - nb")
[2020-05-21 21:25:34,767] {base_task_runner.py:115} INFO - Job 40: Subtask rna-step1-gsm-srr-task-id [2020-05-21 21:25:34,765] {bash_operator.py:129} INFO - logging.info("after json import")
[2020-05-21 21:25:34,767] {base_task_runner.py:115} INFO - Job 40: Subtask rna-step1-gsm-srr-task-id [2020-05-21 21:25:34,765] {bash_operator.py:129} INFO - parameters = {}
[2020-05-21 21:25:34,767] {base_task_runner.py:115} INFO - Job 40: Subtask rna-step1-gsm-srr-task-id [2020-05-21 21:25:34,765] {bash_operator.py:129} INFO - #/home/airflow/gcs/dags/dependencies/
[2020-05-21 21:25:34,767] {base_task_runner.py:115} INFO - Job 40: Subtask rna-step1-gsm-srr-task-id [2020-05-21 21:25:34,765] {bash_operator.py:129} INFO - with open('parameters.json') as json_file:
[2020-05-21 21:25:34,768] {base_task_runner.py:115} INFO - Job 40: Subtask rna-step1-gsm-srr-task-id [2020-05-21 21:25:34,765] {bash_operator.py:129} INFO - parameters = json.load(json_file)
[2020-05-21 21:25:34,768] {base_task_runner.py:115} INFO - Job 40: Subtask rna-step1-gsm-srr-task-id [2020-05-21 21:25:34,765] {bash_operator.py:129} INFO - logging.info("parameters is {}".format(parameters))
are there any settings that need to be done? is it possible to have the logs in stackdriver?
Thanks,
eilalan

Related

Airflow task randomly exited with return code 1 [Local Executor / PythonOperator]

To give some context, I am using Airflow 2.3.0 on Kubernetes with the Local Executor (which may sound weird, but it works for us for now) with one pod for the webserver and two for the scheduler.
I have a DAG consisting of a single task (PythonOperator) that makes many API calls (200K) using requests.
Every 15 calls, the data is loaded in a DataFrame and stored on AWS S3 (using boto3) to reduce the RAM usage.
The problem is that I can't get to the end of this task because it goes into error randomly (after 1, 10 or 120 minutes).
I have made more than 50 tries, no success and the only logs on the task are:
[2022-09-01, 14:45:44 UTC] {taskinstance.py:1159} INFO - Dependencies all met for <TaskInstance: INGESTION-DAILY-dag.extract_task scheduled__2022-08-30T00:00:00+00:00 [queued]>
[2022-09-01, 14:45:44 UTC] {taskinstance.py:1159} INFO - Dependencies all met for <TaskInstance: INGESTION-DAILY-dag.extract_task scheduled__2022-08-30T00:00:00+00:00 [queued]>
[2022-09-01, 14:45:44 UTC] {taskinstance.py:1356} INFO -
--------------------------------------------------------------------------------
[2022-09-01, 14:45:44 UTC] {taskinstance.py:1357} INFO - Starting attempt 23 of 24
[2022-09-01, 14:45:44 UTC] {taskinstance.py:1358} INFO -
--------------------------------------------------------------------------------
[2022-09-01, 14:45:44 UTC] {taskinstance.py:1377} INFO - Executing <Task(_PythonDecoratedOperator): extract_task> on 2022-08-30 00:00:00+00:00
[2022-09-01, 14:45:44 UTC] {standard_task_runner.py:52} INFO - Started process 942 to run task
[2022-09-01, 14:45:44 UTC] {standard_task_runner.py:79} INFO - Running: ['airflow', 'tasks', 'run', 'INGESTION-DAILY-dag', 'extract_task', 'scheduled__2022-08-30T00:00:00+00:00', '--job-id', '4390', '--raw', '--subdir', 'DAGS_FOLDER/dags/ingestion/daily_dag/dag.py', '--cfg-path', '/tmp/tmpwxasaq93', '--error-file', '/tmp/tmpl7t_gd8e']
[2022-09-01, 14:45:44 UTC] {standard_task_runner.py:80} INFO - Job 4390: Subtask extract_task
[2022-09-01, 14:45:45 UTC] {task_command.py:369} INFO - Running <TaskInstance: INGESTION-DAILY-dag.extract_task scheduled__2022-08-30T00:00:00+00:00 [running]> on host 10.XX.XXX.XXX
[2022-09-01, 14:48:17 UTC] {local_task_job.py:156} INFO - Task exited with return code 1
[2022-09-01, 14:48:17 UTC] {taskinstance.py:1395} INFO - Marking task as UP_FOR_RETRY. dag_id=INGESTION-DAILY-dag, task_id=extract_task, execution_date=20220830T000000, start_date=20220901T144544, end_date=20220901T144817
[2022-09-01, 14:48:17 UTC] {local_task_job.py:273} INFO - 0 downstream tasks scheduled from follow-on schedule check
But when I go to the pod logs, I get the following message:
[2022-09-01 14:06:31,624] {local_executor.py:128} ERROR - Failed to execute task an integer is required (got type ChunkedEncodingError).
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/executors/local_executor.py", line 124, in _execute_work_in_fork
args.func(args)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 51, in command
return func(*args, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/cli.py", line 99, in wrapper
return f(*args, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 377, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 183, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 241, in _run_task_by_local_task_job
run_job.run()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/base_job.py", line 244, in run
self._execute()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/local_task_job.py", line 105, in _execute
self.task_runner.start()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/task/task_runner/standard_task_runner.py", line 41, in start
self.process = self._start_by_fork()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/task/task_runner/standard_task_runner.py", line 125, in _start_by_fork
os._exit(return_code)
TypeError: an integer is required (got type ChunkedEncodingError)
What I find strange is that I never had this error on other DAGs (where tasks are smaller and faster). I checked, during an attempt, CPU and RAM usages are stable and low.
I have the same error locally, I also tried to upgrade to 2.3.4 but nothing works.
Do you have any idea how to fix this?
Thanks a lot!
Nicolas
As #EDG956 said, this is not an error from Airflow but from the code.
I solved it using a context manager (which was not enough) and recreating a session:
s = requests.Session()
while True:
try:
with s.get(base_url) as r:
response = r
except requests.exceptions.ChunkedEncodingError:
s.close()
s.requests.Session()
response = s.get(base_url)

Geeting ModuleNotFoundError: No module named 'pytz' in composer DAG run

I am doing Bigquery operation in composer DAG and getting the following error :
Event with job id 36cc0fe962103bf2bb7a9c Failed
[2021-08-19 15:13:07,807] {base_task_runner.py:113} INFO - Job 263145: Subtask sample [2021-08-19 15:13:07,806] {pod_launcher.py:125} INFO - b' from google.cloud.bigquery.routine import RoutineReference\n'
[2021-08-19 15:13:07,807] {base_task_runner.py:113} INFO - Job 263145: Subtask sample [2021-08-19 15:13:07,807] {pod_launcher.py:125} INFO - b' File "/usr/local/lib/python3.7/dist-packages/google/cloud/bigquery/routine/__init__.py", line 18, in <module>\n'
[2021-08-19 15:13:07,810] {base_task_runner.py:113} INFO - Job 263145: Subtask sample [2021-08-19 15:13:07,808] {pod_launcher.py:125} INFO - b' from google.cloud.bigquery.enums import DeterminismLevel\n'
[2021-08-19 15:13:07,810] {base_task_runner.py:113} INFO - Job 263145: Subtask sample [2021-08-19 15:13:07,808] {pod_launcher.py:125} INFO - b' File "/usr/local/lib/python3.7/dist-packages/google/cloud/bigquery/enums.py", line 21, in <module>\n'
[2021-08-19 15:13:07,811] {base_task_runner.py:113} INFO - Job 263145: Subtask sample [2021-08-19 15:13:07,808] {pod_launcher.py:125} INFO - b' from google.cloud.bigquery.query import ScalarQueryParameterType\n'
[2021-08-19 15:13:07,812] {base_task_runner.py:113} INFO - Job 263145: Subtask sample [2021-08-19 15:13:07,809] {pod_launcher.py:125} INFO - b' File "/usr/local/lib/python3.7/dist-packages/google/cloud/bigquery/query.py", line 23, in <module>\n'
[2021-08-19 15:13:07,812] {base_task_runner.py:113} INFO - Job 263145: Subtask sample [2021-08-19 15:13:07,809] {pod_launcher.py:125} INFO - b' from google.cloud.bigquery.table import _parse_schema_resource\n'
[2021-08-19 15:13:07,812] {base_task_runner.py:113} INFO - Job 263145: Subtask sample [2021-08-19 15:13:07,809] {pod_launcher.py:125} INFO - b' File "/usr/local/lib/python3.7/dist-packages/google/cloud/bigquery/table.py", line 23, in <module>\n'
[2021-08-19 15:13:07,813] {base_task_runner.py:113} INFO - Job 263145: Subtask sample [2021-08-19 15:13:07,809] {pod_launcher.py:125} INFO - b' import pytz\n'
[2021-08-19 15:13:07,813] {base_task_runner.py:113} INFO - Job 263145: Subtask sample [2021-08-19 15:13:07,809] {pod_launcher.py:125} INFO - b"ModuleNotFoundError: No module named 'pytz'\n"
I didn't use pytz in my airflow or BQ method . I have tried to add pytz in my composer environment but it didn't work.Please suggest.
This error is due to the breakage https://github.com/googleapis/python-bigquery/issues/885 and a fix for this is released on Bigquery version 2.24.1.
You can update your bigquery dependency using the command gcloud composer environments update. See Installing a Python dependency from PyPI for more details.
For testing purposes, environment has Composer Version=1.16.14 and Airflow Version=1.10.15. Using the pre installed packages listed in Cloud Composer pre installed packages, I only copied the Google dependencies and updated the packages below. This is because they are dependencies of Bigquery as per Bigquery constraints.
google-cloud-bigquery==2.24.1
google-api-core==1.29.0
grpcio==1.38.1
Full requirements.txt:
google-ads==4.0.0
google-api-core==1.29.0
google-api-python-client==1.12.8
google-apitools==0.5.31
google-auth==1.28.0
google-auth-httplib2==0.1.0
google-auth-oauthlib==0.4.3
google-cloud-automl==2.2.0
google-cloud-bigquery==2.24.1
google-cloud-bigquery-datatransfer==3.1.0
google-cloud-bigquery-storage==2.1.0
google-cloud-bigtable==1.7.0
google-cloud-build==2.0.0
google-cloud-container==1.0.1
google-cloud-core==1.6.0
google-cloud-datacatalog==3.1.0
google-cloud-dataproc==2.3.0
google-cloud-datastore==1.15.3
google-cloud-dlp==1.0.0
google-cloud-kms==2.2.0
google-cloud-language==1.3.0
google-cloud-logging==2.2.0
google-cloud-memcache==0.3.0
google-cloud-monitoring==2.0.0
google-cloud-os-login==2.1.0
google-cloud-pubsub==2.3.0
google-cloud-pubsublite==0.3.0
google-cloud-redis==2.1.0
google-cloud-secret-manager==1.0.0
google-cloud-spanner==1.19.1
google-cloud-speech==1.3.2
google-cloud-storage==1.36.2
google-cloud-tasks==2.2.0
google-cloud-texttospeech==1.0.1
google-cloud-translate==1.7.0
google-cloud-videointelligence==1.16.1
google-cloud-vision==1.0.0
google-cloud-workflows==0.2.0
google-crc32c==1.1.2
google-pasta==0.2.0
google-resumable-media==1.2.0
googleapis-common-protos==1.53.0
grpc-google-iam-v1==0.12.3
grpcio==1.38.1
grpcio-gcp==0.2.2
Command:
gcloud composer environments update your-composer-environment-name --update-pypi-packages-from-file requirements.txt --location your-composer-location

Airflow log file Permission denied

I install Airflow 2.1.2 with python 3.6.5 in a linux server and trigger a DAG by run_as user but it is not worked as expected.
I've created a logs folder with chmod -R 777 but when the DAG get start and system create the log file under logs/ with
"drwx------ 3 root root"
and caused permission denied since only root can access the folder.
is there any solution I can fix it?
here is the error I got
[2021-07-16 07:00:14,334] {base_task_runner.py:135} INFO - Running: ['sudo', '-E', '-H', '-u', 'hdfsprod', 'airflow', 'tasks', 'run', 'hdfs_kinit', 'hdfs_kinit', '2021-07-16T07:00:04.092676+00:00', '--job-id', '19', '--pool', 'default_pool', '--raw', '--subdir', 'DAGS_FOLDER/hdfs_test.py', '--cfg-path', '/tmp/tmp8_t6s5o2', '--error-file', '/tmp/tmpmf3zruty']
[2021-07-16 07:00:15,841] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit [[34m2021-07-16 07:00:15,839[0m] {[34mdagbag.py:[0m487} INFO[0m - Filling up the DagBag from /opt/data/dags/hdfs_test.py[0m
[2021-07-16 07:00:15,992] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit Traceback (most recent call last):
[2021-07-16 07:00:15,993] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit File "/service/python3/lib/python3.6/pathlib.py", line 1246, in mkdir
[2021-07-16 07:00:15,993] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit self._accessor.mkdir(self, mode)
[2021-07-16 07:00:15,993] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit File "/service/python3/lib/python3.6/pathlib.py", line 387, in wrapped
[2021-07-16 07:00:15,993] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit return strfunc(str(pathobj), *args)
[2021-07-16 07:00:15,993] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit PermissionError: [Errno 13] Permission denied: '/opt/data/logs/hdfs_kinit/2021-07-16T07:00:04.092676+00:00'
[2021-07-16 07:00:15,994] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit
The easiest way to fix it is to make your "run_as" user belong to "root" group and change umask to 002 in the scripts that start Airflow.
https://unix.stackexchange.com/questions/12842/make-all-new-files-in-a-directory-accessible-to-a-group

Pod Launching failed: Pod took too long to start, Failed to run KubernetesPodOperator secret

I'm running the quickstart for KubernetesPodOperator secret using the link below : https://cloud.google.com/composer/docs/how-to/using/using-kubernetes-pod-operator
Code used below :
from airflow import models
from airflow.contrib.kubernetes import secret
from airflow.contrib.operators import kubernetes_pod_operator
# A Secret is an object that contains a small amount of sensitive data such as
# a password, a token, or a key. Such information might otherwise be put in a
# Pod specification or in an image; putting it in a Secret object allows for
# more control over how it is used, and reduces the risk of accidental
# exposure.
secret_env = secret.Secret(
# Expose the secret as environment variable.
deploy_type='env',
# The name of the environment variable, since deploy_type is `env` rather
# than `volume`.
deploy_target='SQL_CONN',
# Name of the Kubernetes Secret
secret='airflow-secrets',
# Key of a secret stored in this Secret object
key='sql_alchemy_conn')
secret_volume = secret.Secret(
'volume',
# Path where we mount the secret as volume
'/var/secrets/google',
# Name of Kubernetes Secret
'service-account',
# Key in the form of service account file name
'service-account.json')
YESTERDAY = datetime.datetime.now() - datetime.timedelta(days=1)
# If a Pod fails to launch, or has an error occur in the container, Airflow
# will show the task as failed, as well as contain all of the task logs
# required to debug.
with models.DAG(
dag_id='composer_sample_kubernetes_pod',
schedule_interval=datetime.timedelta(days=1),
start_date=YESTERDAY) as dag:
# Only name, namespace, image, and task_id are required to create a
# KubernetesPodOperator. In Cloud Composer, currently the operator defaults
# to using the config file found at `/home/airflow/composer_kube_config if
# no `config_file` parameter is specified. By default it will contain the
# credentials for Cloud Composer's Google Kubernetes Engine cluster that is
# created upon environment creation.
kubernetes_secret_vars_ex = kubernetes_pod_operator.KubernetesPodOperator(
task_id='ex-kube-secrets',
name='ex-kube-secrets',
namespace='default',
image='python:3.6-stretch',
cmds=["python","-c"],
arguments=["print('hello world')"],
labels={"foo": "bar"},
startup_timeout_seconds=300,
# The secrets to pass to Pod, the Pod will fail to create if the
# secrets you specify in a Secret object do not exist in Kubernetes.
secrets=[secret_env, secret_volume],
# env_vars allows you to specify environment variables for your
# container to use. env_vars is templated.
env_vars={
'EXAMPLE_VAR': '/example/value',
'GOOGLE_APPLICATION_CREDENTIALS': '/var/secrets/google/service-account.json'})
I have created successfully the secret using :
kubectl create secret generic airflow-secrets \
--from-literal sql_alchemy_conn=test_value
I am receiving this error :
> Reading remote log from gs://europe-west1-test-environme-5ad38518-bucket/logs/composer_kubernetes_pod/pod-ex-minimum/2020-11-08T10:29:28.767187+00:00/3.log.
[2020-11-09 10:57:38,764] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: composer_kubernetes_pod.pod-ex-minimum 2020-11-08T10:29:28.767187+00:00 [queued]>
[2020-11-09 10:57:38,817] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: composer_kubernetes_pod.pod-ex-minimum 2020-11-08T10:29:28.767187+00:00 [queued]>
[2020-11-09 10:57:38,818] {taskinstance.py:880} INFO -
--------------------------------------------------------------------------------
[2020-11-09 10:57:38,818] {taskinstance.py:881} INFO - Starting attempt 3 of 3
[2020-11-09 10:57:38,818] {taskinstance.py:882} INFO -
--------------------------------------------------------------------------------
[2020-11-09 10:57:38,894] {taskinstance.py:901} INFO - Executing <Task(KubernetesPodOperator): pod-ex-minimum> on 2020-11-08T10:29:28.767187+00:00
[2020-11-09 10:57:38,895] {base_task_runner.py:131} INFO - Running on host: airflow-worker-765557479f-zbzkm
[2020-11-09 10:57:38,895] {base_task_runner.py:132} INFO - Running: ['airflow', 'run', 'composer_kubernetes_pod', 'pod-ex-minimum', '2020-11-08T10:29:28.767187+00:00', '--job_id', '54', '--pool', 'default_pool', '--raw', '-sd', 'DAGS_FOLDER/kubernetes_secret3.py', '--cfg_path', '/tmp/tmp3qvpc1a6']
[2020-11-09 10:57:42,268] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:57:42,267] {configuration.py:618} INFO - Reading the config from /etc/airflow/airflow.cfg
[2020-11-09 10:57:42,365] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:57:42,364] {configuration.py:618} INFO - Reading the config from /etc/airflow/airflow.cfg
[2020-11-09 10:57:42,650] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:57:42,649] {default_celery.py:90} WARNING - You have configured a result_backend of redis://airflow-redis-service.default.svc.cluster.local:6379/0, it is highly recommended to use an alternative result_backend (i.e. a database).
[2020-11-09 10:57:42,651] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:57:42,651] {__init__.py:51} INFO - Using executor CeleryExecutor
[2020-11-09 10:57:42,652] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:57:42,651] {dagbag.py:397} INFO - Filling up the DagBag from /home/airflow/gcs/dags/kubernetes_secret3.py
[2020-11-09 10:57:43,393] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum Running <TaskInstance: composer_kubernetes_pod.pod-ex-minimum 2020-11-08T10:29:28.767187+00:00 [running]> on host airflow-worker-765557479f-zbzkm
[2020-11-09 10:57:59,728] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:57:59,727] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:00,736] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:00,736] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:01,743] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:01,742] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:02,748] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:02,748] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:03,755] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:03,755] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:04,764] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:04,764] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:05,772] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:05,772] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:06,780] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:06,780] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:07,787] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:07,787] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:08,795] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:08,795] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:09,804] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:09,804] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:10,816] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:10,815] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:11,824] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:11,824] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:12,831] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:12,831] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:13,838] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:13,838] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:14,845] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:14,844] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:15,855] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:15,854] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:16,863] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:16,862] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:17,870] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:17,869] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:18,877] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:18,877] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:19,883] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:19,883] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:20,892] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:20,891] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:21,899] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:21,898] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:22,907] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:22,907] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:23,915] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:23,915] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:24,922] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:24,922] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:25,930] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:25,930] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:26,938] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:26,937] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:27,944] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:27,944] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:28,952] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:28,951] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:29,958] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:29,958] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:30,965] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:30,964] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:31,971] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:31,970] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:32,978] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:32,978] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:33,986] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:33,986] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:34,993] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:34,993] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:36,002] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:36,002] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:37,012] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:37,012] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:38,029] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:38,029] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:39,041] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:39,040] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:40,048] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:40,048] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:41,058] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:41,057] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:42,066] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:42,066] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:43,073] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:43,073] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:44,081] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:44,081] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:45,088] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:45,088] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:46,098] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:46,098] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:47,109] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:47,109] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:48,117] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:48,117] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:49,125] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:49,124] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:50,134] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:50,134] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:51,141] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:51,140] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:52,147] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:52,147] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:53,156] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:53,155] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:54,167] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:54,167] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:55,176] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:55,176] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:56,183] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:56,183] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:57,196] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:57,195] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:58,202] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:58,201] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:59,209] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:59,208] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:00,215] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:00,215] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:01,226] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:01,226] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:10,317] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:10,317] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:11,324] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:11,323] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:12,331] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:12,330] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:13,337] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:13,337] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:14,344] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:14,344] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:15,351] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:15,350] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:16,357] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:16,357] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:17,363] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:17,363] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:18,376] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:18,375] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:40,566] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:40,565] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:41,572] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:41,572] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:42,582] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:42,582] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:43,589] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:43,589] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:44,596] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:44,595] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:44,645] {taskinstance.py:1148} ERROR - Pod Launching failed: Pod took too long to start
Traceback (most recent call last)
File "/usr/local/lib/airflow/airflow/contrib/operators/kubernetes_pod_operator.py", line 253, in execut
get_logs=self.get_logs
File "/usr/local/lib/airflow/airflow/contrib/kubernetes/pod_launcher.py", line 113, in run_po
raise AirflowException("Pod took too long to start"
airflow.exceptions.AirflowException: Pod took too long to star
During handling of the above exception, another exception occurred
Traceback (most recent call last)
File "/usr/local/lib/airflow/airflow/models/taskinstance.py", line 985, in _run_raw_tas
result = task_copy.execute(context=context
File "/usr/local/lib/airflow/airflow/contrib/operators/kubernetes_pod_operator.py", line 265, in execut
raise AirflowException('Pod Launching failed: {error}'.format(error=ex)
airflow.exceptions.AirflowException: Pod Launching failed: Pod took too long to star
[2020-11-09 10:59:44,646] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:44,645] {taskinstance.py:1148} ERROR - Pod Launching failed: Pod took too long to start
[2020-11-09 10:59:44,646] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum Traceback (most recent call last):
[2020-11-09 10:59:44,647] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/contrib/operators/kubernetes_pod_operator.py", line 253, in execute
[2020-11-09 10:59:44,647] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum get_logs=self.get_logs)
[2020-11-09 10:59:44,647] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/contrib/kubernetes/pod_launcher.py", line 113, in run_pod
[2020-11-09 10:59:44,647] {taskinstance.py:1205} INFO - Marking task as FAILED.dag_id=composer_kubernetes_pod, task_id=pod-ex-minimum, execution_date=20201108T102928, start_date=20201109T105738, end_date=20201109T105944
[2020-11-09 10:59:44,647] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum raise AirflowException("Pod took too long to start")
[2020-11-09 10:59:44,647] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum airflow.exceptions.AirflowException: Pod took too long to start
[2020-11-09 10:59:44,647] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum
[2020-11-09 10:59:44,648] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum During handling of the above exception, another exception occurred:
[2020-11-09 10:59:44,648] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum
[2020-11-09 10:59:44,648] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum Traceback (most recent call last):
[2020-11-09 10:59:44,648] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/models/taskinstance.py", line 985, in _run_raw_task
[2020-11-09 10:59:44,649] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum result = task_copy.execute(context=context)
[2020-11-09 10:59:44,649] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/contrib/operators/kubernetes_pod_operator.py", line 265, in execute
[2020-11-09 10:59:44,650] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum raise AirflowException('Pod Launching failed: {error}'.format(error=ex))
[2020-11-09 10:59:44,650] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum airflow.exceptions.AirflowException: Pod Launching failed: Pod took too long to start
[2020-11-09 10:59:44,650] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:44,647] {taskinstance.py:1205} INFO - Marking task as FAILED.dag_id=composer_kubernetes_pod, task_id=pod-ex-minimum, execution_date=20201108T102928, start_date=20201109T105738, end_date=20201109T105944
[2020-11-09 10:59:44,703] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum Traceback (most recent call last):
[2020-11-09 10:59:44,704] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/contrib/operators/kubernetes_pod_operator.py", line 253, in execute
[2020-11-09 10:59:44,705] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum get_logs=self.get_logs)
[2020-11-09 10:59:44,705] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/contrib/kubernetes/pod_launcher.py", line 113, in run_pod
[2020-11-09 10:59:44,705] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum raise AirflowException("Pod took too long to start")
[2020-11-09 10:59:44,706] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum airflow.exceptions.AirflowException: Pod took too long to start
[2020-11-09 10:59:44,706] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum
[2020-11-09 10:59:44,707] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum During handling of the above exception, another exception occurred:
[2020-11-09 10:59:44,707] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum
[2020-11-09 10:59:44,708] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum Traceback (most recent call last):
[2020-11-09 10:59:44,708] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/bin/airflow", line 7, in <module>
[2020-11-09 10:59:44,708] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum exec(compile(f.read(), __file__, 'exec'))
[2020-11-09 10:59:44,708] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/bin/airflow", line 37, in <module>
[2020-11-09 10:59:44,708] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum args.func(args)
[2020-11-09 10:59:44,709] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/utils/cli.py", line 75, in wrapper
[2020-11-09 10:59:44,709] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum return f(*args, **kwargs)
[2020-11-09 10:59:44,709] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/bin/cli.py", line 546, in run
[2020-11-09 10:59:44,709] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum _run(args, dag, ti)
[2020-11-09 10:59:44,709] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/bin/cli.py", line 466, in _run
[2020-11-09 10:59:44,709] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum pool=args.pool,
[2020-11-09 10:59:44,710] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/utils/db.py", line 74, in wrapper
[2020-11-09 10:59:44,710] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum return func(*args, **kwargs)
[2020-11-09 10:59:44,710] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/models/taskinstance.py", line 985, in _run_raw_task
[2020-11-09 10:59:44,710] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum result = task_copy.execute(context=context)
[2020-11-09 10:59:44,710] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/contrib/operators/kubernetes_pod_operator.py", line 265, in execute
[2020-11-09 10:59:44,710] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum raise AirflowException('Pod Launching failed: {error}'.format(error=ex))
[2020-11-09 10:59:44,711] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum airflow.exceptions.AirflowException: Pod Launching failed: Pod took too long to start
Do you have any insight ? Thank you in advance and sorry for my english.
It was an issue regarding the volume_secret. I hadn't created it and it wasn't working cause it could not find it.
The above error will be generic and can't used for debug root cause. When you start your dag check the worker pod.Do kubectl describe pod on worker pod .this will tell is any error in launching the worker pod of the task. Then you can check logs of same worker pod also.

why password connection are lost?

I have to update (re-type and save) my ssh connection password every time I restart airflow. Why is that?
I'm running airflow 1.10.3 in a docker container and I can see that all passwords are stored properly in the postgres database.
*** Reading local file: /root/airflow/logs/archive/check_source/2019-07-07T00:00:00+00:00/4.log
[2019-07-08 01:30:27,253] {__init__.py:1139} INFO - Dependencies all met for <TaskInstance: archive.check_source 2019-07-07T00:00:00+00:00 [queued]>
[2019-07-08 01:30:27,267] {__init__.py:1139} INFO - Dependencies all met for <TaskInstance: archive.check_source 2019-07-07T00:00:00+00:00 [queued]>
[2019-07-08 01:30:27,267] {__init__.py:1353} INFO -
--------------------------------------------------------------------------------
[2019-07-08 01:30:27,267] {__init__.py:1354} INFO - Starting attempt 4 of 4
[2019-07-08 01:30:27,268] {__init__.py:1355} INFO -
--------------------------------------------------------------------------------
[2019-07-08 01:30:27,295] {__init__.py:1374} INFO - Executing <Task(SSHOperator): check_source> on 2019-07-07T00:00:00+00:00
[2019-07-08 01:30:27,296] {base_task_runner.py:119} INFO - Running: [u'airflow', u'run', 'archive', 'check_source', '2019-07-07T00:00:00+00:00', u'--job_id', '1321', u'--raw', u'-sd', u'DAGS_FOLDER/archive.py', u'--cfg_path', '/tmp/tmpQwBRud']
[2019-07-08 01:30:28,392] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source [2019-07-08 01:30:28,392] {settings.py:182} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=656
[2019-07-08 01:30:28,741] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source [2019-07-08 01:30:28,740] {__init__.py:51} INFO - Using executor LocalExecutor
[2019-07-08 01:30:28,975] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source [2019-07-08 01:30:28,974] {__init__.py:305} INFO - Filling up the DagBag from /root/airflow/dags/archive.py
[2019-07-08 01:30:29,073] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source [2019-07-08 01:30:29,073] {cli.py:517} INFO - Running <TaskInstance: archive_to_glacier.check_source 2019-07-07T00:00:00+00:00 [running]> on host airflow-webserver-66d5747dc7-99mhr
[2019-07-08 01:30:29,158] {ssh_operator.py:80} INFO - ssh_hook is not provided or invalid. Trying ssh_conn_id to create SSHHook.
[2019-07-08 01:30:29,204] {__init__.py:1580} ERROR - SSH operator error:
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/airflow/models/__init__.py", line 1441, in _run_raw_task
result = task_copy.execute(context=context)
File "/usr/lib/python2.7/site-packages/airflow/contrib/operators/ssh_operator.py", line 167, in execute
raise AirflowException("SSH operator error: {0}".format(str(e)))
AirflowException: SSH operator error:
[2019-07-08 01:30:29,206] {__init__.py:1609} INFO - All retries failed; marking task as FAILED
[2019-07-08 01:30:29,232] {logging_mixin.py:95} INFO - [2019-07-08 01:30:29,232] {configuration.py:287} WARNING - section/key [smtp/smtp_user] not found in config
[2019-07-08 01:30:29,314] {logging_mixin.py:95} INFO - [2019-07-08 01:30:29,313] {email.py:126} INFO - Sent an alert email to [u'bruno.pessanha#imc.com']
[2019-07-08 01:30:29,605] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source Traceback (most recent call last):
[2019-07-08 01:30:29,606] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source File "/usr/bin/airflow", line 32, in <module>
[2019-07-08 01:30:29,606] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source args.func(args)
[2019-07-08 01:30:29,606] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source File "/usr/lib/python2.7/site-packages/airflow/utils/cli.py", line 74, in wrapper
[2019-07-08 01:30:29,606] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source return f(*args, **kwargs)
[2019-07-08 01:30:29,606] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source File "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 523, in run
[2019-07-08 01:30:29,606] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source _run(args, dag, ti)
[2019-07-08 01:30:29,606] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source File "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 442, in _run
[2019-07-08 01:30:29,606] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source pool=args.pool,
[2019-07-08 01:30:29,606] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 73, in wrapper
[2019-07-08 01:30:29,608] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source return func(*args, **kwargs)
[2019-07-08 01:30:29,608] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source File "/usr/lib/python2.7/site-packages/airflow/models/__init__.py", line 1441, in _run_raw_task
[2019-07-08 01:30:29,608] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source result = task_copy.execute(context=context)
[2019-07-08 01:30:29,608] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source File "/usr/lib/python2.7/site-packages/airflow/contrib/operators/ssh_operator.py", line 167, in execute
[2019-07-08 01:30:29,608] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source raise AirflowException("SSH operator error: {0}".format(str(e)))
[2019-07-08 01:30:29,608] {base_task_runner.py:101} INFO - Job 1321: Subtask check_source airflow.exceptions.AirflowException: SSH operator error:
[2019-07-08 01:30:32,260] {logging_mixin.py:95} INFO - [2019-07-08 01:30:32,259] {jobs.py:2562} INFO - Task exited with return code 1
try to add new ssh_conn_id in Admin -> Connections. https://airflow.apache.org/howto/connection/index.html
Because of:
INFO - ssh_hook is not provided or invalid. Trying ssh_conn_id to create SSHHook.

Resources