To give some context, I am using Airflow 2.3.0 on Kubernetes with the Local Executor (which may sound weird, but it works for us for now) with one pod for the webserver and two for the scheduler.
I have a DAG consisting of a single task (PythonOperator) that makes many API calls (200K) using requests.
Every 15 calls, the data is loaded in a DataFrame and stored on AWS S3 (using boto3) to reduce the RAM usage.
The problem is that I can't get to the end of this task because it goes into error randomly (after 1, 10 or 120 minutes).
I have made more than 50 tries, no success and the only logs on the task are:
[2022-09-01, 14:45:44 UTC] {taskinstance.py:1159} INFO - Dependencies all met for <TaskInstance: INGESTION-DAILY-dag.extract_task scheduled__2022-08-30T00:00:00+00:00 [queued]>
[2022-09-01, 14:45:44 UTC] {taskinstance.py:1159} INFO - Dependencies all met for <TaskInstance: INGESTION-DAILY-dag.extract_task scheduled__2022-08-30T00:00:00+00:00 [queued]>
[2022-09-01, 14:45:44 UTC] {taskinstance.py:1356} INFO -
--------------------------------------------------------------------------------
[2022-09-01, 14:45:44 UTC] {taskinstance.py:1357} INFO - Starting attempt 23 of 24
[2022-09-01, 14:45:44 UTC] {taskinstance.py:1358} INFO -
--------------------------------------------------------------------------------
[2022-09-01, 14:45:44 UTC] {taskinstance.py:1377} INFO - Executing <Task(_PythonDecoratedOperator): extract_task> on 2022-08-30 00:00:00+00:00
[2022-09-01, 14:45:44 UTC] {standard_task_runner.py:52} INFO - Started process 942 to run task
[2022-09-01, 14:45:44 UTC] {standard_task_runner.py:79} INFO - Running: ['airflow', 'tasks', 'run', 'INGESTION-DAILY-dag', 'extract_task', 'scheduled__2022-08-30T00:00:00+00:00', '--job-id', '4390', '--raw', '--subdir', 'DAGS_FOLDER/dags/ingestion/daily_dag/dag.py', '--cfg-path', '/tmp/tmpwxasaq93', '--error-file', '/tmp/tmpl7t_gd8e']
[2022-09-01, 14:45:44 UTC] {standard_task_runner.py:80} INFO - Job 4390: Subtask extract_task
[2022-09-01, 14:45:45 UTC] {task_command.py:369} INFO - Running <TaskInstance: INGESTION-DAILY-dag.extract_task scheduled__2022-08-30T00:00:00+00:00 [running]> on host 10.XX.XXX.XXX
[2022-09-01, 14:48:17 UTC] {local_task_job.py:156} INFO - Task exited with return code 1
[2022-09-01, 14:48:17 UTC] {taskinstance.py:1395} INFO - Marking task as UP_FOR_RETRY. dag_id=INGESTION-DAILY-dag, task_id=extract_task, execution_date=20220830T000000, start_date=20220901T144544, end_date=20220901T144817
[2022-09-01, 14:48:17 UTC] {local_task_job.py:273} INFO - 0 downstream tasks scheduled from follow-on schedule check
But when I go to the pod logs, I get the following message:
[2022-09-01 14:06:31,624] {local_executor.py:128} ERROR - Failed to execute task an integer is required (got type ChunkedEncodingError).
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/executors/local_executor.py", line 124, in _execute_work_in_fork
args.func(args)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 51, in command
return func(*args, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/cli.py", line 99, in wrapper
return f(*args, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 377, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 183, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 241, in _run_task_by_local_task_job
run_job.run()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/base_job.py", line 244, in run
self._execute()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/local_task_job.py", line 105, in _execute
self.task_runner.start()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/task/task_runner/standard_task_runner.py", line 41, in start
self.process = self._start_by_fork()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/task/task_runner/standard_task_runner.py", line 125, in _start_by_fork
os._exit(return_code)
TypeError: an integer is required (got type ChunkedEncodingError)
What I find strange is that I never had this error on other DAGs (where tasks are smaller and faster). I checked, during an attempt, CPU and RAM usages are stable and low.
I have the same error locally, I also tried to upgrade to 2.3.4 but nothing works.
Do you have any idea how to fix this?
Thanks a lot!
Nicolas
As #EDG956 said, this is not an error from Airflow but from the code.
I solved it using a context manager (which was not enough) and recreating a session:
s = requests.Session()
while True:
try:
with s.get(base_url) as r:
response = r
except requests.exceptions.ChunkedEncodingError:
s.close()
s.requests.Session()
response = s.get(base_url)
I install Airflow 2.1.2 with python 3.6.5 in a linux server and trigger a DAG by run_as user but it is not worked as expected.
I've created a logs folder with chmod -R 777 but when the DAG get start and system create the log file under logs/ with
"drwx------ 3 root root"
and caused permission denied since only root can access the folder.
is there any solution I can fix it?
here is the error I got
[2021-07-16 07:00:14,334] {base_task_runner.py:135} INFO - Running: ['sudo', '-E', '-H', '-u', 'hdfsprod', 'airflow', 'tasks', 'run', 'hdfs_kinit', 'hdfs_kinit', '2021-07-16T07:00:04.092676+00:00', '--job-id', '19', '--pool', 'default_pool', '--raw', '--subdir', 'DAGS_FOLDER/hdfs_test.py', '--cfg-path', '/tmp/tmp8_t6s5o2', '--error-file', '/tmp/tmpmf3zruty']
[2021-07-16 07:00:15,841] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit [[34m2021-07-16 07:00:15,839[0m] {[34mdagbag.py:[0m487} INFO[0m - Filling up the DagBag from /opt/data/dags/hdfs_test.py[0m
[2021-07-16 07:00:15,992] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit Traceback (most recent call last):
[2021-07-16 07:00:15,993] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit File "/service/python3/lib/python3.6/pathlib.py", line 1246, in mkdir
[2021-07-16 07:00:15,993] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit self._accessor.mkdir(self, mode)
[2021-07-16 07:00:15,993] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit File "/service/python3/lib/python3.6/pathlib.py", line 387, in wrapped
[2021-07-16 07:00:15,993] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit return strfunc(str(pathobj), *args)
[2021-07-16 07:00:15,993] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit PermissionError: [Errno 13] Permission denied: '/opt/data/logs/hdfs_kinit/2021-07-16T07:00:04.092676+00:00'
[2021-07-16 07:00:15,994] {base_task_runner.py:119} INFO - Job 19: Subtask hdfs_kinit
The easiest way to fix it is to make your "run_as" user belong to "root" group and change umask to 002 in the scripts that start Airflow.
https://unix.stackexchange.com/questions/12842/make-all-new-files-in-a-directory-accessible-to-a-group
I'm running the quickstart for KubernetesPodOperator secret using the link below : https://cloud.google.com/composer/docs/how-to/using/using-kubernetes-pod-operator
Code used below :
from airflow import models
from airflow.contrib.kubernetes import secret
from airflow.contrib.operators import kubernetes_pod_operator
# A Secret is an object that contains a small amount of sensitive data such as
# a password, a token, or a key. Such information might otherwise be put in a
# Pod specification or in an image; putting it in a Secret object allows for
# more control over how it is used, and reduces the risk of accidental
# exposure.
secret_env = secret.Secret(
# Expose the secret as environment variable.
deploy_type='env',
# The name of the environment variable, since deploy_type is `env` rather
# than `volume`.
deploy_target='SQL_CONN',
# Name of the Kubernetes Secret
secret='airflow-secrets',
# Key of a secret stored in this Secret object
key='sql_alchemy_conn')
secret_volume = secret.Secret(
'volume',
# Path where we mount the secret as volume
'/var/secrets/google',
# Name of Kubernetes Secret
'service-account',
# Key in the form of service account file name
'service-account.json')
YESTERDAY = datetime.datetime.now() - datetime.timedelta(days=1)
# If a Pod fails to launch, or has an error occur in the container, Airflow
# will show the task as failed, as well as contain all of the task logs
# required to debug.
with models.DAG(
dag_id='composer_sample_kubernetes_pod',
schedule_interval=datetime.timedelta(days=1),
start_date=YESTERDAY) as dag:
# Only name, namespace, image, and task_id are required to create a
# KubernetesPodOperator. In Cloud Composer, currently the operator defaults
# to using the config file found at `/home/airflow/composer_kube_config if
# no `config_file` parameter is specified. By default it will contain the
# credentials for Cloud Composer's Google Kubernetes Engine cluster that is
# created upon environment creation.
kubernetes_secret_vars_ex = kubernetes_pod_operator.KubernetesPodOperator(
task_id='ex-kube-secrets',
name='ex-kube-secrets',
namespace='default',
image='python:3.6-stretch',
cmds=["python","-c"],
arguments=["print('hello world')"],
labels={"foo": "bar"},
startup_timeout_seconds=300,
# The secrets to pass to Pod, the Pod will fail to create if the
# secrets you specify in a Secret object do not exist in Kubernetes.
secrets=[secret_env, secret_volume],
# env_vars allows you to specify environment variables for your
# container to use. env_vars is templated.
env_vars={
'EXAMPLE_VAR': '/example/value',
'GOOGLE_APPLICATION_CREDENTIALS': '/var/secrets/google/service-account.json'})
I have created successfully the secret using :
kubectl create secret generic airflow-secrets \
--from-literal sql_alchemy_conn=test_value
I am receiving this error :
> Reading remote log from gs://europe-west1-test-environme-5ad38518-bucket/logs/composer_kubernetes_pod/pod-ex-minimum/2020-11-08T10:29:28.767187+00:00/3.log.
[2020-11-09 10:57:38,764] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: composer_kubernetes_pod.pod-ex-minimum 2020-11-08T10:29:28.767187+00:00 [queued]>
[2020-11-09 10:57:38,817] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: composer_kubernetes_pod.pod-ex-minimum 2020-11-08T10:29:28.767187+00:00 [queued]>
[2020-11-09 10:57:38,818] {taskinstance.py:880} INFO -
--------------------------------------------------------------------------------
[2020-11-09 10:57:38,818] {taskinstance.py:881} INFO - Starting attempt 3 of 3
[2020-11-09 10:57:38,818] {taskinstance.py:882} INFO -
--------------------------------------------------------------------------------
[2020-11-09 10:57:38,894] {taskinstance.py:901} INFO - Executing <Task(KubernetesPodOperator): pod-ex-minimum> on 2020-11-08T10:29:28.767187+00:00
[2020-11-09 10:57:38,895] {base_task_runner.py:131} INFO - Running on host: airflow-worker-765557479f-zbzkm
[2020-11-09 10:57:38,895] {base_task_runner.py:132} INFO - Running: ['airflow', 'run', 'composer_kubernetes_pod', 'pod-ex-minimum', '2020-11-08T10:29:28.767187+00:00', '--job_id', '54', '--pool', 'default_pool', '--raw', '-sd', 'DAGS_FOLDER/kubernetes_secret3.py', '--cfg_path', '/tmp/tmp3qvpc1a6']
[2020-11-09 10:57:42,268] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:57:42,267] {configuration.py:618} INFO - Reading the config from /etc/airflow/airflow.cfg
[2020-11-09 10:57:42,365] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:57:42,364] {configuration.py:618} INFO - Reading the config from /etc/airflow/airflow.cfg
[2020-11-09 10:57:42,650] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:57:42,649] {default_celery.py:90} WARNING - You have configured a result_backend of redis://airflow-redis-service.default.svc.cluster.local:6379/0, it is highly recommended to use an alternative result_backend (i.e. a database).
[2020-11-09 10:57:42,651] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:57:42,651] {__init__.py:51} INFO - Using executor CeleryExecutor
[2020-11-09 10:57:42,652] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:57:42,651] {dagbag.py:397} INFO - Filling up the DagBag from /home/airflow/gcs/dags/kubernetes_secret3.py
[2020-11-09 10:57:43,393] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum Running <TaskInstance: composer_kubernetes_pod.pod-ex-minimum 2020-11-08T10:29:28.767187+00:00 [running]> on host airflow-worker-765557479f-zbzkm
[2020-11-09 10:57:59,728] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:57:59,727] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:00,736] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:00,736] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:01,743] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:01,742] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:02,748] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:02,748] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:03,755] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:03,755] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:04,764] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:04,764] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:05,772] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:05,772] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:06,780] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:06,780] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:07,787] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:07,787] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:08,795] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:08,795] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:09,804] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:09,804] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:10,816] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:10,815] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:11,824] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:11,824] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:12,831] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:12,831] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:13,838] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:13,838] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:14,845] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:14,844] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:15,855] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:15,854] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:16,863] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:16,862] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:17,870] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:17,869] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:18,877] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:18,877] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:19,883] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:19,883] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:20,892] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:20,891] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:21,899] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:21,898] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:22,907] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:22,907] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:23,915] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:23,915] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:24,922] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:24,922] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:25,930] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:25,930] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:26,938] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:26,937] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:27,944] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:27,944] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:28,952] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:28,951] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:29,958] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:29,958] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:30,965] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:30,964] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:31,971] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:31,970] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:32,978] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:32,978] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:33,986] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:33,986] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:34,993] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:34,993] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:36,002] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:36,002] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:37,012] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:37,012] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:38,029] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:38,029] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:39,041] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:39,040] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:40,048] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:40,048] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:41,058] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:41,057] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:42,066] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:42,066] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:43,073] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:43,073] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:44,081] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:44,081] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:45,088] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:45,088] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:46,098] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:46,098] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:47,109] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:47,109] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:48,117] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:48,117] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:49,125] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:49,124] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:50,134] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:50,134] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:51,141] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:51,140] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:52,147] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:52,147] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:53,156] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:53,155] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:54,167] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:54,167] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:55,176] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:55,176] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:56,183] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:56,183] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:57,196] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:57,195] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:58,202] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:58,201] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:58:59,209] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:58:59,208] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:00,215] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:00,215] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:01,226] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:01,226] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:10,317] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:10,317] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:11,324] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:11,323] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:12,331] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:12,330] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:13,337] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:13,337] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:14,344] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:14,344] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:15,351] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:15,350] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:16,357] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:16,357] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:17,363] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:17,363] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:18,376] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:18,375] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:40,566] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:40,565] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:41,572] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:41,572] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:42,582] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:42,582] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:43,589] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:43,589] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:44,596] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:44,595] {pod_launcher.py:142} INFO - Event: pod-ex-minimum-02a096e7 had an event of type Pending
[2020-11-09 10:59:44,645] {taskinstance.py:1148} ERROR - Pod Launching failed: Pod took too long to start
Traceback (most recent call last)
File "/usr/local/lib/airflow/airflow/contrib/operators/kubernetes_pod_operator.py", line 253, in execut
get_logs=self.get_logs
File "/usr/local/lib/airflow/airflow/contrib/kubernetes/pod_launcher.py", line 113, in run_po
raise AirflowException("Pod took too long to start"
airflow.exceptions.AirflowException: Pod took too long to star
During handling of the above exception, another exception occurred
Traceback (most recent call last)
File "/usr/local/lib/airflow/airflow/models/taskinstance.py", line 985, in _run_raw_tas
result = task_copy.execute(context=context
File "/usr/local/lib/airflow/airflow/contrib/operators/kubernetes_pod_operator.py", line 265, in execut
raise AirflowException('Pod Launching failed: {error}'.format(error=ex)
airflow.exceptions.AirflowException: Pod Launching failed: Pod took too long to star
[2020-11-09 10:59:44,646] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:44,645] {taskinstance.py:1148} ERROR - Pod Launching failed: Pod took too long to start
[2020-11-09 10:59:44,646] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum Traceback (most recent call last):
[2020-11-09 10:59:44,647] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/contrib/operators/kubernetes_pod_operator.py", line 253, in execute
[2020-11-09 10:59:44,647] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum get_logs=self.get_logs)
[2020-11-09 10:59:44,647] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/contrib/kubernetes/pod_launcher.py", line 113, in run_pod
[2020-11-09 10:59:44,647] {taskinstance.py:1205} INFO - Marking task as FAILED.dag_id=composer_kubernetes_pod, task_id=pod-ex-minimum, execution_date=20201108T102928, start_date=20201109T105738, end_date=20201109T105944
[2020-11-09 10:59:44,647] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum raise AirflowException("Pod took too long to start")
[2020-11-09 10:59:44,647] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum airflow.exceptions.AirflowException: Pod took too long to start
[2020-11-09 10:59:44,647] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum
[2020-11-09 10:59:44,648] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum During handling of the above exception, another exception occurred:
[2020-11-09 10:59:44,648] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum
[2020-11-09 10:59:44,648] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum Traceback (most recent call last):
[2020-11-09 10:59:44,648] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/models/taskinstance.py", line 985, in _run_raw_task
[2020-11-09 10:59:44,649] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum result = task_copy.execute(context=context)
[2020-11-09 10:59:44,649] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/contrib/operators/kubernetes_pod_operator.py", line 265, in execute
[2020-11-09 10:59:44,650] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum raise AirflowException('Pod Launching failed: {error}'.format(error=ex))
[2020-11-09 10:59:44,650] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum airflow.exceptions.AirflowException: Pod Launching failed: Pod took too long to start
[2020-11-09 10:59:44,650] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum [2020-11-09 10:59:44,647] {taskinstance.py:1205} INFO - Marking task as FAILED.dag_id=composer_kubernetes_pod, task_id=pod-ex-minimum, execution_date=20201108T102928, start_date=20201109T105738, end_date=20201109T105944
[2020-11-09 10:59:44,703] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum Traceback (most recent call last):
[2020-11-09 10:59:44,704] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/contrib/operators/kubernetes_pod_operator.py", line 253, in execute
[2020-11-09 10:59:44,705] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum get_logs=self.get_logs)
[2020-11-09 10:59:44,705] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/contrib/kubernetes/pod_launcher.py", line 113, in run_pod
[2020-11-09 10:59:44,705] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum raise AirflowException("Pod took too long to start")
[2020-11-09 10:59:44,706] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum airflow.exceptions.AirflowException: Pod took too long to start
[2020-11-09 10:59:44,706] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum
[2020-11-09 10:59:44,707] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum During handling of the above exception, another exception occurred:
[2020-11-09 10:59:44,707] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum
[2020-11-09 10:59:44,708] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum Traceback (most recent call last):
[2020-11-09 10:59:44,708] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/bin/airflow", line 7, in <module>
[2020-11-09 10:59:44,708] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum exec(compile(f.read(), __file__, 'exec'))
[2020-11-09 10:59:44,708] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/bin/airflow", line 37, in <module>
[2020-11-09 10:59:44,708] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum args.func(args)
[2020-11-09 10:59:44,709] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/utils/cli.py", line 75, in wrapper
[2020-11-09 10:59:44,709] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum return f(*args, **kwargs)
[2020-11-09 10:59:44,709] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/bin/cli.py", line 546, in run
[2020-11-09 10:59:44,709] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum _run(args, dag, ti)
[2020-11-09 10:59:44,709] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/bin/cli.py", line 466, in _run
[2020-11-09 10:59:44,709] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum pool=args.pool,
[2020-11-09 10:59:44,710] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/utils/db.py", line 74, in wrapper
[2020-11-09 10:59:44,710] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum return func(*args, **kwargs)
[2020-11-09 10:59:44,710] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/models/taskinstance.py", line 985, in _run_raw_task
[2020-11-09 10:59:44,710] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum result = task_copy.execute(context=context)
[2020-11-09 10:59:44,710] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum File "/usr/local/lib/airflow/airflow/contrib/operators/kubernetes_pod_operator.py", line 265, in execute
[2020-11-09 10:59:44,710] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum raise AirflowException('Pod Launching failed: {error}'.format(error=ex))
[2020-11-09 10:59:44,711] {base_task_runner.py:113} INFO - Job 54: Subtask pod-ex-minimum airflow.exceptions.AirflowException: Pod Launching failed: Pod took too long to start
Do you have any insight ? Thank you in advance and sorry for my english.
It was an issue regarding the volume_secret. I hadn't created it and it wasn't working cause it could not find it.
The above error will be generic and can't used for debug root cause. When you start your dag check the worker pod.Do kubectl describe pod on worker pod .this will tell is any error in launching the worker pod of the task. Then you can check logs of same worker pod also.
I'm running airflow on my computer (Mac AirBook, 1.6 GHz Intel Core i5 and 8 GB 2133 MHz LPDDR3). A DAG with several tasks, failed with below error. Checked several articles online but with little to no help. There is nothing wrong with the task itself(double checked).
Any help is much appreciated.
[2019-08-27 13:01:55,372] {sequential_executor.py:45} INFO - Executing command: ['airflow', 'run', 'Makefile_DAG', 'normalize_companies', '2019-08-27T15:38:20.914820+00:00', '--local', '--pool', 'default_pool', '-sd', '/home/airflow/dags/makefileDAG.py']
[2019-08-27 13:01:56,937] {settings.py:213} INFO - settings.configure_orm(): Using pool settings. pool_size=5, max_overflow=10, pool_recycle=1800, pid=40647
[2019-08-27 13:01:57,285] {__init__.py:51} INFO - Using executor SequentialExecutor
[2019-08-27 13:01:59,423] {dagbag.py:90} INFO - Filling up the DagBag from /home/airflow/dags/makefileDAG.py
[2019-08-27 13:02:01,736] {cli.py:516} INFO - Running <TaskInstance: Makefile_DAG.normalize_companies 2019-08-27T15:38:20.914820+00:00 [queued]> on host ajays-macbook-air.local
Traceback (most recent call last):
File "/anaconda3/envs/airflow/bin/airflow", line 32, in <module>
args.func(args)
File "/anaconda3/envs/airflow/lib/python3.6/site-packages/airflow/utils/cli.py", line 74, in wrapper
return f(*args, **kwargs)
File "/anaconda3/envs/airflow/lib/python3.6/site-packages/airflow/bin/cli.py", line 522, in run
_run(args, dag, ti)
File "/anaconda3/envs/airflow/lib/python3.6/site-packages/airflow/bin/cli.py", line 435, in _run
run_job.run()
File "/anaconda3/envs/airflow/lib/python3.6/site-packages/airflow/jobs/base_job.py", line 213, in run
self._execute()
File "/anaconda3/envs/airflow/lib/python3.6/site-packages/airflow/jobs/local_task_job.py", line 111, in _execute
self.heartbeat()
File "/anaconda3/envs/airflow/lib/python3.6/site-packages/airflow/jobs/base_job.py", line 196, in heartbeat
self.heartbeat_callback(session=session)
File "/anaconda3/envs/airflow/lib/python3.6/site-packages/airflow/utils/db.py", line 70, in wrapper
return func(*args, **kwargs)
File "/anaconda3/envs/airflow/lib/python3.6/site-packages/airflow/jobs/local_task_job.py", line 159, in heartbeat_callback
raise AirflowException("Hostname of job runner does not match")
airflow.exceptions.AirflowException: Hostname of job runner does not match
[2019-08-27 13:05:05,904] {sequential_executor.py:52} ERROR - Failed to execute task Command '['airflow', 'run', 'Makefile_DAG', 'normalize_companies', '2019-08-27T15:38:20.914820+00:00', '--local', '--pool', 'default_pool', '-sd', '/home/airflow/dags/makefileDAG.py']' returned non-zero exit status 1..
[2019-08-27 13:05:05,905] {scheduler_job.py:1256} INFO - Executor reports execution of Makefile_DAG.normalize_companies execution_date=2019-08-27 15:38:20.914820+00:00 exited with status failed for try_number 2
Logs from the task:
[2019-08-27 13:02:13,616] {bash_operator.py:115} INFO - Running command: python /home/Makefile_Redo/normalize_companies.py
[2019-08-27 13:02:13,628] {bash_operator.py:124} INFO - Output:
[2019-08-27 13:05:02,849] {logging_mixin.py:95} INFO - [[34m2019-08-27 13:05:02,848[0m] {[34mlocal_task_job.py:[0m158} [33mWARNING[0m - [33mThe recorded hostname [1majays-macbook-air.local[0m does not match this instance's hostname [1mAJAYs-MacBook-Air.local[0m[0m
[2019-08-27 13:05:02,860] {helpers.py:319} INFO - Sending Signals.SIGTERM to GPID 40649
[2019-08-27 13:05:02,861] {taskinstance.py:897} ERROR - Received SIGTERM. Terminating subprocesses.
[2019-08-27 13:05:02,862] {bash_operator.py:142} INFO - Sending SIGTERM signal to bash process group
[2019-08-27 13:05:03,539] {taskinstance.py:1047} ERROR - Task received SIGTERM signal
Traceback (most recent call last):
File "/anaconda3/envs/airflow/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 922, in _run_raw_task
result = task_copy.execute(context=context)
File "/anaconda3/envs/airflow/lib/python3.6/site-packages/airflow/operators/bash_operator.py", line 126, in execute
for line in iter(sp.stdout.readline, b''):
File "/anaconda3/envs/airflow/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 899, in signal_handler
raise AirflowException("Task received SIGTERM signal")
airflow.exceptions.AirflowException: Task received SIGTERM signal
[2019-08-27 13:05:03,550] {taskinstance.py:1076} INFO - All retries failed; marking task as FAILED
A weird thing I noticed from above log is:
The recorded hostname [1majays-macbook-air.local[0m does not match this instance's hostname [1mAJAYs-MacBook-Air.local[0m[0m
How is this possible and any solution to fix this?
I had the same problem on my Mac. The solution that worked for me was updating airflow.cfg with hostname_callable = socket:gethostname. The original getfqdn returns different hostnames from time to time.