Hourly tasks "randomly" fail - airflow

I have an Apache Airflow DAG which runs hourly but "randomly" fails, which means it succeeds about half of the time. The Dag consists of exactly one task which utilizes the BashOperator to do a single curl. The problem occurs with scheduled and manual triggers.
Airflow Version: 1.10.13.
Executor: Celery
Env: Kubernetes with Bitnami Airflow Helm Chart (1 Airflow-Web, 1 Airflow-Scheduler, 1 Airflow-Worker, I can give more information on this if it helps)
DB: PSQL
Redis: A Redis instance is provided to the Airflow setup but no keys are present.
DAGs: DAGs are defined inside a K8s ConfigMap and are updated regularily by Airflow
What have I tried:
Dropped DB and set everything up from scratch two times
Changed params like start_date, retries, etc.
Monitored behaviour for 2 days
Code
from builtins import range
from datetime import datetime, timedelta
from airflow.models import DAG
from airflow.operators.bash_operator import BashOperator
args = {
'owner': 'Airflow',
'start_date': datetime(2021, 1, 8, 8, 30, 0),
'retries': 3,
'retry_delay': timedelta(minutes=5),
}
dag = DAG(
dag_id='persist__events_dag',
default_args=args,
schedule_interval='0 * * * *',
tags=['SparkJobs']
)
run_job = BashOperator(
task_id='curl_to_spark_api',
bash_command="""
//Here comes a curl command
""",
dag=dag,
)
run_job
Logs
From Airflow Worker on failed runs:
[2021-01-08 10:36:13,313: INFO/MainProcess] Received task: airflow.executors.celery_executor.execute_command[c6e3fdfb-5474-47aa-b333-be8c69b23ebe]
[2021-01-08 10:36:13,314: INFO/ForkPoolWorker-15] Executing command in Celery: ['airflow', 'run', 'persist_events_dag', 'task_id', '2021-01-08T09:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/opt/bitnami/airflow/dags/external/..2021_01_08_10_33_28.059622242/persist_screenviews.py']
[2021-01-08 10:36:19,279] {__init__.py:50} INFO - Using executor CeleryExecutor
[2021-01-08 10:36:19,356] {dagbag.py:417} INFO - Filling up the DagBag from /opt/bitnami/airflow/dags/external/..2021_01_08_10_33_28.059622242/persist_screenviews.py
/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/models/dag.py:1342: PendingDeprecationWarning: The requested task could not be added to the DAG because a task with task_id create_tag_template_field_result is already in the DAG. Starting in Airflow 2.0, trying to overwrite a task will raise an exception.
category=PendingDeprecationWarning)
Traceback (most recent call last):
File "/opt/bitnami/airflow/venv/bin/airflow", line 37, in <module>
args.func(args)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/utils/cli.py", line 76, in wrapper
return f(*args, **kwargs)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/bin/cli.py", line 538, in run
dag = get_dag(args)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/bin/cli.py", line 164, in get_dag
'parse.'.format(args.dag_id))
airflow.exceptions.AirflowException: dag_id could not be found: persist_events_dag. Either the dag did not exist or it failed to parse.
[2021-01-08 10:36:20,373: ERROR/ForkPoolWorker-15] execute_command encountered a CalledProcessError
Traceback (most recent call last):
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/executors/celery_executor.py", line 78, in execute_command
close_fds=True, env=env)
File "/opt/bitnami/python/lib/python3.6/subprocess.py", line 311, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['airflow', 'run', 'persist_events_dag', 'task_id', '2021-01-08T09:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/opt/bitnami/airflow/dags/external/..2021_01_08_10_33_28.059622242/persist_screenviews.py']' returned non-zero exit status 1.
[2021-01-08 10:36:20,373: ERROR/ForkPoolWorker-15] None
[2021-01-08 10:36:20,526: ERROR/ForkPoolWorker-15] Task airflow.executors.celery_executor.execute_command[c6e3fdfb-5474-47aa-b333-be8c69b23ebe] raised unexpected: AirflowException('Celery command failed',)
Traceback (most recent call last):
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/executors/celery_executor.py", line 78, in execute_command
close_fds=True, env=env)
File "/opt/bitnami/python/lib/python3.6/subprocess.py", line 311, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['airflow', 'run', 'persist_events_dag', 'task_id', '2021-01-08T09:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/opt/bitnami/airflow/dags/external/..2021_01_08_10_33_28.059622242/persist_screenviews.py']' returned non-zero exit status 1.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/celery/app/trace.py", line 412, in trace_task
R = retval = fun(*args, **kwargs)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/celery/app/trace.py", line 704, in __protected_call__
return self.run(*args, **kwargs)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/executors/celery_executor.py", line 83, in execute_command
raise AirflowException('Celery command failed')
airflow.exceptions.AirflowException: Celery command failed
[2021-01-08 10:36:43,230: INFO/MainProcess] Received task: airflow.executors.celery_executor.execute_command[8ba7956f-6b96-48f8-b112-3a4d7baa8bf7]
[2021-01-08 10:36:43,231: INFO/ForkPoolWorker-15] Executing command in Celery: ['airflow', 'run', 'persist_events_dag', 'task_id', '2021-01-08T10:36:40.516529+00:00', '--local', '--pool', 'default_pool', '-sd', '/opt/bitnami/airflow/dags/external/..2021_01_08_10_33_28.059622242/persist_screenviews.py']
[2021-01-08 10:36:49,157] {__init__.py:50} INFO - Using executor CeleryExecutor
[2021-01-08 10:36:49,158] {dagbag.py:417} INFO - Filling up the DagBag from /opt/bitnami/airflow/dags/external/..2021_01_08_10_33_28.059622242/persist_screenviews.py
/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/models/dag.py:1342: PendingDeprecationWarning: The requested task could not be added to the DAG because a task with task_id create_tag_template_field_result is already in the DAG. Starting in Airflow 2.0, trying to overwrite a task will raise an exception.
category=PendingDeprecationWarning)
Traceback (most recent call last):
File "/opt/bitnami/airflow/venv/bin/airflow", line 37, in <module>
args.func(args)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/utils/cli.py", line 76, in wrapper
return f(*args, **kwargs)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/bin/cli.py", line 538, in run
dag = get_dag(args)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/bin/cli.py", line 164, in get_dag
'parse.'.format(args.dag_id))
airflow.exceptions.AirflowException: dag_id could not be found: persist_events_dag. Either the dag did not exist or it failed to parse.
[2021-01-08 10:36:50,560: ERROR/ForkPoolWorker-15] execute_command encountered a CalledProcessError
Traceback (most recent call last):
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/executors/celery_executor.py", line 78, in execute_command
close_fds=True, env=env)
File "/opt/bitnami/python/lib/python3.6/subprocess.py", line 311, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['airflow', 'run', 'persist_events_dag', 'task_id', '2021-01-08T10:36:40.516529+00:00', '--local', '--pool', 'default_pool', '-sd', '/opt/bitnami/airflow/dags/external/..2021_01_08_10_33_28.059622242/persist_screenviews.py']' returned non-zero exit status 1.
[2021-01-08 10:36:50,561: ERROR/ForkPoolWorker-15] None
[2021-01-08 10:36:50,649: ERROR/ForkPoolWorker-15] Task airflow.executors.celery_executor.execute_command[8ba7956f-6b96-48f8-b112-3a4d7baa8bf7] raised unexpected: AirflowException('Celery command failed',)
Traceback (most recent call last):
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/executors/celery_executor.py", line 78, in execute_command
close_fds=True, env=env)
File "/opt/bitnami/python/lib/python3.6/subprocess.py", line 311, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['airflow', 'run', 'persist_events_dag', 'task_id', '2021-01-08T10:36:40.516529+00:00', '--local', '--pool', 'default_pool', '-sd', '/opt/bitnami/airflow/dags/external/..2021_01_08_10_33_28.059622242/persist_screenviews.py']' returned non-zero exit status 1.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/celery/app/trace.py", line 412, in trace_task
R = retval = fun(*args, **kwargs)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/celery/app/trace.py", line 704, in __protected_call__
return self.run(*args, **kwargs)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/executors/celery_executor.py", line 83, in execute_command
raise AirflowException('Celery command failed')
airflow.exceptions.AirflowException: Celery command failed
[2021-01-08 10:37:23,414: INFO/MainProcess] Received task: airflow.executors.celery_executor.execute_command[10478bca-c21d-4cf5-b366-5ca1f66c0fe1]
[2021-01-08 10:37:23,415: INFO/ForkPoolWorker-15] Executing command in Celery: ['airflow', 'run', 'persist_events_dag', 'task_id', '2021-01-08T09:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/opt/bitnami/airflow/dags/external/..data/persist_screenviews.py']
[2021-01-08 10:37:29,457] {__init__.py:50} INFO - Using executor CeleryExecutor
[2021-01-08 10:37:29,458] {dagbag.py:417} INFO - Filling up the DagBag from /opt/bitnami/airflow/dags/external/..data/persist_screenviews.py
Running %s on host %s <TaskInstance: persist_events_dag.task_id 2021-01-08T09:00:00+00:00 [queued]> airflow-worker-0.airflow-worker-headless.default.svc.cluster.local
[2021-01-08 10:37:36,294: INFO/ForkPoolWorker-15] Task airflow.executors.celery_executor.execute_command[10478bca-c21d-4cf5-b366-5ca1f66c0fe1] succeeded in 12.8788606680464s: None
If a run succeeds, I nevertheless get the following error in the logs of the Airflow scheduler:
Process DagFileProcessor144631-Process:
Traceback (most recent call last):
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 593, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "dag_pkey"
DETAIL: Key (dag_id)=(persist_events_dag) already exists.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/opt/bitnami/python/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
self.run()
File "/opt/bitnami/python/lib/python3.6/multiprocessing/process.py", line 93, in run
self._target(*self._args, **self._kwargs)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py", line 159, in _run_file_processor
pickle_dags)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
return func(*args, **kwargs)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py", line 1609, in process_file
dag.sync_to_db()
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
return func(*args, **kwargs)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/models/dag.py", line 1535, in sync_to_db
orm_dag.tags = self.get_dagtags(session=session)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/utils/db.py", line 70, in wrapper
return func(*args, **kwargs)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/models/dag.py", line 1574, in get_dagtags
session.commit()
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 1042, in commit
self.transaction.commit()
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 504, in commit
self._prepare_impl()
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 483, in _prepare_impl
self.session.flush()
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 2536, in flush
self._flush(objects)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 2678, in _flush
transaction.rollback(_capture_exception=True)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/util/langhelpers.py", line 70, in __exit__
with_traceback=exc_tb,
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
raise exception
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 2638, in _flush
flush_context.execute()
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/unitofwork.py", line 422, in execute
rec.execute(self)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/unitofwork.py", line 589, in execute
uow,
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/persistence.py", line 245, in save_obj
insert,
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/persistence.py", line 1083, in _emit_insert_statements
c = cached_connections[connection].execute(statement, multiparams)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1011, in execute
return meth(self, multiparams, params)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1130, in _execute_clauseelement
distilled_params,
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1317, in _execute_context
e, statement, parameters, cursor, context
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1511, in _handle_dbapi_exception
sqlalchemy_exception, with_traceback=exc_info[2], from_=e
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
raise exception
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 593, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "dag_pkey"
DETAIL: Key (dag_id)=(persist_events_dag) already exists.
[SQL: INSERT INTO dag (dag_id, root_dag_id, is_paused, is_subdag, is_active, last_scheduler_run, last_pickled, last_expired, scheduler_lock, pickle_id, fileloc, owners, description, default_view, schedule_interval) VALUES (%(dag_id)s, %(root_dag_id)s, %(is_paused)s, %(is_subdag)s, %(is_active)s, %(last_scheduler_run)s, %(last_pickled)s, %(last_expired)s, %(scheduler_lock)s, %(pickle_id)s, %(fileloc)s, %(owners)s, %(description)s, %(default_view)s, %(schedule_interval)s)]
[parameters: {'dag_id': 'persist_events_dag', 'root_dag_id': None, 'is_paused': True, 'is_subdag': False, 'is_active': True, 'last_scheduler_run': datetime.datetime(2021, 1, 8, 10, 55, 4, 297819, tzinfo=<Timezone [UTC]>), 'last_pickled': None, 'last_expired': None, 'scheduler_lock': None, 'pickle_id': None, 'fileloc': '/opt/bitnami/airflow/dags/external/..2021_01_08_10_33_28.059622242/persist_screenviews.py', 'owners': 'Airflow', 'description': None, 'default_view': None, 'schedule_interval': '"0 * * * *"'}]
(Background on this error at: http://sqlalche.me/e/13/gkpj)
Process DagFileProcessor144689-Process:
Traceback (most recent call last):
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 593, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "dag_pkey"
DETAIL: Key (dag_id)=(persist_events_dag) already exists.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/opt/bitnami/python/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
self.run()
File "/opt/bitnami/python/lib/python3.6/multiprocessing/process.py", line 93, in run
self._target(*self._args, **self._kwargs)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py", line 159, in _run_file_processor
pickle_dags)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
return func(*args, **kwargs)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py", line 1609, in process_file
dag.sync_to_db()
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
return func(*args, **kwargs)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/models/dag.py", line 1535, in sync_to_db
orm_dag.tags = self.get_dagtags(session=session)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/utils/db.py", line 70, in wrapper
return func(*args, **kwargs)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/airflow/models/dag.py", line 1574, in get_dagtags
session.commit()
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 1042, in commit
self.transaction.commit()
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 504, in commit
self._prepare_impl()
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 483, in _prepare_impl
self.session.flush()
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 2536, in flush
self._flush(objects)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 2678, in _flush
transaction.rollback(_capture_exception=True)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/util/langhelpers.py", line 70, in __exit__
with_traceback=exc_tb,
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
raise exception
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 2638, in _flush
flush_context.execute()
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/unitofwork.py", line 422, in execute
rec.execute(self)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/unitofwork.py", line 589, in execute
uow,
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/persistence.py", line 245, in save_obj
insert,
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/persistence.py", line 1083, in _emit_insert_statements
c = cached_connections[connection].execute(statement, multiparams)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1011, in execute
return meth(self, multiparams, params)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1130, in _execute_clauseelement
distilled_params,
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1317, in _execute_context
e, statement, parameters, cursor, context
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1511, in _handle_dbapi_exception
sqlalchemy_exception, with_traceback=exc_info[2], from_=e
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
raise exception
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/opt/bitnami/airflow/venv/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 593, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "dag_pkey"
DETAIL: Key (dag_id)=(persist_events_dag) already exists.
[SQL: INSERT INTO dag (dag_id, root_dag_id, is_paused, is_subdag, is_active, last_scheduler_run, last_pickled, last_expired, scheduler_lock, pickle_id, fileloc, owners, description, default_view, schedule_interval) VALUES (%(dag_id)s, %(root_dag_id)s, %(is_paused)s, %(is_subdag)s, %(is_active)s, %(last_scheduler_run)s, %(last_pickled)s, %(last_expired)s, %(scheduler_lock)s, %(pickle_id)s, %(fileloc)s, %(owners)s, %(description)s, %(default_view)s, %(schedule_interval)s)]
[parameters: {'dag_id': 'persist_events_dag', 'root_dag_id': None, 'is_paused': True, 'is_subdag': False, 'is_active': True, 'last_scheduler_run': datetime.datetime(2021, 1, 8, 10, 55, 33, 439514, tzinfo=<Timezone [UTC]>), 'last_pickled': None, 'last_expired': None, 'scheduler_lock': None, 'pickle_id': None, 'fileloc': '/opt/bitnami/airflow/dags/external/..2021_01_08_10_33_28.059622242/persist_screenviews.py', 'owners': 'Airflow', 'description': None, 'default_view': None, 'schedule_interval': '"0 * * * *"'}]
(Background on this error at: http://sqlalche.me/e/13/gkpj)
[2021-01-08 10:57:22,392] {scheduler_job.py:963} INFO - 1 tasks up for execution:
<TaskInstance: persist_events_dag.task_id 2021-01-08 09:00:00+00:00 [scheduled]>
[2021-01-08 10:57:22,402] {scheduler_job.py:997} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 1 task instances ready to be queued
[2021-01-08 10:57:22,403] {scheduler_job.py:1025} INFO - DAG persist_events_dag has 0/16 running and queued tasks
[2021-01-08 10:57:22,407] {scheduler_job.py:1085} INFO - Setting the following tasks to queued state:
<TaskInstance: persist_events_dag.task_id 2021-01-08 09:00:00+00:00 [scheduled]>
[2021-01-08 10:57:22,420] {scheduler_job.py:1159} INFO - Setting the following 1 tasks to queued state:
<TaskInstance: persist_events_dag.task_id 2021-01-08 09:00:00+00:00 [queued]>
[2021-01-08 10:57:22,420] {scheduler_job.py:1195} INFO - Sending ('persist_events_dag', 'task_id', datetime.datetime(2021, 1, 8, 9, 0, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1) to executor with priority 1 and queue default
[2021-01-08 10:57:22,424] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'persist_events_dag', 'task_id', '2021-01-08T09:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/opt/bitnami/airflow/dags/external/persist_screenviews.py']
[2021-01-08 10:57:36,579] {scheduler_job.py:1334} INFO - Executor reports execution of persist_events_dag.task_id execution_date=2021-01-08 09:00:00+00:00 exited with status success for try_number 1

Related

Apache Airflow on Azure sqlalchemy Postgres SSL SYSCALL error: EOF detected

We deployed Apache Airflow 2.3.3 to Azure.
Webserver - Web App
Scheduler - ACI
Celery Worker - ACI
We were seeing errors on the Celery ACI console related to Postgres and Redis connection timeouts as below
[2022-09-22 18:55:50,650: WARNING/ForkPoolWorker-15] Failed operation _store_result. Retrying 2 more times.
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1803, in _execute_context
cursor, statement, parameters, context
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 719, in do_execute
cursor.execute(statement, parameters)
psycopg2.DatabaseError: could not receive data from server: Connection timed out
SSL SYSCALL error: Connection timed out
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/celery/backends/database/__init__.py", line 47, in _inner
return fun(*args, **kwargs)
File "/home/airflow/.local/lib/python3.7/site-packages/celery/backends/database/__init__.py", line 117, in _store_result
task = list(session.query(self.task_cls).filter(self.task_cls.task_id == task_id))
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 2887, in __iter__
return self._iter().__iter__()
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 2897, in _iter
execution_options={"_sa_orm_load_options": self.load_options},
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 1689, in execute
result = conn._execute_20(statement, params or {}, execution_options)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1614, in _execute_20
return meth(self, args_10style, kwargs_10style, execution_options)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/sql/elements.py", line 326, in _execute_on_connection
self, multiparams, params, execution_options
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1491, in _execute_clauseelement
cache_hit=cache_hit,
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context
e, statement, parameters, cursor, context
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 2027, in _handle_dbapi_exception
sqlalchemy_exception, with_traceback=exc_info[2], from_=e
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 207, in raise_
raise exception
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1803, in _execute_context
cursor, statement, parameters, context
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 719, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.DatabaseError: (psycopg2.DatabaseError) could not receive data from server: Connection timed out
SSL SYSCALL error: Connection timed out
[SQL: SELECT celery_taskmeta.id AS celery_taskmeta_id, celery_taskmeta.task_id AS celery_taskmeta_task_id, celery_taskmeta.status AS celery_taskmeta_status, celery_taskmeta.result AS celery_taskmeta_result, celery_taskmeta.date_done AS celery_taskmeta_date_done, celery_taskmeta.traceback AS celery_taskmeta_traceback
FROM celery_taskmeta
WHERE celery_taskmeta.task_id = %(task_id_1)s]
[parameters: {'task_id_1': 'c5f9f53c-8afe-4d67-8d3b-d7ad84875de1'}]
(Background on this error at: https://sqlalche.me/e/14/4xp6)
[2022-09-22 18:55:50,929: INFO/ForkPoolWorker-15] [c5f9f53c-8afe-4d67-8d3b-d7ad84875de1] Executing command in Celery: ['airflow', 'tasks', 'run', 'CS_ALERTING', 'CheckRunningTasks', 'scheduled__2022-09-22T18:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/CS_ALERTING.py']
[2022-09-22 18:55:51,241: INFO/ForkPoolWorker-15] Filling up the DagBag from /opt/airflow/platform_pam/dags/CS_ALERTING.py
[2022-09-22 18:55:53,467: INFO/ForkPoolWorker-15] Running <TaskInstance: CS_ALERTING.CheckRunningTasks scheduled__2022-09-22T18:00:00+00:00 [queued]> on host localhost
[2022-09-22 18:55:58,304: INFO/ForkPoolWorker-15] Task airflow.executors.celery_executor.execute_command[c5f9f53c-8afe-4d67-8d3b-d7ad84875de1] succeeded in 960.1964174450004s: None
[2022-09-22 19:29:25,931: WARNING/MainProcess] consumer: Connection to broker lost. Trying to re-establish the connection...
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/redis/connection.py", line 706, in send_packed_command
sendall(self._sock, item)
File "/home/airflow/.local/lib/python3.7/site-packages/redis/_compat.py", line 9, in sendall
return sock.sendall(*args, **kwargs)
File "/usr/local/lib/python3.7/ssl.py", line 1034, in sendall
v = self.send(byte_view[count:])
File "/usr/local/lib/python3.7/ssl.py", line 1003, in send
return self._sslobj.write(data)
TimeoutError: [Errno 110] Connection timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/celery/worker/consumer/consumer.py", line 332, in start
blueprint.start(self)
File "/home/airflow/.local/lib/python3.7/site-packages/celery/bootsteps.py", line 116, in start
step.start(parent)
File "/home/airflow/.local/lib/python3.7/site-packages/celery/worker/consumer/consumer.py", line 628, in start
c.loop(*c.loop_args())
File "/home/airflow/.local/lib/python3.7/site-packages/celery/worker/loops.py", line 97, in asynloop
next(loop)
File "/home/airflow/.local/lib/python3.7/site-packages/kombu/asynchronous/hub.py", line 301, in create_loop
poll_timeout = fire_timers(propagate=propagate) if scheduled else 1
File "/home/airflow/.local/lib/python3.7/site-packages/kombu/asynchronous/hub.py", line 143, in fire_timers
entry()
File "/home/airflow/.local/lib/python3.7/site-packages/kombu/asynchronous/timer.py", line 64, in __call__
return self.fun(*self.args, **self.kwargs)
File "/home/airflow/.local/lib/python3.7/site-packages/kombu/asynchronous/timer.py", line 126, in _reschedules
return fun(*args, **kwargs)
File "/home/airflow/.local/lib/python3.7/site-packages/kombu/transport/redis.py", line 557, in maybe_check_subclient_health
client.check_health()
File "/home/airflow/.local/lib/python3.7/site-packages/redis/client.py", line 3522, in check_health
check_health=False)
File "/home/airflow/.local/lib/python3.7/site-packages/redis/connection.py", line 726, in send_command
check_health=kwargs.get('check_health', True))
File "/home/airflow/.local/lib/python3.7/site-packages/redis/connection.py", line 718, in send_packed_command
(errno, errmsg))
redis.exceptions.ConnectionError: Error 110 while writing to socket. Connection timed out.
[2022-09-22 19:29:26,023: WARNING/MainProcess] /home/airflow/.local/lib/python3.7/site-packages/celery/worker/consumer/consumer.py:367: CPendingDeprecationWarning:
I referred the Airflow's documentation and found setting up database
We are modifying Airflow's docker image and adding a python file,
airflow.www.db_utils.db_config (This file is installed to site_packages) and defined the dictionary
keepalive_kwargs = {
"keepalives": 1,
"keepalives_idle": 30,
"keepalives_interval": 5,
"keepalives_count": 5,
}
Finally, we are setting
ENV AIRFLOW__DATABASE__SQL_ALCHEMY_CONNECT_ARGS="airflow.www.db_utils.db_config.keepalive_kwargs"
Unfortunately, the error stills persist. It will be great if someone helps me to resolve this issue.

Airflow Scheduler on Webserver: "The scheduler does not appear to be running" after installation

I have installed Airflow 2.2.5 in my local VM (Oracle Virtual Box 8.6) with MySQL 8.0 as database backend and went through the installation process shown in the website (https://airflow.apache.org/docs/apache-airflow/stable/installation/index.html).
I am using the 'sql_alchemy_conn' argument below:
mysql+mysqlconnector://airflow_user:airflow_password#localhost:3306/airflow_db
I managed to install Airflow, its dependencies (at least what has been asked throughout the process) and got to a point where I can log in to the webserver. However the webserver says the scheduler is not running, as show in the picture:
When I execute "airflow scheduler" in the terminal, I get the following error:
[2022-09-14 15:43:28,943] {authentication.py:59} INFO - package: mysql.connector.plugins
[2022-09-14 15:43:28,947] {authentication.py:60} INFO - plugin_name: mysql_native_password
[2022-09-14 15:43:28,953] {authentication.py:64} INFO - AUTHENTICATION_PLUGIN_CLASS: MySQLNativePasswordAuthPlugin
____________ _____________
____ |__( )_________ __/__ /________ __
____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / /
___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ /
_/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/mysql/connector/cursor.py", line 398, in _process_params_dict
conv = to_mysql(conv)
File "/usr/local/lib/python3.6/site-packages/mysql/connector/conversion.py", line 194, in to_mysql
) from None
TypeError: Python 'taskinstancestate' cannot be converted to a MySQL type
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
cursor.execute(statement, parameters)
File "/usr/local/lib/python3.6/site-packages/mysql/connector/cursor.py", line 541, in execute
stmt = _bytestr_format_dict(stmt, self._process_params_dict(params))
File "/usr/local/lib/python3.6/site-packages/mysql/connector/cursor.py", line 406, in _process_params_dict
) from err
mysql.connector.errors.ProgrammingError: Failed processing pyformat-parameters; Python 'taskinstancestate' cannot be converted to a MySQL type
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/bin/airflow", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python3.6/site-packages/airflow/__main__.py", line 48, in main
args.func(args)
File "/usr/local/lib/python3.6/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/airflow/utils/cli.py", line 92, in wrapper
return f(*args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/airflow/cli/commands/scheduler_command.py", line 75, in scheduler
_run_scheduler_job(args=args)
File "/usr/local/lib/python3.6/site-packages/airflow/cli/commands/scheduler_command.py", line 46, in _run_scheduler_job
job.run()
File "/usr/local/lib/python3.6/site-packages/airflow/jobs/base_job.py", line 242, in run
session.commit()
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/orm/session.py", line 1046, in commit
self.transaction.commit()
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/orm/session.py", line 504, in commit
self._prepare_impl()
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/orm/session.py", line 483, in _prepare_impl
self.session.flush()
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/orm/session.py", line 2540, in flush
self._flush(objects)
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/orm/session.py", line 2682, in _flush
transaction.rollback(_capture_exception=True)
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/util/langhelpers.py", line 70, in __exit__
with_traceback=exc_tb,
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
raise exception
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/orm/session.py", line 2642, in _flush
flush_context.execute()
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/orm/unitofwork.py", line 422, in execute
rec.execute(self)
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/orm/unitofwork.py", line 589, in execute
uow,
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/orm/persistence.py", line 245, in save_obj
insert,
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/orm/persistence.py", line 1136, in _emit_insert_statements
statement, params
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/engine/base.py", line 1011, in execute
return meth(self, multiparams, params)
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/engine/base.py", line 1130, in _execute_clauseelement
distilled_params,
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/engine/base.py", line 1317, in _execute_context
e, statement, parameters, cursor, context
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/engine/base.py", line 1511, in _handle_dbapi_exception
sqlalchemy_exception, with_traceback=exc_info[2], from_=e
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
raise exception
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/usr/local/lib64/python3.6/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
cursor.execute(statement, parameters)
File "/usr/local/lib/python3.6/site-packages/mysql/connector/cursor.py", line 541, in execute
stmt = _bytestr_format_dict(stmt, self._process_params_dict(params))
File "/usr/local/lib/python3.6/site-packages/mysql/connector/cursor.py", line 406, in _process_params_dict
) from err
sqlalchemy.exc.ProgrammingError: (mysql.connector.errors.ProgrammingError) Failed processing pyformat-parameters; Python 'taskinstancestate' cannot be converted to a MySQL type
[SQL: INSERT INTO job (dag_id, state, job_type, start_date, end_date, latest_heartbeat, executor_class, hostname, unixname) VALUES (%(dag_id)s, %(state)s, %(job_type)s, %(start_date)s, %(end_date)s, %(latest_heartbeat)s, %(executor_class)s, %(hostname)s, %(unixname)s)]
[parameters: {'dag_id': None, 'state': <TaskInstanceState.RUNNING: 'running'>, 'job_type': 'SchedulerJob', 'start_date': datetime.datetime(2022, 9, 14, 18, 43, 29, 247880), 'end_date': None, 'latest_heartbeat': datetime.datetime(2022, 9, 14, 18, 43, 29, 247894), 'executor_class': 'LocalExecutor', 'hostname': 'airhost', 'unixname': 'root'}]
(Background on this error at: http://sqlalche.me/e/13/f405)
I have been clueless about this section:
"msql.connector.errors.ProgrammingError: Failed processing pyformat-parameters; Python 'taskinstancestate' cannot be converted to a MySQL type".
What does that mean? Any ideas? I haven't run any DAGs or built anything so far.
I managed to fix this. The problem was related to the driver mysqlconnector chosen for 'sql_alchemy_conn' string. I switched to pymysql and the problem was solved:
mysql+pymysql://airflow_user:airflow_password#localhost:3306/airflow_db

Getting Timeout Issue in Airflow

I am getting SSH operator error timed out issue in Airflow. Initially I was not getting this issue but Now I am getting this issue continually
Code
xx = [
SSHOperator(
task_id=str(i),
command="sh /home/ec2-user/rapid/dht/hi_policy/bin/filename.sh ",
ssh_conn_id=SECRET_NAME,
dag=dag,
do_xcom_push=True,
)
for i in dbo_list_1
]
Error
[2022-08-22 10:16:03,760] {{taskinstance.py:1482}} ERROR - Task failed with exception
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/airflow/providers/ssh/operators/ssh.py", line 109, in execute
with self.ssh_hook.get_conn() as ssh_client:
File "/usr/local/lib/python3.7/site-packages/airflow/providers/ssh/hooks/ssh.py", line 240, in get_conn
client.connect(**connect_kwargs)
File "/usr/local/lib/python3.7/site-packages/paramiko/client.py", line 349, in <lambda>
retry_on_signal(lambda: sock.connect(addr))
socket.timeout: timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1138, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1311, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
raise AirflowException(f"SSH operator error: {str(e)}")
airflow.exceptions.AirflowException: SSH operator error: timed out

MysqlOperator in airflow 2.0.1 failed with "ssl connection error"

I am new to airflow and I am trying to test Mysql connection using MysqlOperator in airflow 2.0.1. However I am getting an error regarding to ssl connection error. I have tried to add extra parameters to disable ssl mode, but still I am getting the same error.
Here is my code, (I tried to pass the ssl param = disable in the code), and it doesn't work.
from airflow import DAG
from airflow.providers.mysql.operators.mysql import MySqlOperator
from airflow.operators.python import PythonOperator
from airflow.operators.dummy_operator import DummyOperator
from airflow.utils.dates import days_ago, timedelta
default_args = {
'owner' : 'airflow',
'depend_on_past' : False,
'start_date' : days_ago(2),
'retries' : 1,
'retry_delay' : timedelta(minutes=1)
}
with DAG(
'mysqlConnTest',
default_args=default_args,
schedule_interval='#once',
catchup=False) as dag:
start_date = DummyOperator(task_id = "start_task")
# [START howto_operator_mysql]
select_table_mysql_task = MySqlOperator(
task_id='select_table_mysql', mysql_conn_id='mysql', sql="SELECT * FROM country;"autocommit=True, parameters= {'ssl_mode': 'DISABLED'}
)
start_date >> select_table_mysql_task
and here is the error
*** Reading local file: /opt/airflow/logs/mysqlHookConnTest/select_table_mysql/2021-04-14T12:46:42.221662+00:00/2.log
[2021-04-14 12:47:46,791] {taskinstance.py:851} INFO - Dependencies all met for <TaskInstance: mysqlHookConnTest.select_table_mysql 2021-04-14T12:46:42.221662+00:00 [queued]>
[2021-04-14 12:47:47,007] {taskinstance.py:851} INFO - Dependencies all met for <TaskInstance: mysqlHookConnTest.select_table_mysql 2021-04-14T12:46:42.221662+00:00 [queued]>
[2021-04-14 12:47:47,047] {taskinstance.py:1042} INFO -
--------------------------------------------------------------------------------
[2021-04-14 12:47:47,054] {taskinstance.py:1043} INFO - Starting attempt 2 of 2
[2021-04-14 12:47:47,074] {taskinstance.py:1044} INFO -
--------------------------------------------------------------------------------
[2021-04-14 12:47:47,331] {taskinstance.py:1063} INFO - Executing <Task(MySqlOperator): select_table_mysql> on 2021-04-14T12:46:42.221662+00:00
[2021-04-14 12:47:47,377] {standard_task_runner.py:52} INFO - Started process 66 to run task
[2021-04-14 12:47:47,402] {standard_task_runner.py:76} INFO - Running: ['airflow', 'tasks', 'run', 'mysqlHookConnTest', 'select_table_mysql', '2021-04-14T12:46:42.221662+00:00', '--job-id', '142', '--pool', 'default_pool', '--raw', '--subdir', 'DAGS_FOLDER/MySqlHookConnTest.py', '--cfg-path', '/tmp/tmppujnrey3', '--error-file', '/tmp/tmpjl_g_p3t']
[2021-04-14 12:47:47,413] {standard_task_runner.py:77} INFO - Job 142: Subtask select_table_mysql
[2021-04-14 12:47:47,556] {logging_mixin.py:104} INFO - Running <TaskInstance: mysqlHookConnTest.select_table_mysql 2021-04-14T12:46:42.221662+00:00 [running]> on host ea95b9685a31
[2021-04-14 12:47:47,672] {taskinstance.py:1257} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=airflow
AIRFLOW_CTX_DAG_ID=mysqlHookConnTest
AIRFLOW_CTX_TASK_ID=select_table_mysql
AIRFLOW_CTX_EXECUTION_DATE=2021-04-14T12:46:42.221662+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2021-04-14T12:46:42.221662+00:00
[2021-04-14 12:47:47,687] {mysql.py:72} INFO - Executing: SELECT idPais, Nombre, codigo, paisPlataforma, create_date, update_date FROM ob_cpanel.cpanel_pais;
[2021-04-14 12:47:47,710] {base.py:74} INFO - Using connection to: id: mysql. Host: sys-sql-pre-01.oneboxtickets.net, Port: 3306, Schema: , Login: lectura, Password: None, extra: None
[2021-04-14 12:47:48,134] {taskinstance.py:1455} ERROR - (2006, 'SSL connection error: error:1425F102:SSL routines:ssl_choose_client_version:unsupported protocol')
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 1112, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 1285, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 1315, in _execute_task
result = task_copy.execute(context=context)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/providers/mysql/operators/mysql.py", line 74, in execute
hook.run(self.sql, autocommit=self.autocommit, parameters=self.parameters)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/hooks/dbapi.py", line 173, in run
with closing(self.get_conn()) as conn:
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/providers/mysql/hooks/mysql.py", line 144, in get_conn
return MySQLdb.connect(**conn_config)
File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/__init__.py", line 85, in Connect
return Connection(*args, **kwargs)
File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/connections.py", line 208, in __init__
super(Connection, self).__init__(*args, **kwargs2)
_mysql_exceptions.OperationalError: (2006, 'SSL connection error: error:1425F102:SSL routines:ssl_choose_client_version:unsupported protocol')
[2021-04-14 12:47:48,143] {taskinstance.py:1503} INFO - Marking task as FAILED. dag_id=mysqlHookConnTest, task_id=select_table_mysql, execution_date=20210414T124642, start_date=20210414T124746, end_date=20210414T124748
[2021-04-14 12:47:48,243] {local_task_job.py:146} INFO - Task exited with return code 1
We have tried to remove the last two parameter from the dag code, and we add in extra field(conn-airflow UI). Adding this json
{"ssl":false}
and the issue appears with another similar error
/opt/airflow/logs/mysqlOperatorConnTest/select_table_mysql/2021-04-15T11:26:50.578333+00:00/2.log
*** Fetching from: http://airflow-worker-0.airflow-worker.airflow.svc.cluster.local:8793/log/mysqlOperatorConnTest/select_table_mysql/2021-04-15T11:26:50.578333+00:00/2.log
[2021-04-15 11:27:54,471] {taskinstance.py:851} INFO - Dependencies all met for <TaskInstance: mysqlOperatorConnTest.select_table_mysql 2021-04-15T11:26:50.578333+00:00 [queued]>
[2021-04-15 11:27:54,497] {taskinstance.py:851} INFO - Dependencies all met for <TaskInstance: mysqlOperatorConnTest.select_table_mysql 2021-04-15T11:26:50.578333+00:00 [queued]>
[2021-04-15 11:27:54,497] {taskinstance.py:1042} INFO -
--------------------------------------------------------------------------------
[2021-04-15 11:27:54,497] {taskinstance.py:1043} INFO - Starting attempt 2 of 2
[2021-04-15 11:27:54,497] {taskinstance.py:1044} INFO -
--------------------------------------------------------------------------------
[2021-04-15 11:27:54,507] {taskinstance.py:1063} INFO - Executing <Task(MySqlOperator): select_table_mysql> on 2021-04-15T11:26:50.578333+00:00
[2021-04-15 11:27:54,510] {standard_task_runner.py:52} INFO - Started process 115 to run task
[2021-04-15 11:27:54,514] {standard_task_runner.py:76} INFO - Running: ['airflow', 'tasks', 'run', 'mysqlOperatorConnTest', 'select_table_mysql', '2021-04-15T11:26:50.578333+00:00', '--job-id', '68', '--pool', 'default_pool', '--raw', '--subdir', '/opt/airflow/dags/repo/MySqlOperatorConnTest.py', '--cfg-path', '/tmp/tmpy7bv58_z', '--error-file', '/tmp/tmpaoe808of']
[2021-04-15 11:27:54,514] {standard_task_runner.py:77} INFO - Job 68: Subtask select_table_mysql
[2021-04-15 11:27:54,644] {logging_mixin.py:104} INFO - Running <TaskInstance: mysqlOperatorConnTest.select_table_mysql 2021-04-15T11:26:50.578333+00:00 [running]> on host airflow-worker-0.airflow-worker.airflow.svc.cluster.local
[2021-04-15 11:27:54,707] {logging_mixin.py:104} WARNING - /opt/python/site-packages/sqlalchemy/sql/coercions.py:518 SAWarning: Coercing Subquery object into a select() for use in IN(); please pass a select() construct explicitly
[2021-04-15 11:27:54,725] {taskinstance.py:1255} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=airflow
AIRFLOW_CTX_DAG_ID=mysqlOperatorConnTest
AIRFLOW_CTX_TASK_ID=select_table_mysql
AIRFLOW_CTX_EXECUTION_DATE=2021-04-15T11:26:50.578333+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2021-04-15T11:26:50.578333+00:00
[2021-04-15 11:27:54,726] {mysql.py:72} INFO - Executing: SELECT idPais, Nombre, codigo, paisPlataforma, create_date, update_date FROM ob_cpanel.cpanel_pais;
[2021-04-15 11:27:54,744] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,744] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,744] {base.py:65} INFO - Using connection to: id: mysql. Host: sys-sql-pre-01.oneboxtickets.net, Port: 3306, Schema: , Login: lectura, Password: XXXXXXXX, extra: None
[2021-04-15 11:27:54,745] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,745] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,745] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,745] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,746] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,746] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,746] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,746] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,746] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,747] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,747] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,747] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,747] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,747] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,787] {taskinstance.py:1455} ERROR - (2006, 'SSL connection error: error:1425F102:SSL routines:ssl_choose_client_version:unsupported protocol')
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/taskinstance.py", line 1112, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/opt/python/site-packages/airflow/models/taskinstance.py", line 1285, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
File "/opt/python/site-packages/airflow/models/taskinstance.py", line 1315, in _execute_task
result = task_copy.execute(context=context)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/mysql/operators/mysql.py", line 74, in execute
hook.run(self.sql, autocommit=self.autocommit, parameters=self.parameters)
File "/opt/python/site-packages/airflow/hooks/dbapi.py", line 173, in run
with closing(self.get_conn()) as conn:
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/mysql/hooks/mysql.py", line 144, in get_conn
return MySQLdb.connect(**conn_config)
File "/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/__init__.py", line 85, in Connect
return Connection(*args, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/connections.py", line 208, in __init__
super(Connection, self).__init__(*args, **kwargs2)
_mysql_exceptions.OperationalError: (2006, 'SSL connection error: error:1425F102:SSL routines:ssl_choose_client_version:unsupported protocol')
[2021-04-15 11:27:54,788] {taskinstance.py:1496} INFO - Marking task as FAILED. dag_id=mysqlOperatorConnTest, task_id=select_table_mysql, execution_date=20210415T112650, start_date=20210415T112754, end_date=20210415T112754
[2021-04-15 11:27:54,845] {local_task_job.py:146} INFO - Task exited with return code 1
We solved this issue upgrading the Mysql client to 5.7. Our server version was 5.6 and the previous client was 8, as I was using docker image. so we downgraded the client to be more closer to the server version.

Flask/uWSGI/MariaDB running under Supervisor stops working almost daily

I'm running a Flask app via uWSGI which is managed by Supervisor (and the webserver is using Nginx).
The whole setup seems to work perfectly, however, I notice that every day or two, my flask app starts throwing 500 errors (which means there is a Sqlalchemy db.commit() error) and I can't recover until I go in and reload uwsgi, after which, everything works perfectly again. To reload, I typically kill the uwsgi processes, and supervisor brings them back in and it's fine. Once I've gotten it to work by just 'touch'ing the wsgi file.
Also strangely, whenever I run supervisor with processes=1, it automatically loads TWO uwsgi instances. The first PID is the one logged as being operated by supervisor, and the other doesn't register anywhere. However, regardless of this, killing the 2nd process - it always comes back.
Below is my uwsgi startup command run by supervisor
[program:{{ app_name }}]
command=/usr/local/bin/uwsgi
--socket=/tmp/{{ app_name }}.sock
--logto={{ log_dir }}/application.log
--home={{ site_dir }}/env
--pythonpath={{ site_dir }}
--virtualenv={{ env_path }}
--wsgi-file={{ webapps_dir }}/manage.py
--callable={{ wsgi_callable }}
--max-requests=1000
--master
--processes=1
--uid={{ deployment_user }}
--gid=www-data
--chmod-socket=664
--chown-socket={{ deployment_user }}:www-data
directory={{ webapps_dir }}
autostart=true
autorestart=true
EDIT: Here is the error that's spit out from the server.
2015-03-19 15:03:38,389 ERROR: Exception on /api/v1/account/login [POST] [in /var/www/website.com/env/local/lib/python2.7/site-packages/flask/app.py:1423]
Traceback (most recent call last):
File "/var/www/website.com/env/local/lib/python2.7/site-packages/flask/app.py", line 1817, in wsgi_app
response = self.full_dispatch_request()
File "/var/www/website.com/env/local/lib/python2.7/site-packages/flask/app.py", line 1477, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/flask_cors.py", line 272, in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
File "/var/www/website.com/env/local/lib/python2.7/site-packages/flask/app.py", line 1381, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/flask/app.py", line 1475, in full_dispatch_request
rv = self.dispatch_request()
File "/var/www/website.com/env/local/lib/python2.7/site-packages/flask/app.py", line 1461, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/flask_classy.py", line 200, in proxy
response = view(**request.view_args)
File "./webportal/account/api.py", line 34, in login_user
user = auth.authenticate(username=email, password=password)
File "/var/www/website.com/server/webportal/auth.py", line 23, in authenticate
user = User.query.filter_by(email=username).first()
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2367, in first
ret = list(self[0:1])
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2228, in __getitem__
return list(res)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2438, in __iter__
return self._execute_and_instances(context)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2453, in _execute_and_instances
result = conn.execute(querycontext.statement, self._params)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 729, in execute
return meth(self, multiparams, params)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/sql/elements.py", line 322, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 826, in _execute_clauseelement
compiled_sql, distilled_params
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 958, in _execute_context
context)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 1159, in _handle_dbapi_exception
exc_info
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/util/compat.py", line 199, in raise_from_cause
reraise(type(exception), exception, tb=exc_tb)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 951, in _execute_context
context)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/engine/default.py", line 436, in do_execute
cursor.execute(statement, parameters)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/MySQLdb/cursors.py", line 205, in execute
self.errorhandler(self, exc, value)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler
raise errorclass, errorvalue
OperationalError: (OperationalError) (2006, 'MySQL server has gone away') 'SELECT `Users`.id AS `Users_id`, `Users`.email AS `Users_email`, `Users`.password AS `Users_password`, `Users`.`emailConfirmed` AS `Users_emailConfirmed`, `Users`.`createdAtUtc` AS `Users_createdAtUtc`, `Users`.`updatedAtUtc` AS `Users_updatedAtUtc`, `Users`.`roleId` AS `Users_roleId` \nFROM `Users` \nWHERE `Users`.email = %s \n LIMIT %s' ('user#website.demo', 1)
2015-03-19 15:06:22,089 ERROR: Exception on /api/v1/account/login [POST] [in /var/www/website.com/env/local/lib/python2.7/site-packages/flask/app.py:1423]
Traceback (most recent call last):
File "/var/www/website.com/env/local/lib/python2.7/site-packages/flask/app.py", line 1817, in wsgi_app
response = self.full_dispatch_request()
File "/var/www/website.com/env/local/lib/python2.7/site-packages/flask/app.py", line 1477, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/flask_cors.py", line 272, in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
File "/var/www/website.com/env/local/lib/python2.7/site-packages/flask/app.py", line 1381, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/flask/app.py", line 1475, in full_dispatch_request
rv = self.dispatch_request()
File "/var/www/website.com/env/local/lib/python2.7/site-packages/flask/app.py", line 1461, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/flask_classy.py", line 200, in proxy
response = view(**request.view_args)
File "./webportal/account/api.py", line 34, in login_user
user = auth.authenticate(username=email, password=password)
File "/var/www/website.com/server/webportal/auth.py", line 23, in authenticate
user = User.query.filter_by(email=username).first()
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2367, in first
ret = list(self[0:1])
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2228, in __getitem__
return list(res)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2438, in __iter__
return self._execute_and_instances(context)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2453, in _execute_and_instances
result = conn.execute(querycontext.statement, self._params)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 729, in execute
return meth(self, multiparams, params)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/sql/elements.py", line 322, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 826, in _execute_clauseelement
compiled_sql, distilled_params
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 893, in _execute_context
None, None)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 1159, in _handle_dbapi_exception
exc_info
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/util/compat.py", line 199, in raise_from_cause
reraise(type(exception), exception, tb=exc_tb)
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 887, in _execute_context
conn = self._revalidate_connection()
File "/var/www/website.com/env/local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 242, in _revalidate_connection
"Can't reconnect until invalid "
StatementError: Can't reconnect until invalid transaction is rolled back (original cause: InvalidRequestError: Can't reconnect until invalid transaction is rolled back) u'SELECT `Users`.id AS `Users_id`, `Users`.email AS `Users_email`, `Users`.password AS `Users_password`, `Users`.`emailConfirmed` AS `Users_emailConfirmed`, `Users`.`createdAtUtc` AS `Users_createdAtUtc`, `Users`.`updatedAtUtc` AS `Users_updatedAtUtc`, `Users`.`roleId` AS `Users_roleId` \nFROM `Users` \nWHERE `Users`.email = %s \n LIMIT %s' [immutabledict({})]
According to https://dev.mysql.com/doc/refman/5.0/en/gone-away.html, MySQL closes the connection after 8 hours... Once that happens, the app gets in a bit of a state.

Resources