Airflow BashOperator UnicodeEncodeError - airflow

I'm using airflow 1.10.0 on Python 3.5 and encounter this error about encoding error with logging.
The operator uses default setting of output_encoding which is already utf-8.
task_compile = BashOperator(
task_id='task_compile',
retries=1,
retry_delay=timedelta(minutes=5),
bash_command='/root/docker/tools/compile.sh',
dag=dag
)
task_compile.set_downstream(task_last)
The shell script pops a docker container and runs composer install, I tested with another simple composer install task and nothing fails, the error is only with a certain set of dependencies. As shown in the trace stack the module reponsible for the exception is file_task_handler.py when it emits the line to be logged into the log file.
[2018-09-19 20:42:18,708] {bash_operator.py:111} INFO - Package operations: 134 installs, 0 updates, 0 removals
[2018-09-19 20:42:18,790] {bash_operator.py:111} INFO - - Installing ocramius/package-versions (1.3.0): Downloading (100%)
[2018-09-19 20:42:18,850] {bash_operator.py:111} INFO - - Installing symfony/flex (v1.1.1): Downloading (100%)
[2018-09-19 20:42:18,897] {bash_operator.py:111} INFO -
[2018-09-19 20:42:18,898] {logging_mixin.py:95} WARNING - --- Logging error ---
[2018-09-19 16:12:51,554] {logging_mixin.py:95} WARNING - --- Logging error ---
[2018-09-19 16:12:51,555] {logging_mixin.py:95} WARNING - Traceback (most recent call last):
[2018-09-19 16:12:51,555] {logging_mixin.py:95} WARNING - File "/usr/lib/python3.5/logging/__init__.py", line 983, in emit
stream.write(msg)
[2018-09-19 16:12:51,555] {logging_mixin.py:95} WARNING - UnicodeEncodeError: 'ascii' codec can't encode character '\U0001f3b6' in position 81: ordinal not in range(128)
[2018-09-19 16:12:51,555] {logging_mixin.py:95} WARNING - Call stack:
[2018-09-19 16:12:51,557] {logging_mixin.py:95} WARNING - File "/usr/local/bin/airflow", line 32, in <module>
args.func(args)
[2018-09-19 16:12:51,557] {logging_mixin.py:95} WARNING - File "/usr/local/lib/python3.5/dist-packages/airflow/utils/cli.py", line 74, in wrapper
return f(*args, **kwargs)
[2018-09-19 16:12:51,557] {logging_mixin.py:95} WARNING - File "/usr/local/lib/python3.5/dist-packages/airflow/bin/cli.py", line 498, in run
_run(args, dag, ti)
[2018-09-19 16:12:51,558] {logging_mixin.py:95} WARNING - File "/usr/local/lib/python3.5/dist-packages/airflow/bin/cli.py", line 402, in _run
pool=args.pool,
[2018-09-19 16:12:51,558] {logging_mixin.py:95} WARNING - File "/usr/local/lib/python3.5/dist-packages/airflow/utils/db.py", line 74, in wrapper
return func(*args, **kwargs)
[2018-09-19 16:12:51,558] {logging_mixin.py:95} WARNING - File "/usr/local/lib/python3.5/dist-packages/airflow/models.py", line 1633, in _run_raw_task
result = task_copy.execute(context=context)
[2018-09-19 16:12:51,558] {logging_mixin.py:95} WARNING - File "/usr/local/lib/python3.5/dist-packages/airflow/operators/bash_operator.py", line 110, in execute
self.log.info(line)
[2018-09-19 16:12:51,558] {logging_mixin.py:95} WARNING - File "/usr/lib/python3.5/logging/__init__.py", line 1280, in info
self._log(INFO, msg, args, **kwargs)
[2018-09-19 16:12:51,558] {logging_mixin.py:95} WARNING - File "/usr/lib/python3.5/logging/__init__.py", line 1416, in _log
self.handle(record)
[2018-09-19 16:12:51,558] {logging_mixin.py:95} WARNING - File "/usr/lib/python3.5/logging/__init__.py", line 1426, in handle
self.callHandlers(record)
[2018-09-19 16:12:51,558] {logging_mixin.py:95} WARNING - File "/usr/lib/python3.5/logging/__init__.py", line 1488, in callHandlers
hdlr.handle(record)
[2018-09-19 16:12:51,558] {logging_mixin.py:95} WARNING - File "/usr/lib/python3.5/logging/__init__.py", line 856, in handle
self.emit(record)
[2018-09-19 16:12:51,558] {logging_mixin.py:95} WARNING - File "/usr/local/lib/python3.5/dist-packages/airflow/utils/log/file_task_handler.py", line 61, in emit
self.handler.emit(record)

The issue is that the locale in the container is not set to UTF-8

Faced similar issue, I was able to resolve it by adding env variable LANG=en_US.UTF-8 into the supervisord configuration and restarting supervisord.
I use supervisor to start airflow scheduler, webserver and flower.
Note: This env variable needs to be added into all the airflow worker nodes as well.

Related

Airflow scheduler gets stuck

I have setup airflow on Windows 10 WSL with Python 3.6.8. I started the scheduler of airflow using airflow scheduler command but it got following error :
[2020-04-28 19:24:06,500] {base_job.py:205} ERROR - SchedulerJob heartbeat got an exception
Traceback (most recent call last):
File "/home/akshay/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 755, in _commit_impl
self.engine.dialect.do_commit(self.connection)
File "/home/akshay/.local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 543, in do_commit
dbapi_connection.commit()
sqlite3.OperationalError: disk I/O error
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/akshay/.local/lib/python3.6/site-packages/airflow/jobs/base_job.py", line 173, in heartbeat
previous_heartbeat = self.latest_heartbeat
File "/usr/lib/python3.6/contextlib.py", line 88, in __exit__
next(self.gen)
File "/home/akshay/.local/lib/python3.6/site-packages/airflow/utils/db.py", line 45, in create_session
session.commit()
File "/home/akshay/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 1036, in commit
self.transaction.commit()
File "/home/akshay/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 507, in commit
t[1].commit()
File "/home/akshay/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1736, in commit
self._do_commit()
File "/home/akshay/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1767, in _do_commit
self.connection._commit_impl()
File "/home/akshay/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 757, in _commit_impl
self._handle_dbapi_exception(e, None, None, None, None)
File "/home/akshay/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1482, in _handle_dbapi_exception
sqlalchemy_exception, with_traceback=exc_info[2], from_=e
File "/home/akshay/.local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 178, in raise_
raise exception
File "/home/akshay/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 755, in _commit_impl
self.engine.dialect.do_commit(self.connection)
File "/home/akshay/.local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 543, in do_commit
dbapi_connection.commit()
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) disk I/O error
(Background on this error at: http://sqlalche.me/e/e3q8)
What is the reason for this failing? What can be the solution to resolve this as my airflow scheduler is running well since last 10 days?

airflow trigger_dag command throwing error

I am executing airflow trigger_dag cng-hello_world command in airflow server and it resulted in below error. please suggest.
I followed below link:- http://michal.karzynski.pl/blog/2017/03/19/developing-workflows-with-apache-airflow/
The same Dag is been executed via airflow UI
[2019-02-06 11:57:41,755] {settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=2000
[2019-02-06 11:57:43,326] {plugins_manager.py:97} ERROR - invalid syntax (airflow_api.py, line 7)
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/airflow/plugins_manager.py", line 86, in <module>
m = imp.load_source(namespace, filepath)
File "/home/ec2-user/airflow/plugins/airflow_api.py", line 7
<!DOCTYPE html>
^
SyntaxError: invalid syntax
[2019-02-06 11:57:43,326] {plugins_manager.py:98} ERROR - Failed to import plugin /home/ec2-user/airflow/plugins/airflow_api.py
[2019-02-06 11:57:43,326] {plugins_manager.py:97} ERROR - invalid syntax (__init__.py, line 7)
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/airflow/plugins_manager.py", line 86, in <module>
m = imp.load_source(namespace, filepath)
File "/home/ec2-user/airflow/plugins/__init__.py", line 7
<!DOCTYPE html>
^
SyntaxError: invalid syntax
[2019-02-06 11:57:43,327] {plugins_manager.py:98} ERROR - Failed to import plugin /home/ec2-user/airflow/plugins/__init__.py
[2019-02-06 11:57:47,236] {__init__.py:51} INFO - Using executor CeleryExecutor
[2019-02-06 11:57:48,420] {models.py:258} INFO - Filling up the DagBag from /home/ec2-user/airflow/dags
[2019-02-06 11:57:48,783] {cli.py:237} INFO - Created <DagRun cng-hello_world # 2019-02-06 11:57:48+00:00: manual__2019-02-06T11:57:48+00:00, externally triggered: True>

createrepo does not delete .repodata file automatically

Command
createrepo .
Output
(process:2560): GLib-CRITICAL **: g_timer_stop: assertion 'timer != NULL' failed
(process:2560): GLib-CRITICAL **: g_timer_destroy: assertion 'timer != NULL' failed
Could not remove temp metadata dir: .repodata
Error was [Errno 39] Directory not empty: '/app/run/local_repo/application/test/./.repodata'
Please clean up this directory manually.
Traceback (most recent call last):
File "/usr/share/createrepo/genpkgmetadata.py", line 309, in <module>
main(sys.argv[1:])
File "/usr/share/createrepo/genpkgmetadata.py", line 274, in main
mdgen.doRepoMetadata()
File "/usr/lib/python2.7/site-packages/createrepo/__init__.py", line 1014, in doRepoMetadata
gen_func(complete_path, csum)
File "/usr/lib64/python2.7/site-packages/sqlitecachec.py", line 61, in getOtherdata
self.repoid))
TypeError: Can not create db_info table: disk I/O error
We have tried Following command to clean cache:
yum clean all
Above issue was due to disk error of the Virtual Machine.
The Disk had got corrupted.

PyInstaller - OSError: [Errno 2] No such file or directory

I run into "OSError: [Errno 2] No such file or directory" while running pyinstaller. Could someone point me what needs to be installed or done to solve it?
Below is the error message.
root#mylinkit:/usr# pyinstaller t123.py
2999 INFO: PyInstaller: 3.2.1
3002 INFO: Python: 2.7.12
3013 INFO: Platform: Linux-3.18.44-mips-with-glibc2.0
3026 INFO: wrote /usr/t123.spec
3069 INFO: UPX is not available.
3089 INFO: Extending PYTHONPATH with paths
['/usr', '/usr']
3092 INFO: checking Analysis
3258 INFO: checking PYZ
3346 INFO: checking PKG
3356 INFO: Bootloader /usr/lib/python2.7/site-packages/PyInstaller/bootloader/Linux-32bit/run
3358 INFO: checking EXE
3360 INFO: Building EXE because out00-EXE.toc is non existent
3362 INFO: Building EXE from out00-EXE.toc
3575 INFO: Appending archive to ELF section in EXE /usr/build/t123/t123
Traceback (most recent call last):
File "/usr/bin/pyinstaller", line 9, in <module>
load_entry_point('PyInstaller==3.2.1', 'console_scripts', 'pyinstaller')()
File "/usr/lib/python2.7/site-packages/PyInstaller/__main__.py", line 90, in run
run_build(pyi_config, spec_file, **vars(args))
File "/usr/lib/python2.7/site-packages/PyInstaller/__main__.py", line 46, in run_build
PyInstaller.building.build_main.main(pyi_config, spec_file, **kwargs)
File "/usr/lib/python2.7/site-packages/PyInstaller/building/build_main.py", line 788, in main
build(specfile, kw.get('distpath'), kw.get('workpath'), kw.get('clean_build'))
File "/usr/lib/python2.7/site-packages/PyInstaller/building/build_main.py", line 734, in build
exec(text, spec_namespace)
File "<string>", line 26, in <module>
File "/usr/lib/python2.7/site-packages/PyInstaller/building/api.py", line 411, in __init__
self.__postinit__()
File "/usr/lib/python2.7/site-packages/PyInstaller/building/datastruct.py", line 161, in __postinit__
self.assemble()
File "/usr/lib/python2.7/site-packages/PyInstaller/building/api.py", line 563, in assemble
self.name)
File "/usr/lib/python2.7/site-packages/PyInstaller/compat.py", line 486, in exec_command_all
stdout=subprocess.PIPE, stderr=subprocess.PIPE, **kwargs)
File "/usr/lib/python2.7/subprocess.py", line 711, in __init__
errread, errwrite)
File "/usr/lib/python2.7/subprocess.py", line 1343, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
root#mylinkit:/usr#
I've done some investigation and I believe binutils may be the missing dependency.
apt-get install binutils

Plone 4 buildout error

Does anyone know how I can fix this error that I get when I do a buildout?
An internal error occurred due to a bug in either zc.buildout or in a
recipe being used:
Traceback (most recent call last):
File "/usr/local/Plone/buildout-cache/eggs/zc.buildout-1.7.1-py2.7.egg/zc/buildout/buildout.py", line 1866, in main
getattr(buildout, command)(args)
File "/usr/local/Plone/buildout-cache/eggs/zc.buildout-1.7.1-py2.7.egg/zc/buildout/buildout.py", line 625, in install
installed_files = self[part]._call(recipe.install)
File "/usr/local/Plone/buildout-cache/eggs/zc.buildout-1.7.1-py2.7.egg/zc/buildout/buildout.py", line 1345, in _call
return f()
File "/usr/local/Plone/buildout-cache/eggs/plone.recipe.precompiler-0.6-py2.7.egg/plone/recipe/precompiler/__init__.py", line 29, in install
return self._run()
File "/usr/local/Plone/buildout-cache/eggs/plone.recipe.precompiler-0.6-py2.7.egg/plone/recipe/precompiler/__init__.py", line 35, in _run
self._compile_eggs()
File "/usr/local/Plone/buildout-cache/eggs/plone.recipe.precompiler-0.6-py2.7.egg/plone/recipe/precompiler/__init__.py", line 67, in _compile_eggs
py_compile.compile(fn, None, None, True)
File "/usr/lib/python2.7/py_compile.py", line 123, in compile
with open(cfile, 'wb') as fc:
IOError: [Errno 13] Permission denied: '/usr/local/Plone/zeocluster/products/MyScriptModules/__init__.pyc'
Remove the file that causes the problem. which is:
/usr/local/Plone/zeocluster/products/MyScriptModules/__init__.pyc
Then re-run buildout as the user that removed the above file.

Resources