ResourceInError: resources.bigip_instance: Went to status ERROR due to \"Message: 'NoneType' object is not iterable, Code: 500\"", - openstack

I am using Openstack (Ocata).
I am facing an error (Resource CREATE failed: ResourceInError: resources.bigip_instance: Went to status ERROR due to \"Message: 'NoneType' object is not iterable, Code: 500\") intermittently.
We need your help with this. Here are the logs from nova-compute container.
2021-11-04 02:17:03.051 6 INFO nova.virt.libvirt.driver [req-102ca315-27e6-42db-b6f7-96957b9957b4 171e3f22877942e7a6c3a2839fd12721 b706dec01b2e470982a34a2c8f0c17b6 - - -] [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] Creating image
2021-11-04 02:17:03.092 6 ERROR nova.image.glance [req-102ca315-27e6-42db-b6f7-96957b9957b4 171e3f22877942e7a6c3a2839fd12721 b706dec01b2e470982a34a2c8f0c17b6 - - -] Error writing to /var/lib/nova/instances/_base/5b6cf3f5d3319fa96ce0e1537e7abcd5c59fe8d3.part: 'NoneType' object is not iterable
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [req-102ca315-27e6-42db-b6f7-96957b9957b4 171e3f22877942e7a6c3a2839fd12721 b706dec01b2e470982a34a2c8f0c17b6 - - -] [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] Instance failed to spawn
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] Traceback (most recent call last):
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 2133, in _build_resources
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] yield resources
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 1939, in _build_and_run_instance
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] block_device_info=block_device_info)
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py", line 2786, in spawn
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] block_device_info=block_device_info)
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py", line 3193, in _create_image
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] fallback_from_host)
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py", line 3309, in _create_and_inject_local_root
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] instance, size, fallback_from_host)
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py", line 6988, in _try_fetch_image_cache
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] size=size)
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/imagebackend.py", line 242, in cache
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] *args, **kwargs)
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/imagebackend.py", line 584, in create_image
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] prepare_template(target=base, *args, **kwargs)
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py", line 274, in inner
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] return f(*args, **kwargs)
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/imagebackend.py", line 238, in fetch_func_sync
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] fetch_func(target=target, *args, **kwargs)
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/utils.py", line 458, in fetch_image
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] images.fetch_to_raw(context, image_id, target)
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/nova/virt/images.py", line 132, in fetch_to_raw
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] fetch(context, image_href, path_tmp)
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/nova/virt/images.py", line 123, in fetch
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] IMAGE_API.download(context, image_href, dest_path=path)
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/nova/image/api.py", line 184, in download
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] dst_path=dest_path)
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/nova/image/glance.py", line 602, in download
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] {'path': dst_path, 'exception': ex})
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] self.force_reraise()
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] six.reraise(self.type_, self.value, self.tb)
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] File "/usr/lib/python2.7/dist-packages/nova/image/glance.py", line 586, in download
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] for chunk in image_chunks:
2021-11-04 02:17:03.093 6 ERROR nova.compute.manager [instance: adc77e69-ead0-4e10-9587-cae31e2d904a] TypeError: 'NoneType' object is not iterable

Related

MysqlOperator in airflow 2.0.1 failed with "ssl connection error"

I am new to airflow and I am trying to test Mysql connection using MysqlOperator in airflow 2.0.1. However I am getting an error regarding to ssl connection error. I have tried to add extra parameters to disable ssl mode, but still I am getting the same error.
Here is my code, (I tried to pass the ssl param = disable in the code), and it doesn't work.
from airflow import DAG
from airflow.providers.mysql.operators.mysql import MySqlOperator
from airflow.operators.python import PythonOperator
from airflow.operators.dummy_operator import DummyOperator
from airflow.utils.dates import days_ago, timedelta
default_args = {
'owner' : 'airflow',
'depend_on_past' : False,
'start_date' : days_ago(2),
'retries' : 1,
'retry_delay' : timedelta(minutes=1)
}
with DAG(
'mysqlConnTest',
default_args=default_args,
schedule_interval='#once',
catchup=False) as dag:
start_date = DummyOperator(task_id = "start_task")
# [START howto_operator_mysql]
select_table_mysql_task = MySqlOperator(
task_id='select_table_mysql', mysql_conn_id='mysql', sql="SELECT * FROM country;"autocommit=True, parameters= {'ssl_mode': 'DISABLED'}
)
start_date >> select_table_mysql_task
and here is the error
*** Reading local file: /opt/airflow/logs/mysqlHookConnTest/select_table_mysql/2021-04-14T12:46:42.221662+00:00/2.log
[2021-04-14 12:47:46,791] {taskinstance.py:851} INFO - Dependencies all met for <TaskInstance: mysqlHookConnTest.select_table_mysql 2021-04-14T12:46:42.221662+00:00 [queued]>
[2021-04-14 12:47:47,007] {taskinstance.py:851} INFO - Dependencies all met for <TaskInstance: mysqlHookConnTest.select_table_mysql 2021-04-14T12:46:42.221662+00:00 [queued]>
[2021-04-14 12:47:47,047] {taskinstance.py:1042} INFO -
--------------------------------------------------------------------------------
[2021-04-14 12:47:47,054] {taskinstance.py:1043} INFO - Starting attempt 2 of 2
[2021-04-14 12:47:47,074] {taskinstance.py:1044} INFO -
--------------------------------------------------------------------------------
[2021-04-14 12:47:47,331] {taskinstance.py:1063} INFO - Executing <Task(MySqlOperator): select_table_mysql> on 2021-04-14T12:46:42.221662+00:00
[2021-04-14 12:47:47,377] {standard_task_runner.py:52} INFO - Started process 66 to run task
[2021-04-14 12:47:47,402] {standard_task_runner.py:76} INFO - Running: ['airflow', 'tasks', 'run', 'mysqlHookConnTest', 'select_table_mysql', '2021-04-14T12:46:42.221662+00:00', '--job-id', '142', '--pool', 'default_pool', '--raw', '--subdir', 'DAGS_FOLDER/MySqlHookConnTest.py', '--cfg-path', '/tmp/tmppujnrey3', '--error-file', '/tmp/tmpjl_g_p3t']
[2021-04-14 12:47:47,413] {standard_task_runner.py:77} INFO - Job 142: Subtask select_table_mysql
[2021-04-14 12:47:47,556] {logging_mixin.py:104} INFO - Running <TaskInstance: mysqlHookConnTest.select_table_mysql 2021-04-14T12:46:42.221662+00:00 [running]> on host ea95b9685a31
[2021-04-14 12:47:47,672] {taskinstance.py:1257} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=airflow
AIRFLOW_CTX_DAG_ID=mysqlHookConnTest
AIRFLOW_CTX_TASK_ID=select_table_mysql
AIRFLOW_CTX_EXECUTION_DATE=2021-04-14T12:46:42.221662+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2021-04-14T12:46:42.221662+00:00
[2021-04-14 12:47:47,687] {mysql.py:72} INFO - Executing: SELECT idPais, Nombre, codigo, paisPlataforma, create_date, update_date FROM ob_cpanel.cpanel_pais;
[2021-04-14 12:47:47,710] {base.py:74} INFO - Using connection to: id: mysql. Host: sys-sql-pre-01.oneboxtickets.net, Port: 3306, Schema: , Login: lectura, Password: None, extra: None
[2021-04-14 12:47:48,134] {taskinstance.py:1455} ERROR - (2006, 'SSL connection error: error:1425F102:SSL routines:ssl_choose_client_version:unsupported protocol')
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 1112, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 1285, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 1315, in _execute_task
result = task_copy.execute(context=context)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/providers/mysql/operators/mysql.py", line 74, in execute
hook.run(self.sql, autocommit=self.autocommit, parameters=self.parameters)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/hooks/dbapi.py", line 173, in run
with closing(self.get_conn()) as conn:
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/providers/mysql/hooks/mysql.py", line 144, in get_conn
return MySQLdb.connect(**conn_config)
File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/__init__.py", line 85, in Connect
return Connection(*args, **kwargs)
File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/connections.py", line 208, in __init__
super(Connection, self).__init__(*args, **kwargs2)
_mysql_exceptions.OperationalError: (2006, 'SSL connection error: error:1425F102:SSL routines:ssl_choose_client_version:unsupported protocol')
[2021-04-14 12:47:48,143] {taskinstance.py:1503} INFO - Marking task as FAILED. dag_id=mysqlHookConnTest, task_id=select_table_mysql, execution_date=20210414T124642, start_date=20210414T124746, end_date=20210414T124748
[2021-04-14 12:47:48,243] {local_task_job.py:146} INFO - Task exited with return code 1
We have tried to remove the last two parameter from the dag code, and we add in extra field(conn-airflow UI). Adding this json
{"ssl":false}
and the issue appears with another similar error
/opt/airflow/logs/mysqlOperatorConnTest/select_table_mysql/2021-04-15T11:26:50.578333+00:00/2.log
*** Fetching from: http://airflow-worker-0.airflow-worker.airflow.svc.cluster.local:8793/log/mysqlOperatorConnTest/select_table_mysql/2021-04-15T11:26:50.578333+00:00/2.log
[2021-04-15 11:27:54,471] {taskinstance.py:851} INFO - Dependencies all met for <TaskInstance: mysqlOperatorConnTest.select_table_mysql 2021-04-15T11:26:50.578333+00:00 [queued]>
[2021-04-15 11:27:54,497] {taskinstance.py:851} INFO - Dependencies all met for <TaskInstance: mysqlOperatorConnTest.select_table_mysql 2021-04-15T11:26:50.578333+00:00 [queued]>
[2021-04-15 11:27:54,497] {taskinstance.py:1042} INFO -
--------------------------------------------------------------------------------
[2021-04-15 11:27:54,497] {taskinstance.py:1043} INFO - Starting attempt 2 of 2
[2021-04-15 11:27:54,497] {taskinstance.py:1044} INFO -
--------------------------------------------------------------------------------
[2021-04-15 11:27:54,507] {taskinstance.py:1063} INFO - Executing <Task(MySqlOperator): select_table_mysql> on 2021-04-15T11:26:50.578333+00:00
[2021-04-15 11:27:54,510] {standard_task_runner.py:52} INFO - Started process 115 to run task
[2021-04-15 11:27:54,514] {standard_task_runner.py:76} INFO - Running: ['airflow', 'tasks', 'run', 'mysqlOperatorConnTest', 'select_table_mysql', '2021-04-15T11:26:50.578333+00:00', '--job-id', '68', '--pool', 'default_pool', '--raw', '--subdir', '/opt/airflow/dags/repo/MySqlOperatorConnTest.py', '--cfg-path', '/tmp/tmpy7bv58_z', '--error-file', '/tmp/tmpaoe808of']
[2021-04-15 11:27:54,514] {standard_task_runner.py:77} INFO - Job 68: Subtask select_table_mysql
[2021-04-15 11:27:54,644] {logging_mixin.py:104} INFO - Running <TaskInstance: mysqlOperatorConnTest.select_table_mysql 2021-04-15T11:26:50.578333+00:00 [running]> on host airflow-worker-0.airflow-worker.airflow.svc.cluster.local
[2021-04-15 11:27:54,707] {logging_mixin.py:104} WARNING - /opt/python/site-packages/sqlalchemy/sql/coercions.py:518 SAWarning: Coercing Subquery object into a select() for use in IN(); please pass a select() construct explicitly
[2021-04-15 11:27:54,725] {taskinstance.py:1255} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=airflow
AIRFLOW_CTX_DAG_ID=mysqlOperatorConnTest
AIRFLOW_CTX_TASK_ID=select_table_mysql
AIRFLOW_CTX_EXECUTION_DATE=2021-04-15T11:26:50.578333+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2021-04-15T11:26:50.578333+00:00
[2021-04-15 11:27:54,726] {mysql.py:72} INFO - Executing: SELECT idPais, Nombre, codigo, paisPlataforma, create_date, update_date FROM ob_cpanel.cpanel_pais;
[2021-04-15 11:27:54,744] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,744] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,744] {base.py:65} INFO - Using connection to: id: mysql. Host: sys-sql-pre-01.oneboxtickets.net, Port: 3306, Schema: , Login: lectura, Password: XXXXXXXX, extra: None
[2021-04-15 11:27:54,745] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,745] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,745] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,745] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,746] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,746] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,746] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,746] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,746] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,747] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,747] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,747] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,747] {connection.py:337} ERROR - Expecting value: line 2 column 9 (char 11)
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/connection.py", line 335, in extra_dejson
obj = json.loads(self.extra)
File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 9 (char 11)
[2021-04-15 11:27:54,747] {connection.py:338} ERROR - Failed parsing the json for conn_id mysql
[2021-04-15 11:27:54,787] {taskinstance.py:1455} ERROR - (2006, 'SSL connection error: error:1425F102:SSL routines:ssl_choose_client_version:unsupported protocol')
Traceback (most recent call last):
File "/opt/python/site-packages/airflow/models/taskinstance.py", line 1112, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/opt/python/site-packages/airflow/models/taskinstance.py", line 1285, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
File "/opt/python/site-packages/airflow/models/taskinstance.py", line 1315, in _execute_task
result = task_copy.execute(context=context)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/mysql/operators/mysql.py", line 74, in execute
hook.run(self.sql, autocommit=self.autocommit, parameters=self.parameters)
File "/opt/python/site-packages/airflow/hooks/dbapi.py", line 173, in run
with closing(self.get_conn()) as conn:
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/mysql/hooks/mysql.py", line 144, in get_conn
return MySQLdb.connect(**conn_config)
File "/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/__init__.py", line 85, in Connect
return Connection(*args, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/connections.py", line 208, in __init__
super(Connection, self).__init__(*args, **kwargs2)
_mysql_exceptions.OperationalError: (2006, 'SSL connection error: error:1425F102:SSL routines:ssl_choose_client_version:unsupported protocol')
[2021-04-15 11:27:54,788] {taskinstance.py:1496} INFO - Marking task as FAILED. dag_id=mysqlOperatorConnTest, task_id=select_table_mysql, execution_date=20210415T112650, start_date=20210415T112754, end_date=20210415T112754
[2021-04-15 11:27:54,845] {local_task_job.py:146} INFO - Task exited with return code 1
We solved this issue upgrading the Mysql client to 5.7. Our server version was 5.6 and the previous client was 8, as I was using docker image. so we downgraded the client to be more closer to the server version.

Failing to attach volume to openstack instance

I am attempting to attach a volume to an image. My attempts consistently fail.
I am using the following command to create the attachment:
openstack server add volume test_instance testvol
where test_instance is my instance and testvol is the 20G volume that I created with Cinder.
I get no error messages when I attempt this attachment, but when I check my volume I do not see the attachment:
cinder list
+--------------------------------------+-----------+-------------+------+-------------+----------+-------------+
| ID | Status | Name | Size | Volume Type | Bootable | Attached to |
+--------------------------------------+-----------+-------------+------+-------------+----------+-------------+
| 3dce2112-eddc-4d39-b47d-8717b16dd5e9 | available | testvol | 20 | __DEFAULT__ | false | |
| a9aa8845-dd3a-4f47-8b5e-445c6170abbe | available | volume1 | 20 | iscsi | false | |
+--------------------------------------+-----------+-------------+------+-------------+----------+-------------+
I see no attachment. The above is what I get whether I attempt to attach testvol or volume1.
Looking at the nova-compute log, I am getting the following messages:
2020-09-06 12:41:07.172 2734 INFO nova.compute.manager [req-f84fbcef-b5fb-4f43-87ed-25d213b371c2 ce97118ec1f348a09d196c6da826749f 65d7e58a440440c3894c0fb7bf958052 - default default] [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Attaching volume 3dce2112-eddc-4d39-b47d-8717b16dd5e9 to /dev/vda
2020-09-06 12:41:11.114 2734 INFO os_brick.initiator.connectors.iscsi [req-f84fbcef-b5fb-4f43-87ed-25d213b371c2 ce97118ec1f348a09d196c6da826749f 65d7e58a440440c3894c0fb7bf958052 - default default] Trying to connect to iSCSI portal 192.168.1.105:3260
2020-09-06 12:41:11.141 2734 WARNING os_brick.initiator.connectors.iscsi [req-f84fbcef-b5fb-4f43-87ed-25d213b371c2 ce97118ec1f348a09d196c6da826749f 65d7e58a440440c3894c0fb7bf958052 - default default] Couldn't find iSCSI nodes because iscsiadm err: iscsiadm: No records found: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
2020-09-06 12:41:11.150 2734 WARNING os_brick.initiator.connectors.iscsi [req-f84fbcef-b5fb-4f43-87ed-25d213b371c2 ce97118ec1f348a09d196c6da826749f 65d7e58a440440c3894c0fb7bf958052 - default default] iscsiadm stderr output when getting sessions: iscsiadm: No active sessions.: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [req-f84fbcef-b5fb-4f43-87ed-25d213b371c2 ce97118ec1f348a09d196c6da826749f 65d7e58a440440c3894c0fb7bf958052 - default default] [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Driver failed to attach volume 3dce2112-eddc-4d39-b47d-8717b16dd5e9 at /dev/vda: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
Command: iscsiadm -m node -T iqn.2010-10.org.openstack:volume-3dce2112-eddc-4d39-b47d-8717b16dd5e9 -p 192.168.1.105:3260 --interface default --op new
Exit code: 6
Stdout: ''
Stderr: 'iscsiadm: Could not open /var/lib/iscsi/nodes/iqn.2010-10.org.openstack:volume-3dce2112-eddc-4d39-b47d-8717b16dd5e9/192.168.1.105,3260: Permission deniedd\niscsiadm: Error while adding record: encountered iSCSI database failure\n'
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Traceback (most recent call last):
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/nova/virt/block_device.py", line 598, in _volume_attach
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] device_type=self['device_type'], encryption=encryption)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1827, in attach_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] encryption=encryption)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1607, in _connect_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] vol_driver.connect_volume(connection_info, instance)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/volume/iscsi.py", line 64, in connect_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] device_info = self.connector.connect_volume(connection_info['data'])
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/utils.py", line 137, in trace_logging_wrapper
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return f(*args, **kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py", line 359, in inner
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return f(*args, **kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 519, in connect_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] self._cleanup_connection(connection_properties, force=True)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] self.force_reraise()
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] six.reraise(self.type_, self.value, self.tb)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/six.py", line 703, in reraise
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] raise value
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 513, in connect_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return self._connect_single_volume(connection_properties)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/utils.py", line 61, in _wrapper
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return r.call(f, *args, **kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/retrying.py", line 223, in call
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return attempt.get(self._wrap_exception)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/retrying.py", line 261, in get
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] six.reraise(self.value[0], self.value[1], self.value[2])
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/six.py", line 703, in reraise
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] raise value
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/retrying.py", line 217, in call
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] attempt = Attempt(fn(*args, **kwargs), attempt_number, False)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 569, in _connect_single_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] self._connect_vol(self.device_scan_attempts, props, data)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 626, in _connect_vol
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] session, manual_scan = self._connect_to_iscsi_portal(props)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 1040, in _connect_to_iscsi_portal
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] '--op', 'new'))
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 993, in _run_iscsiadm
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] delay_on_retry=delay_on_retry)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/executor.py", line 52, in _execute
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] result = self.__execute(*args, **kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/privileged/rootwrap.py", line 169, in execute
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return execute_root(*cmd, **kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/oslo_privsep/priv_context.py", line 247, in _wrap
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return self.channel.remote_call(name, args, kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/oslo_privsep/daemon.py", line 204, in remote_call
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] raise exc_type(*result[2])
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Command: iscsiadm -m node -T iqn.2010-10.org.openstack:volume-3dce2112-eddc-4d39-b47d-8717b16dd5e9 -p 192.168.1.105:3260 --interface default --op new
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Exit code: 6
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Stdout: ''
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Stderr: 'iscsiadm: Could not open /var/lib/iscsi/nodes/iqn.2010-10.org.openstack:volume-3dce2112-eddc-4d39-b47d-8717b16dd5e9/192.168.1.105,3260: Permission deniedd\niscsiadm: Error while adding record: encountered iSCSI database failure\n'
2020
Does anyone know what is causing this failure? How can I get a volume attached to an instance?

Openstack Pike installation Neutron Compute could not be found error

Hi I am new to Openstack I have installed the OpenStack Pike version (Keystone, Glance, Nova, Neutron, Horizon) with two ubuntu 16.04 VM (Controller and Compute Node) and added ICMP, SSH ports to security group but I could not able ping the instance ip, This is the link for the Doc https://docs.openstack.org/install-guide/openstack-services.html
For neutron I have choose self-service networks
This is My Console Log of the Instance
NOCHANGE: partition 1 is size 10458315. it cannot be grown
info: initramfs loading root from /dev/vda1
info: /etc/init.d/rc.sysinit: up at 5.09
info: container: none
Starting logging: OK
modprobe: module virtio_blk not found in modules.dep
modprobe: module virtio_net not found in modules.dep
WARN: /etc/rc3.d/S10-load-modules failed
Initializing random number generator... done.
Starting acpid: OK
cirros-ds 'local' up at 7.70
no results found for mode=local. up 8.31. searched: nocloud configdrive ec2
Starting network...
udhcpc (v1.20.1) started
Sending discover...
Sending select for 10.0.0.11...
Lease of 10.0.0.11 obtained, lease time 86400
route: SIOCADDRT: File exists
WARN: failed: route add -net "0.0.0.0/0" gw "10.0.0.1"
cirros-ds 'net' up at 9.71
checking http://169.254.169.254/2009-04-04/instance-id
failed 1/20: up 9.90. request failed
failed 2/20: up 14.06. request failed
failed 3/20: up 20.58. request failed
failed 4/20: up 32.71. request failed
failed 5/20: up 35.68. request failed
failed 6/20: up 47.78. request failed
failed 7/20: up 50.76. request failed
failed 8/20: up 53.40. request failed
failed 9/20: up 56.17. request failed
failed 10/20: up 58.86. request failed
failed 11/20: up 63.43. request failed
failed 12/20: up 66.60. request failed
failed 13/20: up 78.80. request failed
failed 14/20: up 81.63. request failed
failed 15/20: up 84.49. request failed
failed 16/20: up 87.27. request failed
failed 17/20: up 89.93. request failed
failed 18/20: up 92.73. request failed
failed 19/20: up 95.79. request failed
failed 20/20: up 98.56. request failed
failed to read iid from metadata. tried 20
no results found for mode=net. up 101.26. searched: nocloud configdrive ec2
failed to get instance-id of datasource
Top of dropbear init script
Starting dropbear sshd: failed to get instance-id of datasource
WARN: generating key of type ecdsa failed!
OK
This is my neutron-l3-agent.log (Compute Node)
2018-07-02 18:22:35.009 13819 INFO neutron.common.config [-] Logging enabled!
2018-07-02 18:22:35.010 13819 INFO neutron.common.config [-] /usr/bin/neutron-l3-agent version 11.0.3
2018-07-02 18:22:35.329 13819 ERROR neutron.agent.l3.agent [-] An interface driver must be specified
This is my neutron-linuxbridge-agent.log (Compute Node)
2018-07-02 18:24:25.186 7298 INFO neutron.agent.securitygroups_rpc [req-85444d73-0a1f-49df-ba25-98c0edc8d64d - - - - -] Security group member updated [u'957d8ed4-dba5-4cf5-b6df-0c7c044e4376']
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent [req-85444d73-0a1f-49df-ba25-98c0edc8d64d - - - - -] Error occurred while removing port tapac1db012-e4: RemoteError: Remote error: AgentNotFoundByTypeHost Agent with agent_type=L3 agent and host=Compute could not be found
[u'Traceback (most recent call last):\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 160, in _process_incoming\n res = self.dispatcher.dispatch(message)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 213, in dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 183, in _do_dispatch\n result = func(ctxt, **new_args)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 234, in update_device_down\n n_const.PORT_STATUS_DOWN, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 331, in notify_l2pop_port_wiring\n l2pop_driver.obj.update_port_down(port_context)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/drivers/l2pop/mech_driver.py", line 253, in update_port_down\n admin_context, agent_host, [port[\'device_id\']]):\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/l3_agentschedulers_db.py", line 303, in list_router_ids_on_host\n context, constants.AGENT_TYPE_L3, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/agents_db.py", line 291, in _get_agent_by_type_and_host\n host=host)\n', u'AgentNotFoundByTypeHost: Agent with agent_type=L3 agent and host=Compute could not be found\n'].
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent Traceback (most recent call last):
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/drivers/agent/_common_agent.py", line 337, in treat_devices_removed
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent cfg.CONF.host)
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/neutron/agent/rpc.py", line 139, in update_device_down
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent agent_id=agent_id, host=host)
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/neutron/common/rpc.py", line 162, in call
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent return self._original_context.call(ctxt, method, **kwargs)
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 169, in call
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent retry=self.retry)
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 123, in _send
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent timeout=timeout, retry=retry)
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 578, in send
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent retry=retry)
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 569, in _send
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent raise result
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent RemoteError: Remote error: AgentNotFoundByTypeHost Agent with agent_type=L3 agent and host=Compute could not be found
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent [u'Traceback (most recent call last):\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 160, in _process_incoming\n res = self.dispatcher.dispatch(message)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 213, in dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 183, in _do_dispatch\n result = func(ctxt, **new_args)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 234, in update_device_down\n n_const.PORT_STATUS_DOWN, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 331, in notify_l2pop_port_wiring\n l2pop_driver.obj.update_port_down(port_context)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/drivers/l2pop/mech_driver.py", line 253, in update_port_down\n admin_context, agent_host, [port[\'device_id\']]):\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/l3_agentschedulers_db.py", line 303, in list_router_ids_on_host\n context, constants.AGENT_TYPE_L3, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/agents_db.py", line 291, in _get_agent_by_type_and_host\n host=host)\n', u'AgentNotFoundByTypeHost: Agent with agent_type=L3 agent and host=Compute could not be found\n'].
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent
2018-07-02 18:24:25.366 7298 INFO neutron.plugins.ml2.drivers.agent._common_agent [req-85444d73-0a1f-49df-ba25-98c0edc8d64d - - - - -] Attachment tapac454bed-28 removed
2018-07-02 18:24:25.810 7298 INFO neutron.agent.securitygroups_rpc [req-85444d73-0a1f-49df-ba25-98c0edc8d64d - - - - -] Security group member updated [u'957d8ed4-dba5-4cf5-b6df-0c7c044e4376']
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent [req-85444d73-0a1f-49df-ba25-98c0edc8d64d - - - - -] Error occurred while removing port tapac454bed-28: RemoteError: Remote error: AgentNotFoundByTypeHost Agent with agent_type=L3 agent and host=Compute could not be found
[u'Traceback (most recent call last):\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 160, in _process_incoming\n res = self.dispatcher.dispatch(message)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 213, in dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 183, in _do_dispatch\n result = func(ctxt, **new_args)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 234, in update_device_down\n n_const.PORT_STATUS_DOWN, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 331, in notify_l2pop_port_wiring\n l2pop_driver.obj.update_port_down(port_context)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/drivers/l2pop/mech_driver.py", line 253, in update_port_down\n admin_context, agent_host, [port[\'device_id\']]):\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/l3_agentschedulers_db.py", line 303, in list_router_ids_on_host\n context, constants.AGENT_TYPE_L3, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/agents_db.py", line 291, in _get_agent_by_type_and_host\n host=host)\n', u'AgentNotFoundByTypeHost: Agent with agent_type=L3 agent and host=Compute could not be found\n'].
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent Traceback (most recent call last):
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/drivers/agent/_common_agent.py", line 337, in treat_devices_removed
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent cfg.CONF.host)
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/neutron/agent/rpc.py", line 139, in update_device_down
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent agent_id=agent_id, host=host)
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/neutron/common/rpc.py", line 162, in call
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent return self._original_context.call(ctxt, method, **kwargs)
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 169, in call
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent retry=self.retry)
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 123, in _send
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent timeout=timeout, retry=retry)
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 578, in send
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent retry=retry)
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 569, in _send
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent raise result
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent RemoteError: Remote error: AgentNotFoundByTypeHost Agent with agent_type=L3 agent and host=Compute could not be found
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent [u'Traceback (most recent call last):\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 160, in _process_incoming\n res = self.dispatcher.dispatch(message)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 213, in dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 183, in _do_dispatch\n result = func(ctxt, **new_args)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 234, in update_device_down\n n_const.PORT_STATUS_DOWN, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 331, in notify_l2pop_port_wiring\n l2pop_driver.obj.update_port_down(port_context)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/drivers/l2pop/mech_driver.py", line 253, in update_port_down\n admin_context, agent_host, [port[\'device_id\']]):\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/l3_agentschedulers_db.py", line 303, in list_router_ids_on_host\n context, constants.AGENT_TYPE_L3, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/agents_db.py", line 291, in _get_agent_by_type_and_host\n host=host)\n', u'AgentNotFoundByTypeHost: Agent with agent_type=L3 agent and host=Compute could not be found\n'].
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent
Add this to your /etc/neutron/neutron.conf on both controller and all of the compute nodes (under [oslo_concurrency] section):
lock_path = /var/lib/neutron/tmp

'unicode' object does not support item assignment (HTTP 400) (Request-ID: xxx)

From the openstack keystone client management doc:
I use openstack user list to check the users in openstack, but get the bellow errors:
'unicode' object does not support item assignment (HTTP 400) (Request-ID: req-ccf9d2b6-0801-45fd-9000-7feb3783eedc)
Why get this issue? its strange.
In the openstack cloud mariadb's keystone database, I selected * from user, get the bellow information:
EDIT
In my openstack cloud host machine's /var/log/keystone.log:
......
2017-09-20 15:15:24.376 9503 INFO keystone.common.wsgi [req-53ed55d1-125f-4ee7-b548-39c8d4e9c062 - - - - -] GET http://controller:35357/v3/users
2017-09-20 15:15:24.416 9503 ERROR keystone.common.wsgi [req-53ed55d1-125f-4ee7-b548-39c8d4e9c062 - - - - -] 'unicode' object does not support item assignment
2017-09-20 15:15:24.416 9503 ERROR keystone.common.wsgi Traceback (most recent call last):
2017-09-20 15:15:24.416 9503 ERROR keystone.common.wsgi File "/usr/lib/python2.7/site-packages/keystone/common/wsgi.py", line 228, in __call__
2017-09-20 15:15:24.416 9503 ERROR keystone.common.wsgi result = method(req, **params)
2017-09-20 15:15:24.416 9503 ERROR keystone.common.wsgi File "/usr/lib/python2.7/site-packages/keystone/common/controller.py", line 235, in wrapper
2017-09-20 15:15:24.416 9503 ERROR keystone.common.wsgi return f(self, request, filters, **kwargs)
2017-09-20 15:15:24.416 9503 ERROR keystone.common.wsgi File "/usr/lib/python2.7/site-packages/keystone/identity/controllers.py", line 231, in list_users
2017-09-20 15:15:24.416 9503 ERROR keystone.common.wsgi return UserV3.wrap_collection(request.context_dict, refs, hints=hints)
2017-09-20 15:15:24.416 9503 ERROR keystone.common.wsgi File "/usr/lib/python2.7/site-packages/keystone/common/controller.py", line 499, in wrap_collection
2017-09-20 15:15:24.416 9503 ERROR keystone.common.wsgi cls.wrap_member(context, ref)
2017-09-20 15:15:24.416 9503 ERROR keystone.common.wsgi File "/usr/lib/python2.7/site-packages/keystone/common/controller.py", line 468, in wrap_member
2017-09-20 15:15:24.416 9503 ERROR keystone.common.wsgi cls._add_self_referential_link(context, ref)
2017-09-20 15:15:24.416 9503 ERROR keystone.common.wsgi File "/usr/lib/python2.7/site-packages/keystone/common/controller.py", line 464, in _add_self_referential_link
2017-09-20 15:15:24.416 9503 ERROR keystone.common.wsgi ref['links']['self'] = cls.base_url(context) + '/' + ref['id']
2017-09-20 15:15:24.416 9503 ERROR keystone.common.wsgi TypeError: 'unicode' object does not support item assignment
At last I find the extra data format of several row is wrong .
See the user's extra column in the snapshot. the data is like bellow:
{"email": "xxxx#xx.com", "links":"http://103.x5.xx.1/user_resource/liaoss07"}
But should pay attention the links should not like that, so, I set the extra to {} which extra is like this abnormal.
my solution is :
UPDATE user SET extra='{}' WHERE id=xxx
and you can leave the email in the {} too.

ERROR oslo_messaging.rpc.dispatcher [req-xxxxx] Exception during message handling: 'metadata'

I installed Openstack(Kilo) on Centos7
The error occurred in adding the security group rules
I'm using Nova-network.
I got this error form nova-compute.log
2015-07-31 16:55:05.669 5950 ERROR oslo_messaging.rpc.dispatcher [req-76584058-8dbd-4860-a048-e9dbad712779 843c47b4a71b4ac3a6c4375d558aa423 3f4e979ea9b9409a9425442a8b096457 - -
-] Exception during message handling: 'metadata'
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher Traceback (most recent call last):
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 142, in _dispatch_and_reply
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher executor_callback))
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 186, in _dispatch
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher executor_callback)
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 130, in _do_dispatch
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher result = func(ctxt, **new_args)
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 6716, in refresh_instance_security_rules
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher return self.manager.refresh_instance_security_rules(ctxt, instance)
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 434, in decorated_function
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher args = (_load_instance(args[0]),) + args[1:]
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 425, in _load_instance
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher expected_attrs=metas)
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/nova/objects/instance.py", line 492, in _from_db_object
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher instance['metadata'] = utils.instance_meta(db_inst)
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/nova/utils.py", line 816, in instance_meta
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher if isinstance(instance['metadata'], dict):
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher KeyError: 'metadata'
2015-07-31 16:55:05.669 5950 TRACE oslo_messaging.rpc.dispatcher
Thank you warm-hearted!
I also faced the same error.
Try adding the rules to the security group first and then create the instance.
It worked for me.
This is a work around only. Since I don't have enough privileges to comment I am adding this as an answer.
If I get to know about any solution I will post it here.
Although it's been 4 years, probably nobody even cares what the root cause is but I put it here anyway, since I did some (re)search.
The object_compat decorator expects to get the Instance object with
'metadata' and 'system_metadata' attributes but if those aren't in the
db instance dict object, Instance._from_db_object will fail with a
KeyError.
In Kilo this happens per refresh_instance_security_rules because in
the compute API code, the instance passed to
refresh_instance_security_rules comes from the call to get the
security group(s) which joins on the instances column, but that
doesn't join on the metadata/system_metadata fields for the instances.
So when the instances get to object_compat in the compute manager and
the db instance dict is converted to the Instance object, it expects
fields that aren't in the dict and we get the KeyError.
Refer to the following code commit for details.
https://opendev.org/openstack/nova/commit/9369aab04e37b7818d49b00e65857be8b3564e9e

Resources