openstack masakari, AttributeError: 'Connection' object has no attribute 'instance_ha' - openstack

I want to make instance HA for my Openstack instance with Masakari project which I installed masakari-api and masakari-engine on Controller nodes and masakari-processmonitor, masakari-instancemonitor and masakari-hostmonitor on Compute nodes but when I am going to test failed openstack-nova-compte service to test instance HA I get this error:
2019-02-02T10:10:03.782541+03:30 c2 masakari-processmonitor: Command: systemctl restart openstack-nova-compute.service
2019-02-02T10:10:03.782972+03:30 c2 masakari-processmonitor: Exit code: 1
2019-02-02T10:10:03.783363+03:30 c2 masakari-processmonitor: Stdout: u''
2019-02-02T10:10:03.783785+03:30 c2 masakari-processmonitor: Stderr: u'Job for openstack-nova-compute.service failed because the control process exited with error code. See "systemctl status openstack-nova-compute.service" and "journalctl -xe" for details.\n': ProcessExecutionError: Unexpected error while running command.
2019-02-02T10:10:08.776654+03:30 c2 masakari-processmonitor: 2019-02-02 10:10:08.775 115036 INFO masakarimonitors.ha.masakari [-] **Send a notification**. {'notification': {'hostname': '<my_domain_name>', 'type': 'PROCESS', 'payload': {'process_name': '/usr/bin/nova-compute', 'event': 'STOPPED'}, 'generated_time': datetime.datetime(2019, 2, 2, 6, 40, 8, 774997)}}
2019-02-02T10:10:08.780948+03:30 c2 masakari-processmonitor: 2019-02-02 10:10:08.778 115036 ERROR masakarimonitors.processmonitor.process [-] Exception caught: 'Connection' object has no attribute 'instance_ha': AttributeError: 'Connection' object has no attribute 'instance_ha'
2019-02-02T10:10:08.781513+03:30 c2 masakari-processmonitor: 2019-02-02 10:10:08.778 115036 ERROR masakarimonitors.processmonitor.process Traceback (most recent call last):
2019-02-02T10:10:08.782106+03:30 c2 masakari-processmonitor: 2019-02-02 10:10:08.778 115036 ERROR masakarimonitors.processmonitor.process File "/usr/lib/python2.7/site-packages/masakarimonitors/processmonitor/process.py", line 75, in main
2019-02-02T10:10:08.782882+03:30 c2 masakari-processmonitor: 2019-02-02 10:10:08.778 115036 ERROR masakarimonitors.processmonitor.process self.process_handler.restart_processes(down_process_list)
2019-02-02T10:10:08.783546+03:30 c2 masakari-processmonitor: 2019-02-02 10:10:08.778 115036 ERROR masakarimonitors.processmonitor.process File "/usr/lib/python2.7/site-packages/masakarimonitors/processmonitor/process_handler/handle_process.py", line 203, in restart_processes
2019-02-02T10:10:08.784149+03:30 c2 masakari-processmonitor: 2019-02-02 10:10:08.778 115036 ERROR masakarimonitors.processmonitor.process event)
2019-02-02T10:10:08.784772+03:30 c2 masakari-processmonitor: 2019-02-02 10:10:08.778 115036 ERROR masakarimonitors.processmonitor.process File "/usr/lib/python2.7/site-packages/masakarimonitors/ha/masakari.py", line 60, in send_notification
2019-02-02T10:10:08.785349+03:30 c2 masakari-processmonitor: 2019-02-02 10:10:08.778 115036 ERROR masakarimonitors.processmonitor.process client = self._make_client()
2019-02-02T10:10:08.785902+03:30 c2 masakari-processmonitor: 2019-02-02 10:10:08.778 115036 ERROR masakarimonitors.processmonitor.process File "/usr/lib/python2.7/site-packages/masakarimonitors/ha/masakari.py", line 43, in _make_client
2019-02-02T10:10:08.786500+03:30 c2 masakari-processmonitor: 2019-02-02 10:10:08.778 115036 ERROR masakarimonitors.processmonitor.process return conn.instance_ha
2019-02-02T10:10:08.786998+03:30 c2 masakari-processmonitor: 2019-02-02 10:10:08.778 115036 ERROR masakarimonitors.processmonitor.process AttributeError: 'Connection' object has no attribute 'instance_ha'
2019-02-02T10:10:08.787561+03:30 c2 masakari-processmonitor: 2019-02-02 10:10:08.778 115036 ERROR masakarimonitors.processmonitor.process
I check my connection to controller nodes and compute node, all of them is OK.
I check the Rabbitmq queues and it has masakari ha-engine queue.
I check the MySQL for masakari database and I have that too.
mysql> SHOW TABLES FROM masakari;
failover_segments
hosts
migrate_version
notifications
Every thing looks correct but where is my problem, do I miss some things.

The problem seems like
AttributeError: 'Connection' object has no attribute 'instance_ha'.
In stable/queens, this issue is fixed with this patch.
If you still find this issue in stable/queens or any other version,
I recommend you to file a bug report to OpenStack masakari-monitors via launchpad. Please include the OpenStack version details.

Related

Getting Timeout Issue in Airflow

I am getting SSH operator error timed out issue in Airflow. Initially I was not getting this issue but Now I am getting this issue continually
Code
xx = [
SSHOperator(
task_id=str(i),
command="sh /home/ec2-user/rapid/dht/hi_policy/bin/filename.sh ",
ssh_conn_id=SECRET_NAME,
dag=dag,
do_xcom_push=True,
)
for i in dbo_list_1
]
Error
[2022-08-22 10:16:03,760] {{taskinstance.py:1482}} ERROR - Task failed with exception
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/airflow/providers/ssh/operators/ssh.py", line 109, in execute
with self.ssh_hook.get_conn() as ssh_client:
File "/usr/local/lib/python3.7/site-packages/airflow/providers/ssh/hooks/ssh.py", line 240, in get_conn
client.connect(**connect_kwargs)
File "/usr/local/lib/python3.7/site-packages/paramiko/client.py", line 349, in <lambda>
retry_on_signal(lambda: sock.connect(addr))
socket.timeout: timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1138, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1311, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
raise AirflowException(f"SSH operator error: {str(e)}")
airflow.exceptions.AirflowException: SSH operator error: timed out

I import a other wordpress file into my wordpress. When i import file its shows:- Fatal error: Uncaught WC_Data_Exception

Fatal error: Uncaught WC_Data_Exception: Invalid billing email address in D:\XAMPP\htdocs\wordpress\wp-content\plugins\woocommerce\includes\abstracts\abstract-wc-data.php:810 Stack trace: #0 D:\XAMPP\htdocs\wordpress\wp-content\plugins\woocommerce\includes\class-wc-customer.php(971): WC_Data->error('customer_invali...', 'Invalid billing...') #1 D:\XAMPP\htdocs\wordpress\wp-content\plugins\woocommerce\includes\data-stores\class-wc-customer-data-store-session.php(113): WC_Customer->set_billing_email('admin1#localhos...') #2 D:\XAMPP\htdocs\wordpress\wp-content\plugins\woocommerce\includes\class-wc-data-store.php(159): WC_Customer_Data_Store_Session->read(Object(WC_Customer)) #3 D:\XAMPP\htdocs\wordpress\wp-content\plugins\woocommerce\includes\class-wc-customer.php(116): WC_Data_Store->read(Object(WC_Customer)) #4 D:\XAMPP\htdocs\wordpress\wp-content\plugins\woocommerce\includes\class-woocommerce.php(757): WC_Customer->__construct(6, true) #5 D:\XAMPP\htdocs\wordpress\wp-content\plugins\woocommerce\includes\wc-core-functions in D:\XAMPP\htdocs\wordpress\wp-content\plugins\woocommerce\includes\abstracts\abstract-wc-data.php on line 810

Forbidden: An error occurred (403) when calling the HeadObject operation:

my ~/.aws/credentials looks like
[default]
aws_access_key_id = XYZ
aws_secret_access_key = ABC
[testing]
source_profile = default
role_arn = arn:aws:iam::54:role/ad
I add my remote like
dvc remote add --local -v myremote s3://bib-ds-models-testing/data/dvc-test
I have made my .dvc/config.local to look like
[‘remote “myremote”’]
url = s3://bib-ds-models-testing/data/dvc-test
access_key_id = XYZ
secret_access_key = ABC/h2hOsRcCIFqwYWV7eZaUq3gNmS
profile=‘testing’
credentialpath = /Users/nyt21/.aws/credentials
but still after running dvc push -r myremote I get
ERROR: unexpected error - Forbidden: An error occurred (403) when calling the HeadObject operation: Forbidden
** Update
here is the output of dvc push -v
2021-07-25 22:40:38,887 DEBUG: Check for update is enabled.
2021-07-25 22:40:39,022 DEBUG: Preparing to upload data to 's3://bib-ds-models-testing/data/dvc-test'
2021-07-25 22:40:39,022 DEBUG: Preparing to collect status from s3://bib-ds-models-testing/data/dvc-test
2021-07-25 22:40:39,022 DEBUG: Collecting information from local cache...
2021-07-25 22:40:39,022 DEBUG: Collecting information from remote cache...
2021-07-25 22:40:39,022 DEBUG: Matched '0' indexed hashes
2021-07-25 22:40:39,022 DEBUG: Querying 1 hashes via object_exists
2021-07-25 22:40:39,644 ERROR: unexpected error - Forbidden: An error occurred (403) when calling the HeadObject operation: Forbidden
------------------------------------------------------------
Traceback (most recent call last):
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 246, in _call_s3
out = await method(**additional_kwargs)
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/aiobotocore/client.py", line 154, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 1057, in _info
out = await self._simple_info(path)
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 970, in _simple_info
out = await self._call_s3(
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 265, in _call_s3
raise translate_boto_error(err)
PermissionError: Access Denied
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 246, in _call_s3
out = await method(**additional_kwargs)
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/aiobotocore/client.py", line 154, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/main.py", line 55, in main
ret = cmd.do_run()
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/command/base.py", line 50, in do_run
return self.run()
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/command/data_sync.py", line 57, in run
processed_files_count = self.repo.push(
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/repo/__init__.py", line 51, in wrapper
return f(repo, *args, **kwargs)
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/repo/push.py", line 44, in push
pushed += self.cloud.push(objs, jobs, remote=remote)
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/data_cloud.py", line 79, in push
return remote_obj.push(
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/remote/base.py", line 57, in wrapper
return f(obj, *args, **kwargs)
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/remote/base.py", line 494, in push
ret = self._process(
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/remote/base.py", line 351, in _process
dir_status, file_status, dir_contents = self._status(
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/remote/base.py", line 195, in _status
self.hashes_exist(
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/remote/base.py", line 145, in hashes_exist
return indexed_hashes + self.odb.hashes_exist(list(hashes), **kwargs)
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/objects/db/base.py", line 438, in hashes_exist
remote_hashes = self.list_hashes_exists(hashes, jobs, name)
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/objects/db/base.py", line 389, in list_hashes_exists
ret = list(itertools.compress(hashes, in_remote))
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/concurrent/futures/_base.py", line 619, in result_iterator
yield fs.pop().result()
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/concurrent/futures/_base.py", line 444, in result
return self.__get_result()
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/concurrent/futures/_base.py", line 389, in __get_result
raise self._exception
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/concurrent/futures/thread.py", line 57, in run
result = self.fn(*self.args, **self.kwargs)
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/objects/db/base.py", line 380, in exists_with_progress
ret = self.fs.exists(path_info)
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/fs/fsspec_wrapper.py", line 92, in exists
return self.fs.exists(self._with_bucket(path_info))
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/fsspec/asyn.py", line 87, in wrapper
return sync(self.loop, func, *args, **kwargs)
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/fsspec/asyn.py", line 68, in sync
raise result[0]
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/fsspec/asyn.py", line 24, in _runner
result[0] = await coro
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 802, in _exists
await self._info(path, bucket, key, version_id=version_id)
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 1061, in _info
out = await self._version_aware_info(path, version_id)
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 1004, in _version_aware_info
out = await self._call_s3(
File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 265, in _call_s3
raise translate_boto_error(err)
PermissionError: Forbidden
------------------------------------------------------------
2021-07-25 22:40:39,712 DEBUG: Version info for developers:
DVC version: 2.5.4 (pip)
---------------------------------
Platform: Python 3.8.10 on macOS-10.16-x86_64-i386-64bit
Supports:
http (requests = 2.26.0),
https (requests = 2.26.0),
s3 (s3fs = 2021.6.1, boto3 = 1.18.6)
Cache types: reflink, hardlink, symlink
Cache directory: apfs on /dev/disk3s1s1
Caches: local
Remotes: s3
Workspace directory: apfs on /dev/disk3s1s1
Repo: dvc, git
Having any troubles? Hit us up at https://dvc.org/support, we are always happy to help!
2021-07-25 22:40:39,713 DEBUG: Analytics is enabled.
2021-07-25 22:40:39,765 DEBUG: Trying to spawn '['daemon', '-q', 'analytics', '/var/folders/4x/xhm22wt16gl6m9nvkl9gllkc0000gn/T/tmpo86jdns5']'
2021-07-25 22:40:39,769 DEBUG: Spawned '['daemon', '-q', 'analytics', '/var/folders/4x/xhm22wt16gl6m9nvkl9gllkc0000gn/T/tmpo86jdns5']'
I can upload through python
import boto3
import os
import pickle
bucket_name = 'bib-ds-models-testing'
os.environ["AWS_PROFILE"] = "testing"
session = boto3.Session()
s3_client = boto3.client('s3')
s3_client.upload_file('/Users/nyt21/Devel/DVC/test/data/iris.csv',
'bib-ds-models-testing',
'data/dvc-test/my_iris.csv')
I don't use aws CLI but the following also gives an access deny !
aws s3 ls s3://bib-ds-models-testing/data/dvc-test
An error occurred (AccessDenied) when calling the ListObjectsV2
operation: Access Denied
but it works if I add --profile=testing
aws s3 ls s3://bib-ds-models-testing/data/dvc-test --profile=testing
PRE dvc-test/
just you know environment variable AWS_PROFILE is already set to 'testing'
UPDATE
I have tried both AWS_PROFILE='testing' and AWS_PROFILE=testing, neither of them worked.

Failing to attach volume to openstack instance

I am attempting to attach a volume to an image. My attempts consistently fail.
I am using the following command to create the attachment:
openstack server add volume test_instance testvol
where test_instance is my instance and testvol is the 20G volume that I created with Cinder.
I get no error messages when I attempt this attachment, but when I check my volume I do not see the attachment:
cinder list
+--------------------------------------+-----------+-------------+------+-------------+----------+-------------+
| ID | Status | Name | Size | Volume Type | Bootable | Attached to |
+--------------------------------------+-----------+-------------+------+-------------+----------+-------------+
| 3dce2112-eddc-4d39-b47d-8717b16dd5e9 | available | testvol | 20 | __DEFAULT__ | false | |
| a9aa8845-dd3a-4f47-8b5e-445c6170abbe | available | volume1 | 20 | iscsi | false | |
+--------------------------------------+-----------+-------------+------+-------------+----------+-------------+
I see no attachment. The above is what I get whether I attempt to attach testvol or volume1.
Looking at the nova-compute log, I am getting the following messages:
2020-09-06 12:41:07.172 2734 INFO nova.compute.manager [req-f84fbcef-b5fb-4f43-87ed-25d213b371c2 ce97118ec1f348a09d196c6da826749f 65d7e58a440440c3894c0fb7bf958052 - default default] [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Attaching volume 3dce2112-eddc-4d39-b47d-8717b16dd5e9 to /dev/vda
2020-09-06 12:41:11.114 2734 INFO os_brick.initiator.connectors.iscsi [req-f84fbcef-b5fb-4f43-87ed-25d213b371c2 ce97118ec1f348a09d196c6da826749f 65d7e58a440440c3894c0fb7bf958052 - default default] Trying to connect to iSCSI portal 192.168.1.105:3260
2020-09-06 12:41:11.141 2734 WARNING os_brick.initiator.connectors.iscsi [req-f84fbcef-b5fb-4f43-87ed-25d213b371c2 ce97118ec1f348a09d196c6da826749f 65d7e58a440440c3894c0fb7bf958052 - default default] Couldn't find iSCSI nodes because iscsiadm err: iscsiadm: No records found: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
2020-09-06 12:41:11.150 2734 WARNING os_brick.initiator.connectors.iscsi [req-f84fbcef-b5fb-4f43-87ed-25d213b371c2 ce97118ec1f348a09d196c6da826749f 65d7e58a440440c3894c0fb7bf958052 - default default] iscsiadm stderr output when getting sessions: iscsiadm: No active sessions.: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [req-f84fbcef-b5fb-4f43-87ed-25d213b371c2 ce97118ec1f348a09d196c6da826749f 65d7e58a440440c3894c0fb7bf958052 - default default] [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Driver failed to attach volume 3dce2112-eddc-4d39-b47d-8717b16dd5e9 at /dev/vda: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
Command: iscsiadm -m node -T iqn.2010-10.org.openstack:volume-3dce2112-eddc-4d39-b47d-8717b16dd5e9 -p 192.168.1.105:3260 --interface default --op new
Exit code: 6
Stdout: ''
Stderr: 'iscsiadm: Could not open /var/lib/iscsi/nodes/iqn.2010-10.org.openstack:volume-3dce2112-eddc-4d39-b47d-8717b16dd5e9/192.168.1.105,3260: Permission deniedd\niscsiadm: Error while adding record: encountered iSCSI database failure\n'
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Traceback (most recent call last):
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/nova/virt/block_device.py", line 598, in _volume_attach
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] device_type=self['device_type'], encryption=encryption)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1827, in attach_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] encryption=encryption)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1607, in _connect_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] vol_driver.connect_volume(connection_info, instance)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/volume/iscsi.py", line 64, in connect_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] device_info = self.connector.connect_volume(connection_info['data'])
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/utils.py", line 137, in trace_logging_wrapper
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return f(*args, **kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py", line 359, in inner
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return f(*args, **kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 519, in connect_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] self._cleanup_connection(connection_properties, force=True)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] self.force_reraise()
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] six.reraise(self.type_, self.value, self.tb)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/six.py", line 703, in reraise
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] raise value
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 513, in connect_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return self._connect_single_volume(connection_properties)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/utils.py", line 61, in _wrapper
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return r.call(f, *args, **kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/retrying.py", line 223, in call
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return attempt.get(self._wrap_exception)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/retrying.py", line 261, in get
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] six.reraise(self.value[0], self.value[1], self.value[2])
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/six.py", line 703, in reraise
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] raise value
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/retrying.py", line 217, in call
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] attempt = Attempt(fn(*args, **kwargs), attempt_number, False)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 569, in _connect_single_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] self._connect_vol(self.device_scan_attempts, props, data)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 626, in _connect_vol
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] session, manual_scan = self._connect_to_iscsi_portal(props)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 1040, in _connect_to_iscsi_portal
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] '--op', 'new'))
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 993, in _run_iscsiadm
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] delay_on_retry=delay_on_retry)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/executor.py", line 52, in _execute
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] result = self.__execute(*args, **kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/privileged/rootwrap.py", line 169, in execute
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return execute_root(*cmd, **kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/oslo_privsep/priv_context.py", line 247, in _wrap
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return self.channel.remote_call(name, args, kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/oslo_privsep/daemon.py", line 204, in remote_call
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] raise exc_type(*result[2])
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Command: iscsiadm -m node -T iqn.2010-10.org.openstack:volume-3dce2112-eddc-4d39-b47d-8717b16dd5e9 -p 192.168.1.105:3260 --interface default --op new
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Exit code: 6
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Stdout: ''
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Stderr: 'iscsiadm: Could not open /var/lib/iscsi/nodes/iqn.2010-10.org.openstack:volume-3dce2112-eddc-4d39-b47d-8717b16dd5e9/192.168.1.105,3260: Permission deniedd\niscsiadm: Error while adding record: encountered iSCSI database failure\n'
2020
Does anyone know what is causing this failure? How can I get a volume attached to an instance?

"Unauthorized: The request you have made requires authentication. (HTTP 401)" on Fiware Horizon

I am new to FIWARE. On my Ubuntu 16.04, I have setup Fiware's KeyRock which is a combination of OpenStack Keystone Identity Management and Horizon Dashboard. I have used this guide to set up everything.
After installing Keystone (which is a RESTful API), I have used curl to send HTTP requests and everything is working fine, i.e. I can run these commands on the keystone server using curl.
However, when I run Horizon on the same server using the reference Django project located in the openstack_dashboard directory with:
sudo tools/with_venv.sh python manage.py runserver 0.0.0.0:8000,
Horizon server starts with no errors but when I access it via browser, I get the following error:
A server error occured. Please contact the administrator
On the Keystone server console, I get the following error:
2017-01-09 11:53:40.962 13285 ERROR keystone.notifications [-] Failed to construct notifier
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications Traceback (most recent call last):
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications File "/keystone/keystone/notifications.py", line 220, in _get_notifier
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications transport = messaging.get_transport(CONF)
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications File "/keystone/.venv/local/lib/python2.7/site-packages/oslo/messaging/transport.py", line 185, in get_transport
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications invoke_kwds=kwargs)
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications File "/keystone/.venv/local/lib/python2.7/site-packages/stevedore/driver.py", line 45, in __init__
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications verify_requirements=verify_requirements,
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications File "/keystone/.venv/local/lib/python2.7/site-packages/stevedore/named.py", line 55, in __init__
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications verify_requirements)
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications File "/keystone/.venv/local/lib/python2.7/site-packages/stevedore/extension.py", line 170, in _load_plugins
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications self._on_load_failure_callback(self, ep, err)
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications File "/keystone/.venv/local/lib/python2.7/site-packages/stevedore/driver.py", line 50, in _default_on_load_failure
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications raise err
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications ImportError: cannot import name _uuid_generate_random
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications
2017-01-09 11:53:40.964 13285 WARNING keystone.common.wsgi [-] Authorization failed. The request you have made requires authentication. from 127.0.0.1
2017-01-09 11:53:40.970 13285 INFO eventlet.wsgi.server [-] 127.0.0.1 - - [09/Jan/2017 11:53:40] "POST /v3/auth/tokens HTTP/1.1" 401 331 0.261293
On the Horizon server console, I get the following error:
DEBUG:idm_logger:Creating a new internal keystoneclient connection to http://0.0.0.0:5000/v3.
Unauthorized: The request you have made requires authentication. (HTTP 401)
Traceback (most recent call last):
File "/horizon/openstack_dashboard/fiware_api/keystone.py", line 990, in _get_element_and_cache
resource_element = function(request, element)
File "/horizon/openstack_dashboard/fiware_api/keystone.py", line 1022, in <lambda>
request, basic, lambda req, n: internal_keystoneclient(req).roles.find(name=n), pickle_props=['name'])
File "/horizon/openstack_dashboard/fiware_api/keystone.py", line 64, in internal_keystoneclient
cache.set(CACHE_CLIENT, keystoneclient.session.get_token(), INTERNAL_CLIENT_CACHE_TIME)
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/session.py", line 610, in get_token
return (self.get_auth_headers(auth) or {}).get('X-Auth-Token')
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/session.py", line 589, in get_auth_headers
return auth.get_headers(self, **kwargs)
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/auth/base.py", line 114, in get_headers
token = self.get_token(session)
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/auth/identity/base.py", line 104, in get_token
return self.get_access(session).auth_token
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/auth/identity/base.py", line 144, in get_access
self.auth_ref = self.get_auth_ref(session)
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/auth/identity/v3.py", line 127, in get_auth_ref
authenticated=False, log=False, **rkwargs)
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/session.py", line 488, in post
return self.request(url, 'POST', **kwargs)
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/utils.py", line 318, in inner
return func(*args, **kwargs)
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/session.py", line 389, in request
raise exceptions.from_response(resp, method, url)
Unauthorized: The request you have made requires authentication. (HTTP 401)
Traceback (most recent call last):
File "/usr/lib/python2.7/wsgiref/handlers.py", line 85, in run
self.result = application(self.environ, self.start_response)
File "/horizon/.venv/local/lib/python2.7/site-packages/django/contrib/staticfiles/handlers.py", line 64, in __call__
return self.application(environ, start_response)
File "/horizon/.venv/local/lib/python2.7/site-packages/django/core/handlers/wsgi.py", line 168, in __call__
self.load_middleware()
File "/horizon/.venv/local/lib/python2.7/site-packages/django/core/handlers/base.py", line 46, in load_middleware
mw_instance = mw_class()
File "/horizon/.venv/local/lib/python2.7/site-packages/django/middleware/locale.py", line 23, in __init__
for url_pattern in get_resolver(None).url_patterns:
File "/horizon/.venv/local/lib/python2.7/site-packages/django/core/urlresolvers.py", line 367, in url_patterns
patterns = getattr(self.urlconf_module, "urlpatterns", self.urlconf_module)
File "/horizon/.venv/local/lib/python2.7/site-packages/django/core/urlresolvers.py", line 361, in urlconf_module
self._urlconf_module = import_module(self.urlconf_name)
File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
File "/horizon/openstack_dashboard/urls.py", line 36, in <module>
from openstack_dashboard.dashboards.idm_admin.user_accounts \
File "/horizon/openstack_dashboard/dashboards/idm_admin/user_accounts/views.py", line 28, in <module>
from openstack_dashboard.dashboards.idm_admin.user_accounts \
File "/horizon/openstack_dashboard/dashboards/idm_admin/user_accounts/forms.py", line 195, in <module>
class UpdateAccountForm(forms.SelfHandlingForm, UserAccountsLogicMixin):
File "/horizon/openstack_dashboard/dashboards/idm_admin/user_accounts/forms.py", line 202, in UpdateAccountForm
choices=get_account_choices())
File "/horizon/openstack_dashboard/dashboards/idm_admin/user_accounts/forms.py", line 172, in get_account_choices
use_idm_account=True),
File "/horizon/openstack_dashboard/fiware_api/keystone.py", line 1022, in get_basic_role
request, basic, lambda req, n: internal_keystoneclient(req).roles.find(name=n), pickle_props=['name'])
File "/horizon/openstack_dashboard/fiware_api/keystone.py", line 997, in _get_element_and_cache
exceptions.handle(request)
File "/horizon/horizon/exceptions.py", line 291, in handle
messages.error(request, message or fallback)
File "/horizon/horizon/messages.py", line 83, in error
fail_silently=fail_silently)
File "/horizon/horizon/messages.py", line 41, in add_message
if not horizon_message_already_queued(request, message):
File "/horizon/horizon/messages.py", line 28, in horizon_message_already_queued
if request.is_ajax():
AttributeError: 'NoneType' object has no attribute 'is_ajax'
[09/Jan/2017 12:39:35] "GET / HTTP/1.1" 500 59
I am sending a GET request via the browser but the Keystone server is receiving a POST request as indicated in its console output. I don't understand why is this so.
It looks like the Horizon server is not able to verify the credentials.
In the case of Fiware, you would get this error on Horizon server if the file local_settings.py (located in the /horizon/openstack_dashboard/local/ directory) is not properly configured. In this file, there is a dictionary named IDM_USER_CREDENTIALS where you have to set the password for the admin profile that was created when you populated the database of Keystone. It is also given in the link that you provided. Try changing the password and see if it works.

Resources