Installing openstack ocata.
When i install and configure the compute node i need to restart the nova-compute service, but this gives me the error below.
When i do the Hypervisor list in controller node i do not get any output.
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 402, in _send
2017-09-02 23:11:52.030 8491 ERROR nova msg.update({'_reply_q': self._get_reply_q()})
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 383, in _get_reply_q
2017-09-02 23:11:52.030 8491 ERROR nova conn = self._get_connection(rpc_common.PURPOSE_LISTEN)
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 374, in _get_connection
2017-09-02 23:11:52.030 8491 ERROR nova purpose=purpose)
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/common.py", line 402, in __init__
2017-09-02 23:11:52.030 8491 ERROR nova self.connection = connection_pool.create(purpose)
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/pool.py", line 144, in create
2017-09-02 23:11:52.030 8491 ERROR nova return self.connection_cls(self.conf, self.url, purpose)
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 611, in __init__
2017-09-02 23:11:52.030 8491 ERROR nova self.ensure_connection()
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 710, in ensure_connection
2017-09-02 23:11:52.030 8491 ERROR nova self.ensure(method=lambda: self.connection.connection)
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 817, in ensure
2017-09-02 23:11:52.030 8491 ERROR nova ret, channel = autoretry_method()
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/kombu/connection.py", line 436, in _ensured
2017-09-02 23:11:52.030 8491 ERROR nova return fun(*args, **kwargs)
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/kombu/connection.py", line 507, in __call__
2017-09-02 23:11:52.030 8491 ERROR nova self.revive(create_channel())
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/kombu/connection.py", line 242, in channel
2017-09-02 23:11:52.030 8491 ERROR nova chan = self.transport.create_channel(self.connection)
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/kombu/connection.py", line 741, in connection
2017-09-02 23:11:52.030 8491 ERROR nova self._connection = self._establish_connection()
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/kombu/connection.py", line 696, in _establish_connection
2017-09-02 23:11:52.030 8491 ERROR nova conn = self.transport.establish_connection()
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/kombu/transport/pyamqp.py", line 116, in establish_connection
2017-09-02 23:11:52.030 8491 ERROR nova conn = self.Connection(**opts)
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/amqp/connection.py", line 180, in __init__
2017-09-02 23:11:52.030 8491 ERROR nova (10, 30), # tune
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/amqp/abstract_channel.py", line 67, in wait
2017-09-02 23:11:52.030 8491 ERROR nova self.channel_id, allowed_methods, timeout)
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/amqp/connection.py", line 274, in _wait_method
2017-09-02 23:11:52.030 8491 ERROR nova wait()
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/amqp/abstract_channel.py", line 69, in wait
2017-09-02 23:11:52.030 8491 ERROR nova return self.dispatch_method(method_sig, args, content)
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/amqp/abstract_channel.py", line 87, in dispatch_method
2017-09-02 23:11:52.030 8491 ERROR nova return amqp_method(self, args)
2017-09-02 23:11:52.030 8491 ERROR nova File "/usr/lib/python2.7/dist-packages/amqp/connection.py", line 530, in _close
2017-09-02 23:11:52.030 8491 ERROR nova (class_id, method_id), ConnectionError)<br>
2017-09-02 23:11:52.030 8491 ERROR nova AccessRefused: (0, 0): (403) ACCESS_REFUSED - Login was refused using authentication mechanism AMQPLAIN.
For details see the broker logfile.
What can be the solution to this? Thanks in advance
Neha,
1. Check rabbitmq-server service is active and running
2. Ensure the rabbitmq-server credentials provided are correct in nova config file
Related
I am attempting to attach a volume to an image. My attempts consistently fail.
I am using the following command to create the attachment:
openstack server add volume test_instance testvol
where test_instance is my instance and testvol is the 20G volume that I created with Cinder.
I get no error messages when I attempt this attachment, but when I check my volume I do not see the attachment:
cinder list
+--------------------------------------+-----------+-------------+------+-------------+----------+-------------+
| ID | Status | Name | Size | Volume Type | Bootable | Attached to |
+--------------------------------------+-----------+-------------+------+-------------+----------+-------------+
| 3dce2112-eddc-4d39-b47d-8717b16dd5e9 | available | testvol | 20 | __DEFAULT__ | false | |
| a9aa8845-dd3a-4f47-8b5e-445c6170abbe | available | volume1 | 20 | iscsi | false | |
+--------------------------------------+-----------+-------------+------+-------------+----------+-------------+
I see no attachment. The above is what I get whether I attempt to attach testvol or volume1.
Looking at the nova-compute log, I am getting the following messages:
2020-09-06 12:41:07.172 2734 INFO nova.compute.manager [req-f84fbcef-b5fb-4f43-87ed-25d213b371c2 ce97118ec1f348a09d196c6da826749f 65d7e58a440440c3894c0fb7bf958052 - default default] [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Attaching volume 3dce2112-eddc-4d39-b47d-8717b16dd5e9 to /dev/vda
2020-09-06 12:41:11.114 2734 INFO os_brick.initiator.connectors.iscsi [req-f84fbcef-b5fb-4f43-87ed-25d213b371c2 ce97118ec1f348a09d196c6da826749f 65d7e58a440440c3894c0fb7bf958052 - default default] Trying to connect to iSCSI portal 192.168.1.105:3260
2020-09-06 12:41:11.141 2734 WARNING os_brick.initiator.connectors.iscsi [req-f84fbcef-b5fb-4f43-87ed-25d213b371c2 ce97118ec1f348a09d196c6da826749f 65d7e58a440440c3894c0fb7bf958052 - default default] Couldn't find iSCSI nodes because iscsiadm err: iscsiadm: No records found: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
2020-09-06 12:41:11.150 2734 WARNING os_brick.initiator.connectors.iscsi [req-f84fbcef-b5fb-4f43-87ed-25d213b371c2 ce97118ec1f348a09d196c6da826749f 65d7e58a440440c3894c0fb7bf958052 - default default] iscsiadm stderr output when getting sessions: iscsiadm: No active sessions.: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [req-f84fbcef-b5fb-4f43-87ed-25d213b371c2 ce97118ec1f348a09d196c6da826749f 65d7e58a440440c3894c0fb7bf958052 - default default] [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Driver failed to attach volume 3dce2112-eddc-4d39-b47d-8717b16dd5e9 at /dev/vda: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
Command: iscsiadm -m node -T iqn.2010-10.org.openstack:volume-3dce2112-eddc-4d39-b47d-8717b16dd5e9 -p 192.168.1.105:3260 --interface default --op new
Exit code: 6
Stdout: ''
Stderr: 'iscsiadm: Could not open /var/lib/iscsi/nodes/iqn.2010-10.org.openstack:volume-3dce2112-eddc-4d39-b47d-8717b16dd5e9/192.168.1.105,3260: Permission deniedd\niscsiadm: Error while adding record: encountered iSCSI database failure\n'
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Traceback (most recent call last):
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/nova/virt/block_device.py", line 598, in _volume_attach
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] device_type=self['device_type'], encryption=encryption)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1827, in attach_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] encryption=encryption)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1607, in _connect_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] vol_driver.connect_volume(connection_info, instance)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/volume/iscsi.py", line 64, in connect_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] device_info = self.connector.connect_volume(connection_info['data'])
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/utils.py", line 137, in trace_logging_wrapper
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return f(*args, **kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py", line 359, in inner
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return f(*args, **kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 519, in connect_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] self._cleanup_connection(connection_properties, force=True)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] self.force_reraise()
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] six.reraise(self.type_, self.value, self.tb)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/six.py", line 703, in reraise
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] raise value
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 513, in connect_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return self._connect_single_volume(connection_properties)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/utils.py", line 61, in _wrapper
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return r.call(f, *args, **kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/retrying.py", line 223, in call
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return attempt.get(self._wrap_exception)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/retrying.py", line 261, in get
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] six.reraise(self.value[0], self.value[1], self.value[2])
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/six.py", line 703, in reraise
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] raise value
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/retrying.py", line 217, in call
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] attempt = Attempt(fn(*args, **kwargs), attempt_number, False)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 569, in _connect_single_volume
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] self._connect_vol(self.device_scan_attempts, props, data)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 626, in _connect_vol
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] session, manual_scan = self._connect_to_iscsi_portal(props)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 1040, in _connect_to_iscsi_portal
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] '--op', 'new'))
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/initiator/connectors/iscsi.py", line 993, in _run_iscsiadm
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] delay_on_retry=delay_on_retry)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/executor.py", line 52, in _execute
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] result = self.__execute(*args, **kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/os_brick/privileged/rootwrap.py", line 169, in execute
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return execute_root(*cmd, **kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/oslo_privsep/priv_context.py", line 247, in _wrap
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] return self.channel.remote_call(name, args, kwargs)
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] File "/usr/lib/python3.6/site-packages/oslo_privsep/daemon.py", line 204, in remote_call
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] raise exc_type(*result[2])
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Command: iscsiadm -m node -T iqn.2010-10.org.openstack:volume-3dce2112-eddc-4d39-b47d-8717b16dd5e9 -p 192.168.1.105:3260 --interface default --op new
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Exit code: 6
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Stdout: ''
2020-09-06 12:41:11.150 2734 ERROR nova.virt.block_device [instance: efdf9e5a-6a8c-4ef1-8bfd-7d072c750b6b] Stderr: 'iscsiadm: Could not open /var/lib/iscsi/nodes/iqn.2010-10.org.openstack:volume-3dce2112-eddc-4d39-b47d-8717b16dd5e9/192.168.1.105,3260: Permission deniedd\niscsiadm: Error while adding record: encountered iSCSI database failure\n'
2020
Does anyone know what is causing this failure? How can I get a volume attached to an instance?
Hi I am new to Openstack I have installed the OpenStack Pike version (Keystone, Glance, Nova, Neutron, Horizon) with two ubuntu 16.04 VM (Controller and Compute Node) and added ICMP, SSH ports to security group but I could not able ping the instance ip, This is the link for the Doc https://docs.openstack.org/install-guide/openstack-services.html
For neutron I have choose self-service networks
This is My Console Log of the Instance
NOCHANGE: partition 1 is size 10458315. it cannot be grown
info: initramfs loading root from /dev/vda1
info: /etc/init.d/rc.sysinit: up at 5.09
info: container: none
Starting logging: OK
modprobe: module virtio_blk not found in modules.dep
modprobe: module virtio_net not found in modules.dep
WARN: /etc/rc3.d/S10-load-modules failed
Initializing random number generator... done.
Starting acpid: OK
cirros-ds 'local' up at 7.70
no results found for mode=local. up 8.31. searched: nocloud configdrive ec2
Starting network...
udhcpc (v1.20.1) started
Sending discover...
Sending select for 10.0.0.11...
Lease of 10.0.0.11 obtained, lease time 86400
route: SIOCADDRT: File exists
WARN: failed: route add -net "0.0.0.0/0" gw "10.0.0.1"
cirros-ds 'net' up at 9.71
checking http://169.254.169.254/2009-04-04/instance-id
failed 1/20: up 9.90. request failed
failed 2/20: up 14.06. request failed
failed 3/20: up 20.58. request failed
failed 4/20: up 32.71. request failed
failed 5/20: up 35.68. request failed
failed 6/20: up 47.78. request failed
failed 7/20: up 50.76. request failed
failed 8/20: up 53.40. request failed
failed 9/20: up 56.17. request failed
failed 10/20: up 58.86. request failed
failed 11/20: up 63.43. request failed
failed 12/20: up 66.60. request failed
failed 13/20: up 78.80. request failed
failed 14/20: up 81.63. request failed
failed 15/20: up 84.49. request failed
failed 16/20: up 87.27. request failed
failed 17/20: up 89.93. request failed
failed 18/20: up 92.73. request failed
failed 19/20: up 95.79. request failed
failed 20/20: up 98.56. request failed
failed to read iid from metadata. tried 20
no results found for mode=net. up 101.26. searched: nocloud configdrive ec2
failed to get instance-id of datasource
Top of dropbear init script
Starting dropbear sshd: failed to get instance-id of datasource
WARN: generating key of type ecdsa failed!
OK
This is my neutron-l3-agent.log (Compute Node)
2018-07-02 18:22:35.009 13819 INFO neutron.common.config [-] Logging enabled!
2018-07-02 18:22:35.010 13819 INFO neutron.common.config [-] /usr/bin/neutron-l3-agent version 11.0.3
2018-07-02 18:22:35.329 13819 ERROR neutron.agent.l3.agent [-] An interface driver must be specified
This is my neutron-linuxbridge-agent.log (Compute Node)
2018-07-02 18:24:25.186 7298 INFO neutron.agent.securitygroups_rpc [req-85444d73-0a1f-49df-ba25-98c0edc8d64d - - - - -] Security group member updated [u'957d8ed4-dba5-4cf5-b6df-0c7c044e4376']
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent [req-85444d73-0a1f-49df-ba25-98c0edc8d64d - - - - -] Error occurred while removing port tapac1db012-e4: RemoteError: Remote error: AgentNotFoundByTypeHost Agent with agent_type=L3 agent and host=Compute could not be found
[u'Traceback (most recent call last):\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 160, in _process_incoming\n res = self.dispatcher.dispatch(message)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 213, in dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 183, in _do_dispatch\n result = func(ctxt, **new_args)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 234, in update_device_down\n n_const.PORT_STATUS_DOWN, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 331, in notify_l2pop_port_wiring\n l2pop_driver.obj.update_port_down(port_context)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/drivers/l2pop/mech_driver.py", line 253, in update_port_down\n admin_context, agent_host, [port[\'device_id\']]):\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/l3_agentschedulers_db.py", line 303, in list_router_ids_on_host\n context, constants.AGENT_TYPE_L3, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/agents_db.py", line 291, in _get_agent_by_type_and_host\n host=host)\n', u'AgentNotFoundByTypeHost: Agent with agent_type=L3 agent and host=Compute could not be found\n'].
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent Traceback (most recent call last):
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/drivers/agent/_common_agent.py", line 337, in treat_devices_removed
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent cfg.CONF.host)
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/neutron/agent/rpc.py", line 139, in update_device_down
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent agent_id=agent_id, host=host)
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/neutron/common/rpc.py", line 162, in call
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent return self._original_context.call(ctxt, method, **kwargs)
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 169, in call
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent retry=self.retry)
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 123, in _send
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent timeout=timeout, retry=retry)
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 578, in send
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent retry=retry)
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 569, in _send
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent raise result
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent RemoteError: Remote error: AgentNotFoundByTypeHost Agent with agent_type=L3 agent and host=Compute could not be found
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent [u'Traceback (most recent call last):\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 160, in _process_incoming\n res = self.dispatcher.dispatch(message)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 213, in dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 183, in _do_dispatch\n result = func(ctxt, **new_args)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 234, in update_device_down\n n_const.PORT_STATUS_DOWN, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 331, in notify_l2pop_port_wiring\n l2pop_driver.obj.update_port_down(port_context)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/drivers/l2pop/mech_driver.py", line 253, in update_port_down\n admin_context, agent_host, [port[\'device_id\']]):\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/l3_agentschedulers_db.py", line 303, in list_router_ids_on_host\n context, constants.AGENT_TYPE_L3, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/agents_db.py", line 291, in _get_agent_by_type_and_host\n host=host)\n', u'AgentNotFoundByTypeHost: Agent with agent_type=L3 agent and host=Compute could not be found\n'].
2018-07-02 18:24:25.364 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent
2018-07-02 18:24:25.366 7298 INFO neutron.plugins.ml2.drivers.agent._common_agent [req-85444d73-0a1f-49df-ba25-98c0edc8d64d - - - - -] Attachment tapac454bed-28 removed
2018-07-02 18:24:25.810 7298 INFO neutron.agent.securitygroups_rpc [req-85444d73-0a1f-49df-ba25-98c0edc8d64d - - - - -] Security group member updated [u'957d8ed4-dba5-4cf5-b6df-0c7c044e4376']
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent [req-85444d73-0a1f-49df-ba25-98c0edc8d64d - - - - -] Error occurred while removing port tapac454bed-28: RemoteError: Remote error: AgentNotFoundByTypeHost Agent with agent_type=L3 agent and host=Compute could not be found
[u'Traceback (most recent call last):\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 160, in _process_incoming\n res = self.dispatcher.dispatch(message)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 213, in dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 183, in _do_dispatch\n result = func(ctxt, **new_args)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 234, in update_device_down\n n_const.PORT_STATUS_DOWN, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 331, in notify_l2pop_port_wiring\n l2pop_driver.obj.update_port_down(port_context)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/drivers/l2pop/mech_driver.py", line 253, in update_port_down\n admin_context, agent_host, [port[\'device_id\']]):\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/l3_agentschedulers_db.py", line 303, in list_router_ids_on_host\n context, constants.AGENT_TYPE_L3, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/agents_db.py", line 291, in _get_agent_by_type_and_host\n host=host)\n', u'AgentNotFoundByTypeHost: Agent with agent_type=L3 agent and host=Compute could not be found\n'].
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent Traceback (most recent call last):
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/drivers/agent/_common_agent.py", line 337, in treat_devices_removed
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent cfg.CONF.host)
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/neutron/agent/rpc.py", line 139, in update_device_down
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent agent_id=agent_id, host=host)
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/neutron/common/rpc.py", line 162, in call
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent return self._original_context.call(ctxt, method, **kwargs)
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 169, in call
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent retry=self.retry)
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 123, in _send
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent timeout=timeout, retry=retry)
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 578, in send
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent retry=retry)
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 569, in _send
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent raise result
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent RemoteError: Remote error: AgentNotFoundByTypeHost Agent with agent_type=L3 agent and host=Compute could not be found
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent [u'Traceback (most recent call last):\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 160, in _process_incoming\n res = self.dispatcher.dispatch(message)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 213, in dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', u' File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 183, in _do_dispatch\n result = func(ctxt, **new_args)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 234, in update_device_down\n n_const.PORT_STATUS_DOWN, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/rpc.py", line 331, in notify_l2pop_port_wiring\n l2pop_driver.obj.update_port_down(port_context)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/drivers/l2pop/mech_driver.py", line 253, in update_port_down\n admin_context, agent_host, [port[\'device_id\']]):\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/l3_agentschedulers_db.py", line 303, in list_router_ids_on_host\n context, constants.AGENT_TYPE_L3, host)\n', u' File "/usr/lib/python2.7/dist-packages/neutron/db/agents_db.py", line 291, in _get_agent_by_type_and_host\n host=host)\n', u'AgentNotFoundByTypeHost: Agent with agent_type=L3 agent and host=Compute could not be found\n'].
2018-07-02 18:24:26.215 7298 ERROR neutron.plugins.ml2.drivers.agent._common_agent
Add this to your /etc/neutron/neutron.conf on both controller and all of the compute nodes (under [oslo_concurrency] section):
lock_path = /var/lib/neutron/tmp
I am new to FIWARE. On my Ubuntu 16.04, I have setup Fiware's KeyRock which is a combination of OpenStack Keystone Identity Management and Horizon Dashboard. I have used this guide to set up everything.
After installing Keystone (which is a RESTful API), I have used curl to send HTTP requests and everything is working fine, i.e. I can run these commands on the keystone server using curl.
However, when I run Horizon on the same server using the reference Django project located in the openstack_dashboard directory with:
sudo tools/with_venv.sh python manage.py runserver 0.0.0.0:8000,
Horizon server starts with no errors but when I access it via browser, I get the following error:
A server error occured. Please contact the administrator
On the Keystone server console, I get the following error:
2017-01-09 11:53:40.962 13285 ERROR keystone.notifications [-] Failed to construct notifier
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications Traceback (most recent call last):
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications File "/keystone/keystone/notifications.py", line 220, in _get_notifier
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications transport = messaging.get_transport(CONF)
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications File "/keystone/.venv/local/lib/python2.7/site-packages/oslo/messaging/transport.py", line 185, in get_transport
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications invoke_kwds=kwargs)
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications File "/keystone/.venv/local/lib/python2.7/site-packages/stevedore/driver.py", line 45, in __init__
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications verify_requirements=verify_requirements,
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications File "/keystone/.venv/local/lib/python2.7/site-packages/stevedore/named.py", line 55, in __init__
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications verify_requirements)
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications File "/keystone/.venv/local/lib/python2.7/site-packages/stevedore/extension.py", line 170, in _load_plugins
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications self._on_load_failure_callback(self, ep, err)
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications File "/keystone/.venv/local/lib/python2.7/site-packages/stevedore/driver.py", line 50, in _default_on_load_failure
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications raise err
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications ImportError: cannot import name _uuid_generate_random
2017-01-09 11:53:40.962 13285 TRACE keystone.notifications
2017-01-09 11:53:40.964 13285 WARNING keystone.common.wsgi [-] Authorization failed. The request you have made requires authentication. from 127.0.0.1
2017-01-09 11:53:40.970 13285 INFO eventlet.wsgi.server [-] 127.0.0.1 - - [09/Jan/2017 11:53:40] "POST /v3/auth/tokens HTTP/1.1" 401 331 0.261293
On the Horizon server console, I get the following error:
DEBUG:idm_logger:Creating a new internal keystoneclient connection to http://0.0.0.0:5000/v3.
Unauthorized: The request you have made requires authentication. (HTTP 401)
Traceback (most recent call last):
File "/horizon/openstack_dashboard/fiware_api/keystone.py", line 990, in _get_element_and_cache
resource_element = function(request, element)
File "/horizon/openstack_dashboard/fiware_api/keystone.py", line 1022, in <lambda>
request, basic, lambda req, n: internal_keystoneclient(req).roles.find(name=n), pickle_props=['name'])
File "/horizon/openstack_dashboard/fiware_api/keystone.py", line 64, in internal_keystoneclient
cache.set(CACHE_CLIENT, keystoneclient.session.get_token(), INTERNAL_CLIENT_CACHE_TIME)
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/session.py", line 610, in get_token
return (self.get_auth_headers(auth) or {}).get('X-Auth-Token')
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/session.py", line 589, in get_auth_headers
return auth.get_headers(self, **kwargs)
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/auth/base.py", line 114, in get_headers
token = self.get_token(session)
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/auth/identity/base.py", line 104, in get_token
return self.get_access(session).auth_token
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/auth/identity/base.py", line 144, in get_access
self.auth_ref = self.get_auth_ref(session)
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/auth/identity/v3.py", line 127, in get_auth_ref
authenticated=False, log=False, **rkwargs)
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/session.py", line 488, in post
return self.request(url, 'POST', **kwargs)
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/utils.py", line 318, in inner
return func(*args, **kwargs)
File "/horizon/.venv/src/python-keystoneclient/keystoneclient/session.py", line 389, in request
raise exceptions.from_response(resp, method, url)
Unauthorized: The request you have made requires authentication. (HTTP 401)
Traceback (most recent call last):
File "/usr/lib/python2.7/wsgiref/handlers.py", line 85, in run
self.result = application(self.environ, self.start_response)
File "/horizon/.venv/local/lib/python2.7/site-packages/django/contrib/staticfiles/handlers.py", line 64, in __call__
return self.application(environ, start_response)
File "/horizon/.venv/local/lib/python2.7/site-packages/django/core/handlers/wsgi.py", line 168, in __call__
self.load_middleware()
File "/horizon/.venv/local/lib/python2.7/site-packages/django/core/handlers/base.py", line 46, in load_middleware
mw_instance = mw_class()
File "/horizon/.venv/local/lib/python2.7/site-packages/django/middleware/locale.py", line 23, in __init__
for url_pattern in get_resolver(None).url_patterns:
File "/horizon/.venv/local/lib/python2.7/site-packages/django/core/urlresolvers.py", line 367, in url_patterns
patterns = getattr(self.urlconf_module, "urlpatterns", self.urlconf_module)
File "/horizon/.venv/local/lib/python2.7/site-packages/django/core/urlresolvers.py", line 361, in urlconf_module
self._urlconf_module = import_module(self.urlconf_name)
File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
File "/horizon/openstack_dashboard/urls.py", line 36, in <module>
from openstack_dashboard.dashboards.idm_admin.user_accounts \
File "/horizon/openstack_dashboard/dashboards/idm_admin/user_accounts/views.py", line 28, in <module>
from openstack_dashboard.dashboards.idm_admin.user_accounts \
File "/horizon/openstack_dashboard/dashboards/idm_admin/user_accounts/forms.py", line 195, in <module>
class UpdateAccountForm(forms.SelfHandlingForm, UserAccountsLogicMixin):
File "/horizon/openstack_dashboard/dashboards/idm_admin/user_accounts/forms.py", line 202, in UpdateAccountForm
choices=get_account_choices())
File "/horizon/openstack_dashboard/dashboards/idm_admin/user_accounts/forms.py", line 172, in get_account_choices
use_idm_account=True),
File "/horizon/openstack_dashboard/fiware_api/keystone.py", line 1022, in get_basic_role
request, basic, lambda req, n: internal_keystoneclient(req).roles.find(name=n), pickle_props=['name'])
File "/horizon/openstack_dashboard/fiware_api/keystone.py", line 997, in _get_element_and_cache
exceptions.handle(request)
File "/horizon/horizon/exceptions.py", line 291, in handle
messages.error(request, message or fallback)
File "/horizon/horizon/messages.py", line 83, in error
fail_silently=fail_silently)
File "/horizon/horizon/messages.py", line 41, in add_message
if not horizon_message_already_queued(request, message):
File "/horizon/horizon/messages.py", line 28, in horizon_message_already_queued
if request.is_ajax():
AttributeError: 'NoneType' object has no attribute 'is_ajax'
[09/Jan/2017 12:39:35] "GET / HTTP/1.1" 500 59
I am sending a GET request via the browser but the Keystone server is receiving a POST request as indicated in its console output. I don't understand why is this so.
It looks like the Horizon server is not able to verify the credentials.
In the case of Fiware, you would get this error on Horizon server if the file local_settings.py (located in the /horizon/openstack_dashboard/local/ directory) is not properly configured. In this file, there is a dictionary named IDM_USER_CREDENTIALS where you have to set the password for the admin profile that was created when you populated the database of Keystone. It is also given in the link that you provided. Try changing the password and see if it works.
Message
Remote error: ProcessExecutionError Unexpected error while running command. Command: sudo nova-rootwrap /etc/nova/rootwrap.conf env CONFIG_FILE=["/etc/nova/nova.conf"] NETWORK_ID=1 dnsmasq --strict-order --bind-interfaces --conf-file= --pid-file=/opt/stac
Code
500
Details
File "/opt/stack/nova/nova/compute/manager.py", line 293, in decorated_function return function(self, context, *args, **kwargs) File "/opt/stack/nova/nova/compute/manager.py", line 2003, in run_instance do_run_instance() File "/opt/stack/nova/nova/openstack/common/lockutils.py", line 249, in inner return f(*args, **kwargs) File "/opt/stack/nova/nova/compute/manager.py", line 2002, in do_run_instance legacy_bdm_in_spec) File "/opt/stack/nova/nova/compute/manager.py", line 1150, in _run_instance notify("error", fault=e) # notify that build failed File "/opt/stack/nova/nova/openstack/common/excutils.py", line 68, in __exit__ six.reraise(self.type_, self.value, self.tb) File "/opt/stack/nova/nova/compute/manager.py", line 1134, in _run_instance instance, image_meta, legacy_bdm_in_spec) File "/opt/stack/nova/nova/compute/manager.py", line 1287, in _build_instance filter_properties, bdms, legacy_bdm_in_spec) File "/opt/stack/nova/nova/compute/manager.py", line 1333, in _reschedule_or_error self._log_original_error(exc_info, instance_uuid) File "/opt/stack/nova/nova/openstack/common/excutils.py", line 68, in __exit__ six.reraise(self.type_, self.value, self.tb) File "/opt/stack/nova/nova/compute/manager.py", line 1328, in _reschedule_or_error bdms, requested_networks) File "/opt/stack/nova/nova/compute/manager.py", line 2055, in _shutdown_instance self._try_deallocate_network(context, instance, requested_networks) File "/opt/stack/nova/nova/compute/manager.py", line 2014, in _try_deallocate_network self._set_instance_error_state(context, instance['uuid']) File "/opt/stack/nova/nova/openstack/common/excutils.py", line 68, in __exit__ six.reraise(self.type_, self.value, self.tb) File "/opt/stack/nova/nova/compute/manager.py", line 2009, in _try_deallocate_network self._deallocate_network(context, instance, requested_networks) File "/opt/stack/nova/nova/compute/manager.py", line 1708, in _deallocate_network context, instance, requested_networks=requested_networks) File "/opt/stack/nova/nova/network/api.py", line 94, in wrapped return func(self, context, *args, **kwargs) File "/opt/stack/nova/nova/network/api.py", line 318, in deallocate_for_instance requested_networks=requested_networks) File "/opt/stack/nova/nova/network/rpcapi.py", line 190, in deallocate_for_instance return cctxt.call(ctxt, 'deallocate_for_instance', **kwargs) File "/opt/stack/oslo.messaging/oslo/messaging/rpc/client.py", line 361, in call return self.prepare().call(ctxt, method, **kwargs) File "/opt/stack/oslo.messaging/oslo/messaging/rpc/client.py", line 150, in call wait_for_reply=True, timeout=timeout) File "/opt/stack/oslo.messaging/oslo/messaging/transport.py", line 90, in _send timeout=timeout) File "/opt/stack/oslo.messaging/oslo/messaging/_drivers/amqpdriver.py", line 409, in send return self._send(target, ctxt, message, wait_for_reply, timeout) File "/opt/stack/oslo.messaging/oslo/messaging/_drivers/amqpdriver.py", line 402, in _send raise result
I am not able to figure out the problem. Please help!
I made change in localrc file as below and run stack.sh script again, it worked:
stack#ubuntu:~/devstack$ cat localrc
FLOATING_RANGE=192.168.1.224/27
FIXED_RANGE=10.11.12.0/24
FIXED_NETWORK_SIZE=256
ADMIN_PASSWORD=root123
MYSQL_PASSWORD=root123
RABBIT_PASSWORD=root123
SERVICE_PASSWORD=root123
FORCE=yes
SERVICE_TOKEN=root123
LOGFILE=/opt/stack/logs/stack.sh.log
FLOATING_RANGE=192.168.1.224/27
FIXED_RANGE=10.11.12.0/24
FIXED_NETWORK_SIZE=256
FLAT_INTERFACE=eth0
stack#ubuntu:~/devstack$
I'm trying to add a new nova-compute node (KUubuntu 12.04) to my single node OpenStack (Essex release) installation running on Ubuntu 12.04 LTS. However, I am getting the following error:-
2012-06-22 14:05:12 INFO nova.rpc.common [-] Reconnecting to AMQP server on localhost:5672
2012-06-22 14:05:12 ERROR nova.rpc.common [-] AMQP server on localhost:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 27 seconds.
2012-06-22 14:05:12 TRACE nova.rpc.common Traceback (most recent call last):
2012-06-22 14:05:12 TRACE nova.rpc.common File "/usr/lib/python2.7/dist-packages/nova/rpc/impl_kombu.py", line 446, in reconnect
2012-06-22 14:05:12 TRACE nova.rpc.common self._connect()
2012-06-22 14:05:12 TRACE nova.rpc.common File "/usr/lib/python2.7/dist-packages/nova/rpc/impl_kombu.py", line 423, in _connect
2012-06-22 14:05:12 TRACE nova.rpc.common self.connection.connect()
2012-06-22 14:05:12 TRACE nova.rpc.common File "/usr/lib/python2.7/dist-packages/kombu/connection.py", line 154, in connect
2012-06-22 14:05:12 TRACE nova.rpc.common return self.connection
2012-06-22 14:05:12 TRACE nova.rpc.common File "/usr/lib/python2.7/dist-packages/kombu/connection.py", line 560, in connection
2012-06-22 14:05:12 TRACE nova.rpc.common self._connection = self._establish_connection()
2012-06-22 14:05:12 TRACE nova.rpc.common File "/usr/lib/python2.7/dist-packages/kombu/connection.py", line 521, in _establish_connection
2012-06-22 14:05:12 TRACE nova.rpc.common conn = self.transport.establish_connection()
2012-06-22 14:05:12 TRACE nova.rpc.common File "/usr/lib/python2.7/dist-packages/kombu/transport/pyamqplib.py", line 255, in establish_connection
2012-06-22 14:05:12 TRACE nova.rpc.common connect_timeout=conninfo.connect_timeout)
2012-06-22 14:05:12 TRACE nova.rpc.common File "/usr/lib/python2.7/dist-packages/kombu/transport/pyamqplib.py", line 52, in __init__
2012-06-22 14:05:12 TRACE nova.rpc.common super(Connection, self).__init__(*args, **kwargs)
2012-06-22 14:05:12 TRACE nova.rpc.common File "/usr/lib/python2.7/dist-packages/amqplib/client_0_8/connection.py", line 129, in __init__
2012-06-22 14:05:12 TRACE nova.rpc.common self.transport = create_transport(host, connect_timeout, ssl)
2012-06-22 14:05:12 TRACE nova.rpc.common File "/usr/lib/python2.7/dist-packages/amqplib/client_0_8/transport.py", line 281, in create_transport
2012-06-22 14:05:12 TRACE nova.rpc.common return TCPTransport(host, connect_timeout)
2012-06-22 14:05:12 TRACE nova.rpc.common File "/usr/lib/python2.7/dist-packages/amqplib/client_0_8/transport.py", line 85, in __init__
2012-06-22 14:05:12 TRACE nova.rpc.common raise socket.error, msg
2012-06-22 14:05:12 TRACE nova.rpc.common error: [Errno 111] ECONNREFUSED
2012-06-22 14:05:12 TRACE nova.rpc.common
What is the cause of error and how to fix this error?
As you noted, you configuration was missing the --rabbit_host = ... flag, which indicates where to find the RabbitMQ service for your OpenStack cloud. The default value is localhost.
Other rabbit flags that nova accepts:
--rabbit_host=...
--rabbit_port=...
--rabbit_virtual_host=/
--rabbit_use_ssl
--rabbit_userid=...
--rabbit_password=...