I have installed allinone setup using packstack in a server. I have generated one windows server instance and it was working fine. The system was left idle and later it was rebooted. After reboot, I'm not able to see the keystone service. I'm not able to start keystone service as well. It says the unit is not found but earlier keystone service was working fine and there are no logs of keystone after reboot. I'm not able ot start nova-conductor and nova-scheduler services. I'm attaching the log files for reference. Can anyone please share a way ahead to eliminate the error.
Thanks in advance.
Nova-scheduler Log:
2021-12-28 14:02:36.568 44587 ERROR nova keystoneauth1.exceptions.connection.ConnectFailure: Unable to establish connection to http://172.26.6.238:5000/v3/auth/tokens: HTTPConnectionPool(host='172.26.6.238', port=5000): Max retries exceeded with url: /v3/auth/tokens (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f5f470529e8>: Failed to establish a new connection: [Errno 111] ECONNREFUSED',))
2021-12-28 14:02:36.568 44587 ERROR nova
2021-12-28 14:02:38.790 44597 INFO oslo_service.periodic_task [-] Skipping periodic task _discover_hosts_in_cells because its interval is negative
2021-12-28 14:02:38.799 44597 WARNING keystoneauth.identity.generic.base [-] Failed to discover available identity versions when contacting http://172.26.6.238:5000/v3. Attempting to parse version from URL.: keystoneauth1.exceptions.connection.ConnectFailure: Unable to establish connection to http://172.26.6.238:5000/v3: HTTPConnectionPool(host='172.26.6.238', port=5000): Max retries exceeded with url: /v3 (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f53f3575a20>: Failed to establish a new connection: [Errno 111] ECONNREFUSED',))
2021-12-28 14:02:38.803 44597 CRITICAL nova [-] Unhandled error: keystoneauth1.exceptions.connection.ConnectFailure: Unable to establish connection to http://172.26.6.238:5000/v3/auth/tokens: HTTPConnectionPool(host='172.26.6.238', port=5000): Max retries exceeded with url: /v3/auth/tokens (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f53f358e9e8>: Failed to establish a new connection: [Errno 111] ECONNREFUSED',))
2021-12-28 14:02:38.803 44597 ERROR nova Traceback (most recent call last):
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib/python3.6/site-packages/urllib3/connection.py", line 159, in _new_conn
2021-12-28 14:02:38.803 44597 ERROR nova (self._dns_host, self.port), self.timeout, **extra_kw)
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib/python3.6/site-packages/urllib3/util/connection.py", line 80, in create_connection
2021-12-28 14:02:38.803 44597 ERROR nova raise err
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib/python3.6/site-packages/urllib3/util/connection.py", line 70, in create_connection
2021-12-28 14:02:38.803 44597 ERROR nova sock.connect(sa)
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib/python3.6/site-packages/eventlet/greenio/base.py", line 253, in connect
2021-12-28 14:02:38.803 44597 ERROR nova socket_checkerr(fd)
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib/python3.6/site-packages/eventlet/greenio/base.py", line 51, in socket_checkerr
2021-12-28 14:02:38.803 44597 ERROR nova raise socket.error(err, errno.errorcode[err])
2021-12-28 14:02:38.803 44597 ERROR nova ConnectionRefusedError: [Errno 111] ECONNREFUSED
2021-12-28 14:02:38.803 44597 ERROR nova
2021-12-28 14:02:38.803 44597 ERROR nova During handling of the above exception, another exception occurred:
2021-12-28 14:02:38.803 44597 ERROR nova
2021-12-28 14:02:38.803 44597 ERROR nova Traceback (most recent call last):
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib/python3.6/site-packages/urllib3/connectionpool.py", line 600, in urlopen
2021-12-28 14:02:38.803 44597 ERROR nova chunked=chunked)
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib/python3.6/site-packages/urllib3/connectionpool.py", line 354, in _make_request
2021-12-28 14:02:38.803 44597 ERROR nova conn.request(method, url, **httplib_request_kw)
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib64/python3.6/http/client.py", line 1254, in request
2021-12-28 14:02:38.803 44597 ERROR nova self._send_request(method, url, body, headers, encode_chunked)
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib64/python3.6/http/client.py", line 1300, in _send_request
2021-12-28 14:02:38.803 44597 ERROR nova self.endheaders(body, encode_chunked=encode_chunked)
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib64/python3.6/http/client.py", line 1249, in endheaders
2021-12-28 14:02:38.803 44597 ERROR nova self._send_output(message_body, encode_chunked=encode_chunked)
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib64/python3.6/http/client.py", line 1036, in _send_output
2021-12-28 14:02:38.803 44597 ERROR nova self.send(msg)
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib64/python3.6/http/client.py", line 974, in send
2021-12-28 14:02:38.803 44597 ERROR nova self.connect()
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib/python3.6/site-packages/urllib3/connection.py", line 181, in connect
2021-12-28 14:02:38.803 44597 ERROR nova conn = self._new_conn()
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib/python3.6/site-packages/urllib3/connection.py", line 168, in _new_conn
2021-12-28 14:02:38.803 44597 ERROR nova self, "Failed to establish a new connection: %s" % e)
2021-12-28 14:02:38.803 44597 ERROR nova urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f53f358e9e8>: Failed to establish a new connection: [Errno 111] ECONNREFUSED
2021-12-28 14:02:38.803 44597 ERROR nova
2021-12-28 14:02:38.803 44597 ERROR nova During handling of the above exception, another exception occurred:
2021-12-28 14:02:38.803 44597 ERROR nova
2021-12-28 14:02:38.803 44597 ERROR nova Traceback (most recent call last):
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib/python3.6/site-packages/requests/adapters.py", line 449, in send
2021-12-28 14:02:38.803 44597 ERROR nova timeout=timeout
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib/python3.6/site-packages/urllib3/connectionpool.py", line 638, in urlopen
2021-12-28 14:02:38.803 44597 ERROR nova _stacktrace=sys.exc_info()[2])
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib/python3.6/site-packages/urllib3/util/retry.py", line 399, in increment
2021-12-28 14:02:38.803 44597 ERROR nova raise MaxRetryError(_pool, url, error or ResponseError(cause))
2021-12-28 14:02:38.803 44597 ERROR nova urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='172.26.6.238', port=5000): Max retries exceeded with url: /v3/auth/tokens (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f53f358e9e8>: Failed to establish a new connection: [Errno 111] ECONNREFUSED',))
2021-12-28 14:02:38.803 44597 ERROR nova
2021-12-28 14:02:38.803 44597 ERROR nova During handling of the above exception, another exception occurred:
2021-12-28 14:02:38.803 44597 ERROR nova
2021-12-28 14:02:38.803 44597 ERROR nova Traceback (most recent call last):
2021-12-28 14:02:38.803 44597 ERROR nova File "/usr/lib/python3.6/site-packages/keystoneauth1/session.py", line 1004, in _send_request
2021-12-28 14:02:38.803 44597 ERROR nova resp = self.session.request(method, url, **kwargs)
Nova-Conductor Log
2021-12-28 15:08:32.128 82721 ERROR nova ConnectionRefusedError: [Errno 111] ECONNREFUSED
2021-12-28 15:08:32.128 82721 ERROR nova
2021-12-28 15:08:32.128 82721 ERROR nova During handling of the above exception, another exception occurred:
2021-12-28 15:08:32.128 82721 ERROR nova
2021-12-28 15:08:32.128 82721 ERROR nova Traceback (most recent call last):
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/urllib3/connectionpool.py", line 600, in urlopen
2021-12-28 15:08:32.128 82721 ERROR nova chunked=chunked)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/urllib3/connectionpool.py", line 354, in _make_request
2021-12-28 15:08:32.128 82721 ERROR nova conn.request(method, url, **httplib_request_kw)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib64/python3.6/http/client.py", line 1254, in request
2021-12-28 15:08:32.128 82721 ERROR nova self._send_request(method, url, body, headers, encode_chunked)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib64/python3.6/http/client.py", line 1300, in _send_request
2021-12-28 15:08:32.128 82721 ERROR nova self.endheaders(body, encode_chunked=encode_chunked)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib64/python3.6/http/client.py", line 1249, in endheaders
2021-12-28 15:08:32.128 82721 ERROR nova self._send_output(message_body, encode_chunked=encode_chunked)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib64/python3.6/http/client.py", line 1036, in _send_output
2021-12-28 15:08:32.128 82721 ERROR nova self.send(msg)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib64/python3.6/http/client.py", line 974, in send
2021-12-28 15:08:32.128 82721 ERROR nova self.connect()
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/urllib3/connection.py", line 181, in connect
2021-12-28 15:08:32.128 82721 ERROR nova conn = self._new_conn()
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/urllib3/connection.py", line 168, in _new_conn
2021-12-28 15:08:32.128 82721 ERROR nova self, "Failed to establish a new connection: %s" % e)
2021-12-28 15:08:32.128 82721 ERROR nova urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fc0e5be6978>: Failed to establish a new connection: [Errno 111] ECONNREFUSED
2021-12-28 15:08:32.128 82721 ERROR nova
2021-12-28 15:08:32.128 82721 ERROR nova During handling of the above exception, another exception occurred:
2021-12-28 15:08:32.128 82721 ERROR nova
2021-12-28 15:08:32.128 82721 ERROR nova Traceback (most recent call last):
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/requests/adapters.py", line 449, in send
2021-12-28 15:08:32.128 82721 ERROR nova timeout=timeout
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/urllib3/connectionpool.py", line 638, in urlopen
2021-12-28 15:08:32.128 82721 ERROR nova _stacktrace=sys.exc_info()[2])
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/urllib3/util/retry.py", line 399, in increment
2021-12-28 15:08:32.128 82721 ERROR nova raise MaxRetryError(_pool, url, error or ResponseError(cause))
2021-12-28 15:08:32.128 82721 ERROR nova urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='172.26.6.238', port=5000): Max retries exceeded with url: /v3/auth/tokens (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fc0e5be6978>: Failed to establish a new connection: [Errno 111] ECONNREFUSED',))
2021-12-28 15:08:32.128 82721 ERROR nova
2021-12-28 15:08:32.128 82721 ERROR nova During handling of the above exception, another exception occurred:
2021-12-28 15:08:32.128 82721 ERROR nova
2021-12-28 15:08:32.128 82721 ERROR nova Traceback (most recent call last):
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/keystoneauth1/session.py", line 1004, in _send_request
2021-12-28 15:08:32.128 82721 ERROR nova resp = self.session.request(method, url, **kwargs)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/requests/sessions.py", line 533, in request
2021-12-28 15:08:32.128 82721 ERROR nova resp = self.send(prep, **send_kwargs)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/requests/sessions.py", line 646, in send
2021-12-28 15:08:32.128 82721 ERROR nova r = adapter.send(request, **kwargs)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/requests/adapters.py", line 516, in send
2021-12-28 15:08:32.128 82721 ERROR nova raise ConnectionError(e, request=request)
2021-12-28 15:08:32.128 82721 ERROR nova requests.exceptions.ConnectionError: HTTPConnectionPool(host='172.26.6.238', port=5000): Max retries exceeded with url: /v3/auth/tokens (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fc0e5be6978>: Failed to establish a new connection: [Errno 111] ECONNREFUSED',))
2021-12-28 15:08:32.128 82721 ERROR nova
2021-12-28 15:08:32.128 82721 ERROR nova During handling of the above exception, another exception occurred:
2021-12-28 15:08:32.128 82721 ERROR nova
2021-12-28 15:08:32.128 82721 ERROR nova Traceback (most recent call last):
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/bin/nova-conductor", line 10, in
2021-12-28 15:08:32.128 82721 ERROR nova sys.exit(main())
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/nova/cmd/conductor.py", line 44, in main
2021-12-28 15:08:32.128 82721 ERROR nova topic=rpcapi.RPC_TOPIC)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/nova/service.py", line 270, in create
2021-12-28 15:08:32.128 82721 ERROR nova periodic_interval_max=periodic_interval_max)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/nova/service.py", line 128, in init
2021-12-28 15:08:32.128 82721 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/nova/conductor/manager.py", line 121, in init
2021-12-28 15:08:32.128 82721 ERROR nova self.compute_task_mgr = ComputeTaskManager()
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/nova/conductor/manager.py", line 247, in init
2021-12-28 15:08:32.128 82721 ERROR nova self.report_client = report.SchedulerReportClient()
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/nova/scheduler/client/report.py", line 186, in init
2021-12-28 15:08:32.128 82721 ERROR nova self._client = self._create_client()
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/nova/scheduler/client/report.py", line 229, in _create_client
2021-12-28 15:08:32.128 82721 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement')
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/nova/utils.py", line 1079, in get_sdk_adapter
2021-12-28 15:08:32.128 82721 ERROR nova return getattr(conn, service_type)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/openstack/service_description.py", line 93, in get
2021-12-28 15:08:32.128 82721 ERROR nova endpoint = proxy_mod.Proxy.get_endpoint(proxy)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/keystoneauth1/adapter.py", line 282, in get_endpoint
2021-12-28 15:08:32.128 82721 ERROR nova return self.session.get_endpoint(auth or self.auth, **kwargs)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/keystoneauth1/session.py", line 1225, in get_endpoint
2021-12-28 15:08:32.128 82721 ERROR nova return auth.get_endpoint(self, **kwargs)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/keystoneauth1/identity/base.py", line 380, in get_endpoint
2021-12-28 15:08:32.128 82721 ERROR nova allow_version_hack=allow_version_hack, **kwargs)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/keystoneauth1/identity/base.py", line 271, in get_endpoint_data
2021-12-28 15:08:32.128 82721 ERROR nova service_catalog = self.get_access(session).service_catalog
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/keystoneauth1/identity/base.py", line 134, in get_access
2021-12-28 15:08:32.128 82721 ERROR nova self.auth_ref = self.get_auth_ref(session)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/keystoneauth1/identity/generic/base.py", line 208, in get_auth_ref
2021-12-28 15:08:32.128 82721 ERROR nova return self._plugin.get_auth_ref(session, **kwargs)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/keystoneauth1/identity/v3/base.py", line 184, in get_auth_ref
2021-12-28 15:08:32.128 82721 ERROR nova authenticated=False, log=False, **rkwargs)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/keystoneauth1/session.py", line 1131, in post
2021-12-28 15:08:32.128 82721 ERROR nova return self.request(url, 'POST', **kwargs)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/keystoneauth1/session.py", line 913, in request
2021-12-28 15:08:32.128 82721 ERROR nova resp = send(**kwargs)
2021-12-28 15:08:32.128 82721 ERROR nova File "/usr/lib/python3.6/site-packages/keystoneauth1/session.py", line 1020, in _send_request
2021-12-28 15:08:32.128 82721 ERROR nova raise exceptions.ConnectFailure(msg)
2021-12-28 15:08:32.128 82721 ERROR nova keystoneauth1.exceptions.connection.ConnectFailure: Unable to establish connection to http://172.26.6.238:5000/v3/auth/tokens: HTTPConnectionPool(host='172.26.6.238', port=5000): Max retries exceeded with url: /v3/auth/tokens (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fc0e5be6978>: Failed to establish a new connection: [Errno 111] ECONNREFUSED',))
2021-12-28 15:08:32.128 82721 ERROR nova
The error messages in the logfile say that it cannot connect to http://172.26.6.238:5000/v3/auth/tokens. That will be your Keystone service.
You need to investigate why your Keystone service is not listening for connections.
I'm not able to start Keystone service as well.
Solve that problem first.
It says the unit is not found ...
Suspicious! Systemd unit files don't just spontaneously disappear. You need to figure out where they have gone.
Related
I am creating a logging system using firebase, i have connected my project to firebase and enabled
email and password authentication and added users. But when i run the app and try to login using the
email and password that i have added,it shows the below error and the app crashes.
class _AdminLoginState extends State<AdminLogin> {
String _username, _password;
final GlobalKey<FormState> _formkey = GlobalKey<FormState>();
Future<void> login() async{
final formState = _formkey.currentState;
if(formState.validate()){
formState.save();
try{
final FirebaseUser user = (await FirebaseAuth.instance.signInWithEmailAndPassword(email: _username, password: _password)).user;
Navigator.push(context, MaterialPageRoute(builder: (context) => Dashboard()));
}catch(e){
print(e.message);
}
}
}
}
when i enter the username and password the app shutdowns and throws me this error.
W/BiChannelGoogleApi(29343): [FirebaseAuth: ] getGoogleApiForMethod() returned Gms: com.google.firebase.auth.api.internal.zzak#4b54d30
E/JavaBinder(29343): *** Uncaught remote exception! (Exceptions are not yet supported across processes.)
E/JavaBinder(29343): java.lang.NoClassDefFoundError: Failed resolution of: Landroid/support/v4/util/ArrayMap;
E/JavaBinder(29343): at com.google.firebase.auth.internal.zzam.zzc(Unknown Source:22)
E/JavaBinder(29343): at com.google.firebase.auth.internal.zzam.zzde(Unknown Source:17)
E/JavaBinder(29343): at com.google.firebase.auth.api.internal.zzep.zza(Unknown Source:9)
E/JavaBinder(29343): at android.os.Binder.execTransact(Binder.java:735)
E/JavaBinder(29343): Caused by: java.lang.ClassNotFoundException: Didn't find class "android.support.v4.util.ArrayMap" on path: DexPathList[[zip file "/data/app/m20zero.adminbookingpage-tivSbYSPqm94eArdHgurKQ==/base.apk"],nativeLibraryDirectories=[/data/app/m20zero.adminbookingpage-tivSbYSPqm94eArdHgurKQ==/lib/arm64, /data/app/m20zero.adminbookingpage-tivSbYSPqm94eArdHgurKQ==/base.apk!/lib/arm64-v8a, /system/lib64, /vendor/lib64]]
E/JavaBinder(29343): at dalvik.system.BaseDexClassLoader.findClass(BaseDexClassLoader.java:134)
E/JavaBinder(29343): at java.lang.ClassLoader.loadClass(ClassLoader.java:379)
E/JavaBinder(29343): at java.lang.ClassLoader.loadClass(ClassLoader.java:312)
E/JavaBinder(29343): ... 16 more
E/AndroidRuntime(29343): FATAL EXCEPTION: Binder:29343_1
E/AndroidRuntime(29343): Process: m20zero.adminbookingpage, PID: 29343
E/AndroidRuntime(29343): java.lang.NoClassDefFoundError: Failed resolution of: Landroid/support/v4/util/ArrayMap;
E/AndroidRuntime(29343): at com.google.firebase.auth.internal.zzam.zzc(Unknown Source:22)
E/AndroidRuntime(29343): at com.google.firebase.auth.internal.zzam.zzde(Unknown Source:17)
E/AndroidRuntime(29343): at com.google.android.gms.internal.firebase_auth.zza.onTransact(Unknown Source:13)
E/AndroidRuntime(29343): at android.os.Binder.execTransact(Binder.java:735)
E/AndroidRuntime(29343): Caused by: java.lang.ClassNotFoundException: Didn't find class "android.support.v4.util.ArrayMap" on path: DexPathList[[zip file "/data/app/m20zero.adminbookingpage-tivSbYSPqm94eArdHgurKQ==/base.apk"],nativeLibraryDirectories=[/data/app/m20zero.adminbookingpage-tivSbYSPqm94eArdHgurKQ==/lib/arm64, /data/app/m20zero.adminbookingpage-tivSbYSPqm94eArdHgurKQ==/base.apk!/lib/arm64-v8a, /system/lib64, /vendor/lib64]]
E/AndroidRuntime(29343): at dalvik.system.BaseDexClassLoader.findClass(BaseDexClassLoader.java:134)
E/AndroidRuntime(29343): at java.lang.ClassLoader.loadClass(ClassLoader.java:379)
E/AndroidRuntime(29343): at java.lang.ClassLoader.loadClass(ClassLoader.java:312)
E/AndroidRuntime(29343): ... 16 more
I/Process (29343): Sending signal. PID: 29343 SIG: 9
Lost connection to device.
'''
Did you change the minSdk version on build.gradle file?
I have tried creating a snapshot for the running instance in our openstack setup. Upon clicking, create snapshot creation gets queued but after few minutes it is getting deleted. Not sure what logs to check.
I have gone through glance-all.log in controller. Observed following messages
glance-all.log :
<150>Aug 24 10:20:23 node-1 glance-registry: 2018-08-24 10:20:23.150 9643 INFO glance.registry.api.v1.images [req-3b08e166-c883-4abd-8f14-5169bbe77797 2771a2a4e11d41c9a79262539ff1505e 92d211537de24d0899be89c320a69a53 - - -] Updating metadata for image 3aee82a2-c4a1-4945-9bd1-5b6ccb2ebf77
<150>Aug 24 10:20:23 node-1 glance-registry: 2018-08-24 10:20:23.151 9643 INFO eventlet.wsgi.server [req-3b08e166-c883-4abd-8f14-5169bbe77797 2771a2a4e11d41c9a79262539ff1505e 92d211537de24d0899be89c320a69a53 - - -] 192.168.0.2 - - [24/Aug/2018 10:20:23] "PUT /images/3aee82a2-c4a1-4945-9bd1-5b6ccb2ebf77 HTTP/1.1" 200 885 0.031617
<150>Aug 24 10:20:23 node-1 glance-registry: 2018-08-24 10:20:23.199 9643 INFO glance.registry.api.v1.images [req-3b08e166-c883-4abd-8f14-5169bbe77797 2771a2a4e11d41c9a79262539ff1505e 92d211537de24d0899be89c320a69a53 - - -] Successfully deleted image 3aee82a2-c4a1-4945-9bd1-5b6ccb2ebf77
<150>Aug 24 10:20:23 node-1 glance-registry: 2018-08-24 10:20:23.200 9643 INFO eventlet.wsgi.server [req-3b08e166-c883-4abd-8f14-5169bbe77797 2771a2a4e11d41c9a79262539ff1505e 92d211537de24d0899be89c320a69a53 - - -] 192.168.0.2 - - [24/Aug/2018 10:20:23] "DELETE /images/3aee82a2-c4a1-4945-9bd1-5b6ccb2ebf77 HTTP/1.1" 200 722 0.047676
nova-compute.log :
2018-08-24 05:57:22.434 53672 INFO nova.compute.manager [req-27904691-aada-4e7b-a5d3-64c5a8990e20 2771a2a4e11d41c9a79262539ff1505e 92d211537de24d0899be89c320a69a53 - - -] [instance: 343bdab5-1002-4e63-a37a-70f953847088] instance snapshotting
2018-08-24 05:57:23.201 53672 INFO nova.compute.resource_tracker [req-75c74bc3-26d1-409e-84cd-f1a5730fc70a - - - - -] Compute_service record updated for node-2.domain.tld:node-2.domain.tld
2018-08-24 05:57:24.133 53672 INFO nova.compute.manager [req-4a1bd5c0-d54f-4429-a8ff-3e9cb46011ec - - - - -] [instance: 343bdab5-1002-4e63-a37a-70f953847088] VM Paused (Lifecycle Event)
2018-08-24 05:57:24.383 53672 INFO nova.compute.manager [req-4a1bd5c0-d54f-4429-a8ff-3e9cb46011ec - - - - -] [instance: 343bdab5-1002-4e63-a37a-70f953847088] During sync_power_state the instance has a pending task (image_snapshot). Skip.
2018-08-24 05:57:52.233 53672 INFO nova.virt.libvirt.driver [req-27904691-aada-4e7b-a5d3-64c5a8990e20 2771a2a4e11d41c9a79262539ff1505e 92d211537de24d0899be89c320a69a53 - - -] [instance: 343bdab5-1002-4e63-a37a-70f953847088] Beginning cold snapshot process
2018-08-24 05:58:07.238 53672 INFO nova.compute.manager [-] [instance: 343bdab5-1002-4e63-a37a-70f953847088] VM Stopped (Lifecycle Event)
2018-08-24 05:58:07.303 53672 INFO nova.compute.manager [req-63b9a442-4483-4238-97c3-0c20ef1dccc2 - - - - -] [instance: 343bdab5-1002-4e63-a37a-70f953847088] During sync_power_state the instance has a pending task (image_pending_upload). Skip.
2018-08-24 05:58:17.561 53672 INFO nova.compute.resource_tracker [req-75c74bc3-26d1-409e-84cd-f1a5730fc70a - - - - -] Auditing locally available compute resources for node node-2.domain.tld
2018-08-24 05:58:20.590 53672 INFO nova.compute.resource_tracker [req-75c74bc3-26d1-409e-84cd-f1a5730fc70a - - - - -] Total usable vcpus: 72, total allocated vcpus: 44
2018-08-24 05:58:20.591 53672 INFO nova.compute.resource_tracker [req-75c74bc3-26d1-409e-84cd-f1a5730fc70a - - - - -] Final resource view: name=node-2.domain.tld phys_ram=257584MB used_ram=135680MB phys_disk=1618GB used_disk=880GB total_vcpus=72 used_vcpus=44 pci_stats=[PciDevicePool(count=53,numa_node=0,product_id='1515',tags={dev_type='type-VF',physical_network='physnet6'},vendor_id='8086'), PciDevicePool(count=53,numa_node=0,product_id='1515',tags={dev_type='type-VF',physical_network='physnet7'},vendor_id='8086'), PciDevicePool(count=59,numa_node=0,product_id='1515',tags={dev_type='type-VF',physical_network='physnet8'},vendor_id='8086')]
2018-08-24 05:58:22.985 53672 INFO nova.compute.resource_tracker [req-75c74bc3-26d1-409e-84cd-f1a5730fc70a - - - - -] Compute_service record updated for node-2.domain.tld:node-2.domain.tld
2018-08-24 05:59:16.119 53672 INFO nova.compute.manager [req-4a1bd5c0-d54f-4429-a8ff-3e9cb46011ec - - - - -] [instance: 343bdab5-1002-4e63-a37a-70f953847088] VM Started (Lifecycle Event)
2018-08-24 05:59:17.427 53672 INFO nova.compute.manager [req-4a1bd5c0-d54f-4429-a8ff-3e9cb46011ec - - - - -] [instance: 343bdab5-1002-4e63-a37a-70f953847088] During sync_power_state the instance has a pending task (image_pending_upload). Skip.
2018-08-24 05:59:17.428 53672 INFO nova.compute.manager [req-4a1bd5c0-d54f-4429-a8ff-3e9cb46011ec - - - - -] [instance: 343bdab5-1002-4e63-a37a-70f953847088] VM Resumed (Lifecycle Event)
2018-08-24 05:59:17.557 53672 INFO nova.compute.resource_tracker [req-75c74bc3-26d1-409e-84cd-f1a5730fc70a - - - - -] Auditing locally available compute resources for node node-2.domain.tld
2018-08-24 05:59:19.828 53672 INFO nova.compute.manager [req-4a1bd5c0-d54f-4429-a8ff-3e9cb46011ec - - - - -] [instance: 343bdab5-1002-4e63-a37a-70f953847088] During sync_power_state the instance has a pending task (image_pending_upload). Skip.
2018-08-24 05:59:23.094 53672 INFO nova.compute.manager [req-27904691-aada-4e7b-a5d3-64c5a8990e20 2771a2a4e11d41c9a79262539ff1505e 92d211537de24d0899be89c320a69a53 - - -] [instance: 343bdab5-1002-4e63-a37a-70f953847088] Successfully reverted task state from image_pending_upload on failure for instance.
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher [req-27904691-aada-4e7b-a5d3-64c5a8990e20 2771a2a4e11d41c9a79262539ff1505e 92d211537de24d0899be89c320a69a53 - - -] Exception during message handling: internal error: unable to execute QEMU command 'device_add': Device initialization failed
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher Traceback (most recent call last):
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 138, in _dispatch_and_reply
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher incoming.message))
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 185, in _dispatch
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher return self._do_dispatch(endpoint, method, ctxt, args)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 127, in _do_dispatch
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher result = func(ctxt, **new_args)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/exception.py", line 110, in wrapped
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher payload)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher self.force_reraise()
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/exception.py", line 89, in wrapped
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher return f(self, context, *args, **kw)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 359, in decorated_function
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher LOG.warning(msg, e, instance=instance)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher self.force_reraise()
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 328, in decorated_function
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher return function(self, context, *args, **kwargs)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 387, in decorated_function
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher kwargs['instance'], e, sys.exc_info())
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher self.force_reraise()
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 328, in decorated_function
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher return function(self, context, *args, **kwargs)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 387, in decorated_function
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher kwargs['instance'], e, sys.exc_info())
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher self.force_reraise()
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 375, in decorated_function
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher return function(self, context, *args, **kwargs)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 435, in decorated_function
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher instance=instance)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher self.force_reraise()
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 425, in decorated_function
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher *args, **kwargs)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 3158, in snapshot_instance
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher task_states.IMAGE_SNAPSHOT)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 3188, in _snapshot_instance
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher update_task_state)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py", line 1747, in snapshot
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher state, instance)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py", line 1792, in _snapshot_domain
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher self._attach_sriov_ports(context, instance, guest)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py", line 3425, in _attach_sriov_ports
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher guest.attach_device(cfg)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/guest.py", line 250, in attach_device
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher self._domain.attachDeviceFlags(conf.to_xml(), flags=flags)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/eventlet/tpool.py", line 186, in doit
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher result = proxy_call(self._autowrap, f, *args, **kwargs)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/eventlet/tpool.py", line 144, in proxy_call
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher rv = execute(f, *args, **kwargs)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/eventlet/tpool.py", line 125, in execute
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher six.reraise(c, e, tb)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/eventlet/tpool.py", line 83, in tworker
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher rv = meth(*args, **kwargs)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/libvirt.py", line 528, in attachDeviceFlags
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher if ret == -1: raise libvirtError ('virDomainAttachDeviceFlags() failed', dom=self)
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher libvirtError: internal error: unable to execute QEMU command 'device_add': Device initialization failed
2018-08-24 05:59:23.100 53672 ERROR oslo_messaging.rpc.dispatcher
2018-08-24 05:59:23.292 53672 INFO nova.compute.resource_tracker [req-75c74bc3-26d1-409e-84cd-f1a5730fc70a - - - - -] Total usable vcpus: 72, total allocated vcpus: 44
2018-08-24 05:59:23.292 53672 INFO nova.compute.resource_tracker [req-75c74bc3-26d1-409e-84cd-f1a5730fc70a - - - - -] Final resource view: name=node-2.domain.tld phys_ram=257584MB used_ram=135680MB phys_disk=1618GB used_disk=880GB total_vcpus=72 used_vcpus=44 pci_stats=[PciDevicePool(count=53,numa_node=0,product_id='1515',tags={dev_type='type-VF',physical_network='physnet6'},vendor_id='8086'), PciDevicePool(count=53,numa_node=0,product_id='1515',tags={dev_type='type-VF',physical_network='physnet7'},vendor_id='8086'), PciDevicePool(count=59,numa_node=0,product_id='1515',tags={dev_type='type-VF',physical_network='physnet8'},vendor_id='8086')]
Please help us to find appropriate logs to find the root cause
So i'm working on a telegram bot for a multi-player board game. Web hook is set. And there appears to be a http 400 error when the bot tries to privately message each user in the group chat. What happened to the bot is that it got stuck and keeps messaging one player.
my code:
BASE_URL = 'https://api.telegram.org/bot' + TOKEN + '/'
class WebhookHandler(webapp2.RequestHandler):
def post(self):
urlfetch.set_default_fetch_deadline(60)
body = json.loads(self.request.body)
logging.info('request body:')
logging.info(body)
self.response.write(json.dumps(body))
update_id = body['update_id']
try:
message = body['message']
except:
message = body['edited_message']
message_id = message.get('message_id')
date = message.get('date')
text = message.get('text')
fr = message.get('from')
chat = message['chat']
chat_id = chat['id']
chat_type = chat['type']
fr_user_id = fr['id']
fr_user_name = fr['first_name']
if not text:
logging.info('no text')
return
def reply_to_user(desti_user_id, msg):
if msg:
resp = urllib2.urlopen(BASE_URL + 'sendMessage', urllib.urlencode({
'chat_id': str(desti_user_id),
**#line 115!** 'text': msg.encode('utf-8')
})).read()
else:
logging.error('no msg or img specified')
resp = None
logging.info('send response:')
logging.info (resp)
The LOG(google app engine):
HTTP Error 400: Bad Request (/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py:1552)
Traceback (most recent call last):
File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 1535, in __call__
rv = self.handle_exception(request, response, e)
File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 1529, in __call__
rv = self.router.dispatch(request, response)
File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 1278, in default_dispatcher
return route.handler_adapter(request, response)
File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 1102, in __call__
return handler.dispatch()
File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 572, in dispatch
return self.handle_exception(e, self.app.debug)
File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 570, in dispatch
return method(*args, **kwargs)
File "/base/data/home/apps/s~orbitaltest2/1.394273849972192493/main.py", line 241, in post
reply_to_user(player.user_id, "you are: a member of the resistance")
File "/base/data/home/apps/s~orbitaltest2/1.394273849972192493/main.py", line 115, in reply_to_user
'text': msg.encode('utf-8'),
File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/urllib2.py", line 127, in urlopen
return _opener.open(url, data, timeout)
File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/urllib2.py", line 410, in open
response = meth(req, response)
File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/urllib2.py", line 523, in http_response
'http', request, response, code, msg, hdrs)
File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/urllib2.py", line 448, in error
return self._call_chain(*args)
File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/urllib2.py", line 382, in _call_chain
result = func(*args)
File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/urllib2.py", line 531, in http_error_default
raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
HTTPError: HTTP Error 400: Bad Request
not sure what happened.
I called let push = IMFPushClient.sharedInstance()
logger?.logInfoWithMessages("subscribe to tags: \(tagsArray)")
push.subscribeToTags([tagsArray]) { (response: IMFResponse!, error: NSError!) -> Void in
to subscribe to a tag, but got 400 from the stacktrace.
2016-01-22 16:48:10.648 gschat-swift[44084:5194666] [INFO] [BlueList] subscribe to tags: ["Test"]
2016-01-22 16:48:10.648 gschat-swift[44084:5194666] [INFO] [IMFPushClient] -[IMFPushClient subscribeToTags:completionHandler:] in IMFPushClient.m:314 :: Entering: subscribeToTags.
2016-01-22 16:48:10.650 gschat-swift[44084:5194666] [DEBUG] [IMF] +[WLAFHTTPClientWrapper requestWithURL:] in WLAFHTTPClientWrapper.m:44 :: Request url is https://ChatBMS.mybluemix.net/imfpush/v1/apps/2b7d11a9-57bf-47e6-8a4c-8aabf044ac8a/subscriptions
2016-01-22 16:48:10.653 gschat-swift[44084:5194666] [DEBUG] [IMF] -[WLAFHTTPClientWrapper start] in WLAFHTTPClientWrapper.m:194 :: Starting the request with URL https://ChatBMS.mybluemix.net/imfpush/v1/apps/2b7d11a9-57bf-47e6-8a4c-8aabf044ac8a/subscriptions
2016-01-22 16:48:11.255 gschat-swift[44084:5194362] [DEBUG] [IMF] -[WLAFHTTPClientWrapper requestFailed:error:] in WLAFHTTPClientWrapper.m:215 :: Request Failed
2016-01-22 16:48:11.256 gschat-swift[44084:5194362] [DEBUG] [IMF] -[WLAFHTTPClientWrapper requestFailed:error:] in WLAFHTTPClientWrapper.m:216 :: Response Status Code : 400
2016-01-22 16:48:11.257 gschat-swift[44084:5194362] [DEBUG] [IMF] -[WLAFHTTPClientWrapper requestFailed:error:] in WLAFHTTPClientWrapper.m:217 :: Response Error : Expected status code in (200-299), got 400
2016-01-22 16:48:11.259 gschat-swift[44084:5194362] [ERROR] [IMFPushClient] __51-[IMFPushClient subscribeToTags:completionHandler:]_block_invoke in IMFPushClient.m:350 :: Error while subscribing to tags - Error is: Error Domain=WLAFNetworkingErrorDomain Code=-1011 "Expected status code in (200-299), got 400" UserInfo={WLAFNetworkingOperationFailingURLRequestErrorKey=<NSMutableURLRequest: 0x7f89b8d8dd10> { URL: https://ChatBMS.mybluemix.net/imfpush/v1/apps/2b7d11a9-57bf-47e6-8a4c-8aabf044ac8a/subscriptions }, NSLocalizedRecoverySuggestion={"message":"Bad Request - Invalid JSON","code":"FPWSE0004E"}, NSErrorFailingURLKey=https://ChatBMS.mybluemix.net/imfpush/v1/apps/2b7d11a9-57bf-47e6-8a4c-8aabf044ac8a/subscriptions, WLAFNetworkingOperationFailingURLResponseErrorKey=<NSHTTPURLResponse: 0x7f89b8eb7f00> { URL: https://chatbms.mybluemix.net/imfpush/v1/apps/2b7d11a9-57bf-47e6-8a4c-8aabf044ac8a/subscriptions } { status code: 400, headers {
Connection = "Keep-Alive";
"Content-Type" = "application/json";
Date = "Fri, 22 Jan 2016 16:48:09 GMT";
"Transfer-Encoding" = Identity;
"X-Backside-Transport" = "FAIL FAIL";
"X-Cf-Requestid" = "cf162009-bd95-46d2-46ca-3cad50a70153";
"X-Client-IP" = "80.111.218.187";
"X-Global-Transaction-ID" = 2240839783;
"X-Powered-By" = "Servlet/3.0";
} }, NSLocalizedDescription=Expected status code in (200-299), got 400}
2016-01-22 16:48:11.260 **gschat-swift[44084:5194362] [FATAL] [BlueList] error Error Domain=com.ibm.mobilefoundation.push Code=10 "Expected status code in (200-299), got 400" UserInfo={WLAFNetworkingOperationFailingURLRequestErrorKey=<NSMutableURLRequest: 0x7f89b8d8dd10> { URL: https://ChatBMS.mybluemix.net/imfpush/v1/apps/2b7d11a9-57bf-47e6-8a4c-8aabf044ac8a/subscriptions }, NSLocalizedRecoverySuggestion={"message":"Bad Request - Invalid JSON","code":"FPWSE0004E"}, NSErrorFailingURLKey=https://ChatBMS.mybluemix.net/imfpush/v1/apps/2b7d11a9-57bf-47e6-8a4c-8aabf044ac8a/subscriptions, WLAFNetworkingOperationFailingURLResponseErrorKey=<NSHTTPURLResponse: 0x7f89b8eb7f00> { URL: https://chatbms.mybluemix.net/imfpush/v1/apps/2b7d11a9-57bf-47e6-8a4c-8aabf044ac8a/subscriptions } { status code: 400, headers {
Connection = "Keep-Alive";
"Content-Type" = "application/json";
Date = "Fri, 22 Jan 2016 16:48:09 GMT";
"Transfer-Encoding" = Identity;
"X-Backside-Transport" = "FAIL FAIL";
"X-Cf-Requestid" = "cf162009-bd95-46d2-46ca-3cad50a70153";
"X-Client-IP" = "80.111.218.187";
"X-Global-Transaction-ID" = 2240839783;
"X-Powered-By" = "Servlet/3.0";
} }, NSLocalizedDescription=Expected status code in (200-299), got 400} in subscribe to tags:["Test"]**
I can see in the error you are providing invalid JSON when attempting to subscribe. If you are trying to subscribe to all available tags, you first need to get the available tags and then use those tags when subscribing. For example:
Swift:
push.retrieveAvailableTagsWithCompletionHandler
{
(response: IMFResponse!, error: NSError!) -> Void in
let tags = response.availableTags()
if (tags.count > 0)
{
//lets subcribe to all tags for demo purposes
push.subscribeToTags(tags, completionHandler:
{ (response: IMFResponse!, error: NSError!) -> Void in
})
}
}
Here is documentation providing further insight:
Tag-based Notifications
I am trying to make a PUT method (or POST) to DropBox api, but it doesent work, I get a GET instead?
import tornado.ioloop
import tornado.web
from tornado.httputil import HTTPHeaders
from tornado.httpclient import HTTPClient, HTTPRequest
url = "https://api-content.dropbox.com/1/files_put/sandbox/world.txt"
class MainHandler(tornado.web.RequestHandler):
def post(self):
headers = HTTPHeaders({'Authorization': 'Bearer TOKEN_FOR_DROPBOX'})
HTTPClient().fetch(
HTTPRequest(url, 'PUT', body="hello there", headers=headers))
application = tornado.web.Application([
(r"/", MainHandler),
])
if __name__ == "__main__":
application.listen(8888)
tornado.ioloop.IOLoop.current().start()
Update: using GET makes an error: HTTPError: HTTP 400: Bad Request
Here is a new code:
import tornado.ioloop
import tornado.web
from tornado.httpclient import HTTPClient, HTTPRequest
url = "https://api-content.dropbox.com/1/files_put/sandbox/wor.txt"
class MainHandler(tornado.web.RequestHandler):
def get(self):
self.set_header('Authorization', 'Bearer DROPBOX_TOKEN')
self.set_header('Content-Type', 'text/plain')
HTTPClient().fetch(
HTTPRequest(url, 'PUT', body="hello there"))
application = tornado.web.Application([
(r"/", MainHandler),
])
if __name__ == "__main__":
application.listen(8888)
tornado.ioloop.IOLoop.current().start()
but get this error:
Traceback (most recent call last):
File "C:\Python27\lib\site-packages\tornado\web.py", line 1415, in _execute
result = yield result
File "C:\Python27\lib\site-packages\tornado\gen.py", line 870, in run
value = future.result()
File "C:\Python27\lib\site-packages\tornado\concurrent.py", line 215, in result
raise_exc_info(self._exc_info)
File "C:\Python27\lib\site-packages\tornado\gen.py", line 230, in wrapper
yielded = next(result)
File "C:\Users\Abdelouahab\Desktop\ttttt.py", line 14, in get
HTTPRequest(url, 'PUT', body="hello there"))
File "C:\Python27\lib\site-packages\tornado\httpclient.py", line 102, in fetch
self._async_client.fetch, request, **kwargs))
File "C:\Python27\lib\site-packages\tornado\ioloop.py", line 445, in run_sync
return future_cell[0].result()
File "C:\Python27\lib\site-packages\tornado\concurrent.py", line 215, in result
raise_exc_info(self._exc_info)
File "<string>", line 3, in raise_exc_info
HTTPError: HTTP 401: Unauthorized
ERROR:tornado.access:500 GET / (::1) 806.00ms
I tried using an HTTP request builder extension from Mozilla, and it worked, so I guess the problem is how to do it on Tornado?
Sorry, it seems that it missed the content-type
import tornado.ioloop
import tornado.web
from tornado.httputil import HTTPHeaders
from tornado.httpclient import HTTPClient, HTTPRequest
from tornado.gen import coroutine
url = "https://api-content.dropbox.com/1/files_put/sandbox/wor.txt"
class MainHandler(tornado.web.RequestHandler):
#coroutine
def get(self):
headers = HTTPHeaders({'Authorization': 'Bearer DROPBOX_TOKEN', 'Content-Type':'text/plain'})
HTTPClient().fetch(
HTTPRequest(url, 'PUT', body="hello there", headers=headers))
application = tornado.web.Application([
(r"/", MainHandler),
])
if __name__ == "__main__":
application.listen(8888)
tornado.ioloop.IOLoop.current().start()