Google Cloud Composer authentication for BigQueryHook - airflow

I'm trying to make use of the BigQueryHook but I am unable to get the service account authentication working.
I've followed the steps provided by Google and have copied the JSON file into the data/ directory in the environment's GCS bucket.
The Airflow Connection details have been filled in:
JSON Keyfile Path: /home/airflow/gcs/data/my-key-file.json
Keyfile JSON: content of the JSON file
Scopes: https://www.googleapis.com/auth/cloud-platform
Error in stackdriver:
Traceback (most recent call last): File "/usr/local/lib/airflow/airflow/models.py", line 374, in process_file m = imp.load_source(mod_name, filepath) File "/opt/python3.6/lib/python3.6/imp.py", line 172, in load_source module = _load(spec) File "", line 684, in _load File "", line 665, in _load_unlocked File "", line 678, in exec_module File "", line 219, in _call_with_frames_removed File "/home/airflow/gcs/dags/cloud_sql_to_bq.py", line 141, in df = get_config() File "/home/airflow/gcs/dags/cloud_sql_to_bq.py", line 71, in get_config bq_client = bigquery.Client(project=bq_hook._get_field("my-project"), credentials=bq_hook._get_credentials()) File "/usr/local/lib/airflow/airflow/contrib/hooks/gcp_api_base_hook.py", line 103, in _get_credentials key_path, scopes=scopes) File "/opt/python3.6/lib/python3.6/site-packages/google/oauth2/service_account.py", line 209, in from_service_account_file filename, require=['client_email', 'token_uri']) File "/opt/python3.6/lib/python3.6/site-packages/google/auth/_service_account_info.py", line 71, in from_filename with io.open(filename, 'r', encoding='utf-8') as json_file: FileNotFoundError: [Errno 2] No such file or directory: '/home/airflow/gcs/data/my-key-file.json'
Any idea why it can't see the JSON file?

Related

DVC Push KeyError fileSize

I've added a large list of CSV files to my dvc repository but when I try to do DVC push it complains with
ERROR: unexpected error - KeyError('fileSize')
Edit
So searching around it seem that it might help to include the verbose log with regards to the error.
T11:27:08~/documents/*****/data$ dvc push -v
2022-02-01 11:32:13,186 DEBUG: Adding '/home/jhylands/Documents/*****/.dvc/config.local' to gitignore file.
2022-02-01 11:32:13,199 DEBUG: Adding '/home/jhylands/Documents/*****/.dvc/tmp' to gitignore file.
2022-02-01 11:32:13,200 DEBUG: Adding '/home/jhylands/Documents/*****/.dvc/cache' to gitignore file.
2022-02-01 11:32:14,102 DEBUG: Preparing to transfer data from '/home/jhylands/Documents/*****/.dvc/cache' to '*********'
2022-02-01 11:32:14,102 DEBUG: Preparing to collect status from '********'
2022-02-01 11:32:14,103 DEBUG: Collecting status from '*******'
2022-02-01 11:32:14,439 DEBUG: GDrive remote auth with config '{'client_config_backend': 'settings', 'client_config_file': 'client_secrets.json', 'save_credentials': True, 'oauth_scope': ['https://www.googleapis.com/auth/drive', 'https://www.googleapis.com/auth/drive.appdata'], 'save_credentials_backend': 'file', 'save_credentials_file': '/home/jhylands/Documents/*****/.dvc/tmp/gdrive-user-credentials.json', 'get_refresh_token': True, 'client_config': {'client_id': '*****.apps.googleusercontent.com', 'client_secret': '****************', 'auth_uri': 'https://accounts.google.com/o/oauth2/auth', 'token_uri': 'https://oauth2.googleapis.com/token', 'revoke_uri': 'https://oauth2.googleapis.com/revoke', 'redirect_uri': ''}}'.
2022-02-01 11:32:14,994 DEBUG: Estimated remote size: 256 files
2022-02-01 11:32:14,995 DEBUG: Querying '316' hashes via traverse
2022-02-01 11:32:15,325 ERROR: unexpected error - KeyError('fileSize')
------------------------------------------------------------
Traceback (most recent call last):
File "/home/jhylands/.local/lib/python3.8/site-packages/pydrive2/files.py", line 226, in __getitem__
return dict.__getitem__(self, key)
KeyError: 'fileSize'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/main.py", line 55, in main
ret = cmd.do_run()
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/command/base.py", line 45, in do_run
return self.run()
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/command/data_sync.py", line 57, in run
processed_files_count = self.repo.push(
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/repo/__init__.py", line 49, in wrapper
return f(repo, *args, **kwargs)
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/repo/push.py", line 56, in push
pushed += self.cloud.push(
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/data_cloud.py", line 85, in push
return transfer(
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/transfer.py", line 153, in transfer
status = compare_status(src, dest, obj_ids, check_deleted=False, **kwargs)
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/status.py", line 158, in compare_status
dest_exists, dest_missing = status(
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/status.py", line 131, in status
exists.update(odb.hashes_exist(hashes, name=odb.fs_path, **kwargs))
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/db/base.py", line 499, in hashes_exist
remote_hashes = set(
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/db/base.py", line 334, in _list_hashes_traverse
yield from itertools.chain.from_iterable(in_remote)
File "/usr/lib/python3.8/concurrent/futures/_base.py", line 611, in result_iterator
yield fs.pop().result()
File "/usr/lib/python3.8/concurrent/futures/_base.py", line 439, in result
return self.__get_result()
File "/usr/lib/python3.8/concurrent/futures/_base.py", line 388, in __get_result
raise self._exception
File "/usr/lib/python3.8/concurrent/futures/thread.py", line 57, in run
result = self.fn(*self.args, **self.kwargs)
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/db/base.py", line 324, in list_with_update
return list(
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/db/base.py", line 215, in _list_hashes
for path in self._list_paths(prefix, progress_callback):
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/db/base.py", line 195, in _list_paths
for file_info in self.fs.find(fs_path, prefix=prefix):
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/fs/fsspec_wrapper.py", line 107, in find
yield from self.fs.find(path)
File "/home/jhylands/.local/lib/python3.8/site-packages/pydrive2/fs/spec.py", line 323, in find
"size": int(item["fileSize"]),
File "/home/jhylands/.local/lib/python3.8/site-packages/pydrive2/files.py", line 229, in __getitem__
raise KeyError(e)
KeyError: KeyError('fileSize')

python integration with azure gremlin not working

I am trying to mimic as mentioned in GIT.
I almost commented everything, and just trying to run simply
g.V().count()
my connection details are correct, and matched to documentation...
I am getting following error.
Traceback (most recent call last):
File "c:\Users\PrasaRak\OneDrive\gremlin_azure_function\connect.py", line 193, in <module>
count_vertices(client)
File "c:\Users\PrasaRak\OneDrive\gremlin_azure_function\connect.py", line 116, in count_vertices
callback = client.submit(_gremlin_count_vertices)
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\gremlin_python\driver\client.py", line 127, in submit
return self.submitAsync(message, bindings=bindings, request_options=request_options).result()
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\gremlin_python\driver\client.py", line 148, in submitAsync
return conn.write(message)
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\gremlin_python\driver\connection.py", line 55, in write
self.connect()
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\gremlin_python\driver\connection.py", line 45, in connect
self._transport.connect(self._url, self._headers)
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\gremlin_python\driver\tornado\transport.py", line 40, in connect
self._ws = self._loop.run_sync(
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\tornado\ioloop.py", line 576, in run_sync
return future_cell[0].result()
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\tornado\simple_httpclient.py", line 269, in run
stream = yield self.tcp_client.connect(
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\tornado\gen.py", line 1133, in run
value = future.result()
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\tornado\gen.py", line 1147, in run
yielded = self.gen.send(value)
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\tornado\tcpclient.py", line 232, in connect
af, addr, stream = yield connector.start(connect_timeout=timeout)
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\tornado\tcpclient.py", line 87, in start
self.try_connect(iter(self.primary_addrs))
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\tornado\tcpclient.py", line 104, in try_connect
stream, future = self.connect(af, addr)
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\tornado\tcpclient.py", line 276, in _create_stream
return stream, stream.connect(addr)
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\tornado\iostream.py", line 1325, in connect
self._add_io_state(self.io_loop.WRITE)
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\tornado\iostream.py", line 1157, in _add_io_state
self.io_loop.add_handler(
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\site-packages\tornado\platform\asyncio.py", line 83, in add_handler
self.asyncio_loop.add_writer(
File "C:\Users\PrasaRak\Miniconda3\envs\learn-gremlin\lib\asyncio\events.py", line 507, in add_writer
raise NotImplementedError
NotImplementedError
I think i got the answer.
issue was with python 3.8 & Tornado compatibility, when it comes to asyncio.
more info is at this link
fix was to add following line in tornado/platform/asyncio.py
asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy()) # python-3.8.0a4

table graphite.account_profile does not exist

I have installed Graphite on Ubuntu 18.04 using apt-get
I have also installed MySQL, created a database as well as a user in it
I have run graphite-manage migrate auth to migrate the database models
But I get this error:
Traceback (most recent call last):
File "/usr/lib/python2.7/dist-packages/django/core/handlers/exception.py", line 41, in inner
response = get_response(request)
File "/usr/lib/python2.7/dist-packages/django/core/handlers/base.py", line 249, in _legacy_get_response
response = self._get_response(request)
File "/usr/lib/python2.7/dist-packages/django/core/handlers/base.py", line 187, in _get_response
response = self.process_exception_by_middleware(e, request)
File "/usr/lib/python2.7/dist-packages/django/core/handlers/base.py", line 185, in _get_response
response = wrapped_callback(request, *callback_args, **callback_kwargs)
File "/usr/lib/python2.7/dist-packages/graphite/composer/views.py", line 35, in composer
profile = getProfile(request)
File "/usr/lib/python2.7/dist-packages/graphite/user_util.py", line 25, in getProfile
return default_profile()
File "/usr/lib/python2.7/dist-packages/graphite/user_util.py", line 44, in default_profile
profile, created = Profile.objects.get_or_create(user=user)
File "/usr/lib/python2.7/dist-packages/django/db/models/manager.py", line 85, in manager_method
return getattr(self.get_queryset(), name)(*args, **kwargs)
File "/usr/lib/python2.7/dist-packages/django/db/models/query.py", line 464, in get_or_create
return self.get(**lookup), False
File "/usr/lib/python2.7/dist-packages/django/db/models/query.py", line 374, in get
num = len(clone)
File "/usr/lib/python2.7/dist-packages/django/db/models/query.py", line 232, in len
self._fetch_all()
File "/usr/lib/python2.7/dist-packages/django/db/models/query.py", line 1118, in _fetch_all
self._result_cache = list(self._iterable_class(self))
File "/usr/lib/python2.7/dist-packages/django/db/models/query.py", line 53, in iter
results = compiler.execute_sql(chunked_fetch=self.chunked_fetch)
File "/usr/lib/python2.7/dist-packages/django/db/models/sql/compiler.py", line 899, in execute_sql
raise original_exception
ProgrammingError: (1146, u"Table 'graphite.account_profile' doesn't exist")
Just execute:
graphite-manage migrate --run-syncdb
And graphite-migrate will create account_profile and other tables for you.

Getting an error while starting airflow worker

I have installed airflow and trying to start the worker on the mac. But I am getting following error. Unable to identify what must be causing this issue.
[2018-05-02 15:37:11,458: CRITICAL/MainProcess] Unrecoverable error: TypeError("Invalid argument(s) 'visibility_timeout' sent to create_engine(), using configuration MySQLDialect_mysqldb/QueuePool/Engine. Please check that the keyword arguments are appropriate for this combination of components.",)
Traceback (most recent call last):
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/celery/worker/worker.py", line 203, in start
self.blueprint.start(self)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/celery/bootsteps.py", line 119, in start
step.start(parent)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/celery/bootsteps.py", line 370, in start
return self.obj.start()
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/celery/worker/consumer/consumer.py", line 320, in start
blueprint.start(self)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/celery/bootsteps.py", line 119, in start
step.start(parent)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/celery/worker/consumer/tasks.py", line 37, in start
c.connection, on_decode_error=c.on_decode_error,
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/celery/app/amqp.py", line 302, in TaskConsumer
**kw
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/messaging.py", line 386, in __init__
self.revive(self.channel)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/messaging.py", line 408, in revive
self.declare()
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/messaging.py", line 421, in declare
queue.declare()
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/entity.py", line 605, in declare
self._create_queue(nowait=nowait, channel=channel)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/entity.py", line 614, in _create_queue
self.queue_declare(nowait=nowait, passive=False, channel=channel)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/entity.py", line 649, in queue_declare
nowait=nowait,
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/transport/virtual/base.py", line 531, in queue_declare
self._new_queue(queue, **kwargs)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/transport/sqlalchemy/__init__.py", line 82, in _new_queue
self._get_or_create(queue)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/transport/sqlalchemy/__init__.py", line 70, in _get_or_create
obj = self.session.query(self.queue_cls) \
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/transport/sqlalchemy/__init__.py", line 65, in session
_, Session = self._open()
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/transport/sqlalchemy/__init__.py", line 56, in _open
engine = self._engine_from_config()
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/transport/sqlalchemy/__init__.py", line 51, in _engine_from_config
return create_engine(conninfo.hostname, **transport_options)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/sqlalchemy/engine/__init__.py", line 424, in create_engine
return strategy.create(*args, **kwargs)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/sqlalchemy/engine/strategies.py", line 162, in create
engineclass.__name__))
TypeError: Invalid argument(s) 'visibility_timeout' sent to create_engine(), using configuration MySQLDialect_mysqldb/QueuePool/Engine. Please check that the keyword arguments are appropriate for this combination of components.
Appreciate any help on it.
Thanks in avance
Manish

Openstack RDO can't connect to MySQL server on '10.0.3.139

Recently my Openstack (use RDO) has an error. Admin user's password is correct. I can use mysql by "keystone", "root". This is detail(/var/log/keystone/keystone.log) :
(OperationalError) (2003, "Can't connect to MySQL server on '10.0.3.139' (111)") None None
Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/keystone/common/wsgi.py", line 238, in __call__
result = method(context, **params)
File "/usr/lib/python2.6/site-packages/keystone/token/controllers.py", line 127, in authenticate
auth_token_data, roles_ref=roles_ref, catalog_ref=catalog_ref)
File "/usr/lib/python2.6/site-packages/keystone/common/manager.py", line 44, in _wrapper
return f(*args, **kw)
File "/usr/lib/python2.6/site-packages/keystone/token/providers/uuid.py", line 386, in issue_v2_token
self.token_api.get_token(token_id)
File "/usr/lib/python2.6/site-packages/keystone/token/core.py", line 128, in get_token
token_ref = self._get_token(unique_id)
File "/usr/lib/python2.6/site-packages/dogpile/cache/region.py", line 936, in decorate
should_cache_fn)
File "/usr/lib/python2.6/site-packages/dogpile/cache/region.py", line 588, in get_or_create
async_creator) as value:
File "/usr/lib/python2.6/site-packages/dogpile/core/dogpile.py", line 158, in __enter__
return self._enter()
File "/usr/lib/python2.6/site-packages/dogpile/core/dogpile.py", line 98, in _enter
generated = self._enter_create(createdtime)
File "/usr/lib/python2.6/site-packages/dogpile/core/dogpile.py", line 149, in _enter_create
created = self.creator()
File "/usr/lib/python2.6/site-packages/dogpile/cache/region.py", line 565, in gen_value
created_value = creator()
File "/usr/lib/python2.6/site-packages/dogpile/cache/region.py", line 932, in creator
return fn(*arg, **kw)
File "/usr/lib/python2.6/site-packages/keystone/token/core.py", line 140, in _get_token
return self.driver.get_token(token_id)
File "/usr/lib/python2.6/site-packages/keystone/token/backends/sql.py", line 46, in get_token
token_ref = session.query(TokenModel).get(token_id)
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/orm/query.py", line 775, in get
return self._load_on_ident(key)
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/orm/query.py", line 2512, in _load_on_ident
return q.one()
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/orm/query.py", line 2184, in one
ret = list(self)
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/orm/query.py", line 2227, in __iter__
return self._execute_and_instances(context)
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/orm/query.py", line 2240, in _execute_and_instances
close_with_result=True)
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/orm/query.py", line 2231, in _connection_from_session
**kw)
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/orm/session.py", line 730, in connection
close_with_result=close_with_result)
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/orm/session.py", line 736, in _connection_for_bind
return engine.contextual_connect(**kwargs)
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/engine/base.py", line 2490, in contextual_connect
self.pool.connect(),
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/pool.py", line 224, in connect
return _ConnectionFairy(self).checkout()
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/pool.py", line 387, in __init__
rec = self._connection_record = pool._do_get()
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/pool.py", line 741, in _do_get
con = self._create_connection()
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/pool.py", line 188, in _create_connection
return _ConnectionRecord(self)
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/pool.py", line 270, in __init__
self.connection = self.__connect()
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/pool.py", line 330, in __connect
connection = self.__pool._creator()
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/engine/strategies.py", line 80, in connect
return dialect.connect(*cargs, **cparams)
File "/usr/lib64/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-x86_64.egg/sqlalchemy/engine/default.py", line 281, in connect
return self.dbapi.connect(*cargs, **cparams)
File "/usr/lib64/python2.6/site-packages/MySQLdb/__init__.py", line 81, in Connect
return Connection(*args, **kwargs)
File "/usr/lib64/python2.6/site-packages/MySQLdb/connections.py", line 187, in __init__
super(Connection, self).__init__(*args, **kwargs2)
OperationalError: (OperationalError) (2003, "Can't connect to MySQL server on '10.0.3.139' (111)") None None
What's wrong with MySQL ? Could someone give me some advice ?
make sure mysql service is running using
service mysql status
next make sure it is listening on "10.0.3.139".
By default mysql listens on localhost.
To change that edit /etc/mysql/my.cnf and set the bind-address to 0.0.0.0 and restart the service. This will make it to listen on all the available interfaces.
Finally you have to make sure the mysql database, user and password are configured properly in /etc/keystone/keystone.conf and also that the user/database are created in mysql.
Use the below link for more details.
http://docs.openstack.org/grizzly/openstack-compute/install/apt/content/install-keystone.html

Resources