I've added a large list of CSV files to my dvc repository but when I try to do DVC push it complains with
ERROR: unexpected error - KeyError('fileSize')
Edit
So searching around it seem that it might help to include the verbose log with regards to the error.
T11:27:08~/documents/*****/data$ dvc push -v
2022-02-01 11:32:13,186 DEBUG: Adding '/home/jhylands/Documents/*****/.dvc/config.local' to gitignore file.
2022-02-01 11:32:13,199 DEBUG: Adding '/home/jhylands/Documents/*****/.dvc/tmp' to gitignore file.
2022-02-01 11:32:13,200 DEBUG: Adding '/home/jhylands/Documents/*****/.dvc/cache' to gitignore file.
2022-02-01 11:32:14,102 DEBUG: Preparing to transfer data from '/home/jhylands/Documents/*****/.dvc/cache' to '*********'
2022-02-01 11:32:14,102 DEBUG: Preparing to collect status from '********'
2022-02-01 11:32:14,103 DEBUG: Collecting status from '*******'
2022-02-01 11:32:14,439 DEBUG: GDrive remote auth with config '{'client_config_backend': 'settings', 'client_config_file': 'client_secrets.json', 'save_credentials': True, 'oauth_scope': ['https://www.googleapis.com/auth/drive', 'https://www.googleapis.com/auth/drive.appdata'], 'save_credentials_backend': 'file', 'save_credentials_file': '/home/jhylands/Documents/*****/.dvc/tmp/gdrive-user-credentials.json', 'get_refresh_token': True, 'client_config': {'client_id': '*****.apps.googleusercontent.com', 'client_secret': '****************', 'auth_uri': 'https://accounts.google.com/o/oauth2/auth', 'token_uri': 'https://oauth2.googleapis.com/token', 'revoke_uri': 'https://oauth2.googleapis.com/revoke', 'redirect_uri': ''}}'.
2022-02-01 11:32:14,994 DEBUG: Estimated remote size: 256 files
2022-02-01 11:32:14,995 DEBUG: Querying '316' hashes via traverse
2022-02-01 11:32:15,325 ERROR: unexpected error - KeyError('fileSize')
------------------------------------------------------------
Traceback (most recent call last):
File "/home/jhylands/.local/lib/python3.8/site-packages/pydrive2/files.py", line 226, in __getitem__
return dict.__getitem__(self, key)
KeyError: 'fileSize'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/main.py", line 55, in main
ret = cmd.do_run()
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/command/base.py", line 45, in do_run
return self.run()
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/command/data_sync.py", line 57, in run
processed_files_count = self.repo.push(
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/repo/__init__.py", line 49, in wrapper
return f(repo, *args, **kwargs)
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/repo/push.py", line 56, in push
pushed += self.cloud.push(
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/data_cloud.py", line 85, in push
return transfer(
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/transfer.py", line 153, in transfer
status = compare_status(src, dest, obj_ids, check_deleted=False, **kwargs)
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/status.py", line 158, in compare_status
dest_exists, dest_missing = status(
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/status.py", line 131, in status
exists.update(odb.hashes_exist(hashes, name=odb.fs_path, **kwargs))
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/db/base.py", line 499, in hashes_exist
remote_hashes = set(
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/db/base.py", line 334, in _list_hashes_traverse
yield from itertools.chain.from_iterable(in_remote)
File "/usr/lib/python3.8/concurrent/futures/_base.py", line 611, in result_iterator
yield fs.pop().result()
File "/usr/lib/python3.8/concurrent/futures/_base.py", line 439, in result
return self.__get_result()
File "/usr/lib/python3.8/concurrent/futures/_base.py", line 388, in __get_result
raise self._exception
File "/usr/lib/python3.8/concurrent/futures/thread.py", line 57, in run
result = self.fn(*self.args, **self.kwargs)
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/db/base.py", line 324, in list_with_update
return list(
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/db/base.py", line 215, in _list_hashes
for path in self._list_paths(prefix, progress_callback):
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/objects/db/base.py", line 195, in _list_paths
for file_info in self.fs.find(fs_path, prefix=prefix):
File "/home/jhylands/.local/lib/python3.8/site-packages/dvc/fs/fsspec_wrapper.py", line 107, in find
yield from self.fs.find(path)
File "/home/jhylands/.local/lib/python3.8/site-packages/pydrive2/fs/spec.py", line 323, in find
"size": int(item["fileSize"]),
File "/home/jhylands/.local/lib/python3.8/site-packages/pydrive2/files.py", line 229, in __getitem__
raise KeyError(e)
KeyError: KeyError('fileSize')
Related
On my FreeBSD I have a file packages.sls in the following path /usr/local/etc/salt/states
I'm getting the following error message when i do salt '*' state.apply packages :
freebsd:
Data failed to compile:
----------
Pillar failed to render with the following messages:
----------
Rendering SLS 'config' failed. Please see master log for details.
On the file master log i have the following details:
2022-06-02 10:05:12,222 [salt.roster :104 ][ERROR ][3425] Can't access roster for backend flat: Roster file "/usr/local/etc/salt/roster" not found
2022-06-02 10:05:12,434 [salt.pillar :900 ][CRITICAL][3427] Rendering SLS 'config' failed, render error:
found unexpected end of stream
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/salt/renderers/yaml.py", line 62, in render
data = yamlloader.load(yaml_data, Loader=get_yaml_loader(argline))
File "/usr/local/lib/python3.8/site-packages/salt/utils/yamlloader.py", line 169, in load
return yaml.load(stream, Loader=Loader)
File "/usr/local/lib/python3.8/site-packages/yaml/__init__.py", line 114, in load
return loader.get_single_data()
File "/usr/local/lib/python3.8/site-packages/yaml/constructor.py", line 49, in get_single_data
node = self.get_single_node()
File "yaml/_yaml.pyx", line 707, in yaml._yaml.CParser.get_single_node
File "yaml/_yaml.pyx", line 725, in yaml._yaml.CParser._compose_document
File "yaml/_yaml.pyx", line 776, in yaml._yaml.CParser._compose_node
File "yaml/_yaml.pyx", line 890, in yaml._yaml.CParser._compose_mapping_node
File "yaml/_yaml.pyx", line 776, in yaml._yaml.CParser._compose_node
File "yaml/_yaml.pyx", line 892, in yaml._yaml.CParser._compose_mapping_node
File "yaml/_yaml.pyx", line 905, in yaml._yaml.CParser._parse_next_event
yaml.scanner.ScannerError: while scanning a quoted scalar
in "<unicode string>", line 3, column 27
found unexpected end of stream
in "<unicode string>", line 4, column 1
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/salt/pillar/__init__.py", line 887, in render_pstate
state = compile_template(
File "/usr/local/lib/python3.8/site-packages/salt/template.py", line 99, in compile_template
ret = render(input_data, saltenv, sls, **render_kwargs)
File "/usr/local/lib/python3.8/site-packages/salt/loader/lazy.py", line 149, in __call__
return self.loader.run(run_func, *args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/salt/loader/lazy.py", line 1201, in run
return self._last_context.run(self._run_as, _func_or_method, *args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/salt/loader/lazy.py", line 1216, in _run_as
return _func_or_method(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/salt/renderers/yaml.py", line 66, in render
raise SaltRenderError(err_type, line_num, exc.problem_mark.buffer)
salt.exceptions.SaltRenderError: found unexpected end of stream
2022-06-02 10:05:12,435 [salt.pillar :1224][CRITICAL][3427] Pillar render error: Rendering SLS 'config' failed. Please see master log for details.
My sls file packages.sls
install_bash:
pkg.installed:
- pkgs:
- bash
- vim
- curl
Any idea on how to solve this situation?
Thank you
It was a problem of DNS/Cache. Issue solved after changing the hostname in minion.id ,clear cache, accepted new key and restart.
Hi I have encountered such problem when loading the function install_tensorflow() in r.
I don't know what to do with this error. Any help will be greatly appreciated!
Below is the error message
ERROR: Exception:
Traceback (most recent call last):
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_vendor/resolvelib/resolvers.py", line 171, in _merge_into_criterion
crit = self.state.criteria[name]
KeyError: 'tensorflow'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_vendor/urllib3/response.py", line 438, in _error_catcher
yield
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_vendor/urllib3/response.py", line 519, in read
data = self._fp.read(amt) if not fp_closed else b""
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_vendor/cachecontrol/filewrapper.py", line 62, in read
data = self.__fp.read(amt)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/http/client.py", line 461, in read
n = self.readinto(b)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/http/client.py", line 505, in readinto
n = self.fp.readinto(b)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/socket.py", line 589, in readinto
return self._sock.recv_into(b)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/ssl.py", line 1071, in recv_into
return self.read(nbytes, buffer)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/ssl.py", line 929, in read
return self._sslobj.read(len, buffer)
socket.timeout: The read operation timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/cli/base_command.py", line 189, in _main
status = self.run(options, args)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/cli/req_command.py", line 178, in wrapper
return func(self, options, args)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/commands/install.py", line 317, in run
reqs, check_supported_wheels=not options.target_dir
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/resolution/resolvelib/resolver.py", line 122, in resolve
requirements, max_rounds=try_to_avoid_resolution_too_deep,
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_vendor/resolvelib/resolvers.py", line 453, in resolve
state = resolution.resolve(requirements, max_rounds=max_rounds)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_vendor/resolvelib/resolvers.py", line 318, in resolve
name, crit = self._merge_into_criterion(r, parent=None)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_vendor/resolvelib/resolvers.py", line 173, in _merge_into_criterion
crit = Criterion.from_requirement(self._p, requirement, parent)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_vendor/resolvelib/resolvers.py", line 82, in from_requirement
if not cands:
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_vendor/resolvelib/structs.py", line 124, in bool
return bool(self._sequence)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/resolution/resolvelib/found_candidates.py", line 143, in bool
return any(self)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/resolution/resolvelib/found_candidates.py", line 38, in _iter_built
candidate = func()
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 169, in _make_candidate_from_link
name=name, version=version,
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 306, in init
version=version,
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 144, in init
self.dist = self._prepare()
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 226, in _prepare
dist = self._prepare_distribution()
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 312, in _prepare_distribution
self._ireq, parallel_builds=True,
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/operations/prepare.py", line 457, in prepare_linked_requirement
return self._prepare_linked_requirement(req, parallel_builds)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/operations/prepare.py", line 482, in _prepare_linked_requirement
self.download_dir, hashes,
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/operations/prepare.py", line 234, in unpack_url
hashes=hashes,
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/operations/prepare.py", line 108, in get_http_url
from_path, content_type = download(link, temp_dir.path)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/network/download.py", line 163, in call
for chunk in chunks:
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/cli/progress_bars.py", line 159, in iter
for x in it:
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_internal/network/utils.py", line 88, in response_chunks
decode_content=False,
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_vendor/urllib3/response.py", line 576, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_vendor/urllib3/response.py", line 541, in read
raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/contextlib.py", line 130, in exit
self.gen.throw(type, value, traceback)
File "/Users/alexsong/Library/r-miniconda/envs/r-reticulate/lib/python3.7/site-packages/pip/_vendor/urllib3/response.py", line 443, in _error_catcher
raise ReadTimeoutError(self._pool, None, "Read timed out.")
pip._vendor.urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='files.pythonhosted.org', port=443): Read timed out.
Error: Error installing package(s): 'tensorflow==2.4'
I'm trying to make use of the BigQueryHook but I am unable to get the service account authentication working.
I've followed the steps provided by Google and have copied the JSON file into the data/ directory in the environment's GCS bucket.
The Airflow Connection details have been filled in:
JSON Keyfile Path: /home/airflow/gcs/data/my-key-file.json
Keyfile JSON: content of the JSON file
Scopes: https://www.googleapis.com/auth/cloud-platform
Error in stackdriver:
Traceback (most recent call last): File "/usr/local/lib/airflow/airflow/models.py", line 374, in process_file m = imp.load_source(mod_name, filepath) File "/opt/python3.6/lib/python3.6/imp.py", line 172, in load_source module = _load(spec) File "", line 684, in _load File "", line 665, in _load_unlocked File "", line 678, in exec_module File "", line 219, in _call_with_frames_removed File "/home/airflow/gcs/dags/cloud_sql_to_bq.py", line 141, in df = get_config() File "/home/airflow/gcs/dags/cloud_sql_to_bq.py", line 71, in get_config bq_client = bigquery.Client(project=bq_hook._get_field("my-project"), credentials=bq_hook._get_credentials()) File "/usr/local/lib/airflow/airflow/contrib/hooks/gcp_api_base_hook.py", line 103, in _get_credentials key_path, scopes=scopes) File "/opt/python3.6/lib/python3.6/site-packages/google/oauth2/service_account.py", line 209, in from_service_account_file filename, require=['client_email', 'token_uri']) File "/opt/python3.6/lib/python3.6/site-packages/google/auth/_service_account_info.py", line 71, in from_filename with io.open(filename, 'r', encoding='utf-8') as json_file: FileNotFoundError: [Errno 2] No such file or directory: '/home/airflow/gcs/data/my-key-file.json'
Any idea why it can't see the JSON file?
I have installed Graphite on Ubuntu 18.04 using apt-get
I have also installed MySQL, created a database as well as a user in it
I have run graphite-manage migrate auth to migrate the database models
But I get this error:
Traceback (most recent call last):
File "/usr/lib/python2.7/dist-packages/django/core/handlers/exception.py", line 41, in inner
response = get_response(request)
File "/usr/lib/python2.7/dist-packages/django/core/handlers/base.py", line 249, in _legacy_get_response
response = self._get_response(request)
File "/usr/lib/python2.7/dist-packages/django/core/handlers/base.py", line 187, in _get_response
response = self.process_exception_by_middleware(e, request)
File "/usr/lib/python2.7/dist-packages/django/core/handlers/base.py", line 185, in _get_response
response = wrapped_callback(request, *callback_args, **callback_kwargs)
File "/usr/lib/python2.7/dist-packages/graphite/composer/views.py", line 35, in composer
profile = getProfile(request)
File "/usr/lib/python2.7/dist-packages/graphite/user_util.py", line 25, in getProfile
return default_profile()
File "/usr/lib/python2.7/dist-packages/graphite/user_util.py", line 44, in default_profile
profile, created = Profile.objects.get_or_create(user=user)
File "/usr/lib/python2.7/dist-packages/django/db/models/manager.py", line 85, in manager_method
return getattr(self.get_queryset(), name)(*args, **kwargs)
File "/usr/lib/python2.7/dist-packages/django/db/models/query.py", line 464, in get_or_create
return self.get(**lookup), False
File "/usr/lib/python2.7/dist-packages/django/db/models/query.py", line 374, in get
num = len(clone)
File "/usr/lib/python2.7/dist-packages/django/db/models/query.py", line 232, in len
self._fetch_all()
File "/usr/lib/python2.7/dist-packages/django/db/models/query.py", line 1118, in _fetch_all
self._result_cache = list(self._iterable_class(self))
File "/usr/lib/python2.7/dist-packages/django/db/models/query.py", line 53, in iter
results = compiler.execute_sql(chunked_fetch=self.chunked_fetch)
File "/usr/lib/python2.7/dist-packages/django/db/models/sql/compiler.py", line 899, in execute_sql
raise original_exception
ProgrammingError: (1146, u"Table 'graphite.account_profile' doesn't exist")
Just execute:
graphite-manage migrate --run-syncdb
And graphite-migrate will create account_profile and other tables for you.
I have installed airflow and trying to start the worker on the mac. But I am getting following error. Unable to identify what must be causing this issue.
[2018-05-02 15:37:11,458: CRITICAL/MainProcess] Unrecoverable error: TypeError("Invalid argument(s) 'visibility_timeout' sent to create_engine(), using configuration MySQLDialect_mysqldb/QueuePool/Engine. Please check that the keyword arguments are appropriate for this combination of components.",)
Traceback (most recent call last):
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/celery/worker/worker.py", line 203, in start
self.blueprint.start(self)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/celery/bootsteps.py", line 119, in start
step.start(parent)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/celery/bootsteps.py", line 370, in start
return self.obj.start()
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/celery/worker/consumer/consumer.py", line 320, in start
blueprint.start(self)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/celery/bootsteps.py", line 119, in start
step.start(parent)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/celery/worker/consumer/tasks.py", line 37, in start
c.connection, on_decode_error=c.on_decode_error,
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/celery/app/amqp.py", line 302, in TaskConsumer
**kw
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/messaging.py", line 386, in __init__
self.revive(self.channel)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/messaging.py", line 408, in revive
self.declare()
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/messaging.py", line 421, in declare
queue.declare()
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/entity.py", line 605, in declare
self._create_queue(nowait=nowait, channel=channel)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/entity.py", line 614, in _create_queue
self.queue_declare(nowait=nowait, passive=False, channel=channel)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/entity.py", line 649, in queue_declare
nowait=nowait,
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/transport/virtual/base.py", line 531, in queue_declare
self._new_queue(queue, **kwargs)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/transport/sqlalchemy/__init__.py", line 82, in _new_queue
self._get_or_create(queue)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/transport/sqlalchemy/__init__.py", line 70, in _get_or_create
obj = self.session.query(self.queue_cls) \
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/transport/sqlalchemy/__init__.py", line 65, in session
_, Session = self._open()
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/transport/sqlalchemy/__init__.py", line 56, in _open
engine = self._engine_from_config()
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/kombu/transport/sqlalchemy/__init__.py", line 51, in _engine_from_config
return create_engine(conninfo.hostname, **transport_options)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/sqlalchemy/engine/__init__.py", line 424, in create_engine
return strategy.create(*args, **kwargs)
File "/Users/manishz/anaconda2/envs/airflow/lib/python2.7/site-packages/sqlalchemy/engine/strategies.py", line 162, in create
engineclass.__name__))
TypeError: Invalid argument(s) 'visibility_timeout' sent to create_engine(), using configuration MySQLDialect_mysqldb/QueuePool/Engine. Please check that the keyword arguments are appropriate for this combination of components.
Appreciate any help on it.
Thanks in avance
Manish