My question is the same as the one here.
Module object has no attribute [CANTERA]
Ray Speth commented but the OP of that post never responded so I'm hoping that maybe Ray could help me out.
I installed Cantera and did tried to make a gas doing the following
import cantera as ct
gas1 = ct.Solution('gri30.xml')
and I got the error
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'module' object has no attribute 'Solution'
I did as the comment suggested and got the following outputs
print(ct.__file__)
/usr/local/lib/python2.7/site-packages/cantera/__init__.py
print(ct.__version__)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'module' object has no attribute '__version__'
Check the versions and compatibility of your Python and Cantera.
Also, check that your mechanism file is correctly merged (the thermo and kinetics files to make the .cti file).
Related
I am trying to export a table from BigQuery to Google Cloud MySQL database.
I found this operator called BigQueryToMySqlOperator (documented here https://airflow.apache.org/docs/apache-airflow-providers-google/stable/_api/airflow/providers/google/cloud/transfers/bigquery_to_mysql/index.html?highlight=bigquerytomysqloperator#module-airflow.providers.google.cloud.transfers.bigquery_to_mysql)
When I deploy the DAG containing this task onto cloud composer, the task always failed with the error
Traceback (most recent call last):
File "/opt/python3.8/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1113, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1287, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1317, in _execute_task
result = task_copy.execute(context=context)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/providers/google/cloud/transfers/bigquery_to_mysql.py", line 166, in execute
for rows in self._bq_get_data():
File "/opt/python3.8/lib/python3.8/site-packages/airflow/providers/google/cloud/transfers/bigquery_to_mysql.py", line 138, in _bq_get_data
response = cursor.get_tabledata(
File "/opt/python3.8/lib/python3.8/site-packages/airflow/providers/google/cloud/hooks/bigquery.py", line 2508, in get_tabledata
return self.hook.get_tabledata(*args, **kwargs)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/providers/google/cloud/hooks/bigquery.py", line 1284, in get_tabledata
rows = self.list_rows(dataset_id, table_id, max_results, selected_fields, page_token, start_index)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/providers/google/common/hooks/base_google.py", line 412, in inner_wrapper
raise AirflowException(
airflow.exceptions.AirflowException: You must use keyword arguments in this methods rather than positional
I don't really understand why it is throwing out this error. Can anyone help me figuring out what went wrong or how should I export data from BigQuery to MySQL DB? Much thanks for your help!
EDIT: My operator code would basically look like this
transfer_data = BigQueryToMySqlOperator(
task_id='task_id',
dataset_table='origin_bq_table',
mysql_table='dest_table_name',
replace=True,
)
Based on the stacktrace, you are most likely using apache-airflow-providers-google==2.2.0.
airflow.exceptions.AirflowException: You must use keyword arguments in
this methods rather than positional
This error originates from the GoogleBaseHook, which can be traced back the BigQueryToMySqlOperator.
BigQueryToMySqlOperator > BigQueryHook > BigQueryConnection > BigQueryCursor > get_tabledata
The reason why you are getting the AirflowException is because get_tabledata
is called as part of the execute method.
Unforuntately, the test for the operator is not comprehensive since it only checks whether or not the method was called was the correct parameters.
I think this will require a new release of the google provider where the cursor in BigQueryToMySqlOperator calls list_rows with keyword arguments instead of get_tabledata, which calls list_rows with positional arguments.
I have also made a Github Issue in the Airflow repository.
Arango module gives a weird error while accessing databases:
from arango import ArangoClient
client = ArangoClient(hosts='http://localhost:8529/')
sys_db = client.db('_system', username='root', password='root')
sys_db.databases()
below is the error:
Traceback (most recent call last): File "", line 1, in
File
"/home/ubuntu/arangovenv/lib/python3.6/site-packages/arango/database.py",
line 699, in databases
return self._execute(request, response_handler) File "/home/ubuntu/arangovenv/lib/python3.6/site-packages/arango/api.py",
line 66, in _execute
return self._executor.execute(request, response_handler) File "/home/ubuntu/arangovenv/lib/python3.6/site-packages/arango/executor.py",
line 82, in execute
return response_handler(resp) File "/home/ubuntu/arangovenv/lib/python3.6/site-packages/arango/database.py",
line 697, in response_handler
return resp.body['result'] TypeError: string indices must be integers
calling database module from "packages/arango/database.py" giving me the same error.
my env:
1) ubuntu 16.4
2) python-arango==5.2.1
any help appreciated.
If you are running it on some server, it may be a server issue. It was in my case at least. I ran the following to clear the proxy and it worked fine.
export http_proxy=''
As I guessed, resp.body is not the data type that you provided. line 697 of database.py is expecting something else. For example:
>>> data = "MyName"
>>> print(data[0])
'M'
>>> print(data['anything'])
TypeError: string indices must be integers
First print command gives the result while seconds command throws the error.
I hope this might solve your problem.
I'm trying to use the Datetime.today code to countdown seconds but every time that I call an attribute of the time, the interpreter denies that the attribute exists. For example:
x=datetime.today()
x= x.hour
print(x)
will return:
Traceback (most recent call last):
File "C:\Users\manuel\Downloads\graphics master v1.py", line 2, in <module>
x=x.hour
AttributeError: 'builtin_function_or_method' object has no attribute 'hour'
In other programs I have tested this to make sure that this is the correct attribute and syntax but in my master program I keep getting this error.
datetime.today is a method not an attribute.
Try datetime.today()
I am using products.sqlalchemypas-1.0-py2.6.egg for authenticating user from MSSQL Table. Authentication work as expected but now I'm trying implementaing groups plugin to
get groups from different table. What happening is when I'm trying to loggin its giving me error saying AttributeError: getGroupsForPrincipal.
Error Traceback is ..
2012-02-21T15:33:14 INFO Zope Ready to handle requests
2012-02-21T15:39:25 ERROR Zope.SiteErrorLog 1329838765.580.598770330561 http://localhost:8060/dev/login_form
Traceback (innermost last):
Module ZPublisher.Publish, line 115, in publish
Module ZPublisher.BaseRequest, line 596, in traverse
Module Products.PluggableAuthService.PluggableAuthService, line 235, in validate
Module Products.PluggableAuthService.PluggableAuthService, line 735, in _findUser
Module Products.PluggableAuthService.PluggableAuthService, line 668, in _getGroupsForPrincipal
AttributeError: getGroupsForPrincipal
My defination in plugin.py is ...
def getGroupsForPrincipal(self, principal=getSecurityManager().getUser().getId(),request=None):
"Getting groups from SIMS"
import pdb; pdb.set_trace()
groups = []
results = self.simsGroupForUser(username=principal)
for row in results.dictionaries():
group = row.get('group')
groups.append(group)
return groups
Don't know why its not able to reach this method in plugin.py however there is implatemented block where I did define this interface to implement resulting showing groups interface in my acl_user pas object.
[added]
I've tried to import my plugin in debugger and tried to reach this method and have same error so I dont know Do I need to define anything specifically to pick this method in my pas. I did define in my implements class to impelement IGroupsPlugin.
Any comment is great help as always.
I don't think you method definition does what you expect it to. principal=getSecurityManager().getUser().getId() will calculate the default parameter at import time rather than at method execution time.
Just found that My file has wrong indentation, that why it was giving attributes error. Thanks all for your time and comments.
I have a handler that adds a Member to a Group. The last line in this handler causes an error:
TypeError: Can't pickle objects in acquisition wrappers.
> /home/mnieber/.buildout/eggs/ZODB3-3.10.3-py2.6-linux-i686.egg/ZODB/serialize.py(431)_dump()
430 self._p.dump(classmeta)
--> 431 self._p.dump(state)
432 self._file.truncate()
In the pdb debugger I can see that indeed Plone is trying to pickle a value that is an Acquisition wrapper:
ipdb> state
((((<PloneUser 'newuser#usecm.com'>, ('Default_Group',), 'maarten#usecm.com', ('PAS',)),),),)
ipdb> type(state[0][0][0][0])
<type 'Acquisition.ImplicitAcquisitionWrapper'>
However, I cannot see which object is being pickled, and therefore I have no idea which part of my code needs fixing. My question is: how should I go about debugging this error? I have tried looking at all the stack frames, but none of them reveal which object is being serialized.
The handler is this one (run_insecure is a decorator that I use to temporarily install a new security manager that avoids a NotAuthorized error when adding the new member):
#adapter(IPrincipalCreatedEvent)
#run_insecure
def userCreatedHandler(event):
portal_groups = getToolByName(getSite(), "portal_groups")
membersGroup = portal_groups.getGroupById('Default_Group')
membersGroup.addMember(event.principal)
The full error is this one:
Traceback (innermost last):
Module ZPublisher.Publish, line 134, in publish
Module Zope2.App.startup, line 301, in commit
Module transaction._manager, line 89, in commit
Module transaction._transaction, line 329, in commit
Module transaction._transaction, line 443, in _commitResources
Module ZODB.Connection, line 567, in commit
Module ZODB.Connection, line 623, in _commit
Module ZODB.Connection, line 658, in _store_objects
Module ZODB.serialize, line 422, in serialize
Module ZODB.serialize, line 431, in _dump
TypeError: Can't pickle objects in acquisition wrappers.
> /home/mnieber/.buildout/eggs/ZODB3-3.10.3-py2.6-linux-i686.egg/ZODB/serialize.py(431)_dump()
430 self._p.dump(classmeta)
--> 431 self._p.dump(state)
432 self._file.truncate()
I got this kind of problem with pickle, and solved by debugging like you did.
Pickle (used to store objects in ZODB) is trying to serialize your PloneUser, and raising this acquisition wrapper error.
In my case, I was wrapping the portal_workflow object into another class, and had to inherit it from pickle.Pickler, and override __getstate__ method to solve my problem.
This method is called by pickle in order to serialize your object. If you override this method, and return your object.__dict__ without this PloneUser, than this error would not be raised.
This question (although with not your exact problem) have more info about what I'm trying to say.
Nice you could solve your problem.