I am having trouble importing TFBertModel, BertConfig, BertTokenizerFast. I tried the latest version of transformers, tokenizer==0.7.0, and transformers.modeling_bert but they do not seem to work. I get the error
from transformers import TFBertModel, BertConfig, BertTokenizerFast
ImportError: cannot import name 'TFBertModel' from 'transformers' (unknown location)
Any ideas for a fix? Thanks!
Do you have Tensorflow 2 installed? The model you are trying to import required Tensorflow
Related
I´m new in coding and for streamlit
I wanted to add aggrid
But when i try to run my code i get a JSON error that i have tried to fix
How can i fix my Attributeerror JSON?
I have uninstalled and reinstalled
pip uninstall streamlit-aggrid
pip install streamlit-aggrid
Nothing helps:
From the code for Aggrid:
import pandas as pd
import streamlit as st
import streamlit.components.v1 as components
from st_aggrid import GridOptionsBuilder, AgGrid, GridUpdateMode, DataReturnMode
and the lines for Aggrid configuration:
# Define AgGrid configuration
gb = GridOptionsBuilder.from_dataframe(df_selection)
gb.configure_default_column(groupable=True, value=True, enableRowGroup=True, aggFunc='sum', editable=True)
gridOptions = gb.build()
# Render AgGrid
components.html(AgGrid(df_selection, gridOptions=gridOptions, width='100%', height='500px', data_return_mode=DataReturnMode.JSON).get_html(),height=800)
I'm trying to use a recorded case in OpenMDAO contained in "my_file.db"
when i execute the following code:
import openmdao.api as om
cr = om.CaseReader('my_file.db')
I get the following error:
ModuleNotFoundError: No module named 'groups'
'groups' is a folder from the openMDAO code that I used to record the case and now I'm trying to import it from a different directory. How can I redefine the path for om.CaseReader to look for the modules it needs?
try setting your PYTHONPATH, as discussed here:
https://bic-berkeley.github.io/psych-214-fall-2016/using_pythonpath.html
Solved using:
import os
dirname = os.path.dirname(__file__)
import sys
sys.path.append( dirname )
I try to import the following functions:
import numpy as np
def load_labels(path):
y = np.load(path)
return y
def print_sentence():
print("hi")
from a Jupyter notebook, with name "save_load" into another Jupyter notebook with the following code:
!pip install import-ipynb
import import_ipynb
import save_load
from save_load import load_labels, print_sentence
The function print_sentence works fine in the notebook, but with the function load_labels I receive the following error:
NameError: name 'np' is not defined
What could be the reason for this error? I've imported numpy as np in both notebooks.
In "save_load" instead of import numpy as np try import numpy, it worked for me.
You can try this:
import numpy as np
or
from numpy import *
I had the same problem when I was fixing my codes on VScode. Try saving the file. Then run it again.
I am attempting to convert some pandas dataframes to R objects for use in rpy2.
I am getting an error when i try to import the conversion the module. from rpy2.robjects import pandas2ri yields ImportError: cannot import name 're_type' as follows:
/opt/conda/lib/python3.6/site-packages/pandas/core/dtypes/inference.py in <module>()
6 from collections import Iterable
7 from numbers import Number
----> 8 from pandas.compat import (PY2, string_types, text_type,
9 string_and_binary_types, re_type)
10 from pandas._libs import lib
ImportError: cannot import name 're_type'
I haven't really seen any discussion of this error elsewhere.
in terms of dependencies, i am using pandas 0.23.4, rpy2 2.9.4, R 3.4, and it is running on the docker jupyter container datascience-notebook
Really hoping someone could help me here!
I am trying to introduce keras.initializers into my net, following this link:
import keras
from keras.optimizers import RMSprop, Adam
from keras.layers import Input, Embedding, LSTM, Dense, merge, Activation
from keras.models import Model, Sequential
model = Sequential()
model.add(Dense(100, init='lecun_uniform', input_shape=(6,)))
model.add(Activation('relu'))
model.add(Dense(27, init='lecun_uniform'))
model.add(Activation('linear'))
rms = RMSprop(lr = 0.01)
keras.initializers.RandomUniform(minval=-0.05, maxval=0.05, seed=None)
model.compile(loss='mse', optimizer=rms)
And it fails with the following error:
keras.initializers.RandomUniform(minval=-0.05, maxval=0.05, seed=None)
AttributeError: module 'keras' has no attribute 'initializers'
Any ideas as to why it happens?
You have to check version of Keras being used. Probable mistake is that you have 1.x.x and trying to use initializers from Keras 2.x.x