error messages during running gamboost using caret - r

I am running gamboost using Caret. The model was generated, but the running process gives the following error message as shown below. I am not clear what does that mean.
rd<- train(formula0, data = df.clean,method = "gamBoost",trControl = train_control,metric="RMSE")
The model looks like this
The error messages are

Related

Errors running Oolong validation in R on both STM and seededLDA

I'm trying to run the oolong package to validate a couple of topic models I've created. Using both an STM model and a seededLDA model (this code won't be reproducible)
oolong_test1a <- witi(input_model = model_stm_byt, input_corpus = YS$body)
OR
oolong_test1a <- witi(input_model = slda_howard_docs, input_corpus = howard_df$content)
In both cases it successfully creates an oolong test in my global environment. However, when I run either the word intrusion or topic intrusion test, I get this error in both my console and my viewer:
Listening on http://127.0.0.1:7122
Warning: Error in value[[3L]]: Couldn't normalize path in `addResourcePath`, with arguments: `prefix` = 'miniUI-0.1.1.1'; `directoryPath` = 'D:/temp/RtmpAh8J5r/RLIBS_35b54642a1c09/miniUI/www'
[No stack trace available]
I couldn't find any reference to this error anywhere else. I've checked I'm running the most recent version of oolong.
I've also tried to run it on the models/corpus that comes supplied with oolong. So this code is reproducible:
oolong_test <- witi(input_model = abstracts_keyatm, input_corpus = abstracts$text, userid = "Julia")
oolong_test$do_word_intrusion_test()
oolong_test$do_topic_intrusion_test()
This generates the same errors.
There is a new version in github that fixes this issue.
devtools::install_github("chainsawriot/oolong")

Error loading quantized BERT model from local repository

After quantizing the BERT model, it works without any issue. But if I save the quantized model and load, it does not work. It shows an error message: 'LinearPackedParams' object has no attribute '_modules". I have used the same device to save and load the quantized model.
model = SentenceTransformer('bert-base-nli-mean-tokens')
model.encode(sentences)
quantized_model = torch.quantization.quantize_dynamic(
model, {torch.nn.Linear}, dtype=torch.qint8)
quantized_model.encode(sentences) ```
torch.save(quantized_model,
"/PATH/TO/DESTINATION/Base_bert_quant.pt")
model=torch.load("/SAME/PATH/Base_bert_quant.pt")
model.encode(sentences) #shows the error

getting this error when running a random forest model

This is the error:
Error in eval(predvars, data, env) :
object '.data_Holand.Netherlands' not found
This is my code when I run it:
rf2<-randomForest(income~., data = income_df_train2, importance= TRUE)
income_df_test2$.data_.Holand.Netherlands<-rep(0, times = nrow(income_df_test2))
predicted<-predict(rf2, newdata = income_df_test2, type = "prob") ####This line does not work need to check why
ID<-income_df_test$Id
I already ran this model previously using training data, now I am running it against the test data to produce a csv file with the results.
Please take the time to understand and investigate error messages before posting. The error says .data_Holand.Netherlands was not found. This means it does not exist in the testing dataframe. And it does not. You have created the column .data_.Holand.Netherlands instead of .data_Holand.Netherlands.

MXNet Time-series Example - Dropout Error when running locally

I am looking into using MXNet LSTM modelling for time-series analysis for a problem i am currently working on.
As a way of understanding how to implement this, I am following the example code given by xnNet from the link: https://mxnet.incubator.apache.org/tutorials/r/MultidimLstm.html
When running this script after downloading the necessary data to my local source, i am able to execute the code fine until i get to the following section to train the model:
## train the network
system.time(model <- mx.model.buckets(symbol = symbol,
train.data = train.data,
eval.data = eval.data,
num.round = 100,
ctx = ctx,
verbose = TRUE,
metric = mx.metric.mse.seq,
initializer = initializer,
optimizer = optimizer,
batch.end.callback = NULL,
epoch.end.callback = epoch.end.callback))
When running this section, the following error occurs once gaining connection to the API.
Error in mx.nd.internal.as.array(nd) :
[14:22:53] c:\jenkins\workspace\mxnet\mxnet\src\operator\./rnn-inl.h:359:
Check failed: param_.p == 0 (0.2 vs. 0) Dropout is not supported at the moment.
Is there currently a problem internally within the XNNet R package which is unable to run this code? I can't imagine they would provide a tutorial example for the package that is not executable.
My other thought is that it is something to do with my local device execution and connection to the API. I haven't been able to find any information about this being a problem for other users though.
Any inputs or suggestions would be greatly appreciated thanks.
Looks like you're running an old version of R package. I think following instructions on this page to build a recent R-package should resolve this issue.

Receiving an R-based error message in Azure ML but not in R itself

This is my first time posting so I apologize if I don't have all my crap together.
I am fairly new to Azure ML and R, but I am trying to implement a Logit model through R in Azure since it doesn't seem to be one of the Microsoft-provided models in Azure ML.
When I run my model and other code in RStudio, I don't get any errors, but when I try to implement it through a "Create R Module" in Azure, I get an error message saying:
"Error 0063: The following error occurred during evaluation of R script:
---------- Start of error message from R ----------
cannot coerce class ""function"" to a data.frame
----------- End of error message from R -----------"
There seems to be very little Azure ML documentation that covers R model creation out there, so I thought I would turn here for some potential answers. It is likely that I am just missing something.
Here is the code I have been running:
Blockquote Trainer Script
model<-glm(Mat_Bin~Mat.Mkt.Pen+debt+Urban.Vmi+GDP.Per.Capita+Avg.Low+MSAcrash, data=data, family=binomial)
Scorer R Script
scores<-as.data.frame(predict(model, subset(data, select=c(predict.MatModel3.))))
names(scores)<-c("Predicted Values")
I tried to upload pictures of everything, but apparently I need higher reputation to do so. But my experiment is super simple. Just some simple column manipulation and then the training and scoring of my model.
Any ideas on why I am getting that error message?
If it's just a simple naming error on your part, you could try:
model <- glm(Mat_Bin ~ Mat.Mkt.Pen + debt + Urban.Vmi+ GDP.Per.Capita + Avg.Low+MSAcrash,
data=scores, family=binomial)

Resources