I installed 'caTools' R package through the command line:
$ R
$ install.packages("caTools", lib="~/R/library")
Then, did this command:
INPUT=/home/user/file.bam
OUTPUT=/home/user/file_cor.bam
Rscript run_spp_nodups.R -c=$INPUT -savp -out=$OUTPUT
And got the error:
Error: could not find function "runmean"
Execution halted
The function 'runmean' belongs to package I installed, 'caTools'.
The R version is appropriate, as R in my machine is version 3.3.2 and 'caTools' depends on R (≥ 2.2.0).
The R code of 'run_spp_nodups.R' is to big to paste here. I show only the part with runmean:
# Smooth the cross-correlation curve if required
cc <- crosscorr$cross.correlation
crosscorr$min.cc <- crosscorr$cross.correlation[ length(crosscorr$cross.correlation$y) , ] # minimum value and shift of cross-correlation
cat("Minimum cross-correlation value", crosscorr$min.cc$y,"\n",file=stdout())
cat("Minimum cross-correlation shift", crosscorr$min.cc$x,"\n",file=stdout())
sbw <- 2*floor(ceiling(5/iparams$sep.range[2]) / 2) + 1 # smoothing bandwidth
cc$y <- runmean(cc$y,sbw,alg="fast")
What's happening and how to solve it?
Related
I am trying to run the textEmbed function in R.
Set up needed:
require(quanteda)
require(quanteda.textstats)
require(udpipe)
require(reticulate)
#udpipe_download_model(language = "english")
ud_eng <- udpipe_load_model(here::here('english-ewt-ud-2.5-191206.udpipe'))
virtualenv_list()
reticulate::import('torch')
reticulate::import('numpy')
reticulate::import('transformers')
reticulate::import('nltk')
reticulate::import('tokenizers')
require(text)
It runs the following code
tmp1 <- textEmbed(x = 'sofa help',
model = 'roberta-base',
layers = 11)
tmp1$x
However, it does not run the following code
tmp1 <- textEmbed(x = 'sofa help',
model = 'roberta-base',
layers = 11)
tmp1$x
It gives me the following error
Error in x[[1]] : subscript out of bounds
In addition: Warning message:
Unknown or uninitialised column: `words`.
Any suggestions would be highly appreciated
I believe that this error has been fixed with a newer version of the text-package (version .9.50 and above).
(I cannot see any difference in the two code parts – but I think that this error is related to only submitting one token/word to textEmbed, which now works).
Also, see updated instructions for how to install the text-package http://r-text.org/articles/Extended_Installation_Guide.html
library(text)
library(reticulate)
# Install text required python packages in a conda environment (with defaults).
text::textrpp_install()
# Show available conda environments.
reticulate::conda_list()
# Initialize the installed conda environment.
# save_profile = TRUE saves the settings so that you don't have to run textrpp_initialize() after restarting R.
text::textrpp_initialize(save_profile = TRUE)
# Test so that the text package work.
textEmbed("hello")
I am using CentOS 7 Linux compute cluster, with 130 GB RAM. I am trying to use the SVM function from the e1071 R package. My matrix dimension is rows = 350 and columns = 54250.
R script Code (file_testR.R)
matris=matrix(rnorm(100),350,54251)
matris <- as.data.frame(matris)
matris$new_variable <- 0
matris$new_variable[1:175] <- "yes"
matris$new_variable[176:350] <- "no"
require(e1071)
svmfit_test <- svm(as.factor(matris$new_variable)~., data = matris, kernel = "linear", cross=10)
Bash code
Rscript --max-ppsize=500000 file_testR.R
I am getting this below error:
Error in model.matrix.default(Terms, m) :
long vectors not supported yet: ../../src/include/Rinlinedfuns.h:522
Calls: svm ... svm.formula -> model.matrix -> model.matrix.default
I would appreciate it if anybody could help me to understand this issue.
I've run a factor analysis on a dataset in R, using the psych package. Up until about 1 month ago, it has spit out the same output, but recently, it's different.
When I try running it on older versions of the psych package, it also churns out a different output. I'm at a loss for diagnosing this issue and trying to get the original output. I don't see how it could be a coding issue since the results were generated in the past -- I'm just struggling getting the same output now..
Below is the condensed version of code.
To download psych package:
# check if package "psych" is installed. if not, remind the user to install
if(("psych" %in% rownames(installed.packages())) == FALSE){
stop("Please install package 'psych' by running 'install.package('psych')'")
}
library(psych) # this package is needed for factor analysis
To run the FA:
n_factor=3
# the variables defined below are used to record the iterative process
LIST_min_in_max_loading_vector <- NULL
LIST_drop_variable <- NULL
min_in_max_loading_vector=0
flag=1
while(TRUE){
cat("The ",flag," Step is done. \n")
fa_result<- fa(dat,nfactors=n_factor,rotate = "varimax", cor='poly')
max_loading_in_each_row <- sapply(1:dim(fa_result$loadings)[1],function(j) max(abs(fa_result$loadings[j,])))
variable_names=row.names(fa_result$loadings)
min_in_max_loading_vector <- min(max_loading_in_each_row)
# Please note that here we have a cut-off value 0.5.
# This means that the minimum of the absolute values of all the loadings must be bigger than 0.5
# it's also the stop condition of our iterative algorithm
if(min_in_max_loading_vector>0.5){
break
}
min_variable <- variable_names[which(max_loading_in_each_row==min_in_max_loading_vector)]
cat("The minimum of the maximum absolute loadings is:",min_in_max_loading_vector,"\n")
drop_index <- which(row.names(fa_result$loadings)==min_variable)
cat(min_variable," is droped in this round.\n\n")
dat <- dat[,-drop_index]
#record the process of dropping
LIST_min_in_max_loading_vector[flag]=min_in_max_loading_vector
LIST_drop_variable[flag] <- min_variable
print(fa_result$loadings)
flag=flag+1
}
Can anyone potentially troubleshoot this?
I recently dusted off a script which calls solve.QP from the quadprog package (I currently have version 1.5-5). Now it generates the error "object '.QP_qpgen2' not found". I don't understand why.
This object is not created by me but by the solve.QP function in quadprog.
On Github Quadprog.R has the code (line 117):
res1 <- .Fortran(.QP_qpgen2,
as.double(Dmat), dvec=as.double(dvec),
as.integer(n), as.integer(n),
sol=as.double(sol), lagr=as.double(lagr),
crval=as.double(crval),
as.double(Amat), as.double(bvec), as.integer(n),
as.integer(q), as.integer(meq),
iact=as.integer(iact), nact=as.integer(nact),
iter=as.integer(iter), work=as.double(work),
ierr=as.integer(factorized))
The error can be generated from the code taken from the documentation for solve.QP:
##
## Assume we want to minimize: -(0 5 0) %*% b + 1/2 b^T b
## under the constraints: A^T b >= b0
## with b0 = (-8,2,0)^T
## and (-4 2 0)
## A = (-3 1 -2)
## ( 0 0 1)
## we can use solve.QP as follows:
##
Dmat <- matrix(0,3,3)
diag(Dmat) <- 1
dvec <- c(0,5,0)
Amat <- matrix(c(-4,-3,0,2,1,0,0,-2,1),3,3)
bvec <- c(-8,2,0)
solve.QP(Dmat,dvec,Amat,bvec=bvec)
I am using R v3.4.1 if that helps.
As stated in my comment, R 3.4 has a new method to register external routines. Quadprog relies on Fortran routines.
To solve this, you need to build the package from source in R 3.4 using the current Rtools. You need to have the Rtools installed and setup (A google search will get you to a guide how to set-up Rtools for whatever system you are using). Then, go to CRAN page of the quadprog package and download the source file quadprog.tar.gz. Finally, run the command
install.packages("PATH_TO_FILE/quadprog_1.5-5.tar.gz", repos = NULL, type="source", INSTALL_opts = "--merge-multiarch")
Alternatively, you can wait a few days. I'm sure, the package on CRAN will be updated soon.
As I keep getting e-mails about this issue:
Use packageDescription("quadprog") to see under which version of R your installed package was built.
If the version is R 3.3.x (or earlier), use update.packages(checkBuilt=TRUE) to update your version to a version that was built under R 3.4.x.
I am using the "BMA" package in R 3.1.0, and get an error when running one of the functions in the package, iBMA.glm. When running the example in the package documentation:
## Not run:
############ iBMA.glm
library("MASS")
library("BMA")
data(birthwt)
y<- birthwt$lo
x<- data.frame(birthwt[,-1])
x$race<- as.factor(x$race)
x$ht<- (x$ht>=1)+0
x<- x[,-9]
x$smoke <- as.factor(x$smoke)
x$ptl<- as.factor(x$ptl)
x$ht <- as.factor(x$ht)
x$ui <- as.factor(x$ui)
### add 41 columns of noise
noise<- matrix(rnorm(41*nrow(x)), ncol=41)
colnames(noise)<- paste('noise', 1:41, sep='')
x<- cbind(x, noise)
iBMA.glm.out<- iBMA.glm( x, y, glm.family="binomial",
factor.type=FALSE, verbose = TRUE,
thresProbne0 = 5 )
summary(iBMA.glm.out)
I get the error:
Error in registerNames(names, package, ".__global__", add) :
The namespace for package "BMA" is locked; no changes in the global variables list may be made.
I get the error in RStudio running R 3.1.0 on Ubuntu.
on Windows 7, from RStudio and the R console I get a similar error:
Error in utils::globalVariables(c("nastyHack_glm.family", "nastyHack_x.df")) :
The namespace for package "BMA" is locked; no changes in the global variables list may be made.
I also get the same error when running my own data in the function. I'm not clear on what this error means and how to work around the error to be actually able to use the function. Any advice would be appreciated!