sp_execute_external_script R script 'unable to start png() device' - r

I am trying to create a plot in SQL Server R using the sp_execute_external_script command, but it fails to create the plot png image:
DECLARE #stateName nvarchar(50) = 'Michigan'
EXEC sp_execute_external_script
#language = N'R',
#script = N'
covidWeeklyDataSet <- InputDataSet
# set up report file for chart
reportfile <- "C:\\temp\\Covid19-Weekly.png"
png(file = reportfile)
plot(x = covidWeeklyDataSet[, 1], y = covidWeeklyDataSet[, 2],
main = paste(state_name, "Weekly Covid 19 Counts", sep = ""),
col = 3, ylab = "Cases", xlab = "Dates", ylim = c(0, 35000))
par(new = TRUE)
plot(x = covidWeeklyDataSet[, 1], y = covidWeeklyDataSet[, 3],
col = 2, ylab = "Cases", xlab = "Dates", ylim = c(0, 35000))
dev.off()
',
#input_data_1 = N'SELECT [date], cases, deaths FROM #weekly',
#params = N'#state_name nvarchar(20)',
#state_name = #stateName
The error message is as follows:
Msg 39004, Level 16, State 20, Line 13 A 'R' script error occurred
during execution of 'sp_execute_external_script' with HRESULT
0x80004004. Msg 39019, Level 16, State 2, Line 13 An external script
error occurred: Error in png(file = reportfile) : unable to start
png() device Calls: source -> withVisible -> eval -> eval -> png In
addition: Warning messages: 1: In png(file = reportfile) : unable to
open file 'C:\temp\Covid19-Weekly.png' for writing 2: In png(file =
reportfile) : opening device failed
Error in execution. Check the output for more information. Error in
eval(ei, envir) : Error in execution. Check the output for more
information. Calls: runScriptFile -> source -> withVisible -> eval ->
eval -> .Call Execution halted
It also fails as an administrator. Please help.

READ & WRITE permissions for c:\temp to "ALL APPLICATION PACKAGES".
EXEC sp_execute_external_script
#language = N'R',
#script = N'
#file.create("c:\\temp\\mytest.png")
png(filename = "c:\\temp\\mytest.png",
width = 500, height = 500, units = "px", pointsize = 12,
bg = "white", res = NA)
x <- sample(c("A","B","C","D"), 20, replace=TRUE)
plot(table(x))
dev.off()'

Related

Error message on when charting efficient frontier in r studios

When I run this line of code, I get the error message:
chart.RiskReward(maxret, risk.col = "StdDev", return.col = "mean",
chart.assets = "False")
chart.EfficientFrontier(maxret, match.col="StdDev", n.portfolios=100, type="l", tangent.line=FALSE)
Error in seq.default(from = minret, to = maxret, length.out = n.portfolios) :
'from' must be a finite number

How to solve invalid symbol coordinate error

I am trying to run a line of code from a manual but each time I get this error Error in symbols(x = coords[, 1], y = coords[, 2], bg = vertex.color, : invalid symbol coordinates
Now, this is my code
LGHomDf_P1 <- bridgeHomologues(pseudohom_stack = P1_homologues_5,
linkage_df = SN_DN_P1,
LOD_threshold = 5,
automatic_clustering = TRUE,
LG_number = 7,
parentname = "P1",
log = "Logfile_tetra.Rmd")
I have been reading some post on the forum but I could not find any information.
Moreover, I do not think I have to use igraph because it is not mentioned at all in the manual.
Moreover the package that I am using is called PolyMappeR.

Run ggsave() in task manager

I'm trying to run this code in task manager. It runs successfully in Rstudio, but there is an error when I ran it in Task Manager. Here is the R code.
grDevices::dev.set(1)
library(ggplot2)
pdf(NULL)
options(bitmapType = 'cairo', device = 'pdf')
g <- ggplot()+geom_line(data = data.frame(a = 1:10, b = 21:30),
aes(x = a, y = b))
ggsave('path/graph.pdf',
g,
device = 'png')
The error when I ran it in Task Manager looks like this:
Error in (function (file = if (onefile) "Rplots.pdf" else "Rplot%03d.pdf", :
cannot open file 'Rplots.pdf'
Calls: ->
Execution halted
The post below talks about the vanilla options when calling Rscript...but I couldn't figure out what the solution is...
Rscript ggplot - ggsave problem
This helped.
grDevices::dev.set(1)
library(ggplot2)
pdf(NULL)
options(bitmapType = 'cairo', device = 'pdf')
g <- ggplot()+geom_line(data = data.frame(a = 1:10, b = 21:30),
aes(x = a, y = b))
ggsave(tf<-tempfile(fileext = ".png"),
g,
device = 'png')

Error in checkForRemoteErrors(val) :

I am currently running an ensemble niche model analyses through a Linux cluster in a CentOs6 environment. The package I am using is SSDM. My code is as follows:
Env <- load_var(path = getwd(), files = NULL, format = c(".grd", ".tif", ".asc",
".sdat", ".rst", ".nc", ".envi", ".bil", ".img"), categorical = "af_anthrome.asc",
Norm = TRUE, tmp = TRUE, verbose = TRUE, GUI = FALSE)
Env
head(Env)
warnings()
Occurrences <- load_occ(path = getwd(), Env, file =
"Final_African_Bird_occurrence_rarefied_points.txt",
Xcol = "decimallon", Ycol = "decimallat", Spcol =
"species", GeoRes = FALSE,
sep = ",", verbose = TRUE, GUI = FALSE)
head(Occurrences)
warnings()
SSDM <- stack_modelling(c("GLM", "GAM", "MARS", "GBM", "RF", "CTA",
"MAXENT", "ANN", "SVM"), Occurrences, Env, Xcol = "decimallon",
Ycol = "decimallat", Pcol = NULL, Spcol = "species", rep
= 1,
name = "Stack", save = TRUE, path = getwd(), PA = NULL,
cv = "holdout", cv.param = c(0.75, 1), thresh = 1001,
axes.metric = "Pearson", uncertainty = TRUE, tmp = TRUE,
ensemble.metric = c("AUC", "Kappa", "sensitivity", "specificity"), ensemble.thresh = c(0.75, 0.75, 0.75, 0.75), weight = TRUE,
method = "bSSDM", metric = "SES", range = NULL,
endemism = NULL, verbose = TRUE, GUI = FALSE, cores = 125)
save.stack(SSDM, name = "Bird", path = getwd(),
verbose = TRUE, GUI = FALSE)
When running the stack_modelling function I get this Error message:
Error in checkForRemoteErrors(val) :
125 nodes produced errors; first error: comparison of these types is not
implemented
Calls: stack_modelling ... clusterApply -> staticClusterApply ->
checkForRemoteErrors
In addition: Warning message:
In stack_modelling(c("GLM", "GAM", "MARS", "GBM", "RF", "CTA", "MAXENT", :
It seems you attributed more cores than your CPU have !
Execution halted
Error in unserialize(node$con) : error reading from connection
Calls: <Anonymous> ... doTryCatch -> recvData -> recvData.SOCKnode ->
unserialize
In addition: Warning message:
In eval(e, x, parent.frame()) :
Incompatible methods ("Ops.data.frame", "Ops.factor") for "=="
Execution halted
I understand that I may have attributed more cores than I have access to but this same error message crops up when I use a fraction of the cores. I am not entirely sure what this error message is trying to tell me or how to fix it as I am new to working on a cluster. Is it a problem with parallel processing of the data? Is there a line of code which can help me fix this issue?
Thanks

R script that generates pdf in SSIS

I have SSIS package that gets data into database and then executes R Script. R script creates new folder (names it based on the current date) and generate some pdf files into this folder. I have deployed this package on server and created Job that executes it every night. The problem is that each morning I am finding only empty folders (with correct date name) without any pdf files. However, If I execute that package manually in Visual Studio it works fine and pdfs are there. Am I missing something here? I appreciate every answer.
EDIT
When I execute manually it is directly on the server
Package looks like this
and here is my R script
dir.create(file.path(output.path, date))
library(RODBC)
conn <- odbcConnect("Azure", uid = "aaaaa", pwd = "aaaaa")
etldata <- sqlFetch(conn,"dbo.EtlLogsData", stringsAsFactors = FALSE)
pdf(paste('ETL_Duration_For_Effective_Date_', date,'.pdf',sep = ""),
width = 12,
height = 8,
paper = 'special')
par(mar = c(5, 17, 5, 3))
plot(c(min(etldata_day$st_sec), max(etldata_day$et_sec)),
c(sn[1], sn[1]),
ylim = c(0, n),
yaxt = 'n',
xaxt = 'n',
ylab = '',
xlab = 'Time',
main = paste('ETL Duration With Effective Date ', date, sep = ""))
abline(h = sn, untf = FALSE, col = "gray90")
for (i in 1:n){
lines(c(etldata_day$st_sec[i], etldata_day$et_sec[i]),
c(sn[i], sn[i]),
type = "l", lwd = 2)
arrows(etldata_day$st_sec[i], sn[i],
etldata_day$et_sec[i], sn[i],
length = 0.025, angle = 90, lwd = 2)
arrows(etldata_day$et_sec[i], sn[i],
etldata_day$st_sec[i], sn[i],
length = 0.025, angle = 90, lwd = 2)
}
# Print y axis labels
axis(2, at = sn, labels = etldata_day$TaskName, las = 1, cex.axis = 1)
# Print x axis labels
xat <- seq(from = min(etldata_day$st_sec), to = max(etldata_day$et_sec), length.out = 10)
xlabels <- secondsToString(xat)
axis(1, at = xat, labels = substr(xlabels,1,8), cex.axis = 1)
dev.off()
After plot() I use some FOR cycles, and LINES(),

Resources