I am trying to extract the body of a few outlook emails with their email subject containing the keyword "Permission" using RDCOMClient in R.
This is the code that I have written.
OutApp <- COMCreate("Outlook.Application")
OutlookNameSpace <- OutApp$GetNameSpace("MAPI")
folderName <- "Inbox"
search <- OutApp$AdvancedSearch(folderName, "urn:schemas:httpmail:subject like '%Permission%'")
results <- search$Results()
body <- c()
for (i in 1:results$Count()){
body <- c(body, results$Item(i)$Body())
}
When I ran the codes line by line, I am able to obtain the character vector body without error.
However, when I ran the entire chunk together, an error is encountered.
< CheckErrorInfo > 80020009
No support for InterfaceSupportsErrorInfo
CheckErrorInfo -2147352567
Error: Exception occurred.
I had tried adding Sys.sleep(1) as suggested in Running Excel macros from R through RDCOMClient, error -2147418111, both outside and inside the for loop, but I still get the same error.
Ultimately, I would like to run this script automatically using source( ).
Could someone please help me understand why this error has occurred, and how do I resolve it?
In addition, if I would like to access a shared inbox instead of my personal inbox, how should I change the folderName so that the search will be done in the correct mailbox?
Thank you!
Related
I've been using the RDCOMClient packge in R for over a year now without a problem.
Now suddenly it's giving me an error:
<checkErrorInfo> 80070057
No support for InterfaceSupportsErrorInfo
checkErrorInfo -2147024809
Error: The parameter is incorrect.
Here is my code (I've cleaned the code due to privacy):
library(RDCOMClient)
library(lubridate)
rmarkdown::render("/report.Rmd", encoding = "UTF-8")
OutApp <- COMCreate("Outlook.Application")
outMail = OutApp$CreateItem(0)
text <- paste("Attached is the report")
path_to_attachment <- "W:\\Rwd\\report\\report_this_month\\report.pdf"
outMail[["to"]] = "user#xyz.is"
outMail[["subject"]] = "Monthly report"
outMail[["htmlbody"]] = text
outMail[["attachments"]]$Add(path_to_attachment)
outMail$Send()
rm(OutApp, outMail)
I have few other scripts that I schedule to send emails. One of them uses the blastula package (also sends email through Outlook) and I have no problem there.
Any idea why I'm getting this error?
I came across this error and was clueless for a while, but in my case it ended up being a matter of my attachment filepath being wrong.
If I were you I would try to first comment out the line outMail[["attachments"]]$Add(path_to_attachment) and see if things work as expected.
I had a similar problem when trying to use RDCOMClient to print a pdf from a word d. What worked for me was to use the URI for the file location even if it is a local file. So for example:
path_to_attachment <- "file:/W:/Rwd/report/report_this_month/report.pdf"
Using the code below to pull a .csv file from Microsoft Outlook into R for routine data manipulation and keep getting the following error (specifically after running the results lines):
<checkErrorInfo> 80020009
No support for InterfaceSupportsErrorInfo
checkErrorInfo -2147352567
Error: Exception occurred.
Similar posts in Stack Overflow suggested adding sys.sleep() to fix this issue, allowing the system adequate time to search through e-mail subjects. After adding sys.sleep() with various time frames (ranging from 5-50), I'm still getting this error. Any suggestions or advice?
#Load in dataset from e-mail
outlook_app <-COMCreate("Outlook.Application")
search <- outlook_app$AdvancedSearch(
"Inbox",
"urn:schemas:httpsmail:subject = 'Outcome Information to Date'"
)
Sys.sleep(5)
results <- search$Results()
Sys.sleep(5)
for (i in 1:results$Count()) {
if(as.Date("1899-12-30")+floor(results$Item(i)$RecievedTime())
== as.Date(strptime(Sys.time(),format="%Y-%m-%d"))) {
email <- results$Item(i)
}
}
attachment_file<-tempfile()
email$Attachments(1)$SaveAsFile(attachment_file)
#Save outcome data in a dataframe
outcomedata<-read.csv(attachment_file)
I am trying to scrape the http://www.emedexpert.com/lists/brand-generic.shtml web page for brand and generic drug names
library(httr)
library(rvest)
session <- read_html("http://www.emedexpert.com/lists/brand-generic.shtml")
form1 <- html_form(session)[[2]]
form2 <- set_values(form1, brand = "tylenol")
submit_form(session, form2)
however this results in the error message:
Error in xml2::url_absolute(form$url, session$url) :
not compatible with STRSXP
Therefore, based on this answer to the same error message ("Error: not compatible with STRSXP" on submit_form with rvest) I added a session$url as follows:
session$url <- "http://www.emedexpert.com/lists/brand-generic.shtml" # added from S.Ov
but I still get the same error message. So I tried also adding various permutations of also adding form2$url such as these
form2$url <- "http://www.emedexpert.com/lists/brand-generic.shtml"
form2$url <- ""
form2$url <- "/"
submit_form(session, form2)
At this point, the error message goes away and I obtain a web page which contain most of the desired web page. However it seems to completely lack the table of brand and generic names.
Any suggestions?
Yes #hackR, RSelenium is not always the answer.
library(rvest)
url<-"http://www.emedexpert.com/lists/bg.php?myc"
page<-html_session(url)
table<-html_table(read_html(page))[[1]]
This could help you I hope.
I am quite new to R and am trying to access some information on the internet, but am having problems with connections that don't seem to be closing. I would really appreciate it if someone here could give me some advice...
Originally I wanted to use the WebChem package, which theoretically delivers everything I want, but when some of the output data is missing from the webpage, WebChem doesn't return any data from that page. To get around this, I have taken most of the code from the package but altered it slightly to fit my needs. This worked fine, for about the first 150 usages, but now, although I have changed nothing, when I use the command read_html, I get the warning message " closing unused connection 4 (http:....." Although this is only a warning message, read_html doesn't return anything after this warning is generated.
I have written a simplified code, given below. This has the same problem
Closing R completely (or even rebooting my PC) doesn't seem to make a difference - the warning message now appears the second time I use the code. I can run the querys one at a time, outside of the loop with no problems, but as soon as I try to use the loop, the error occurs again on the 2nd iteration.
I have tried to vectorise the code, and again it returned the same error message.
I tried showConnections(all=TRUE), but only got connections 0-2 for stdin, stdout, stderr.
I have tried searching for ways to close the html connection, but I can't define the url as a con, and close(qurl) and close(ttt) also don't work. (Return errors of no applicable method for 'close' applied to an object of class "character and no applicable method for 'close' applied to an object of class "c('xml_document', 'xml_node')", repectively)
Does anybody know a way to close these connections so that they don't break my routine? Any suggestions would be very welcome. Thanks!
PS: I am using R version 3.3.0 with RStudio Version 0.99.902.
CasNrs <- c("630-08-0","463-49-0","194-59-2","86-74-8","148-79-8")
tit = character()
for (i in 1:length(CasNrs)){
CurrCasNr <- as.character(CasNrs[i])
baseurl <- 'http://chem.sis.nlm.nih.gov/chemidplus/rn/'
qurl <- paste0(baseurl, CurrCasNr, '?DT_START_ROW=0&DT_ROWS_PER_PAGE=50')
ttt <- try(read_html(qurl), silent = TRUE)
tit[i] <- xml_text(xml_find_all(ttt, "//head/title"))
}
After researching the topic I came up with the following solution:
url <- "https://website_example.com"
url = url(url, "rb")
html <- read_html(url)
close(url)
# + Whatever you wanna do with the html since it's already saved!
I haven't found a good answer for this problem. The best work-around that I came up with is to include the function below, with Secs = 3 or 4. I still don't know why the problem occurs or how to stop it without building in a large delay.
CatchupPause <- function(Secs){
Sys.sleep(Secs) #pause to let connection work
closeAllConnections()
gc()
}
I found this post as I was running into the same problems when I tried to scrape multiple datasets in the same script. The script would get progressively slower and I feel it was due to the connections. Here is a simple loop that closes out all of the connections.
for (i in seq_along(df$URLs)){function(i)
closeAllConnections(i)
}
I am trying and failing to use RCurl to automate the process of fetching a spreadsheet from a web site, China Labour Bulletin's Strike Map.
Here is the URL for the spreadsheet with the options set as I'd like them:
http://strikemap.clb.org.hk/strikes/api.v4/export?FromYear=2011&FromMonth=1&ToYear=2015&ToMonth=6&_lang=en
Here is the code I'm using:
library(RCurl)
temp <- tempfile()
temp <- getForm("http://strikemap.clb.org.hk/strikes/api.v4/export",
FromYear="2011", FromMonth="1",
ToYear="2015", ToMonth="6",
_lang="en")
And here is the error message I get in response:
Error: unexpected input in:
" ToYear=2015, ToMonth=6,
_"
Any suggestions on how to get this to work?
Try enclosing _lang with a backtick.
temp <- getForm("http://strikemap.clb.org.hk/strikes/api.v4/export",
FromYear="2011",
FromMonth="1",
ToYear="2015",
ToMonth="6",
`_lang`="en")
I think R has trouble on the argument starting with an underscore. This seems to have worked for me.