R RSelenium rsDriver chrome browser error on Mac - r

I am using a Mac (OS 10.13.6) and am trying to learn how to use RSelenium.
I have installed RSelenium but am having trouble with the rsDriver command:
rD <- rsDriver(browser="chrome",chromever="80.0.3987.106")
I get this error:
Could not open chrome browser.
Client error message:
Undefined error in httr call. httr output: Failed to connect to localhost port 4567: Connection refused
Check server log for further details.
Warning message:
In rsDriver(browser = "chrome", chromever = "80.0.3987.106") :
Could not determine server status.
I've been poking around for help for a couple of days now but am not clear on the appropriate solution here. I've tried the command with chromever="latest" and following the suggested workaround found here: stackoverflow.com/questions/55201226/. Furthermore, I don't know where to find the "server log" mentioned in the error.
Having never used this package before, or done this type of thing, I can't tell if I just don't have things set up on my machine correctly (non-R requirements of RSelenium that I need to install and where), or whether this is strictly a chrome browser setting/verison issue, or generally mac compatibility issue.
Does anyone an updated (i.e. not involving the defunct checkForServer() command) set of steps (for absolute Selenium beginners) for getting RSelenium set up and rsDriver working on a mac?

After a lot of trials and errors, I managed to solve the same issue by installing
Java SE Development Kit 14 on my Mac.
I hope this solves your issue.

Related

How can I get past this 'SSL connect error' when using jsonlite::fromJSON in R?

Issue
I get the following error whenever I run
all_companies <- jsonlite::fromJSON("https://www.sec.gov/files/company_tickers_exchange.json")
Warning: URL 'https://www.sec.gov/files/company_tickers_exchange.json': status was 'SSL connect error'Error in open.connection(con, "rb") :
cannot open the connection to 'https://www.sec.gov/files/company_tickers_exchange.json'
Thank you in advance for any help!
What I've tried
I am trying to run this on a new work computer, it runs fine on my old work computer but the new one gives me this error.
It seems the difference causing this issue is that I am using {jsonlite} 1.8.2 on my new computer and 1.8.0 on my old computer. Deep in the definition of the fromJSON() function, the working version seems to use curl::curl() to establish a connection whereas the not-working version uses base::url().
The following example line from the help documentation runs fine on the new computer with version 1.8.2
data1 <- fromJSON("https://api.github.com/users/hadley/orgs")
I can access the JSON file I am trying to read on an internet browser.
I do not have permission to install Rtools on my computer to be able to compile an older version of {jsonlite}.

Problems with RSelenium and ChromeDriver - "Could not open chrome browser"

I have been using RSelenium for years and have never had this issue. I recently updated my google chrome to the latest version available 110.0.5481.78. I am now getting the following error when I go to use rsDriver
require(RSelenium)
rD <- rsDriver(browser = "chrome",port = 9537L, chromever = "110.0.5481.77")
"> Could not open chrome browser.
> Client error message:
> Undefined error in httr call. httr output: Failed to connect to localhost port 9537: Connection refused
> Check server log for further details.
> Warning message:
> In rsDriver(browser = "chrome", port = 9537L, chromever = "110.0.5481.77") :
> Could not determine server status."
R Console
I have tried with different versions of chromever from binman::list_versions("chromedriver") as well as leaving rsDriver blank all together. In the past when chrome has updated it has been a very simple change to chromever and everything works perfectly. Not sure if or what has changed with this latest update.
Thanks in advance.
I just fixed this same problem by removing a file LICENSE.chromedriver as per this thread: https://github.com/ropensci/RSelenium/issues/264
Use
wdman::selenium(retcommand=T)
to find the file location of binman_chromedriver files.
Navigate to this file location, go to the driver version you're using and delete the LICENSE.chromedriver file. Mine worked immediately after this action, but note that I also tried downgading wdman version to 0.2.5 (I was on 0.2.6) first:
remotes::install_version('wdman',version = '0.2.5')
I'm not sure if it was both actions that fixed it or just the file delete!

Error: "Windows can't find 'C:/PROGRA~1/'. Check the spelling and try again."

I am getting an error in R in Windows 10 about finding a directory while trying to install a package from GitHub*. Trying to troubleshoot this error led me to a few observations.
For example, both Windows Explorer and my browser can find C:/PROGRA~1, but only my browser can find C:/PROGRA~1/R, where R is installed. The specific Windows Explorer error is:
Windows can't find 'C:/PROGRA~1/R'. Check the spelling and try again.
Yet, Windows Explorer can find C:/Program Files/R no problem. And the error above is the same with C:/PROGRA~1/Adobe, C:/PROGRA~1/Google, or any other. Even more interesting, Windows Explorer can't even find the raw program files path as long as we add a simple slash at the end! So C:/PROGRA~1/ will output a similar error.
So can anyone explain to me why Windows Explorer is not able to find C:/PROGRA~1/R or C:/PROGRA~1/? Is this normal/expected? If I solve this, I can probably resolve my R error too. Thanks.
*Here is the full original error in R:
Error: Failed to install 'package' from GitHub:
create process 'C:/PROGRA~1/R/R-40~1.3/bin/x64/Rcmd.exe' (system error 267, The directory name is invalid.
) #win/processx.c:1040 (processx_exec)
Edit: My investigation revealed that it might be related to the direction of slashes... So for instance, C:/PROGRA~1\R (or even C:/PROGRA~1\) works in Windows Explorer, but only as long as the second slash is a backslash... Can this be of any help in resolving this issue? Doesn't seem like R wants to put that second slash as backslash...

Getting: "Failed to fetch metadata" when starting up Jupyter

I am using JupyterLab as my IDE and I have a couple of packages installed. Namely, the 'jupyterlab-dash' and 'juptyerlab-plotly' packages. My issue is when I go to launch 'juptyer lab' from terminal, I notice the following error:
Failed to fetch package metadata for 'jupyterlab-dash': URLError(gaierror(8, 'nodename nor servname provided, or not known'))
I'm not sure why this error is popping up but I've noticed this error only pops up when I have no internet connection (working on a train etc.). Could it be because these packages are trying to call something as I launch and because I don't have an internet connection it raises the error?
In the end I am able to use JupyterLab and the packages as intended (at-least it seems), but I'm curious why this error to 'fetch metadata' appears?
Thanks,

phantomjs unable to find element on page

Recently, I've been having trouble driving phantomjs under RSelenium. It seems that the browser is unable to locate anything on the page using findElement(). If I pass something as simple as:
library("RSelenium")
RSelenium::checkForServer()
RSelenium::startServer()
rd <- remoteDriver(browserName = "phantomjs")
rd$open()
Sys.sleep(5)
rd$navigate("https://www.Facebook.com")
searchBar <- rd$findElement(using = "id", "email")
I get the error below:
Error: Summary: NoSuchElement
Detail: An element could not be located on the page using the given search parameters.
class: org.openqa.selenium.NoSuchElementException
Any thoughts on what is causing this? It doesn't seem to matter what page I navigate to; it simply fails anytime I try to locate an element on the webpage. This issue started recently and I noticed it when my cron jobs began failing.
I'm working in Ubuntu 14.04 LTS with R 3.3.1 and phantomjs 2.1.1. I don't suspect some type of compatibility issue as this has worked very recently and I haven't updated anything.
The version of phantomjs you installed may be limited. See here
Disabled Ghostdriver due to pre-built source-less Selenium blobs.
Added README.Debian explaining differences from upstream "phantomjs".
If you installed recently using apt-get then this is most likely the case. You can download from the phantomjs website and place the bin location in your PATH.
Alternatively use npm to install a version for you
npm install phantomjs-prebuilt
This will then but a link to the bin in node_modules/.bin/phantomjs.
For the reasons behind the limitations in apt-get you can read the README.Debian file contained here.
Limitations
Unlike original "phantomjs" binary that is statically linked with
modified QT+WebKit, Debian package is built with system libqt5webkit5.
Unfortunately the latter do not have webSecurity extensions therefore
"--web-security=no" is expected to fail.
https://github.com/ariya/phantomjs/issues/13727#issuecomment-155609276
Ghostdriver is crippled due to removed source-less pre-built blobs:
src/ghostdriver/third_party/webdriver-atoms/*
Therefore all PDF functionality is broken.
PhantomJS cannot run in headless mode (if there is no X server
available).
Unfortunately it can not be fixed in Debian. To achieve headless-ness
upstream statically link with customised QT + Webkit. We don't want to
ship forks of those projects. It would be great to eventually convince
upstream to use standard libraries. Meanwhile one can use "xvfb-run"
from "xvfb" package:
xvfb-run --server-args="-screen 0 640x480x16" phantomjs
If you don't want to set your path for phantomjs then you can add it as a extra:
library(RSelenium)
selServ <- startServer()
pBin <- list(phantomjs.binary.path = "/home/john/node_modules/phantomjs-prebuilt/lib/phantom/bin/phantomjs")
rd <- remoteDriver(browserName = "phantomjs"
, extraCapabilities = pBin)
Sys.sleep(5)
rd$open()
rd$navigate("https://www.Facebook.com")
searchBar <- rd$findElement(using = "id", "email")
rd$close()
selServ$stop()

Resources