Press the next button in google search result - r

I try to click to go to the next page of search results of google using the following code:
library("RSelenium")
startServer()
checkForServer()
remDr <- remoteDriver()
remDr$open()
remDr$navigate("https://www.google.com/")
webElem <- remDr$findElement(using = "xpath", "//*/input[#id = 'lst-ib']")
webElem$sendKeysToElement(list("R Cran", "\uE007"))
webElem <- remDr$findElement(using = 'css selector', "#pnnext")
click <- webElem$getElementAttribute("href")
remDr$clickElement(click)
However I receive the following error:
Error in envRefInferField(x, what, getClass(class(x)), selfEnv) :
‘clickElement’ is not a valid field or method name for reference class “remoteDriver”
Does click next button to google search results has different code?
Using inspect I can see that the source code for the button is:
<a id="pnnext" class="pn" style="text-align:left" href="/search?q=R+Cran&biw=1366&bih=657&ei=szhxVv_NMaHMygPW4pLQDg&start=10&sa=N">
Finally what was worked for me:
library("RSelenium")
startServer()
checkForServer()
remDr <- remoteDriver()
remDr$open()
remDr$navigate("https://www.google.com/")
webElem <- remDr$findElement(using = "xpath", "//*/input[#id = 'lst-ib']")
Sys.sleep(5)
webElem$sendKeysToElement(list("R Cran", "\uE007"))
Sys.sleep(5)
link <- remDr$executeScript("return document.getElementById('pnnext').href;")
remDr$navigate(link[[1]])

You are trying to "click" an attribute/string, which is not working the way you try it.
On this line you are grabbing a link as a string (which is not a WebElement for Selenium!)
click <- webElem$getElementAttribute("href")
and then in the next line you are trying to click this link/string via a method that actually needs a WebElement
remDr$clickElement(click)
So here is what you can try:
1) you could try to click your last WebElement directly (not doing the getAttribute):
webElem$clickElement()
or
2) you could try to navigate to the link you just got through getAttribute:
click <- webElem$getElementAttribute("href")
// change your last line to this
remDr$navigate(click)

Not sure what client are you using but it might be that you need to wait() until the ajax request finishes. visibilityOfElementLocated #pnext

Related

Rselenium - click on search button gives error "element is not attached to the page document"

I want to enter a zip code and click on search for a website using Rselenium
library(RSelenium)
library(netstat)
zip <- "601"
rs_driver_object <- rsDriver(browser = 'chrome',
chromever = "101.0.4951.41",
verbose = F,
port = free_port())
remDr <- rs_driver_object$client
remDr$open()
remDr$maxWindowSize()
# go to website
remDr$navigate("https://pizza.dominos.com/")
# locate the search box
address_element <- remDr$findElement(using = 'id', value = 'geo-search')
# locate the search button
button_element <- remDr$findElement(using = 'class', value = "btn-search")
# send the zip code
address_element$sendKeysToElement(list(zip))
# click on search
button_element$clickElement()
However, when I do the last step of clicking, it shows me the error:
Selenium message:stale element reference: element is not attached to the page document
(Session info: chrome=101.0.4951.67)
Error: Summary: StaleElementReference
Detail: An element command failed because the referenced element is no longer attached to the DOM.
class: org.openqa.selenium.StaleElementReferenceException
Further Details: run errorDetails method
I was able to achieve this using the following code. The error stale element reference: element is not attached to the page document usually comes when the backend HTML data is changed and the xpath you used becomes outdated.
# go to website
remdr$navigate("https://pizza.dominos.com/")
# locate the search box and enter the zip
zip <- "601"
address_element <- remdr$findElement(using = 'id', value = 'geo-search')$sendKeysToElement(list(zip))
# click on search
button_element <- remdr$findElement(using = 'xpath', value = "(//button[#class='btn-search'])[2]")$clickElement()

RSelenium: clicking on subsequent links in for loop from a Google search

I'm using RSelenium to do some simple Google searches. Setup:
library(tidyverse)
library(RSelenium) # running docker to do this
library(rvest)
library(httr)
remDr <- remoteDriver(port = 4445L, browserName = "chrome")
remDr$open()
remDr$navigate("https://books.google.com/")
books <- remDr$findElement(using = "css", "[name = 'q']")
books$sendKeysToElement(list("NHL teams", key = "enter"))
bookElem <- remDr$findElements(using = "css", "h3.LC20lb")
That's the easy part. Now, there are 10 links on that first page, and I want to click on every link, back out, and then clink the next link. What's the most efficient way to do that? I've tried the following:
bookElem$clickElement()
Returns Error: attempt to apply non-function - I expected this to click on the first link, but no good. (This works if I take the s off of findElements() - the above, not the for loop below).
clack <- lapply(bookElem, function(y) {
y$clickElement()
y$goBack()
})
Begets an error, kind of like this question:
Error: Summary: StaleElementReference
Detail: An element command failed because the referenced element is no longer attached to the DOM.
Further Details: run errorDetails method
Would it be easier to use rvest, within RSelenium?
I think you could consider grabbing the links and looping through them without going back to the main page.
In order to achieve that, you will have to grab the link elements ("a tag").
bookElems <- remDr$findElements(using = "xpath",
"//h3[#class = 'LC20lb']//parent::a")
And then extracting the "href" attribute and navigate to that:
links <- sapply(bookElems, function(bookElem){
bookElem$getElementAttribute("href")
})
for(link in links){
remDr$navigate(link)
# DO SOMETHING
}
Full code would read:
remDr$open()
remDr$navigate("https://books.google.com/")
books <- remDr$findElement(using = "css", "[name = 'q']")
books$sendKeysToElement(list("NHL teams", key = "enter"))
bookElems <- remDr$findElements(using = "xpath",
"//h3[#class = 'LC20lb']//parent::a")
links <- sapply(bookElems, function(bookElem){
bookElem$getElementAttribute("href")
})
for(link in links){
remDr$navigate(link)
# DO SOMETHING
}

Unable to click on element using (R)Selenium

I ma trying to find element using RSelenium
remDr <- remoteDriver(remoteServerAddr = "192.168.99.100", port = 4445L,
browserName = "chrome")
remDr$open()
url <- "https://sudskapraksa.csp.vsrh.hr/decisionPdf?id=090216ba8084ca52"
remDr$navigate(url)
There is a captcha image (if you don't see it try to execute navigate part 10 times) I tried to select using:
captcha_element <- remDr$findElement(using = "css selector", "img[id='captchaImg']")$clickElement()
captcha_element <- remDr$findElement(using = "id", "captchaImg")$clickElement()
captcha_element <- remDr$findElement(using = "xpath", "//*[#id='captchaIm']")$clickElement()
but it always return an error.
As per your code trials and subsequent comment update to identify the captcha image you can use the following Locator Strategy but of-coarse you can't invoke the click event as none of the element attributes contain any such event :
captcha_element <- remDr$findElement(using = "css selector", "img#captchaImg")

Enter text in popup box using Rselenium

I'm trying to pull data from glassdoor website using Rselenium. I'm unable to enter email id and password in the popup window.
This is my code. I'm not sure where I'm going wrong. When I try to highlight email box, sign in button is being highlighted.
remDr$navigate("https://www.glassdoor.com")
webElem <- remDr$findElement("class", "sign-in")
webElem$highlightElement()
webElem$clickElement()
email <- webElem$findElement(using = "name", "username")
email$highlightElement()
email$sendKeysToElement(list("EMAIL ID")) -->Throwing Error
The following works with latest chrome:
library(RSelenium)
rD <- rsDriver()
remDr <- rD$client
remDr$navigate("https://www.glassdoor.com")
webElem <- remDr$findElement("class", "sign-in")
webElem$highlightElement()
webElem$clickElement()
remDr$setImplicitWaitTimeout()
email <- remDr$findElement(using = "id", "signInUsername")
email$sendKeysToElement(list("EMAIL ID"))
....
rm(rD)
gc()

Rselenium web scraping problems

I'm trying to parse this HTML with R in order to extract some currency exchange rates. They are only visible after clicking on buttons in the center of the webpage (sorry, it's in Russian).
So far I've tried both Rselenium and rvest, but none of them allows me to get to this css: "tr:nth-child(2) td".
And if I try this:
library("RSelenium")
startServer()
mybrowser <- remoteDriver(browserName = "chrome")
mybrowser$open()
mybrowser$navigate("https://www.tinkoff.ru/about/documents/exchange/")
dol<-mybrowser$findElement(using = c('partial link text'), "USD")
it returns a "NoSuchElement" error.
I've highlighted the place in the html code where I need to get
txt<- ".documents-exchange-vertical-list__menu:nth-child(2) .documents-exchange-vertical-list__item+ .documents-exchange-vertical-list__item .Currency-Rate-Trigger";
dol<-mybrowser$findElement(using = 'css selector', txt)clickElement()
#possibly this will work or may not
dol<-mybrowser$findElement(using = 'css selector', "tr:nth-child(2) td:nth-child(1)")$getElementText()

Resources