I Have my own AWS DocumentDB and I'm trying to connect to it in R using Mongolite Package
I tried to do this with mongolite ssl_options
with this code:
mong <- mongo(collection = "test", db = "test"
,url ='*******************.docdb.amazonaws.com:27017'
,verbose = TRUE
,options = ssl_options(ca= 'rds-combined-ca-bundle.pem',weak_cert_validation = T)
)
But I get this Error :
> Error: No suitable servers found (`serverSelectionTryOnce` set):
> [socket timeout calling ismaster on
> '***********************-central-1.docdb.amazonaws.com:27017']
so i need someone how can solve this problem.
You can connect to Amazon DocumentDB using TLS and the Mongolite package (https://jeroen.github.io/mongolite/index.html) using the following example connection string:
j <- mongo(url = "mongodb://<yourUsername>:<yourPassword>#docdb-2019-02-21-02-57-28.cluster-ccuszbx3pn5e.us-east-1.docdb.amazonaws.com:27017/?ssl=true", options = ssl_options(weak_cert_validation = T, key = "rds-combined-ca-bundle.pem"))
The error you are seeing typically occurs when 1/the URL for the host (Amazon DocumentDB cluster) in the connection string is incorrect or does not match that of the cluster you are trying to connect to or 2/your client machine that you are issuing the connection from is in a different region or VPC than your Amazon DocumentDB cluster.
For additional troubleshooting: https://docs.aws.amazon.com/documentdb/latest/developerguide/troubleshooting.html
Related
I'm trying to use RStudio to connect to the a Google Cloud postgres database. My R code is below.
From the documentation I'm relatively sure I have the correct setup.
# Load the DBI library
library(DBI)
library(RPostgres)
# Helper for getting new connection to Cloud SQL
getSqlConnection <- function(){
con <- dbConnect(
RPostgres::Postgres(),
dbname = 'my_dbname',
host ='my_host',
port = 5432,
user ='username',
password = 'my_password' )
return(con)
}
conn <- getSqlConnection()
res <- dbListTables(conn)
print(res)
When I run the code above I'm getting the error
Error: FATAL: unsupported frontend protocol 1234.5679: server supports 2.0 to 3.0
I was hoping someone might have some advice on the error or how to setup a Rstudio connection to a Google Cloud postgres DB?
Thanks in advance!
I need connect to oracle database via R
I do so
library(RODBC)
library(RJDBC)
odbcConnect(dsn = "NBD-TEST-DEV-BLACKBOX",
uid = "Y", pwd = "X")
i use oracle sql developer client to get access
When i try connect via R
i get the errors
Warning messages:
1: In RODBC :: odbcDriverConnect ("DSN = NBD-TEST-DEV-BLACKBOX; UID = x; PWD = y"):
[RODBC] ERROR: state IM002, code 0, message [Microsoft] [ODBC Driver Manager] Data source not found and no default driver used
2: In RODBC :: odbcDriverConnect ("DSN = NBD-TEST-DEV-BLACKBOX; UID = x; PWD = y"):
ODBC connection failed
i think it becase the database on another IP (R on server which ip x.x.x.x, and oracle on IP Y.y.y.y
So i have two questions
how can i indicate path to ODBC , where it. My oracle sql developer in path "C:\Users\Admin\Desktop\sqldeveloper-19.1.0.094.2042-x64"
and how to indicate needed IP when connect
I have acces to an oneway export function to a public company database through Elastic Search. I have problems connection to it from R and the elastic package.
I have server name(URL), username and password, but I don't have any port number. They describe it as a rest API. Do I have to use the elastic package or is there an easier way around it. The only information I have to the database is: http://distribution.virk.dk/cvr-permanent/virksomhed/_search?.
Host="Distribution.virk.dk"
index="cvr-permanent"
type="virksomhed"
The above link works with HTTR, but I wish to use elastic for automation purposes, when making a large request of data.
so my connect looks like
host = "distribution.virk.dk"
port = ''
path = ''
schema = "http"
user = "user_name"
pass = "secret"
connect(es_host = host,es_user=user, transport=schema, port=port, es_pwd = pass)
Even though I set port to blank it returns 9200.
If I try to use Search
>Search(index="cvr-permanent", type="virksomhed", q='"cvrNummer":"33647093"', size=10)
Error in curl::curl_fetch_memory(url, handle = handle) :
Failed to connect to distribution.virk.dk port 9200: Timed out
(elastic maintainer here)
You should be able to pass in httr::authenticate() to elastic::Search and other functions from the pkg, e.g,.
x <- Search(config = c(httr::verbose(), authenticate("foo", "bar")))
You should see the Authorization: Basic XXXXXX header in the request headers
does that work?
I am trying to make a connection from Mongo Atlas into R but nothing seems to work I have tried mongolite and RMongo, is there any good solution to make a connection with my atlas mongodb into R studio.
library(mongolite)
mongo<- mongo(collection = "nameofcollection", db = "nameofdb", url = "mongodb://usr:pass#cluster0-shard-00-00-h8acf.mongodb.net:27017,cluster0-shard-00-01-12ucd.mongodb.net:27017,cluster0-shard-00-02-haucd.mongodb.net:27017/dbname?ssl=true&replicaSet=Cluster0-shard-0&authSource=admin", verbose = TRUE)
You need change nameofcollection, nameofdb and dbname for your user password and dbname of atlas.
Change cluster url for yours cluster (atlas give you a url)
1) Go to Atlas (https://cloud.mongodb.com)
2) Go to your cluster, and click on the Connect button
3) Select "Connect your application"
4) In there, you'll see a Connection String, which will allow you to see the host name/URL for the cluster
You can then use the following R code in order to connect to your Atlas MongoDB cluster:
library(mongolite)
mongo_db_user <- "myuseranme"
mongo_db_password <- "mypassword"
mongo_database <- "mydatabase"
mongo_collection <- "mycollection"
mongo_clustername <- "cluster123-abc.mongodb.net"
# the following is the connection string for an Atlas online MongoDB cluster
url_path = sprintf("mongodb+srv://%s:%s#%s/admin",
mongo_db_user, mongo_db_password, mongo_clustername)
mongo_db <- mongo(collection = mongo_collection, db = mongo_database, url = url_path, verbose = TRUE)
data <- data.frame(Date = c("2020-04-21", "2020-04-20"), Returns = c(0.05, 0.02) )
mongo_db$insert(data)
rm(mongo_db) # disconnection
I was struggling with this for a while. Make sure to run - brew install openssl in terminal to set up the ssl connection! Then it worked fine for me. Cheers
I am trying to query AWS ElasticSearch Service (AWS ES) through a package in R called elastic. I am getting an error when trying to connect to the server.
Here is an example:
install.packages("elastic")
library(elastic)
aws_endpoint = "<secret>"
# I am certain the endpoint exists and is correct, as it functions with Kibana
aws_port = 80
# I have tried 9200, 9300, and 443 with no success
connect(es_host = aws_endpoint,
es_port = 80,
errors = "complete")
ping()
Search(index = "foobar", size = 1)$hits$hits
Whether pinging the server, or actually trying to search a document, both retrieve this error:
Error: 404 - no such index
ES stack trace:
type: index_not_found_exception
reason: no such index
resource.type: index_or_alias
resource.id: us-east-1.es.amazonaws.com
index: us-east-1.es.amazonaws.com
I have gone into my AWS ES dashboard and made certain I am using indexes that exist. Why this error?
I imagine I am misunderstanding something about transport protocols. elastic interacts with elasticsearch's HTTP API. I thought this was fine.
How do I establish an approriate connection between R and AWS ES?
R version 3.3.0 (2016-05-03); elastic_0.7.8
Solved it.
es_path must be specified as an empty string (""). Otherwise, connect() understands the AWS region (i.e. us-east-1.es.amazonaws.com) as the path. I imagine connect() adds the misunderstood path in the HTTP request, following the format shown here.
connect(es_host = aws_endpoint,
es_port = 80,
errors = "complete",
es_path = ""
)
Just to be clear, the parameters I actually used is shown below, but they should not make a difference. Fixing es_path is the key.
connect(es_host = aws_endpoint,
es_port = 443,
es_path = "",
es_transport_schema = "https")