Cannot allocate a new connection: 16 connections already opened RMySQL - r

I am very new to shiny and R but using shiny I am trying to connect to a database fetch the data from there. When I try to access my RShiny work on browser continuously I got an error like Cannot allocate a new connection: 16 connections already opened. How can I overcome this error or RShiny only expecting 16 users at a time? I have got another stack post here RStudio Shiny Error mysqlNewConnection maximum of 16 connections but the explanation was not clear on the above URL.

Maybe you open a new DB connection with obj <- dbConnect(...) every time you send a query in your code. You can simply call dbDisconnect(obj) on the object you created to kill the respective connection everytime after your query executed.
Also you can use this function kill all open connections at once:
library(RMySQL)
killDbConnections <- function () {
all_cons <- dbListConnections(MySQL())
print(all_cons)
for(con in all_cons)
+ dbDisconnect(con)
print(paste(length(all_cons), " connections killed."))
}
I'd recommed to write a small function outside shiny that handles the whole opening and closing thing:
library(RMySQL)
sqlQuery <- function (query) {
# creating DB connection object with RMysql package
DB <- dbConnect(MySQL(), user="youruser", password='yourpassword', dbname='yourdb', host='192.168.178.1')
# close db connection after function call exits
on.exit(dbDisconnect(DB))
# send Query to btain result set
rs <- dbSendQuery(DB, query)
# get elements from result sets and convert to dataframe
result <- fetch(rs, -1)
# return the dataframe
return(result)
}
Hope that helps!

Fast solution
If you get this issue, you should first run the following code interactively to disconnect all your connexions to the MySQL database:
lapply(dbListConnections(MySQL()), dbDisconnect)
(Note that you can replace MySQL() by another DBI Driver, if you use another database management system).
Faster solution
Just restart R session : command/ctrl + shift + F10 on RStudio
How to avoid this issue
You need to tell shiny how to disconnect properly. That part depends on the use case. If you start a connexion each time you start shiny, you could add inside server.ui:
session$onSessionEnded(function(){
dbDisconnect(con)
}
Where con is your connexion.
If you start a connexion each time you run a query, you must disconnect immediately after the query has run.
You should also take a look at the pool package which is suggested by the shiny team to manage connexions.
There is also a very helpful section on Shiny Articles on database.

dbDisconnect() was not working in my case. So I had stopped MySQL server from terminal and again started using
sudo service mysql stop
sudo service mysql start
Then I ran the code with using dbDisconnect() now its working for me.

Related

copy an sqlite database into memory using sqlalchemy for testing flask app

I want to use my development database during testing of my flask app (it contains oauth tokens that I can't get during testing). I was hoping to be able to copy the disk based development database to memory and use that for testing so that it did not alter the development db.
so here is what I have done so far:
in my app when testing is selected use:
SQLALCHEMY_DATABASE_URI = "sqlite:///:memory:"
Now I wish to copy my existing db into this temporary memory-based db. I have searched other answers but the closest I come is:
# open existing db
old_db = sqlite3.connect('app.db')
# get the connection to the in memory db
connection = db.get_engine().raw_connection()
# copy old_db into memory
old_db.backup(connection)
but this results in
backup() argument 1 must be sqlite3.Connection, not _ConnectionFairy
Is there a way to use the connection like this to achieve my desired aim?
Thank you
Martyn
If anyone else comes to this here's what I missed - the raw_connection() object needs to be turned into an sqlite connection by:
connection = db.get_engine().raw_connection().connection
This the full code:
# open existing db
old_db = sqlite3.connect('app.db')
# get the connection to the in memory db
conn = db.get_engine().raw_connection().connection
# copy old_db into memory
old_db.backup(conn)
Martyn

How Do I Format A Connection String to Analysis Server in RMDX?

Sorry in advance for asking a very basic/newbie question, but I'm trying to use RMDX to query some data from a Microsoft Analysis Server from RStudio. RMDX is the only package I've been able to successfully install. I've also tried adding X4R w install_github but had some difficulty (and in any case X4R also seems to use a URL as the connection string), and I've tried adding olapR from my RClient library to my R 3.5.2 library, but I get an error about it being made for a version of R with different internals.
RMDX takes a URL as a connection string and I don't know how to format the data connection... correctly, I guess? I've only used sql with RODBC in R before, and setting up a data source via ODBC Data Source Administrator doesn't work for the data warehouse.
Obviously I'm missing a lot of basics/theory/fundamentals so I'm just kind of shooting in the dark, but I've tried "localhost//[server-name]," "https://[server-name]," and copying the connection string used for some of the microsoft bi dashboards that connect to the same data warehouse that I want to query, and none work. Does anyone know how to solve this issue, or can anyone suggest an alternative way of executing MDX queries from RStudio? Thanks!
After experimenting on a similar route like you - I have ended up writing a powershell script that connects to the MS SSAS or OLAP Cube via its "URL" (usually you will use the URL string that has the 'msmdpump.dll' mentioned in it somewhere - usually at the end of it - as the $con or connection string). After that (meaning in the ps script more precisely it is a module) I heavily rely on the AdomdClient object and its properties, something along these lines
#establish SSAS ADOMD Client and open connection
[System.Reflection.Assembly]::LoadWithPartialName(\"Microsoft.AnalysisServices.AdomdClient\") | Out-Null;
\
echo 'Connecting to Database/Cube via Powershell module!'
$con = new-object Microsoft.AnalysisServices.AdomdClient.AdomdConnection($connectionString)
$con.Open()
$command = new-object Microsoft.AnalysisServices.AdomdClient.AdomdCommand($MDXquery, $con)
$dataAdapter = new-object Microsoft.AnalysisServices.AdomdClient.AdomdDataAdapter($command)
$ds = new-object System.Data.DataSet
#fetch data
echo 'fill data container w cube data'
$dataAdapter.Fill($ds)
$con.Close();
....
After that I call this ps script via system2(command = "powershell",...) from within R with the various connectionstring, MDX (query) and so on parameters, save the result in a temp folder as a csv file and then load that back into my R session again.
Hope that helps.

Can I Use Only One RODBC Connection in Foreach using doParallel in R?

I know that I can open an SQL Server connection in each worker, however, it opens multiple connections to the server at the same time. My work's Database Administrators are saying that I am using too many system resources by having multiple connections open at the same time, and that I need to use only one connection. Is it possible to open a single connection and pass it to each worker? I did read the answer as to why this can't be performed here: RODBC & foreach, but I was hoping there may be a new solution or new insights.
library(foreach)
library(doParallel)
cl <- makeCluster(detectCores() - 1)
clusterEvalQ(cl, {
library(RODBC)
Conn <- odbcConnect("SERVER_NAME")
})
foreach(iter=1:10, .noexport="Conn") %dopar% {
# Code block
}
I also need this because I am creating a temporary table and need each worker to be able to access the connection that has the temporary table. Otherwise, each worker opens a new connection and doesn't have access to the connection with the temporary table. Thanks!
You wouldn't be able to work concurrently over a single connection, even if you could share it.
If you want multiple child processes, each with their own connection, to see the same data, then either use a permanent table, or a global temporary table created in the driver process. eg
create table ##foo(...)
Global temporary tables are automatically dropped when the session
that created the table ends and all other tasks have stopped
referencing them. The association between a task and a table is
maintained only for the life of a single Transact-SQL statement. This
means that a global temporary table is dropped at the completion of
the last Transact-SQL statement that was actively referencing the
table when the creating session ended.
CREATE TABLE

R connection to SQL Server

I have been trying to make a connection between R and the SQL server but I keep getting this error msg:
object 'C_RODBCDriverConnect' not found
It seems as if R is trying to find that object but is failing in doing so, anyone have an idea on how I can solve this issue?
Reinstalling R is not an option as it's a work computer and I do not have the rights to do so. Also note that I am using the RODBC package as the odbc package doesn't want to install properly (I kept getting the zero non-exit error msg).
Thanks in advance.
Zachary
Just make sure you make a odbc connection in the odbc data sources and ensure you have done it correctly to the sql server database.
library(RODBC)
#ODBC_1 refer to Database you want to use
con_ref = odbcConnect("name_of_connection")# name of connection is what you used in odbc connection setup
#input data
x <- sqlQuery(con_ref, "select * from db_name")

oracle connection in r

hi folks I can connect to oracle through R no problem using the following code:
library(RODBC)
channel <- odbcConnect(dsn = "xxxx", uid = "xxxx", pwd = "xxxx")
odbcGetInfo(channel)## CHECKS CONNECTION TO ORACLE
COMPANIES <- sqlFetch(channel, "COMPANIES")
COMPANIES_EQUIPMENT <-sqlFetch(channel, "COMPANIES_EQUIPMENT")
EQUIPMENT_SENSORS <-sqlFetch(channel, "EQUIPMENT_SENSORS")
odbcClose(channel) ## CLOSES odbc CONNECTION
when I fetch the first data table "COMPANIES" , no issue but this means running the code just to fetch this data frame, the problem is that when I run the above code to fetch all 3 data frames:
COMPANIES,
COMPANIES_EQUIPMENT,
EQUIPMENT_SENSORS
my R script just hangs up, I have tried to run each fetch statement individually, and they all work but when run together my script just hangs up any ideas?
Not sure if problem is R, New laptop or Oracle, oracle seems ok as can connect no issue but is there a data limit maybe allowed etc...
I am using Oracle Instantclient 11.2 to connect my laptop windows 7 Professional to Oracle, RStudio Version 1.0.143
thanks
Nigel
Have you considered using sqlQuery instead of sqlFetch?
COMPANIES <- sqlQuery(channel, "SELECT * FROM COMPANIES")
you might need to replace the * by the names of the variables.
I personnally use RJDBC to connect to Oracle:
driverClass="oracle.jdbc.OracleDriver"
classPath="<PATH_TO_INSTANTCLIENT>/instantclient_12_1/ojdbc6‌​.jar"
connectPath="jdbc:oracle:thin:#//<HOST>:<PORT>/<DB_NAME>"
jdbcDriver <- RJDBC::JDBC(driverClass, classPath)
jdbcConnection <- RJDBC::dbConnect(jdbcDriver, connectPath, dbuser, dbpass)

Resources