I am trying to connect directly to a Domo data set within my datawarehouse in Domo. Is there a way I could make a direct connection to the Domo dataset in R? I simply want to have a connection directly to the dataset so as to avoid having to download a CSV and then import manually.
Related
I need to ingest the csv data into kusto using kusto query for geographical map visualization. but I couldn't find any query to ingest csv data. please help me on this.
Thank you
Try the One-Click ingestion wizard or ingest-from-storage commands (requires data to be in Azure Blob or ADLSv2 file.
You can ingest using query as well as described here: https://learn.microsoft.com/en-us/azure/data-explorer/kusto/management/data-ingestion/ingest-from-storage
You will need to place your files in Azure Blob Storage or Azure Data Lake Store Gen 2.
However, one click ingestion wizard method is advisable as that will ingest data via Data Management component that offers better queuing and ingestion orchestration instead of ingesting directly into database/table using the command shared above.
I have developed a relatively simply R shiny app that does the following:
I input a txt file.
The txt file is processed.
A new object is created: timestamp + a number.
I save it in my sql database for instance or I save it as a .txt on my server.
I want to configure an API such as on a specific request it would pull the new object through the R shiny server. I believe I could use plumber but that means to run 2 servers and somehow coordinate the data flow.
Is there a simple solution such as I could get the data using my R shiny server using an api tool and a simple GET?
Thank you
My R program produced a dataframe1 and write it into databaseA(eg.oracle database),but another databaseB(eg.MySQL) want this dataframe1,
How do I synchronization databaseA's dataframe1 to MySQL databaseB using the URL from databaseB? Do I need to make asynchronous requests using any R packages?
my R program on the offline server? Do I have to add the URL to my previous R program?
databaseB's URL:http://10.234.200.173:8088/Server/data/forecast/synchronize?kpi_no=$kpi_id, which offered by my other department colleague.
I'm working on a process improvement that will use SQL in r to work with large datasets. Currently the source data is stored in several different MS Access databases. My initial approach was to use RODBC to read all of the source data into r, and then use sqldf() to summarize the data as needed. I'm running out of RAM before I can even begin use sqldf() though.
Is there a more efficient way for me to complete this task using r? I've been looking for a way to run a SQL query that joins the separate databases before reading them into r, but so far I haven't found any packages that support this functionality.
Should your data be in a database dplyr (a part of the tidyverse) would be the tool you are looking for.
You can use it to connect to a local / remote database, push your joins / filters / whatever there and collect() the result as a data frame. You will find the process neatly summarized on http://db.rstudio.com/dplyr/
What I am not quite certain of - but it is not a R issue but rather an MS Access issue - is the means for accessing data across multiple MS Access databases.
You may need to write custom SQL code for that & pass it to one of the databases via DBI::dbGetQuery() and have MS Access handle the database link.
The link you posted looks promising. If it doesn't yield the intended results, consider linking one Access DB to all the others. Links take almost no memory. Union the links and fetch the data from there.
# Load RODBC package
library(RODBC)
# Connect to Access db
channel <- odbcConnectAccess("C:/Documents/Name_Of_My_Access_Database")
# Get data
data <- sqlQuery(channel , paste ("select * from Name_of_table_in_my_database"))
These URLs may help as well.
https://www.r-bloggers.com/getting-access-data-into-r/
How to connect R with Access database in 64-bit Window?
I have been using ArcMap to access GIS data on a spatial data server. I want to figure out how to do the same within R.
I know how to read shapefiles into R. I have successfully used maptools and rgdal to open and map locally stored shapefiles (e.g.
http://www.nceas.ucsb.edu/scicomp/usecases/ReadWriteESRIShapeFiles)
My problem is when the data is not stored locally, but rather it is on an Application Server. I believe it's an Oracle database. I've been given information about the 1. Server 2. Instance (a number) 3. Database 4. User and 5. Password. Normally, I would include an example, but it's doubtful that an external user could access the servers.
For example here's how to read and plot local files in R
library(rgdal)
ogrInfo(".", "nw-rivers")
centroids.rg <- readOGR(".", "nw-centroids")
plot(centroids.rg)
The "." points to the local directory. How would I change this to access data on a server? The actual syntax of code would be helpful.
You can read data from Oracle Spatial DBs using GDAL/OGR:
http://www.gdal.org/ogr/drv_oci.html
if you have the driver in your GDAL/OGR installation. If:
require(rgdal)
ogrDrivers()
shows the Oracle driver then you can use readOGR with all the parameters in the right place.
At a guess, and by analogy with the PostGIS example, I'd say try:
dsn="OCI:userid/password#database_instance:")
ogrListLayers(dsn)
s = readOGR(dsn, layername)
but I don't have an Oracle server to test it on (if I did I'd ditch it tomorrow for PostGIS, and spend the license saving on a yacht) and you don't sound certain its an Oracle server anyway. The general principle for connecting to any spatial database is the same - check you have an OGR driver, figure out what the dsn parameter looks like, try it.
Another way is to go via ODBC, or another non-spatial R database connection. However you'll likely get back the spatial data in WKB or WKT form and have to convert to SpatialWhatevers (point, lines, polygons?).
PostGIS example is here:
https://gis.stackexchange.com/questions/64950/which-is-the-best-way-of-working-with-postgis-data-in-r