I'm looking for a way to connect to a Dentrix G4 db to extract certain data. We've had hit-and-miss luck. Does anyone know of an ODBC to connect to Dentrix G4?
I have never seen anyone successfully get in through ODBC. G4 database files can be accessed directly. Open the Patient.dat file in the dentrix data folder as binary or in a hex reader and us hex/Decimal/string conversion to get the unknown data back to Human readable text. Not a great solution but it should be possible to get the data out of the system.
Related
Currently, using SAS Enterprise Guide, we have some code that pulls data from a data warehouse that has a seemingly straightforward 'CONNECT TO DB2(x,y,z,)' statement in a PROC SQL where x=database name, y=user ID, z=password.
Looking in Tools > Connections > Profiles, I can find the host name and port for the connection. I'm trying to see if there's a way to use this information (and find any other needed information) to connect to the same data warehouse using R. Other posts here on SO have some code using JDBC or RODBC, but I wasn't able to get anywhere with that, as they mention drivers (which I don't know anything about) and the Oracle folder in my C drive didn't seem to have any of that information.
After reaching out to someone involved with the warehouse, their response was "it is a SAS data warehouse and not accessible via direct ODBC connections."
I'm not too sure what other information to ask for or provide for this, but do any of you know if what I'm looking to do is possible? Are SAS Data Warehouses structured some way that would prevent me from accessing it in R? If I can, what else do I need besides the host name, port, database name, user ID, and password?
Thanks in advance, I'm pretty new to all of this
I'm working on a process improvement that will use SQL in r to work with large datasets. Currently the source data is stored in several different MS Access databases. My initial approach was to use RODBC to read all of the source data into r, and then use sqldf() to summarize the data as needed. I'm running out of RAM before I can even begin use sqldf() though.
Is there a more efficient way for me to complete this task using r? I've been looking for a way to run a SQL query that joins the separate databases before reading them into r, but so far I haven't found any packages that support this functionality.
Should your data be in a database dplyr (a part of the tidyverse) would be the tool you are looking for.
You can use it to connect to a local / remote database, push your joins / filters / whatever there and collect() the result as a data frame. You will find the process neatly summarized on http://db.rstudio.com/dplyr/
What I am not quite certain of - but it is not a R issue but rather an MS Access issue - is the means for accessing data across multiple MS Access databases.
You may need to write custom SQL code for that & pass it to one of the databases via DBI::dbGetQuery() and have MS Access handle the database link.
The link you posted looks promising. If it doesn't yield the intended results, consider linking one Access DB to all the others. Links take almost no memory. Union the links and fetch the data from there.
# Load RODBC package
library(RODBC)
# Connect to Access db
channel <- odbcConnectAccess("C:/Documents/Name_Of_My_Access_Database")
# Get data
data <- sqlQuery(channel , paste ("select * from Name_of_table_in_my_database"))
These URLs may help as well.
https://www.r-bloggers.com/getting-access-data-into-r/
How to connect R with Access database in 64-bit Window?
My source data is in Oracle and target data is in Teradata.Can you please provide me the easy and quick way to validate data .There are 900 tables.If possible can you provide syntax too
There is a product available known as the Teradata Gateway that works with Oracle and allows you to access Teradata in a "heterogeneous" manner. This may not be the most effective way to compare the data.
Ultimately what your requirements sound more process driven and to be done effectively would require the source data to be compared/validated as stage tables on the Teradata environment after your ETL/ELT process has completed.
I have been using ArcMap to access GIS data on a spatial data server. I want to figure out how to do the same within R.
I know how to read shapefiles into R. I have successfully used maptools and rgdal to open and map locally stored shapefiles (e.g.
http://www.nceas.ucsb.edu/scicomp/usecases/ReadWriteESRIShapeFiles)
My problem is when the data is not stored locally, but rather it is on an Application Server. I believe it's an Oracle database. I've been given information about the 1. Server 2. Instance (a number) 3. Database 4. User and 5. Password. Normally, I would include an example, but it's doubtful that an external user could access the servers.
For example here's how to read and plot local files in R
library(rgdal)
ogrInfo(".", "nw-rivers")
centroids.rg <- readOGR(".", "nw-centroids")
plot(centroids.rg)
The "." points to the local directory. How would I change this to access data on a server? The actual syntax of code would be helpful.
You can read data from Oracle Spatial DBs using GDAL/OGR:
http://www.gdal.org/ogr/drv_oci.html
if you have the driver in your GDAL/OGR installation. If:
require(rgdal)
ogrDrivers()
shows the Oracle driver then you can use readOGR with all the parameters in the right place.
At a guess, and by analogy with the PostGIS example, I'd say try:
dsn="OCI:userid/password#database_instance:")
ogrListLayers(dsn)
s = readOGR(dsn, layername)
but I don't have an Oracle server to test it on (if I did I'd ditch it tomorrow for PostGIS, and spend the license saving on a yacht) and you don't sound certain its an Oracle server anyway. The general principle for connecting to any spatial database is the same - check you have an OGR driver, figure out what the dsn parameter looks like, try it.
Another way is to go via ODBC, or another non-spatial R database connection. However you'll likely get back the spatial data in WKB or WKT form and have to convert to SpatialWhatevers (point, lines, polygons?).
PostGIS example is here:
https://gis.stackexchange.com/questions/64950/which-is-the-best-way-of-working-with-postgis-data-in-r
I am very new to AS400, and I am stuck. I have read documenation but cannot find what I need.
I have an odbc connection to an AS400 server. When I run this command I get an Outfile with everything I need:
CALL QSYS.QCMDEXC('DSPUSRPRF USRPRF(*ALL) OUTPUT(*OUTFILE) OUTFILE(CHHFLE/TEST3)', 0000000061.00000)
Instead of the results going to an outfile I need to receive the results of this command to my script that is connecting through odbc. If I change 'OUTPUT(*OUTFILE)' to 'OUTPUT(*)' I get no results when I try to 'fetchall()'.
Is there any way to get this information through the odbc connection to my script?
EDIT: I am on a linux server, in a python script using pyodbc to connect. I can run sql queries successfully using this connection, but I can't figure out how to get the results of a command to come through as some sort of record set.
I hope I'm interpreting what you're asking correctly. it looks like you're accessing user profile data and dumping it to a file. It looks like you then want to use the contents of that file in a script or something that's running on Windows. If that's the case:
In general, when accessing data in a file from the Windows world, whether through ODBC and VBScript, or .NET, the AS/400 is treated like a database. All files in libraries are exposed via the built-in DB2 database. It's all automatic, and part of the Universal DB2 database.
So, after creating this file, you should have a file named TEST3 in library CHHFLE
You'd create a connection and execute the following SQL statement to read the contents:
Select * From CHHFLE.TEST3
This, of course, assumes that you have proper permissions to access this. You should be able to test this using the iSeries Navigator tool, which includes the ability to run SQL Scripts against the database before doing it in your script.
Added after reading comments above
There's info at this question on connecting to the DB2 from Python. I hope it's helpful.
OUTPUT(*) is not stdout, unfortunately. That means you won't be able to redirect OUTPUT(*) to an ODBC connection. Dumping to a DB2 table via OUTPUT(*OUTFILE) is a good plan. Once that's done, use a standard cursor / fetch loop as though you were working with any other DB2 table.