Can't load data to make dataframes and query with SQL - r

I am new to R and work with MSSQL for over a year now. I need to import an excelfile with two sheets into Rstudio so I can make dataframes and write querys. I've got an error and can't figure out why..
Can anyone here spot the line(s) causing the error? I've added 2 printscreens, one with the code, one with the sheets.
The sheets that need to be imported:
The code:

R cannot read your query directly, you have to write your query inside a dbGetQuery() function.
dbGetQuery(con, "SELECT * FROM covid")

Related

Excel queries disappear when writing an R dataset into Excel

I have an Excel file where I have made a data connection (got my data from ODBC). I've also made a dataset in R which I now want to write into that existing Excel file. However, when I try to write the dataset into Excel, all of the data connections in there get lost - they're just tables, but not queries as they were before.
I have done that using openxlsx:
library(openxlsx)
wb <- loadWorkbook("myfile.xlsx")
writeData(wb,"Sheet1",mydataset,startCol=1,startRow=2)
saveWorkbook(wb,"myfile.xlsx",overwrite=T)
I have also tried excel.link:
library(excel.link)
xl.workbook.activate("myfile.xlsx")
xl.sheet.activate('Sheet1')
xl[a2] <- mydataset
and that almost works, it keeps the queries as needed but another problem appears - it ruins the encoding in that Sheet1. The special characters such as ä are now written as Ƥ.
Any ideas how I could fix the problem?
Thanks
Teele

Mongolite Showing No Data After Connecting to MongoDB

I'm pretty novice with Mongolite, and I'm trying to query my database which is server side. However, I'm running into an issue that all of my queries are returning no data.
To connect, I run the following code:
con <- mongo(db="terrain", url="mongodb://22.92.59.149:27017")
After that, I run the following code, and get this output:
con$count('{}')
0
con$find('{}')
data frame with 0 columns and 0 rows
When I export the database from the command line as a csv, it exports the exact same file that was imported via command line, so as far as I can tell it is on my end. Additionally, I believe I am looking in the correct location because when I listCollections, I get a list of all of the databases:
admin <- mongo(db="admin", url="mongodb://22.92.59.149:27017")
admin$run('{"listDatabases":1}')
This segment of code outputs a list of all of the databases, their size on the disc, and whether they are empty or not. This is the same result if you query the DB names from the command line.
What am I missing here? I'm sure its something very simple.
Thanks in advance!

Export R data frame to MS Access

I am trying to export a data frame from R to MS Access but it seems to me that there is no package available to do this task. Is there a way to export a data frame directly to Access? Any help will be greatly appreciated.
The following works for medium sized datasets, but may fail if MyRdataFrame is too large for the 2GB limit of Access or conversion type errors.
library(RODBC)
db <- "C:Documents/PreviouslySavedBlank.accdb"
Mycon <- odbcConnectAccess2007(db)
sqlSave(Mycon, MyRdataFrame)
There is the ImportExport package.
The database has to already exist (at least in my case). So you have to create it first.
It has to be a access database 2000 version with extension .mdb
Here is an example:
ImportExport::access_export("existing_databse.mdb",as.data.frame(your_R_data),
tablename="bob")
with "bob" the name of the table you want to create in the database. Choose your own name of course and it has to be a non already existing table
It will also add a first column called rownames which is just an index column
Note that creating a .accdb file and then changing the extension to .mdb wont work ^^ you really have to open it and save it as .mdb. I added as.data.frame() but if your data is already one then no need.
There might be a way for .accdb files using directly sqlSave (which is used internally by ImportExport) and specifying the driver from the RODBC package. This is in the link in the comment from #BenJacobson. But the solution above worked for me and it was only one line.

Bulkload option exporting data from R to Teradata using RODBC

I have done many researches on how to upload a huge data in .txt through R to Teradata DB. I tried to use RODBC's sqlSave() but it did not work. I also followed some other similar questions posted such as:
Write from R to Teradata in 3.0 OR Export data frame to SQL server using RODBC package OR How to quickly export data from R to SQL Server.
However, since Teradata somehow is structured differently than MS SQL server, most of those options suggested are not applicable to my situation.
I know that there is a TeradataR package available but it has not been updated since like 2-3 years ago.
So here are my 2 main problems I am facing:
1. How to bulk load (all records at once) data in .txt format to Teradata using R if there is any way. (So far I only tried using SAS to do so, but I need to explore this in R)
2. The data is big like 500+ MB so I cannot load it through R, I am sure there is a way to go around this but directly pull data from server.
Here is what I tried according to one of posts but this was for MS SQL server:
toSQL = data.frame(...) #this doesn't work for me cause its too big.
write.table(toSQL,"C:\\export\\filename.txt",quote=FALSE,sep=",",row.names=FALSE,col.names=FALSE,append=FALSE);
sqlQuery(channel,"BULK
INSERT Yada.dbo.yada
FROM '\\\\<server-that-SQL-server-can-see>\\export\\filename.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\\n'
)");
*Note: there is an option in Teradata to insert/import data but that is the same as writing millions of rows of Insert statements.
Sorry that I do not have sample codes at this point since the package that I found wasn't the right one that I should use.
Anyone has similar issues/problems like this?
Thank you so much for your help in advance!
I am not sure if you figured out how to do this but I second Andrew's solution. If you have Teradata installed on your computer you can easily run the FastLoad Utility from the shell.
So I would:
export by data frame to a txt file (Comma separated)
create my fastload script and call the exported txt file from within the fastload scrip (you can learn more about it here)
run the shell command referencing my fastload script.
setwd("pathforyourfile")
write.table(mtcars, "mtcars.txt", sep = ",", row.names = FALSE,quote= FALSE, na = "NA",col.names = FALSE)
shell("fastload < mtcars_fastload.txt")
I hope this sorts your issue. Let me know if you need help especially on the fastloading script. More than happy to help.

Read a PostgreSQL local file into R in chunks and export to .csv

I have a local file in PostgreSQL format that I would like to read into R into chunks and export it as .csv.
I know this might be a simple question but I'm not at all familiar with PostgreSQL or SQL. I've tried different things using R libraries like RPostgreSQL, RSQLite and sqldf but I couldn't get my head around this.
If your final goal is to create a csv file, you can do it directly using PostgreSQL.
You can run something similar to this:
COPY my_table TO 'C:\my_table.csv' DELIMITER ',' CSV HEADER;
Sorry if I misunderstood your requirement.
The requirement is to programmatically create a very large .csv file from scratch and populate it from data in a database? I would use this approach.
Step 1 - isolate the database data into a single table with an auto incrementing primary key field. Whether you always use the same table or create and drop one each time depends on the possibility of concurrent use of the program.
Step 2 - create the .csv file with your programming code. It can either be empty, or have column headers, depending on whether or not you need column headers.
Step 3 - get the minimum and maximum primary key values from your table.
Step 4 - set up a loop in your programming code using the values from Step 3. Inside the loop:
query the table to get x rows
append those rows to your file
increment the variables that control your loop
Step 5 - Do whatever you have to do with the file. Don't try to read it with your programming code.

Resources