Mongolite Showing No Data After Connecting to MongoDB - r

I'm pretty novice with Mongolite, and I'm trying to query my database which is server side. However, I'm running into an issue that all of my queries are returning no data.
To connect, I run the following code:
con <- mongo(db="terrain", url="mongodb://22.92.59.149:27017")
After that, I run the following code, and get this output:
con$count('{}')
0
con$find('{}')
data frame with 0 columns and 0 rows
When I export the database from the command line as a csv, it exports the exact same file that was imported via command line, so as far as I can tell it is on my end. Additionally, I believe I am looking in the correct location because when I listCollections, I get a list of all of the databases:
admin <- mongo(db="admin", url="mongodb://22.92.59.149:27017")
admin$run('{"listDatabases":1}')
This segment of code outputs a list of all of the databases, their size on the disc, and whether they are empty or not. This is the same result if you query the DB names from the command line.
What am I missing here? I'm sure its something very simple.
Thanks in advance!

Related

Can't load data to make dataframes and query with SQL

I am new to R and work with MSSQL for over a year now. I need to import an excelfile with two sheets into Rstudio so I can make dataframes and write querys. I've got an error and can't figure out why..
Can anyone here spot the line(s) causing the error? I've added 2 printscreens, one with the code, one with the sheets.
The sheets that need to be imported:
The code:
R cannot read your query directly, you have to write your query inside a dbGetQuery() function.
dbGetQuery(con, "SELECT * FROM covid")

Write_ods keeps writing an empty data frame into directory?

I am trying to export a data frame into an ods sheet, however it is not working correctly.
I have tried exporting the information to many different directories but all have failed. I have also been able to use read_ods in the correct way. When using write_ods I keep getting zero errors yet the directory I am writing to is always empty after I open it.
print(final)
write_ods(x = final, path = "C:/Users/Administrator/Desktop/SpendingOptimizerStreamlined/CoeffSheet.ods")
temp <- read_ods(path = "C:/Users/Administrator/Desktop/SpendingOptimizerStreamlined/CoeffSheet.ods")
print(temp)
I have printed the final data frame that I want to export and it is full of the correct data.
I then write to the directory and get no errors.
To confirm that the previous command worked correctly, I then read in the previously exported data.
Then I print the data out and see zero columns and zero rows.
Not quite sure why this keeps happening? I am curious to know if the write_ods command is still supported or am I just doing something wrong? Thank you!

U-SQL An output statement must have at least one local run error

i m trying to play little bit with U-SQL. I want to run script locally but i m getting this error: "An output statement must have at least one local run error". I put my input file in data root directory and i just want to extract one column to the new file. Simple script just to see how it works.
Did i miss some step here?
U-SQL scripts can either do DDL operations like CREATE TABLE or data output operations, in which case they must have an OUTPUT statement, something like this:
// Output results
OUTPUT #output
TO "/output/output.csv"
USING Outputters.Csv();

Bulkload option exporting data from R to Teradata using RODBC

I have done many researches on how to upload a huge data in .txt through R to Teradata DB. I tried to use RODBC's sqlSave() but it did not work. I also followed some other similar questions posted such as:
Write from R to Teradata in 3.0 OR Export data frame to SQL server using RODBC package OR How to quickly export data from R to SQL Server.
However, since Teradata somehow is structured differently than MS SQL server, most of those options suggested are not applicable to my situation.
I know that there is a TeradataR package available but it has not been updated since like 2-3 years ago.
So here are my 2 main problems I am facing:
1. How to bulk load (all records at once) data in .txt format to Teradata using R if there is any way. (So far I only tried using SAS to do so, but I need to explore this in R)
2. The data is big like 500+ MB so I cannot load it through R, I am sure there is a way to go around this but directly pull data from server.
Here is what I tried according to one of posts but this was for MS SQL server:
toSQL = data.frame(...) #this doesn't work for me cause its too big.
write.table(toSQL,"C:\\export\\filename.txt",quote=FALSE,sep=",",row.names=FALSE,col.names=FALSE,append=FALSE);
sqlQuery(channel,"BULK
INSERT Yada.dbo.yada
FROM '\\\\<server-that-SQL-server-can-see>\\export\\filename.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\\n'
)");
*Note: there is an option in Teradata to insert/import data but that is the same as writing millions of rows of Insert statements.
Sorry that I do not have sample codes at this point since the package that I found wasn't the right one that I should use.
Anyone has similar issues/problems like this?
Thank you so much for your help in advance!
I am not sure if you figured out how to do this but I second Andrew's solution. If you have Teradata installed on your computer you can easily run the FastLoad Utility from the shell.
So I would:
export by data frame to a txt file (Comma separated)
create my fastload script and call the exported txt file from within the fastload scrip (you can learn more about it here)
run the shell command referencing my fastload script.
setwd("pathforyourfile")
write.table(mtcars, "mtcars.txt", sep = ",", row.names = FALSE,quote= FALSE, na = "NA",col.names = FALSE)
shell("fastload < mtcars_fastload.txt")
I hope this sorts your issue. Let me know if you need help especially on the fastloading script. More than happy to help.

RODBC error: SqlSave unable to append to table

we hit to an error with RODBC and SqlSave command. We are a bit confused what to do since the same SqlSave command works when data that we are trying to save to Sybase database is small (~under 10.000 rows). When trying to save bigger data (~200.000 rows) saving process starts without any problems but it crashes after few thousand rows is saved. Then we hit to this error message “unable to append to table..”
We use this kind of code:
library(RODBC)
channel <- odbcConnect("linfo-test", uid="DBA", pwd="xxxxxx", believeNRows=FALSE)
sqlSave(channel=channel, dat=matkat, tablename = "testitaulu", append = TRUE)
odbcClose(channel)
If someone has any idea why this happens only with bigger data and how we could fix this, we would be extremely grateful. We are lacking ideas ourselves.
sqlSave with append=TRUE pretty much never works. You will have to explicitly write an SQL INSERT INTO statement, which is unfortunate. Sorry for the bad news.
sqlSave is working but you have to be really careful with it.
You need to remember:
in R column names of your data frame and sql server table have to match EXACTLY and remember that R is case sensitive, even leading space or space at the end can make a difference
make sure that your data types in R and sql table are matching
that you are not missing any column to insert( even it is default in sql)
connection have to be set up properly, if sql user doesn't have permission to read and write in a destination table(used in sqlSave) it can also fail

Resources