Converting df to csv format, without creating a file - r

I am creating a process to converting an API data into a df.
My problem is:
The data just appears correct after exporting to a csv file, using ' df.to_csv("df.csv", sep=','). If I don't do that, the first column appears a big data list.
Is there a way to do this process of convert to csv format without creating an external file ?

From the documentation of DataFrame.to_csv:
path_or_buf : string or file handle, default None
File path or object, if None is provided the result is returned as a
string.
So simply doing:
csv_string = df.to_csv(None, sep=",")
Gives you a string containing a csv representation of your dataframe without creating an external file.

Related

Converting RData to CSV file returns incorrect CSV file

I do not have any expertise on R and I have to convert RData files to CSV to analyze the data. I followed the following links to do this: Converting Rdata files to CSV and "filename.rdata" file Exploring and Converting to CSV. The second option seemed to be a simpler as I failed to understand the first one. This is what I have tried till now and the results along with it:
>ddata <- load("input_data.RData")
>print(ddata)
[1] "input_data"
> print(ddata[[1]])
[1] "input_data"
> write.csv(ddata,"test.csv")
From the first link I learnt that we can see the RData type and when I did str(ddata) I found out that it is a List of size 1. Hence, I checked to see if print(ddata[[1]]) would print anything apart from just "input_data". With the write.csv I was able to write it to a csv without any errors but it has just the following 2 lines inside the CSV file:
"","x"
"1","input_data"
Can you please help me understand what am I doing wrong and show a way to get all the details in a csv?
The object ddata contains the name of the object(s) that load() created. Try typing the command ls(). That should give you the names of the objects in your environment. One of them should be input_data. That is the object. If it is a data frame (str(input_data)), you can create the csv file with
write.csv(input_data, "test.csv")

How to download uri data:text file using R?

I'm trying to download what I believe is called a URI data:text file using R. The "URL" is of this form:
URL <- "data:text/csv;charset=utf8,Supply%2001%2F02%2F2020%2C0%3A00%2C0%3A05%2C0%3A10%2C0%3A15%2C0..."
followed by hundreds more characters. When I type the "URL" into a web browser and press enter, the desired CSV file downloads without a problem. When I try to use something like download.file or curl_download on the above string, however, I get an error like this:
Error in curl_download(URL, destfile = "test1234.csv") :
Port number ended with 't'
Any insights on how I can download a csv data: file like this using R? Thanks!
If it's any use, the file I'm trying to download is pasted below. I had to first save the string as a .txt file and then import that .txt file with read_file in order to store it as a string in R.
The data URL isn't really a normal URL, it contains all the data inside of the text rather than pointing to the data at a different location. It is made up of two parts: a "header" and then data itself. The header consist of "data:text/csv;charset=utf8," and then the data follows, but it's been HTML (or URL) encoded. You can read the data by removing the header, decoding the values, and then reading the text as a CSV file with read.csv. For example:'
read.csv(text=URLdecode(gsub("^data:text/csv;charset=utf8,","", URL)),
check.names = FALSE)

Convert Raw Vector back to Excel R

I am working with an Excel file saved in S3. I am trying to access it using R. To get the file I am using fl <- get_object(paste(file_path,file_name),bucket = bucket). This works fine and returns the file in raw vector format. The problem I am having is that any function I have found to read an Excel file requires an actual file (ie path), not a raw vector.
Is there a way to read a raw vector (of an Excel file) into a data frame? Or, convert the raw vector back to an Excel file so I can reference that file in read_excel() or the like?
The python code below does what I need, but for reasons far beyond my control, I must do this in R.
fl = s3.get_object(Bucket=bucket,Key= file_path + file_name)
df = pd.read_excel(fl['Body'])

Is there a way to avoid creating a temporary file in R?

I have a database in which VCF files have been inserted as a blob variable. I am able to retrieve it without issue. However, I then need to pass it to some various functions (VariantAnnotation, etc.) that expect a VCF file name. Is there a way to "fake" a file object to pass to these functions if I already have all the data in a character string?
I'm currently writing it out to a file so I can pass it on:
#x contains the entire vcf file as a character string
temp_filename = tempfile(fileext = ".vcf")
writeChar(x, temp_filename)
testVcf = readVcf(temp_filename)
unlink(temp_filename)
This works ok, but I would like to avoid the unnecessary file I/O if possible.

Import data csv with particular quotes in R

I have a csv like this:
"Data,""Ultimo"",""Apertura"",""Massimo"",""Minimo"",""Var. %"""
"28.12.2018,""86,66"",""86,66"",""86,93"",""86,32"",""0,07%"""
What is the solution for importing correctly please?
I tried with read.csv("IT000509408=MI Panoramica.csv", header=T,sep=",", quote="\"") but it doesn't work.
Each row in your file is encoded as a single csv field.
So instead of:
123,"value"
you have:
"123,""value"""
To fix this you can read the file as csv (which will give you one field per row without the extra quotes), and then write the full value of that field to a new file as plain text (without using a csv writer).

Resources