Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 3 years ago.
Improve this question
How to work with excel file larger than 100 MB, I already imported but it doesn't running the shiny app?
if the excel is format ,you can save as csv .then you can usedata.table::fread() to read. it's more effective and easily , csv is ligthter than xlsx
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 days ago.
Improve this question
This is a broader/more-general question, but I find reading .sav files in R to be a nightmare. I use the haven package, but often run into errors due to the format in which .sav files are read (as an ex., of many, it refuses to let me coerce the dbl+lbl format into a numeric data frame).
What I typically do to get around this annoying process is to just save the .sav file as a .csv, then re-read it into R, but I'm sure there's got to be a better way, right?!
n/a this is a more-general question
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 days ago.
Improve this question
How to skip lines based on a string in R?
Im attempting to create a df from a tsv file and I need to skip lines after a delimiter string "[data]". So I want to know how many lines I need to skip to grab the header located below the "[data]" string.
In python I would load the file as a string and find the line number. But I can figure it out on R.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
Running Compile Notebook from RStudio.
I am getting:
Error: could not find function "SegNeigh"
"SegNeigh" being my own function, properly sourced; the script runs fine without R Markdown.
Any help appreciated.
In order for the rmarkdown doc to find the function, you either need to define SegNeigh in the same document or place it in another file and source that file explicitly
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm currently trying to import a 42mb .xlsx file into R with 'gdata', it's brought my laptop churning to a halt. I'm wondering whether it would be quicker to convert to CSV and then try to import it into R?
Importing as a CSV is orders of magnitude faster.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I have an R dataset (an .Rdata file) that I need to convert to either SAS (.sas7bdat or .xpt) or SPSS (.sav or .por). How can I import this dataset into SAS or SPSS?
If you want to use this in SPSS, consider using the STATS_GETR extension command. It can read R workspace or data files and map appropriate elements directly to an SPSS dataset. This extension command is available from the SPSS Community (www.ibm.com/developerworks/spssdevcentral) website or, for Statistics 22, it can be installed via the Utilities menu.