I'm trying to look at some data in RStudio using View(). Code is running smoothly, but for some reason the display looks kind of off. Column labels are a little off from the rest of the column. See image here or below: https://ibb.co/gTJFCj0
Scrolling is also incredibly slow. It takes close to 5 seconds for it to respond at all.
I'm new to R, so I have absolutely no clue where to even begin looking. I'm guessing it may be that RStudio is trying to display everything at once?
Related
I have no idea what's going on, I never got this problem before, but I'm working on a new project and whenever I try to call grid.arrange or any kind of plot now in a loop of plots, R displays the plots from my last project which I obviously don't want anymore which is the last time I used grid.arrange. It refuses to display all of the plots together and I have to go through all of them one by one. It doesn't make sense since my computer has turned off and logged on with new updates since the last time I was working on that project. I tried rm=ls() and I still get the same problem. Help :/
the function is rm(list=ls()). You could also try file.remove(".RData") in the working directory.
I am trying to run my code to create a nice transition_reveal for my line graphs.
The data I've got is very large as it is daily data over 20 years for about 130 different variables.
When I run my code I sometimes get the following error:
Sometimes this error happens, sometimes it successfully runs but only if I cut the data into smaller parts. But it has to be very small parts. If I do that, since it is an animation, I'd have to create overlap and it gets complicated. I'd much prefer to run the whole thing. I don't mind if it takes hours. I can do other things.
But it doesn't make sense... it's not like my RAM is storing all the data at the same time, it's just storing what it needs before replacing. Therefore, it should never fill up. Here is an image of my Task Manager while running the code:
The RAM usually gets quite filled up at about 95% sometimes going lower and sometimes higher. Then it seems, by random chance, it hits my max at 100% and then the code just fails.
This is why splitting my data into 20 parts is difficult because I can't loop it as there is always a chance even a small part can hit the 100% RAM and cause an error.
I don't know if I'm doing anything wrong. I think buying more RAM would not solve the problem. Maybe there is a way I can allow it to use my SSD as RAM as well but I don't know how to do this.
Any help would be much appreciated. Thanks.
I found a workaround for this that actually worked better for the project I was working on, but I'm still curious.
I was creating some maps in R using the leaflet package. The code ran well, but I was using about 1.2 million rows of data and I had 7 layers that I wanted to include in my map. R studio struggled with, so I saved the map as an HTML file, closed R studio a tried viewing the map in a browser. It would load after a while, but would always freeze and crash. My workaround was to create each layer as its own map. R studio struggled a bit with this, and the HTML files can be a bit sluggish, but everything works.
Does anyone have experience or thoughts on ways to better handle large data sets like this while working with a leaflet?
A second question if you will indulge me- I couldn't figure out how to add a title to the map that wasn't overly distracting from the data I was presenting, which was about the only drawback of splitting the maps up.
you could try the leafgl-package: https://github.com/r-spatial/leafgl
It uses web gl to render leaflet with >1M points blazingly fast compared with the 'default' leaflet-methods...
When viewing dataframes in the "Source" section of RStudio, the columns are frequently misaligned, making it difficult to read.
I'm not sure what is causing this, and playing around with the width of the columns doesn't seem to help. I've looked at the "known issue" of column misalignment, but as far as I could tell, it seemed to be a different error. I'm not sure if this is due to something in my settings, but nothing there seems to apply, and restarting RStudio doesn't do anything either.
This happens when clicking on the object in the Data section of the environment, which prints:
View(cps_tiers
At this point, the dataframe loads fine, but the columns are hard to read. Here is an image of what I'm describing:
Calum You was correct, I updated RStudio and the issue seems to have gone away. Thanks for the help!
One quick question to which I could not find answer.
I want to know if it is possible to set the title of the R console to something else (using RGui, on Windows).
The main use I'm thinking of is to show some kind of progress information when running a script which takes a long time to complete.
Any suggestions?
In windows you can use the setWindowTitle function, the name that you give it will show up in the top of the window or be the label on the icon when it is minimized.
I have the following line in my .Rprofile:
utils::setWindowTitle(getwd())
So that each instance of R has a label showing which folder/directory it was opened in (I often have several open at a time that I switch between as I work on different projects). This is nice for starting R by double clicking on the .Rdata file and keeping track of which window is which.
But for indicating the progress of a long running process the progress bars are probably the better approach. In windows you can use winProgressBar or on any platform you can use txtProgessBar or tkProgressBar (the tcltk package is needed for the 2nd). The growing bar is a quick visual of the progress and you can also use the label to give a specific iteration, or other information.