Why would R Shiny app run indefinitely in R Studio? - r

I wanted to learn how to build Shiny Apps in R so I started this beginner-level tutorial. However, when I run the app on my desktop (Windows 10 x64, 16GB RAM, 500 GB SSD, i5-3470 CPU, Dual Display) in R Studio 1.3.1093) using R 4.0.3, it loads indefinitely with no error output. I tried running even the basic built-in examples (which you could find here) and they also failed to load. The exact same scripts and examples run on my laptop (Windows 10 x64, 8GB RAM, 250 GB SSD; R & R Studio specs the same) without issue. I've reinstalled the shiny package, reinstalled R and R Studio, and changed whether the app runs internally or externally with no success. I did find this post which seems to have encountered the same issue, but with no solution.
I know it's not much to go on, but I'm at a loss as to the next thing I should check. Any suggestions?

I figured out from this mostly unrelated post that there was a file at the path C:/ Users/.../Documents\R\win-library\4.0/ called 00LOCK which was giving R trouble downloading and updating new packages. I'm not sure how it got there or why R was not telling me there were issues in updating the packages, but the shiny app seems to work perfectly fine now.

Related

Downloaded Newest RStudio (1.4.1717) and R files are no longer associated with RStudio and I can't find RStudio in my Programs folder

Also, on a windows 10 machine and R version 4.1.0. Since I can't find RStudio on my machine when it asks what program I want to use to open my code files, I just have to go to through the file menu to open the code file. It's obviously not a big deal, but quite annoying. One of my coworkers is having the same problem after downloading the new RStudio version too, so I'm wondering if it's a bug...or I'm to bug.

RStudio Project creation on Windows network share issue

RStudio 1.2.5033 and 1.3.1073 is crashing when creating standard New projects (although not with R package projects) on "some" Windows Network Share Drives.
As of current (ie Sept. 2020) this is supposed to get fixed with RStudio's next boost update see: https://stackoverflow.com/a/63738420/1216790 for similar or root cause of issue
and
https://github.com/rstudio/rstudio/issues/7716#issuecomment-686641326 regarding expected solution.

Crashing When knitting R Markdown under Linux

Ubuntu 16.04 LTS, R Version: 3.4.3, R Studio: Version 1.1.383
I'm playing around and learning R Markdown this afternoon. I am not doing any intensive data analysis. I am knitting my R Markdown into HTML with the following command rmarkdown::render("document.Rmd").
About every half an hour my Ubuntu GNOME session almost totally freezes. I can sort of move the mouse cursor around and every several minutes I'm presented with a brief window of time where the computer works again before going back into a deep freeze. I'm not running anything other programs.
I've kept my System Monitor open and notice that rsession and rstudio usually utilize ~200 MiB of memory. When the computer freezes the rsession rises to ~4 GiB and it's also directly after I issue the rmarkdown::render("document.Rmd") command in R Studio.
I did sudo-apt-update and sudo-apt-upgrade. What else should I do? Do I update the Linux kernel? Upgrade R Studio? Submit a bug report? Is this a memory leak (and what is that)?

Fixing pandoc "out of memory" error when running the profvis R package

I'm trying to use the profvis package to do memory profiling of a large job in R (64 bit), run under RStudio, run under windows 7. profvis keeps crashing and I get an error message saying that Pandoc is out of memory. The message is copied below.
My understanding, and please correct me if this is wrong, is that the problem is likely to go away if I can set the /LARGEADDRESSAWARE switch on Pandoc. And to do that, I need to install a linker, etc., do my own build, after learning how to do all those things. Or, there is a shortcut, involving installing MS Visual Studio, running the editbin utility, and set the switch that way. However a new install of Visual Studio is unhappy on my machine, and demands that I fix some unspecified problem with Windows Management Instrumentation before it will go forward.
So my question is this: Is there a way to set the /LARGEADDRESSAWARE switch on Pandoc from inside R?
I had a similar problem and was able to resolve it by following the advice at https://www.techpowerup.com/forums/threads/large-address-aware.112556/. See in the post where it has an Attached File called laa_2_0_4.zip. I downloaded it and ran the executable it contains. Basic mode was sufficient; I simply navigated to C:/Program Files/RStudio/bin/pandoc/pandoc and turned on the checkbox for Large Address Aware Flag (step 2), then did Commit Changes (step 3). After this, the profvis-invoked pandoc command eventually ran to success. I was able to watch pandoc's memory consumption in Task Manager rise up to a peak of about 2.7 GB.

shiny red hat enterprise linux 5.8 specific issue

Is there a known issue with Red Hat Enterprise Linux 5.8 and shiny install? I have R and shiny code working on multiple PCs and Mac OS. My Linux farm IT/SysAdmin person says the R and shiny packages installed properly (I'm at his mercy for installs.) I can run other R packages he installed.
But when I
runApp()
from a R prompt, the browser fires-up and input widgets and non-reactive things show up, but output from reactive and render blocks do not appear. There are no traceback or error messages in R console.
This behavior is the same for demonstration shiny code from Rstudio.
I can't try URL examples served from shiny-server sites to see if those work because external webpage browsing is turned off in this system. Thanks.
My support IT person did not give me full details, but he said he needed to recompile some of the base OS packages and install in a separate location, then upgrade to Firefox 12. It now works.

Resources