Parallel Studio --InstallerCache directory multiple GBs in size - intel

On installing Intel Parallel Studio, the size of C:\ProgramData\Intel\InstallerCache is multiple GBs in size and it does not seem to be used/changed since it was installed.
Example image from running WinDirStat below:
Is it safe to delete this folder to save on some space without affecting the functionality of Parallel Studio otherwise?

Related

Why would R Shiny app run indefinitely in R Studio?

I wanted to learn how to build Shiny Apps in R so I started this beginner-level tutorial. However, when I run the app on my desktop (Windows 10 x64, 16GB RAM, 500 GB SSD, i5-3470 CPU, Dual Display) in R Studio 1.3.1093) using R 4.0.3, it loads indefinitely with no error output. I tried running even the basic built-in examples (which you could find here) and they also failed to load. The exact same scripts and examples run on my laptop (Windows 10 x64, 8GB RAM, 250 GB SSD; R & R Studio specs the same) without issue. I've reinstalled the shiny package, reinstalled R and R Studio, and changed whether the app runs internally or externally with no success. I did find this post which seems to have encountered the same issue, but with no solution.
I know it's not much to go on, but I'm at a loss as to the next thing I should check. Any suggestions?
I figured out from this mostly unrelated post that there was a file at the path C:/ Users/.../Documents\R\win-library\4.0/ called 00LOCK which was giving R trouble downloading and updating new packages. I'm not sure how it got there or why R was not telling me there were issues in updating the packages, but the shiny app seems to work perfectly fine now.

Rstudio potential memory leak / background activity?

I’m having a lot of trouble working with Rstudio on a new PC. I could not find a solution searching the web.
When Rstudio is on, it is constantly eating up memory until it becomes unworkable. If I work on an existing project, it takes half an hour to an hour to become impossible to work with. If I start a new project without loading any objects or packages, just writing scripts without even running them, it takes longer to reach that point, however, it still does.
When I first start the program, the Task Manager shows memory usage of 950-1000 MB already (sometimes larger), and as I work, it climbs up to 6000 MB at which point it is impossible to work with as every activity is delayed and 'stuck'. Just to compare, on my old PC while working on the program, the Task Manager shows 100-150 MB. When I click the "Memory Usage Report" within Rstudio, the "used by session" is very small, the "used by system" is almost at a maximum yet Rstudio is the only thing taking up they system memory on the PC.
Things I tried: installing older versions of both R and Rstudio, pausing my anti-virus program, changing compatibility mode, zoom on "100%". It feels like Rstudio is continuously running something in the background as the memory usage keeps growing (and quite quickly). But maybe it is something else entirely.
I am currently using the latest versions of R and Rstudio (4.1.2, and 2021.09.0-351), on a PC with processor Intel i7, x64 bit, RAM 16GM, Windows 10.
What should I look for at this point?
On Windows, there is several typical memory or CPU issues with Rstudio. In my answer, I explain how the Rstudio interface itself use memory and CPU, as soon as you open a project (e.g., when Rstudio show you some .Rmd files). The memory / CPU cost associated with the computation is not covered in my answer (i.e. when you have performance issues when executing a line of code = not covered).
When working on 'long' .Rmd files within Rstudio on Windows, the CPU and/or memory usage get sometimes very high and increases progressively (e.g., because of a process named 'Qtwebengineprocess'). To solve the problem caused by long Rmd files loaded within a Rstudio session, you should:
pay attention to the process of Rstudio that consume memory, when scanning your code (i.e. disable or enable stuff in the 'Global options' menu of Rstudio). For example, try to disable 'inline display'(Tools => Global options => Rmarkdown => Show equation and image preview => Never). This post put me on this way to consider that memory / CPU leak are sometimes due to Rstudio itself, nor the data or the code.
set up a bookdown project, in order to split your large Rmd files into several Rmd. See here.
Bonus step, see if there is a conflict in some packages which are loaded with the command tidyverse_conflicts(), but it's already a 'computing problem' (not covered here).

Is a computation using R-Portable less reproducible than using a Container

I am investigating ways for my group to improve the reproducibility of our analyses. The aim is that reviewers or we in 10 years are able to recompute our results.
My first choice would be containers using Singularity which are basically a SquashFS with all needed files except the Linux kernel. But except our cluster we are working on Windows machines. Our IT does not feel equipped to support Linux VMs on every machine nor do I expect my fellow biologists to reliably keep working inside a container inside a VM instead of circumventing the system because deadline is always looming.
Therefore, my next best idea is copying R Portable inside each project and using renv to keep a dedicated R package library for each project. Additionally, I will set the locale using Sys.setlocale in the .Rprofile of the project. The working directory can be set to the project folder (nessecary so that R loads the .Rprofile and users don't get tempted to use absolute paths) by using relative paths in Windows Shortcuts
The size of the R Portable download is 200MB whereas for example the rocker/r-ver docker image is 260MB. So there seems to be at least roughly the same amount of information available in the computation environment. Does this setup enable me to reproduce our analyses on other Windows machines?

Running out of space using Xamarin on my PC

I created a Xamarin app on my PC (Windows 10, Visual Studio 2019) along with an android emulator (Pixel Pie 9.0 -API 28).
I noticed when creating Xaml pages along with corresponding View Models and running the changes on the emulator consumes a conciderable amount of data on my drive.
I am new to Xamarin App/Android development so my guess is that there is caching of data (perhaps?) that takes up allot of space. Creating a Registration form for example, building and running the change on the emulator resulted in disk space dropping from 13.5 GB to 10 GB.
How can I free up data on my hard drive as at this rate it feels like it will run out very quickly.
Some of it is cache (mostly downloaded nuget packages), but a lot of it is also just the compiled app (the debug version of which is often a lot larger than a normal 'release' build). The emulator also consumes some space to create and run, which is not avoidable. Installing the android sdk and emulator images does not create the actual emulator disk image until you run the emulator itself.
In general, you should not expect to see the use of space continue to increase at that rate unless you run multiple versions of the emulator. Just adding new code to the same project shouldn't change the size much.
To free space temporarily, you can always delete the obj and bin folders in your project, but they will be recreated as soon as you build it again.

The file is too large - why does the file size limit differ on each computer?

There is a file size limit implemented in R-Studio. I have an R-file with a size of 3.3MB. On my computer the limit is set to 5MB and therefore I can open my R-file without any problems. However, I just tried to open the file on another computer and there the file size limit is set to 2MB. For that reason, I receive the error message: The file xxx is too large to open in the source editor.
I am wondering, why the file size limit differs on different computers. Could it be because of the version of R? I am using version 3.2.1 and on the other computer version 3.1.2 is used.
Question 1: Why does R-Studio have different file size limits on different computers?
Question 2: If it is because of the version of R, is there a list of file size limits for each version of R available?

Resources