R package extrafont::font_import() not finishing - r

As I was writing up this question, I tricked myself into finding the (now fairly obvious) solution. However, since it caused me a lot of confusion, I figured I would leave this up to save other people time. I can't write my own answer, so if someone else wants to answer it I will mark it. And no hard feelings if someone wants to mark this as obvious/duplicate/not a question.
Basically, I "couldn't" see the y/n prompt because I wasn't looking at the console (I was doing this in R Notebook for no particular reason), and the function was endlessly waiting for me to respond. After I figured this out, font_import() finished after 5 minutes. Hope this is helpful to someone.
Here is my question:
I'm trying to set up the extrafont package in R so I can use Times New Roman in ggplot. I am using this as a reference: How to change font of ggplot into Times New Roman (on OS X)?
Here is what I have tried:
library()
#font_import()
#font_import(pattern = "TIMES")
font_import(paths="C:\\Windows\\Fonts")
I let the first one run for 5 hours last night, and I tried the other ones this morning to maybe keep it from doing unnecessary work. None of them finished. However, I think something is going wrong for two reasons: 1) I don't get the prompt shown here: https://rdrr.io/cran/extrafont/man/font_import.html, and 2) my CPU and RAM utilization don't indicate that my computer is actually working on anything that should take some time.

The font_import function gives a prompt on the console that needs to be answered with y or n. Hence, it was waiting for a response, and nothing happened.
(Posting answer since OP cannot post themselves.)

FYI, I had a similar issue (newbie to R), until I figured out that there was a prompt in the console.
If you wanna avoid having to wait to answer in the console every time, set
prompt=FALSE
https://www.rdocumentation.org/packages/extrafont/versions/0.18/topics/font_import

Related

RStudio: My code now runs many times slower than it did before on the same computer

I'm looking for an advice please. After cca 6 months I got back to a code I wrote that by then took around 30 minutes to finish. Now, when I run it's way slower. It looks like it could take days. Since back then, hardware didn't change, I'm using Windows 10 and since then I updated my RStudio to current version (2022.07.2 Build 576), and I didn't update R version, which is "4.1.2 (2021-11-01)".
I noticed that in contrast to before, now RStudio is not using more than around 400MB RAM. Before it was much more. I don't run any other SW and there is plenty RAM available.
I had an idea that antivirus might cause this, even though I didn't change any settings. I put RStudio and R to exceptions and didn't change anything.
I also updated RStudio from the previous version, which didn't help.
Please, does anyone have an idea what can be causing this? Sorry if the description is not optimal, it's my first post here and I'm not a programmer, I just use R for data analysis for my biology related diploma thesis.
Thanks a lot!
Daniel

RStudio keeps on running code chunk with no output

I was running a spatstat envelop to generate simulations sample, however, it got stuck and did not run. So, I attempted to close the application but fail.
RStudio diagnostic log
Additional error message:
This application has requested the Runtime to terminate it in an
unusual way. Please contact the application's support team for more
information
There are several typing errors in the command shown in the question. The argument rank should be nrank and the argument glocal should be global. I will assume that these were typed correctly when you ran the command.
Since global=TRUE this command will generate 2 * nsim = 198 realisations of a completely random pattern and calculate the L function for each one of them. In my experience it should take only a minute or two to compute this, unless the geometry of the window is very complicated. One hour is really extraordinary.
So I'm guessing either you have a very complicated window (so that the edge correction calculation is taking a long time) or that RStudio is hanging somehow.
Try setting correction="border" or correction="none" and see if that makes it run faster. (These are the fastest choices.) If that works, then read the help for Lest or Kest about edge corrections, and choose an edge correction that you like. If not, then try running the same command in R instead of RStudio.

Speed up performance of R Script, Performance changes between runs

I have a script that I want to run a couple of times (5000 - 10 000). The speed seems to be around 0.10 sec usually, but sometimes it goes up to 1-2 seconds and other times even up to 7 sec. It's not that common that it go up to this time, but I would like to know why this could happen.
I have a script-file calling other script-files. My only warnings are these:
"closing unused connection #NUMBER", which I'm trying to fix.
I try to use rm() in the end of each script-file.
My script writes and reads to some files, xml and txt.
Does anyone have any idea of what could be the problem? I know it's hard, but maybe someone have experience from this (that the time that a script takes changes).
I would also appreciate any tip of how I can search the "problem". I'm a bit of a beginner in this, maybe there's a good guide of debugging in R?
Thanks!

Naming of Plot Commands in Sage

I've started teaching myself sage and I'm a bit confused about the naming of some commands in graphics. The most basic command for graphics is perhaps plot with its variants polar_plot, contour_plot, etc. However, I've also seen some variants of plot that are obtained from it by adding postfixes to it, for instance, plot_vector_field.
Does anyone know the reason why some graphical commands belong to the first category (prefix_plot) and some to the second (plot_postfix)? I'm asking this because of there is a good reason for this, then it can help me remember the names more easily, and if there is no special reason this might be something to suggest for changes in future releases of sage as it is open source.
PS This is my first question on stackoverflow and I hope this is the right place for asking it, otherwise please feel free to move it anywhere that you feel it might belong.

Why is R slowing down as time goes on, when the computations are the same?

So I think I don't quite understand how memory is working in R. I've been running into problems where the same piece of code gets slower later in the week (using the same R session - sometimes even when I clear the workspace). I've tried to develop a toy problem that I think reproduces the "slowing down affect" I have been observing, when working with large objects. Note the code below is somewhat memory intensive (don't blindly run this code without adjusting n and N to match what your set up can handle). Note that it will likely take you about 5-10 minutes before you start to see this slowing down pattern (possibly even longer).
N=4e7 #number of simulation runs
n=2e5 #number of simulation runs between calculating time elapsed
meanStorer=rep(0,N);
toc=rep(0,N/n);
x=rep(0,50);
for (i in 1:N){
if(i%%n == 1){tic=proc.time()[3]}
x[]=runif(50);
meanStorer[i] = mean(x);
if(i%%n == 0){toc[i/n]=proc.time()[3]-tic; print(toc[i/n])}
}
plot(toc)
meanStorer is certainly large, but it is pre-allocated, so I am not sure why the loop slows down as time goes on. If I clear my workspace and run this code again it will start just as slow as the last few calculations! I am using Rstudio (in case that matters). Also here is some of my system information
OS: Windows 7
System Type: 64-bit
RAM: 8gb
R version: 2.15.1 ($platform yields "x86_64-pc-mingw32")
Here is a plot of toc, prior to using pre-allocation for x (i.e. using x=runif(50) in the loop)
Here is a plot of toc, after using pre-allocation for x (i.e. using x[]=runif(50) in the loop)
Is ?rm not doing what I think it's doing? Whats going on under the hood when I clear the workspace?
Update: with the newest version of R (3.1.0), the problem no longer persists even when increasing N to N=3e8 (note R doesn't allow vectors too much larger than this)
Although it is quite unsatisfying that the fix is just updating R to the newest version, because I can't seem to figure out why there was problems in version 2.15. It would still be nice to know what caused them, so I am going to continue to leave this question open.
As you state in your updated question, the high-level answer is because you are using an old version of R with a bug, since with the newest version of R (3.1.0), the problem no longer persists.

Resources