I have a script that I want to run a couple of times (5000 - 10 000). The speed seems to be around 0.10 sec usually, but sometimes it goes up to 1-2 seconds and other times even up to 7 sec. It's not that common that it go up to this time, but I would like to know why this could happen.
I have a script-file calling other script-files. My only warnings are these:
"closing unused connection #NUMBER", which I'm trying to fix.
I try to use rm() in the end of each script-file.
My script writes and reads to some files, xml and txt.
Does anyone have any idea of what could be the problem? I know it's hard, but maybe someone have experience from this (that the time that a script takes changes).
I would also appreciate any tip of how I can search the "problem". I'm a bit of a beginner in this, maybe there's a good guide of debugging in R?
Thanks!
Related
I'm looking for an advice please. After cca 6 months I got back to a code I wrote that by then took around 30 minutes to finish. Now, when I run it's way slower. It looks like it could take days. Since back then, hardware didn't change, I'm using Windows 10 and since then I updated my RStudio to current version (2022.07.2 Build 576), and I didn't update R version, which is "4.1.2 (2021-11-01)".
I noticed that in contrast to before, now RStudio is not using more than around 400MB RAM. Before it was much more. I don't run any other SW and there is plenty RAM available.
I had an idea that antivirus might cause this, even though I didn't change any settings. I put RStudio and R to exceptions and didn't change anything.
I also updated RStudio from the previous version, which didn't help.
Please, does anyone have an idea what can be causing this? Sorry if the description is not optimal, it's my first post here and I'm not a programmer, I just use R for data analysis for my biology related diploma thesis.
Thanks a lot!
Daniel
I am trying to run my code to create a nice transition_reveal for my line graphs.
The data I've got is very large as it is daily data over 20 years for about 130 different variables.
When I run my code I sometimes get the following error:
Sometimes this error happens, sometimes it successfully runs but only if I cut the data into smaller parts. But it has to be very small parts. If I do that, since it is an animation, I'd have to create overlap and it gets complicated. I'd much prefer to run the whole thing. I don't mind if it takes hours. I can do other things.
But it doesn't make sense... it's not like my RAM is storing all the data at the same time, it's just storing what it needs before replacing. Therefore, it should never fill up. Here is an image of my Task Manager while running the code:
The RAM usually gets quite filled up at about 95% sometimes going lower and sometimes higher. Then it seems, by random chance, it hits my max at 100% and then the code just fails.
This is why splitting my data into 20 parts is difficult because I can't loop it as there is always a chance even a small part can hit the 100% RAM and cause an error.
I don't know if I'm doing anything wrong. I think buying more RAM would not solve the problem. Maybe there is a way I can allow it to use my SSD as RAM as well but I don't know how to do this.
Any help would be much appreciated. Thanks.
As I was writing up this question, I tricked myself into finding the (now fairly obvious) solution. However, since it caused me a lot of confusion, I figured I would leave this up to save other people time. I can't write my own answer, so if someone else wants to answer it I will mark it. And no hard feelings if someone wants to mark this as obvious/duplicate/not a question.
Basically, I "couldn't" see the y/n prompt because I wasn't looking at the console (I was doing this in R Notebook for no particular reason), and the function was endlessly waiting for me to respond. After I figured this out, font_import() finished after 5 minutes. Hope this is helpful to someone.
Here is my question:
I'm trying to set up the extrafont package in R so I can use Times New Roman in ggplot. I am using this as a reference: How to change font of ggplot into Times New Roman (on OS X)?
Here is what I have tried:
library()
#font_import()
#font_import(pattern = "TIMES")
font_import(paths="C:\\Windows\\Fonts")
I let the first one run for 5 hours last night, and I tried the other ones this morning to maybe keep it from doing unnecessary work. None of them finished. However, I think something is going wrong for two reasons: 1) I don't get the prompt shown here: https://rdrr.io/cran/extrafont/man/font_import.html, and 2) my CPU and RAM utilization don't indicate that my computer is actually working on anything that should take some time.
The font_import function gives a prompt on the console that needs to be answered with y or n. Hence, it was waiting for a response, and nothing happened.
(Posting answer since OP cannot post themselves.)
FYI, I had a similar issue (newbie to R), until I figured out that there was a prompt in the console.
If you wanna avoid having to wait to answer in the console every time, set
prompt=FALSE
https://www.rdocumentation.org/packages/extrafont/versions/0.18/topics/font_import
I apologize in advance for the somewhat vague question, but I dont have anyone else to ask. I was running the following code in R:
library(SAScii)
parse.SAScii("16crime.sas", beginline = 14)
x <- read.SAScii("opafy16nid.dat", "16crime.sas", beginline = 14)
the .dat file is 2.3 gigs which is large, at least for my computer specs, but seemingly do-able. Originally I tried running it on my windows onedrive, which should have about 9 Gigs of free space. I have a RAM of 8 Gigs. After it ran for about 8 hours and R showed about 16,000 records processed my computer started making a very bizarre noise. It almost sounded like it does when calling a fax machine ( this is the vague part i apologize for) for about 3 minutes, after that the noise ceased. Everything froze and i was unable to do a thing. Against my better judgement after about an hour of waiting for something to happened, i forced shutdown on my computer and rebooted. Seemingly, everything is working fine now and running scans on harddrive came back with no errors. I want to try it again because the data never completed loading into R, but i dont want my computer to blow up either. Can anyone comment to that noise and if i should try this again?
I wrote a program to solve a linear program in Julia using GLPKMathProgInterface and JuMP. The Julia code is being called by python program which runs multiple instances of the Juila code through multiple command line calls. While I'm extremely happy with the performance of the actual solver the initialization is extremely slow. I was wondering if there were approaches to speed this up.
For example if I just save the following to a file
#time using DataFrames, CSV, GLPKMathProgInterface, JuMP, ArgParse
and run it
mylabtop:~ me$ julia test.jl
12.270137 seconds (6.54 M allocations: 364.537 MiB, 3.05% gc time)
This seems extremely slow, is there some good way to speed up using modules like a precompile step I could do once?
Since you haven't gotten any answers yet, let me give you the general first order answers - although I hope someone more qualified will answer your question in more detail (and correct me if I'm wrong).
1) Loading packages in Julia is sometimes rather slow up to the time of this writing. It has been discussed many times and you can expect improvements in the future. AFAIK this will happen in early 1.x releases after 1.0 is out. Have a look at this thread.
2) Since you typically only have to pay the loading time cost once per Julia session one approach is to keep the session running for as long as possible. You can execute your script with include("test.jl") from within the session. Let me also mention the amazing Revise.jl - it's hardly possible to overemphasize this package!
3) (I have no experience with this more difficult approach.) There is PackageCompiler.jl which allows you to compile a package into your system image. Read this blog post by Simon.
4) (Not recommended) There has also been the highly experimental static-julia which statically compiles your script into an shared library and executable.
Hope that helps.