I usually run a lot of simulations, mainly statistical ones. Therefore, I use R language most of the time. My laptop has been great till last month or so, my simulations started taking abnormal time to finish. I decided to get a new machine with good RAMs and CPU etc. I downloaded R and Rstudio after I downloaded Windows 11 pro and no other softwares.
I am surprised that this new PC is slow too regardless the cutting edge parts I put in it.
I know for fact that my coding skills not great, yet I have tried my best to optimize the code and lately I discovered that the problem is not the efficiency of my code.
I was complaining to a friend of mine about this situation and asked him to run my code on his machine. Interestingly, his machine managed to run the entire exact code in a few hours while it takes days on my machine. His R version is much older than 4.2.1 which I am using on my PC.
I am not sure if this is the right place to seek help for such problems. Your help is appreciated
Related
I am new here and I came to look for help since my business is stopped due a problem I have.
I am not into codding, so I asked a person to do a training app for training yolov4 models for my job. I work in agriculture.
It was working fine until I changed my computer and it's not working anymore. What about the person that made it? He is unreachable for months already since he found a job overseas.
It has 2 versions, training with Cuda or without Cuda. The second one works, but it is too slow to be practical, weeks vs hours with cuda.
I already installed everything that should be installed: CMake, Powershell, Cuda, OpenCV, cuDNN and I have a Nvidia 1650 Super.
Attached is the error message.
Anybody can help? you can keep the training app if you can make it work again.
Thanks in advance,
Miguel
the error
I am running R Studio 64bit on a Windows10 laptop with a Nvidia GPU in it, however, when I am running code, specifically Rshiny apps, they take a long time. This laptop has a GPU but my task manager shows that the GPU is not being utilized. Would the GPU make my program run faster? I do not know much about hardware so forgive my ignorance regarding this.
In answer to your question getting a new GPU would have no impact on the speed of your code.
By default most R code is single threaded meaning that it will only use 1 CPU core. There are various ways to do parallel processing (using more than 1 core) in R. And there are also packages that can make use of GPUs. However it sounds like you are not using either of these.
There are various different ways that you could code your application that would make it more efficient. However how you would go about this would be specific to your code. I would suggest you ask a different question regarding.
Also Hadley's excellent book, Advance R, has techniques for profiling and benchmarking your code to increase performance: http://adv-r.had.co.nz/Profiling.html
I've been using R 3.1.2 on an early-2014 13" MacBook Air with 8GB and 1.7GHz Intel Core I7, running Mavericks OSX.
Recently, I've started to work with substantially larger data frames (2+ million rows and 500+ columns) and I am running into performance issues. In Activity Monitor, I'm seeing virtual memory sizes of 64GB, 32GB paging files, etc. and the "memory pressure" indicator is red.
Can I use the "throw more hardware" at this problem? Since the MacBook Air tops out at 8GB physical memory, I was thinking about buying a Mac Pro with 64GB memory. Before I spend the $5K+, I wanted to ask if there are any inherent limitations in R other than the ones that I've read about here: R Memory Limits or if anyone who has a Mac Pro has experienced any issues running R/RStudio on it. I've searched using Google and haven't come up with anything specific about running R on a Mac Pro.
Note that I realize I'll still be using 1 CPU core unless I rewrite my code. I'm just trying to solve the memory problem first.
Several thoughts:
1) Its a lot more cost effective to use a cloud service like https://www.dominodatalab.com (not affiliated). Amazon AWS would also work, the benefit of domino is that it takes the work out of managing the environment so you can focus on the data science.
2) You might want to redesign your processing pipeline so that not all your data needs to be loaded in memory at the same time (soon you will find you need 128 GB, then what). Read up on memory mapping, using databases, separating your pipeline into several steps that can be executed independent of each other, etc (googling brought up http://user2007.org/program/presentations/adler.pdf). Running out of memory is a common problem when working with real life datasets, throwing more hardware at the problem is not always your best option (though sometimes it really can't be avoided).
Does anyone have a suggestion if it would be possible to run R scripts on a Synology DS214 NAS? If yes, further information or links would be much appreciated.
The underlying OS on the Diskstation is Linux-based, so in principle it may be possible, but I highly doubt it will be in practice. First of all, you'd need to compile R for the Marvell Armada XP CPU, which is part of the ARM family of CPUs. To compile R on this chip you'll need all the related software described in the R Administration & Installation manual upon which R depends.
Finally, a DiskStation is not at all designed to crunch numbers. Even if you do manage to overcome the potentially unsurmountable problems of being able to compile the software you need on the DS214, the execution of code isn't going to be, well, snappy. Also, the 512MB on-board RAM will make all but small data analysis jobs impossible or impossibly slow.
For some reason, when I use the standalone build from InstallShield 2009 Professional, there are times when I get an error and times when the build completes successfully, with no major distinguishable reason why. The error that shows up usually reads something like:
IsCmdBld.exe - Application Error
The instruction at "0xa781543" referenced memory at "0x6a19a778". The memory could not be "written"
Click on OK to terminate the program
Now, this message only comes up sometimes, it doesn't occur with any regularity or pattern. Anybody have any ideas on this? Thanks.
If the buggy memory approach doesn't pan out, I wouldn't be surprised if there was a bug in IsCmdBld.exe causing this.
I'd hazard a guess that you've got some dodgy/incompatible RAM. If you can take the machine offline and let memtest run for 24 hrs you should have enough information to debug.
I've had a similar issue in the past with a machine where the RAM/Mobo combo would cause one or two errors about once every 24 hrs. The same RAM in another machine would be fine, and other brand RAM in that machine would be fine. Running the machine off of a UPS would also be fine, I guess that teeny tiny electricity fluctuations were just enough to highlight minor incompatibilities between the two pieces of hardware.
If that doesn't help I'd suggest taking your question over to the InstallShield forums, last IS bug I had turned out to be a bug which was lodged, fixed and a hotfix distributed.