I am looking to run some experiments which are hard disk RW intensive on my unix box.
Could you suggest me some programs which I can use for this purpose ?
Do you mean testing tools as described here?
Related
I am running R Studio 64bit on a Windows10 laptop with a Nvidia GPU in it, however, when I am running code, specifically Rshiny apps, they take a long time. This laptop has a GPU but my task manager shows that the GPU is not being utilized. Would the GPU make my program run faster? I do not know much about hardware so forgive my ignorance regarding this.
In answer to your question getting a new GPU would have no impact on the speed of your code.
By default most R code is single threaded meaning that it will only use 1 CPU core. There are various ways to do parallel processing (using more than 1 core) in R. And there are also packages that can make use of GPUs. However it sounds like you are not using either of these.
There are various different ways that you could code your application that would make it more efficient. However how you would go about this would be specific to your code. I would suggest you ask a different question regarding.
Also Hadley's excellent book, Advance R, has techniques for profiling and benchmarking your code to increase performance: http://adv-r.had.co.nz/Profiling.html
I have a program in julia and I know it took a lot of memory. I want to know where it is happening.
How can I monitor the code or profiling for finding the problem?
Please read the manual: https://docs.julialang.org/en/v1/manual/profile/index.html
Profiling for memory use discussed at the bottom. The Juno IDE has some nice tools for interacting with the profiler: http://docs.junolab.org/latest/man/juno_frontend/#Profiler-1
Does anyone have a suggestion if it would be possible to run R scripts on a Synology DS214 NAS? If yes, further information or links would be much appreciated.
The underlying OS on the Diskstation is Linux-based, so in principle it may be possible, but I highly doubt it will be in practice. First of all, you'd need to compile R for the Marvell Armada XP CPU, which is part of the ARM family of CPUs. To compile R on this chip you'll need all the related software described in the R Administration & Installation manual upon which R depends.
Finally, a DiskStation is not at all designed to crunch numbers. Even if you do manage to overcome the potentially unsurmountable problems of being able to compile the software you need on the DS214, the execution of code isn't going to be, well, snappy. Also, the 512MB on-board RAM will make all but small data analysis jobs impossible or impossibly slow.
I was looking at the existing RAMDisk discussions ... and none seem to bring up any reliability issues. I recently started using a Dataram ramdisk for my source code and am wondering if there are any risks I should be concerned with.
It did speed up the compile time by 30%
I am not fully familiar with that product, but the answer probably depends on whether you have a (good) UPS, and what you are using to sync changes with your hard drive. I had looked into this a while ago (on a linux machine) mapping a portion of the ram as a disk and using RSYNC to persist changes to the hard drive, but discontinued the idea and got a faster hard drive instead :) I would be very interested in seeing this working...
Does anyone know how exactly RSLs work with AIR? I have a terminal server that runs several instances of a very large AIR application, which unfortunately has 100M RAM on startup and 200 after a bit of use. This is obviously not really workable, and I'm thinking that RSLs may be a solution if they're cached on the machine. However I haven't been able to find much of anything on this, and I'd really like to know if anyone has.
On a second note, what are some good ways to reduce the initial memory size of an AIR applicaiton?
RSLs will only help with download size not RAM usage. To use less memory I recommend AMF instead of XML as XML parsing has some overhead.
Hope that helps.
-James
Try using the profiler that comes with flexbuilder. It will help you see what is eating up the memory and then you can change your code accordingly.
After a good bit of research, I found that the answer is simply this: Using RSL with Flex gives you the advantage of caching most of the core flex libraries within the browser, thus grossly lowering download speed after that initial one. This is not, however, the case for AIR. It does you basically no good.
Like james said, it'll help with the DL speed only.
As far as memory goes... check out articles by Grant Skinner - he's helped me out a lot.
Thanks!