R how to know if a library is effectively used? - r

I have a hudge code, and where all the libraries are attached in the begining of the code. Now, I'm cleaning a bit this code, removing parts of it / re-writting other.
I was wondering if there was a way to know if a specific library is used or not by the code (in order to clean the library part too)? I could list, for each library, all the functions that are attached, then search in the code that this function are not called, but it will become long. I could also remove this library and try to run the code, but I don't like this solution (not enough robust).
I'm sorry if the question have already been asked, but so far, I haven't found any solution :(.

Related

R shiny rhandsontable tagging with multi-value select boxes

I'd like to implement the possibility to use in a cell a multivalue select list, such as is usually happening in tagging.
This is possible in different datatable packages in other languages, but I am sadly failing to find a solution for this in R Shiny.
I'd like to stick with rhandsontable as for my use case the multicell copy paste functionality is just too important to get rid of it, but if this is completely impossible and there are good alternatives out there that I didn't find, please feel free to suggest them.
For a reference of a similar behaviour you can see what is happening here:
https://select2.org/tagging#tagging-with-multi-value-select-boxes
this is the first result I got googling it, I can find the same for most of the other packages I tried in the past (not in R, unfortunatley I am so new to R).
Have you have ever seen a functionality like this implemented with Rhandsontable? Is it even possible to do it?

How to see where in my code a function gets called in RStudio?

I'm currenty cleaning up my first big R project and at a point, where I have a lot of functions implemented but I am not sure, which function got called and used by me in an other script and which function got never used. So now I want to get all calls of this function in my project. Is this possible?
I'm using RStudio and a lot of other IDEs I've used got a feature like this, so I was wondering if this is also implemented in RStudio.
I searched the web and stack overflow, but got no answer, so I assume that this is not possible but I wanted to ask, just in case it IS possible but I didn't found the right answer.
Thank you!

Ada dependency graph

I need to create a dependency graph for a software suite that I am working on. In the past the company I work for has always done this manually, but I am guessing that there is a tool somewhere that will do what we need.
The software I am working with is Ada95, and has about 200 code modules/files, with about 40 packages. I need to create a map that will trace every output, individually, back to each input or constant that will have an impact on the output. Does anybody know of a tool that would accomplish this? Or even just partially accomplish it?
AdaCore's GPS (available from http://libre.adacore.com) comes with a command line tool named gnatinspect. You can use this tool to load all cross-reference information generated by the compiler (assuming you are compiling with GNAT). This creates a sqlite database (gnatinspect.db) which contains all information you need. gnatinspect itself provides a number of pre-made queries that might get you at least partially to where you want to go.
You could also look at ASIS, as a way to do this kind of queries directly on the code. I am told this is not so easy to use the first time around though.
There is also an older tool provided with gnat (gnatxref) which does something similar, although it is being superceded by gnatinspect.
Finally, you could look at gnat2xml as an alternative to ASIS if you are more comfortable parsing XML files.

Is it possible to include the os library in lua 4.0?

I'm stuck using the 4.0 version of lua which does not seem to support the os library. Is there a way to include this library into my project?
Or get another way to use the functionality contained within pertaining to date time calculations?
Preferably by using a *.lua file and not a *.c file since I don't have complete access to the code.
When I run the following line,
print(os.time{year=1970, month=1, day=1, hour=0})
I get an error stating:
attempt to index global 'os'(a nil value)
As #Colonel Thirty Two said it's not possible to use the os library. So the time() funciton is not available for me.
Adding to the (totally correct) currently accepted answer (that if "os" access was not allowed to you, you're generally done), there's some very slight chance the Original Programmer may have provided you with some alternative facilities to do your thing (fingers crossed). In a perfect world, those would be described in some kind of a User's Manual for your scripting environment. But if the manual was lost to time (or never existed in the first place), you might possibly try your luck at exploring any preloaded libraries by digging through the result of the globals() Basic Function. (At least I hope that's how it was done in 4.0 too.) That is, if the Original Programmer didn't block globals() for you too...

Where in R do I permanently store my custom functions?

I have several custom functions that I use frequently in R. Rather than souce this file (or parts thereof) in each script, is there some way to add this to a base R file such that they are always available when I use R?
Yes, create a package. There are numerous tutorials as well as the Writing R Extensions manual that came with your copy of R.
It may seem like too much work at first, but you will probably be glad that you did this in the longer run.
PS And you can then load that package from ~/.Rprofile. For really short code, you can also define it there.
A package may be overkill for a for a few useful functions. I'd argue there's nothing wrong with explicitly source()ing them as you need them - at least it is explicit so that if you email someone your code, you won't forget to include those other scripts.
Another option is to use the .Rprofile file. You can read about the details in ?Startup. Basically, the idea is that:
...a file called ‘.Rprofile’ is searched for in the current directory or
in the user's home directory (in that order). The user profile file is
sourced into the workspace.
You can read here about how many people use this functionality.
The accepted answer is best long-term: Make a package.
Luckily, the learning curve for doing this has been dramatically reduced by the devtools package: It automates package creation (a nice assist in getting off on the right foot), encourages good practices (like documenting with roxygen2, and helps with using online version control (bitbucket, github or other), sharing your package with others. It's also very helpful for smoothing your way to CRAN submission.
Good docs at http://adv-r.had.co.nz and http://r-pkgs.had.co.nz .
to create your package, for instance you can:
install.packages("devtools")
devtools::create("path/to/package/pkgname")
You could also look at the 'mvbutils' package: it lets you set up a hierarchical set of "tasks" (folders with workspace ".RData" files in them) such that you can always see what's in the ancestral tasks (ie the ancestors are in the search() path). So you can put your custom functions in the "starting task" where you always start R; and then you change to vwhatever project-specific task you require, so you can avoid cluttered workspaces, but you'll still be able to use (and edit) your custom functions because the starting task is always ancestral. Objects (including functions) get stored in ".RData" files and are thus loaded/saved automatically, but there are separate text-backup facilities for functions.
There are lots of different ways of working in R, and no "one-size-fits-all" best solution. It's also not easy to find an overview! Speaking just for myself:
I'm not a fan of having to 'source' everything in every time; for one thing, it simply doesn't work with big data sets and/or results of model runs.
I think packages are hard to create and maintain; there is a really significant overhead. After the first 5 packages you write, it does get a bit easier provided you do it on at least a weekly basis so you don't forget how, but really...
In fact, 'mvbutils' also has a bunch of tools for facilitating the creation and (especially) maintenance of packages, designed to interface smoothly with the task-hierarchy system. I use & edit my own packages all the time (including editing mvbutils itself); but if it wasn't for the tools in 'mvbutils', I'd be grinding my teeth in frustration most days of the week.

Resources