Get all functions available at runtime - julia

Is there a way to get all functions available at runtime? Or is there a hidden database to keep track of all the loaded functions, variables, modules accessible by our code.

I'll post my comment as an answer since it seemed useful:
Do you mean user-defined functions / variables and loaded modules? You can get those with whos(). To get functions exported from a module you can do whos(MyModule)

Related

[R][package] Load all functions in a package's NAMESPACE

Tl;dr
I have a complete NAMESPACE file for the package I am building. I want to execute all the importFrom(x,y) clauses in that file to get the set of necessary functions I use in the package. Is there an automated way to do that?
Full question
I'm currently working on building a package that in turns depends on a bunch of other packages. Think: both Hmisc and dplyr.
The other contributors to the package don't really like using devtools::build() every 5 minutes when they debug, which I can understand. They want to be able to toggle a dev_mode boolean which, when set to true, loads all dependencies and sources all scripts in the R/ and data-raw/ folders, so that all functions and data objects are loaded in memory. I don't necessarily think that's a perfectly clean solution, but they're technically my clients and it's a big help to them so please don't offer a frame challenge unless it's the most user-friendly solution imaginable.
The problem with this solution is that when I load two libraries whose namespace clash, functions in the package that would perfectly work otherwise start throwing errors. I thus need to improve my import system.
Thanks to thorough documentation (and the help of devtools::check()), my NAMESPACE is complete. I guess I could split it in pieces and eval() some well-chosen parts of it, but it sort of feels like I'm probably not the first person to encounter this difficulty. Is there an automated way to parse NAMESPACE and import the proper functions from the proper packages ?
Thanks !

Can I automatically add functions called using pkg::fct to the importFrom section in roxygen2?

When I write a package, I usually call external functions using pkg::fct() just to make it very clear and explicit where the function is coming from. I'm aware of the small overhead, but can usually ignore it.
On the other hand I like it when all external functions appear in the roxygen tags to give an overview of what is used in the function.
Is there a way to automatically add all functions called via pkg::fct() to #importFrom? And is that a good idea after all?

Include library calls in functions?

Is it good practice to include every library I need to execute a function within that function?
For example, my file global.r contains several functions I need for a shiny app. Currently I have all needed packages at the top of the file. When I'm switching projects/copying these functions I have to load the packages/include them in the new code. Otherwise all needed packages are contained in that function. Of course I have to check all functions with a new R session, but I think this could help in the long run.
When I tried to load a package twice it won't load the package again but checks it's already loaded. My main question is if it would slow my functions if I restructure in that way?
I only saw that practice once, library calls inside functions, so I'm not sure.
As one of the commenters suggest, you should avoid loading packages within a function since
The function now has a global effect - as a general rule, this is something to avoid.
There is a very small performance hit.
The first point is the big one. As with most optimisation, only worry about the second point if it's an issue.
Now that we've established the principle, what are the possible solution.
In small projects, I have a file called packages.R that includes all the library calls I need. This is sourced at the top of my analysis script. BTW, all my functions are in a file call func.R. This workflow was stolen/adapted from a previous SO question
If you're only importing a single function, you could use the :: trick, e.g. package::funcA(...) That way you avoid loading the package.
For larger projects, I create an R package that handles all necessary imports. The benefit of creating a package is detailed in this answer on structuring large R projects.

Best practices to handle personal functions in R

I have written personal functions in R that are not specific to one (or a few) projects.
What are the best practices (in R) to put those kind of functions?
Is the best way to do it to have one file that gets sourced at startup? or is there a better (recommended) way to deal with this situation?
Create a package named "utilities" , put utility functions in that package, try to aim for one function per file, and store the package in a source control system (e.g., GIT, SVN ). It will save you time in the long run.
P.S. .Rprofile tends to get accidentally deleted.
If you have many, it would be good to make it into a package that you load each time you start working.
It is probably not a good idea to have a monolithic script with a bunch of functions. Instead break the file up into several files each of which either has only one function (my preference) or has a group of functions that are logically similar. That makes it easier to find things when you need to make changes.
Most people use the .Rprofile file for this. Here are two links which talk about this file in some detail.
http://www.statmethods.net/interface/customizing.html
http://blog.revolutionanalytics.com/2013/10/sample-rprofile.html
At the top of my .Rprofile file I call library() for the various libraries which I normally use. I also have some personal handy functions which I've come to rely on. Because this file is sourced on startup, they are available to me every session.
From my experience, a package will be the best choice for personal functions. Firstly I put all new functions into a personal package, which I called it My. When I find some functions was similar and are worth to become an independent package, I will create a new package and move them.

Hiding Undocumented Functions in a Package - Use of .function_name?

I've got some functions I need to make available in a package, and I don't want to export them or write documentation for them. I'd just hide them inside another function but they need to be available to several functions so doing so becomes a scoping and maintenance issue. What is the right way to do this? By that I mean do they need special names, do they go somewhere other than the R subdirectory, can I put them in a single file, etc? I've checked out the manuals, and what I'm after is like the .internals concept in the core, but I don't any instructions about how to do this generally. I thought I had seen something about this before but cannot locate it just now. Thx.
My solution is to remove unnecessary function from NAMESPACE and call internal function by NAME-OF-PACKAGE:::NAME-OF-INTERNAL-FUNCTION. For example if your package name is RP and the name of the internal function is IFC. Then it would be like RP:::IFC(). Notice that if you use :: (two colon)then you can call functions that are listed in NAMESPACE and when you use ::: (three colon), you can call all functions including internal and exported functions.
After asking on R-help, here is the answer. #Dwin is correct, do not export the internal functions (so fix up your export instructions in NAMESPACE - don't use exportPattern but rather name the functions explicitly using export). You can call them what you want, there is no special naming convention. You do not have to write Rd files for them if you don't export them.

Resources