How to print in REPL the code of functions in Julia? - julia

In Julia, a lot of the Base and closer related functions are also written in pure Julia, and the code is easily avaible. One can skim through the repository or the local downloaded files, and see how the function is written/implemented. But I think there is allready some built in method that does that for you, so you can write in REPL or Jupyter Notebook something like:
#code functioninquestion()
and get something like:
functioninquestion(input::Type)
some calculations
return
end
without pagingh throug the code.
I just don't remember the method or call. I have read the Reflection/Introspection section of the Manual but I cannot seem to be able to use anything there. I've tried methods, methodswith, code_lowered, expand and cannot seem to make them give what I want-

This is not currently supported but probably will be in the future.

Though this may not be what the OP is looking for, #less is very convenient to read the underlying code (so I very often use it). For example,
julia> #less 1 + 2
gives
+(x::Int, y::Int) = box(Int,add_int(unbox(Int,x),unbox(Int,y)))
which corresponds to the line given by
julia> #which 1 + 2
+(x::Int64, y::Int64) at int.jl:8

#edit functioninquestion() will open up your editor to the location of the method given.
It probably wouldn't be to hard to take the same information used by #edit and use it to open the file and skip to the method definition, and then display it directly in the REPL (or Jupyter).
EDIT: While I was answering, somebody else mentioned #less, which seems to do exactly what you want already.

There is now another tool for this, https://github.com/timholy/CodeTracking.jl. It is part of Revise.jl (and works better when also using Revise). It should work inside Jupyter and with functions defined in the REPL, unlike #edit/#less.

Related

Asking for an input and printing in a C interface for R

I'm trying to modify a CRAN package. From what I understand, they use a C interface using .Call().
So I made changes in the C code (can't do it anywhere else since it's in a loop) but I need to ask the user to input an integer.
I read the "Writing R extension" doc and found out that you need to use specific functions as Rprintf() instead of printf().
But I can't seem to find a way to replace scanf() so how can we ask for an input?
And finally is Rprintf() supposed to print in the R console because it is what I'd want but I can't find where it is printed?
Edit:
I'm unable to try it right now but it may seem that using capture_output() may work to get the Rprintf() output. Therefore the only remaining issue would be the scanf() :)
Thanks a lot!

source() functions preventing the read of downstream functions

I am trying to teach myself R and coming from a Python programming background.
I am clearly having problems with sourcing in one file (file_read_functions.R) when the functions stored in it are called from a file in the same directory (from read_files.R).
file_read_functions is as follows:
constant_source <- 'constants.R'
function_source <- 'file_read_functions.R'
class_source <- 'classes.R'
source(class_source)
source(constant_source)
source(function_source)
cellecta_counts = read_cellecta_counts(filepath = cell_counts_by_gene_id)
file_read_functions.R is as follows:
constants <- 'constants.R'
classes <- 'classes.R'
assignments <- 'assignment_functions.R'
source(constants)
source(classes)
source(assignments)
read_cellecta_counts = function(filepath) {
print("hello")
return(filepath)}
With the above, if I move read_cellecta_counts to before the source functions, the code can successfully find the function. What might be the cause?
This seems like a straight-forward error message to me. The function object wasn't found, so that means you haven't defined it anywhere, or haven't loaded it.
If it's a function from a package, maybe you forgot to load the package, or call the function as package::function(). If it is a function you wrote as a simple script, maybe you forgot to source it or define it locally. If it's a function you wrote as part of a package, you can load all functions by using the shortcut CTRL+SHIFT+L in RStudio.
That said, I believe you can benefit a lot from reading the chapter on Debugging from Hadley Wickham's "Advanced R" book. It is really well written and easy to understand, especially for beginners in the R language. The chapter will teach you how to use some debugging tools, either interactively or not. You can find it here.
I found it out. By commenting out the parts piece-wise in file_read_functions, I found that there a typo within assignment_functions.R. It was a simple typo; I had the following:
constant_source <- 'constants.R'
source(constants_source)
The extra 's' did it.
If you get an error like this in the future, be sure to check all of the upstream source() files; if there is an error in any of those it will not be able to find any proceeding functions.
Thank you all for your patience.

Is it unwise to modify the class of functions in other packages?

There's a bit of a preamble before I get to my question, so hang with me!
For an R package I'm working on I'd like to make it as easy as possible for users to partially apply functions inline. I had the idea of using the [] operators to call my partial application function, which I've name "partialApplication." What I aim to achieve is this:
dnorm[mean = 3](1:10)
# Which would be exactly equivalent to:
dnorm(1:10, mean = 3)
To achieve this I tried defining a new [] method for objects of class function, i.e.
`[.function` <- function(...) partialApplication(...)
However, R gives a warning that the [] method for function objects is "locked." (Is there any way to override this?)
My idea seemed to be thwarted, but I thought of one simple solution: I can invent a new S3 class "partialAppliable" and create a [] method for it, i.e.
`[.partialAppliable` = function(...) partialApplication(...)
Then, I can take any function I want and append 'partialAppliable' to its class, and now my method will work.
class(dnorm) = append(class(dnorm), 'partialAppliable')
dnorm[mean = 3](1:10)
# It works!
Now here's my question/problem: I'd like users to be able to use any function they want, so I thought, what if I loop through all the objects in the active environment (using ls) and append 'partialAppliable' to the class of all functions? For instance:
allobjs = unlist(lapply(search(), ls))
#This lists all objects defined in all active packages
for(i in allobjs) {
if(is.function(get(i))) {
curfunc = get(i)
class(curfunc) = append(class(curfunc), 'partialAppliable')
assign(i, curfunc)
}
}
VoilĂ ! It works. (I know, I should probably assign the modified functions back into their original package environments, but you get the picture).
Now, I'm not a professional programmer, but I've picked up that doing this sort of thing (globally modifying all variables in all packages) is generally considered unwise/risky. However, I can't think of any specific problems that will arise. So here's my question: what problems might arise from doing this? Can anyone think of specific functions/packages that will be broken by doing this?
Thanks!
This is similar to what the Defaults package did. The package is archived because the author decided that modifying other package's code is a "very bad thing". I think most people would agree. Just because you can do it, does not mean it's a good idea.
And, no, you most certainly should not assign the modified functions back to their original package environments. CRAN does not like when packages modify the users search path unnecessarily, so I would be surprised if they allowed a package to modify other package's function arguments.
You could work around that by putting all the modified functions in an environment on the search path. But then you have to ensure that environment is always searched first, which means modifying the search path every time another package is loaded.
Changing arguments for functions in other packages also has the potential to make it very difficult for others to reproduce your results because they must have all your argument settings. Unless you always call functions with all their arguments specified, which defeats the purpose of what you're trying to do.

Is there a fast way to copy and paste function arguments to R console

using R and debugging, I often might have a function with several arguments set by default.
e.g.
foo <- function(x=c(3,4,5), y= 'house', dateidx = '1990-01-01'){}
Often I just want to manually run through some lines in the function, while using the pre-set parameters. If the parameter list is long, I have to type or paste each argument to the console manually before stepping through the function.
x=c(3,4,5)
y= 'house'
dateidx = '1990-01-01'
It's ok if the list of arguments is small but if there is a long list of arguments, it gets tedious. Is there some way to just copy the whole set of arguments, paste to console, and do something like unlist, so that all the arguments are passed to the console as if I manually passed each one?
p.s. I'm weakly familiar with the debug tool, but sometimes I find it easier and faster to just troubleshoot lines quickly and manually as above.
There is no easy pre-existing way to do this--mainly because this a problem solved by the debugger.
One could imagine hacking something together that might parse these parameters with a regex and set them automatically--or something like that. However, the effort would be much better spent learning how to use the debugger.
It should be quite quick to test the part of the code you are interested in with the debugger if you learn how to use it. RStudio has a visual debugger. Using this, you can simply mark the command you are interested in testing with a breakpoint and run the script. The script will run until it reaches the breakpoint, then stop there so you can inspect what is happening.

How do you get code of a clisp memory image

I have got a memory image, that i can't find the source for and I want to get the code out of it again. What do i have to do to achieve that? I can obviously load the image, but then i'd need to guess the function names.
You may get "interesting" symbols with (apropos ""), and the function names with WITH-PACKAGE-ITERATOR and FBOUNDP. But the source code is (probably) lost: try DISASSEMBLE on functions and see the information which is there.
In addition to DISASSEMBLE, you might try EXT:UNCOMPILE. Note, however, that it will only work on functions compiled in an interactive session (i.e., from REPL), not on those loaded from a compiled .fas file.
So, the suggested procedure is:
LIST-ALL-PACKAGES - figure out which packages are interesting.
DO-EXTERNAL-SYMBOLS - figure out which symbols in the interesting packages are interesting.
DISASSEMBLE or EXT:UNCOMPILE on those interesting symbols.
However, the easiest way is to contact your vendor. Remember, CLISP is distributed under the GNU GPL.

Resources