I need macro (variable) for GNU makefile, that searches file/directory by given mask at some of toplevel directories. For example, current working directory is /home/sysop/powerup/native/apps/toopl. Also exists directory /home/sysop/powerup/native/SDK/build. I want to find location of SDK/build directory relative to current one. So, I wrote recursive macros for that:
upfind = $(if $(wildcard $(1)),$(1),$(if $(filter $(abspath $(1)),$(abspath ../$(1))),$(error "can't find $(1)"),$(call upfind,../$(1))))
And I now can use it in following way:
relpath = $(call upfind, ../SDK/build)
And this assigns value "../../SDK/build" to relpath variable.
All fine, but I need propagate such macro to multiple makefiles, so I'am looking way to minimize it (upfind macro). I hope, anybody suggests me how to rewrite this macro in more compact way. For example, it's enought to limit recursion at some level, using of $(abspath) macro isn't necessary. But how can I determine recursion level or measure argument ($(1))length?
Not sure what you are asking here, so this probably isn't an answer.
The recursion limit is simple enough.
First, a bit of tidying:
assert-root = $(if $(filter $(abspath $1),$(abspath ../$1)),$(error Can't find $1))
upfind = $(if $(wildcard $1),$1,${assert-root}$(call upfind,../$1))
(Note how $assert-root is not called, it simply inherits the exisiting $1.)
Makes it a bit clearer how we can limit recursion depth: just pass an ever-lengthening $2.
maxup := 3
assert-depth = $(if $(filter ${maxup},$(words $2)),$(error Can't find [$1] within ${maxup} parents))
upfind = $(if $(wildcard $1),$1,${assert-depth}$(call upfind,../$1,_ $2))
Do both at the same time if you like
upfind = $(if $(wildcard $1),$1,${assert-depth}${assert-root}$(call upfind,../$1,_ $2))
Related
What is the sane way to go from a Module object to a path to the file in which it was declared?
To be precise, I am looking for the file where the keyword module occurs.
The indirect method is to find the location of the automatically defined eval method in each module.
moduleloc(mm::Module) = first(functionloc(mm.eval, (Symbol,)))
for example
moduleloc(mm::Module) = first(functionloc(mm.eval, (Symbol,)))
using DataStructures
moduleloc(DataStructures)
Outputs:
/home/oxinabox/.julia/v0.6/DataStructures/src/DataStructures.jl
This indirect method works, but it feels like a bit of a kludge.
Have I missed some inbuilt function to do this?
I will remind answered that Modules are not the same thing as packages.
Consider the existence of submodules, or even modules that are being loaded via includeing some abolute path that is outside the package directory or loadpath.
Modules simply do not store the file location where they were defined. You can see that for yourself in their definition in C. Your only hope is to look through the bindings they hold.
Methods, on the other hand, do store their file location. And eval is the one function that is defined in every single module (although not baremodules). Slightly more correct might be:
moduleloc(mm::Module) = first(functionloc(mm.eval, (Any,)))
as that more precisely mirrors the auto-defined eval method.
If you aren't looking for a programmatic way of doing it you can use the methods function.
using DataFrames
locations = methods(DataFrames.readtable).ms
It's for all methods but it's hardly difficult to find the right one unless you have an enormous number of methods that differ only in small ways.
There is now pathof:
using DataStructures
pathof(DataStructures)
"/home/ederag/.julia/packages/DataStructures/59MD0/src/DataStructures.jl"
See also: pkgdir.
pkgdir(DataStructures)
"/home/ederag/.julia/packages/DataStructures/59MD0"
Tested with julia-1.7.3
require obviously needs to perform that operation. Looking into loading.jl, I found that finding the module path has changed a bit recently: in v0.6.0, there is a function
load_hook(prefix::String, name::String, ::Void)
which you can call "manually":
julia> Base.load_hook(Pkg.dir(), "DataFrames", nothing)
"/home/philipp/.julia/v0.6/DataFrames/src/DataFrames.jl"
However, this has changed to the better in the current master; there's now a function find_package, which we can copy:
macro return_if_file(path)
quote
path = $(esc(path))
isfile(path) && return path
end
end
function find_package(name::String)
endswith(name, ".jl") && (name = chop(name, 0, 3))
for dir in [Pkg.dir(); LOAD_PATH]
dir = abspath(dir)
#return_if_file joinpath(dir, "$name.jl")
#return_if_file joinpath(dir, "$name.jl", "src", "$name.jl")
#return_if_file joinpath(dir, name, "src", "$name.jl")
end
return nothing
end
and add a little helper:
find_package(m::Module) = find_package(string(module_name(m)))
Basically, this takes Pkg.dir() and looks in the "usual locations".
Additionally, chop in v0.6.0 doesn't take these additional arguments, which we can fix by adding
chop(s::AbstractString, m, n) = SubString(s, m, endof(s)-n)
Also, if you're not on Unix, you might want to care about the definitions of isfile_casesensitive above the linked code.
And if you're not so concerned about corner cases, maybe this is enough or can serve as a basis:
function modulepath(m::Module)
name = string(module_name(m))
Pkg.dir(name, "src", "$name.jl")
end
julia> Pkg.dir("DataStructures")
"/home/liso/.julia/v0.7/DataStructures"
Edit: I now realized that you want to use Module object!
julia> m = DataStructures
julia> Pkg.dir(repr(m))
"/home/liso/.julia/v0.7/DataStructures"
Edit2: I am not sure if you are trying to find path to module or to object defined in module (I hope that parsing path from next result is easy):
julia> repr(which(DataStructures.eval, (String,)))
"eval(x) in DataStructures at /home/liso/.julia/v0.7/DataStructures/src/DataStructures.jl:3"
What does the following declaration mean. More specifically the ':P' part.
How do I read similar declarations in gnu makefile.
BUILD_TOOL = ${toolname:P}
That's a BSD make feature:
:P The path of the node which has the same name as the variable
is the value. If no such node exists or its path is null, then
the name of the variable is used. In order for this modifier to
work, the name (node) must at least have appeared on the rhs of
a dependency.
There is nothing similar to that in GNU make; you'll have to set the variable directly and use it in both places (target and variable etc.)
I've run into a difficulty in defining generic rules for a non-recursive make system.
Background
For further reading, rather than me reproducing too much existing material, see this earlier question, which covers the ground pretty well, and previously helped me when constructing this system.
For the make system I am constructing, I want to define dependencies between system components - e.g. component A depends on component B - and then leave the make system to ensure that any products of the B build process are built before they will be needed by build steps for A. It's slightly wasteful due to the granularity (some unneeded intermediates may be built), but for my use case it meets a comfortable balance point between ease-of-use and build performance.
A difficulty the system must deal with is that the order of makefile loading cannot be controlled - indeed, should not matter. However because of this, a component defined in an early-loaded makefile may depend on a component defined in a not-yet-read makefile.
To allow common patterns to be applied across all the component-defining makefiles, each component uses variables such as: $(component_name)_SRC. This is a common solution for non-recursive (but recursively included) make systems.
For information on GNU make's different types of variables, see the manual. In summary: simply expanded variables (SEVs) are expanded as a makefile is read, exhibiting behaviour similar to an imperative programming language; recursively expanded variables (REVs) are expanded during make's second phase, after all the makefiles have been read.
The Problem
The specific issue arises when trying to turn a list of depended-on components into a list of the files those components represent.
I've distilled my code down to this runnable example which leaves out a lot of the detail of the real system. I think this is sufficiently simple to demonstrate the issue without losing its substance.
rules.mk:
$(c)_src := $(src)
$(c)_dependencies := $(dependencies)
### This is the interesting line:
$(c)_dependencies_src := $(foreach dep, $($(c)_dependencies), $($(dep)_src))
$(c) : $($(c)_src) $($(c)_dependencies_src)
#echo $^
Makefile:
.PHONY: foo_a.txt foo_b.txt bar_a.txt hoge_a.txt
### Bar
c := bar
src := bar_a.txt
dependencies :=
include rules.mk
### Foo
c := foo
src := foo_a.txt foo_b.txt
dependencies := bar hoge
include rules.mk
### Hoge
c := hoge
src := hoge_a.txt
dependencies := bar
include rules.mk
These will run to give:
$ make foo
foo_a.txt foo_b.txt bar_a.txt
$
hoge_a.txt is not included in the output, because at the time that foo_dependencies is defined as a SEV, hoge_src doesn't exist yet.
Expansion after all the makefiles have been read is a problem REVs should be able to solve and I did previously try defining $(c)_dependencies_src as a REV, but that doesn't work either because $(c) is then expanded at substitution time, not definition time, so it no longer holds the correct value.
In case anyone is wondering why I am not using target-specific variables, I am concerned that the application of the variable to all the prerequisites of the target described in the manual will cause an unwanted interaction between the rules for different components.
I'd like to know:
Is there a solution to this specific issue? (i.e. is there a simple way to make that line achieve what I want it to?)
Is there a more typical way of building a make system like this? (i.e. a single make instance, loading components from multiple makefiles and defining dependencies between those components.)
If there are multiple solutions, what are the trade-offs between them?
A final comment: As I've written my question, I've begun to realise that there might be a solution possible using eval to construct a REV definition, however as I couldn't find this problem covered anywhere else on SO I thought worthwhile asking the question anyway for the sake of future searchers, plus I'd like to hear more experienced users' thoughts on this or any other approaches.
The short answer is there's no good solution for the question you are asking. It's not possible to stop expansion of a variable partway through and defer it until later. Not only that, but because you use the variable in a prerequisite list even if you could get the value of the $(c)_dependencies_src variable to contain just the variable references you wanted, in the very next line they would be completely expanded as part of the prerequisites list so it wouldn't gain you anything.
There's only one way to postpone the expansion of prerequisites and that's to use the secondary expansion feature. You would have to do something like:
$(c)_src := $(src)
$(c)_dependencies := $(dependencies)
.SECONDEXPANSION
$(c) : $($(c)_src) $$(foreach dep, $$($$#_dependencies), $$($$(dep)_src))
#echo $^
(untested). This side-steps the issue with $(c)_dependencies_src by just not defining it at all and putting it into the prerequisite list directly, but as a secondary expansion.
As I wrote in my comment above, though, I personally would not design a system that worked like this. I prefer to have systems where all the variables are created up-front using a namespace (typically prepending the target name) and then at the end, after all variables have been defined, including a single "rules.mk" or whatever that will use all those variables to construct the rules, most likely (unless all your recipes are very simple) using eval.
So, something like:
targets :=
### Bar
targets += bar
bar_c := bar
bar_src := bar_a.txt
bar_dependencies :=
### Foo
targets += foo
foo_c := foo
foo_src := foo_a.txt foo_b.txt
foo_dependencies := bar hoge
### Hoge
targets += hoge
hoge_c := hoge
hoge_src := hoge_a.txt
hoge_dependencies := bar
# Now build all the rules
include rules.mk
And then in rules.mk you would see something like:
define make_c
$1 : $($1_src) $(foreach dep, $($1_dependencies), $($(dep)_src))
#echo $$^
endif
$(foreach T,$(targets),$(eval $(call make_c,$T)))
You can even get rid of the settings for target if you are careful about your variable names, by adding something like this into rules.mk:
targets := $(patsubst %_c,%,$(filter %_c,$(.VARIABLES)))
In order to allow the same components in different targets you'd just need to add more to the namespace to differentiate the two different components.
Is there a way to print the unexpanded definition of a recursive variable? I have a complicated build system, and a user can set some values. I'd like to echo the user definition to another file, for later use.
For example,
externals = $(HOME)/externals
all:
echo $(externals)
doesn't work, because it echos using the current definition of HOME. I'd like it to echo
the literal string $(HOME)/externals without expanding $(HOME).
The value function is probably what you want here.
8.8 The value Function
The value function provides a way for you to use the value of a variable without having it expanded. Please note that this does not undo expansions which have already occurred; for example if you create a simply expanded variable its value is expanded during the definition; in that case the value function will return the same result as using the variable directly.
The syntax of the value function is:
$(value variable)
Note that variable is the name of a variable, not a reference to that variable. Therefore you would not normally use a ā$ā or parentheses when writing it. (You can, however, use a variable reference in the name if you want the name not to be a constant.)
The result of this function is a string containing the value of variable, without any expansion occurring. For example, in this makefile:
FOO = $PATH
all:
#echo $(FOO)
#echo $(value FOO)
The first output line would be ATH, since the ā$Pā would be expanded as a make variable, while the second output line would be the current value of your $PATH environment variable, since the value function avoided the expansion.
The value function is most often used in conjunction with the eval function (see Eval Function).
Though in addition to this you are going to need to use single quotes on that echo line or the shell will expand things on you.
$ cat Makefile
externals = $(HOME)/externals
all:
echo $(externals)
allv:
echo $(value externals)
allvq:
echo '$(value externals)'
$ make all
echo /home/user/externals
/home/user/externals
$ make allv
echo $(HOME)/externals
/bin/sh: HOME: command not found
/externals
$ make allvq
echo '$(HOME)/externals'
$(HOME)/externals
I'm trying to teach myself the basics of Bourne shell scripting using a textbook I borrowed from the library, and I'm working through the questions at the end of each chapter. However, I just got to one and I'm stumped...
Write a script that takes zero or more arguments and prints the last argument in the list. For example, given the argument 'myProgram arg1 arg2 arg3', the output would be 'arg3'.
Could anyone give me some advice on how to set this one up? I'm trying to review the section on user input and arguments, but I haven't worked with that much so far, so I don't have much practice yet.
echo ${!#} # bash only
eval echo \${$#} # sh-compatible
Explanation
The number of arguments is $#. Variables can be accessed indirectly via ${!VAR}. For example:
$ VAR="PATH"
$ echo ${!VAR}
/sbin:/bin:/usr/sbin:/usr/bin
Put those together and if we have a variable $n containing an integer we can access the $nth command-line argument with ${!n}. Or instead of $n let's use $#; the last command-line argument is ${!#}!
Additionally, this can be more longwindedly written using array slicing ($# is an array holding all the command-line arguments) as:
echo ${#:$#:$#}
Oddly, you cannot use an array index:
# Does not work
echo ${#[$#]}
I'll just give you some pointers. Since you want to learn bash, you probably don't just want a piece of code that does what the question asks:
1) Do you know how to count how many arguments your bash function has?
2) Do you know how to loop?
3) Do you know how to "pop" one of the arguments?
4) Do you know how to print out the first argument?
If you put all that together, I bet you'll come up with it.