I've been trying to get this to work with quote, quote-splicing, eval, and whatever else I can think of, but no luck so far. I understand why it doesn't work - it's being seen as a map, and it's trying to eval a, b, and c - just not how to get around it.
(def destructor {a :a b :b c :c})
; CompilerException java.lang.RuntimeException: Unable to resolve symbol: a in this context, compiling:(:1:15)
(let [destructor my-map]
'etc)
I have a rather involved destructuring map that I'm considering using several times, so it seemed a good idea to tuck it away somewhere. Maybe there are better ways to go about it?
Good idea, but you can't do it in quite this way, because the thing you want to store is not really a value, so you can't store it in a var.
Instead, you can define a macro that includes this in its expansion:
(defmacro with-abc [abc & body]
`(let [~'{:keys [a b c]} ~abc]
~#body))
(with-abc foo
(...use a, b, and c...))
Something like #amalloy's answer was my first instinct too and it's probably the way to go. That said, it might be worth considering a plain ol' higher-order function:
(defn destruct
[{a :a b :b c :c} f]
(f a b c))
(destruct my-map
(fn [a b c]
(println a)
(println b)
(println c)))
It's a little nosier and you're forced to name the bindings every time, but you avoid potential hygiene issues and, depending on your level of comfort with metaprogramming, it's a little easier to put together.
You'll need to quote the destructing pattern
(def my-destructor '{a :a b :b c :c})
You can do this with levels of quoting/unquoting, but it is easier to see with a little helper function.
(defn- with-destructor* [binding & body]
`(let ~binding ~#body))
This eval occurs at "compile" time (during macro expansion)
(defmacro with-destructor [[destructor form] & body]
(eval `(with-destructor* [~destructor '~form] '~#body)))
As shown
(macroexpand-1 '(with-destructor [my-destructor {:a 1 :c 3}] (+ a c)))
;=> (clojure.core/let [{a :a, b :b, c :c} {:a 1, :c 3}] (+ a c))
Result
(with-destructor [my-destructor {:a 1 :c 3}] (+ a c))
;=> 4
Related
I want to convert {a 1, b 2} clojure.lang.PersistentArrayMap into
[a 1 b 2] clojure.lang.PersistentVector in clojure.
I have tried to write a function in clojure which converts {a 1, b 2} into [a 1 b 2]. I have also written a macro which gives me expected end result. In clojure we cannot pass the values generated inside functions to macros. For that I wanted to know a way in which I can implement a macro directly which can convert {a 1, b 2} into (let [a 1 b 2] (println a)), which will return 1.
Dummy Macro:
(defmacro mymacro [binding & body]
--some implemetation---)
Execution :
(mymacro '{a 1, b 2} (println a))
output:
1
nil
My Implementaion:
Function which converts into desired output.
(defn myfn [x]
(let [a (into (vector) x) b (vec (mapcat vec a))] b))
Execution:
(myfn '{a 1, b 2})
Output:
[a 1 b 2]
MACRO:
(defmacro list-let [bindings & body] `(let ~(vec bindings) ~#body))
Execution:
(list-let [a 1 b 2] (println a))
Output:
1
nil
I wanted to know how can I implement the same inside the macro itself and avoid the function implementation to get the require output. Something same as dummy macro given above. I am also interested in knowing if there is any which through which I can pass the value from my funtion to the macro without using
(def)
in general, macro code is plain clojure code (with the difference that it returns clojure code to be evaluated later). So, almost anything you can think of coding in clojure, you can do inside macro to the arguments passed.
for example, here is the thing you're trying to do (if i understood you correctly):
(defmacro map-let [binding-map & body]
(let [b-vec (reduce into [] binding-map)]
`(let ~b-vec ~#body)))
(map-let {a 10 b 20}
(println a b)
(+ a b))
;;=> 10 20
30
or like this:
(defmacro map-let [binding-map & body]
`(let ~(reduce into [] binding-map) ~#body))
or even like this:
(defmacro map-let [binding-map & body]
`(let [~#(apply concat binding-map)] ~#body))
You don't need a macro for this, and you should always prefer functions over macros when possible.
For your particular case, I have already written a function keyvals which you may find handy:
(keyvals m)
"For any map m, returns the keys & values of m as a vector,
suitable for reconstructing via (apply hash-map (keyvals m))."
(keyvals {:a 1 :b 2})
;=> [:b 2 :a 1]
(apply hash-map (keyvals {:a 1 :b 2}))
;=> {:b 2, :a 1}
And, here are the full API docs.
If you are curious about the implementation, it is very simple:
(s/defn keyvals :- [s/Any]
"For any map m, returns the (alternating) keys & values of m as a vector, suitable for reconstructing m via
(apply hash-map (keyvals m)). (keyvals {:a 1 :b 2} => [:a 1 :b 2] "
[m :- tsk/Map ]
(reduce into [] (seq m)))
I know this is a recurring question (here, here, and more), and I know that the problem is related to creating lazy sequencies, but I can't see why it fails.
The problem: I had written a (not very nice) quicksort algorithm to sort strings that uses loop/recur. But applied to 10000 elements, I get a StackOverflowError:
(defn qsort [list]
(loop [[current & todo :as all] [list] sorted []]
(cond
(nil? current) sorted
(or (nil? (seq current)) (= (count current) 1)) (recur todo (concat sorted current))
:else (let [[pivot & rest] current
pred #(> (compare pivot %) 0)
lt (filter pred rest)
gte (remove pred rest)
work (list* lt [pivot] gte todo)]
(recur work sorted)))))
I used in this way:
(defn tlfnum [] (str/join (repeatedly 10 #(rand-int 10))))
(defn tlfbook [n] (repeatedly n #(tlfnum)))
(time (count (qsort (tlfbook 10000))))
And this is part of the stack trace:
[clojure.lang.LazySeq seq "LazySeq.java" 49]
[clojure.lang.RT seq "RT.java" 521]
[clojure.core$seq__4357 invokeStatic "core.clj" 137]
[clojure.core$concat$fn__4446 invoke "core.clj" 706]
[clojure.lang.LazySeq sval "LazySeq.java" 40]
[clojure.lang.LazySeq seq "LazySeq.java" 49]
[clojure.lang.RT seq "RT.java" 521]
[clojure.core$seq__4357 invokeStatic "core.clj" 137]]}
As far as I know, loop/recur performs tail call optimization, so no stack is used (is, in fact, an iterative process written using recursive syntax).
Reading other answers, and because of the stack trace, I see there's a problem with concat and adding a doall before concat solves the stack overflow problem. But... why?
Here's part of the code for the two-arity version of concat.
(defn concat [x y]
(lazy-seq
(let [s (seq x)]
,,,))
)
Notice that it uses two other functions, lazy-seq, and seq. lazy-seq is a bit like a lambda, it wraps some code without executing it yet. The code inside the lazy-seq block has to result in some kind of sequence value. When you call any sequence operation on the lazy-seq, then it will first evaluate the code ("realize" the lazy seq), and then perform the operation on the result.
(def lz (lazy-seq
(println "Realizing!")
'(1 2 3)))
(first lz)
;; prints "realizing"
;; => 1
Now try this:
(defn lazy-conj [xs x]
(lazy-seq
(println "Realizing" x)
(conj (seq xs) x)))
Notice that it's similar to concat, it calls seq on its first argument, and returns a lazy-seq
(def up-to-hundred
(reduce lazy-conj () (range 100)))
(first up-to-hundred)
;; prints "Realizing 99"
;; prints "Realizing 98"
;; prints "Realizing 97"
;; ...
;; => 99
Even though you asked for only the first element, it still ended up realizing the whole sequence. That's because realizing the outer "layer" results in calling seq on the next "layer", which realizes another lazy-seq, which again calls seq, etc. So it's a chain reaction that realizes everything, and each step consumes a stack frame.
(def up-to-ten-thousand
(reduce lazy-conj () (range 10000)))
(first up-to-ten-thousand)
;;=> java.lang.StackOverflowError
You get the same problem when stacking concat calls. That's why for instance (reduce concat ,,,) is always a smell, instead you can use (apply concat ,,,) or (into () cat ,,,).
Other lazy operators like filter and map can exhibit the exact same problem. If you really have a lot of transformation steps over a sequence consider using transducers instead.
;; without transducers: many intermediate lazy seqs and deep call stacks
(->> my-seq
(map foo)
(filter bar)
(map baz)
,,,)
;; with transducers: seq processed in a single pass
(sequence (comp
(map foo)
(filter bar)
(map baz))
my-seq)
Arne had a good answer (and, in fact, I'd never noticed cat before!). If you want a simpler solution, you can use the glue function from the Tupelo library:
Gluing Together Like Collections
The concat function can sometimes have rather surprising results:
(concat {:a 1} {:b 2} {:c 3} )
;=> ( [:a 1] [:b 2] [:c 3] )
In this example, the user probably meant to merge the 3 maps into one. Instead, the three maps were mysteriously converted into length-2 vectors, which were then nested inside another sequence.
The conj function can also surprise the user:
(conj [1 2] [3 4] )
;=> [1 2 [3 4] ]
Here the user probably wanted to get [1 2 3 4] back, but instead got a nested vector by mistake.
Instead of having to wonder if the items to be combined will be merged, nested, or converted into another data type, we provide the glue function to always combine like collections together into a result collection of the same type:
; Glue together like collections:
(is (= (glue [ 1 2] '(3 4) [ 5 6] ) [ 1 2 3 4 5 6 ] )) ; all sequential (vectors & lists)
(is (= (glue {:a 1} {:b 2} {:c 3} ) {:a 1 :c 3 :b 2} )) ; all maps
(is (= (glue #{1 2} #{3 4} #{6 5} ) #{ 1 2 6 5 3 4 } )) ; all sets
(is (= (glue "I" " like " \a " nap!" ) "I like a nap!" )) ; all text (strings & chars)
; If you want to convert to a sorted set or map, just put an empty one first:
(is (= (glue (sorted-map) {:a 1} {:b 2} {:c 3}) {:a 1 :b 2 :c 3} ))
(is (= (glue (sorted-set) #{1 2} #{3 4} #{6 5}) #{ 1 2 3 4 5 6 } ))
An Exception will be thrown if the collections to be 'glued' are not all of the same type. The allowable input types are:
all sequential: any mix of lists & vectors (vector result)
all maps (sorted or not)
all sets (sorted or not)
all text: any mix of strings & characters (string result)
I put glue into your code instead of concat and still got a StackOverflowError. So, I also replaced the lazy filter and remove with eager versions keep-if and drop-if to get this result:
(defn qsort [list]
(loop [[current & todo :as all] [list] sorted []]
(cond
(nil? current) sorted
(or (nil? (seq current)) (= (count current) 1))
(recur todo (glue sorted current))
:else (let [[pivot & rest] current
pred #(> (compare pivot %) 0)
lt (keep-if pred rest)
gte (drop-if pred rest)
work (list* lt [pivot] gte todo)]
(recur work sorted)))))
(defn tlfnum [] (str/join (repeatedly 10 #(rand-int 10))))
(defn tlfbook [n] (repeatedly n #(tlfnum)))
(def result
(time (count (qsort (tlfbook 10000)))))
-------------------------------------
Clojure 1.8.0 Java 1.8.0_111
-------------------------------------
"Elapsed time: 1377.321118 msecs"
result => 10000
(def m {:a 1 :b 2 :c 3})
Let's say I want each value in m to be incremented. The only way I can think of to do that is
(into {}
(map
(fn [[key val]]
[key (inc val)])
m))
Is there a better way to do this? I need to do this a lot in my code and it looks kind of hacky. I really do need to use a map here (mostly for O(1) lookups, the key will be a UUID and the value a map), not a vector or a list.
Found something that looks good here: http://clojuredocs.org/clojure.core/reduce-kv.
(defn update-map [m f]
(reduce-kv (fn [m k v]
(assoc m k (f v))) {} m))
Then you can do
(update-map {:a 1 :b 2} inc)
to get
{:a 2 :b 3}
If needed you can supply k to f or make a update-key-values function that takes in two functions f and g and applies them to the keys and values respectively.
I am not to Clojure and attempting to figure out how to do this.
I want to create a new hash-map that for a subset of the keys in the hash-map applies a function to the elements. What is the best way to do this?
(let
[my-map {:hello "World" :try "This" :foo "bar"}]
(println (doToMap my-map [:hello :foo] (fn [k] (.toUpperCase k)))
This should then result a map with something like
{:hello "WORLD" :try "This" :foo "BAR"}
(defn do-to-map [amap keyseq f]
(reduce #(assoc %1 %2 (f (%1 %2))) amap keyseq))
Breakdown:
It helps to look at it inside-out. In Clojure, hash-maps act like functions; if you call them like a function with a key as an argument, the value associated with that key is returned. So given a single key, the current value for that key can be obtained via:
(some-map some-key)
We want to take old values, and change them to new values by calling some function f on them. So given a single key, the new value will be:
(f (some-map some-key))
We want to associate this new value with this key in our hash-map, "replacing" the old value. This is what assoc does:
(assoc some-map some-key (f (some-map some-key)))
("Replace" is in scare-quotes because we're not mutating a single hash-map object; we're returning new, immutable, altered hash-map objects each time we call assoc. This is still fast and efficient in Clojure because hash-maps are persistent and share structure when you assoc them.)
We need to repeatedly assoc new values onto our map, one key at a time. So we need some kind of looping construct. What we want is to start with our original hash-map and a single key, and then "update" the value for that key. Then we take that new hash-map and the next key, and "update" the value for that next key. And we repeat this for every key, one at a time, and finally return the hash-map we've "accumulated". This is what reduce does.
The first argument to reduce is a function that takes two arguments: an "accumulator" value, which is the value we keep "updating" over and over; and a single argument used in one iteration to do some of the accumulating.
The second argument to reduce is the initial value passed as the first argument to this fn.
The third argument to reduce is a collection of arguments to be passed as the second argument to this fn, one at a time.
So:
(reduce fn-to-update-values-in-our-map
initial-value-of-our-map
collection-of-keys)
fn-to-update-values-in-our-map is just the assoc statement from above, wrapped in an anonymous function:
(fn [map-so-far some-key] (assoc map-so-far some-key (f (map-so-far some-key))))
So plugging it into reduce:
(reduce (fn [map-so-far some-key] (assoc map-so-far some-key (f (map-so-far some-key))))
amap
keyseq)
In Clojure, there's a shorthand for writing anonymous functions: #(...) is an anonymous fn consisting of a single form, in which %1 is bound to the first argument to the anonymous function, %2 to the second, etc. So our fn from above can be written equivalently as:
#(assoc %1 %2 (f (%1 %2)))
This gives us:
(reduce #(assoc %1 %2 (f (%1 %2))) amap keyseq)
(defn doto-map [m ks f & args]
(reduce #(apply update-in %1 [%2] f args) m ks))
Example call
user=> (doto-map {:a 1 :b 2 :c 3} [:a :c] + 2)
{:a 3, :b 2, :c 5}
Hopes this helps.
The following seems to work:
(defn doto-map [ks f amap]
(into amap
(map (fn [[k v]] [k (f v)])
(filter (fn [[k v]] (ks k)) amap))))
user=> (doto-map #{:hello :foo} (fn [k] (.toUpperCase k)) {:hello "World" :try "This" :foo "bar"})
{:hello "WORLD", :try "This", :foo "BAR"}
There might be a better way to do this. Perhaps someone can come up with a nice one-liner :)
In common lisp I've noticed that I can write this:
(defun foo (&key (a 1) (b 2) (c (+ a b))) (print (+ a b c)))
And when I call (foo), 6 is printed. So the argument c can refer to values set for a and b. But I can't seem to find a way to do something similar with defstruct. Something like:
CL-USER> (defstruct thing a b c)
THING
CL-USER> (setq q (make-thing :a 1 :b 2 :c (+ a b)))
; Evaluation aborted
CL-USER> (setq q (make-thing :a 1 :b 2 :c (+ :a :b)))
; Evaluation aborted
Is there a way to do this?
You can do this using :constructor option of defstruct.
CL-USER> (defstruct (thing
(:constructor make-thing (&key a b (c (+ a b)))))
a b c)
THING
CL-USER> (make-thing :a 1 :b 2)
#S(THING :A 1 :B 2 :C 3)
For more info, see CLHS entry for defstruct.
Not that way. But using Lisp reader tricks you can do:
(make-thing :a #1=1 :b #2=2 :c (+ #1# #2#))
Otherwise use defclass and specialize the generic function shared-initialize.
Note that these reader macros will reference the form, not the result of evaluating it. Thus
(make-thing :id #1=(generate-unique-id) :my-id-is #1#)
will make a thing with two distinct calls to generate-unique-id.