Clojure: summing values in a collection of maps - collections

I am trying to sum up values of a collection of maps by their common keys. I have this snippet:
(def data [{:a 1 :b 2 :c 3} {:a 1 :b 2 :c 3}]
(for [xs data] (map xs [:a :b]))
((1 2) (1 2))
Final result should be ==> (2 4)
Basically, I have a list of maps. Then I perform a list of comprehension to take only the keys I need.
My question now is how can I now sum up those values? I tried to use "reduce" but it works only over sequences, not over collections.
Thanks.
===EDIT====
Using the suggestion from Joost I came out with this:
(apply merge-with + (for [x data] (select-keys x [:col0 :col1 :col2]))
This iterates a collection and sums on the chosen keys. The "select-keys" part I added is needed especially to avoid to get in trouble when the maps in the collection contain literals and not only numbers.

If you really want to sum the values of the common keys you can do the whole transformation in one step:
(apply merge-with + data)
=> {:a 2, :b 4, :c 6}
To sum the sub sequences you have:
(apply map + '((1 2) (1 2)))
=> (2 4)

Related

Combine and do calculation two vectors return map Clojure

I got two vectors, [shoes milk shoes] and [1 3 1], and the map I want to get is {shoes 2, milk 3}. I tried to zipmap two vectors and only {shoes 1 milk 3} shows. Without loop and iterate, is there another way to do that?
you can also employ a bit different solution for that, generating one-entry maps for item-to-amount pair, and then merging them with +:
(let [goods '[shoes milk shoes]
amounts [1 3 1]]
(apply merge-with + (map hash-map goods amounts)))
;;=> {milk 3, shoes 2}
You can do that with a reduce:
build up tuples of key/value from your two lists
accumulate into a map: add the value to the value for the key in the map (or start from 0 if missing (nil is passed))
(let [v1 '[shoes milk shoes]
v2 [1 3 1]]
(reduce
(fn [m [k v]]
(update m k (fnil + 0) v))
{}
(map vector v1 v2)))
; → {shoes 2, milk 3}
I liked the solution of #leetwinski a lot.
Actually, to solve similar problems in future,
I would suggest, first to collect the values of all keys in lists - in their occurring order:
(defn vecs2hash
[keys values]
(apply merge-with concat (map (fn [k v] {k (list v)}) keys values)))
Then:
(def hm (vecs2hash '[shoes milk shoes milk] [1 4 2 3]))
hm
;; => {milk (4 3), shoes (1 2)}
Then, one could write a function dealing with each of the collected elements as you wish.
Define a new function to sum up all values in the list:
(defn sum-up-value
[val]
(apply + val))
Define a helper function to apply the helper function to process all value-lists:
(defn apply-to-each-value
[fun hm]
(apply hash-map (interleave (keys hm) (map fun (vals hm)))))
So in your case:
(apply-to-each-value sum-up-values hm)
;; {milk 7, shoes 3}

clojure - (Another one) StackOverflow with loop/recur

I know this is a recurring question (here, here, and more), and I know that the problem is related to creating lazy sequencies, but I can't see why it fails.
The problem: I had written a (not very nice) quicksort algorithm to sort strings that uses loop/recur. But applied to 10000 elements, I get a StackOverflowError:
(defn qsort [list]
(loop [[current & todo :as all] [list] sorted []]
(cond
(nil? current) sorted
(or (nil? (seq current)) (= (count current) 1)) (recur todo (concat sorted current))
:else (let [[pivot & rest] current
pred #(> (compare pivot %) 0)
lt (filter pred rest)
gte (remove pred rest)
work (list* lt [pivot] gte todo)]
(recur work sorted)))))
I used in this way:
(defn tlfnum [] (str/join (repeatedly 10 #(rand-int 10))))
(defn tlfbook [n] (repeatedly n #(tlfnum)))
(time (count (qsort (tlfbook 10000))))
And this is part of the stack trace:
[clojure.lang.LazySeq seq "LazySeq.java" 49]
[clojure.lang.RT seq "RT.java" 521]
[clojure.core$seq__4357 invokeStatic "core.clj" 137]
[clojure.core$concat$fn__4446 invoke "core.clj" 706]
[clojure.lang.LazySeq sval "LazySeq.java" 40]
[clojure.lang.LazySeq seq "LazySeq.java" 49]
[clojure.lang.RT seq "RT.java" 521]
[clojure.core$seq__4357 invokeStatic "core.clj" 137]]}
As far as I know, loop/recur performs tail call optimization, so no stack is used (is, in fact, an iterative process written using recursive syntax).
Reading other answers, and because of the stack trace, I see there's a problem with concat and adding a doall before concat solves the stack overflow problem. But... why?
Here's part of the code for the two-arity version of concat.
(defn concat [x y]
(lazy-seq
(let [s (seq x)]
,,,))
)
Notice that it uses two other functions, lazy-seq, and seq. lazy-seq is a bit like a lambda, it wraps some code without executing it yet. The code inside the lazy-seq block has to result in some kind of sequence value. When you call any sequence operation on the lazy-seq, then it will first evaluate the code ("realize" the lazy seq), and then perform the operation on the result.
(def lz (lazy-seq
(println "Realizing!")
'(1 2 3)))
(first lz)
;; prints "realizing"
;; => 1
Now try this:
(defn lazy-conj [xs x]
(lazy-seq
(println "Realizing" x)
(conj (seq xs) x)))
Notice that it's similar to concat, it calls seq on its first argument, and returns a lazy-seq
(def up-to-hundred
(reduce lazy-conj () (range 100)))
(first up-to-hundred)
;; prints "Realizing 99"
;; prints "Realizing 98"
;; prints "Realizing 97"
;; ...
;; => 99
Even though you asked for only the first element, it still ended up realizing the whole sequence. That's because realizing the outer "layer" results in calling seq on the next "layer", which realizes another lazy-seq, which again calls seq, etc. So it's a chain reaction that realizes everything, and each step consumes a stack frame.
(def up-to-ten-thousand
(reduce lazy-conj () (range 10000)))
(first up-to-ten-thousand)
;;=> java.lang.StackOverflowError
You get the same problem when stacking concat calls. That's why for instance (reduce concat ,,,) is always a smell, instead you can use (apply concat ,,,) or (into () cat ,,,).
Other lazy operators like filter and map can exhibit the exact same problem. If you really have a lot of transformation steps over a sequence consider using transducers instead.
;; without transducers: many intermediate lazy seqs and deep call stacks
(->> my-seq
(map foo)
(filter bar)
(map baz)
,,,)
;; with transducers: seq processed in a single pass
(sequence (comp
(map foo)
(filter bar)
(map baz))
my-seq)
Arne had a good answer (and, in fact, I'd never noticed cat before!). If you want a simpler solution, you can use the glue function from the Tupelo library:
Gluing Together Like Collections
The concat function can sometimes have rather surprising results:
(concat {:a 1} {:b 2} {:c 3} )
;=> ( [:a 1] [:b 2] [:c 3] )
In this example, the user probably meant to merge the 3 maps into one. Instead, the three maps were mysteriously converted into length-2 vectors, which were then nested inside another sequence.
The conj function can also surprise the user:
(conj [1 2] [3 4] )
;=> [1 2 [3 4] ]
Here the user probably wanted to get [1 2 3 4] back, but instead got a nested vector by mistake.
Instead of having to wonder if the items to be combined will be merged, nested, or converted into another data type, we provide the glue function to always combine like collections together into a result collection of the same type:
; Glue together like collections:
(is (= (glue [ 1 2] '(3 4) [ 5 6] ) [ 1 2 3 4 5 6 ] )) ; all sequential (vectors & lists)
(is (= (glue {:a 1} {:b 2} {:c 3} ) {:a 1 :c 3 :b 2} )) ; all maps
(is (= (glue #{1 2} #{3 4} #{6 5} ) #{ 1 2 6 5 3 4 } )) ; all sets
(is (= (glue "I" " like " \a " nap!" ) "I like a nap!" )) ; all text (strings & chars)
; If you want to convert to a sorted set or map, just put an empty one first:
(is (= (glue (sorted-map) {:a 1} {:b 2} {:c 3}) {:a 1 :b 2 :c 3} ))
(is (= (glue (sorted-set) #{1 2} #{3 4} #{6 5}) #{ 1 2 3 4 5 6 } ))
An Exception will be thrown if the collections to be 'glued' are not all of the same type. The allowable input types are:
all sequential: any mix of lists & vectors (vector result)
all maps (sorted or not)
all sets (sorted or not)
all text: any mix of strings & characters (string result)
I put glue into your code instead of concat and still got a StackOverflowError. So, I also replaced the lazy filter and remove with eager versions keep-if and drop-if to get this result:
(defn qsort [list]
(loop [[current & todo :as all] [list] sorted []]
(cond
(nil? current) sorted
(or (nil? (seq current)) (= (count current) 1))
(recur todo (glue sorted current))
:else (let [[pivot & rest] current
pred #(> (compare pivot %) 0)
lt (keep-if pred rest)
gte (drop-if pred rest)
work (list* lt [pivot] gte todo)]
(recur work sorted)))))
(defn tlfnum [] (str/join (repeatedly 10 #(rand-int 10))))
(defn tlfbook [n] (repeatedly n #(tlfnum)))
(def result
(time (count (qsort (tlfbook 10000)))))
-------------------------------------
Clojure 1.8.0 Java 1.8.0_111
-------------------------------------
"Elapsed time: 1377.321118 msecs"
result => 10000

why clojure conj acts different between vector and map

> (conj [0] 1 2 3)
[0 1 2 3]
> (conj {:a "ei"} {:b "bi"})
{:b "bi", :a "ei"}
>
See, when it acts on vector, it puts 1,2,3 at end of it.
Whereas it put :b "bi" in front of :a map k-v pair
Why is this?
thanks
As with many hashed maps implementations, Clojure's hashed maps do not sort their entries, not retain the order in which they were inserted. This allows for better performance.
Note that conj does not have general ordering semantics either (it has ordering semantics for some concrete types, such as vectors).
You don't have to go as far as maps to get inconsistent behaviour from conj:
(conj [1] 2) ; [1 2]
(conj (list 1) 2) ; (2 1)
Hash maps have no defined order. But, for any map,
the seq of entries will always be the same
the vals and keys will be in consistent order.
Thus, for map m,
(= (keys m) (map key m))
(= (vals m) (map val m))
(= m (zipmap (keys m) (vals m)))
Currently, this sequence seems to be independent of insertion order. I tested this on sets by randomly shuffling random integers.

Idiomatic way to iterate over a map, 'changing' each value and returning a map?

(def m {:a 1 :b 2 :c 3})
Let's say I want each value in m to be incremented. The only way I can think of to do that is
(into {}
(map
(fn [[key val]]
[key (inc val)])
m))
Is there a better way to do this? I need to do this a lot in my code and it looks kind of hacky. I really do need to use a map here (mostly for O(1) lookups, the key will be a UUID and the value a map), not a vector or a list.
Found something that looks good here: http://clojuredocs.org/clojure.core/reduce-kv.
(defn update-map [m f]
(reduce-kv (fn [m k v]
(assoc m k (f v))) {} m))
Then you can do
(update-map {:a 1 :b 2} inc)
to get
{:a 2 :b 3}
If needed you can supply k to f or make a update-key-values function that takes in two functions f and g and applies them to the keys and values respectively.

Count elements in a vector of vectors in clojure

If I have a matrix defined as:
(def m1 [[1 2 3][4 5 6][7 8 9]])
How do I go about counting the vectors within the vector in clojure. I know that (count m1) will return 3 which is the number of vectors I have in the initial vector but I can't remember how to count the inner vectors (its been a very very long time since I've had to deal with any lisp dialect). Also I do not want to flatten the vector and then count it because I need to count the values separately (ie. I want to return 3, 3, 3 because each of the inner vectors have 3 elements. One last restriction I guess is that I want to do this without using map right away because I realized I can simply do (map count m1).
That's actually very simple, just call:
(map count m1)
Or if you want to have your result also in vector:
(mapv count m1)
You'll want to use map. It will apply count to each element in the vector and return a list of counts.
(def m1 [[1 2 3][4 5 6][7 8 9]])
(map count m1)
=> (3 3 3)
Your edit: "I want to do this without using map."
(defn counts [vs]
(loop [vs vs, cs []]
(if (empty? vs)
cs
(recur (rest vs), (conj cs (count (first vs)))))))
One answer quite good but use count. Assume not using count and also not use map
((fn [lst]
(loop [l lst, n 0]
(if (empty? l)
n
(recur (rest l) (inc n)))))'(1 2 3))

Resources