Say that I have a function:
(defn get-token [char]
(defn char->number? []
(re-matches #"\d" (str char)))
(defn whitespace? []
(or
(= \space char)
(= \newline char)))
(defn valid-ident-char? []
(re-matches #"[a-zA-Z_$]" (str char)))
(cond
(whitespace?)
nil
(= \' char)
:quote
(= \) char)
:rparen
(= \( char)
:lparen
(char->number?)
:integer
(valid-ident-char?)
:identifier
:else
(throw (Exception. "invalid character"))))
When I run this function on, for instance, the string "(test 1 2)", I get a list of symbols for each character:
'(:lparen :identifier :identifier :identifier nil :integer nil :integer :rparen)
Seeing that this is not entirely what I want, I am trying to write a function that takes a collection and "condenses" the collection to combine adjacent elements that are equal.
A final example might do this:
(defn combine-adjacent [coll]
implementation...)
(->>
"(test 1 2)"
(map get-token)
(combine-adjacent)
(remove nil?))
; => (:lparen :identifier :integer :integer :rparen)
What is the idiomatic Clojure way to achieve this?
Clojure 1.7 will introduce a new function called dedupe to accomplish exactly this:
(dedupe [0 1 1 2 2 3 1 2 3])
;= (0 1 2 3 1 2 3)
If you're prepared to use 1.7.0-alpha2, you could use it today.
The implementation relies on transducers (dedupe produces a transducer when called with no arguments; the unary overload is defined simply as (sequence (dedupe) coll)), so it wouldn't be straightforward to backport.
Not sure how idiomatic it is but one way to do it would be to use partition-by to group elements of the incoming sequence into lists containing subsequences of the same element and then use map to get the first element from each of those lists.
So, in code
(defn combine-adjacent [input]
(->> input (partition-by identity) (map first)))
or
(defn combine-adjacent [input]
(->> (partition-by identity input) (map first))
should work.
There are a couple of tricks for comparing items to the one next door:
first, we can compare it to its tail:
(defn combine-adjacent
[s]
(mapcat #(when (not= % %2) [%]) (rest s) s))
alternatively, we can take the sequence by twos, and drop repeats
(defn combine-adjacent
[s]
(mapcat (fn [[a b]] (if (not= a b) [a]) ())
(partition 2 1[0 1 2 2 2 3 2 3])))
both of these take advantage of the helpful property of concat when combined with map that you can return zero or more elements for the result sequence for each input. The empty list for the false case in the second version is not needed, but may help with clarity.
Related
I'm running into a problem where immutability suddenly doesn't hold for my vectors. I was wondering if there was a way to create fresh, immutable vector copies of a given set.
Clojuredocs suggested "aclone" but that is giving me an error stating that there's no such method.
(defn stripSame [word wordList]
(def setVec (into #{} wordList))
(def wordSet word)
(def wordVec (into #{} [wordSet]))
(def diffSet (set/difference setVec wordVec))
(def diffVec (into [] diffSet))
diffVec)
(defn findInsOuts [word passList]
(def wordList (stripSame word passList))
(println word wordList)
(def endLetter (subs word (dec (count word))))
(def startLetter (subs word 0 1))
(println startLetter endLetter)
(def outs (filter (partial starts endLetter) wordList))
(def ins (filter (partial ends startLetter) wordList))
;(println ins outs)
(def indexes [ (count ins) (count outs)])
indexes)
(defn findAll [passList]
(def wordList (into [] passList) ))
(println wordList)
(loop [n 0 indexList []]
(println "In here" (get wordList n) wordList)
(if (< n (count wordList))
(do
(def testList wordList)
(def indexes (findInsOuts (get wordList n) testList))
(println (get wordList n) indexes)
(recur (inc n) (conj indexList [(get wordList n) indexes]))))))
passList is a list of words like so (lol at good) which is then cast into a vector.
So basically findAll calls findInsOuts which goes through every word in the list and sees how many other words start with its last letter but which first removes the search word from the vector before performing some function to prevent duplicates. The problem is that somehow this vector is actually mutable, so the copy of the vector in findAll also has that value permanently stripped.
When I try to create a new vector and then act on that vector the same thing still happens, which implies that they're aliased/sharing the same memory location.
How can I create a fresh vector for use that is actually immutable?
Any help is appreciated
I'm afraid your code is riddled with misunderstandings. For a start, don't use def within defn. Use let instead. This turns your first function into ...
(defn stripSame [word wordList]
(let [setVec (into #{} wordList)
wordSet word
wordVec (into #{} [wordSet])
diffSet (clojure.set/difference setVec wordVec)
diffVec (into [] diffSet)]
diffVec))
For example,
=> (stripSame 2 [1 2 :buckle-my-shoe])
[1 :buckle-my-shoe]
The function can be simplified to ...
(defn stripSame [word wordList]
(vec (disj (set wordList) word)))
... or, using a threading macro, to ...
(defn stripSame [word wordList]
(-> wordList set (disj word) vec))
I don't think the function does what you think it does, because it doesn't always preserve the order of elements in the vector.
If I were you, I'd work my way through some of the community tutorials on this page. There are several good books referred to there too. Once you get to grips with the idioms of the language, you'll find the sort of thing you are trying to do here much clearer and easier.
I am trying to understand the below program to find the Fibonacci series using recursion in Clojure.
(defn fib
[x]
(loop [i '(1 0)]
(println i)
(if (= x (count i))
(reverse i)
(recur
(conj i (apply + (take 2 i))))))) // This line is not clear
For ex for a call fib(4) I get the below output,
(1 0)
(1 1 0)
(2 1 1 0)
(0 1 1 2)
Which as per my inference the conj seems to add the value of (apply + (take 2 i)) to the start of the i. But that is not the behaviour of conj. Can someone help me understand how exactly this works?
That is the behavior of conj, for lists. conj doesn't always add to the end:
(conj '(1) 2) ; '(2 1)
(conj [1] 2) ; [1 2]
The placement of the added element depends on the type of the collection. Since adding to the end of a list is expensive, conj adds to to front instead. It's the same operation (adding to a list), but optimized for the collection being used.
Per Clojure documentation:
The 'addition' may happen at different 'places' depending on the concrete type.
Appending to list happens to beginning of list, appending to vector happens to the end...
See more examples at https://clojuredocs.org/clojure.core/conj
I know this is a recurring question (here, here, and more), and I know that the problem is related to creating lazy sequencies, but I can't see why it fails.
The problem: I had written a (not very nice) quicksort algorithm to sort strings that uses loop/recur. But applied to 10000 elements, I get a StackOverflowError:
(defn qsort [list]
(loop [[current & todo :as all] [list] sorted []]
(cond
(nil? current) sorted
(or (nil? (seq current)) (= (count current) 1)) (recur todo (concat sorted current))
:else (let [[pivot & rest] current
pred #(> (compare pivot %) 0)
lt (filter pred rest)
gte (remove pred rest)
work (list* lt [pivot] gte todo)]
(recur work sorted)))))
I used in this way:
(defn tlfnum [] (str/join (repeatedly 10 #(rand-int 10))))
(defn tlfbook [n] (repeatedly n #(tlfnum)))
(time (count (qsort (tlfbook 10000))))
And this is part of the stack trace:
[clojure.lang.LazySeq seq "LazySeq.java" 49]
[clojure.lang.RT seq "RT.java" 521]
[clojure.core$seq__4357 invokeStatic "core.clj" 137]
[clojure.core$concat$fn__4446 invoke "core.clj" 706]
[clojure.lang.LazySeq sval "LazySeq.java" 40]
[clojure.lang.LazySeq seq "LazySeq.java" 49]
[clojure.lang.RT seq "RT.java" 521]
[clojure.core$seq__4357 invokeStatic "core.clj" 137]]}
As far as I know, loop/recur performs tail call optimization, so no stack is used (is, in fact, an iterative process written using recursive syntax).
Reading other answers, and because of the stack trace, I see there's a problem with concat and adding a doall before concat solves the stack overflow problem. But... why?
Here's part of the code for the two-arity version of concat.
(defn concat [x y]
(lazy-seq
(let [s (seq x)]
,,,))
)
Notice that it uses two other functions, lazy-seq, and seq. lazy-seq is a bit like a lambda, it wraps some code without executing it yet. The code inside the lazy-seq block has to result in some kind of sequence value. When you call any sequence operation on the lazy-seq, then it will first evaluate the code ("realize" the lazy seq), and then perform the operation on the result.
(def lz (lazy-seq
(println "Realizing!")
'(1 2 3)))
(first lz)
;; prints "realizing"
;; => 1
Now try this:
(defn lazy-conj [xs x]
(lazy-seq
(println "Realizing" x)
(conj (seq xs) x)))
Notice that it's similar to concat, it calls seq on its first argument, and returns a lazy-seq
(def up-to-hundred
(reduce lazy-conj () (range 100)))
(first up-to-hundred)
;; prints "Realizing 99"
;; prints "Realizing 98"
;; prints "Realizing 97"
;; ...
;; => 99
Even though you asked for only the first element, it still ended up realizing the whole sequence. That's because realizing the outer "layer" results in calling seq on the next "layer", which realizes another lazy-seq, which again calls seq, etc. So it's a chain reaction that realizes everything, and each step consumes a stack frame.
(def up-to-ten-thousand
(reduce lazy-conj () (range 10000)))
(first up-to-ten-thousand)
;;=> java.lang.StackOverflowError
You get the same problem when stacking concat calls. That's why for instance (reduce concat ,,,) is always a smell, instead you can use (apply concat ,,,) or (into () cat ,,,).
Other lazy operators like filter and map can exhibit the exact same problem. If you really have a lot of transformation steps over a sequence consider using transducers instead.
;; without transducers: many intermediate lazy seqs and deep call stacks
(->> my-seq
(map foo)
(filter bar)
(map baz)
,,,)
;; with transducers: seq processed in a single pass
(sequence (comp
(map foo)
(filter bar)
(map baz))
my-seq)
Arne had a good answer (and, in fact, I'd never noticed cat before!). If you want a simpler solution, you can use the glue function from the Tupelo library:
Gluing Together Like Collections
The concat function can sometimes have rather surprising results:
(concat {:a 1} {:b 2} {:c 3} )
;=> ( [:a 1] [:b 2] [:c 3] )
In this example, the user probably meant to merge the 3 maps into one. Instead, the three maps were mysteriously converted into length-2 vectors, which were then nested inside another sequence.
The conj function can also surprise the user:
(conj [1 2] [3 4] )
;=> [1 2 [3 4] ]
Here the user probably wanted to get [1 2 3 4] back, but instead got a nested vector by mistake.
Instead of having to wonder if the items to be combined will be merged, nested, or converted into another data type, we provide the glue function to always combine like collections together into a result collection of the same type:
; Glue together like collections:
(is (= (glue [ 1 2] '(3 4) [ 5 6] ) [ 1 2 3 4 5 6 ] )) ; all sequential (vectors & lists)
(is (= (glue {:a 1} {:b 2} {:c 3} ) {:a 1 :c 3 :b 2} )) ; all maps
(is (= (glue #{1 2} #{3 4} #{6 5} ) #{ 1 2 6 5 3 4 } )) ; all sets
(is (= (glue "I" " like " \a " nap!" ) "I like a nap!" )) ; all text (strings & chars)
; If you want to convert to a sorted set or map, just put an empty one first:
(is (= (glue (sorted-map) {:a 1} {:b 2} {:c 3}) {:a 1 :b 2 :c 3} ))
(is (= (glue (sorted-set) #{1 2} #{3 4} #{6 5}) #{ 1 2 3 4 5 6 } ))
An Exception will be thrown if the collections to be 'glued' are not all of the same type. The allowable input types are:
all sequential: any mix of lists & vectors (vector result)
all maps (sorted or not)
all sets (sorted or not)
all text: any mix of strings & characters (string result)
I put glue into your code instead of concat and still got a StackOverflowError. So, I also replaced the lazy filter and remove with eager versions keep-if and drop-if to get this result:
(defn qsort [list]
(loop [[current & todo :as all] [list] sorted []]
(cond
(nil? current) sorted
(or (nil? (seq current)) (= (count current) 1))
(recur todo (glue sorted current))
:else (let [[pivot & rest] current
pred #(> (compare pivot %) 0)
lt (keep-if pred rest)
gte (drop-if pred rest)
work (list* lt [pivot] gte todo)]
(recur work sorted)))))
(defn tlfnum [] (str/join (repeatedly 10 #(rand-int 10))))
(defn tlfbook [n] (repeatedly n #(tlfnum)))
(def result
(time (count (qsort (tlfbook 10000)))))
-------------------------------------
Clojure 1.8.0 Java 1.8.0_111
-------------------------------------
"Elapsed time: 1377.321118 msecs"
result => 10000
The ClojureDocs page for lazy-seq gives an example of generating a lazy-seq of all positive numbers:
(defn positive-numbers
([] (positive-numbers 1))
([n] (cons n (lazy-seq (positive-numbers (inc n))))))
This lazy-seq can be evaluated for pretty large indexes without throwing a StackOverflowError (unlike the sieve example on the same page):
user=> (nth (positive-numbers) 99999999)
100000000
If only recur can be used to avoid consuming stack frames in a recursive function, how is it possible this lazy-seq example can seemingly call itself without overflowing the stack?
A lazy sequence has the rest of the sequence generating calculation in a thunk. It is not immediately called. As each element (or chunk of elements as the case may be) is requested, a call to the next thunk is made to retrieve the value(s). That thunk may create another thunk to represent the tail of the sequence if it continues. The magic is that (1) these special thunks implement the sequence interface and can transparently be used as such and (2) each thunk is only called once -- its value is cached -- so the realized portion is a sequence of values.
Here it is the general idea without the magic, just good ol' functions:
(defn my-thunk-seq
([] (my-thunk-seq 1))
([n] (list n #(my-thunk-seq (inc n)))))
(defn my-next [s] ((second s)))
(defn my-realize [s n]
(loop [a [], s s, n n]
(if (pos? n)
(recur (conj a (first s)) (my-next s) (dec n))
a)))
user=> (-> (my-thunk-seq) first)
1
user=> (-> (my-thunk-seq) my-next first)
2
user=> (my-realize (my-thunk-seq) 10)
[1 2 3 4 5 6 7 8 9 10]
user=> (count (my-realize (my-thunk-seq) 100000))
100000 ; Level stack consumption
The magic bits happen inside of clojure.lang.LazySeq defined in Java, but we can actually do the magic directly in Clojure (implementation that follows for example purposes), by implementing the interfaces on a type and using an atom to cache.
(deftype MyLazySeq [thunk-mem]
clojure.lang.Seqable
(seq [_]
(if (fn? #thunk-mem)
(swap! thunk-mem (fn [f] (seq (f)))))
#thunk-mem)
;Implementing ISeq is necessary because cons calls seq
;on anyone who does not, which would force realization.
clojure.lang.ISeq
(first [this] (first (seq this)))
(next [this] (next (seq this)))
(more [this] (rest (seq this)))
(cons [this x] (cons x (seq this))))
(defmacro my-lazy-seq [& body]
`(MyLazySeq. (atom (fn [] ~#body))))
Now this already works with take, etc., but as take calls lazy-seq we'll make a my-take that uses my-lazy-seq instead to eliminate any confusion.
(defn my-take
[n coll]
(my-lazy-seq
(when (pos? n)
(when-let [s (seq coll)]
(cons (first s) (my-take (dec n) (rest s)))))))
Now let's make a slow infinite sequence to test the caching behavior.
(defn slow-inc [n] (Thread/sleep 1000) (inc n))
(defn slow-pos-nums
([] (slow-pos-nums 1))
([n] (cons n (my-lazy-seq (slow-pos-nums (slow-inc n))))))
And the REPL test
user=> (def nums (slow-pos-nums))
#'user/nums
user=> (time (doall (my-take 10 nums)))
"Elapsed time: 9000.384616 msecs"
(1 2 3 4 5 6 7 8 9 10)
user=> (time (doall (my-take 10 nums)))
"Elapsed time: 0.043146 msecs"
(1 2 3 4 5 6 7 8 9 10)
Keep in mind that lazy-seq is a macro, and therefore does not evaluate its body when your positive-numbers function is called. In that sense, positive-numbers isn't truly recursive. It returns immediately, and the inner "recursive" call to positive-numbers doesn't happen until the seq is consumed.
user=> (source lazy-seq)
(defmacro lazy-seq
"Takes a body of expressions that returns an ISeq or nil, and yields
a Seqable object that will invoke the body only the first time seq
is called, and will cache the result and return it on all subsequent
seq calls. See also - realized?"
{:added "1.0"}
[& body]
(list 'new 'clojure.lang.LazySeq (list* '^{:once true} fn* [] body)))
I think the trick is that the producer function (positive-numbers) isn't getting called recursively, it doesn't accumulate stack frames as if it was called with basic recursion Little-Schemer style, because LazySeq is invoking it as needed for the individual entries in the sequence. Once a closure gets evaluated for an entry then it can be discarded. So stack frames from previous invocations of the function can get garbage-collected as the code churns through the sequence.
I would like to reduce the following seq:
({0 "Billie Verpooten"}
{1 "10:00"}
{2 "17:00"}
{11 "11:10"}
{12 "19:20"})
to
{:name "Billie Verpooten"
:work {:1 ["10:00" "17:00"]
:11 ["11:10" "19:20"]}}
but I have no idea to do this.
I was think about a recursive function that uses deconstruction.
There's a function for reducing a sequence to something in the standard library, and it's called reduce. Though in your specific case, it seems appropriate to remove the special case key 0 first and partition the rest into the pairs of entries that they're meant to be.
The following function gives the result described in your question:
(defn build-map [maps]
(let [entries (map first maps)
key-zero? (comp zero? key)]
{:name (val (first (filter key-zero? entries)))
:work (reduce (fn [acc [[k1 v1] [k2 v2]]]
(assoc acc (keyword (str k1)) [v1 v2]))
{}
(partition 2 (remove key-zero? entries)))}))
Just for variety here is a different way of expressing an answer by threading sequence manipulation functions:
user> (def data '({0 "Billie Verpooten"}
{1 "10:00"}
{2 "17:00"}
{11 "11:10"}
{12 "19:20"}))
user> {:name (-> data first first val)
:work (as-> data x
(rest x)
(into {} x)
(zipmap (map first (partition 1 2 (keys x)))
(partition 2 (vals x))))}
teh as-> threading macro is new to Clojure 1.5 and makes expressing this sort of function a bit more concise.