Emacs lisp has reduce-vec. What's the proper way to do this in common lisp, without using loop or reinventing the wheel?
You should be able to use something like the following. It works for arrays of any dimensions.
(defun reduce-multidimensional-array (fn arr &rest args)
(apply #'reduce
fn
(make-array (array-total-size arr) :displaced-to arr)
args))
In short, this works by creating a one dimensional array that shares elements with the array passed in. Since reduce works on one dimensional arrays it is possible to reduce the new array.
The function array-total-size returns the total number of elements in the array and the :displaced-to keyword argument causes the new array to share elements with the array passed in (even if they have different dimensions).
Related
When developing with Common Lisp, we have three possibilities to define new setf-forms:
We can define a function whose name is a list of two symbols, the first one being setf, e.g. (defun (setf some-observable) (…)).
We can use the short form of defsetf.
We can use the long form of defsetf.
We can use define-setf-expander.
I am not sure what is the right or intended use-case for each of these possibilities.
A response to this question could hint at the most generic solution and outline contexts where other solutions are superior.
define-setf-expander is the most general of these. All of setf's functionality is encompassed by it.
Defining a setf function works fine for most accessors. It is also valid to use a generic function, so polymorphism is insufficient to require using something else. Controlling evaluation either for correctness or performance is the main reason to not use a setf function.
For correctness, many forms of destructuring are not possible to do with a setf function (e.g. (setf (values ...) ...)). Similarly I've seen an example that makes functional data structures behave locally like a mutable one by changing (setf (dict-get key some-dict) 2) to assign a new dictionary to some-dict.
For performance, consider the silly case of (incf (nth 10000 list)) which if the nth writer were implemented as a function would require traversing 10k list nodes twice, but in a setf expander can be done with a single traversal.
Very often one finds statements that lists have a performance disadvantage compared to vectors because of consing and additional gc steps and some function work on generic sequences accepting lists and vectors.
But some functions like intersection expect two lists. Is there a library providing an alternative for vectors?
I started with something like this, but have the feeling that there should be a more mature solution.
(defun vec-intersec (vec-1 vec-2 &aux (result (make-array 0 :adjustable t :fill-pointer 0)))
"A simple implementation of intersection for vectors instead of lists."
(loop :for v1 :across vec-1
:if (find v1 vec-2 :test #'equal)
:do (vector-push-extend v1 result))
result)
It always depends on the size of your collection and what you want to do with it.
Below about 20 to 50 elements, lists are often perfectly OK even for random access (if you're not in a tight inner loop, or consing a lot).
If you already have vectors, it might be most convenient to sort one of them so that you can do a binary search instead of a naïve linear one. If that is not enough, and your collections bigger, putting the elements into a hash-table (as keys, with an appropriate :test) gives you faster (amortized) lookup.
This should take you quite far. If you identify an issue that cannot be solved in such a simple way, you might want to look into FSet or CL-Containers, which support more advanced data structures.
I store graph coordinates in a circular list:
(defun make-circular-list (size &key initial-element)
(let ((list (make-list size :initial-element initial-element)))
(nconc list list)))
(defvar *coordinates* (make-circular-list 1024 :initial-element 0.0))
Now it's easy to update *coordinates* whenever new coordinates must be set.
However, I have a library function that takes a sequence of coordinates to draw lines on a graph. Of course this function does not work with a circular structure, so I would like to make a copy of fixed length. A list or an array is fine.
So far I have tried subseq and make-array with the :initial-contents keyword, but it fails. loop or dotimes do work but I was hoping to avoid iterating on each elements of the list.
Is it possible to efficiently copy this circular structure or make a fixed length array in the spirit of a displaced-array?
There is nothing wrong with using LOOP.
(loop for c in *coordinates* repeat 1024 collect c)
Btw., sometimes it might be useful to hide the circular list behind a CLOS object.
(defclass circular-list ()
((list)
(size)
(first-element)
(last-element)))
And so on...
That way you can provide some CLOS methods for accessing and changing it (create, add, copy, delete, , as-list, ...). You can also control printing of it using a method for PRINT-OBJECT.
I have the following code which increments the first element of every pair in a vector:
(vec (map (fn [[key value]] [(inc key) value]) [[0 :a] [1 :b]]))
However i fear this code is inelegant, as it first creates a sequence using map and then casts it back to a vector.
Consider this analog:
(into [] (map (fn [[key value]] [(inc key) value]) [[0 :a] [1 :b]]))
On #clojure#irc.freenode.net i was told, that using the code above is bad, because into expands into (reduce conj [] (map-indexed ...)), which produces many intermediate objects in the process. Then i was told that actually into doesn't expand into (reduce conj ...) and uses transients when it can. Also measuring elapsed time showed that into is actually faster than vec.
So my questions are:
What is the proper way to use map over vectors?
What happens underneath, when i use vec and into with vectors?
Related but not duplicate questions:
Clojure: sequence back to vector
How To Turn a Reduce-Realized Sequence Back Into Lazy Vector Sequence
Actually as of Clojure 1.4.0 the preferred way of doing this is to use mapv, which is like map except its return value is a vector. It is by far the most efficient approach, with no unnecessary intermediate allocations at all.
Clojure 1.5.0 will bring a new reducers library which will provide a generic way to map, filter, take, drop etc. while creating vectors, usable with into []. You can play with it in the 1.5.0 alphas and in the recent tagged releases of ClojureScript.
As for (vec some-seq) and (into [] some-seq), the first ultimately delegates to a Java loop which pours some-seq into an empty transient vector, while the second does the same thing in very efficient Clojure code. In both cases there are some initial checks involved to determine which approach to take when constructing the final return value.
vec and into [] are significantly different for Java arrays of small length (up to 32) -- the first will alias the array (use it as the tail of the newly created vector) and demands that the array not be modified subsequently, lest the contents of the vector change (see the docstring); the latter creates a new vector with a new tail and doesn't care about future changes to the array.
This a conceptual question on how one would implement the following in Lisp (assuming Common Lisp in my case, but any dialect would work). Assume you have a function that creates closures that sequentially iterate over an arbitrary collection (or otherwise return different values) of data and returns nil when exhausted, i.e.
(defun make-counter (up-to)
(let ((cnt 0))
(lambda ()
(if (< cnt up-to)
(incf cnt)
nil))))
CL-USER> (defvar gen (make-counter 3))
GEN
CL-USER> (funcall gen)
1
CL-USER> (funcall gen)
2
CL-USER> (funcall gen)
3
CL-USER> (funcall gen)
NIL
CL-USER> (funcall gen)
NIL
Now, assume you are trying to permute a combinations of one or more of these closures. How would you implement a function that returns a new closure that subsequently creates a permutation of all closures contained within it? i.e.:
(defun permute-closures (counters)
......)
such that the following holds true:
CL-USER> (defvar collection (permute-closures (list
(make-counter 3)
(make-counter 3))))
CL-USER> (funcall collection)
(1 1)
CL-USER> (funcall collection)
(1 2)
CL-USER> (funcall collection)
(1 3)
CL-USER> (funcall collection)
(2 1)
...
and so on.
The way I had it designed originally was to add a 'pause' parameter to the initial counting lambda such that when iterating you can still call it and receive the old cached value if passed ":pause t", in hopes of making the permutation slightly cleaner. Also, while the example above is a simple list of two identical closures, the list can be an arbitrarily-complicated tree (which can be permuted in depth-first order, and the resulting permutation set would have the shape of the tree.).
I had this implemented, but my solution wasn't very clean and am trying to poll how others would approach the problem.
Thanks in advance.
edit Thank you for all the answers. What I ended up doing was adding a 'continue' argument to the generator and flattening my structure by replacing any nested list with a closure that permuted that list. The generators did not advance and always returned the last cached value unless 'continue' was passed. Then I just recursively called each generator until I got to the either the last cdr or a nil. If i got to the last cdr, I just bumped it. If I got to a NIL, I bumped the one before it, and reset every closure following it.
You'll clearly need some way of using each value returned by a generator more than once.
In addition to Rainer Joswig's suggestions, three approaches come to mind.
Caching values
permute-closures could, of course, remember every value returned by each generator by storing it in a list, and reuse that over and over. This approach obviously implies some memory overhead, and it won't work very well if the generated sequences can be infinite.
Creating new generators on each iteration
In this approach, you would change the signature of permute-closures to take as arguments not ready-to-use generators but thunks that create them. Your example would then look like this:
(permute-closures (list (lambda () (make-counter 3))
(lambda () (make-counter 3))))
This way, permute-closures is able to reset a generator by simply recreating it.
Making generator states copyable
You could provide a way of making copies of generators along with their states. This is kind of like approach #2 in that permute-closures would reset the generators as needed except the resetting would be done by reverting to a copy of the original state. Also, you would be able to do partial resets (i.e., backtrack to an arbitrary point rather than just the beginning), which may or may not make the code of permute-closures significantly simpler.
Copying generator states might be a bit easier in a language with first-class continuations (like Scheme), but if all generators follow some predefined structure, abstracting it away with a define-generator macro or some such should be possible in Common Lisp as well.
I would add to the counter either one of these:
being able to reset the counter to the start
letting the counter return NIL when the count is done and then starting from the first value again on the next call