I'm curious to learn the reason for this since set seems unique. For instance:
(set 'nm 3) ;; set evaluates its first argument, a symbol, has to be quoted
nm ;; ==> evaluates to 3
(set 'nm 'nn) ;; assigns nn to the value cell of nm
nm ;; ==> evaluates to nn
nn ;; ==> ERROR. no value
(set nm 3) ;; since nm evaluates to nn ...
nm ;; evaluates to nn
nn ;; evaluates to 3
To achieve similar behavior, I've only been able to use setf:
(setq tu 'ty) ;;
(symbol-value 'tu) ;; returns ty
(setq (symbol-value 'tu) 5) ;; ERROR. setq expects a symbol
(setf (symbol-value tu) 5) ;; has to be unquoted to access the value cell
tu ;; ==> evaluates to ty
ty ;; ==> evaluates to 3
In other programming languages the reason(s) for demotion are pretty clear: inefficient, bug prone, or insecure come to mind. I wonder what the criteria for deprecation for set was at the time. All I've been able to glean from the web is this, which is laughable. Thanks.
The main reason set is deprecated is that its use can lead to errors when it is used on bound variables (e.g., in functions):
(set 'a 10)
==> 10
a
==> 10
(let ((a 1)) ; new lexical binding
(set 'a 15) ; modification of the value slot of the global symbol, not the lexical variable
a) ; access the lexical variable
==> 1 ; nope, not 15!
a
==> 15
set is a legacy function from the times when Lisp was "the language of symbols and lists". Lisp has matured since then; direct operations on symbol slots are relatively rare, and there is no reason to use set instead of the more explicit (setf symbol-value).
In your example:
(set nm 'nn) ;; assigns nn to the value cell of nm
nm ;; ==> evaluates to nn
This is completely wrong and the main reason why it's deprecated. It's the symbol you get when evaluating nm that is bound to nn. Unfortunately in your example that is the number 3 and it will signal an error at run time since you cannot use numbers as variables. If you were to write (setq 3 'nn) the error can be seen at compile time.
With set there is an additional reason. It's very difficult to compile when you don't know what symbol is to be bound and the compiler cannot optimize it.
Scheme didn't have automatic quoting either in it's first version and they didn't even have a quoted macro like setq. Instead it stated that set' would suffice. It's obvious that it didn't since Scheme no longer have set but only define.
I personally disagree that it should be removed (deprecation results in removal eventually) but like eval it should be avoided and only used as a last resort.
Related
I have a question concerning the relation between symbols and global variables.
The hyperspec states for the value attribute of a symbol:
"If a symbol has a value attribute, it is said to be bound, and that fact can be detected by the function boundp. The object contained in the value cell of a bound symbol is the value of the global variable named by that symbol, and can be accessed by the function symbol-value."
If I apply the following steps:
CL-USER> (intern "*X*")
*X*
NIL
CL-USER> (boundp '*x*)
NIL
CL-USER> (setf (symbol-value '*x*) 1)
1
CL-USER> (boundp '*x*)
T
as I understand the above cited conditions are fulfilled. There should be a global variable named by the symbol and the value of the variable is the symbol-value. But this is wrong.
CL-USER> (describe '*x*)
COMMON-LISP-USER::*X*
[symbol]
*X* names an undefined variable:
Value: 1
; No value
CL-USER>
It has to be proclaimed special.
CL-USER> (proclaim '(special *x*))
; No value
CL-USER> (describe '*x*)
COMMON-LISP-USER::*X*
[symbol]
*X* names a special variable:
Value: 1
; No value
CL-USER>
Can you please explain this behaviour. What means an "undefined variable", I did not find this term in the hyperspec.
(I use SBCL 1.3.15.)
Thank you for your answers.
Edit:
(Since this comment applies to both answers below (user Svante and coredump), I write it as an Edit and not as a comment to both answers).
I agree with the answers that * x* is a global variable.
The hyperspec states for global variable:
"global variable n. a dynamic variable or a constant variable."
Therefore I think now, the reason why SBCL says it is "undefined" is not whether is special, but whether it is a dynamic (special) variable or a constant variable (hyperspec:"constant variable n. a variable, the value of which can never change").
The third definition which is mentioned in the below answers (maybe I understand the answers wrong), that it is a global variable which is not special (and not a constant) does not exist according to the hyperspec.
Can you agree with that?
Edit 2:
Ok, in summary, I thought, since the hyperspec does not define undefined global variables, they do not exist.
But the correct answer is, they do exist and are undefined, that means it is implementation dependent how it is dealt with them.
Thank you for your answers, I accept all three of them, but I can only mark one.
There is a global variable named by the symbol, and its value is the symbol-value. That is what the output tells you. The thing that is undefined is the status of the variable: whether it is special. I agree that the wording of the output is a bit special.
If you set the value of a variable without creating it first (which is what a naked setq would do as well), it is undefined whether it becomes special or not.
Conventionally, one does not use global variables that are not special. That's why you should use defvar, defparameter etc..
As said in the other answer(s), *X* is not declare special (dynamic). SBCL also gives you a warning if your lexically bind the symbol:
FUN> (let ((*X* 30)) (list *X* (symbol-value '*X*)))
; in: LET ((*X* 30))
; (LET ((FUN::*X* 30))
; (LIST FUN::*X* (SYMBOL-VALUE 'FUN::*X*)))
;
; caught STYLE-WARNING:
; using the lexical binding of the symbol (FUN::*X*), not the
; dynamic binding, even though the name follows
; the usual naming convention (names like *FOO*) for special variables
;
; compilation unit finished
; caught 1 STYLE-WARNING condition
(30 10)
Note also what happens if *X* is locally declared as special:
FUN> (let ((*X* 30)) (declare (special *X*)) (list *X* (symbol-value '*X*)))
(30 30)
The symbol-value accessor retrieves the binding from the dynamic environment.
global variable which is not special (and not a constant) does not exist according to the hyperspec.
The actual behaviour is undefined in the standard, but in implementations it might work in some way.
This example in LispWorks:
CL-USER 46 > (boundp 'foo)
NIL
So FOO is unbound.
CL-USER 47 > (defun baz (bar) (* foo bar))
BAZ
The above defines a function baz in the LispWorks interpreter - it is not compiled. There is no warning.
Now we set this symbol foo:
CL-USER 48 > (setq foo 20)
20
CL-USER 49 > (baz 22)
440
We have successfully called it, even though FOO was not declared as a global function.
Let's check, if it is declared as special:
CL-USER 50 > (SYSTEM:DECLARED-SPECIAL-P 'foo)
NIL
Now we compile the function from above:
CL-USER 51 > (compile 'baz)
;;;*** Warning in BAZ: FOO assumed special
BAZ
The compiler says that it does not know of FOO and assumes that it is special.
This behaviour is undefined and implementations differ:
an interpreter might just use the global symbol value and not complain at all - see the LispWorks example above. That's relatively common in implementations.
a compiler might assume that the undefined variable is a special variable and warns. This is also relatively common in implementations.
a compiler might assume that the undefined variable is a special variable and also declare the variable to be special. This is not so common - the CMUCL did (does?) that by default. This behaviour is not common and not liked, since there is no standard way to undo the declaration.
I would like to have a variable containing an integer, that came from an input of a user. It can't accept strings neither decimal numbers.
I would like some help to understand what I am doing wrong here.
My code until now:
I appreciate the help.
(format t "~%Enter a number: ")
(loop (defvar numb (read))
(cond (((rationalp numb)1)
(print "No decimal numbers are allowed, please enter an integer"))
(((stringp numb)1)
(print "No strings are allowed, please enter an integer"))
)
(when ((integerp numb)1) (return numb))
)
Working code
Here is how I would do it:
(defun ask-and-read (prompt)
"Prompt the user and read his input."
(princ prompt *query-io*)
(force-output *query-io*) ; flush the buffers
(let ((*read-eval* nil)) ; close the security hole
(read *query-io*)))
(defun request-object (prompt predicate)
"Ask the user for an object using prompt.
Only accept data which satisfies the predicate."
(loop
for object = (ask-and-read prompt)
when (funcall predicate object)
return object
do (format *query-io* "Alas, ~S (~S) does not satisfy ~S, please try again~%"
object (type-of object) predicate)))
Example:
> (request-object "Enter an integer: " #'integerp)
Enter an integer: 4.6
Alas, 4.6 (SINGLE-FLOAT) does not satisfy #<SYSTEM-FUNCTION INTEGERP>, please try again
Enter an integer: 5/7
Alas, 5/7 (RATIO) does not satisfy #<SYSTEM-FUNCTION INTEGERP>, please try again
Enter an integer: asdf
Alas, ASDF (SYMBOL) does not satisfy #<SYSTEM-FUNCTION INTEGERP>, please try again
Enter an integer: 7
==> 7
> (request-object "Enter a real: " #'realp)
Enter a real: 4.5
==> 4.5
> (request-object "Enter a real: " #'realp)
Enter a real: 5/8
==> 5/8
> (request-object "Enter a real: " #'realp)
Enter a real: "sdf"
Alas, "sdf" ((SIMPLE-BASE-STRING 3)) does not satisfy #<SYSTEM-FUNCTION REALP>, please try again
Enter a real: 8
==> 8
Please see the documentation for the facilities I used:
princ
force-output
*query-io*
read
*read-eval*
loop:
for
when
return
do
format
Your mistakes
Code formatting
Your code is unreadable because you have incorrect indentation.
Lispers do not count parens - this is the job for compilers and editors.
We look at indentation.
Please do yourself a favor and use Emacs - it will indent the code for you and you will often see your errors yourself.
Defvar is a top-level form
First of all, defvar is a top-level form which is used to define global variables, not set them.
Subsequent calls do not change the value:
(defvar *abc* 1)
*abc*
==> 1
(defvar *abc* 10)
*abc*
==> 1 ; not 10!
Use setq to set variable.
Prefer local variables to global variables
While Lisp does allow global variables, the predominant programming
style in Lisp is the functional style: every function receives its
"input" data as arguments and returns its "output" data as values.
To achieve functional style, prefer a local to a global variable.
You create local variables through let or
let* or, in loop, see
Local Variable Initializations.
Cond and When have very specific syntax
You have extra parens and 1(?!) in your cond and when forms.
Remember, parens are meaningful in Lisp.
Security first!
Binding *read-eval* to nil
before read is necessary to
avoid a nuclear war if a user enters #.(launch-nuclear-missiles)
in response to your prompt, because normally read evaluates whatever
comes after #..
Or in other words: Is it possible for a variable in CL not to be (part of) a symbol?
I think I may have a profound misconception about variables in CL.
I always thought CL has no variables, only symbols, and symbols have (among other properties) a name and a value cell (which is the variable).
And when someone said "variable x has the value 42" I thought it was short for "the value cell of the symbol named x stores the value 42".
But this is probably wrong.
When I type
> (let ((a 42))
(type-of 'a))
SYMBOL
; caught STYLE-WARNING:
; The variable A is defined but never used.
is the lexical variable a in this example a fully fleshed symbol whose value cell has been set to 42?
Because the warning The variable A is defined but never used suggests otherwise and it appears that the lexical variable is not the same thing as the symbol a in the following form (type-of 'a).
Common Lisp has two data types which have a special meaning for evaluation:
cons cells / lists -> used in Lisp source code, lists are Lisp forms
symbols -> used as names for various purposes
If you want to use them as data in Lisp code, then you have to quote them.
Both are used in the Lisp source code, but once you compile code, they may disappear.
Variables are written as symbols in the source code. But in compiled code they may go away - when they are lexical variables.
Example using SBCL:
a file with
(defun test (foo)
(+ foo foo))
Now we do:
CL-USER> (proclaim '(optimize (debug 0))) ; the compiler saves no debug info
; No value
CL-USER> (compile-file "/tmp/test.lisp")
; compiling file "/private/tmp/test.lisp" (written 23 MAY 2017 09:06:51 PM):
; compiling (DEFUN TEST ...)
; /tmp/test.fasl written
; compilation finished in 0:00:00.013
#P"/private/tmp/test.fasl"
NIL
NIL
CL-USER> (find-symbol "FOO")
FOO
:INTERNAL
The compiler has read the source code and created a compiled FASL file. We see that the symbol FOO is now in the current package. FOO names the variable in our source code.
Now quit SBCL and restart it.
Let's load the machine code:
CL-USER> (load "/tmp/test")
T
CL-USER> (find-symbol "FOO")
NIL
NIL
There is no symbol FOO anymore. It's also not possible to retrieve the lexical value of the variable FOO using the symbol FOO. There is no mapping (like some kind of explicit lexical environment) from symbols to lexical values.
The value cell is used for dynamic (AKA "special") variables, not lexical variables. Lexical variables are symbols in the source code, but they don't have any runtime relationship to the symbol (except for internal use by the debugger).
So if you wrote:
(let ((a 42))
(declare (special a))
(print (symbol-value 'a)))
it would work because the declaration makes it a dynamic variable, and then you can access the value in the function cell.
You are not checking the type of the bound variable a or its value but that of a literal constant symbol that happens to have the same name as the variable in your let form:
(let ((a 42))
(type-of 'literal-symbol))
; ==> symbol (since 'literal-symbol evaluates to a symbol, just like 'a does)
To check the type of the value of the binding a you do it without the literal quote:
(let ((a 42))
(type-of a))
; ==> (integer 0 281474976710655)
Here you actually check the type of the let bound value and it's an integer. Surprised that 42 is a number and not a symbol?
(let ((a 10) (b 'a))
(list a b))
; ==> (10 a)
The variable a and the quoted literal 'a are not the same. They just happen to look the same when displayed but 'a is data and a is code. In CL a compiler might use lists and symbols internally but what it is when its executing is entirely up to the implementation and in most implementations they stack allocate when they can and the code that evaluate a stack allocated variable would be replaced by something that picks the value at the index from the stack. CL has a disassemble function and if you check the output in SBCL from something you'll see it's more similar to the output of a C compiler than the original lisp source.
I've just been reading up on the sharpsign colon reader macro and it sounded like it had a very similar effect to gensym
Sharpsign Colon: "introduces an uninterned symbol"
Gensym: "Creates and returns a fresh, uninterned symbol"
So a simple test
CL-USER> #:dave
; Evaluation aborted on #<UNBOUND-VARIABLE DAVE {1002FF77D3}>.
CL-USER> (defparameter #:dave 1)
#:DAVE
CL-USER> #:dave
; Evaluation aborted on #<UNBOUND-VARIABLE DAVE {100324B493}>.
Cool so that fails as it should.
Now for the macro test
(defmacro test (x)
(let ((blah '#:jim))
`(let ((,blah ,x))
(print ,blah))))
CL-USER> (test 10)
10
10
CL-USER>
Sweet so it can be used like in a gensym kind of way.
To me this looks cleaner than gensym with an apparently identical result. I'm sure I'm missing a vital detail so my question is, What it it?
Every time the macro is expanded, it will use the same symbol.
(defmacro foo () `(quote #:x))
(defmacro bar () `(quote ,(gensym)))
(eq (foo) (foo)) => t
(eq (bar) (bar)) => nil
Gensym will create a new symbol every time it is evaluated, but sharp colon will only create a new symbol when it is read.
While using sharp colon is unlikely to cause problems, there are a couple rare cases where using it would lead to nearly impossible to find bugs. It is better to be safe to begin with by always using gensym.
If you want to use something like sharp colon, you should look at the defmacro! macro from Let Over Lambda.
GENSYM is like MAKE-SYMBOL. The difference is that GENSYM supports fancy naming by counting up -> thus symbols kind of have unique names, which makes debugging a bit easier when having gensyms for example in macro expansions.
#:foo is a notation for the reader.
So you have a function which creates these and a literal notation. Note that, when *print-circle* is true, some kind of identity maybe preserved in s-expressions: #(#1=#:FOO #1#).
Generally this is similar to (a . b) and (cons 'a 'b), #(a b) and (vector 'a 'b)... One is literal data and the other one is a form which will create ('cons') fresh objects.
If you look at your macro, the main problem is that nested usage of it could cause problems. Both lexically or dynamically.
lexically it could be the same variable, which is rebound.
dynamically, if it is a special variable it could also be rebound
Using a generated symbol at macro expansion time would make sure that different and expanded code would not share bindings.
This is with SBCL 1.0.55 on Debian squeeze. I'm probably missing something obvious, but I'm a beginner, so please bear with me.
CL-USER> (defparameter x 0)
CL-USER> (case x (t 111) )
111
So it looks like case here is matching the variable x with the truth symbol t. This happens with everthing I've tried; this x is just an example. I don't see why this would happen. Since case uses eql for matching, I tried
CL-USER> (eql x t)
NIL
So, eql does not match x and t. What am I missing? Thanks in advance.
Described in the CASE documentation.
otherwise-clause::= ({otherwise | t} form*)
The syntax says that an otherwise clause is either (otherwise form-1 ... form-n) or (t form-1 ... form-n). Note that the syntax says {otherwise | t}. The vertical bar is an OR in a syntax specification. So the marker for an otherwise clause is either otherwise or t.
That means, if your case clause begins with otherwise or t, then we have an otherwise-clause.
In the case construct in Common Lisp, t, used by itself, is equivalent to default in C; that is, it's evaluated if the expression doesn't match any of the other cases. If you want to match the actual symbol t, use (t) instead.