Besides the general question in the title,
How do functional programmers and functional languages approach the domain of simulations, which seem to be most naturally handled by object-oriented languages?
Are there open-source examples of complex simulations written in a (mostly) functional style?
What changes of perspective would an OO-programmer need, in order to approach simulations from a functional paradigm?
I'm asking this while learning how Clojure's creator Rich Hickey specifically sought to tame the "incidental complexity" of OO-programming and mutable state, e.g. Clojure's separation of identity and state makes a lot of sense (Hickey's ants.clj is on the study list). Another related area is using functional programming for games, which are often simulations with lots of stateful "things" all over the place; there are some articles/papers written about FP and games, more would be welcome.
Perhaps experienced functional programmers can provide additional background and advice on how to re-orient one's thinking to functional style, specifically for simulations. Thanks in advance!
Michal's answer is excellent, but I thought I'd add a couple other neat examples I've personally found helpful/interesting.
The first is a post (and code) about functional fluid dynamics by Lau Jenson. Though he definitely goes the mutable route for speed here, the style is rather functional. I'd bet by Clojure 1.3 this could be done (mostly!) immutably with reasonable performance.
The next is a simple Snake game implemented in Clojure. Easy enough to read though in an hour or so, and the style is really pleasant and cohesive.
Also, some neat code to look at (I think!) is code modeling neural networks. Jeff Foster has some single layer perceptron code, and some more idiomatic revisions of the code. Worth looking at, even if you're not acquainted with NNs. He also has some more recent posts regarding fluid dynamics, though this time in Haskell. (Part I and Part II) Also fun, I think, is his implementation of the Game of Life (& Part II).
Finally, as Michal mentioned before me, Brian Carper is working on a RPG in Clojure. he recently posted some artwork for the game, so I'm betting it's still being worked on ;)
I love using the sequence libraries for working with tons of data; it feels more natural using abstractions like map and reduce, and fun, handy tools like juxt rather than simple imperative iterations. You do pay a tax, I've found, by using Clojure/functional langs in reimplementing well-known and well-implemented imperative algorithms.
Have fun!
I'm not sure that I'm up to the challenge of writing up a comprehensive analysis of the problem posed in the question, but I can at least post some links interesting on the FP vs. games front:
Jörg W. Mittag provides a number of interesting examples in this answer to a question on "real world" Haskell programming (with links to some interesting write-ups -- the Purely Functional Retrogames series is really worth a read).
In Clojure land, Phil Hagelberg has implemented a text-based adventure game for his PeepCode screencast on Clojure programming; the code is available on GitHub. Then there's Brian Carper's RPG project; no code publicly released yet and just that one post from a while ago (it looked very cool, though, so let's all come together to pressure Brian to continue ;-)). Finally, here's an example of a simple game using Penumbra (for some reason -- possibly unrelated to Clojure -- I couldn't get it to work, but may you will, plus there's a write-up attached).
As for simulations, studying ants.clj is a great idea. Also, I remember seeing a series of SICP-based lectures from an introductory programming course at UC Berkeley (I think...?) available somewhere (90% it was on their YouTube channel); they've got three lectures on OOP in Scheme and I think they mention simulation as a domain providing good use cases for the approach. Note that I have a pretty vague memory of this one, so it's hard for me to say how useful it might be to you.
I'm writing a game in Clojure, using a mostly functional style. For example, the entire game state is modelled as an immutable data structure.
This has required some slightly convoluted coding. For example, you frequently end up creating functions with a lot of parameters to pass around various elements of game state and ensure that application of game state updates happens to the very latest game version.
But it has also thrown up some really nice advantages, for example concurrency has proved pretty trivial and you can do fun things like cloning the entire game state to run different simulations in the AI.
Overall, I'm delighted with Clojure as a language for simulations / games.
Based on this experience, the things I think would improve Clojure for games / simulations would be:
Better support for primitives, especially as function parameters and return values
Implementation of core language functions that are less hard on memory allocations (GC pressure is an issue for interactive games!)
You can see an early version of the game here: Ironclad - Generals of Steam. It's basically a steampunk themed strategy game.
Uncle Bob has been playing with Clojure lately and in particular writing an orbital simulator as his most public example.
Some links:
Blog post
Code on github
To complement the other answers: There is a discipline called Functional Reactive Programming that addresses the issue of functional representation of systems that change in time and react to outer events. See
What is (functional) reactive programming?
Is functional GUI programming possible?
Functional Reactive Programming at The Haskell Wiki.
Simulations are a form of interpreter -- which are easy to write in a functional style. They can be also designed as self-optimizing simulators, based on treating them as a compiler.
Related
I'm struggling with understanding the difference between functional and imperative programming. From reading https://www.sitepoint.com/what-is-functional-programming/ I see that there are a number of principles in functional programming that I use all the time in what I thought was an imperative programming.
I've read that functional programs use pure functions, so does that mean every time I make or use a pure function I'm writing in the functional paradigm?
I've also passed functions in and used them as first class objects, does that mean I was writing in the functional paradigm?
I pretty much use all of the functional paradigm principles in my code, but I never thought I was doing functional programming. Is the act of using any of these functional programming principles considered the functional programming paradigm?
Functional principles are just techniques and ideas. These are the bread and butter of the functional programming paradigm, which is what happens when you take these tools and use their unique advantages to gain systemic advantages.
A pure function is just a function with no side effects. You've written a million of these. But now, if you write only pure functions, your app can be split across processing cores with no effort or risk.
You've used constants before. But if you almost always use constants, then the things that are variable are the only things you have to think about when tracing code, and that is quite an advantage.
And you've chained functions before, but when you make everything pipe-able your entire language begins to feel like wiring up data flows, rather than giving the computer step-by-step instructions. This is much easier for humans to reason about and is less error-prone.
The techniques always have their advantages. When they become baseline assumptions, those advantages multiply. That's the functional paradigm.
Moving my comments here for clarity:
good question! In this article medium.com/#charlesbailey333/… it talked about how Rust had advantages over C++ because it incorporates functional programming ideas better. The evidence they give is that it supports Map, Reduce, and Filter. It almost seems like they're saying that those functions are "functional programming functions", but I don't think those functions are anything special. – Joshua Segal 16 mins ago
Okay great! This I can help with. SO this author is struggling to use the actual term for what they're referencing. It's called "expressiveness". Basically it means how close is the code I'm writing to the mental model of what I'm doing? For example, you want to give someone directions on how to get from A to B. Ideally, you do this by expressing it in turns and street names. However, if your language forces you to express this using the angle of the accelerator pedal and the angle of the steering wheel, this is much clunkier to do. C++ did it clunky. Rust did it elegantly and expressively.
In general, the functional and declarative languages tend to be much better at "expressing" your ideas in code and visually. You have branching paths? Your code literally looks like a branching tree. You have a data flow with a transformer? Guess what friend, that's just a function that transforms X to Y and a some sort of pipe operator that takes care of looping and new info.
The thing is, expressiveness isn't a statistic or something you can optimize for. It's an emergent "feeling" when using the language. The paradigms are general principles that tend to be internally consistent that produce useful "feelings". FP feels like flowing pipes and transformations. OOP feels like gadgets and features that talk to each other. The different mental models have different uses. FP is better for data processing. OOP can be good for UI and stateful services. At the boundaries they can clash a little, which is where the clunk comes from in C++.
At this point anything that is "completely OOP" or "completely FP" is usually shit, to be frank, so it can be a little hard to see the identities of the two when they are so merged. If you do complete OOP you can't compose anything and you have to write a million connector classes. If you do complete FP you can't modify state or have side effects (like... uh... showing stuff on screen?). These are genres. What makes something a house beat? If something else uses a house beat is it automatically house music? Does anyone care about the categorization?
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Software Engineering as it is taught today is entirely focused on object-oriented programming and the 'natural' object-oriented view of the world. There is a detailed methodology that describes how to transform a domain model into a class model with several steps and a lot of (UML) artifacts like use-case-diagrams or class-diagrams. Many programmers have internalized this approach and have a good idea about how to design an object-oriented application from scratch.
The new hype is functional programming, which is taught in many books and tutorials. But what about functional software engineering?
While reading about Lisp and Clojure, I came about two interesting statements:
Functional programs are often developed bottom up instead of top down ('On Lisp', Paul Graham)
Functional Programmers use Maps where OO-Programmers use objects/classes ('Clojure for Java Programmers', talk by Rich Hickley).
So what is the methodology for a systematic (model-based ?) design of a functional application, i.e. in Lisp or Clojure? What are the common steps, what artifacts do I use, how do I map them from the problem space to the solution space?
Thank God that the software-engineering people have not yet discovered functional programming. Here are some parallels:
Many OO "design patterns" are captured as higher-order functions. For example, the Visitor pattern is known in the functional world as a "fold" (or if you are a pointy-headed theorist, a "catamorphism"). In functional languages, data types are mostly trees or tuples, and every tree type has a natural catamorphism associated with it.
These higher-order functions often come with certain laws of programming, aka "free theorems".
Functional programmers use diagrams much less heavily than OO programmers. Much of what is expressed in OO diagrams is instead expressed in types, or in "signatures", which you should think of as "module types". Haskell also has "type classes", which is a bit like an interface type.
Those functional programmers who use types generally think that "once you get the types right; the code practically writes itself."
Not all functional languages use explicit types, but the How To Design Programs book, an excellent book for learning Scheme/Lisp/Clojure, relies heavily on "data descriptions", which are closely related to types.
So what is the methodology for a systematic (model-based ?) design of a functional application, i.e. in Lisp or Clojure?
Any design method based on data abstraction works well. I happen to think that this is easier when the language has explicit types, but it works even without. A good book about design methods for abstract data types, which is easily adapted to functional programming, is Abstraction and Specification in Program Development by Barbara Liskov and John Guttag, the first edition. Liskov won the Turing award in part for that work.
Another design methodology that is unique to Lisp is to decide what language extensions would be useful in the problem domain in which you are working, and then use hygienic macros to add these constructs to your language. A good place to read about this kind of design is Matthew Flatt's article Creating Languages in Racket. The article may be behind a paywall. You can also find more general material on this kind of design by searching for the term "domain-specific embedded language"; for particular advice and examples beyond what Matthew Flatt covers, I would probably start with Graham's On Lisp or perhaps ANSI Common Lisp.
What are the common steps, what artifacts do I use?
Common steps:
Identify the data in your program and the operations on it, and define an abstract data type representing this data.
Identify common actions or patterns of computation, and express them as higher-order functions or macros. Expect to take this step as part of refactoring.
If you're using a typed functional language, use the type checker early and often. If you're using Lisp or Clojure, the best practice is to write function contracts first including unit tests—it's test-driven development to the max. And you will want to use whatever version of QuickCheck has been ported to your platform, which in your case looks like it's called ClojureCheck. It's an extremely powerful library for constructing random tests of code that uses higher-order functions.
For Clojure, I recommend going back to good old relational modeling. Out of the Tarpit is an inspirational read.
Personally I find that all the usual good practices from OO development apply in functional programming as well - just with a few minor tweaks to take account of the functional worldview. From a methodology perspective, you don't really need to do anything fundamentally different.
My experience comes from having moved from Java to Clojure in recent years.
Some examples:
Understand your business domain / data model - equally important whether you are going to design an object model or create a functional data structure with nested maps. In some ways, FP can be easier because it encourages you to think about data model separately from functions / processes but you still have to do both.
Service orientation in design - actually works very well from a FP perspective, since a typical service is really just a function with some side effects. I think that the "bottom up" view of software development sometimes espoused in the Lisp world is actually just good service-oriented API design principles in another guise.
Test Driven Development - works well in FP languages, in fact sometimes even better because pure functions lend themselves extremely well to writing clear, repeatable tests without any need for setting up a stateful environment. You might also want to build separate tests to check data integrity (e.g. does this map have all the keys in it that I expect, to balance the fact that in an OO language the class definition would enforce this for you at compile time).
Prototying / iteration - works just as well with FP. You might even be able to prototype live with users if you get very extremely good at building tools / DSL and using them at the REPL.
OO programming tightly couples data with behavior. Functional programming separates the two. So you don't have class diagrams, but you do have data structures, and you particularly have algebraic data types. Those types can be written to very tightly match your domain, including eliminating impossible values by construction.
So there aren't books and books on it, but there is a well established approach to, as the saying goes, make impossible values unrepresentable.
In so doing, you can make a range of choices about representing certain types of data as functions instead, and conversely, representing certain functions as a union of data types instead so that you can get, e.g., serialization, tighter specification, optimization, etc.
Then, given that, you write functions over your adts such that you establish some sort of algebra -- i.e. there are fixed laws which hold for these functions. Some are maybe idempotent -- the same after multiple applications. Some are associative. Some are transitive, etc.
Now you have a domain over which you have functions which compose according to well behaved laws. A simple embedded DSL!
Oh, and given properties, you can of course write automated randomized tests of them (ala QuickCheck).. and that's just the beginning.
Object Oriented design isn't the same thing as software engineering. Software engineering has to do with the entire process of how we go from requirements to a working system, on time and with a low defect rate. Functional programming may be different from OO, but it does not do away with requirements, high level and detailed designs, verification and testing, software metrics, estimation, and all that other "software engineering stuff".
Furthermore, functional programs do exhibit modularity and other structure. Your detailed designs have to be expressed in terms of the concepts in that structure.
One approach is to create an internal DSL within the functional programming language of choice. The "model" then is a set of business rules expressed in the DSL.
See my answer to another post:
How does Clojure aproach Separation of Concerns?
I agree more needs to be written on the subject on how to structure large applications that use an FP approach (Plus more needs to be done to document FP-driven UIs)
While this might be considered naive and simplistic, I think "design recipes" (a systematic approach to problem solving applied to programming as advocated by Felleisen et al. in their book HtDP) would be close to what you seem to be looking for.
Here, a few links:
http://www.northeastern.edu/magazine/0301/programming.html
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.86.8371
I've recently found this book:
Functional and Reactive Domain Modeling
I think is perfectly in line with your question.
From the book description:
Functional and Reactive Domain Modeling teaches you how to think of the domain model in terms of pure functions and how to compose them to build larger abstractions. You will start with the basics of functional programming and gradually progress to the advanced concepts and patterns that you need to know to implement complex domain models. The book demonstrates how advanced FP patterns like algebraic data types, typeclass based design, and isolation of side-effects can make your model compose for readability and verifiability.
There is the "program calculation" / "design by calculation" style associated with Prof. Richard Bird and the Algebra of Programming group at Oxford University (UK), I don't think its too far-fetched to consider this a methodology.
Personally while I like the work produced by the AoP group, I don't have the discipline to practice design in this way myself. However that's my shortcoming, and not one of program calculation.
I've found Behavior Driven Development to be a natural fit for rapidly developing code in both Clojure and SBCL. The real upside of leveraging BDD with a functional language is that I tend to write much finer grain unit tests than I usually do when using procedural languages because I do a much better job of decomposing the problem into smaller chunks of functionality.
Honestly if you want design recipes for functional programs, take a look at the standard function libraries such as Haskell's Prelude. In FP, patterns are usually captured by higher order procedures (functions that operate on functions) themselves. So if a pattern is seen, often a higher order function is simply created to capture that pattern.
A good example is fmap. This function takes a function as an argument and applies it to all the "elements" of the second argument. Since it is part of the Functor type class, any instance of a Functor (such as a list, graph, etc...) may be passed as a second argument to this function. It captures the general behavior of applying a function to every element of its second argument.
Well,
Generally many Functional Programming Languages are used at universities for a long time for "small toy problems".
They are getting more popular now since OOP has difficulties with "paralel programming" because of "state".And sometime functional style is better for problem at hand like Google MapReduce.
I am sure that, when functioanl guys hit the wall [ try to implement systems bigger than 1.000.000 lines of code], some of them will come with new software-engineering methodologies with buzz words :-). They should answer the old question: How to divide system into pieces so that we can "bite" each pieces one at a time? [ work iterative, inceremental en evolutionary way] using Functional Style.
It is sure that Functional Style will effect our Object Oriented
Style.We "still" many concepts from Functional Systems and adapted to
our OOP languages.
But will functional programs will be used for such a big systems?Will they become main stream? That is the question.
And Nobody can come with realistic methodology without implementing such a big systems, making his-her hands dirty.
First you should make your hands dirty then suggest solution. Solutions-Suggestions without "real pains and dirt" will be "fantasy".
I'm getting a bit tired of having to code explicitly for multicore if I want more speed, particularly when I'm just writing a one-off script. My dev box already has 8 cores and that number is going up a lot faster than the clock speed. Functional languages seem to offer a potential escape hatch, but I haven't put in the effort to master one of them yet.
I'd love to see some sample chunks of real-world code that are much better and/or more parallelizable than non-functional alternatives. I'm not picky about the language -- I'm more interested in the concepts.
Thanks!
How about MapReduce? It's incredibly parallelizable and even though it's not implemented in functional languages as far as the paper goes, it's inspired by Lisp's map and reduce.
This (long, but very good) video gives both an intro to F# and a compelling demo of how easy it is to parallelize code in the language:
http://channel9.msdn.com/pdc2008/TL11/
Your question is asking for material right at the state of the art. I think your best introduction to this field, with examples, is the book Implicit Parallel Programming in pH by Nikhil and Arvind.
LINQ is a nice example of functional programming in mainstream languages. Reified code and monads? In MY C#? :) Anyways, w.r.t. threading, there's mention of Parallel LINQ. By using immutability and higher order functions (and Expression, perhaps), libraries can parallelize things for us.
And another link to F# with async workflows. What's impressive is the ability to take sync code, and with a few small annotations turn it into async code. The code retains a lot of the imperative qualities you might be using. You don't have to completely change things to take advantage of this; the compiler via handles it all.
There's an extended example of a text indexer/searcher using mapreduce in Chapter 20 ("Programming Multi-core CPUs") of Programming Erlang. I don't know how impressive that is, but it looks like code mortals can write.
Purely Functional Data Structures (long PDF), by Chris Okasaki.
A teacher of mine used to joke that the greatest example of functional code is the code that is not written.
SICP - "Structure and Interpretation of Computer Programs"
Explanation for the same would be nice
Can some one explain about Metalinguistic Abstraction
SICP really drove home the point that it is possible to look at code and data as the same thing.
I understood this before when thinking about universal Turing machines (the input to a UTM is just a representation of a program) or the von Neumann architecture (where a single storage structure holds both code and data), but SICP made the idea much more clear. Scheme (Lisp) helped here, as the syntax for a program is exactly the same as the syntax for lists in general, namely S-expressions.
Once you have the "equivalence" of code and data, suddenly a lot of things become easy. For example, you can write programs that have different evaluation methods (lazy, nondeterministic, etc). Previously, I might have thought that this would require an extension to the programming language; in reality, I can just add it on to the language myself, thus allowing the core language to be minimal. As another example, you can similarly implement an object-oriented framework; again, this is something I might have naively thought would require modifying the language.
Incidentally, one thing I wish SICP had mentioned more: types. Type checking at compilation time is an amazing thing. The SICP implementation of object-oriented programming did not have this benefit.
I didn't read that book yet, I have only looked at the video courses, but it taught me a lot. Functions as first class citizens was mind blowing for me. Executing a "variable" was something very new to me. After watching those videos the way I now see JavaScript and programming in general has greatly changed.
Oh, I think I've lied, the thing that really struck me was that + was a function.
I think the most surprising thing about SICP is to see how few primitives are actually required to make a Turing complete language--almost anything can be built from almost nothing.
Since we are discussing SICP, I'll put in my standard plug for the video lectures at http://groups.csail.mit.edu/mac/classes/6.001/abelson-sussman-lectures/, which are the best Introduction to Computer Science you could hope to get in 20 hours.
The one that I thought was really cool was streams with delayed evaluation. The one about generating primes was something I thought was really neat. Like a "PEZ" dispenser that magically dispenses the next prime in the sequence.
One example of "the data and the code are the same thing" from A. Rex's answer got me in a very deep way.
When I was taught Lisp back in Russia, our teachers told us that the language was about lists: car, cdr, cons. What really amazed me was the fact that you don't need those functions at all - you can write your own, given closures. So, Lisp is not about lists after all! That was a big surprise.
A concept I was completely unfamiliar with was the idea of coroutines, i.e. having two functions doing complementary work and having the program flow control alternate between them.
I was still in high school when I read SICP, and I had focused on the first and second chapters. For me at the time, I liked that you could express all those mathematical ideas in code, and have the computer do most of the dirty work.
When I was tutoring SICP, I got impressed by different aspects. For one, the conundrum that data and code are really the same thing, because code is executable data. The chapter on metalinguistic abstractions is mind-boggling to many and has many take-home messages. The first is that all the rules are arbitrary. This bothers some students, specially those who are physicists at heart. I think the beauty is not in the rules themselves, but in studying the consequence of the rules. A one-line change in code can mean the difference between lexical scoping and dynamic scoping.
Today, though SICP is still fun and insightful to many, I do understand that it's becoming dated. For one, it doesn't teach debugging skills and tools (I include type systems in there), which is essential for working in today's gigantic systems.
I was most surprised of how easy it is to implement languages. That one could write interpreter for Scheme onto a blackboard.
I felt Recursion in different sense after reading some of the chapters of SICP
I am right now on Section "Sequences as Conventional Interfaces" and have found the concept of procedures as first class citizens quite fascinating. Also, the application of recursion is something I have never seen in any language.
Closures.
Coming from a primarily imperative background (Java, C#, etc. -- I only read SICP a year or so ago for the first time, and am re-reading it now), thinking in functional terms was a big revelation for me; it totally changed the way I think about my work today.
I read most part of the book (without exercise). What I have learned is how to abstract the real world at a specific level, and how to implement a language.
Each chapter has ideas surprise me:
The first two chapters show me two ways of abstracting the real world: abstraction with the procedure, and abstraction with data.
Chapter 3 introduces time in the real world. That results in states. We try assignment, which raises problems. Then we try streams.
Chapter 4 is about metalinguistic abstraction, in other words, we implement a new language by constructing an evaluator, which determines the meaning of expressions.
Since the evaluator in Chapter 4 is itself a Lisp program, it inherits the control structure of the underlying Lisp system. So in Chapter 5, we dive into the step-by-step operation of a real computer with the help of an abstract model, register machine.
Thanks.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I really feel that I should learn Lisp and there are plenty of good resources out there to help me do it.
I'm not put off by the complicated syntax, but where in "traditional commercial programming" would I find places it would make sense to use it instead of a procedural language.
Is there a commercial killer-app out there that's been written in Lisp ?
Lisp is a large and complex language with a large and complex runtime to support it. For that reason, Lisp is best suited to large and complicated problems.
Now, a complex problem isn't the same as a complicated one. A complex problem is one with a lot of small details, but which isn't hard. Writing an airline booking system is a complex business, but with enough money and programmers it isn't hard. Get the difference?
A complicated problem is one which is convoluted, one where traditional divide and conquer doesn't work. Controlling a robot, or working with data that isn't tabular (languages, for example), or highly dynamic situations.
Lisp is really well suited to problems where the solution must be expandable; the classic example is the emacs text editor. It is fully programmable, and thus a programming environment in it's own right.
In his famous book PAIP, Norvig says that Lisp is ideal for exploratory programming. That is, programming a solution to a problem that isn't fully understood (as opposed to an on-line booking system). In other words: Complicated problems.
Furthermore, learning Lisp will remind you of something fundamental that has been forgotten: The difference between Von Neumann and Turing. As we know, Turing's model of computation is an interesting theoretical model, but useless as a model for designing computers. Von Neumann, on the other hand, designed a model of how computers and computation were to execute: The Von Neumann model.
Central to the Von Neumann model is that you have but one memory, and store both your code and your data there. Notice carefully that a Java program (or C#, or whatever you like) is a manifestation of the Turing model. You set your program in concrete, once and for all. Then you hope you can deal with all data that gets thrown on it.
Lisp maintains the Von Neuman model; there is no sharp, pre-determined border between code and data. Programming in Lisp opens your mind to the power of the Von Neumann model. Programming in Lisp makes you see old concepts in a new light.
Finally, being interactive, you'll learn to interact with your programs as you develop them (as opposed to compile and run). This also change the way you program, and the way you view programming.
With this intro I can finally offer a reply to your question: Will you find places where it outshines "traditional" languages?
If you are an advanced programmer, you need advanced tools. And there is no tool more advanced than Lisp.
Or, in other words: The answer is yes if your problems are hard. No otherwise.
One of the main uses for Lisp is in Artificial Intelligence. A friend of mine at college took a graduate AI course and for his main project he wrote a "Lights Out" solver in Lisp. Multiple versions of his program utilized slightly different AI routines and testing on 40 or so computers yielded some pretty neat results (I wish it was online somewhere for me to link to, but I don't think it is).
Two semesters ago I used Scheme (a language based on Lisp) to write an interactive program that simulated Abbott and Costello's "Who's on First" routine. Input from the user was matched against some pretty complicated data structures (resembling maps in other languages, but much more flexible) to choose what an appropriate response would be. I also wrote a routine to solve a 3x3 slide puzzle (an algorithm which could easily be extended to larger slide puzzles).
In summary, learning Lisp (or Scheme) may not yield many practical applications beyond AI but it is an extremely valuable learning experience, as many others have stated. Programming in a functional language like Lisp will also help you think recursively (if you've had trouble with recursion in other languages, this could be a great help).
In response to #lassevk:
complicated syntax??
The syntax for lisp is incredibly simple.
Killer app written in lisp: emacs. Lisp will allow you to extend emacs at will to do almost anything you can think of that an editor might do.
But, you should only learn lisp if you want to, and you may never get to use at work ever, but it is still awesome.
Also, I want to add: even if you find places where lisp will make sense, you will probably not convince anyone else that it should be used over java, c++, c#, python, ruby, etc.
I can't answer from first-hand experience but you should read what Paul Graham wrote on Lisp. As for the "killer-app" part, read Beating the averages.
I programmed in Lisp professionally for about a year, and it is definitely worth learning. You will have unparalleled opportunity to remove redundancy from your code, by being able to replace all boilerplate code with functions where possible, and macros where not. You will also be able to access unparalleled flexibility at runtime, translating freely between code and data. Thus, situations where user actions can trigger the need to build complex structures dynamically is where Lisp truly shines. Popular airline flight schedulers are written in Lisp, and there is also a lot of CAD/CAM in Lisp.
Lisp is very useful for creating little DSLs. I've got a copy of Lisp in a Box running at work and I've written little DSLs to interrogate SQL server databases and generate data layers etc in C#. All my boiler plate code is now written in lisp macros that output to C#. I generate HTML, XML, all sorts of things with it. While I wish I could use Lisp for everyday coding, Lisp can bring practical benefits.
If you like programming you should learn Lisp for the pure joy of it. XKCD perfectly expresses the intellectual enlightenment that ensues. Learning Lisp is for the programmer what meditation is for the Buddhist monk (and I meant this without any blasphemous connotation).
Any language looks a lot harder when one doesn't use the common indentation conventions of a language. When one follows them of Lisp, one sees how it expresses a syntax-tree structure quite readily (note, this isn't quite right because the preview lies a little; the r's should align with the fns in the recursive quicksort argument):
(defun quicksort (lis)
(if (null lis)
nil
(let* ((x (car lis))
(r (cdr lis))
(fn (lambda (a)
(< a x))))
(append (quicksort (remove-if-not fn
r))
(list x)
(quicksort (remove-if fn
r))))))
I found that learning a new language, always influences your programming style in languages you already know. For me it always made me think in different ways to solve a problem in my primary language, which is Java. I think in general, it just widens your horizon in term of programming.
I took a "lisp class" in college back in the eighties. Despite grokking all the concepts presented in the class, I was left without any appreciation for what makes lisp great. I'm afraid that a lot of people look at lisp as just another programming language, which is what that course in college did for me so many years ago. If you see someone complaining about lisp syntax (or lack thereof), there's a good chance that they're one of those people who has failed to grasp lisp's greatness. I was one of those people for a very long time.
It wasn't until two decades later, when I rekindled my interest in lisp, that I began to "get" what makes lisp interesting--for me anyway. If you manage to learn lisp without having your mind blown by closures and lisp macros, you've probably missed the point.
Learning LISP/Scheme may not give you any increased application space, but it will help you get a better sense of functional programming, its rules, and its exceptions.
It's worth the time investment just to learn the difference in the beauty of six nested pure functions, and the nightmare of six nested functions with side effects.
From http://www.gigamonkeys.com/book/introduction-why-lisp.html
One of the most commonly repeated
myths about Lisp is that it's "dead."
While it's true that Common Lisp isn't
as widely used as, say, Visual Basic
or Java, it seems strange to describe
a language that continues to be used
for new development and that continues
to attract new users as "dead." Some
recent Lisp success stories include
Paul Graham's Viaweb, which became
Yahoo Store when Yahoo bought his
company; ITA Software's airfare
pricing and shopping system, QPX, used
by the online ticket seller Orbitz and
others; Naughty Dog's game for the
PlayStation 2, Jak and Daxter, which
is largely written in a
domain-specific Lisp dialect Naughty
Dog invented called GOAL, whose
compiler is itself written in Common
Lisp; and the Roomba, the autonomous
robotic vacuum cleaner, whose software
is written in L, a downwardly
compatible subset of Common Lisp.
Perhaps even more telling is the
growth of the Common-Lisp.net Web
site, which hosts open-source Common
Lisp projects, and the number of local
Lisp user groups that have sprung up
in the past couple of years.
If you have to ask yourself if you should learn lisp, you probably don't need to.
Learning lisp will put Javascript in a completely different light! Lisp really forces you to grasp both recursion and the whole "functions as first class objects"-paradigm. See Crockfords excellent article on Scheme vs Javascript. Javascript is perhaps the most important language around today, so understanding it better is immensely useful!
"Lisp is worth learning for the profound enlightenment experience you will have when you finally get it; that experience will make you a better programmer for the rest of your days, even if you never actually use Lisp itself a lot."
--Eric S. Raymond, "How to Become a Hacker"
http://www.paulgraham.com/avg.html
I agree that Lisp is one of those languages that you may never use in a commercial setting. But even if you don't get to, learning it will definitely expand your understanding of programming as a whole. For example, I learned Prolog in college and while I never used it after, I gave me a greater understanding of many programming concepts and (at times) a greater appreciation for the languages I do use.
But if you are going to learn it...by all means, read On Lisp
Complicated syntax? The beauty of lisp is that it has a ridiculously simple syntax. It's just a list, where each element of the list can be either another list or an elementary data type.
It's worth learning because of the way it enhances your coding ability to think about and use functions as just another data type. This will improve upon the way you code in an imperative and/or object-oriented language because it will allow you to be more mentally flexible with how your code is structured.
Gimp's Script-Fu is lipsish. That's a photoshop-killer app.
Okay, I might be weird but I really don't like Paul Graham's essays that much & on Lisp is a really rough going book if you don't have some grasp of Common Lisp already. Instead, I'd say go for Siebel's Practical Common Lisp. As for "killer-apps", Common Lisp seems to find its place in niche shops, like ITA, so while there isn't an app synonymous with CL the way Rails is for Ruby there are places in industry that use it if you do a little digging.
To add to the other answers:
Because the SICP course (the videos are available here) is awesome: teaches you Lisp and a lot more!
Killer app? Franz Inc. has a long list of success stories, but this list only includes users of AllegroCL... There are probably others. My favourite is the story about Naughty Dog, since I was a big fan of the Crash Bandicoot games.
For learning Common Lisp, I'd recommend Practical Common Lisp. It has a hands-on approach that at least for me made it easier than other books I've looked at.
You could use Clojure today to write tests and scripts on top of the Java VM. While there are other Lisp languages implemented on the JVM, I think Clojure does the best job of integrating with Java.
There are times when the Java language itself gets in the way of writing tests for Java code (including "traditional commercial programming"). (I don't mean that as an indictment of Java -- other languages suffer from the same problem -- but it's a fact. Since the topic, not Java, I won't elaborate. Please feel free to start a new topic if someone wants to discuss it.) Clojure eliminates many of those hindrances.
Lisp can be used anywhere you use traditional programming. It's not that different, it's just more powerful. Writing a web app? you can do it on Lisp, writing a desktop application? you can do it on Lisp, whatever, you can probably do it on Lisp, or Python, or any other generic programming (there are a few languages that are suited for only one task).
The biggest obstacle will probably be acceptance of your boss, your peers or your customers. That's something you will have to work with them. Choosing a pragmatic solution like Clojure that can leverage the current install base of Java infrastructure, from the JVM to the libraries, might help you. Also, if you have a Java program, you may do a plug-in architecture and write Clojure plug-ins for it and end up writing half your code in Clojure.
Not a reason but (trivial) AutoCAD has LISP & DCL runtime support. It is a convenient way to write complex macros (including ActiveX automation) if you don't want to use VBA or their C++ or .NET SDKs, or if a DIESEL expression doesn't cut it.
A lot of AutoCAD's functions are actually LISP routines.
This is a topic i myself have pondered for a while but I have not really come to a decision, as usual time is the main problem... ;)
And since I can´t find these links sofar in this post i add them for public interest:
Success and Failure story:
Lisping at JPL
Really impressive success story:
Lisp in use at the Orbitz corporation
Comparison and analysis of whether to use Lisp instead of Java:
Lisp as an Alternative to Java
Syntax is irrelevant, readability is not!
Not saying this is a killer app but it looks like it could be cool
http://code.google.com/p/plop/
Killer app? The flight search engine by ITA Software is one.
As for "why", it will most probably make you a better developer and is extremnely unlikely to make you a worse one. It may, however, make you prefer lisp dialects to other languages.