An example of pratical application of Isabelle/HOL - isabelle

I have looked into the Isabelle tutorial which presents an example of it's use in verifying security protocol. However, it is a bit out of my understanding as I only know the basics. I'm looking for some examples which are not just simple theorems but practical applications using Isabelle/HOL.
For example proving some algorithms or may be verifying properties system or some non-trivial mathematical theorem. Are such examples available anywhere ?
I have looked into the list of all applications provided in the isabelle official page but most of them are proofs of theorems.
I am also looking at an example of a file system verification using Alloy. It provides a proof where the properties of file/directories can be verified. I'm looking for something similar to it.

A few highly non-trivial examples I can think of right now are:
seL4, an entire operating system kernel written in C that was verified with Isabelle.
The AFP entry Jinja_Threads contains, as far as I know, a fully formalised bytecode compiler for a Java-like language with arrays and threads.
Jeremy Avigad's proof of the Prime Number Theorem.
The proof of Kepler's conjecture. A part of this was done in Isabelle; most of it, however, was done in the more ‘basic’ theorem prover HOL Light, whose logic is similar to Isabelle.
As Joachim mentioned, I am sure you can find more interesting applications in the AFP

Related

Isabelle/Pure Isabelle/HOL Isabell/Isar conceptual questions

I need to do a presentation on a paper which at some point makes use of Isabelle/Isar and Isabelle/HOL.
I tried researching online about Isabelle/HOL and Isabelle/Isar to be able to eplain the relations in one or two slides.
Here are the relations as I currently understand them:
Isabelle - provides a generic infrastructure for deductive systems
Based on Standard ML programming language
provides an IDE which allows you to write theories which can be later be proved.
Isabelle/Pure - minimal version of higher-order logic according to this link:
Is it an actual language that can be inputted into isabelle IDE?
Or is it a technical specification?
Isabelle/HOL(Higher Order Logic):
Is it a library or a language?
How does it relate to Isabelle/Pure?
Is it procedural in nature?
Do tactics only exist in Isabelle/HOL?
Is it LCF - Logical Commutable Functions?
Isabelle/Isar:
Structured proof language based on Isabelle/Pure
Declarative
Is it an extension of Isabelle/HOL as stated at here?
Do locales only exist in Isabelle/Isar?
What does the Isabelle/IDE supports by default?
Just feels like I'm getting conflicting information from different sources and would like to sort this out.
Thanks in advance
Edit - Check out this highly related question and Manuel Eberls answer here: What are all the isabelle/slashes?
As this is an answer to a homework question and I myself only have limited understanding of all parts of the Isabelle project, this answer merely tries to point you in the right direction for at some parts of your question.
From the Isabelle/ISAR reference manual:
The Isabelle system essentially provides a generic infrastructure for building deductive systems (programmed in Standard ML), with a special focus on interactive theorem proving in higher-order logics.
It continues to also introduce ISAR:
In contrast Isar provides an interpreted language environment of its own,
which has been specifically tailored for the needs of theory and proof development.
[...]
The main concern of Isar is the design of a human-readable structured proof
language
Let's try to connect Pure to all of this by looking at publications from Makarius Wenzel regarding the topic:
Thus Isar proof texts may be understood as structured compositions of formal entities of the Pure framework, namely propositions, facts, and goals
In colloquial terms, Pure is the semantic foundation. Isar is a language that "follows" this semantic and provides syntax for it. Isabelle is just (one of the) platforms it all runs on.
Some of your confusions around the distinction between Pure and Isar seem to stem from the fact that the Isabelle Pure source code defines, or at least seems to define, both the semantics (Pure) and the syntax (Isar) in one go:
(* The Pure theory, with definitions of Isar commands and some lemmas. *)
In my humble opinion, this might be related to your understanding of syntax, semantics and "implementations" of the two. "Pure" outside of computers or paper is just semantics and thus, like math, just a thing in our brains. Give it syntax and you can put it to paper or type it into a machine. For the machine to be able to process your text (since this is ultimately what we after), it needs an implementation. Some framework telling it how to read the syntax and how to then process it. This framework is Isabelle. On top of Isabelle, there is Isabelle/Pure, which defines the semantics (the processing) and Isabelle/Isar, which defines syntax. For practical reasons, Isabelle's Pure implementation already provides the Isar syntax in one go.
From all of this, you might be able to figure HOL out yourself!
Some more references:
The Isabelle/Isar Implementation

Theory of automata prerequisites

I'm interested in automata theory to improve my understanding of programming and compiler design (I would like to create some simple syntax's in my own projects , for example; L-Systems, AI, neural net structures and intelligent object-object conversation 'AI dialog') but there are things I need to learn before I go forward.
There are a lot of new symbols and mathematical concepts I need to learn before studying automata theory, I could not copy and paste examples because of the symbols and
I don't have the required reputation to post an image so hears a link to a wiki article.
Context-free grammar article on Wikipedia
Under the heading "Proper CFGs" you can see some definitions. I don't understand them.
Could someone please tell me what this notation is called so I can Google it. Any other pointers or information would also be helpful but just knowing a few key words will help. Also if anyone knows of a comprehensive resource that can be accessed for free e.g, an IIT Video lecture on the subject of that notation I would be eternally grateful as I
can't afford tutoring or even text books at this time.
The resource I'm using at the moment for automata theory(for anyone who is interested) is Theory of Automata IIT Lectures on YouTube.
The symbols ∀ and ∃ are logical quantifiers, respectively meaning "for all" and "there exists".
Typically you are first introduced to them in a discrete mathematics course, though they're a part of predicate logic (also known as first-order logic); in my particular university's CS program, Discrete Math is a pre-requisite for Logic for Computer Science, which in turn is a pre-requisite for Formal Languages and Automata.
The star * symbol in the term (V union Sigma)* there is studied in formal languages/automata theory itself: it is the Kleene star operator. Its input is an alphabet (a set of symbols), and it produces the set of all strings of zero or more symbols over that alphabet.
A useful tool for studying formal languages and automata is JFLAP.
This topic, at the level that you have referred to in your link, is really only for mathematicians or graduate-level theoretical computer science students. The symbols you are referring to are just symbolic logic. If you are really interested in automata theory, I would recommend trying to find resources that explore the topic from a conceptual level and avoid using complex logical statements. OR, if you really want to dive in, you can teach yourself symbolic logic, some set theory, probably some modern algebra, and then tackle automata theory from there.
I read many books on the subject of Languages and Automata, including the Dragon books on compilers (and the much more pragmatic Jack Crenshaw's Let's Write a Compiler), but none of it really clicked until I read the classic Finite and Infinite Machines by Marvin Minsky. Being an old book, it does not cover the latest research and developments in the field at all, but he explains the state-of-the-art for the 1960s in Automata, Neural Networks, Turing Machines, Functional Programming and Lambda Calculus, and the oft-neglected third wheel of String-Rewriting Systems. And the writing is exceptionally excellent and engaging. IIRC Minksy even co-authored a robot story with Isaac Asimov, so he has some serious writing credentials.
Like I say, this book will not bring you up-to-date in any of these fields, but it's the best book I've found for explaining everything from the ground up. And it would provide a very firm basis for reading anything more recent. This book is in the bibliography of every book published since.

Short implementation examples of abstract interpretation

I am taking a course on abstract interpretation, but I haven't seen any examples of how the theory maps down to actual code.
I am looking for short code examples, where I preferably won't have to work with a whole compiler. The analysis doesn't have to be useful, I would just like to see an example where the analysis is derived and then implemented.
Does anyone know of any such examples, perhaps from a university course?
AI is based on a mathematic theory name Galois Connection. The theory is very simple:
Abstract the behaviour of the program.
Perform the analysis on the abstract level.
Galois connection: To relate the Actual and Abstract program.
This is the best tutorial I have seen so far about Abstract Interpretation:
There is this paper by Bertot
Structural abstract interpretation, A formal study using Coq
That gives a full implementation of an abstract interpreter for a simple toy language using the Coq Proof Assistant. I used this for a concrete reference, and found it useful, although a little hard going, which is to be expected given the subject matter. Coq is a great little piece of software.
I also came across in a Cousot paper:
A gentle introduction to formal verification of computer systems by abstract interpretation
rough details (but I am sure there will be useful citations for full details) of an implementation in Astrée, I am not familiar with Astrée, so didn't actually read that section, but I think it meets your criteria.
If you come across anymore, please let me know! Would especially like to see a prolog abstract interpreter.
Maybe this tool is also interesting for you:
Interproc Analyzer
It is an abstract analyzer for a very simple language, which however offers
interprocedural analyses. You can try out the analysis and get numerical invariants about the analyzed program. The source code is available (OCaml).
A really thorough and precise course, given by one of the "creators" of Abstract Interpretation, Patrick Cousot (already mentioned in one of the answers):
MIT course about Abstract Interpretation. The course also offers assignments, in OCaml.
There is MonoREIL, which comes with the recently open sourced tool BinNavi.
See here is a short intro.
Note that the context of the MonoREIL framework is not compilers but the analysis of binary code. Yet, it has been used for real world applications, see slide 34 ff of this introduction (which contains more formal background).

What are the best uses of Logic Programming?

By Logic Programming I mean the a sub-paradigm of declarative programming languages. Don't confuse this question with "What problems can you solve with if-then-else?"
A language like Prolog is very fascinating, and it's worth learning for the sake of learning, but I have to wonder what class of real-world problems is best expressed and solved by such a language. Are there better languages? Does logic programming exist by another name in more trendy programming languages? Is the cynical version of the answer a variant of the Python Paradox?
Prototyping.
Prolog is dynamic and has been for 50 years. The compiler is liberal, the syntax minimalist, and "doing stuff" is easy, fun and efficient. SWI-Prolog has a built-in tracer (debugger!), and even a graphical tracer. You can change the code on the fly, using make/0, you can dynamically load modules, add a few lines of code without leaving the interpreter, or edit the file you're currently running on the fly with edit(1). Do you think you've found a problem with the foobar/2 predicate?
?- edit(foobar).
And as soon as you leave the editor, that thing is going to be re-compiled. Sure, Eclipse does the same thing for Java, but Java isn't exactly a prototyping language.
Apart from the pure prototyping stuff, Prolog is incredibly well suited for translating a piece of logic into code. So, automatic provers and that type of stuff can easily be written in Prolog.
The first Erlang interpreter was written in Prolog - and for a reason, since Prolog is very well suited for parsing, and encoding the logic you find in parse trees. In fact, Prolog comes with a built-in parser! No, not a library, it's in the syntax, namely DCGs.
Prolog is used a lot in NLP, particularly in syntax and computational semantics.
But, Prolog is underused and underappreciated. Unfortunately, it seems to bear an academic or "unusable for any real purpose" stigma. But it can be put to very good use in many real-world applications involving facts and the computation of relations between facts. It is not very well suited for number crunching, but CS is not only about number crunching.
Since Prolog = Syntactic Unification + Backward chaining + REPL,
most places where syntactic unification is used is also a good use for Prolog.
Syntactic unification uses
AST transformations
Type Inference
Term rewriting
Theorem proving
Natural language processing
Pattern matching
Combinatorial test case generation
Extract sub structures from structured data such as an XML document
Symbolic computation i.e. calculus
Deductive databases
Expert systems
Artificial Intelligence
Parsing
Query languages
Constraint Logic Programming (CLP)
Many very good and well-suited use cases of logic programming have already been mentioned. I would like to complement the existing list with several tasks from an extremely important application area of logic programming:
Logic programming blends seamlessly, more seamlessly than other paradigms, with constraints, resulting in a framework called Constraint Logic Programming.
This leads to dedicated constraint solvers for different domains, such as:
CLP(FD) for integers
CLP(B) for Booleans
CLP(Q) for rational numbers
CLP(R) for floating point numbers.
These dedicated constraint solvers lead to several important use cases of logic programming that have not yeen been mentioned, some of which I show below.
When choosing a Prolog system, the power and performance of its constraint solvers are often among the deciding factors, especially for commercial users.
CLP(FD) — Reasoning over integers
In practice, CLP(FD) is one of the most imporant applications of logic programming, and is used to solve tasks from the following areas, among others:
scheduling
resource allocation
planning
combinatorial optimization
See clpfd for more information and several examples.
CLP(B) — Boolean constraints
CLP(B) is often used in connection with:
SAT solving
circuit verification
combinatorial counting
See clpb.
CLP(Q) — Rational numbers
CLP(Q) is used to solve important classes of problems arising in Operations Research:
linear programming
integer linear programming
mixed integer linear programming
See clpq.
One of the things Prolog gives you for free is a backtracking search algorithm -- you could implement it yourself, but if your problem is best solved by having that algorithm available, then it's nice to use it.
The two things I've seen it be good at is mathematical proofs and natural language understanding.
Prolog is ideal for non-numeric problems. This article gives a few examples of some applications of Prolog and it might help you understand the type of problems that it might solve.
Prolog is great at solving puzzles and the like. That said, in the domain of puzzle-solving it makes easy/medium puzzle-solving easier and complicated puzzle solving harder. Still, writing solvers for grid puzzles and the like such as Hexiom, Sudoku, or Nurikabe is not especially tough.
One simple answer is "build systems". The language used to build Makefiles (at least, the part to describe dependencies) is essentially a logic programming language, although not really a "pure" logic programming language.
Yes, Prolog has been around since 1972. It was invented by Alain Colmerauer with Philippe Roussel, based on Robert Kowalski's procedural interpretation of Horn clauses. Alain was a French computer scientist and professor at Aix-Marseille University from 1970 to 1995.
And Alain invented it to analyse Natural Language. Several successful prototypes were created by him and his "followers".
His own system Orbis to understand questions in English and French about the solar system. See his personal site.
Warren and Pereira's system Chat80 QA on world geography.
Today, IBM Watson is a contempory QA based on logic with a huge dose of statistics about real world phrases.
So you can imagine that's where it's strength is.
Retired in 2006, he remained active until he died in 2017. He was named Chevalier de la Legion d’Honneur by the French government in 1986.

Interactive math proof system

I'm looking for a tool (GUI preferred but CLI would work) that allows me to input math expressions and then perform manipulations of them but restricts me to only mathematically valid operations. Also, the tool must be able to save a session and later prove that the given set of saved operations is valid.
Note: I am Not looking for a system to generate proofs, only that check that the steps I manually specify are valid.
I have used ACL2 for similar operations and it does well for some cases but it is very hard to use for everything else.
This little project is my motivation. It is a D template type that allows for equation solving. Given this equation:
(A * B) = C + D / F;
Any one of the symbols can be set as unknown and evaluating that expression will result an an assignment to that variable. It works by building expression trees into the type and then using rewrite rules to convert it to something that can be eventuated for the unknown type.
What I need is some way to validate the rewrite rule. They can be validated by testing the assertion that given some relation is true, another one is also.
Several American proof assistants were mentioned already (usually with LISP syntax), so here is a Europe-centric list to complement that:
Coq
Isabelle
HOL4
HOL-Light
Mizar
All of them are notorious for TTY interfaces, but Coq and Isabelle provide good support for the Proof General / Emacs interface. Moreover, Coq comes with CoqIDE, which is based on OCaml/GTK an the on-board text widget. Recent Isabelle includes the Isabelle/jEdit Prover IDE, which is based on jEdit and augmented by semantic markup provided by the prover in real-time as the user types.
ACL2 is notorious -- we used to say it was an expert system, and so could only be used by experts, who had to learn from Warren Hunt, J Moore, or Bob Boyer. The thing you need to do in ACL2 is really really understand how the proof system itself works; then you can "hint" it in directions that reduce the search space.
There are several other systems that can help with this kind of thing, though, depending on what you're trying to do.
If you want to work with continuous math or number theory, the ideal is Mathematica. Problem is you can buy a used car for the same amount of money (unless you can qualify for an academic license, a far better deal.)
Something similar, and free, is Open Maxima, which is an extension of Macsyma. That page also points to several others like Axiom, that I've got no experience with.
For mathematical logic operations, there's PVS from SRI. They've got some other cool stuff like model-checking in the same framework.
There's ongoing research in this area, it's called "Theorem proving in computer algebra".
People are trying to merge the ease of use and power of computer algebra systems like Mathematica, Maple, ... with the logical rigor of proof systems. The problems are:
Computer algebra systems are not rigorous. They tend to forget side conditions such as that a divisor must not be 0.
The proof systems are hard and tedious to use (as you have discovered).
In addition to what Charlie Martin's links, you may also want to check out Maple. My experience with such software is about 5 years old, but I recall at the time finding Maple to be much more intuitive than Mathematica.
The lean prover is interactive through a JS gui.
An old and unmaintained system is 'Ontic':
http://www.cs.cmu.edu/afs/cs/project/ai-repository/ai/areas/kr/systems/ontic/0.html

Resources