- Prolog and Natural-Language Analysis by Fernando C. N. Pereira and Stuart M. Shieber provides an introduction to Prolog, and applies it to natural language processing. The book is more difficult than Learn Prolog Now!, but if you enjoy SICP, you will enjoy this book.
- Warren's Abstract Machine: A Tutorial Reconstruction by Hassan Ait-Kaci is a thorough overview of David Warren's abstract machine for executing of Prolog code. Since modern Prolog implementations are often based on WAM, it helps in understanding how to design efficient Prolog programs.
The Craft of Prolog is a beautiful book; it's one of those SICP-style books that doesn't just teach you a programming language, but changes your way of thinking about programming. I am also amused by the sly (and not-so-sly) digs that O'Keefe gets in at, well, everyone else who writes Prolog.
FWIW, I compiled a list of freely available Prolog resources a while back. Anyone looking to learn Prolog might find some of this useful, even though it is a few years old now:
After learning a bit of Prolog and seeing its beauty you may wonder how to leverage the paradigm in your own work. Lispers in particular have pondered this for nearly 3 decades - Peter Norvig's Paradigms of Artificial Intelligence famously includes a Prolog interpreter/compiler.
The Reasoned Schemer also continues in this tradition. I used that as my introduction into the beauty of Prolog and created core.logic - https://github.com/clojure/core.logic
If you're interested in being able to mix functional, object-oriented, and logic programming techniques in a single practical language - I think Clojure is a great place to start (yes, yes, Mozart/Oz is good too ;).
On your last sentence: well, dataflow variables plus backtracing are crucial things in Prolog; Mozart/Oz has dataflow vars right away and backtracing can be called on demand.
How exactly would you do this in Closure? I'm a big fan of this platform, but can't see how to easily apply above-mentioned techniques. Dataflow library in clojure-contrib seems to have very narrow scope AFAIR.
I'll also chime in and say that Prolog is an interesting language to experiment with even if you don't use it for anything else later. It is said that there are some languages that change the way that you think about programming -- Prolog is definitely one of these.
It is one of the few languages where the majority of the work is done through pattern matching (unification), rather than explicit algorithms.
Haha, in University, on a project with a friend, we ended up re-writing cpp (the C-compiler preprocessor) in Prolog. We got an A- and "I'm not sure why you'd do this … " on the project.
I did a university module in Prolog (for programming intelligent systems). Although I have not used it since, I always remember it fondly. It is a totally different paradigm to the Pythons/Rubies/Cs/Javas and really quite refreshing when you get in to it.
Unfortunately I now struggle to think of a use for it on a day to day basis.
I think Prolog still has a place in modern software development. It provides a natural way of carrying out a backwards chaining depth first search through a state space (good for game AI, planning and scheduling etc). Also constraint solving is a very useful feature built into most logic programming languages now.
Prolog pureists argue that the whole of Prolog can be used for applications, but I tend to take a different approach that integration of Prolog with other languages can help you to do the things its really good and you can get the best out of multiple languages.
I'm a logic fan so I like prolog a lot, though for many people admittedly its greatest attribute will be expanding the way you think about programming.
However for some problems you might want to go right to constraint programming as you mentioned if you need speed and ease of modeling. I like MiniZinc for many problems, or if you're interested you could try my constraint programming language (http://sabrlang.org) which is based on spatial and temporal logic.
One can have Prolog-as-a-service. The SWI-PL implementation has excellent webserver out of the box. We feed the Prolog database from some external sources, do the reasoning and expose results via dead simple JSON API. It Just Works (tm) !!
Same here, all University of Amsterdam CS students (informatics, AI and information sciences) took a compulsory course in Prolog.
Except I was lucky enough to have a use for it after uni (about two years ago), writing generators for various puzzle games. It's one of the tasks that has Prolog written all over it.
Prolog matches very well with the grammar formalism that we use (attribute-value grammars), since attribute-value structures with arbitrary depth and re-entrancy can easily be represented as Prolog terms and larger analyses can be constructed through unification.
E.g. the grammar rules are plain Prolog rules where one argument argument is a list that represents the right hand side of a grammar rule, and another argument is the left-hand side. The rule goals define relations between the RHS and LHS by unifying parts of the RHS structures with the LHS structure. For instance, the rule for np -> det, n is this:
grammar_rule(np_det_n, NP, [ Det, N ] ) :-
np_det_n_struct(NP,Det,N).
Where np_det_n_struct unifies paths between NP <-> Det, NP <-> N, and unifies some paths with atoms. Completing a grammar rule is simply a matter of unifying members of the RHS list.
This looks like the purpose for which DCGs are designed
DCGs are not really practical when implementing different parsing/generation strategies, where you usually want to be able to access categories easily (ie. not as Prolog goals). Also, once you represent RHS constituents as a list rather than goals, you do not need DCG's difference lists to maintain adjacency.
Also, why ... instead of ...
Since the rule identifiers are never used as goals. They are just for printing pretty parse trees, and to see which rules fired for constructing an attribute-value structure (e.g. rule counts are used as features in the disambiguation component). Rules match on category (remember, there is more than one way to construct on np), which is a type in the type hierarchy for feature structures. An av-structure with type np will be structured as a Prolog term with the functor np.
Btw. note that the representation above is only the representation that the grammar writer will use. For parsing/generation transformed terms will be used. E.g. during generation grammar rules use the syntactic head as the first term argument, and chart edges use the next unprocessed RHS slot as the first term argument. Both to make use of first argument indexing.
I did a module in logic and prolog for a CS masters and agree it's very refreshing. I actually got a lot more out of it as a language in the context of predicate logic (which I started hating but by the end really enjoyed).
A different problem to the 'non-first-class-ness' of clauses that others have mentioned is that Prolog is really just a unification engine, not a reasoning engine. Thus, even a manual definition
x(A, C) :- x(A, B), x(B, C).
is not always a good idea. For example, if you put this clause first (in the definition of `x`), it'll always try to reason by transitivity, rather than ever checking directly. The sub-clauses will then be checked by transitivity, &c. This is easy to fix by not putting this clause first; but then you run into further problems if `x` is reflexive; consider
le(1, 1).
le(A, C) :- le(A, B), le(B, C).
Now
?- le(1, 0).
will unify with `le(A, C)` by `A = 1`, `C = 0`, and then search for `B` such that `le(A, B)`. It'll put `B = 1`, and then try to solve `le(B, C)`, i.e., just `le(1, 0)` --and you're at the first stage of an infinite loop.
Unfortunately, no. The names of predicates---functors in Prolog-speak---have to be symbols, not variables, and Horn clauses themselves aren't first-class either.
You can do it via metaprogramming; a functor can be converted to a list [name, param1, param2, ...], and then you can write normal Prolog operations on the list that bind variables and such, and then convert back. This page shows how to use that to build a higher-order map function: http://en.wikibooks.org/wiki/Prolog/Higher_Order_Programming
I had a terrible time getting Teyjus up and running. I played around with Curry a while ago (using PAKCS (http://www.informatik.uni-kiel.de/~pakcs), if I recall correctly), but didn't explore enough to say anything about it.
Nope, in vanilla Prolog you'd have to type "X(A,C) :- X(A,B), X(B,C)." for every single X you want to be transitive.
In formal logic, "A if B" is just syntactic sugar for "(B && A) || (!B)" or in English: when B is true, A is true. (But when B is false, A could be either true or false.) So you could rewrite your expression as (edit: fixed):
transitive(X, A, B, C) :- (X(A,B), X(B,C), X(A,C)); not(X(A,B), X(B,C)).
(This saves testing `x(A, C)` twice. Also, `->` implicitly cuts, presenting backtracking over `x(A, C)`; but that shouldn't matter here.) However, this set-up mistakenly has `x(A, C)` as the antecedent; I think you actually want it as the consequent.
The problem is that transitive builds up a relationship between a functor and it's variables by trying to unify them with already known facts.
But no new fact is generated.
which derives transitivity from the first two facts, the last relationship (R3) is what we want to conclude, not what we want to find in the database.
Neither call though, will derive from fact1 and fact2 that "descendant(a,c)." is true. To arrive there you would have to "assert" these (new) facts explicitly in the transitive(...) definition.
But Prolog is not my comfort zone, there might be (and probably is) a way out of that dilemma.
Note that there's another flaw: `transitive` as defined only checks if there's one instance of transitivity that works. If you wanted to check (rather than assert) transitivity, you'd need something more like
(It's been a while since I've Prologged, so I may be forgetting something subtle about why you'd need to use the dribble `=..` to build something callable; but that's not the point, of course!)
Right, I see this as the third way: stating transitivity which must not be asserted explicitly. The question arises what one would want to reach with a call to
transitive(descendant).
(aside: the X =.. [F,A,B] is the prolog way of saying
eval("F(A,B)").
and make it callable)
The reason why prolog is important to learn and play with it is to understand and learn concept of declarative programming as programming paradigm.
That is very important skill when designing system because you learn how to define what the system should accomplish, rather than describing how to go about accomplishing it.
- Learn Prolog Now! by Patrick Blackburn, Johan Bos, and Kristina Striegnitz provides an introduction to Prolog:
http://www.learnprolognow.org/
- Prolog and Natural-Language Analysis by Fernando C. N. Pereira and Stuart M. Shieber provides an introduction to Prolog, and applies it to natural language processing. The book is more difficult than Learn Prolog Now!, but if you enjoy SICP, you will enjoy this book.
http://www.mtome.com/Publications/PNLA/pnla-digital.html
- The Craft of Prolog by Richard O'Keefe is a must to learn writing correct efficient Prolog code. Many eye-openers.
http://www.amazon.com/Craft-Prolog-Logic-Programming/dp/0262...
- Warren's Abstract Machine: A Tutorial Reconstruction by Hassan Ait-Kaci is a thorough overview of David Warren's abstract machine for executing of Prolog code. Since modern Prolog implementations are often based on WAM, it helps in understanding how to design efficient Prolog programs.
http://wambook.sourceforge.net/