SaturdayFridayThursdayWednesdayTuesdayMondaySunday

My Gripes with Prolog

azhenley 152 points buttondown.com
upghost
Author: "Not my favorite language"

Prolog: "Mistakes were made"

As an avid Prolog fan, I would have to agree with a lot of Mr. Wayne's comments! There are some things about the language that are now part of the ISO standard that are a bit unergonomic.

On the other hand, you don't have to write Prolog like that! The only shame is that there are 10x more examples (at least) of bad Prolog on the internet than good Prolog.

If you want to see some really beautiful stuff, check out Power of Prolog[1] (which Mr. Wayne courteously links to in his article!)

If you are really wondering why Prolog, the thing about it that makes it special among all languages is metainterpretation. No, seriously, would strongly recommend you check it out[2]

This is all that it takes to write a metainterpreter in Prolog:

  mi1(true).
  mi1((A,B)) :-
          mi1(A),
          mi1(B).
  mi1(Goal) :-
          Goal \= true,
          Goal \= (_,_),
          clause(Goal, Body),
          mi1(Body).
Writing your own Prolog-like language in Prolog is nearly as fundamental as for-loops in other language.

[1] https://www.youtube.com/@ThePowerOfProlog

https://www.metalevel.at/prolog

[2] https://www.youtube.com/watch?v=nmBkU-l1zyc

https://www.metalevel.at/acomip/

schmuhblaster
I also have a strange obsession with Prolog and Markus Triska's article on meta-interpreters heavily inspired me to write a Prolog-based agent framework with a meta-interpreter at its core[0].

I have to admit that writing Prolog sometimes makes me want to bash my my head against the wall, but sometimes the resulting code has a particular kind of beauty that's hard to explain. Anyways, Opus 4.5 is really good at Prolog, so my head feels much better now :-)

[0] http://github.com/deepclause/deepclause-desktop

kamaal
I have to admit that writing Prolog sometimes makes me want to bash my my head against the wall

I think much of the frustration with older tech like this comes from the fact that these things were mostly written(and rewritten till perfection) on paper first and only the near-end program was input into a computer with a keyboard.

Modern ways of carving out a program with 'Successive Approximations' with a keyboard and monitor until you get to something to work is mostly a recent phenomenon. Most of us are used to working like this. Which quite honestly is mostly trial and error. The frustration is understandable because you are basically throwing darts, most of the times in the dark.

I knew a programmer from the 1980s who(built medical electronics equipment) would tell me how even writing C worked back then. It was mostly writing a lot, on paper. You had to prove things on paper first.

bjourne
But I wonder if that characterization is actually flattering for Prolog? I can't think of any situation, skill, technology, paradigm, or production process for which "doing it right the first time" beats iterative refinement.
pixl97
Being that prolog is from the 70s, I would guess you're a bit more careful with punch cards.
qohen
ROFL.

Like Lisp and Smalltalk, Prolog was used primarily in the 1980s, so it was run on Unix workstations and also, to some extent, on PCs. (There were even efforts to create hardware designed to run Prolog a la Lisp machines.)

And, like Lisp and Smalltalk, Prolog can be very nice for iterative development/rapid prototyping (where the prototypes might be good enough to put into production).

The people who dealt with Prolog on punchcards were the academics who created and/or refined it in its early days.[0]

[0] https://softwarepreservation.computerhistory.org/prolog/

kamaal
I mean there are nearly two full decades between the appearance of Prolog(1972) and PC revolution late 1980s and early 1990s.

The people who dealt with Prolog on punchcards were the academics who created and/or refined it in its early days.[0]

That's like a decade of work. Thats hardly early 'days'.

Also the programming culture in the PC days and before that is totally different. Heck even the editors from that era(eg vi), are designed for an entirely different workflow. That is, lots of planning, and correctness before you decided to input the code into the computer.

qohen
By 1979 at the latest -- probably closer to 1975 -- the primary Prolog implementation of the day (Warren's DEC-10 version) had an interpreter, where you could load files of code in and modify the code and you had a REPL with the ability to do all kinds of things.

I posted an excerpt of the manual, with a link to a PDF of it, in a reply to another comment[0]

(And, since even the earliest versions of Prolog were interpreted, they may've had features like this too).

And, as far as editors are concerned, people still use versions of vi (and, of course, emacs) to this day by people who don't necessarily do lots of planning and correctness before deciding to input the code into the computer.

[0] https://news.ycombinator.com/item?id=46664671

qohen
And one other thing: just because early Prolog interpreters were implemented on punchcards doesn't mean that Prolog programs run by those interpreters needed to be. It's quite possible that basically nobody ever wrote Prolog programs using punchcards, given that Prolog has the ability to read in files of code and data.
kamaal
"doing it right the first time" beats iterative refinement.

Its not iterative refinement which is bad. Its just that when you use a keyboard a thinking device, there is a tendency to assume the first trivially working solution to be completely true.

This is doesn't happen with pen and paper as it slows you down. You get mental space to think through a lot of things, exceptions etc etc. Until even with iterative refinement you are likely to build something that is correct compared to just committing the first typed function to the repo.

schmuhblaster
I think much of the frustration with older tech like this comes from the fact that these things were mostly written(and rewritten till perfection) on paper first and only the near-end program was input into a computer with a keyboard.

I very much agree with this, especially since Prolog's execution model doesn't seem to go that well with the "successive approximations" method.

kamaal
Before personal computer revolution, compute time and even development/test time on a large computers back then was rationed.

One can imagine how development would work in a ecosystem like that. You have to understand both the problem, and your solution, and you need to be sure it would work before you start typing it out at a terminal.

This the classic Donald Knuth workflow. Like he is away disconnected from a computer for long periods of time, focussed on the problems and solutions, and he is working them out on paper and pen. Until he has arrived solutions that just work, correctly. And well enough to be explained in a text book.

When you take this away. You also take away the need to put in hard work required to make things work correctly. Take a look at how many Java devs are out there who try to use a wrong data structure for the problem, and then try to shoe horn their solution to roughly fit the problem. Eventually solution does work for some acceptable inputs, and remainder is left to be discovered by an eventual production bug. Stackoverflow is full of such questions.

Languages like Prolog just don't offer that sort of freedom. And you have to be in some way serious about what you are doing in terms of truly understanding both the problem and solution well enough to make them work.

qohen
Languages like Prolog just don't offer that sort of freedom.

Yes, they do -- that's why people have enjoyed using such languages.

It might help to think of them as being like very-high-level scripting-languages with more rigorous semantics (e.g. homoiconicity) and some nifty built-ins, like Prolog's relational-database. (Not to mention REPLs, tooling, etc.)

Read, for example, what Paul Graham wrote about using Lisp for Viaweb (which became Yahoo Store)[0] and understand that much of what he says applies to languages like Prolog and Smalltalk too.

[0] https://www.paulgraham.com/avg.html

qohen
...these things were mostly written(and rewritten till perfection) on paper first and only the near-end program was input into a computer with a keyboard.

Not if you were working in a high-level language with an interpreter, REPL, etc. where you could write small units of code that were easily testable and then integrated into the larger whole.

As with Lisp.

And Prolog.

kamaal
Personal computers are a thing from the late 1980s.

Even then PC use in businesses was fairly limited.

Prolog appeared in 1972.

Either way before the PC, Programming was nothing like it is today. It was mostly a Math discipline. Math is done on paper.

qohen
The following is from David H.D. Warren's manual for DEC-10 Prolog, from 1979[0]. It describes how Prolog development is done interactively, by being able to load code in dynamically into an interpreter and using the REPL -- note that the only mention of using paper is if the developer wants to print out a log of what they did during their session:

Interactive Environment Performance is all very well. What the programmer really needs is a good inter-active environment for developing his programs. To address this need, DEC-10 Prolog provides an interpreter in addition to the compiler.

The interpreter allows a program to be read in quickly, and to be modified on-line, by adding and deleting single clauses, or by updating whole procedures. Goals to be executed can be entered directly from the terminal. An execution can be traced, interrupted, or suspended while other actions are performed. At any time, the state of the system can be saved, and resumed later if required. The system maintains, on a disk file, a complete log of all interactions with the user's terminal. After a session, the user can examine this file, and print it out on hard copy if required.

[0] https://softwarepreservation.computerhistory.org/prolog/edin...

pixl97
I'm assuming they were written on paper because they were commonly punched into paper at some stage after that. We tend to be more careful with non erasable media.
tannhaeuser
Opus 4.5 is really good at Prolog

Anything you'd like to share? I did some research within the realm of classic robotic-like planning ([1]) and the results were impressive with local LLMs already a year ago, to the point that obtaining textual descriptions for complex enough problems became the bottleneck, suggesting that prompting is of limited use when you could describe the problem in Prolog concisely and directly already, given Prolog's NLP roots and one-to-one mapping of simple English sentences. Hence that report isn't updated to GLM 4.7, Claude whatever, or other "frontier" models yet.

[1] https://quantumprolog.sgml.net/llm-demo/part1.html

schmuhblaster
Opus 4.5 helped me implement a basic coding agent in a DSL built on top of Prolog: https://deepclause.substack.com/p/implementing-a-vibed-llm-c.... It worked surprisingly well. With a bit of context it was able to (almost) one-shot about 500 lines of code. With older models, I felt that they "never really got it".
goku12
This is the sort of comment I'm on HN for. Information, especially links to appropriate resources, that only a true practitioner can offer.
gpvos
Indeed. Favorited it. My Prolog is too rusty to understand it all, but even just skimming the metainterpretation article was enlightening.
Nora23
Same here. The metainterpretation stuff is fascinating but dense.
hackyhacky
In short, "Here are my gripes about Prolog, a language that I don't understand."

It's perfectly fine to not like Prolog, but I do feel that if you're going to write an article about why you don't like it, you should at least spend some time figuring it out first.

He says of the cut operator "This is necessary for optimization but can lead to invalid programs." Imagine if a programmer new to C++ said the same thing of the "break" keyword. That's how ridiculous it sounds. Yes, cut can be used to prune backtracking and eliminate unneeded work, but that's hardly it's purpose. It leads to "invalid" programs (by which I assume he means, programs that do something other than what he wants) only in cases where you are using it wrong. Cut is no more "necessary for optimization" than break is. It's a control structure that you don't understand

Negation (\+) is confusing, and the author correctly provides examples where its meaning is unintuitive when applied to unbound variables. That's because it's not strictly speaking a negation predicate, but rather a "not provable" predicate. In that light, the examples in the article make perfect sense. Yes, Prolog is a programming language, so the order of terms matter, even if the order wouldn't matter in pure logic.

Look, Prolog is a weird language. It has a learning curve. It's not "just another language" in the Java, C++, Pascal, Python mold. I get it. But this article has the flavor of an impatient newbie getting frustrated because he can't be bothered to read the documentation.

doorhammer
I'm wildly out of my depth here, but sometimes I find I learn quickly if I try out my intuition publicly and fail spectacularly :)

"This is necessary for optimization but can lead to invalid programs."

Is this not the case? It feels right in my head, but I assume I'm missing something.

My understanding: - Backtracking gets used to find other possible solutions - Cut stops backtracking early which means you might miss valid solutions - Cut is often useful to prune search branches you know are a waste of time but Prolog doesn't - But if you're wrong you might cut a branch with solutions you would have wanted and if Prolog iterates all other solutions then I guess you could say it's provided an invalid solution/program?

Again, please be gentle. This sounded reasonable to me and I'm trying to understand why it wouldn't be. It's totally possible that it feels reasonable because it might be a common misconception I've seen other places. My understanding of how Prolog actually works under-the-hood is very patchy.

hackyhacky
I'm wildly out of my depth here, but sometimes I find I learn quickly if I try out my intuition publicly and fail spectacularly :)

Fair enough. I believe this is a variation of Cunningham's Law, which states "the best way to get the right answer on the internet is not to ask a question; it's to post the wrong answer."

Everything you wrote about backtracking is completely correct. If I may paraphrase, it boils down to: cut can be used to avoid executing unnecessary code, but using it the wrong place will avoid executing necessary code, and that would be bad. My point is: the same could be said about the "break" keyword in C++: it can avoid unnecessary iterations in a loop, or it can exit a loop prematurely. Cut and break are both control structures which make sense in the context of their respective languages, but neither would be accurately described as "for optimization."

YeGoblynQueenne
Well, sometimes you can gain a few LIPS by cutting strategically but it's not a big deal. Most textbooks will tell you that cuts help the compiler optimise etc, but most of the time you're not writing e.g. a raytracer in Prolog, so the efficiency gains are slim.
YeGoblynQueenne
Cut stops backtracking early which means you might miss valid solutions

That's right, but missing valid solutions doesn't mean that your program is "invalid", whatever that means. The author doesn't say.

Cuts are difficult and dangerous. The danger is that they make your program behave in unexpected ways. Then again, Prolor programs behave in unexpected ways even without the cut, and once you understand why, you can use the cut to make them behave.

In my experience, when one begins to program in Prolog, they pepper their code with cuts to try and stop unwanted bactracking, which can often be avoided by understanding why Prolog is backtracking in the first place. But that's a hard thing to get one's head around, so everyone who starts out makes a mess of their code with the cut.

There are very legitimate and safe ways to use cuts. Prolog textbooks sometimes introduce a terminology of "red" and "green" cuts. Red cuts change the set of answers found by a query, green cuts don't. And that, in itself, is already hard enough to get one's head around.

At first, don't use the cut, until you know what you're doing, is I think the best advice to give to beginner Prolog programmers. And to advanced ones sometimes. I've seen things...

davidgay
In my experience, when one begins to program in Prolog, they pepper their code with cuts to try and stop unwanted bactracking, which can often be avoided by understanding why Prolog is backtracking in the first place.

This gets to the heart of my problem with Prolog: it's sold as if it's logic programming - just write your first-order predicate logic and we'll solve it. But then to actually use it you have to understand how it's executed - "understanding why Prolog is backtracking in the first place".

At that point, I would just prefer a regular imperative programming language, where understanding how it's executed is really straightforward, combined with some nice unification library and maybe a backtracking library that I can use explicitly when they are the appropriate tools.

hackyhacky
This gets to the heart of my problem with Prolog: it's sold as if it's logic programming - just write your first-order predicate logic and we'll solve it. But then to actually use it you have to understand how it's executed

Prolog is a logic-flavored programming language. I don't recall Prolog ever being "sold" as pure logic. More likely, an uninformed person simply assumed that Prolog used pure logic.

Complaining that Prolog logic doesn't match mathematical logic is like complaining that C++ objects don't accurately model real-life objects.

AlotOfReading

    I don't recall Prolog ever being "sold" as pure logic.
One of the guides linked above describes it as:

    The core of Prolog is restricted to a Turing complete subset of first-order predicate logic called Horn clauses
hackyhacky
The core of Prolog is restricted to a Turing complete subset of first-order predicate logic called Horn clauses

Does this sound to you like an attempt to deceive the reader into believing, as the GP comment stated, that the user can

just write your first-order predicate logic and we'll solve it.
AlotOfReading
It absolutely does sound like "write your first order logic in this subset and we'll solve it". There's no reasonable expectation that it's going to do the impossible like solve decideability for first order logic.
hackyhacky
It absolutely does sound like "write your first order logic in this subset and we'll solve it".

No it does not. Please read the words that you are citing, not the words that you imagine. I honestly can't tell if you are unable to parse that sentence or if you a cynically lying about your interpretation in order to "win" an internet argument.

All programming languages are restricted, at least, to a "Turing complete subset of first-order predicate logic." There is absolutely no implication or suggestion of automatically solving any, much less most, first order logic queries.

davidgay
Except it cannot decide all Horn clauses.
YeGoblynQueenne
This gets to the heart of my problem with Prolog: it's sold as if it's logic programming - just write your first-order predicate logic and we'll solve it. But then to actually use it you have to understand how it's executed - "understanding why Prolog is backtracking in the first place".

Prolog isn't "sold" as a logic programming language. It is a logic programming language. Like, what else is it?

I have to be honest and say I've heard this criticism before and it's just letting the perfect be the enemy of the good. The criticism is really that Prolog is not a 100% purely declarative language with 100% the same syntax and semantics as First Order Logic.

Well, it isn't, but if it was, it would be unusable. That would make the critics very happy, or at least the kind of critics that don't want anyone else to have cool stuff, but in the current timeline we just have a programming language that defines the logic programming paradigm, so it makes no sense to say it isn't a logic programming language.

Edit:

At that point, I would just prefer a regular imperative programming language, where understanding how it's executed is really straightforward, combined with some nice unification library and maybe a backtracking library that I can use explicitly when they are the appropriate tools.

Yeah, see what I mean? Let's just use Python, or Java, or C++ instead, which has 0% of FOL syntax and semantics and is 0% declarative (or maybe 10% in the case of C++ templates). Because we can't make do with 99% logic-based and declarative, gosh no. Better have no alternative than have a less than absolutely idealised perfect ivory tower alternative.

Btw, Prolog's value is its SLD-Resolution based interpretation. Backtracking is an implementation detail. If you need backtracking use yield or whatever other keyword your favourite imperative language gives you. As to unification, good luck with a "nice unification library" for other languages. Most programmers can't even get their head around regexes. And good luck convincing functional programmers that "two-way pattern matching" (i.e. unification) is less deadly than the Bubonic Plague.

doorhammer
Red cuts change the set of answers found by a query, green cuts don't.

Ohhh, interesting. So a green cut is basically what I described as cutting branches you know are a waste of time, and red cuts are the ones where you're wrong and cut real solutions?

At first, don't use the cut, until you know what you're doing, is I think the best advice to give to beginner Prolog programmers. And to advanced ones sometimes. I've seen things...

Yeah, I'm wondering how much of this is almost social or use-case in nature?

E.g., I'm experimenting with Prolog strictly as a logic language and I experiment with (at a really novice level) things like program synthesis or model-to-model transformations to emulate macro systems that flow kind of how JetBrains MPS handles similar things. I'm basically just trying to bend and flex bidirectional pure relations (I'm probably conflating fp terms here) because it's just sort of fun to me, yeah?

So cut _feels_ like something I'd only use if I were optimizing and largely just as something I'd never use because for my specific goals, it'd be kind of antithetical--and also I'm not an expert so it scares me. Basically I'm using it strictly because of the logic angle, and cut doesn't feel like a bad thing, but it feels like something I wouldn't use unless I created a situation where I needed it to get solutions faster or something--again, naively anyway.

Whereas if I were using Prolog as a daily GP language to actually get stuff done, which I know it's capable of, it makes a lot of sense to me to see cut and `break` as similar constructs for breaking out of a branch of computation that you know doesn't actually go anywhere?

I'm mostly spit-balling here and could be off base. Very much appreciate the response, either way.

YeGoblynQueenne
So a green cut is basically what I described as cutting branches you know are a waste of time, and red cuts are the ones where you're wrong and cut real solutions?

Basically, yes, except it's not necessarily "wrong", just dangerous because it's tempting to use it when you don't really understand what answers you're cutting. So you may end up cutting answers you'd like to see after all. The "red" is supposed to signify danger. Think of it as red stripes, like.

Which make stuff go faster too (well, a little bit). So, yeah, cuts in general help the compiler/interpreter optimise code execution. I however use it much more for its ability to help me control my program. Prolog makes many concessions to efficiency and usability, and the upshot of this is you need to be aware of its idiosyncrasies, the cut being just one of them.

Whereas if I were using Prolog as a daily GP language to actually get stuff done, which I know it's capable of, it makes a lot of sense to me to see cut and `break` as similar constructs for breaking out of a branch of computation that you know doesn't actually go anywhere?

Cuts work like breaks sometimes, but not always. To give a clear example of where I always use cuts, there's a skeleton you use when you want to process the elements of a list that looks like this:

  list_processing([], Bind, Bind):-
    !. % <-- Easiest way to not backtrack once the list is empty.

  list_processing([X|Xs], ..., Acc, Bind, ... ):-
     condition(X)
     ,! % Easiest way to not fall over to the last clause.
     ,process_a(X,Y)
    ,list_processing(Xs, ..., [Y|Acc], Bind, ... ).

  list_processing([X|Xs], ..., Acc, Bind, ... ):-
     process_b(X,Y)
    ,list_processing(Xs, ..., [Y|Acc], Bind, ... ).
So, the first cut is a green cut because it doesn't change the set of answers your program will find, because once the list in the first argument is empty, it's empty, there's no more to process. However, Prolog will leave two choice points behind, for each of the other two clauses, because it can't know what you're trying to do, so it can't just stop because it found an empty list.

The second cut is technically a red cut: you'd get more answers if you allowed both process_a and process_b to modify your list's elements, but the point is you don't want that, so you cut as soon as you know you only want process_a. So this is forcing a path down one branch of search, not quite like a break (nor a continue).

You could also get the same behaviour without a cut, by e.g. having a negated condition(X) check in the last clause and also checking that the list is not empty in every other clause (most compilers are smart enough to know that means no more choice points are needed), but, why? All you gain this way is theoretical purity, and more verbose code. I prefer to just cut there and get it done. Others of course disagree.

dwattttt
In short, "Here are my gripes about Prolog, a language that I don't understand."

this article has the flavor of an impatient newbie getting frustrated because he can't be bothered to read the documentation.

The author has written about Prolog in a positive light before (and linked to it in the post), I don't get the impression that these are all "the author doesn't understand what they're doing".

Their first complaint, that "strings are not standardised, so code working with strings in SWI-Prolog is not compatible with Scryer Prolog", seems an appropriate thing to be unhappy about (unless the author is just wrong?).

Your response to their gripe about \+ being "not provable" instead of "negation" notes it's a subtle difference, and that Prolog differs from pure logic there.

The author even notes that doing due diligence, they found a solution to a complaint they had. This doesn't strike me as "can't be bothered to read the documentation".

YeGoblynQueenne
Code logic is expressed entirely in rules, predicates which return true or false for certain values.

Open any Prolog programming textbook (Clocksin & Mellish, Bratko, Sterling & Shapiro, O'Keefe, anything) and the first thing you learn about Prolog is that "code logic" is expressed in facts and rules, and that Prolog predicates don't "return" anything.

The confusion only deepens after that. There are no boolean values? In an untyped language? Imagine that. true/0 and false/0 are not values? In a language where everything is a predicate? Imagine that. Complete lack of understanding that "=" is a unification operator, and that unification is not assignment, like, really, it's not, it's not just a fancy way to pretend you don't do assignment while sneaking it in by the backdoor to be all smug and laugh at the noobs who aren't in the in-group, it's unification, it doesn't work as you think it should work if you think it should work like assignment because everything is immutable so you really, really don't need assignment. Complete misunderstanding of the cut, and its real dangers, complete misunderstanding of Negation as Failure, a central concept in logic programming (including in ASP) and so on and so on and so on and on.

The author failed to do due diligence. And if they've written "in a positive light" about Prolog, I would prefer not to read it because I'll pull my remaining hair out, which is not much after reading this.

dwattttt
Is it your contention that the author doesn't understand that that Prolog predicates don't "return" anything, that they were expecting assignment rather than unification? I would read it again, their examples clearly state these (noting that the author does say "return", but also clearly shows bidirectional examples).

Both you and GP have had some fairly strong responses to what looked like mild complaints, the kind I would expect anyone to have with a language they've used enough to find edges to.

YeGoblynQueenne
See this:

  The original example in the last section was this:

  foo(A, B) :-
      \+ (A = B),
      A = 1,
      B = 2.

  foo(1, 2) returns true, so you'd expect f(A, B) to return A=1, B=2. But it returns false.

foo(A,B) fails because \+(A = B) fails, because A = B succeeds. That's because = is not an assignment but a unification, and in the query foo(A,B), A and B are variables, so they always unify.

In fact here I'm not sure whether the author expects = to work as an assignment or an equality. In \+(A = B) they seem to expect it to work as an equality, but in A = 1, B = 2, they seem to expect it to work as an assignment. It is neither.

I appreciate unification is confusing and takes effort to get one's head around, but note that the author is selling a book titled LOGIC FOR PROGR∀MMERS (in small caps) so they should really try to understand what the damn heck this logic programming stuff is all about. The book is $30.

dwattttt
The author also wrote in the same article:

This is also why you can't just write A = B+1: that unifies A with the compound term +(B, 1)
YeGoblynQueenne
Yes, and then they were horribly confused about why foo(A,B) fails, regardless. They clearly have heard of unification and find it a fascinating concept but have no idea what it means.

Honestly, we don't have to wrap everyone on the internets in a fuzzy warm cocoon of acceptance. Sometimes people talk bullshit. If they're open to learn, that's great, but the author is snarkily dismissing any criticism, so they can stew in their ignorance as far as I am concerned.

Like the OP says, the author didn't bother to RTFM before griping.

IceDane
I thought the same thing.

I, too, am a big fan of prolog and have (at least) yearly binges where I write a lot of it for fun and profit (and some frustration), but I do not consider myself to be an expert. But even I can see that the author has what I would consider a pretty basic understanding of prolog. Which makes it even more surprising they are writing a book that uses prolog.

cogman10
Yeah, exactly why I'm not writing the same sort of article about Haskell or prolog. I'm inexperienced in both and the effort to learn them was more than I wanted to spend.
ristos
No functions:

You can't do length + 1 as a single expression in any language though, in python for example len(ls) + 1 is two expressions, but I get what you mean, it is a little less terse. But those commas in prolog are super useful. And you can make your example bidirectional too:

``` :- use_module(library(clpfd)).

foo(List, X) :- length(List, Out), X #= Out + 1. ```

``` ?- foo(X, 3). X = [_, _] .

?- foo(X, 3), X = [a,b]. X = [a, b] . ```

-----

No standarized collection types:

I think that's a feature rather than a bug. The ability to just use infix and compound terms in a generic way without it being tied to a clause or operator is a huge feature. For example:

``` pets(dog-rex-poodle, dog-fluffy-bishon). ```

That defines the collection in a semantic or mathematical sort of way, then you can pass that into whatever data structure you want, whether it's a ordered map or a hashmap or whatever.

----

No boolean values:

That's also a feature in prolog, same as having no null. It's the same sort of motivation for Haskell's Maybe and Either types or Rust's Option and Result types, same kind of usefulness.

----

Cuts are confusing:

Conditionals don't need to use cuts, the modern prolog way of doing it is using the reif library:

``` :- use_module(library(reif)).

if_(Cond, Then, Else) ```

For example:

``` :- use_module(library(clpfd)). :- use_module(library(reif)).

sign(X, S) :- if_(X #> 0, S = pos, S = nonpos). ```

---

Non-cuts are confusing:

This isn't doing what you're thinking:

``` \+ (A = B) ```

It's unifying A with B, `A = B` unifies. You want `dif(A, B)`.

----

Straying outside of default queries is confusing:

It might be tempting to use bagof, but it's not monotonic. You should try to write as much prolog code as you can in a pure, monotonic way, so your code can take advantage of prolog's unique advantages, like monotonic debugging.

Check out the SWI prolog discourse group, there's a lot of good stuff in there and everyone's very supportive to newbies. And Markus Triska's website and youtube channel Power of Prolog is a super useful resource for all this stuff, a good place to go if you want to know what sets prolog apart. A lot of prolog's usefulness doesn't show up as an embedded language either, like minikanren isn't "pretty much prolog", it lacks a lot of stuff that prolog offers. Multiparadigm programming languages are also really popular now, but a lot of what sets programming languages apart isn't what they can do, it's what they can't do, what sort of paradigm or way of thinking they force you to adopt.

victorbjorklund
You can do it in one expression in Elixir:

x = length(list) + 1

On the other hand I don’t really see why it is an issue with Prolog (maybe there are more complicated situations than the example)

ristos
Oh ok, you're referring to assignment as one expression, not the PLT definition of expression, which for that example would be like 3-6 expressions depending on whether you consider x, list, and 1 to be expressions. Yeah, a complicted formula might be more verbose in prolog because there's no return statements, it's all unification, but I think that's a feature not a bug, because it allows you to create bidirectional formulas and also naturally extends to "multiple return values" in a sense, because everything is a relation.
YeGoblynQueenne
It's not an issue. "+" is not a function in Prolog. So, for example, X = 1 + 1 means that the term 1 + 1, or +(1,1), is bound to the variable X. It doesn' t mean that 1 + 1 is evaluated, and its result assigned to the variable X. Prolog doesn't have assignment, and all its data structures are immutable, including variables, so once a variable is bound it stays bound for the scope of the current execution branch. That's unification.

The thing to keep in mind is that Prolog borrows its syntax and semantics from (a fragment of) First Order Logic, and that's where its treatment of functions comes from. In FOL there's a concept of a "term": a variable is a term, a constant is a term, and a function symbol followed by one or more comma-separated terms in parentheses is a term. So if f is a function symbol, f(X,a(b), c) is a term, where X is a variable (in Prolog notation where variables start with capital letters or _). Terms are mapped to functions over objects in the domain of discourse by a pre-interpretation (long story). Terms are arguments to atomic formulae which look exactly like terms with one or more arguments, except they have predicate symbols rather than function symbols, so if p is a predicate symbol, then p(f(X,a(b), c), d, 1, 2, 3) is an atomic formula, or atom.

There's a bit of terminological confusion here because Prolog calls everything a "term", including atomic formulae, and it calls its constants "atoms", but, well, you learn to deal with it.

The difference between terms (in both Prolog and FOL) and functions (in most programming languages) is that terms are not evaluated and they are not replaced by their values. Rather, during execution of a Prolog program, variables in a term are bound to values, by unification, and when execution is over, if you've got all your ducks in a row, all variables in your program should be bound to some term with no variables, and all terms be "ground" (i.e. have no variables).

That's because a ground term is essentially a proposition, and so it has a truth value of true or false. The point of FOL and of Prolog, and every other logic programming language is to carry out a proof of a theorem, expressed as a logic program. If a term has variables we can't know its truth value because it may correspond to a different value of a function, depending on the values of its variables. So to know the truth or falsehood of a term we need to ground it. Unification in Prolog is a dirty hack that allows the proof to proceed without fully grounding terms, until the truth of an entire theorem is known, at which point everything becomes ground (or should be ... more or less, it depends).

ASP (Answer Set Programming) instead starts by grounding every term. Then, it basically treats a logic program as a SAT formula and uses a SAT-solver to find its truth or falsehood (it does more than that: it gives you all models of the theorem, i.e. every set of ground atoms that makes it true).

And that's where the tiger got its stripes. Don't expect Prolog (or ASP for that matter) to work like other languages, it has its own semantics. You don't have to like them, but there's a reason why everything is the way it is.

usgroup
I think this article is problematic because Prolog is truly a different paradigm which requires time to understand. Laments about no strings, no functions and "x is confusing" read like expectations of a different paradigm.

Prolog is also unusual in a sense that it is essential to understand what the interpreter does with your code in order to be able to write it well. For vanilla Prolog, that's not so hard. However, when constraint programming and other extensions are added, that becomes much harder to do.

aeonik
Datalog has the same capabilities as prolog but allows strings right?

My understanding is that they have very different evaluation strategies, bottom up vs top down. But with laziness and pruning you can still achieve the same goals in datalog with more ergonomics, right?

I think every language should have a prolog or datalog implementation, kind of like regex.

cmrdporcupine
In many respects "Datalog" doesn't refer to a single language or implementation or standard. It really just refers to a set of approaches for querying relational datasets using something like Prolog's unification.

By which I mean there are Datalogs that look like Prolog a bit, and others that don't. And things that are "Datalogs" that don't even have their own PL but instead more of an API. And no standard at all.

usgroup
No, datalog is a decidable subset of Prolog. That changes everything.
pjmlp
When I did my degree in Software Engineering, logic programming (with Tarsky's World), and Programming with Prolog were required classes.

There were only two prevalent attitudes, some of us really loved FP (me included), others hated it and could hardly wait to get it done.

Somehow there was a similar overlap with those of us that enjoyed going out of mainstream languages, and those that rather stay with Pascal and C.

usgroup
Yeah that sounds like me too. Prolog became a fetish a few years ago. I used it intensely for 2 years, wrote a lot about it, until it became a part of me. Its intangible what it does to you, but its the dual of what you might expect.
rramadass
I think this article is problematic because Prolog is truly a different paradigm which requires time to understand.

Prolog is also unusual in a sense that it is essential to understand what the interpreter does with your code in order to be able to write it well.

100% this!

Coming from procedural/OO paradigms i did not understand how to think about Prolog until i read Robert Kowalski's paper Predicate Logic as a Programming Language - https://www.researchgate.net/publication/221330242_Predicate...

I still have a long way to go but at least i am on the right track.

gota
Agreed, but cuts are confusing even considering the paradigm, though. Especially considering the paradigm, actually!

They are necessary in practice, though. But boy do a cut here and there makes it harder for catching up to some Prolog codebase.

shevy-java
I liked the idea behind Prolog, but I absolutely detest the syntax.

IMO it would be better to have something like Prolog as part of a "better designed" language per se. I can't come up with a good proposal myself - language design is hard, including syntax design - but imagine if Prolog would be a part of python. That feature would then be used by more people. (This is just an example; just randomly creeping in features into a more successful language, also often won't work. I am just giving this as an example that MIGHT be better.)

dragonwriter
“Something like Prolog” as a part of a more traditional language is kind of the idea of miniKanren, which has been implemented for many languages: https://minikanren.org/
subjectsigma
The syntax of Prolog is important! Since Prolog programs are just Prolog data structures, it is easy to write Prolog meta-interpreters which generate or consume Prolog code.

https://www.youtube.com/watch?v=nmBkU-l1zyc

YeGoblynQueenne
The syntax of Prolog is (a fragment of) the syntax of First Order Logic. It's not supposed to look like your friendly neighbourhood programming language because it's mathematical notation.

Count yourself lucky you (probably) learned programming in a language like Java or Python, and not, say, FORTRAN. Because then you'd really pray for the simplicity and elegance of definite clauses.

(Or not. FORTRAN programmers can write FORTRAN in any language, even FORTRAN).

infotainment
I always felt like Prolog's ability to execute programs was entirely accidental.

To me, it feels like a data description language that someone discovered could be tricked into performing computation.

jjgreen
... a bit like life ...
flopsamjetsam
Conway's Life? Or DNA?
mlajtos
yes
hwayne
Check out datalog! https://learn-some.com/ The tutorial there uses Clojure syntax but Datalog normally uses a Prolog syntax.
akritid
This datalog implementation uses prolog syntax, can even run the queries in prolog to contrast the model: https://des.sourceforge.io/
YeGoblynQueenne
It's the other way around. We kind of stumbled on the whole idea of computation thanks to work on First Order Logic, that Prolog borrows its syntax and semantics from.

It's all the other programming languages that have weird syntax, including LISP btw. Prolog's syntax is the quintessential syntax of a formal language for computation.

As to the "data description" part this is just a leaky abstraction almost universally adopted by programming languages, other than LISPs and logic programming languages. In truth, there is no separation between data and computation. And so there is no need for special syntax for either. Prolog is a "data description" language only in the sense that you can describe data and computation in one go.

subjectsigma
Re: the comma-at-end-of-line thing: I would sometimes write Prolog like so to avoid that issue:

    goal :-
        true
        , subgoal(A, B)
        , subgoal(B, C)
        .        
This is definitely not standard and I don't know if the WAM optimizes out the gratuitous choice point, but it certainly makes the code easier to work with.
cyberpunk
I actually like , ; . in erlang. Maybe I’m an alien.
Joel_Mckay
"Depends how you felt about Elixir" |> String.graphemes() |> Enum.frequencies()

Best regards =3

YeGoblynQueenne
It's not standard but that's how I write Prolog. I thing I got it from SQL?

I don't usually leave the full-stop on its own line though. You can always select the entire line, then move one down to cut it without catching the full stop. If that makes sense?

doorhammer
I always come back to prolog to tool around with it but haven’t done a ton.

Bidirectionality has always been super fascinating.

Didn’t know about Picat. 100% going to check it out.

hwayne
I'll warn you that Picat is very much a "research language" and a lot of the affordances you'd expect with a polished PL just aren't there yet. There's also this really great "field notes" repo from another person who learned it: https://github.com/dsagman/picat
doorhammer
Totally fair. Realistically “check it out” means I’ll probably spin up an env and try modeling a few things to see how it feels.

I’m mostly a language tourist they likes kicking the tires on modes of modeling problems that feel different to my brain.

Started skimming those notes. Really solid info. Appreciate it!

doorhammer
Side note: Just clocked your name. Read through Practical TLA+ recently modeling a few things at work. Incredibly helpful book for working through my first concrete model in practice.
wodenokoto
As someone who is interested in learning more abut these kinds of tools, where does one start? Prolog? datalog? MiniKranren? And now the TFA also introduces Picat.

And once you've settled on one of these, which learning resource should one go with?

cess11
If you just want to dip in, grab https://www.scryer.pl/ and do some exercises from https://www.metalevel.at/prolog.

Scryer is a good start because it's ISO. Datalog is kind of a subset, MiniKanren is somewhat related but not Prolog, and Picat is kind of Prolog with an imperative language within it.

usgroup
SWI Prolog is just fine, and you'll find it to be batteries included unlike many other choices. The first thing to learn is the "Prolog state of mind", or how to express your intentions in Prolog without trying to turn it into a functional or imperative programming language.

Prolog will show you another way of thinking. If it does not then you are doing it wrong.

rramadass
see https://news.ycombinator.com/item?id=45915699 first.

Then checkout the books recommended by user "YeGoblynQueenne" who knows this domain pretty well.

floxy
I guess we are supposed to pile on, so I'll add that the author should read "The Art of Prolog" (Sterling & Shapiro) and then "The Craft of Prolog" (O'Keefe).
YeGoblynQueenne
And also "Prolog Programming for AI" by Bratko and "Programming in Prolog" by Clocksin and Mellish.

Although these days I'd recommend anyone interested in Prolog starts in at the deep end with "Foundations of Logic Programming" by George W. Lloyd, because I've learned the hard way that teaching Prolog as a mere programming language, without explaining the whole logic programming thing, fails.

codekilla
Thanks for the reference. Have you ever worked with Maude? Curious what the advantages of one over the other might be. Maude seem like it might be more focused on being a meta logic, and I'm guessing it is probably easier to write programs in Prolog.
YeGoblynQueenne
I've never worked with Maude.
tannhaeuser
No standardized strings

ISO "strings" are just atoms or lists of single-character atoms (or lists of integer character codes) [...]. Code written with strings in SWI-Prolog will not work in [other] Prolog.

That's because SWI isn't following ISO (and even moving away from ISO in other places eg.[1]).

ISO Prolog strings are lists of character codes period. It's just that there are convenient string manipulation-like predicates operating on atom names such as sub_atom, atom_concat, atom_length, etc ([2]). You'd use atom_codes to converse between atoms/strings or use appropriate list predicates.

[1] https://www.reddit.com/r/prolog/comments/1089peh/can_someone...

[2] https://quantumprolog.sgml.net/docs/libreference.html#string...

YeGoblynQueenne
That's where ISO clashes with the de-facto standard of its most popular implementation, that is also the best maintained. Too bad for ISO.

... we've disagreed about this before though :)

iainctduncan
Anyone have experience with the author's book? I am just getting into this world right now, as it happens, and am working on Art of Prolog, Simply Logical, and the Reasoned Schemer, but other suggestions for resources that are particularly good would be welcome!
usgroup
I can recommend "Simply Logical". I also suggest Advent of Code as a nice way to cut your teeth with expressing thoughts with Prolog.
cbarrick
Please just let me end rules with a trailing comma instead of a period, I'm begging you.

The reason Prolog doesn't support trailing commas is exactly the same reason JSON doesn't support trailing commas: the language is designed to be parsed by an operator precedence parser, and the comma is an infix operator.

The fact that Prolog is a purely operator precedence syntax, with a dynamic operator table, allows you to build DSLs embedded in Prolog. CLP(FD) uses this to create the #= operator.

usgroup
Generally speaking, Prolog syntax is ridiculously simple and uniform. Its pattern matching is the most universal of any programming language partly because of this.
xelxebar
Here's a nicely-designed tiling window manager, implemented in SWI-Prolog:

https://github.com/Seeker04/plwm

It actually has quite good UX affordances. More than that, however, I find the code imminently hackable, even as someone with very little Prolog experience. Reading through the plwm code really demystified the apparent gap between toy and practical Prolog for me. Heck, even the SWI-Prolog codbase itself is quite approachable!

I'm also mildly surprised at some of OG's gripes. A while back, I ran through Triska's The Power of Prolog[0], which crisply grounds Prolog's mental model and introduces standard conventions. In particular, it covers desugaring syntax into normal predicates, e.g. -/2 as pairs, [,]/2 as special syntax for ./2 cons cells, etc. Apparently, I just serendipitously stumbled into good pedagogical resources!

I'd be interested in ways that people utilize logical programming concepts and techniques into non-LP languages.

[0]https://www.metalevel.at/prolog

shawn_w
I frequently find myself thinking "this would be a great fit for prolog etc." but always fail when it comes to the execution.
mcswell
That's easy:

thisWouldBeGreat: !, fail.

drob518
Prolog is pretty unique. I’ve run into similar frustrations when I have used it as well. I think you have to drop all your preconceived notions and past programming experience when you use Prolog and just accept it as it is. Prolog was developed to run in very small machines and so the resolution algorithm operates in a strict order. This means clause order sometimes matters. Overall, I’ve found it useful to always keep in mind that I’m unifying trees of terms, not so much programming in a pure logical language.
fithisux
The logic programming space has more than Prolog

Picat (mentioned by the author) Datalog Mercury XSB

are there more?

usgroup
Personally I don't like the Standard ML based mash-ups.

I think Curry is an interesting take on logic programming. A sort of Haskell meets Prolog.

bw86
findall instead of bagof can also help for these cases.

    | ?- findall(A, (tree(A, N), branch(N)), As).

    As = [n,n1]

    yes
See https://lpn.swi-prolog.org/lpnpage.php?pagetype=html&pageid=...
boxed
The line reorder issue is evergreen and it seems all languages need to either go through this phase and fix it, or gaslight its users forever that it's "not really a problem".
egl2020
Maybe it's just me, but my gripe is that it looks declarative, but you have to read the code in execution order.
xlii
Someone bashing on my pet language? Cracks knuckles

Just kidding. Some of those are stylistic choices I don't have gripes but can understand the criticism. There is however one thing about "Non-cuts are confusing" I'd like to clarify:

In this example:

    foo(A, B) :-
      \+ (A = B),
      A = 1,
      B = 2.
It's very obvious why it fails and it has nothing to do with non-cut. Let's say A can be apple and B can be orange and now you're asking Prolog to compare apples to oranges! ;)

In short one has to "hint" Prolog what A and B can be so then it can "figure out" whethever comparison can be made and what is its result. Assuming there exist is_number(X) clause that can instantiate X as a number following would work just fine:

    foo(A, B) :-
      is_number(A),
      is_number(B),
      \+ (A = B),
      A = 1,
      B = 2.
(note that this would be stupid and very slow clause. Instantiation in such clauses like is_number(X) usually starts with some defined bounds. For A = 10000, B = 10001 and lower bound of 1 pessimistic case this clause would require 100M checks!
YeGoblynQueenne
I think that should be nonvar(A), nonvar(B) because the reason the unification succeeds and \+(A = B) fails is because A and B are variables (when called as foo(A,B). What confuses the author is unification, as far as I can tell.

But, really, that's just not good style. It's bound to fail at some point. It's supposed to be a simple example, but it ends up not being simple at all because the author is confused about what's it supposed to behave like.

YeGoblynQueenne
I expect by this time tomorrow I'll have been Cunningham'd and there will be a 2000 word essay about how all of my gripes are either easily fixable by doing XYZ or how they are the best possible choice that Prolog could have made.

In that case I won't try to correct any of the author's misconceptions, but I'll advise anyone reading the article to not take anything the author says seriously because they are seriously confused and have no idea what they're talking about.

Sorry to be harsh, but it seems to me the author is trying their damnedest best to misunderstand everything ever written about Prolog, and to instead apply entirely the wrong abstractions to it. I don't want to go into the weeds, since the author doesn't seem ready to appreciate that, but Prolog isn't Python, or Java, or even Picat, and to say e.g. that Prolog predicates "return true or false" is a strong hint that the author failed to read any of the many textbooks on Prolog programming, because they all make sure to drill into you the fact that Prolog predicates don't "return" anything because they're not functions. And btw, Prolog does have functions, but like I say, not going into the weeds.

Just stay away. Very misinformed article.