Functional programmers vaunt the Curry-Howard(-Lambek) correspondence, as if it endorses the lambda calculus. In fact, it shows the limitations of computer science.

Theorems are eternal truths[1]—one can use their results freely. This attitude does not work with the results of computations.

Lazy evaluation shows one practical problem in free reuse; consider Hudak's example:

\begin{align*} \lambda x. & (+ x~x) (* 5~4) \\ & (+ (* 5~4) (* 5~4)) \\ & (+20~(* 5~4)) \\ & (+20~20) \\ &40 \end{align*}

Such situations are common enough that lazy languages must be implemented by sharing.

Consider next

fac :: Integer -> Integer
fac 0 = 1
fac n = n * fac (n-1)

Putatively fac (n-1) has type Integer but in fact it is qualitatively different from n (also of type Integer)—it is the result of a computation, while n is a value. In this example, it can be excused, but in the case of the Ackermann function, viz.

\begin{align*} A(0,n) &= n + 1 \\ A(m+1,0) &= A(m,1) \\ A(m+1,n+1) &= A(m,A(m+1,n)) \end{align*}

Computing $$A(5,3)$$ by stepping through the above would take longer than the heat death of the universe—it is morally different from an Integer provided as an argument or even the result of an $$O(n)$$ computation.

Theorems stand on their own, algorithms are verbs. They relate inputs to outputs; an algorithm is a Thing even when evaluation has not taken place, but evaluation is common in programming. It is understood that a procedure quicksort will be applied to inputs and it derives its meaning from the plausibility of such application.

Girard singles out cut-elimination (guaranteed by Gentzen's Hauptsatz) as what makes a programming language. In the case of intuitionistic logic, cut-elimination is Church-Rosser, which is not at all the case for classical logical, but in any case mathematics without modus ponens is uncanny ("like crossing the English Channel with fists and feet bound"). Computer science, however, is very much about normalizing and carrying out $$\beta$$-reductions a.k.a cut-elimination.

# Girard, Negation

$$\neg\forall x. \neg P(x) = \exists x. P(x)$$ is actually an interesting way to handle existence ("it is inconsistent to say $$x$$ does not exist, thus $$x$$ exists"); Euclid first uses it to prove the existence of the angle bisector. Perhaps it reflects an anxious worldview (Kubrick, Kafka), a taut overarching machine of language where everything is predetermined vs. the intuitionist (playing and moulding logic yourself).

Intuitionists rejected this and then developed intuitionistic logic; that it is isomorphic to the lambda calculus should not vindicate their fundamentalism— mathematicians already distinguish proof of existence vs. constructive algorithm.

Linear logic provides an example of constructive negation, negation being entirely absent in functional programming. Intuitionists reject $$\neg\forall x. \neg P(x) = \exists x. P(x)$$, objecting to the law of the excluded middle, but programming lacks negation altogether and the duality of logic (De Morgan laws, e.g. $$A^\bot\wedge B^\bot = (A \vee B)^\bot$$) is missing. This is more interesting!

[1] Prolog is order-dependent, making a mockery of logic (Girard).