"Linear Logic is a refinement of Classical Logic: using Girard's well-known translation we can investigate the usage of resources in proofs of traditional theorems."
One of the liveliest threads at the seminar concerned natural language. Several talks explored how the resource-sensitivity of Linear Logic maps beautifully onto the combinatorics of sentence meaning — the way words and phrases "combine" their meanings exactly once to produce the meaning of a whole sentence. This is the foundation of the glue semantics approach to natural language, which uses an implicational fragment of Linear Logic to assemble meanings compositionally. The core proof-search challenge — finding all distinct derivations efficiently, since each derivation corresponds to a distinct reading of an ambiguous sentence — anticipated concerns that are very much alive in today's work on parsing and semantic composition.
Another major theme was functional programming and memory management. The intuition is elegant: if a variable is used exactly once (a "linear" variable), then the object it refers to has exactly one pointer to it at any given time, which means it can safely be updated in place without copying. Several talks at the seminar explored how far this idea could be pushed — and where it runs into difficulties, particularly with higher-order functions. This line of thinking is now central to the design of modern systems programming languages. Rust's celebrated ownership and borrowing system is a direct descendant of these ideas, and linear types have since made their way into Haskell as well, a sign that the functional programming community has come to take resource-awareness seriously.
The seminar also featured what became perhaps its most contentious debate: whether Linear Logic really is "the" logic of resources, or whether a rival system — the Logic of Bunched Implications, BI, proposed by Peter O'Hearn and David Pym — did a better job. The BI proponents argued that Linear Logic captured a process reading of resources (independent parallel computations) but not a fully satisfactory spatial reading (separate regions of memory). This dispute, which generated "the most controversy in the meeting" turned out to matter enormously: BI became the foundation of Separation Logic, which is now one of the most important frameworks for formally verifying the correctness of programs that manipulate memory. O'Hearn received the ACM Turing Award in 2023, in part for this work.
Not every thread has yet reached its destination, and that is part of what makes this area exciting. Applications of Linear Logic to logic programming, planning, and concurrency have had a long and productive history, and there is no reason to think the story is finished — these are hard problems, and resource-aware reasoning keeps finding new angles on them. What makes a seminar like this valuable is precisely that ideas from different communities — proof theorists, linguists, programming language researchers, verification specialists — get to collide and cross-pollinate, with payoffs that can take years or decades to fully materialise.
One striking absence from the 1999 programme was Differential Linear Logic, introduced by Ehrhard and Regnier a few years later. By adding a notion of differentiation — in the calculus sense — to the logical framework, it opened a connection between proof theory and the mathematics of smooth functions and power series that nobody had anticipated. It gave a logical account of the "Taylor expansion" of proofs, and has since made contact with probabilistic programming, neural network semantics, and the theory of quantitative information flow. A quarter century on, Differential Linear Logic is still actively developing: the right categorical semantics, the proper treatment of higher types, and the connections to adjacent fields are not yet fully settled. It is a reminder that a logical idea, once it takes hold, can keep ramifying in unexpected directions long after the original seminar has ended.
Twenty-five years later, it is fair to say that the core intuition of the 1999 meeting has aged very well. Thinking carefully about how resources are used — in memory, in concurrency, in meaning assembly, in the very structure of proofs — remains one of the most fertile ideas at the intersection of logic and computer science. Some of the seeds planted at Dagstuhl are still growing.
The original seminar report is available from Dagstuhl [here].







