Sunday, July 3, 2016

Proof-Theoretic Semantics for Non-Philosophers

`Proof-theoretic semantics is an alternative to truth-condition semantics.' writes Peter Schroeder-Heister in the Stanford Encyclopedia of Philosophy and he ought to know, since  he created the label for the  idea in 1991.  He  says that proof theoretic semantics is part of the tradition according to which `the meaning of a term should be explained by reference to the way it is used in our language'. Meaning-as-use, is the slogan for that, which somehow sends people to Wittgenstein and his `language games'. 

This post is merely a summary of the first part of Schroeder-Heister's  Stanford encyclopedia entry on proof-theoretical semantics. Why bother then? I  almost hear you ask. Well, I couldn't find a short account that I liked. This is not it, definitely, but it should work as a marker to continue looking. 
Also I have found many interesting articles both in Prawitz'sSchroeder-Heister's  and in Girard's webpages that might be it. But no time to read them now, with only a week before the course at NASSLLI.

Mathematicians might want to think of it as a generalization of proof theory  starting from Gentzen's work (Investigations into Logical Deduction). Gentzen famously remarked that the introduction rules in his calculus of natural deduction could be said to  define the meanings of logical constants, while the elimination rules can be obtained as a consequence of their definition.

While Hilbert is credited  with the revolutionary idea that proofs should be studied as mathematical objects themselves, Prawitz (1972)  made the original distinction between `reductive' proof theory and `general' proof theory.  While Hilbert-style “reductive proof theory”,  is the “attempt to analyze the proofs of mathematical theories with the intention of reducing them to some more elementary part of mathematics such as finitistic or constructive mathematics”, in general proof theory “proofs are studied in their own right in the hope of understanding their nature”

Meanwhile Kreisel proposes  to explain  proof theory "from a neglected point of view. Proofs and their representations by formal derivations are treated as principal objects of study, not as mere tools for analyzing the consequence relation.” (Kreisel, 1971).  Kreisel focuses on the dichotomy between a theory of proofs and a theory of provability, while Prawitz concentrates on the different goals proof theory may pursue. However, both stress the necessity of studying proofs as fundamental entities by means of which we acquire  mathematical knowledge.
Thus in general proof theory we are not only interested in whether B follows from A, but in the ways by which we arrive at B starting from A

Most forms of proof-theoretic semantics are intuitionistic in spirit, which means in particular that principles of classical logic such as the law of excluded middle or the double negation law are rejected or at least considered problematic. The main tool of proof-theoretic semantics, the calculus of natural deduction, is biased towards intuitionistic logic, in the sense that the straightforward formulation of its elimination rules is the intuitionistic one. Classical logic is only available by means of some rule of indirect proof, which, at least to some extent, destroys the symmetry of the reasoning principles. Of particular importance is its functional view of implication, according to which a proof of A → B is a constructive function which, when applied to a proof of A yields a proof of B. This functional perspective underlies many conceptions of proof-theoretic semantics, in particular those of Lorenzen, Prawitz and Martin Löf. 

Natural deduction is based on  five major ideas:
  • Discharge of assumptions: Assumptions can be “discharged” or “eliminated” in the course of a derivation, so the central notion of natural deduction is that of a derivation depending on assumptions.
  • Separation/Modularity: Each primitive rule schema contains only a single logical constant.
  • Introduction and elimination: The rules for logical constants come in pairs. The introduction rule(s) allow(s) one to infer a formula with the constant in question as its main operator, the elimination rule(s) permit(s) to draw consequences from such a formula.
  • Reduction: For every detour consisting of an introduction rule immediately followed by an elimination rule there is a reduction step removing this detour.
  • Normalization: By successive applications of reductions, derivations can be transformed into normal forms which contain no detours.
Lorenzen (1955) introduced the idea of inversion principle which says that everything that can be obtained from every defining condition of A can be obtained from A itself.
von Kutschera (1968) introduces “Gentzen semantics”, a semantics of logically complex implication-like statements A1,…,An → B with respect to calculi K which govern the reasoning with atomic sentences. 
----



Sunday, June 19, 2016

Nominalizations and Zombie Nouns

I was in Stanford for some of the DELPH-IN meeting and we had the traditional conversation about nominalizations once again.



Is it useful to connect `destruction' with `destroy'? Should this be a lexical resource? How? Should we classify them (or some) nominalizations or deverbals? Which?

I have been working on this stuff for a while. First with Olya Gurevich (and Dick and Tracy), then  in Portuguese, with Livy Real (and Claudia and Alexandre). 

It looks like there's still plenty to do.

But Knowledge Representation aside, as Helen Sword says, nominalizations are `zombie nouns'.

Persepolis seems beautiful, even after destroyed by Alexander in 332 B.C.
 (By the way Alexander  the Great also destroyed Thebes and Tyre, the last one by building a bridge causeway!!!)

Saturday, March 12, 2016

Late with Women's International Day 2016

Someone I don't know personally yet (Evangelia Antonakos)
posted this in the Facebook group Women in Logic
List of women in mathematics - Wikipedia, the free encyclopedia
and guess what? I am there!!!

As Grouch Marx would've said things are really wrong if I'm invited to this party. Still a party is a party, might as well enjoy it.

Thank you Evangelia, this was a super present!

Monday, February 22, 2016

Bridges and all that

I talked  on October 26th 2012 in the Nuance NLU Seminar organized by Mark Fanty,
(NLP Research: Bridges from Language to Logic)
about my work on Natural Language-based Knowledge Representations at PARC and how I wanted to improve it.

I first gave a talk similar to this in Feb 2010 at SRI (slides in slideshare).

Then I tried to think a bit more along the lines of expressivity vs. complexity, following the work of Larry Moss and others. First at Stanford Workshop on Natural Logic, Proof Theory, and Computational Semantics, March 2011 and  then at RAIN in Austin, Texas, June 2011, when I called it Little Engines of Inference.

Now this work is stopped for lack of computational power. I need to decide how to continue it, but am drawing a blank right now.

(picture of Via Lactea in the Appalachian mountains by Lam, CUNY 2015 photo contest)

Sunday, February 21, 2016

Gentle with Gentilics

(photo of Yosemite Firefall by Sangeeta Dey via National Geographic)

Yesterday Livy submitted a joint paper on gentilics on the OpenWordNet-PT. This is good, as she reminded me, this work has been kicking around for quite a while.

This also reminds me that the work on NomLex-PT needs tidying up. Will write more about it later on. There is a paper in e.g. here.

Sunday, February 14, 2016

Bauer and Banach-Tarski


Dan Piponi had a short and thought-provoking post in G+ the other day. To me the best bit was
"Conversations between people who view mathematics as being about beliefs, and people who view mathematics as being about practices, can often be at cross purposes.
I think I am firmly on the side of the ones who think that it's about practices, not beliefs. One thing that surprises me about Philosophy and philosophical questions is the notion of cross-purposed-ness, how one can be interested in a subject, yet, totally not engage with certain questions and manners of approaching it.

The whole Piponi's post is below. In response to an ensuing discussion with Rodrigo Freire on  whether intuitionistic practices of Mathematics are self-contradictory, given that the majority of the mathematicians do believe in Excluded Middle and Choice, Victor Botelho posted the link to the following video from Andrej Bauer talking in Princeton in 2013.

Five Stages of Accepting Constructive Mathematics

I loved it!! If you haven't seen it yet, do yourself a favour and have a look. It touches on lots of the questions that aren't at cross-purposes for me. Says some things I'd say too, but it has much more than I know, so I learned lots, and hope to follow up with some more learning on the subject of constructive mathematics. Including some on what to do with Banach-Tarski and the constructive "theorem" that every function is continuous. 

Philosophy is hard, I don't want to venture there much, but it would be nice to get enough od a feeling for the place, to avoid it.



 ------------------------
Dan Piponi on Beliefs vs Practices in Mathematics
 
Many people think of religions primarily as systems of belief. I think this may be a skewed view because of the predominance of Christianity and Islam, both of which make creeds prominent. For example, although Judaism does have something like a creed, it tends to place more emphasis on practice than belief.

This reflects my view of mathematics. I think that for many, mathematics is a matter of belief. For them, mathematics is a way to find out what is and isn't true. I tend to see mathematics as a set of practices. As a result, I find myself bemused by debates over whether 2 really exists, or whether infinite sets exist, whether the continuum really is an infinite collection of points, whether infinitesimals exist, whether the axiom of choice is true, and so on. I find some ultrafinitists particularly confusing. They seem to believe themselves to be expressing skepticism of some sort, whereas to me, expressing skepticism about mathematical constructions is a category error. So to me, these ultrafinitists are surprising because of what they believe, not because of what they don't. This doesn't just apply to ultrafinitists. In an essay by Boolos [1], he seems confident in the self-evident truth of the existence of integers, say, but expresses more and more doubt as he considers larger and larger cardinals. Many mathematicians seem to have a scale of believability, and ultrafinitists just draw the scale differently.

Conversations between people who view mathematics (or religion) as being about beliefs, and people who view mathematics (or religion) as being about practices, can often be at cross purposes. And members of one group can often find themselves dragged into debates that they don't care for because of the framing of questions. (I don't want to debate the existence infinite sets, not because I can't justify my beliefs, but because I'm more interested in how to use such sets. I don't think belief is a precondition for use.)

Of course you can't completely separate belief and practice and I certainly do have some mathematical beliefs. For example I put a certain amount of trust in mathematics in my daily job because I believe certain practices will allow me to achieve certain goals.

[1] Must we believe in Set Theory? https://books.google.com/books/about/Logic_Logic_and_Logic.html?id=2BvlvetSrlgC (I hope I'm not mischaracterizing this essay, but even if I am, the point still stands.)

Friday, February 12, 2016

Billy Goats Gruff

In 2009 Ruy de Queiroz first asked his logician friends, me included, about Rolf Schock Prize nominations.

I think it's very nice that Ruy gets asked this kind of question. So I tried to help in 2009 with a tale of Bills.

I'd like to nominate Three Bills: William A. Howard, William W. Tait and F. William Lavwere. No, it's not a kid's story, even if the three Billy Goats Gruff come to mind. The reason I think this is a sensible nomination is that the three are involved in the project that came to be known as the Curry-Howard Correspondence. The Curry-Howard correspondence is one of the big forces behind the advancement of logic and philosophy as   engines of Computer Science in the last fifty years.

The Curry-Howard correspondence asserts that propositions in (certain systems of) logic correspond to types in (certain kinds of) programming languages, that proofs in those logics correspond to terms in the these programming languages and that simplification of proofs correspond to computation in the related  languages. As can be easily seen even in this oversimplified description, the correspondence has far-reaching consequences for both logic and programming. In particular it allows for results in either area to be exploit in the other--a hallmark of a deep result in Mathematics.

Howard is already, together with H. Curry named in the the correspondence, which wouldn't have the impact it has without the work of Tait and Lawvere. Tait is responsible to the part of the correspondence that relates proof simplification to computation, possibly the most important component of the correspondence. And Lawvere (together with Lambek and many other category theorists) is the one that made it all beautiful mathematics, by relating it to the Theory of Categories. Lawvere showed in his thesis (1968) that the correspondence could be  extended to (special classes of) categories and that, once again, results in pure category theory correspond to logical (and hence also programming) results, so that one can work in whichever field there are tools available and then transfer the results to the other fields. It's remarkable and very beautiful, albeit still hard to explain without the full mathematical apparatus.

Nonetheless given the Rolf's prize interest in making sure that society in general is aware of the importance of philosophy  and logic matters for humanity, and given the importance of computing and its understanding for society as a whole, the nomination of the three Bills seems to me a sensible option.