Logic, computation, and normativity

Coordinateur (s)
OKADA Mitsu - Full Professor at Keio University's Department of Philosophy
Personne(s) impliquée(s)
AKIYOSHI Ryota
ARAHATA Yasuhiro
KASHIWABATA Tatsuta
TAKAHASHI Yuta
BENOIST Jocelyn (Professeur de philosophie contemporaine et philosophie de la connaissance à l’Université Paris 1 Panthéon-Sorbonne)
MINESHIMA Koji (simulation science
Ochanomizu University)
TERUI Kazuyuki (mathematical institute
Kyoto University)
CODOGNET Philippe (Japanese French Laboratory for Informatics
Tokyo)
JOINET Jean-Baptiste (Université Lyon 3).
Informations complémentaires
Présentation

Tokyo-Paris cooperation January 1st 2015 - December 31st 2017

Scientific program

It is usually assumed that logic is a normative science, as opposed to a descriptive science such as physics or biology. But what does that mean exactly? The answer will depend crucially on what logic actually is, and because modern logic has evolved dramatically from the time of Frege’s Begriffsschift (1879) to the present day, Frege’s classical conception of logical normativity cannot be ours anymore. This project is premised on the basic idea that the relationships between logic and computer science – more precisely: between logical proofs and computational programs – raise new issues about the very meaning of the normativity of logic.

 

The goal of our project is to provide a new analysis of logical normativity from the contemporary standpoint, where the notion of “normal” proofs becomes central. The idea of a normal proof was originally introduced by the Hilbert School for carrying out Hilbert’s Consistency Program, under the early influence of Husserl, and was later revived with Martin-Löf’s Intuitionistic Type Theory, with further developments through the Curry-Howard isomorphism. Today, the notion of a normal proof turns out to be essential to our understanding of logic, especially in the combined proof-theoretic and computational views of logic which is so central in contemporary logic and computer science. Recent research in proof theory puts the relation between proofs and computation in a new perspective and gives it a new twist, which makes it deeper than before. Linear logic, ludics, and further developments show that the notion of normal proofs is essential for various settings of logic. The notion of normalization appeared in very important places, including computational theories, and among them, e.g. compositionality of game semantics. A typical example is the linear logical introduction of various representations of proof-normalization, with the distinction between classical logic and intuitionist logic being given a new sense, and the classical distinction between the semantics and syntax being critically examined. These new aspects of logic as connected with computation and programs call for an updated concept of logical normativity.