Papers by Valeria de Paiva

Journal of Symbolic Logic, 1984
In 1965 Zadeh introduced the concept of fuzzy sets. The characteristic of fuzzy sets is that the ... more In 1965 Zadeh introduced the concept of fuzzy sets. The characteristic of fuzzy sets is that the range of truth value of the membership relation is the closed interval [0, 1] of real numbers. The logical operations ⊃, ∼ on [0, 1] which are used for Zadeh's fuzzy sets seem to be Łukasiewciz's logic, where p ⊃ q = min(1, 1 − p + q), ∼ p = 1 − p. L. S. Hay extended in [4] Łukasiewicz's logic to a predicate logic and proved its weak completeness theorem: if P is valid then P + Pn is provable for each positive integer n. She also showed that one can without losing consistency obtain completeness of the system by use of additional infinitary rule.Now, from a logical standpoint, each logic has its corresponding set theory in which each logical operation is translated into a basic operation for set theory; namely, the relation ⊆ and = on sets are translation of the logical operations → and ↔. For Łukasiewicz's logic, P Λ (P ⊃ Q). ⊃ Q is not valid. Translating it to the set v...
The general aim of research on Categorical Proof Theory is to apply Logic and Category Theory to ... more The general aim of research on Categorical Proof Theory is to apply Logic and Category Theory to understand, reason about and optimise computational processes. In particular, for this course we will develop (linear) logic, and its categorical proof theory, as a tool for solving problems in theoretical computer science. Health Warning: These notes are very preliminary. I am sure they contain several mistakes, big and small. They are also supposed to be read in conjunction with going to the lectures. And, last disclaimer, I haven't got around to putting the bibliographic references in, yet. As soon as I managed to do that, they'll be in my homepage vdp. In any case, I would be grateful to anyone willing to point the mistakes out to me.
In this paper we consider the problem of deriving a term assignment system for Girard's Intui... more In this paper we consider the problem of deriving a term assignment system for Girard's Intuitionistic Linear Logic for both the sequent calculus and natural deduction proof systems. Our system differs from previous calculi (e.g. that of Abramsky) and has two important properties which they lack. These are the substitution property (the set of valid deductions is closed under substitution) and subject reduction (reduction on terms is well typed). We define a simple (but more general than previous proposals) categorical model for Intuitionistic Linear Logic and show how this can be used to derive the term assignment system. We also consider term reduction arising from cut-elimination in the sequent calculus and normalisation in natural deduction. We explore the relationship between these, as well as with the equations which follow from our categorical model.
This note starts the formal study of the type system of the functional language Ponder. Some of t... more This note starts the formal study of the type system of the functional language Ponder. Some of the problems of proving soundness and completeness are discussed and some preliminary results, about fragments of the type system, shown. It consists of 6 sections. In section 1 we review briefly Ponder's syntax and describe its typing system. In section 2 we consider a very restricted fragment of the language for which we can prove soundness of the type inference mechanism, but not completeness. Section 3 describes possible models of this fragment and some related work. Section 4 describes the type-inference algorithm for a larger fragment of Ponder and in section 5 we come up against some problematic examples. Section 6 is a summary of further work.

Logic, language, information, and computation : 22nd International Workshop, WoLLIC 2015, Bloomington, IN, USA, July 20-23, 2015, proceedings Modeling Language Design for Complex Systems Simulation.- Formalization of Mathematics for Fun an... more Modeling Language Design for Complex Systems Simulation.- Formalization of Mathematics for Fun and Profit.- From Residuated Lattices via GBI-algebras to BAOs.- Towards a Nominal Chomsky Hierarchy.- Multi-Linear Algebraic Semantics for Natural Language.- Categories of Games.- Learning in the limit, general topology, and modal logic.- The Word Problem for Finitely Presented Quandles is Undecidable.- Intuitionistic Ancestral Logic as a Dependently Typed Abstract Programming Language.- On Topologically Relevant Fragments of the Logic of Linear Flows of Time.- An Equation-Based Classical Logic.- Cyclic multiplicative proof nets of linear logic with an application to language parsing.- A Dichotomy Result for Ramsey Quantifiers.- Parametric Polymorphism { Universally.- On the weak index problem for game automata.- Proof-theoretic aspects of the Lambek-Grishin Calculus.- Syllogistic Logic with "Most".- Characterizing Frame Definability in Team Semantics via The Universal Modality....

ArXiv, 2016
Lexical semantics continues to play an important role in driving research directions in NLP, with... more Lexical semantics continues to play an important role in driving research directions in NLP, with the recognition and understanding of context becoming increasingly important in delivering successful outcomes in NLP tasks. Besides traditional processing areas such as word sense and named entity disambiguation, the creation and maintenance of dictionaries, annotated corpora and resources have become cornerstones of lexical semantics research and produced a wealth of contextual information that NLP processes can exploit. New efforts both to link and construct from scratch such information - as Linked Open Data or by way of formal tools coming from logic, ontologies and automated reasoning - have increased the interoperability and accessibility of resources for lexical and computational semantics, even in those languages for which they have previously been limited. LexSem+Logics 2016 combines the 1st Workshop on Lexical Semantics for Lesser-Resources Languages and the 3rd Workshop on L...
Journal of Logic and Computation
Proceedings of the ACL-PASCAL Workshop on Textual Entailment and Paraphrasing - RTE '07, 2007
This paper describes our system as used in the RTE3 task. The system maps premise and hypothesis ... more This paper describes our system as used in the RTE3 task. The system maps premise and hypothesis pairs into an abstract knowledge representation (AKR) and then performs entailment and contradiction detection (ECD) on the resulting AKRs. Two versions of ECD were used in RTE3, one with strict ECD and one with looser ECD.

Lecture Notes in Computer Science, 2014
Natural language processing systems, even when given proper syntactic and semantic interpretation... more Natural language processing systems, even when given proper syntactic and semantic interpretations, still lack the common sense inference capabilities required for genuinely understanding a sentence. Recently, there have been several studies developing a semantic classification of verbs and their sentential complements, aiming at determining which inferences people draw from them. Such constructions may give rise to implied commitments that the author normally cannot disavow without being incoherent or without contradicting herself, as described for instance in the work of Kartunnen. In this paper, we model such knowledge at the semantic level by attempting to associate such inferences with specific word senses, drawing on WordNet and Verb-Net. This allows us to investigate to what extent the inferences apply to semantically equivalent words within and across languages.
Lecture Notes in Computer Science, 1998
A generic method for constructing categorical models of Linear Logic is provided and instantiated... more A generic method for constructing categorical models of Linear Logic is provided and instantiated to yield traditional models such as coherence spaces, hypercoherences, phase spaces, relations, etc. The generic construction is modular, as expected. Hence we discuss multiplicative connectives, modalities and additive connectives in turn. Modelling the multiplicative connectives of Linear Logic is a generalisation of previous work, requiring a few non-standard concepts. More challenging is the modelling of the modalities `!' (and, respectively `?'), which is achieved in the surprisingly general setting of this construction by considering !-candidates and showing that they exist and constitute a modality, under appropriate conditions.
Classical intensional semantic frameworks, like Montague's Intensional Logic (IL), identify inten... more Classical intensional semantic frameworks, like Montague's Intensional Logic (IL), identify intensional identity with logical equivalence. This criterion of cointensionality is excessively coarse-grained, and it gives rise to several well known di culties. Theories of fine-grained intensionality have been been proposed to avoid this problem. Several of these provide a formal solution to the problem, but they do not ground this solution in a substantive account of intensional di↵erence. Applying the distinction between operational and denotational meaning, developed for the semantics of programming languages, to the interpretation of natural language expressions, o↵ers the basis for such an account. It permits us to escape some of the complications generated by the traditional modal characterization of intensions.

The benefits of the extended Curry-Howard correspondence relating the simply typed lambda-calculu... more The benefits of the extended Curry-Howard correspondence relating the simply typed lambda-calculus to proofs of intuitionistic propositional logic and to appropriate classes of categories that model the calculus are widely known. In this paper we show an analogous correspondence between a simple constructive modal logic CK (with both necessity and possibility ♦ operators) and a lambda-calculus with modality constructors. Then we investigate classes of categorical models for this logic. Parallel work for constructive S4 (CS4) has appeared before in [Bierman and . The work on the basic system CK has appeared initially with co-authors Bellin and Ritter in the conference Methods for the Modalities [Bellin et al., 2001]. Since then the technical work has been improved by and taken to a different, higher-order categorical setting by Ritter and myself. Here we expound on the logical significance of the earlier work.
Brown and Gurr [1, 2] have introduced a model of Petri Nets that is based on de Paiva's Dialectic... more Brown and Gurr [1, 2] have introduced a model of Petri Nets that is based on de Paiva's Dialectica categories. This model was refined in an unpublished technical report [3], where Petri nets with multiplicities, instead of elementary nets (i.e., nets with multiplicities zero and one only) were considered. In this note we expand this modelling to deal with fuzzy petri nets. The basic idea is to use as the dualizing object in the Dialectica categories construction, the unit interval that has all the properties of a lineale structure [6].
This paper describes a preliminary classification of transitive verbs in terms of the implication... more This paper describes a preliminary classification of transitive verbs in terms of the implications of existence (or non-existence) associated with their direct object nominal arguments. The classification was built to underlie the lexical marking of verbs in the lexical resources that the automated system BRIDGE developed at Xerox PARC used for textual inference. Similar classifications are required for other logic-based textual inference systems, but very little is written about the issue.
This preliminary account of our work on improving the verb lexicon of OpenWordNet-PT describes so... more This preliminary account of our work on improving the verb lexicon of OpenWordNet-PT describes some of the issues that one faces when manually cleaning up a semi-automatically constructed lexical resource and some of the lessons we learned while doing it.
This paper presents OpenWordNet-PT, a freely available open-source wordnet for Portuguese, with i... more This paper presents OpenWordNet-PT, a freely available open-source wordnet for Portuguese, with its latest developments and practical uses. We provide a detailed description of the RDF representation developed for OpenWordnet-PT. We highlight our efforts to extend the coverage of our resource and add nominalization relations connecting nouns and verbs. Finally, we present several real-world applications where OpenWordnet-PT was put to use, including a large-scale high-throughput sentiment analysis system.
This paper presents NomLex-PT, a lexical resource describing Portuguese nominalizations. NomLex-P... more This paper presents NomLex-PT, a lexical resource describing Portuguese nominalizations. NomLex-PT connects verbs to their nominalizations, thereby enabling NLP systems to observe the potential semantic relationships between the two words when analysing a text. NomLex-PT is freely available and encoded in RDF for easy integration with other resources. Most notably, we have integrated NomLex-PT with OpenWordNet-PT, an open Portuguese Wordnet.
This paper presents NomLex-BR, a lexical resource describing Brazilian Portuguese nominalizations... more This paper presents NomLex-BR, a lexical resource describing Brazilian Portuguese nominalizations, and its integration with OpenWordnet-PT. We first describe the original English NOMLEX lexical resource and how we used it to bootstrap a Portuguese version. Subsequently, we describe how this lexicon can be embedded into OpenWordnet-PT, which facilitates its use and helps spot-checking both the bigger integrated resource and the original lexicon. Lastly, we outline some of the other, more substantial work that we plan to engage for the project of using linguistic insights for knowledge representation in Portuguese.
Proceedings of the international conference on Formal Ontology in Information Systems - FOIS '01, 2001
Uploads
Papers by Valeria de Paiva
invited lectures and 4 tutorials, were carefully reviewed and selected from 44 submissions.
workshop itself but extending beyond it to embrace a wider range of topics related to Chu spaces and Dialectica constructions. By way of complement to Barr’s article in this issue on the conceptual evolution of
Chu spaces we offer here some prefatory remarks of our own on the nature, uses, and history of Chu spaces. We then give our perspective on where this volume’s papers fit in the overall scheme of things. Further material on Chu spaces can be found at the website http://chu.stanford.edu maintained by the second editor.