Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
ABSTRACT The present work presents a general theoretical framework for the study of operators which merge partial probabilistic knowledge from different sources which are individually consistent, but may be collectively inconsistent. We... more
ABSTRACT The present work presents a general theoretical framework for the study of operators which merge partial probabilistic knowledge from different sources which are individually consistent, but may be collectively inconsistent. We consider a number of principles for such an operator to satisfy including a set of principles derived from those of Konieczny and Pino Pérez which were formulated for the different context of propositional merging. Finally we investigate two specific such merging operators derived from the Kullback-Leibler notion of informational distance: the social entropy operator, and its dual, the linear entropy operator. The first of these is strongly related to both the multi-agent normalised geometric mean pooling operator and the single agent maximum entropy inference process. By contrast the linear entropy operator is similarly related to both the arithmetic mean pooling operator and the limit centre of mass inference process.
Research Interests:
The present work may perhaps be seen as a point of convergence of two historically distinct sequences of results. One sequence of results started with the work of Tennenbaum [59] who showed that there could be no nonstandard recursive... more
The present work may perhaps be seen as a point of convergence of two historically distinct sequences of results. One sequence of results started with the work of Tennenbaum [59] who showed that there could be no nonstandard recursive model of the system PA of first order Peano arithmetic. Shepherdson [65] on the other hand showed that the system of arithmetic with open induction was sufficiently weak to allow the construction of nonstandard recursive models. Between these two results there remained for many years a large gap occasioned by a general lack of interest in weak systems of arithmetic. However Dana Scott observed that the addition alone of a nonstandard model of PA could not be recursive, while more recently McAloon [82] improved these results by showing that even for the weaker system of arithmetic with only bounded induction, neither the addition nor the multiplication of a nonstandard model could be recursive. Another sequence of results starts with the work of Lessan ...
The present work may perhaps be seen as a point of convergence of two historically distinct sequences of results. One sequence of results started with the work of Tennenbaum [59] who showed that there could be no nonstandard recursive... more
The present work may perhaps be seen as a point of convergence of two historically distinct sequences of results. One sequence of results started with the work of Tennenbaum [59] who showed that there could be no nonstandard recursive model of the system PA of first order Peano arithmetic. Shepherdson [65] on the other hand showed that the system of arithmetic with open induction was sufficiently weak to allow the construction of nonstandard recursive models. Between these two results there remained for many years a large gap occasioned by a general lack of interest in weak systems of arithmetic. However Dana Scott observed that the addition alone of a nonstandard model of PA could not be recursive, while more recently McAloon [82] improved these results by showing that even for the weaker system of arithmetic with only bounded induction, neither the addition nor the multiplication of a nonstandard model could be recursive. Another sequence of results starts with the work of Lessan ...
The present work may perhaps be seen as a point of convergence of two historically distinct sequences of results. One sequence of results started with the work of Tennenbaum [59] who showed that there could be no nonstandard recursive... more
The present work may perhaps be seen as a point of convergence of two historically distinct sequences of results. One sequence of results started with the work of Tennenbaum [59] who showed that there could be no nonstandard recursive model of the system PA of first order Peano arithmetic. Shepherdson [65] on the other hand showed that the system of arithmetic with open induction was sufficiently weak to allow the construction of nonstandard recursive models. Between these two results there remained for many years a large gap occasioned by a general lack of interest in weak systems of arithmetic. However Dana Scott observed that the addition alone of a nonstandard model of PA could not be recursive, while more recently McAloon [82] improved these results by showing that even for the weaker system of arithmetic with only bounded induction, neither the addition nor the multiplication of a nonstandard model could be recursive. Another sequence of results starts with the work of Lessan ...
CiteSeerX - Document Details (Isaac Councill, Lee Giles): The purpose of this note is to describe the underlying insights and results obtained by the authors, and others, in a series of papers aimed at modelling the distribution of... more
CiteSeerX - Document Details (Isaac Councill, Lee Giles): The purpose of this note is to describe the underlying insights and results obtained by the authors, and others, in a series of papers aimed at modelling the distribution of `natural' probability functions, more precisely the probability ...
IΣo denotes the subtheory of first–order Peano arithmetic obtained by restricting the induction schema to formulae with only bounded quantifiers. Let EXP denote the corresponding theory obtained by adding to the language a function symbol... more
IΣo denotes the subtheory of first–order Peano arithmetic obtained by restricting the induction schema to formulae with only bounded quantifiers. Let EXP denote the corresponding theory obtained by adding to the language a function symbol to denote exponentiation. Let PT ...
ABSTRACT An attempt is made to estimate the distribution of the length of a polymer molecule when the excluded volume effect is taken into account. The model of a self-avoiding walk on a lattice is used, and exact enumerations are... more
ABSTRACT An attempt is made to estimate the distribution of the length of a polymer molecule when the excluded volume effect is taken into account. The model of a self-avoiding walk on a lattice is used, and exact enumerations are undertaken on a computer of walks of up to 18 steps on the two-dimensional simple quadratic lattice (nearly 125 million walks) and up to 13 steps on the three-dimensional simple cubic lattice (nearly 950 million walks). It is conjectured that the circular and spherical symmetry properties of the limiting space distributions for simple random walks are not changed by the self-avoiding condition. The limiting distribution of a rectangular coordinate is estimated, and is found to differ appreciably from the Gaussian distribution of a simple random walk. Instead the distribution can be well fitted by a function of the form exp(-|x|ν) dx where ν is equal to 4 in the two-dimensional case and 2.5 in the three-dimensional case. The distributions of length corresponding to these functions are then calculated, and are significantly sharper than those corresponding to randomly linked units.
ABSTRACT Without Abstract
A model for a complete first order theory T in a language of finite type is minimally saturated if it is recursively saturated and elementarily embeddable in every recursively saturated model of T. Such a model is unique when it exists,... more
A model for a complete first order theory T in a language of finite type is minimally saturated if it is recursively saturated and elementarily embeddable in every recursively saturated model of T. Such a model is unique when it exists, and may be regarded as the smallest ...
We describe an axiomatic approach to the a priori choice of hierarchies of second order probability distributions within the context of inexact reasoning. In this manner we give an epistemological characterisation of a certain hierarchy... more
We describe an axiomatic approach to the a priori choice of hierarchies of second order probability distributions within the context of inexact reasoning. In this manner we give an epistemological characterisation of a certain hierarchy of symmetric Dirichlet priors up to a ...
Research Interests:
The European Summer Meeting of the Association for Symbolic Logic was held at Keele University, Staffordshire, on 20-29 July, 1993. The program included twenty invited hour talks, two invited short courses and forty-eight contributed... more
The European Summer Meeting of the Association for Symbolic Logic was held at Keele University, Staffordshire, on 20-29 July, 1993. The program included twenty invited hour talks, two invited short courses and forty-eight contributed papers. There were also ...
We consider the desirability, or otherwise, of various forms of induction in the light of certain principles and inductive methods within predicate uncertain reasoning. Our general conclusion is that there remain conflicts within the area... more
We consider the desirability, or otherwise, of various forms of induction in the light of certain principles and inductive methods within predicate uncertain reasoning. Our general conclusion is that there remain conflicts within the area whose resolution will require a ...
The purpose of this paper is to describe the underlying insights and results obtained by the authors, and others, in a series of papers aimed at modeling the distribution of 'natural' probability functions, more... more
The purpose of this paper is to describe the underlying insights and results obtained by the authors, and others, in a series of papers aimed at modeling the distribution of 'natural' probability functions, more precisely the probability functions on {0, 1} n which we encounter ...
ABSTRACT In this paper we rework the results in our earlier paper [1] using an alternate initial set of connectives. We show that this yields a more satisfactory limiting prior probability distribution than our earlier paper and that this... more
ABSTRACT In this paper we rework the results in our earlier paper [1] using an alternate initial set of connectives. We show that this yields a more satisfactory limiting prior probability distribution than our earlier paper and that this distribution extends naturally to more variables. We then consider these multivariate prior probability distributions in the light of established desiderata. Introduction and Notation In this paper we shall continue the investigations into 'natural prior probability distributions' initiated in [1], [2]. Briey in these papers we considered the problem of estimating the probability that a 'naturally encountered' probability function P on the set SL 1 of sentences of the proposition language L 1 with a single propositional variable q 1 would have P (q 1 ), equivalently the expected (truth) value of q 1 , lying in a real interval (a; b). [Recall that for a language Ln = fq 1 ; :::; q n g a probability function P on SLn is determined by its values on the atoms ...
CiteSeerX - Document Details (Isaac Councill, Lee Giles): The purpose of this note is to describe the underlying insights and results obtained by the authors, and others, in a series of papers aimed at modelling the distribution of... more
CiteSeerX - Document Details (Isaac Councill, Lee Giles): The purpose of this note is to describe the underlying insights and results obtained by the authors, and others, in a series of papers aimed at modelling the distribution of `natural' probability functions, more precisely the probability ...
The present paper seeks to establish a logical foundation for studying axiomatically multi-agent probabilistic reasoning over a discrete space of outcomes. We study the notion of a social inference process which generalises the concept... more
The present paper seeks to establish a logical foundation for studying axiomatically multi-agent probabilistic reasoning over a discrete space of outcomes. We
study the notion of a social inference process which generalises the concept of an
inference process for a single agent which was used by Paris and Vencovská to characterise axiomatically the method of maximum entropy inference. Axioms for a social inference process are introduced and discussed, and a particular social inference process called the
Social Entropy Process, or SEP, is defined which satisfies these axioms. SEP is justified
heuristically by an information theoretic argument, and incorporates both the maximum
entropy inference process for a single agent and the multi–agent normalised geometric mean
pooling operator.
Research Interests:
"""The present work presents a general theoretical framework for the study of operators which merge partial probabilistic evidence from different sources which are individually coherent, but may be collectively incoherent. We consider a... more
"""The present work presents a general theoretical framework for the study of operators which merge partial probabilistic evidence from different sources which are individually coherent, but may be collectively incoherent. We consider a number of principles for such an operator to satisfy including a set of principles derived from those of Konieczny and Pino Perez [11] which were formulated for the different context of propositional merging.

Finally we investigate two specific such merging operators derived from the Kullback-Leibler notion of informational distance: the social entropy operator, and its dual, the linear entropy operator. The first of these is strongly related to both the multi-agent normalised  geometric mean pooling operator and the single agent maximum entropy inference process,
ME.  By contrast the linear entropy operator is similarly related to both the arithmetic mean pooling operator and the limit centre of mass inference process, CM-infinity."""
"ABSTRACT Within the framework of discrete probabilistic uncertain reasoning a large literature exists justifying the maximum entropy inference process, ME, as being optimal in the context of a single agent whose subjective... more
"ABSTRACT

Within the framework of discrete probabilistic uncertain reasoning a
large literature exists justifying the maximum entropy inference process,
ME, as being optimal in the context of a single agent whose subjective
probabilistic knowledge base is consistent. In [9] Paris and Vencovska,
extending the work of Johnson and Shore [6], completely characterised the
ME inference process by an attractive set of axioms which an inference
process should satisfy, thus providing a quite different justification for
ME from that of the more traditional possible worlds or information
theoretic arguments whose origins go back to nineteenth century statistical
mechanics as in [8] or [5].
More recently the second author in [10] and [11] extended the Paris-
Vencovska axiomatic approach to inference processes to the context of
several agents whose subjective probabilistic knowledge bases, while in-
dividually consistent, may be collectively inconsistent. In particular he
defines a "social entropy process", SEP, which is a natural extension of
the single agent ME. However, while SEP is known to possess many
attractive properties, these are almost certainly insufficient to uniquely
characterise SEP. It is therefore of particular interest to study those
Paris-Vencovska principles valid for ME whose immediate generalisations
to the multiagent case are not satisfied by SEP. One of these principles is
the Irrelevant Information Principle, a principle which very few inference
processes satisfy even in single agent context. In this paper we will inves-
tigate whether SEP can satisfy an interesting modified generalisation of
this principle."
"This work stems from a desire to combine ideas arising from two historically different schemes of probabilistic reasoning, each having its own axiomatic traditions, into a single broader axiomatic framework, capable of providing general... more
"This work stems from a desire to combine ideas arising from two historically different schemes of probabilistic reasoning, each having its own axiomatic traditions, into a single broader axiomatic framework, capable of providing general new insights into the nature of probabilistic inference in a multiagent context.
In the present sketch of our work we first describe briefly the background context, and we then present a set of natural principles to be satisfied by any general method of aggregating the partially defined probabilistic beliefs of several agents into a single probabilistic belief function. We will call such a general method of aggregation a social inference process. Finally we define a particular social inference process, the Social Entropy Process (abbreviated to SEP), which satisfies the principles formulated earlier. SEP has a natural justification in terms of information theory, and is closely related to the maximum entropy inference process: indeed it can be regarded as a natural extension of that inference process to the multiagent context."
""ABSTRACT The present paper introduces a new approach to the the theory of voting in the context of binary collective choice, which seeks to define a dynamic optimal voting rule by using insights derived from the mathematical theory... more
""ABSTRACT

The present paper introduces a new approach to the the theory of voting in
the context of binary collective choice, which seeks to define a dynamic optimal voting rule by using insights derived from the mathematical theory of information. In order to define such a voting rule, a method of defining a (non-additive, but monotone) "measure" of the weight of independent opinion of an arbitrary set of voters is suggested, which is value free to the extent that it depends only on probabilistic information extracted from previous patterns of voting, but does not require for its definition any direct information concerning either the correctness or incorrectness of previous voting decisions, or the content of those decisions. The approach to the definition of such a measure, which the author calls gravitas, is axiomatic. The voting rule is then defined by comparing the gravitas of the set of those voters who vote for a given motion with the gravitas of the set of those who vote against that motion.""
?0. Introduction. The present work may perhaps be seen as a point of convergence of two historically distinct sequences of results. One sequence of results started with the work of Tennenbaum [59] who showed that there could be no... more
?0. Introduction. The present work may perhaps be seen as a point of convergence of two historically distinct sequences of results. One sequence of results started with the work of Tennenbaum [59] who showed that there could be no nonstandard recursive model of the system PA of ...
Despite the acutely obvious failings of the formal structures of liberal democracy, the conceptual limitations inherent in the historical development of all actually existing forms of democracy have iinduced a paralysing intellectual... more
Despite the acutely obvious failings of the formal structures of liberal democracy,  the conceptual limitations inherent in the historical development of all actually existing forms of democracy have iinduced a paralysing intellectual timidity  and a failure to ask basic philosophical, logical, and mathematical questions about how radically different notions of democratic choice might be envisioned.  This talk is aimed at broadening a foundational approach to the intuitive notion of democracy, by pointing to a fundamental flaw common to even to the simplest implementations of virtually all existing notions, which is not sufficiently widely appreciated.
https://m.youtube.com/watch?v=VHbCTAcIfCw