Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Ross Pain

Over the last fifteen years, an ambitious explanatory framework has been proposed to unify explanations across biology and cognitive science. Active inference, whose most famous tenet is the free energy principle, has inspired excitement... more
Over the last fifteen years, an ambitious explanatory framework has been proposed to unify explanations across biology and cognitive science. Active inference, whose most famous tenet is the free energy principle, has inspired excitement and confusion in equal measure. Here, we lay the ground for proper critical analysis of active inference, in three ways. First, we give simplified versions of its core mathematical models. Second, we outline the historical development of active inference and its relationship to other theoretical approaches. Third, we describe three different kinds of claim-labelled mathematical, empirical and general-routinely made by proponents of the framework, and suggest dialectical links between them. Overall, we aim to increase philosophical understanding of active inference so that it may be more readily evaluated. This paper is the Introduction to the Topical Collection "The Free Energy Principle: From Biology to Cognition".
The free energy principle is notoriously difficult to understand. In this paper, we relate the principle to a framework that philosophers of biology are familiar with: Ruth Millikan's teleosemantics. We argue that: (i) systems that... more
The free energy principle is notoriously difficult to understand. In this paper, we relate the principle to a framework that philosophers of biology are familiar with: Ruth Millikan's teleosemantics. We argue that: (i) systems that minimise free energy are systems with a proper function; and (ii) Karl Friston's notion of implicit modelling can be understood in terms of Millikan's notion of mapping relations. Our analysis reveals some surprising formal similarities between the two frameworks, and suggests interesting lines of future research. We hope this will aid further philosophical evaluation of the free energy principle.
Recent work by Stout and colleagues indicates that the neural correlates of language and Early Stone Age toolmaking overlap significantly. The aim of this paper is to add computational detail to their findings. I use an error minimisation... more
Recent work by Stout and colleagues indicates that the neural correlates of language and Early Stone Age toolmaking overlap significantly. The aim of this paper is to add computational detail to their findings. I use an error minimisation model to outline where the information processing overlap between toolmaking and language lies. I argue that the Early Stone Age signals the emergence of complex structured representations. I then highlight a feature of my account: It allows us to understand the early evolution of syntax in terms of an increase in the number and complexity of models in a cognitive system, rather than the development of new types of processing.
In their landmark 2010 paper, "The weirdest people in the world?", Henrich, Heine, and Norenzayan outlined a serious methodological problem for the psychological and behavioural sciences. Most of the studies produced in the field use... more
In their landmark 2010 paper, "The weirdest people in the world?", Henrich, Heine, and Norenzayan outlined a serious methodological problem for the psychological and behavioural sciences. Most of the studies produced in the field use people from Western, Educated, Industrialised, Rich and Democratic (WEIRD) societies, yet inferences are often drawn to the species as a whole. In drawing such inferences, researchers implicitly assume that either there is little variation across human populations, or that WEIRD populations are generally representative of the species. Yet neither of these assumptions is justified. In many psychological and behavioural domains, cultural variation begets cognitive variation, and WEIRD samples are recurrently shown to be outliers. In the years since the article was published, attention has focused on the implications this has for research on extant human populations. Here we extend those implications to the study of ancient H. sapiens, their hominin forebears, and cousin lineages. We assess a range of characteristic arguments and key studies in the cognitive archaeology literature, identifying issues stemming from the problem of sample diversity. We then look at how worrying the problem is, and consider some conditions under which inferences to ancient populations via cognitive models might be provisionally justified.
Hutto and Myin claim that teleosemantics cannot account for mental content. In their view, teleosemantics accounts for a poorer kind of relation between cognitive states and the world but lacks the theoretical tools to account for a... more
Hutto and Myin claim that teleosemantics cannot account for mental content. In their view, teleosemantics accounts for a poorer kind of relation between cognitive states and the world but lacks the theoretical tools to account for a richer kind. We show that their objection imposes two criteria on theories of content: a truth-evaluable criterion and an intensionality criterion. For the objection to go through, teleosemantics must be subject to both these criteria and must fail to satisfy them. We argue that teleosemantics meets the truth-evaluable criterion and is not required to meet the intensionality criterion. We conclude that Hutto and Myin's objection fails.
Cognitive archaeologists attempt to infer the cognitive and cultural features of past hominins and their societies from the material record. This task faces the problem of minimum necessary competence: as the most sophisticated thinking... more
Cognitive archaeologists attempt to infer the cognitive and cultural features of past hominins and their societies from the material record. This task faces the problem of minimum necessary competence: as the most sophisticated thinking of ancient hominins may have been in domains that leave no archaeological signature, it is safest to assume that tool production and use reflects only the lower boundary of cognitive capacities. Cognitive archaeology involves selecting a model from the cognitive sciences and then assessing some aspect of the material record through that lens. We give examples to show that background theoretical commitments in cognitive science which inform those models lead to different minimum necessary competence results. This raises an important question: what principles should guide us in selecting a model from the cognitive sciences? We outline two complementary responses to this question. The first involves using independent lines of evidence to converge on a particular capacity. This can then influence model choice. The second is a broader suggestion. Theoretical diversity is a good thing in science; but is only beneficial over a limited amount of time. According to recent modelling work, one way of limiting diversity is to introduce extreme priors. We argue that having a broad spectrum of views in the philosophy of cognitive science may actually help cognitive archaeologists address the problem of minimum necessary competence.
How do technologies that are too complex for any one individual to produce ("cumulative technological culture") arise and persist in human populations? Contra prevailing views focusing on social learning, Osiurak & Reynaud (2020) argue... more
How do technologies that are too complex for any one individual to produce ("cumulative technological culture") arise and persist in human populations? Contra prevailing views focusing on social learning, Osiurak & Reynaud (2020) argue that the primary driver for cumulative technological culture is our ability for technical reasoning. Whilst sympathetic to their overall position, we argue that two specific aspects of their account are implausible: first, that technical reasoning is unique to humans; and second, that technical reasoning is a necessary condition for the production of cumulative technological culture. We then present our own view, which keeps technical reasoning at the forefront whilst jettisoning these conditions. This produces an account of cumulative technological culture that maintains an important role for technical reasoning, whilst being more evolutionarily plausible.
Veissière et. al must sacrifice explanatory realism and precision in order to develop a unified formal model. Drawing on examples from cognitive archeology, we argue that this makes it difficult for them to derive the kinds of testable... more
Veissière et. al must sacrifice explanatory realism and precision in order to develop a unified formal model. Drawing on examples from cognitive archeology, we argue that this makes it difficult for them to derive the kinds of testable predictions that would allow them to resolve debates over the nature of human social cognition and cultural acquisition.
This paper examines the inferential framework employed by Palaeolithic cognitive archaeologists, using the work of Wynn and Coolidge as a case study. I begin by distinguishing minimal-capacity inferences from cognitive-transition... more
This paper examines the inferential framework employed by Palaeolithic cognitive archaeologists, using the work of Wynn and Coolidge as a case study. I begin by distinguishing minimal-capacity inferences from cognitive-transition inferences. Minimal-capacity inferences attempt to infer the cognitive prerequisites required for the production of a technology. Cognitive-transition inferences use transitions in technological complexity to infer transitions in cognitive evolution. I argue that cognitive archaeology has typically used cognitive-transition inferences informed by minimal-capacity inferences, and that this reflects a tendency to favour cognitive explanations for transitions in technological complexity. Next I look at two alternative explanations for transitions in technological complexity: the demographic hypothesis and the environmental hypothesis. This presents us with a dilemma: either reject these alternative explanations or reject traditional cognitive-transition inferences. Rejecting the former is unappealing as there is strong evidence that demographic and environmental influences play some causal role in technological transitions. Rejecting the latter is unappealing as it means abandoning the idea that technological transitions tell us anything about transitions in hominin cognitive evolution. I finish by briefly outlining some conceptual tools from the philosophical literature that might help shed some light on the problem.
Shaun Gallagher [2019] argues for a 'non-classical' conception of nature, which includes subjects as irreducible constituents. As such, first-person phenomenology can be naturalised and at the same time resist reduction to the... more
Shaun Gallagher [2019] argues for a 'non-classical' conception of nature, which includes subjects as irreducible constituents. As such, first-person phenomenology can be naturalised and at the same time resist reduction to the third-person. In the first part of this paper, I raise three concerns for the claim that nature is irreducibly subject-involving. In the second part of the paper, I suggest that embracing a process ontology could help strengthen Gallagher's proposal.
Review of William B Irvine's "You: A Natural History".