Wilfrid Sellars' denunciation of the Myth of the Given was meant to clarify, against empiricism, ... more Wilfrid Sellars' denunciation of the Myth of the Given was meant to clarify, against empiricism, that perceptual episodes alone are insufficient to ground and justify perceptual knowledge. Sellars showed that in order to accomplish such epistemic tasks, more resources and capacities, such as those involved in using concepts, are needed. Perceptual knowledge belongs to the space of reasons and not to an independent realm of experience. Dan Hutto and Eric Myin have recently presented the Hard Problem of Content as an ensemble of reasons against naturalistic accounts of content. In a nutshell, it states that covariance relations-even though they are naturalistically acceptable explanatory resources-do not constitute content. The authors exploit this move in order to promote their preferred radical enactivist and anti-representationalist option, according to which, basic minds-the lower stratum of cognition-do not involve content. Although it is controversial to argue that the Hard Problem of Content effectively dismisses naturalistic theories of representation, a central aspect of it-the idea that information as covariance does not suffice to explain content-finds support among the defenders of classical cognitive representationalism, such as Marcin Miłkowski. This support-together with the acknowledgment this remark about covariance is a point already made by Sellars in his criticism of the Myth of the Given-has a number of interesting implications. Not only is it of interest for the debates about representationalism in cognitive science, where it can be understood as an anticipatory move, but it also offers some clues and insights for reconsidering some issues along Sellarsian lines-a conflation between two concepts of representation that is often assumed in cognitive science, a distinction between two types of relevant normativities, and a reconsideration of the naturalism involved in such explanations.
Stephen Turner’s anti-normativism is based on the idea that the normative can be explained away b... more Stephen Turner’s anti-normativism is based on the idea that the normative can be explained away by social science. Exploiting the idea fostered by the sociology of scientific knowledge that reasons can be understood naturalistically as the causes of the beliefs of scientists and endorsing a non-normative conception of rationality, Turner has argued that normative accounts are better understood as “Good Bad Theories” (GBT). GBT are understood as false accounts that play a role in social coordination like magical or religious rituals in primitive societies (e.g. Tabu and the like). According to Turner, “norms,” “obligations,” “reasons,” and “commitments” are like Tabu and can be explained away as GBT. Hence, Turner expected normative accounts to disappear completely in a fully disenchanted world. Turner focuses on the idea, widespread among philosophers, that the normative does not reduce to the causal: his main claim is that social science succeeds in the reduction of the normative in causal terms, overcoming normative/causal dualism. Furthermore, this success is presented as creating a serious challenge for normativism. By focusing on certain (supposedly normative) features of beliefs like those involved in belief change dynamics, I will point out some interesting implications and problems for Turner’s anti-normativism.
Discursive pluralism, recently fostered by anti-representationalist views, by stating that not al... more Discursive pluralism, recently fostered by anti-representationalist views, by stating that not all assertions conform to a descriptive model of language, poses an interesting challenge to representationalism. Although in recent years alethic pluralism has become more and more popular as an interesting way out for this issue, the discussion also hosts other interesting minority approaches in the anti- representationalist camp. In particular, the late stage of contemporary expressivism offers a few relevant insights, going from Price's denunciation of “placement problems” to Brandom's inferentialism. This paper attempts to show how these expressivist ideas combine well together, composing a unitary and metaphysically sober metaphilosophical framework.
Stephen Turner claims that social science can explain away normativity. By exploiting a nonnormat... more Stephen Turner claims that social science can explain away normativity. By exploiting a nonnormative view of rationality and a causal view of belief, he claimed that normativist views are akin to what he calls Good Bad Theories (GBT). GBT are false accounts that play a role of social coordination like primitive rituals (Taboo and the like). Hence, "norms", "commitments", and "obligations" are just like Taboo and can be explained away as GBT. Normativism, as a consequence, is doomed to disappear in a disenchanted world. Turner focuses on the normativist idea that the normative does not reduce to the causal: he claims that social science succeeds in the reduction. This claim is presented as a major challenge to philosophical normativism. In what follows, I try to discuss some aspects of Turner's challenge by focusing on certain features of belief and belief-change that prima facie promote a normativist view: this is the basis to focus on some problems concerning the scope of Turner's argument.
The paper focuses on the kind of expertise required by doctors in health
communication and argues... more The paper focuses on the kind of expertise required by doctors in health communication and argues that such an expertise is twofold: both epistemological and communicative competences are necessary to achieve compliance with the patient. Firstly, we introduce the specific epistemic competences that deal with diagnosis and its problems. Secondly, we focus on the communicative competences and argue that an inappropriate strategy in communicating the reasons of diagnosis and therapy can make patient compliance unworkable. Finally, we focus on the case of diabetes metaphor and propose the deliberate use of metaphors in health communication as an educational tool. On the one hand, metaphors might help doctors in explaining the disease in simpler terms and framing the experience of illness according to patient’s specific needs. On the other hand, metaphors might encourage a change in patient’s beliefs on their own experience of illness, and enable them to reach a shared decision making with doctors.
Introduction: The paper investigates the impact of the use of metaphors in reasoning tasks concer... more Introduction: The paper investigates the impact of the use of metaphors in reasoning tasks concerning vaccination, especially for defeasible reasoning cases. We assumed that both metaphor and defeasible reasoning can be relevant to let people understand vaccination as an important collective health phenomenon, by anticipating possible defeating conditions.
Methods: We hypothesized that extended metaphor could improve both the argumentative and the communicative effects of the message. We designed an empirical study to test our main hypotheses: participants (N = 196, 78% females; Mean age = 27.97 years, SD age = 10.40) were presented with a text about vaccination, described in either literal or metaphorical terms, based on uncertain vs. safe reasoning scenarios.
Results: The results of the study confirmed that defeasible reasoning is relevant for the communicative impact of a text and that an extended metaphor enhances the overall communicative effects of the message, in terms of understandability, persuasion, perceived safety, and feeling of control over the health situation, collective trust in expertise and uptake of experts' advice. However, the results show that this effect is significantly nuanced by the type of defeasible reasoning, especially in the case of participants' trust in expertise and commitment to experts' advice.
Conclusion: Both communicative and defeasible reasoning competences are needed to enhance trust in immunization, with possible different outcomes at an individual and collective level.
It is a common opinion that chance events cannot be understood in causal terms. Conversely, accor... more It is a common opinion that chance events cannot be understood in causal terms. Conversely, according to a causal view of chance, intersections between independent causal chains originate accidental events, called ''coincidences.'' The present paper takes into proper consideration this causal conception of chance and tries to shed new light on it. More precisely, starting from Hart and Honoré's view of coincidental events (Hart and Honoré in Causation in the Law. Clarendon Press, Oxford, 1959), this paper furnishes a more detailed account on the nature of coincidences, according to which coincidental events are hybrids constituted by ontic (physical) components, that is the intersections between independent causal chains, plus epistemic aspects; where by ''epistemic'' we mean what is related, in some sense, to knowledge: for example, access to information, but also expectations , relevance, significance, that is psychological aspects. In particular, this paper investigates the role of the epistemic aspects in our understanding of what coincidences are. In fact, although the independence between the causal lines involved plays a crucial role in understanding coincidental events, that condition results to be insufficient to give a satisfactory definition of coincidences. The main target of the present work is to show that the epistemic aspects of coincidences are, together with the independence between the intersecting causal chains, a constitutive part of coincidental phenomena. Many examples are offered throughout this paper to enforce this idea. This conception, despite-for example-Antoine Augustine Cournot and Jacques Monod's view, entails that a pure objectivist view about coincidences is not tenable.
European Journal of Pragmatism and American Philosophy, 2021
Hilary Putnam spent much of his career criticizing the fact/value dichotomy, and this became appa... more Hilary Putnam spent much of his career criticizing the fact/value dichotomy, and this became apparent already during the phase when he defended internal realism. He later changed his epistemological and metaphysical view by endorsing natural realism, with the consequence of embracing alethic pluralism, the idea that truth works differently in various discourse domains. Despite these changes of mind in epistemology and in theory of truth, Putnam went on criticizing the fact/value dichotomy. However, alethic pluralism entails drawing distinctions among discourse domains, especially between factual and nonfactual domains, and these distinctions are in tension with the rejection of the fact/value dichotomy, as this would in principle hinder factual domains as genuine. This issue raises, prima facie, some doubts about the effective compatibility of these views.
The paper investigates the epistemological and communicative competences the experts need to use ... more The paper investigates the epistemological and communicative competences the experts need to use and communicate evidence in the reasoning process leading to diagnosis. The diagnosis and diagnosis communication are presented as intertwined processes that should be jointly addressed in medical consultations, to empower patients’ compliance in illness management. The paper presents defeasible reasoning as specific to the diagnostic praxis, showing how this type of reasoning threatens effective diagnosis communication and entails that we should understand diagnostic evidence as defeasible as well. It argues that metaphors might be effective communicative devices to let the patients understand the relevant defeasors in the diagnostic reasoning process, helping to improve effective diagnosis communication, and also encouraging a change in patients’ beliefs and attitudes on their own experience of illness and illness’ management.
Il recente volume 'Putnam' di Massimo Dell’Utri ricostruisce il lungo percorso teorico e argoment... more Il recente volume 'Putnam' di Massimo Dell’Utri ricostruisce il lungo percorso teorico e argomentativo di Hilary Putnam, che lo ha condotto a esplorare le implicazioni delle tesi di Quine sull’analiticità e le varie declinazioni del realismo nel campo dell’epistemologia, della filosofia della scienza, della filosofia della matematica, le sue implicazioni per la filosofia della mente, e in tempi più recenti per le questioni etiche, meta-etiche e di teoria dei valori. La lettura critica ricostruisce il percorso di lettura proposto nel libro, per poi concentrarsi su alcune implicazioni che questa ricostruzione permette di mettere a fuoco a proposito di un passaggio delicato nell’evoluzione del pensiero di Putnam: la svolta sancita dall’abbandono del realismo interno e l’approdo al realismo naturale.
This chapter explores some key themes of Huw Price's global expressivist program and his appropri... more This chapter explores some key themes of Huw Price's global expressivist program and his appropriation of inferentialist views. Some remarks concerning certain internal tensions inside that program follow.
Anti-representationalism is the hallmark of Richard Rorty’s critique of the epistemological tradi... more Anti-representationalism is the hallmark of Richard Rorty’s critique of the epistemological tradition. According to it, knowledge does not “mirror” reality and the human mind is not a representational device. Anti-representationalism is a family of philosophical theses, respectively dealing with the notion of “representation” in different ways. Though prima facie one may feel entitled to think about anti-representationalism as a kind of uniform philosophical movement, things stand quite differently. In fact, among many anti-representationalist options, we can identify two main versions: a global anti-representationalism that entirely rejects the philosophical uses of the notion of “representation”, and a local version that just removes the notion of “representation” from the explanatory toolbox. In this chapter I try to compare Rorty’s global anti-representationalism and Robert Brandom’s local version, exploiting a recent discussion by Brandom and a famous exchange between Rorty and Bjørn Ramberg about Donald Davidson’s take on the special role of the intentional vocabulary.
Robert Brandom has developed an account of conceptual content as instituted by social practices. ... more Robert Brandom has developed an account of conceptual content as instituted by social practices. Such practices are understood as being implicitly normative. Brandom proposed the idea of implicit norms in order to meet some requirements imposed by Wittgenstein's remarks on rule-following: escaping the regress of rules on the one hand, and avoiding mere regular behavior on the other. Anandi Hattiangadi has criticized this account as failing to meet such requirements. In what follows, I try to show how the correct understanding of sanctions and the expressivist reading of the issue can meet these challenges.
Among the many features that go hand in hand with the recent onset of populism in many countries,... more Among the many features that go hand in hand with the recent onset of populism in many countries, an interesting phenomenon is surely the shift of public discourse in the direction of social media. Is there anything special about communication in social media that is particularly suitable for the development of such movements and ideas? In what follows , I provide an attempt to read Facebook comments as showing an anaphoric structure. This analysis permits me to give emphasis on a number of interesting features that such communications exhibit. Finally , I try also to highlight some of the main implications of this model in comparison with ordinary communication.
Wittgenstein’s Investigations proposed an egalitarian view about language games, emphasizing thei... more Wittgenstein’s Investigations proposed an egalitarian view about language games, emphasizing their plurality (“language has no downtown”). Uses of words depend on the game one is playing, and may change when playing another. Furthermore, there is no privileged game dictating the rules for the others: games are as many as purposes. This view is pluralist and egalitarian, but it says little about the connection between meaning and use, and about how a set of rules is responsible for them in practice. Brandom’s Making It Explicit attempted a straightforward answer to these questions, by developing Wittgensteinian insights: the primacy of social practice over meanings; the idea that meaning is use; the idea of rule–following to understand participation in social practices. Nonetheless, Brandom defended a non–Wittgensteinian conception of discursive practice: language has a “downtown”, the game of “giving and asking for reasons”. This is the idea of a normative structure of language, consisting of advancing claims and drawing inferences. By means of assertions, speakers undertake “commitments” that can be challenged/defended in terms of reasons (those successfully justified can gain “entitlement”). This game is not one among many: it is indispensable to the very idea of discursive practice. In this paper, my aim will be that of exploring the main motivations and implications of both perspectives.
espanolLas Investigaciones de Wittgenstein propusieron una vision igualitaria acerca de los juego... more espanolLas Investigaciones de Wittgenstein propusieron una vision igualitaria acerca de los juegos de lenguaje, haciendo enfasis en su pluralidad (“el lenguaje no tiene ningun centro de ciudad”). Los usos de las palabras dependen del juego que se este jugando y puede cambiar si se juega otro. No hay, tampoco, ningun juego privilegiado que dicte la reglas para los demas: hay tantos juegos como propositos. Esta vision es pluralista e igualitaria, pero dice poco acerca de la conexion entre significado y uso, y acerca de como un conjunto de reglas es responsable de ellos en la practica. Hacerlo explicito de Brandom intento dar una respuesta directa a estas preguntas mediante un desarrollo de las ideas Wittgensteinianas: la primacia de la practica social sobre significados; la idea de que significado es uso; la idea de seguir una regla para comprender la participacion en practicas sociales. Brandom, sin embargo, defendio una concepcion no-Wittgensteiniana de la practica discursiva: el le...
Anaphoric deflationism is a prosententialist account of the use of “true.” Prosentences are, for ... more Anaphoric deflationism is a prosententialist account of the use of “true.” Prosentences are, for sentences, the equivalent of what pronouns are for nouns: as pronouns refer to previously introduced nouns, so prosentences like “that’s true” inherit their content from previously introduced sentences. This kind of deflationism concerning the use of “true” (especially in Brandom’s version) is an explanation in terms of anaphora; the prosentence depends anaphorically on the sentence providing its content. A relevant implication of this theory is that “true” is not understood as a predicate and that truth is not a property. Primitivism, defended by Frege, Moore, and Davidson, is associated with two ideas: (1) that truth is a primitive and central trait of our conceptual system and (2) that truth, as such, cannot be defined. This second claim can be called “negative primitivism,” and it especially points out the facts about the “indefinability” of truth generally advocated by primitivists. In what follows, a connection is established between the deflationist’s rejection of the predicate and of the property and facts (and primitivist ideas) about the indefinability of truth. This connection establishes a common framework to lend further explanatory power to both options. According to the resulting view, this indefinability can explain the appeal and soundness of a deflationist dismissal of predicates and properties dealing with truth. https://doi.org/10.1007/s12136-018-0363-6
Anaphoric deflationism is a kind of prosententialist account of the use of “true.” It holds that ... more Anaphoric deflationism is a kind of prosententialist account of the use of “true.” It holds that “true” is an expressive operator and not a predicate. In particular, “is true” is explained as a “prosentence.” Prosentences are, for sentences, the equivalent of what pronouns are for nouns: As pronouns refer to previously introduced nouns, so prosentences like “that’s true” inherit their semantic content from previously introduced sentences. So, if Jim says, “The candidate is going to win the election,” and Bill replies “that’s true,” the real meaning of Bill’s statement is “It is true that the candidate is going to win the election.” This kind of prosententialist deflationism around the use of “true,” especially in Robert Brandom’s version, is an explanation given in terms of anaphora. The prosentence is an anaphoric dependent of the sentence providing its content. Båve (Philosophical Studies, 145, 297-310. 2009) argued that the anaphoric account is not as general as prosententialists claim, and that the analogy between prosentences and pronouns is explanatorily idle because it does not do any real explanatory work. The two criticisms are connected: The lack of unity within the anaphoric theory can be used to show its poor explanatory value. The plurality of uses of “is true” exceeds the anaphoric account indeed. Therefore, prosententialism is just a superficial re-description and the real work is done by means of more general semantic terms, namely “semantic equivalence and consequence” between “p” and ““p” is true.” I analyze Båve’s arguments and highlight that he fails to acknowledge the importance of a pragmatic and expressive dimension explained by the anaphoric account, a dimension that semantic “equivalence” and “consequence” are not capable of explaining. I then show that the anaphoric account can actually explain semantic equivalence and consequence, and this is crucial because equivalence and consequence do not explain anaphoric dependence. This reverses the allegation of generality: The anaphoric account is more general. Again, the cases typically used to defend prosententialism, if correctly described, show a unitary structure: They are all versions of lazy anaphoric dependence. Therefore, the unifying principle performing the explanation here is lazy anaphora.
Wilfrid Sellars' denunciation of the Myth of the Given was meant to clarify, against empiricism, ... more Wilfrid Sellars' denunciation of the Myth of the Given was meant to clarify, against empiricism, that perceptual episodes alone are insufficient to ground and justify perceptual knowledge. Sellars showed that in order to accomplish such epistemic tasks, more resources and capacities, such as those involved in using concepts, are needed. Perceptual knowledge belongs to the space of reasons and not to an independent realm of experience. Dan Hutto and Eric Myin have recently presented the Hard Problem of Content as an ensemble of reasons against naturalistic accounts of content. In a nutshell, it states that covariance relations-even though they are naturalistically acceptable explanatory resources-do not constitute content. The authors exploit this move in order to promote their preferred radical enactivist and anti-representationalist option, according to which, basic minds-the lower stratum of cognition-do not involve content. Although it is controversial to argue that the Hard Problem of Content effectively dismisses naturalistic theories of representation, a central aspect of it-the idea that information as covariance does not suffice to explain content-finds support among the defenders of classical cognitive representationalism, such as Marcin Miłkowski. This support-together with the acknowledgment this remark about covariance is a point already made by Sellars in his criticism of the Myth of the Given-has a number of interesting implications. Not only is it of interest for the debates about representationalism in cognitive science, where it can be understood as an anticipatory move, but it also offers some clues and insights for reconsidering some issues along Sellarsian lines-a conflation between two concepts of representation that is often assumed in cognitive science, a distinction between two types of relevant normativities, and a reconsideration of the naturalism involved in such explanations.
Stephen Turner’s anti-normativism is based on the idea that the normative can be explained away b... more Stephen Turner’s anti-normativism is based on the idea that the normative can be explained away by social science. Exploiting the idea fostered by the sociology of scientific knowledge that reasons can be understood naturalistically as the causes of the beliefs of scientists and endorsing a non-normative conception of rationality, Turner has argued that normative accounts are better understood as “Good Bad Theories” (GBT). GBT are understood as false accounts that play a role in social coordination like magical or religious rituals in primitive societies (e.g. Tabu and the like). According to Turner, “norms,” “obligations,” “reasons,” and “commitments” are like Tabu and can be explained away as GBT. Hence, Turner expected normative accounts to disappear completely in a fully disenchanted world. Turner focuses on the idea, widespread among philosophers, that the normative does not reduce to the causal: his main claim is that social science succeeds in the reduction of the normative in causal terms, overcoming normative/causal dualism. Furthermore, this success is presented as creating a serious challenge for normativism. By focusing on certain (supposedly normative) features of beliefs like those involved in belief change dynamics, I will point out some interesting implications and problems for Turner’s anti-normativism.
Discursive pluralism, recently fostered by anti-representationalist views, by stating that not al... more Discursive pluralism, recently fostered by anti-representationalist views, by stating that not all assertions conform to a descriptive model of language, poses an interesting challenge to representationalism. Although in recent years alethic pluralism has become more and more popular as an interesting way out for this issue, the discussion also hosts other interesting minority approaches in the anti- representationalist camp. In particular, the late stage of contemporary expressivism offers a few relevant insights, going from Price's denunciation of “placement problems” to Brandom's inferentialism. This paper attempts to show how these expressivist ideas combine well together, composing a unitary and metaphysically sober metaphilosophical framework.
Stephen Turner claims that social science can explain away normativity. By exploiting a nonnormat... more Stephen Turner claims that social science can explain away normativity. By exploiting a nonnormative view of rationality and a causal view of belief, he claimed that normativist views are akin to what he calls Good Bad Theories (GBT). GBT are false accounts that play a role of social coordination like primitive rituals (Taboo and the like). Hence, "norms", "commitments", and "obligations" are just like Taboo and can be explained away as GBT. Normativism, as a consequence, is doomed to disappear in a disenchanted world. Turner focuses on the normativist idea that the normative does not reduce to the causal: he claims that social science succeeds in the reduction. This claim is presented as a major challenge to philosophical normativism. In what follows, I try to discuss some aspects of Turner's challenge by focusing on certain features of belief and belief-change that prima facie promote a normativist view: this is the basis to focus on some problems concerning the scope of Turner's argument.
The paper focuses on the kind of expertise required by doctors in health
communication and argues... more The paper focuses on the kind of expertise required by doctors in health communication and argues that such an expertise is twofold: both epistemological and communicative competences are necessary to achieve compliance with the patient. Firstly, we introduce the specific epistemic competences that deal with diagnosis and its problems. Secondly, we focus on the communicative competences and argue that an inappropriate strategy in communicating the reasons of diagnosis and therapy can make patient compliance unworkable. Finally, we focus on the case of diabetes metaphor and propose the deliberate use of metaphors in health communication as an educational tool. On the one hand, metaphors might help doctors in explaining the disease in simpler terms and framing the experience of illness according to patient’s specific needs. On the other hand, metaphors might encourage a change in patient’s beliefs on their own experience of illness, and enable them to reach a shared decision making with doctors.
Introduction: The paper investigates the impact of the use of metaphors in reasoning tasks concer... more Introduction: The paper investigates the impact of the use of metaphors in reasoning tasks concerning vaccination, especially for defeasible reasoning cases. We assumed that both metaphor and defeasible reasoning can be relevant to let people understand vaccination as an important collective health phenomenon, by anticipating possible defeating conditions.
Methods: We hypothesized that extended metaphor could improve both the argumentative and the communicative effects of the message. We designed an empirical study to test our main hypotheses: participants (N = 196, 78% females; Mean age = 27.97 years, SD age = 10.40) were presented with a text about vaccination, described in either literal or metaphorical terms, based on uncertain vs. safe reasoning scenarios.
Results: The results of the study confirmed that defeasible reasoning is relevant for the communicative impact of a text and that an extended metaphor enhances the overall communicative effects of the message, in terms of understandability, persuasion, perceived safety, and feeling of control over the health situation, collective trust in expertise and uptake of experts' advice. However, the results show that this effect is significantly nuanced by the type of defeasible reasoning, especially in the case of participants' trust in expertise and commitment to experts' advice.
Conclusion: Both communicative and defeasible reasoning competences are needed to enhance trust in immunization, with possible different outcomes at an individual and collective level.
It is a common opinion that chance events cannot be understood in causal terms. Conversely, accor... more It is a common opinion that chance events cannot be understood in causal terms. Conversely, according to a causal view of chance, intersections between independent causal chains originate accidental events, called ''coincidences.'' The present paper takes into proper consideration this causal conception of chance and tries to shed new light on it. More precisely, starting from Hart and Honoré's view of coincidental events (Hart and Honoré in Causation in the Law. Clarendon Press, Oxford, 1959), this paper furnishes a more detailed account on the nature of coincidences, according to which coincidental events are hybrids constituted by ontic (physical) components, that is the intersections between independent causal chains, plus epistemic aspects; where by ''epistemic'' we mean what is related, in some sense, to knowledge: for example, access to information, but also expectations , relevance, significance, that is psychological aspects. In particular, this paper investigates the role of the epistemic aspects in our understanding of what coincidences are. In fact, although the independence between the causal lines involved plays a crucial role in understanding coincidental events, that condition results to be insufficient to give a satisfactory definition of coincidences. The main target of the present work is to show that the epistemic aspects of coincidences are, together with the independence between the intersecting causal chains, a constitutive part of coincidental phenomena. Many examples are offered throughout this paper to enforce this idea. This conception, despite-for example-Antoine Augustine Cournot and Jacques Monod's view, entails that a pure objectivist view about coincidences is not tenable.
European Journal of Pragmatism and American Philosophy, 2021
Hilary Putnam spent much of his career criticizing the fact/value dichotomy, and this became appa... more Hilary Putnam spent much of his career criticizing the fact/value dichotomy, and this became apparent already during the phase when he defended internal realism. He later changed his epistemological and metaphysical view by endorsing natural realism, with the consequence of embracing alethic pluralism, the idea that truth works differently in various discourse domains. Despite these changes of mind in epistemology and in theory of truth, Putnam went on criticizing the fact/value dichotomy. However, alethic pluralism entails drawing distinctions among discourse domains, especially between factual and nonfactual domains, and these distinctions are in tension with the rejection of the fact/value dichotomy, as this would in principle hinder factual domains as genuine. This issue raises, prima facie, some doubts about the effective compatibility of these views.
The paper investigates the epistemological and communicative competences the experts need to use ... more The paper investigates the epistemological and communicative competences the experts need to use and communicate evidence in the reasoning process leading to diagnosis. The diagnosis and diagnosis communication are presented as intertwined processes that should be jointly addressed in medical consultations, to empower patients’ compliance in illness management. The paper presents defeasible reasoning as specific to the diagnostic praxis, showing how this type of reasoning threatens effective diagnosis communication and entails that we should understand diagnostic evidence as defeasible as well. It argues that metaphors might be effective communicative devices to let the patients understand the relevant defeasors in the diagnostic reasoning process, helping to improve effective diagnosis communication, and also encouraging a change in patients’ beliefs and attitudes on their own experience of illness and illness’ management.
Il recente volume 'Putnam' di Massimo Dell’Utri ricostruisce il lungo percorso teorico e argoment... more Il recente volume 'Putnam' di Massimo Dell’Utri ricostruisce il lungo percorso teorico e argomentativo di Hilary Putnam, che lo ha condotto a esplorare le implicazioni delle tesi di Quine sull’analiticità e le varie declinazioni del realismo nel campo dell’epistemologia, della filosofia della scienza, della filosofia della matematica, le sue implicazioni per la filosofia della mente, e in tempi più recenti per le questioni etiche, meta-etiche e di teoria dei valori. La lettura critica ricostruisce il percorso di lettura proposto nel libro, per poi concentrarsi su alcune implicazioni che questa ricostruzione permette di mettere a fuoco a proposito di un passaggio delicato nell’evoluzione del pensiero di Putnam: la svolta sancita dall’abbandono del realismo interno e l’approdo al realismo naturale.
This chapter explores some key themes of Huw Price's global expressivist program and his appropri... more This chapter explores some key themes of Huw Price's global expressivist program and his appropriation of inferentialist views. Some remarks concerning certain internal tensions inside that program follow.
Anti-representationalism is the hallmark of Richard Rorty’s critique of the epistemological tradi... more Anti-representationalism is the hallmark of Richard Rorty’s critique of the epistemological tradition. According to it, knowledge does not “mirror” reality and the human mind is not a representational device. Anti-representationalism is a family of philosophical theses, respectively dealing with the notion of “representation” in different ways. Though prima facie one may feel entitled to think about anti-representationalism as a kind of uniform philosophical movement, things stand quite differently. In fact, among many anti-representationalist options, we can identify two main versions: a global anti-representationalism that entirely rejects the philosophical uses of the notion of “representation”, and a local version that just removes the notion of “representation” from the explanatory toolbox. In this chapter I try to compare Rorty’s global anti-representationalism and Robert Brandom’s local version, exploiting a recent discussion by Brandom and a famous exchange between Rorty and Bjørn Ramberg about Donald Davidson’s take on the special role of the intentional vocabulary.
Robert Brandom has developed an account of conceptual content as instituted by social practices. ... more Robert Brandom has developed an account of conceptual content as instituted by social practices. Such practices are understood as being implicitly normative. Brandom proposed the idea of implicit norms in order to meet some requirements imposed by Wittgenstein's remarks on rule-following: escaping the regress of rules on the one hand, and avoiding mere regular behavior on the other. Anandi Hattiangadi has criticized this account as failing to meet such requirements. In what follows, I try to show how the correct understanding of sanctions and the expressivist reading of the issue can meet these challenges.
Among the many features that go hand in hand with the recent onset of populism in many countries,... more Among the many features that go hand in hand with the recent onset of populism in many countries, an interesting phenomenon is surely the shift of public discourse in the direction of social media. Is there anything special about communication in social media that is particularly suitable for the development of such movements and ideas? In what follows , I provide an attempt to read Facebook comments as showing an anaphoric structure. This analysis permits me to give emphasis on a number of interesting features that such communications exhibit. Finally , I try also to highlight some of the main implications of this model in comparison with ordinary communication.
Wittgenstein’s Investigations proposed an egalitarian view about language games, emphasizing thei... more Wittgenstein’s Investigations proposed an egalitarian view about language games, emphasizing their plurality (“language has no downtown”). Uses of words depend on the game one is playing, and may change when playing another. Furthermore, there is no privileged game dictating the rules for the others: games are as many as purposes. This view is pluralist and egalitarian, but it says little about the connection between meaning and use, and about how a set of rules is responsible for them in practice. Brandom’s Making It Explicit attempted a straightforward answer to these questions, by developing Wittgensteinian insights: the primacy of social practice over meanings; the idea that meaning is use; the idea of rule–following to understand participation in social practices. Nonetheless, Brandom defended a non–Wittgensteinian conception of discursive practice: language has a “downtown”, the game of “giving and asking for reasons”. This is the idea of a normative structure of language, consisting of advancing claims and drawing inferences. By means of assertions, speakers undertake “commitments” that can be challenged/defended in terms of reasons (those successfully justified can gain “entitlement”). This game is not one among many: it is indispensable to the very idea of discursive practice. In this paper, my aim will be that of exploring the main motivations and implications of both perspectives.
espanolLas Investigaciones de Wittgenstein propusieron una vision igualitaria acerca de los juego... more espanolLas Investigaciones de Wittgenstein propusieron una vision igualitaria acerca de los juegos de lenguaje, haciendo enfasis en su pluralidad (“el lenguaje no tiene ningun centro de ciudad”). Los usos de las palabras dependen del juego que se este jugando y puede cambiar si se juega otro. No hay, tampoco, ningun juego privilegiado que dicte la reglas para los demas: hay tantos juegos como propositos. Esta vision es pluralista e igualitaria, pero dice poco acerca de la conexion entre significado y uso, y acerca de como un conjunto de reglas es responsable de ellos en la practica. Hacerlo explicito de Brandom intento dar una respuesta directa a estas preguntas mediante un desarrollo de las ideas Wittgensteinianas: la primacia de la practica social sobre significados; la idea de que significado es uso; la idea de seguir una regla para comprender la participacion en practicas sociales. Brandom, sin embargo, defendio una concepcion no-Wittgensteiniana de la practica discursiva: el le...
Anaphoric deflationism is a prosententialist account of the use of “true.” Prosentences are, for ... more Anaphoric deflationism is a prosententialist account of the use of “true.” Prosentences are, for sentences, the equivalent of what pronouns are for nouns: as pronouns refer to previously introduced nouns, so prosentences like “that’s true” inherit their content from previously introduced sentences. This kind of deflationism concerning the use of “true” (especially in Brandom’s version) is an explanation in terms of anaphora; the prosentence depends anaphorically on the sentence providing its content. A relevant implication of this theory is that “true” is not understood as a predicate and that truth is not a property. Primitivism, defended by Frege, Moore, and Davidson, is associated with two ideas: (1) that truth is a primitive and central trait of our conceptual system and (2) that truth, as such, cannot be defined. This second claim can be called “negative primitivism,” and it especially points out the facts about the “indefinability” of truth generally advocated by primitivists. In what follows, a connection is established between the deflationist’s rejection of the predicate and of the property and facts (and primitivist ideas) about the indefinability of truth. This connection establishes a common framework to lend further explanatory power to both options. According to the resulting view, this indefinability can explain the appeal and soundness of a deflationist dismissal of predicates and properties dealing with truth. https://doi.org/10.1007/s12136-018-0363-6
Anaphoric deflationism is a kind of prosententialist account of the use of “true.” It holds that ... more Anaphoric deflationism is a kind of prosententialist account of the use of “true.” It holds that “true” is an expressive operator and not a predicate. In particular, “is true” is explained as a “prosentence.” Prosentences are, for sentences, the equivalent of what pronouns are for nouns: As pronouns refer to previously introduced nouns, so prosentences like “that’s true” inherit their semantic content from previously introduced sentences. So, if Jim says, “The candidate is going to win the election,” and Bill replies “that’s true,” the real meaning of Bill’s statement is “It is true that the candidate is going to win the election.” This kind of prosententialist deflationism around the use of “true,” especially in Robert Brandom’s version, is an explanation given in terms of anaphora. The prosentence is an anaphoric dependent of the sentence providing its content. Båve (Philosophical Studies, 145, 297-310. 2009) argued that the anaphoric account is not as general as prosententialists claim, and that the analogy between prosentences and pronouns is explanatorily idle because it does not do any real explanatory work. The two criticisms are connected: The lack of unity within the anaphoric theory can be used to show its poor explanatory value. The plurality of uses of “is true” exceeds the anaphoric account indeed. Therefore, prosententialism is just a superficial re-description and the real work is done by means of more general semantic terms, namely “semantic equivalence and consequence” between “p” and ““p” is true.” I analyze Båve’s arguments and highlight that he fails to acknowledge the importance of a pragmatic and expressive dimension explained by the anaphoric account, a dimension that semantic “equivalence” and “consequence” are not capable of explaining. I then show that the anaphoric account can actually explain semantic equivalence and consequence, and this is crucial because equivalence and consequence do not explain anaphoric dependence. This reverses the allegation of generality: The anaphoric account is more general. Again, the cases typically used to defend prosententialism, if correctly described, show a unitary structure: They are all versions of lazy anaphoric dependence. Therefore, the unifying principle performing the explanation here is lazy anaphora.
It is a common opinion that chance events could not be understood in causal terms.
Conversely, a... more It is a common opinion that chance events could not be understood in causal terms.
Conversely, according to a causal view of chance, intersections between independent causal chains originate accidental events, called coincidences”.
Despite its importance, this notion of chance is quite neglected in contemporary literature and it seems to eschew a precise definition. The present study takes into proper consideration this causal conception of chance and tries to shed new light on it.
More precisely, this work investigates the role of the epistemic aspects in our understanding what coincidences are. In fact, although the independence between the causal lines involved plays an important part in order to understand coincidental events, that condition does not seem sufficient to give a complete definition of a coincidence. The main target of the present work is to prove that the epistemic aspects of coincidences, such as the epistemic access, expectations, relevance and so on, are, together with the independence between the intersecting causal chains, a constitutive part of coincidental phenomena. Many examples will be discussed throughout this work to highlight the role of epistemic aspects in our understanding the nature of coincidental events.
A famous philosophical image of persuasion is that provided by Wittgenstein’s On certainty (espec... more A famous philosophical image of persuasion is that provided by Wittgenstein’s On certainty (especially §§611-612). Here, Wittgenstein depicted persuasion as something almost difficult to reduce to reasons. In this paper I do not intend to deal with the hermeneutics of Wittgenstein’s texts and philosophy. What I intend to do is rather to try to develop certain philosophical consequences of this image. Reviewing these consequences leads to the acknowledgement of some problems. I then sketch the outline of a different idea of linguistic rational practice, Robert Brandom’s “giving and asking for reasons”, that will be useful to introduce more viable conceptions of persuasion (compatible with our reason-giving practices).
According to a causal view of chance, the intersections between totally independent causal chains... more According to a causal view of chance, the intersections between totally independent causal chains originate accidental
events, called “absolute-coincidences”. The present study focuses on the epistemological aspects of this causal view.
More precisely, these are the main problems we investigate:
1. How much important is the degree of the
access in order to identify absolute-coincidences? There could be events that are not absolute-coincidences but, since
our knowledge is insufficient, we consider them as absolutely-coincidental. Conversely, there could be events that are
absolute-coincidences but, since our knowledge is insufficient, we do not consider them as absolutely-oincidental. Here, our information about the independence of the causal lines involved seems to play a crucial role.
2. Are absolute-coincidences mind-dependent events? The representational background seems to be important in choosing the causal lines involved. Moreover, in identifying absolute-coincidences there seems to be something more than the independence between the causal lines we consider; something such as relevance and so on.
This essay aims to prove that: a) thedegree of the epistemic access is crucial to grasp absolute-coincidences; b) mental aspects are constitutive of absolute-coincidences.
Pietro Salis – (Università di Cagliari) - Giustificazionismo e passato
La realtà del passato rap... more Pietro Salis – (Università di Cagliari) - Giustificazionismo e passato
La realtà del passato rappresenta uno dei principali problemi riguardanti la semantica giustificazionista proposta da Michael Dummett. L’antirealismo tipico di questa prospettiva determina una concezione del passato piuttosto controintuitiva secondo cui esso «cessa di esistere» quando non lascia tracce e testimonianze. In Truth and the Past, Dummett è tornato sulla questione abbandonando l’antirealismo sul passato con l’obiettivo di evitare questa concezione. Questa svolta rappresenta un inedito spostamento in direzione del realismo, limitato tuttavia dal netto rifiuto di aderire ad una nozione di verità bivalente. Il mio intervento intende ricostruire e analizzare criticamente le ragioni di questa svolta di Dummett e cercare di sondare la solidità e la coerenza di questa rimodulazione del giustificazionismo.
"Inferentialism, especially Brandom’s theory (Brandom, 1994), is the project purported to underst... more "Inferentialism, especially Brandom’s theory (Brandom, 1994), is the project purported to understand meaning in terms of inferences, and language as a social practice governed by discursive norms. Discursive practice is thus understood as the basic rational practice, where commitments undertaken by participants are evaluated in terms of their being correct/incorrect. This model of explanation is also intended to rescue, in terms of reasons, the commitments we undertake ourselves and assess the commitments we attribute to others, in an objective sense: starting from our subjective normative and doxastic attitudes we should be able to use the normative discursive resources apt to assess our commitments, not only referring to what we take to be correct, but also referring to how things actually are.
My main hypothesis is that this objectivity is not achieved only on the basis of the rational structure of discursive practice. The main doubt concerns the fact that material inferences, those responsible for the content of our concepts (and commitments), are in general non-monotonic. These inferences put experts in an advantageous position, namely as those capable of defeasible reasoning. I believe that this asymmetry among language users is the crucial factor in assessing the objectivity of claims within discursive practice."
It is a common opinion that chance events cannot be understood in causal terms. Conversely, accor... more It is a common opinion that chance events cannot be understood in causal terms. Conversely, according to a causal view of chance, intersections between independent causal chains originate accidental events, called “coincidences”. Firstly, this book explores this causal conception of chance and tries to shed new light on it. Such a view has been defended by authors like Antoine Augustine Cournot and Jacques Monod. Second, a relevant alternative is provided by those accounts that, instead of acknowledging an intersection among causal lines, claim to track coincidences back to some common cause. Third, starting from Herbert Hart and Anthony Honoré’s view of coincidences (Causation in the Law. Clarendon Press, Oxford, 1959). This book provides a more detailed account of coincidences, according to which coincidental events are hybrids constituted by ontic (physical) components, which is the intersection between independent causal chains, plus epistemic aspects, including but not limited to, access to information, expectations, relevance, significance, desires, which in turn are psychological aspects.
The main target of the present work is to show that the epistemic aspects of coincidences are, together with the independence between the intersecting causal chains, a constitutive part of coincidental phenomena. This book aims to introduce and discuss recent work in psychology concerning one’s judgment about coincidences; this data offers further materials and reasons to reflect upon our understanding of coincidences and to refine our hybrid conception.
Cosa vuol dire “fare uso di concetti”? Che relazione sussiste tra l’uso di un sistema concettuale... more Cosa vuol dire “fare uso di concetti”? Che relazione sussiste tra l’uso di un sistema concettuale e l’uso di un linguaggio naturale? Esiste un’influenza delle pratiche sociali in cui sono coinvolti gli esseri umani sui significati delle loro espressioni linguistiche? Che rapporto lega il ragionamento con l’uso di concetti? Queste sono alcune delle domande centrali per il lavoro del filosofo statunitense Robert Brandom. Sulla scorta di simili interrogativi, e mediante un confronto articolato con autori quali Kant, Hegel, Frege, Wittgenstein, Sellars e Dummett, Brandom ha elaborato una complessa teoria del linguaggio e della pratica discorsiva. Questo volume è dedicato, in primo luogo, a presentare le linee guida e le principali strategie argomentative della teoria di Brandom; in secondo luogo, a discutere alcune questioni chiave per il modello inferenzialista, selezionate con particolare attenzione alle principali discussioni suscitate dal dibattito teorico negli ultimi decenni.
Uploads
Papers by Pietro Salis
communication and argues that such an expertise is twofold: both epistemological and communicative competences are necessary to achieve compliance with the patient. Firstly, we introduce the specific epistemic competences that deal with diagnosis and its problems. Secondly, we focus on the communicative competences and argue that an inappropriate strategy in communicating the reasons
of diagnosis and therapy can make patient compliance unworkable. Finally, we focus on the case of diabetes metaphor and propose the deliberate use of metaphors in health communication as an educational tool. On the one hand, metaphors might help doctors in explaining the disease in simpler terms and framing the experience of illness according to patient’s specific needs. On the other hand, metaphors might encourage a change in patient’s beliefs on their own experience of illness, and enable them to reach a shared decision making with doctors.
Methods: We hypothesized that extended metaphor could improve both the argumentative and the communicative effects of the message. We designed an empirical study to test our main hypotheses: participants (N = 196, 78% females; Mean age = 27.97 years, SD age = 10.40) were presented with a text about vaccination, described in either literal or metaphorical terms, based on uncertain vs. safe reasoning scenarios.
Results: The results of the study confirmed that defeasible reasoning is relevant for the communicative impact of a text and that an extended metaphor enhances the overall communicative effects of the message, in terms of understandability, persuasion, perceived safety, and feeling of control over the health situation, collective trust in expertise and uptake of experts' advice. However, the results show that this effect is significantly nuanced by the type of defeasible reasoning, especially in the case of participants' trust in expertise and commitment to experts' advice.
Conclusion: Both communicative and defeasible reasoning competences are needed to enhance trust in immunization, with possible different outcomes at an individual and collective level.
Brandom’s Making It Explicit attempted a straightforward answer to these questions, by developing Wittgensteinian insights: the primacy of social practice over meanings; the idea that meaning is use; the idea of rule–following to understand participation in social practices. Nonetheless, Brandom defended a non–Wittgensteinian conception of discursive practice: language has a “downtown”, the game of “giving and asking for reasons”. This is the idea of a normative structure of language, consisting of advancing claims and drawing inferences. By means of assertions, speakers undertake “commitments” that can be challenged/defended in terms of reasons (those successfully justified can gain “entitlement”). This game is not one among many: it is indispensable to the very idea of discursive practice.
In this paper, my aim will be that of exploring the main motivations and
implications of both perspectives.
https://doi.org/10.1007/s12136-018-0363-6
https://link.springer.com/epdf/10.1007/s12136-018-0363-6?author_access_token=Qw1g0EwrqA1pC72yB0eAVve4RwlQNchNByi7wbcMAY4mPkW1rc-icAtxYEIrBfbtdgyX-wKYuA3BJYA6gAnGPRC1uCMriijRU7_knSlygMe0CpW2jQVzM2_zwyX85S7naMfg4GbhK5KCiVUGA3UVsg%3D%3D
communication and argues that such an expertise is twofold: both epistemological and communicative competences are necessary to achieve compliance with the patient. Firstly, we introduce the specific epistemic competences that deal with diagnosis and its problems. Secondly, we focus on the communicative competences and argue that an inappropriate strategy in communicating the reasons
of diagnosis and therapy can make patient compliance unworkable. Finally, we focus on the case of diabetes metaphor and propose the deliberate use of metaphors in health communication as an educational tool. On the one hand, metaphors might help doctors in explaining the disease in simpler terms and framing the experience of illness according to patient’s specific needs. On the other hand, metaphors might encourage a change in patient’s beliefs on their own experience of illness, and enable them to reach a shared decision making with doctors.
Methods: We hypothesized that extended metaphor could improve both the argumentative and the communicative effects of the message. We designed an empirical study to test our main hypotheses: participants (N = 196, 78% females; Mean age = 27.97 years, SD age = 10.40) were presented with a text about vaccination, described in either literal or metaphorical terms, based on uncertain vs. safe reasoning scenarios.
Results: The results of the study confirmed that defeasible reasoning is relevant for the communicative impact of a text and that an extended metaphor enhances the overall communicative effects of the message, in terms of understandability, persuasion, perceived safety, and feeling of control over the health situation, collective trust in expertise and uptake of experts' advice. However, the results show that this effect is significantly nuanced by the type of defeasible reasoning, especially in the case of participants' trust in expertise and commitment to experts' advice.
Conclusion: Both communicative and defeasible reasoning competences are needed to enhance trust in immunization, with possible different outcomes at an individual and collective level.
Brandom’s Making It Explicit attempted a straightforward answer to these questions, by developing Wittgensteinian insights: the primacy of social practice over meanings; the idea that meaning is use; the idea of rule–following to understand participation in social practices. Nonetheless, Brandom defended a non–Wittgensteinian conception of discursive practice: language has a “downtown”, the game of “giving and asking for reasons”. This is the idea of a normative structure of language, consisting of advancing claims and drawing inferences. By means of assertions, speakers undertake “commitments” that can be challenged/defended in terms of reasons (those successfully justified can gain “entitlement”). This game is not one among many: it is indispensable to the very idea of discursive practice.
In this paper, my aim will be that of exploring the main motivations and
implications of both perspectives.
https://doi.org/10.1007/s12136-018-0363-6
https://link.springer.com/epdf/10.1007/s12136-018-0363-6?author_access_token=Qw1g0EwrqA1pC72yB0eAVve4RwlQNchNByi7wbcMAY4mPkW1rc-icAtxYEIrBfbtdgyX-wKYuA3BJYA6gAnGPRC1uCMriijRU7_knSlygMe0CpW2jQVzM2_zwyX85S7naMfg4GbhK5KCiVUGA3UVsg%3D%3D
Conversely, according to a causal view of chance, intersections between independent causal chains originate accidental events, called coincidences”.
Despite its importance, this notion of chance is quite neglected in contemporary literature and it seems to eschew a precise definition. The present study takes into proper consideration this causal conception of chance and tries to shed new light on it.
More precisely, this work investigates the role of the epistemic aspects in our understanding what coincidences are. In fact, although the independence between the causal lines involved plays an important part in order to understand coincidental events, that condition does not seem sufficient to give a complete definition of a coincidence. The main target of the present work is to prove that the epistemic aspects of coincidences, such as the epistemic access, expectations, relevance and so on, are, together with the independence between the intersecting causal chains, a constitutive part of coincidental phenomena. Many examples will be discussed throughout this work to highlight the role of epistemic aspects in our understanding the nature of coincidental events.
events, called “absolute-coincidences”. The present study focuses on the epistemological aspects of this causal view.
More precisely, these are the main problems we investigate:
1. How much important is the degree of the
access in order to identify absolute-coincidences? There could be events that are not absolute-coincidences but, since
our knowledge is insufficient, we consider them as absolutely-coincidental. Conversely, there could be events that are
absolute-coincidences but, since our knowledge is insufficient, we do not consider them as absolutely-oincidental. Here, our information about the independence of the causal lines involved seems to play a crucial role.
2. Are absolute-coincidences mind-dependent events? The representational background seems to be important in choosing the causal lines involved. Moreover, in identifying absolute-coincidences there seems to be something more than the independence between the causal lines we consider; something such as relevance and so on.
This essay aims to prove that: a) thedegree of the epistemic access is crucial to grasp absolute-coincidences; b) mental aspects are constitutive of absolute-coincidences.
La realtà del passato rappresenta uno dei principali problemi riguardanti la semantica giustificazionista proposta da Michael Dummett. L’antirealismo tipico di questa prospettiva determina una concezione del passato piuttosto controintuitiva secondo cui esso «cessa di esistere» quando non lascia tracce e testimonianze. In Truth and the Past, Dummett è tornato sulla questione abbandonando l’antirealismo sul passato con l’obiettivo di evitare questa concezione. Questa svolta rappresenta un inedito spostamento in direzione del realismo, limitato tuttavia dal netto rifiuto di aderire ad una nozione di verità bivalente. Il mio intervento intende ricostruire e analizzare criticamente le ragioni di questa svolta di Dummett e cercare di sondare la solidità e la coerenza di questa rimodulazione del giustificazionismo.
My main hypothesis is that this objectivity is not achieved only on the basis of the rational structure of discursive practice. The main doubt concerns the fact that material inferences, those responsible for the content of our concepts (and commitments), are in general non-monotonic. These inferences put experts in an advantageous position, namely as those capable of defeasible reasoning. I believe that this asymmetry among language users is the crucial factor in assessing the objectivity of claims within discursive practice."
The main target of the present work is to show that the epistemic aspects of coincidences are, together with the independence between the intersecting causal chains, a constitutive part of coincidental phenomena. This book aims to introduce and discuss recent work in psychology concerning one’s judgment about coincidences; this data offers further materials and reasons to reflect upon our understanding of coincidences and to refine our hybrid conception.