We have a conundrum. The physical basis of information is clearly a highly active research area. ... more We have a conundrum. The physical basis of information is clearly a highly active research area. Yet the power of information theory comes precisely from separating it from the detailed problems of building physical systems to perform information processing tasks. Developments in quantum information over the last two decades seem to have undermined this separation, leading to suggestions that information is itself a physical entity and must be part of our physical theories, with resource-cost implications. We will consider a variety of ways in which physics seems to a affect computation, but will ultimately argue to the contrary: rejecting the claims that information is physical provides a better basis for understanding the fertile relationship between information theory and physics. instead, we will argue that the physical resource costs of information processing are to be understood through the need to consider physically embodied agents for whom information processing tasks are performed. Doing so sheds light on what it takes for something to be implementing a computational or information processing task of a given kind.
The celebrated Dreimännerarbeit by Born, Heisenberg and Jordan contains a matrix-mechanical deriv... more The celebrated Dreimännerarbeit by Born, Heisenberg and Jordan contains a matrix-mechanical derivation by Jordan of Planck's formula for blackbody fluctuations. Jordan appears to have considered this to be one of his finest contributions to quantum theory, but the status of his derivation is puzzling. In our Dreimenschenarbeit, we show how to understand what Jordan was doing in the double context of a Boltzmannian approach to statistical mechanics and of the early 'statistical interpretation'of matrix mechanics.
The notion of a physical collapse of the wave function is embodied in dynamical collapse mode... more The notion of a physical collapse of the wave function is embodied in dynamical collapse models. These involve a modification of the unitary evolution of the wave function such as to give a dynamical account of collapse. The resulting dynamics is at first sight time asymmetric for the simple reason that the wave function depends on those collapse events in the past but not those in the future. Here we show that dynamical wave function collapse models admit a general description that has no inbuilt direction of time. Given some simple constraints, we show that there exist empirically equivalent pictures of collapsing wave functions in both time directions, each satisfying the same dynamical rules. A preferred direction is singled out only by the asymmetric initial and final time constraints on the state of the Universe.
The amount of heat generated by computers is rapidly becoming one of the main problems for develo... more The amount of heat generated by computers is rapidly becoming one of the main problems for developing new generations of information technology. The thermodynamics of computation sets the ultimate physical bounds on heat generation. A lower bound is set by the Landauer Limit, at which computation becomes thermodynamically reversible. For classical computation there is no physical principle which prevents this limit being reached, and approaches to it are already being experimentally tested. In this paper we show that for quantum computation there is an unavoidable excess heat generation that renders it inherently thermodynamically irreversible. The Landauer Limit cannot, in general, be reached by quantum computers. We show the existence of a lower bound to the heat generated by quantum computing that exceeds that given by the Landauer Limit, give the special conditions where this excess cost may be avoided, and show how classical computing falls within these special conditions.
Macro-realism is the position that certain "macroscopic" observables must always possess definite... more Macro-realism is the position that certain "macroscopic" observables must always possess definite values: e.g. the table is in some definite position, even if we don't know what that is precisely. The traditional understanding is that by assuming macro-realism one can derive the Leggett-Garg inequalities, which constrain the possible statistics from certain experiments. Since quantum experiments can violate the Leggett-Garg inequalities, this is taken to rule out the possibility of macro-realism in a quantum universe. However, recent analyses have exposed loopholes in the Leggett-Garg argument, which allow many types of macro-realism to be compatible with quantum theory and hence violation of the Leggett-Garg inequalities. This paper takes a different approach to ruling out macro-realism and the result is a no-go theorem for macro-realism in quantum theory that is stronger than the Leggett-Garg argument. This approach uses the framework of ontological models: an elegant way to reason about foundational issues in quantum theory which has successfully produced many other recent results, such as the PBR theorem
Macroscopic realism is the thesis that macroscopically observable properties must always have def... more Macroscopic realism is the thesis that macroscopically observable properties must always have definite values. The idea was introduced by Leggett and Garg (1985), who wished to show a conflict with the predictions of quantum theory. However, their analysis required not just the assumption of macroscopic realism per se, but also that the observable properties could be measured non-invasively. In recent years there has been increasing interest in experimental tests of the violation of the Leggett-Garg inequality, but it has remained a matter of controversy whether this second assumption is a reasonable requirement for a macroscopic realist view of quantum theory. In a recent critical assessment Maroney and Timpson (2017) identified three different categories of macroscopic realism, and argued that only the simplest category could be ruled out by Leggett-Garg inequality violations. Allen, Maroney, and Gogioso (2016) then showed that the second of these approaches was also incompatible with quantum theory in Hilbert spaces of dimension 4 or higher. However, we show that the distinction introduced by Maroney and Timpson between the second and third approaches is not noise tolerant, so unfortunately Allen's result, as given, is not directly empirically testable. In this paper we replace Maroney and Timpson's three categories with a parameterization of macroscopic realist models, which can be related to experimental observations in a noise tolerant way, and recover the original definitions in the noise-free limit. We show how this parameterization can be used to experimentally rule out classes of macroscopic realism in Hilbert spaces of dimension 3 or higher, including the category tested by the Leggett-Garg inequality, without any use of the non-invasive measurability assumption.
Collapse models are modifications of quantum theory where the wave function is treated as physica... more Collapse models are modifications of quantum theory where the wave function is treated as physically real and the collapse of the wave function is a physical process. This appears to introduce a time reversal asymmetry into the dynamics of the wave function since the collapses affect only the future state. This paper challenges this conclusion, showing that in three different examples of time asymmetries associated with collapse models, if the physically real part of the model can be reduced to the locations in space and time about which collapses occur, then such a model works both forward and backward in time, in each case satisfying the Born rule. Despite the apparent asymmetry of the collapse process, these models in fact have time reversal symmetry. Any physically observed time asymmetries that arise in such models are due to the asymmetric imposition of initial or final time boundary conditions, rather than from an inherent asymmetry in the dynamical law. This is the standard explanation of time asymmetric behaviour resulting from time symmetric laws.
According to a recent no-go theorem (M. Pusey, J. Barrett and T. Rudolph, Nature Physics 8 475 (2... more According to a recent no-go theorem (M. Pusey, J. Barrett and T. Rudolph, Nature Physics 8 475 (2012)), models in which quantum states correspond to probability distributions over the values of some underlying physical variables must have the following feature: the distributions corresponding to distinct quantum states do not overlap. This is significant because if the distributions do not overlap, then the quantum state itself is
encoded by the physical variables. In such a model, it cannot coherently be maintained that the quantum state merely encodes information about underlying physical variables. The theorem, however, considers only models in which the physical variables corresponding to independently prepared systems are independent. This work considers models that are defined for a single quantum system of dimension d, such that the independence condition does not arise. We prove a result in a similar spirit to the original no-go theorem, in the form of an upper bound on the extent to which the probability distributions can overlap, consistently with reproducing quantum predictions. In particular, models in which the quantum overlap between pure states is equal to the classical overlap between the corresponding probability distributions cannot reproduce the quantum predictions in any dimension d>=3. The result is noise tolerant, and an experiment is motivated to distinguish the class of models ruled out from quantum theory.
Can a density matrix be regarded as a description of the physically real properties of an individ... more Can a density matrix be regarded as a description of the physically real properties of an individual system? If so, it may be possible to attribute the same objective significance to statistical mechanical properties, such as entropy or temperature, as to properties such as mass or energy. Non-linear modifications to unitary evolution can be proposed, based upon this idea, to account for thermodynamic irreversibility.
We examine the relationship between quantum contextuality (in both the standard Kochen-Specker se... more We examine the relationship between quantum contextuality (in both the standard Kochen-Specker sense and in the generalised sense proposed by Spekkens) and models of quantum theory in which the quantum state is maximally epistemic. We find that preparation noncontextual models must be maximally epistemic, and these in turn must be Kochen-Specker noncontextual. This implies that the Kochen-Specker theorem is sufficient to establish both the impossibility of maximally epistemic models and the impossibility of preparation noncontextual models. The implication from preparation noncontextual to maximally epistemic then also yields a proof of Bell's theorem from an EPR-like argument.
One of the recent no-go theorems on \Psi-epistemic interpretations of quantum proves that there a... more One of the recent no-go theorems on \Psi-epistemic interpretations of quantum proves that there are no 'maximally epistemic' interpretations of quantum theory. The proof utilises similar arrangements to Clifton's quantum contextuality proof and has parallels to Harrigan and Rudolph's quantum deficiency no-go theorem, itself based on the Kochen-Specker quantum contextuality proof. This paper shows how the Kochen-Specker theorem can also be turned into a no 'maximally epistemic' theorem, but of a more limited kind.
A novel no-go theorem is presented which sets a bound upon the extent to which '\Psi-epistemic' i... more A novel no-go theorem is presented which sets a bound upon the extent to which '\Psi-epistemic' interpretations of quantum theory are able to explain the overlap between non-orthogonal quantum states in terms of an experimenter's ignorance of an underlying state of reality. The theorem applies to any Hilbert space of dimension greater than two. In the limit of large Hilbert spaces, no more than half of the overlap between quantum states can be accounted for. Unlike other recent no-go theorems no additional assumptions, such as forms of locality, invasiveness, or non-contextuality, are required. The result continues to hold in the presence of a small but finite amount of noise, and is open to experimental verification in a sufficiently precise experimental arrangement.
Studies in the History and Philosophy of Modern Physics, 2017
A quantum pre- and post-selection paradox involves making measurements at two separate times on a... more A quantum pre- and post-selection paradox involves making measurements at two separate times on a quantum system, and making inferences about the state of the system at an intermediate time, conditional upon the observed outcomes. The inferences lead to predictions about the results of measurements performed at the intermediate time, which have been well confirmed experimentally, but which nevertheless seem paradoxical when inferences about different intermediate measurements are combined. The three box paradox is the paradigm example of such an effect, where a ball is placed in one of three boxes and is shuffled between the boxes in between two measurements of its location. By conditionalising on the outcomes of those measurements, it is inferred that between the two measurements the ball would have been found with certainty in Box 1 and with certainty in Box 2, if either box been opened on their own. Despite experimental confirmation of the predictions, and much discussion, it has remained unclear what exactly is supposed to be paradoxical or what specifically is supposed to be quantum, about these effects. In this paper I identify precisely the conditions under which the quantum three box paradox occurs, and show that these conditions are the same as arise in the derivation of the Leggett–Garg Inequality, which is supposed to demonstrate the incompatibility of quantum theory with macroscopic realism. I will argue that, as in Leggett–Garg Inequality violations, the source of the effect actually lies in the disturbance introduced by the intermediate measurement, and that the quantum nature of the effect is that no classical model of measurement disturbance can reproduce the paradox.
"One of the most striking features of quantum mechanics is the profound effect exerted by measure... more "One of the most striking features of quantum mechanics is the profound effect exerted by measurements alone. Sophisticated quantum control is now available in several experimental systems, exposing discrepancies between quantum and classical mechanics whenever measurement induces disturbance of the interrogated system. In practice, such discrepancies may frequently be explained as the back-action required by quantum mechanics adding quantum noise to a classical signal. Here we implement the 'three-box' quantum game of Aharonov and Vaidman in which quantum measurements add no detectable noise to a classical signal, by utilising state-of-the-art control and measurement of the nitrogen vacancy centre in diamond.
Quantum and classical mechanics then make contradictory predictions for the same experimental procedure, however classical observers cannot invoke measurement-induced disturbance to explain this discrepancy. We quantify the residual disturbance of our measurements and obtain data that rule out any classical model by > 7.8 standard deviations, allowing us for the first time to exclude the property of macroscopic state-definiteness from our system. Our experiment is then equivalent to a Kochen-Spekker test of quantum non-contextuality that successfully addresses the measurement detectability loophole"
In two recent papers, Maroney and Turgut separately and independently show generalisations of Lan... more In two recent papers, Maroney and Turgut separately and independently show generalisations of Landauer's erasure principle to indeterministic logical operations, as well as to logical states with variable energies and entropies. Here we show that, although Turgut's generalisation seems more powerful, in that it implies but is not implied by Maroney's and that it does not rely upon initial probability distributions over logical states, it does not hold for non-equilibrium states, while Maroney's generalisation holds even in non-equilibrium. While a generalisation of Turgut's inequality to non-equilibrium seems possible, it lacks the properties that makes the equilibrium inequality appealing. The non-equilibrium generalisation also no longer implies Maroney's inequality, which may still be derived independently. Furthermore, we show that Turgut's inequality can only give a necessary, but not sufficient, criteria for thermodynamic reversibility. Maroney's inequality gives the necessary and sufficient conditions.
Schulman (Entropy 7(4):221–233, 2005) has argued that Boltzmann’s intuition,
that the psychologic... more Schulman (Entropy 7(4):221–233, 2005) has argued that Boltzmann’s intuition, that the psychological arrow of time is necessarily aligned with the thermodynamic arrow, is correct. Schulman gives an explicit physical mechanism for this connection, based on the brain being representable as a computer, together with certain thermodynamic properties of computational processes. Hawking (Physical Origins of Time Asymmetry, Cambridge University Press, Cambridge, 1994) presents similar, if briefer, arguments. The purpose of this paper is to critically examine the support for the link between thermodynamics and an arrow of time for computers. The principal arguments put forward by Schulman and Hawking will be shown to fail. It will be shown that any computational process that can take place in an entropy increasing universe, can equally take place in an entropy decreasing universe. This conclusion does not automatically imply a psychological arrow can run counter to the thermodynamic arrow. Some alternative possible explanations for the alignment of the two arrows will be briefly discussed
Are principles of information processing necessary to demonstrate the consistency of statistical ... more Are principles of information processing necessary to demonstrate the consistency of statistical mechanics? Does the physical implementation of a computational operation have a fundamental thermodynamic cost, purely by virtue of its logical properties? These two questions lie at the centre of a large body of literature concerned with the Szilard engine (a variant of the Maxwell's demon thought experiment), Landauer's principle (supposed to embody the fundamental principle of the thermodynamics of computation) and possible connections between the two. A variety of attempts to answer these questions have illustrated many open questions in the foundations of statistical mechanics.
The relationships among reversible Carnot cycles, the absence of perpetual motion machines, and t... more The relationships among reversible Carnot cycles, the absence of perpetual motion machines, and the existence of a nondecreasing globally unique entropy function form the starting point of many textbook presentations of the foundations of thermodynamics. However, the thermal fluctuation phenomena associated with statistical mechanics has been argued to restrict the domain of validity of this basis of the second law of thermodynamics. Here we demonstrate that fluctuation phenomena can be incorporated into the traditional presentation, extending rather than restricting the domain of validity of the phenomenologically motivated second law. Consistency conditions lead to constraints upon the possible spectrum of thermal fluctuations. In a special case this uniquely selects the Gibbs canonical distribution and more generally incorporates the Tsallis distributions. No particular model of microscopic dynamics need be assumed.
We have a conundrum. The physical basis of information is clearly a highly active research area. ... more We have a conundrum. The physical basis of information is clearly a highly active research area. Yet the power of information theory comes precisely from separating it from the detailed problems of building physical systems to perform information processing tasks. Developments in quantum information over the last two decades seem to have undermined this separation, leading to suggestions that information is itself a physical entity and must be part of our physical theories, with resource-cost implications. We will consider a variety of ways in which physics seems to a affect computation, but will ultimately argue to the contrary: rejecting the claims that information is physical provides a better basis for understanding the fertile relationship between information theory and physics. instead, we will argue that the physical resource costs of information processing are to be understood through the need to consider physically embodied agents for whom information processing tasks are performed. Doing so sheds light on what it takes for something to be implementing a computational or information processing task of a given kind.
The celebrated Dreimännerarbeit by Born, Heisenberg and Jordan contains a matrix-mechanical deriv... more The celebrated Dreimännerarbeit by Born, Heisenberg and Jordan contains a matrix-mechanical derivation by Jordan of Planck's formula for blackbody fluctuations. Jordan appears to have considered this to be one of his finest contributions to quantum theory, but the status of his derivation is puzzling. In our Dreimenschenarbeit, we show how to understand what Jordan was doing in the double context of a Boltzmannian approach to statistical mechanics and of the early 'statistical interpretation'of matrix mechanics.
The notion of a physical collapse of the wave function is embodied in dynamical collapse mode... more The notion of a physical collapse of the wave function is embodied in dynamical collapse models. These involve a modification of the unitary evolution of the wave function such as to give a dynamical account of collapse. The resulting dynamics is at first sight time asymmetric for the simple reason that the wave function depends on those collapse events in the past but not those in the future. Here we show that dynamical wave function collapse models admit a general description that has no inbuilt direction of time. Given some simple constraints, we show that there exist empirically equivalent pictures of collapsing wave functions in both time directions, each satisfying the same dynamical rules. A preferred direction is singled out only by the asymmetric initial and final time constraints on the state of the Universe.
The amount of heat generated by computers is rapidly becoming one of the main problems for develo... more The amount of heat generated by computers is rapidly becoming one of the main problems for developing new generations of information technology. The thermodynamics of computation sets the ultimate physical bounds on heat generation. A lower bound is set by the Landauer Limit, at which computation becomes thermodynamically reversible. For classical computation there is no physical principle which prevents this limit being reached, and approaches to it are already being experimentally tested. In this paper we show that for quantum computation there is an unavoidable excess heat generation that renders it inherently thermodynamically irreversible. The Landauer Limit cannot, in general, be reached by quantum computers. We show the existence of a lower bound to the heat generated by quantum computing that exceeds that given by the Landauer Limit, give the special conditions where this excess cost may be avoided, and show how classical computing falls within these special conditions.
Macro-realism is the position that certain "macroscopic" observables must always possess definite... more Macro-realism is the position that certain "macroscopic" observables must always possess definite values: e.g. the table is in some definite position, even if we don't know what that is precisely. The traditional understanding is that by assuming macro-realism one can derive the Leggett-Garg inequalities, which constrain the possible statistics from certain experiments. Since quantum experiments can violate the Leggett-Garg inequalities, this is taken to rule out the possibility of macro-realism in a quantum universe. However, recent analyses have exposed loopholes in the Leggett-Garg argument, which allow many types of macro-realism to be compatible with quantum theory and hence violation of the Leggett-Garg inequalities. This paper takes a different approach to ruling out macro-realism and the result is a no-go theorem for macro-realism in quantum theory that is stronger than the Leggett-Garg argument. This approach uses the framework of ontological models: an elegant way to reason about foundational issues in quantum theory which has successfully produced many other recent results, such as the PBR theorem
Macroscopic realism is the thesis that macroscopically observable properties must always have def... more Macroscopic realism is the thesis that macroscopically observable properties must always have definite values. The idea was introduced by Leggett and Garg (1985), who wished to show a conflict with the predictions of quantum theory. However, their analysis required not just the assumption of macroscopic realism per se, but also that the observable properties could be measured non-invasively. In recent years there has been increasing interest in experimental tests of the violation of the Leggett-Garg inequality, but it has remained a matter of controversy whether this second assumption is a reasonable requirement for a macroscopic realist view of quantum theory. In a recent critical assessment Maroney and Timpson (2017) identified three different categories of macroscopic realism, and argued that only the simplest category could be ruled out by Leggett-Garg inequality violations. Allen, Maroney, and Gogioso (2016) then showed that the second of these approaches was also incompatible with quantum theory in Hilbert spaces of dimension 4 or higher. However, we show that the distinction introduced by Maroney and Timpson between the second and third approaches is not noise tolerant, so unfortunately Allen's result, as given, is not directly empirically testable. In this paper we replace Maroney and Timpson's three categories with a parameterization of macroscopic realist models, which can be related to experimental observations in a noise tolerant way, and recover the original definitions in the noise-free limit. We show how this parameterization can be used to experimentally rule out classes of macroscopic realism in Hilbert spaces of dimension 3 or higher, including the category tested by the Leggett-Garg inequality, without any use of the non-invasive measurability assumption.
Collapse models are modifications of quantum theory where the wave function is treated as physica... more Collapse models are modifications of quantum theory where the wave function is treated as physically real and the collapse of the wave function is a physical process. This appears to introduce a time reversal asymmetry into the dynamics of the wave function since the collapses affect only the future state. This paper challenges this conclusion, showing that in three different examples of time asymmetries associated with collapse models, if the physically real part of the model can be reduced to the locations in space and time about which collapses occur, then such a model works both forward and backward in time, in each case satisfying the Born rule. Despite the apparent asymmetry of the collapse process, these models in fact have time reversal symmetry. Any physically observed time asymmetries that arise in such models are due to the asymmetric imposition of initial or final time boundary conditions, rather than from an inherent asymmetry in the dynamical law. This is the standard explanation of time asymmetric behaviour resulting from time symmetric laws.
According to a recent no-go theorem (M. Pusey, J. Barrett and T. Rudolph, Nature Physics 8 475 (2... more According to a recent no-go theorem (M. Pusey, J. Barrett and T. Rudolph, Nature Physics 8 475 (2012)), models in which quantum states correspond to probability distributions over the values of some underlying physical variables must have the following feature: the distributions corresponding to distinct quantum states do not overlap. This is significant because if the distributions do not overlap, then the quantum state itself is
encoded by the physical variables. In such a model, it cannot coherently be maintained that the quantum state merely encodes information about underlying physical variables. The theorem, however, considers only models in which the physical variables corresponding to independently prepared systems are independent. This work considers models that are defined for a single quantum system of dimension d, such that the independence condition does not arise. We prove a result in a similar spirit to the original no-go theorem, in the form of an upper bound on the extent to which the probability distributions can overlap, consistently with reproducing quantum predictions. In particular, models in which the quantum overlap between pure states is equal to the classical overlap between the corresponding probability distributions cannot reproduce the quantum predictions in any dimension d>=3. The result is noise tolerant, and an experiment is motivated to distinguish the class of models ruled out from quantum theory.
Can a density matrix be regarded as a description of the physically real properties of an individ... more Can a density matrix be regarded as a description of the physically real properties of an individual system? If so, it may be possible to attribute the same objective significance to statistical mechanical properties, such as entropy or temperature, as to properties such as mass or energy. Non-linear modifications to unitary evolution can be proposed, based upon this idea, to account for thermodynamic irreversibility.
We examine the relationship between quantum contextuality (in both the standard Kochen-Specker se... more We examine the relationship between quantum contextuality (in both the standard Kochen-Specker sense and in the generalised sense proposed by Spekkens) and models of quantum theory in which the quantum state is maximally epistemic. We find that preparation noncontextual models must be maximally epistemic, and these in turn must be Kochen-Specker noncontextual. This implies that the Kochen-Specker theorem is sufficient to establish both the impossibility of maximally epistemic models and the impossibility of preparation noncontextual models. The implication from preparation noncontextual to maximally epistemic then also yields a proof of Bell's theorem from an EPR-like argument.
One of the recent no-go theorems on \Psi-epistemic interpretations of quantum proves that there a... more One of the recent no-go theorems on \Psi-epistemic interpretations of quantum proves that there are no 'maximally epistemic' interpretations of quantum theory. The proof utilises similar arrangements to Clifton's quantum contextuality proof and has parallels to Harrigan and Rudolph's quantum deficiency no-go theorem, itself based on the Kochen-Specker quantum contextuality proof. This paper shows how the Kochen-Specker theorem can also be turned into a no 'maximally epistemic' theorem, but of a more limited kind.
A novel no-go theorem is presented which sets a bound upon the extent to which '\Psi-epistemic' i... more A novel no-go theorem is presented which sets a bound upon the extent to which '\Psi-epistemic' interpretations of quantum theory are able to explain the overlap between non-orthogonal quantum states in terms of an experimenter's ignorance of an underlying state of reality. The theorem applies to any Hilbert space of dimension greater than two. In the limit of large Hilbert spaces, no more than half of the overlap between quantum states can be accounted for. Unlike other recent no-go theorems no additional assumptions, such as forms of locality, invasiveness, or non-contextuality, are required. The result continues to hold in the presence of a small but finite amount of noise, and is open to experimental verification in a sufficiently precise experimental arrangement.
Studies in the History and Philosophy of Modern Physics, 2017
A quantum pre- and post-selection paradox involves making measurements at two separate times on a... more A quantum pre- and post-selection paradox involves making measurements at two separate times on a quantum system, and making inferences about the state of the system at an intermediate time, conditional upon the observed outcomes. The inferences lead to predictions about the results of measurements performed at the intermediate time, which have been well confirmed experimentally, but which nevertheless seem paradoxical when inferences about different intermediate measurements are combined. The three box paradox is the paradigm example of such an effect, where a ball is placed in one of three boxes and is shuffled between the boxes in between two measurements of its location. By conditionalising on the outcomes of those measurements, it is inferred that between the two measurements the ball would have been found with certainty in Box 1 and with certainty in Box 2, if either box been opened on their own. Despite experimental confirmation of the predictions, and much discussion, it has remained unclear what exactly is supposed to be paradoxical or what specifically is supposed to be quantum, about these effects. In this paper I identify precisely the conditions under which the quantum three box paradox occurs, and show that these conditions are the same as arise in the derivation of the Leggett–Garg Inequality, which is supposed to demonstrate the incompatibility of quantum theory with macroscopic realism. I will argue that, as in Leggett–Garg Inequality violations, the source of the effect actually lies in the disturbance introduced by the intermediate measurement, and that the quantum nature of the effect is that no classical model of measurement disturbance can reproduce the paradox.
"One of the most striking features of quantum mechanics is the profound effect exerted by measure... more "One of the most striking features of quantum mechanics is the profound effect exerted by measurements alone. Sophisticated quantum control is now available in several experimental systems, exposing discrepancies between quantum and classical mechanics whenever measurement induces disturbance of the interrogated system. In practice, such discrepancies may frequently be explained as the back-action required by quantum mechanics adding quantum noise to a classical signal. Here we implement the 'three-box' quantum game of Aharonov and Vaidman in which quantum measurements add no detectable noise to a classical signal, by utilising state-of-the-art control and measurement of the nitrogen vacancy centre in diamond.
Quantum and classical mechanics then make contradictory predictions for the same experimental procedure, however classical observers cannot invoke measurement-induced disturbance to explain this discrepancy. We quantify the residual disturbance of our measurements and obtain data that rule out any classical model by > 7.8 standard deviations, allowing us for the first time to exclude the property of macroscopic state-definiteness from our system. Our experiment is then equivalent to a Kochen-Spekker test of quantum non-contextuality that successfully addresses the measurement detectability loophole"
In two recent papers, Maroney and Turgut separately and independently show generalisations of Lan... more In two recent papers, Maroney and Turgut separately and independently show generalisations of Landauer's erasure principle to indeterministic logical operations, as well as to logical states with variable energies and entropies. Here we show that, although Turgut's generalisation seems more powerful, in that it implies but is not implied by Maroney's and that it does not rely upon initial probability distributions over logical states, it does not hold for non-equilibrium states, while Maroney's generalisation holds even in non-equilibrium. While a generalisation of Turgut's inequality to non-equilibrium seems possible, it lacks the properties that makes the equilibrium inequality appealing. The non-equilibrium generalisation also no longer implies Maroney's inequality, which may still be derived independently. Furthermore, we show that Turgut's inequality can only give a necessary, but not sufficient, criteria for thermodynamic reversibility. Maroney's inequality gives the necessary and sufficient conditions.
Schulman (Entropy 7(4):221–233, 2005) has argued that Boltzmann’s intuition,
that the psychologic... more Schulman (Entropy 7(4):221–233, 2005) has argued that Boltzmann’s intuition, that the psychological arrow of time is necessarily aligned with the thermodynamic arrow, is correct. Schulman gives an explicit physical mechanism for this connection, based on the brain being representable as a computer, together with certain thermodynamic properties of computational processes. Hawking (Physical Origins of Time Asymmetry, Cambridge University Press, Cambridge, 1994) presents similar, if briefer, arguments. The purpose of this paper is to critically examine the support for the link between thermodynamics and an arrow of time for computers. The principal arguments put forward by Schulman and Hawking will be shown to fail. It will be shown that any computational process that can take place in an entropy increasing universe, can equally take place in an entropy decreasing universe. This conclusion does not automatically imply a psychological arrow can run counter to the thermodynamic arrow. Some alternative possible explanations for the alignment of the two arrows will be briefly discussed
Are principles of information processing necessary to demonstrate the consistency of statistical ... more Are principles of information processing necessary to demonstrate the consistency of statistical mechanics? Does the physical implementation of a computational operation have a fundamental thermodynamic cost, purely by virtue of its logical properties? These two questions lie at the centre of a large body of literature concerned with the Szilard engine (a variant of the Maxwell's demon thought experiment), Landauer's principle (supposed to embody the fundamental principle of the thermodynamics of computation) and possible connections between the two. A variety of attempts to answer these questions have illustrated many open questions in the foundations of statistical mechanics.
The relationships among reversible Carnot cycles, the absence of perpetual motion machines, and t... more The relationships among reversible Carnot cycles, the absence of perpetual motion machines, and the existence of a nondecreasing globally unique entropy function form the starting point of many textbook presentations of the foundations of thermodynamics. However, the thermal fluctuation phenomena associated with statistical mechanics has been argued to restrict the domain of validity of this basis of the second law of thermodynamics. Here we demonstrate that fluctuation phenomena can be incorporated into the traditional presentation, extending rather than restricting the domain of validity of the phenomenologically motivated second law. Consistency conditions lead to constraints upon the possible spectrum of thermal fluctuations. In a special case this uniquely selects the Gibbs canonical distribution and more generally incorporates the Tsallis distributions. No particular model of microscopic dynamics need be assumed.
Starting in the middle of November 2002, the CMS experiment undertook an evaluation of the Europe... more Starting in the middle of November 2002, the CMS experiment undertook an evaluation of the European, DataGrid Project (EDG) middleware using its event simulation programs. A joint CMS-EDG task force performed a iestress testle by submitting a large number of jobs to many distributed sites. The EDG testbed was complemented with additional CMS-dedicated resources. A total of ~ 10000 jobs consisting of two different computational types were submitted from four different locations in Europe over a period of about one month. Nine sites were active, providing integrated resources of more than 500 CPUs and about 5 TB of disk space (with the additional use of two Mass Storage Systems). Detailed descriptions of the adopted procedures, the problems encountered and the corresponding solutions are reported. Results and evaluations of the test, both from the CMS and the EDG perspectives, are described
Starting in the middle of November 2002, the CMS experiment undertook an evaluation of the Europe... more Starting in the middle of November 2002, the CMS experiment undertook an evaluation of the European DataGrid Project (EDG) middleware using its event simulation programs. A joint CMS-EDG task force performed a "stress test" by submitting a large number of jobs to many distributed sites. The EDG testbed was complemented with additional CMS-dedicated resources. A total of ~ 10000 jobs consisting of two different computational types were submitted from four different locations in Europe over a period of about one month. Nine sites were active, providing integrated resources of more than 500 CPUs and about 5 TB of disk space (with the additional use of two Mass Storage Systems). Descriptions of the adopted procedures, the problems encountered and the corresponding solutions are reported. Results and evaluations of the test, both from the CMS and the EDG perspectives, are described.
An overview is presented of the characteristics of HEP computing and its mapping to the Grid para... more An overview is presented of the characteristics of HEP computing and its mapping to the Grid paradigm. This is followed by a synopsis of the main experiences and lessons learned by HEP experiments in their use of DataGrid middleware using both the EDG application testbed and the LCG production service. Particular reference is made to experiment ‘data challenges’, and a forward look is given to necessary developments in the framework of the EGEE project.
The CMS Experiment is defining its Computing Model and is experimenting and testing the new distr... more The CMS Experiment is defining its Computing Model and is experimenting and testing the new distributed features offered by many Grid Projects. This report describes use by CMS of the early-deployed systems of LCG (LCG-0 and LCG-1). Most of the used features here discussed came from the EU implemented middleware, even if some of the tested capabilities were in common with the US developed middleware. This report describes the simulation of about 2 million of CMS detector events, which were generated as part of the official CMS Data Challenge 04 (Pre-Challenge-Production). The simulations were done on a CMS-dedicated testbed (CMS-LCG-0), where an ad-hoc modified version of the LCG-0 middleware was deployed and where the CMS Experiment had a complete control, and on the official early LCG delivered system (with the LCG-1 version). Modifications to the CMS simulation tools for events produc tion where studied and achieved, together with necessary adaptations of the middleware services. Bilateral feedback (between CMS and LCG middleware) played an important role in making progress (including bugs corrections). Fractions of successful processing of simulation jobs ranged from 70% to 90%. Most of the failure reasons were identified, with RLS instability being the greatest cause of failure. Evaluation of the LCG-1 middleware is also presented and discussed. The progress of the new functionalities introduced and the better-distributed organization of services were tested and eventually stressed. While the overall efficiency decreased in the early implementation of the system in respect of the LCG-0 testbed, a consistent success rate of 50-60% was achieved. One of the major difficulties for the simulation of CMS events on the LCG-1 system was identified to be the consistent configuration of the distributed sites.
Distributed data transfer is currently characterised by the use of widely disparate tools, meanin... more Distributed data transfer is currently characterised by the use of widely disparate tools, meaning that significant human effort is required to maintain the distributed system. In order to realise the possibilities represented by Grid infrastructure, the reality of a heterogenous computing environment must be tackled by providing means by which these disparate elements can communicate. Two such data distribution tools are the SRB and the EU DataGrid's Data Management fabric, both widely used by many large scientific projects. Both provide similar functionality—the replication and cataloguing of datasets in a globally distributed environment. Significant quantities of data are currently stored in both. Moving data from the SRB to the EUDG, however, requires significant intervention and is therefore not scalable. This paper presents a mechanism by which the SRB can automatically interact with the GIGGLE framework as implemented by the EUDG, allowing access to SRB data using Grid tools.
The GridPP Collaboration is building aUKcomputing Grid for particle physics,
as part of the inter... more The GridPP Collaboration is building aUKcomputing Grid for particle physics, as part of the international effort towards computing for the Large Hadron Collider. The project, funded by the UK Particle Physics and Astronomy Research Council (PPARC), began in September 2001 and completed its first phase 3 years later. GridPP is a collaboration of approximately 100 researchers in 19 UK university particle physics groups, the Council for the Central Laboratory of the Research Councils and CERN, reflecting the strategic importance of the project. In collaboration with other European and US efforts, the first phase of the project demonstrated the feasibility of developing, deploying and operating a Grid-based computing system to meet the UK needs of the Large Hadron Collider experiments. This note describes the work undertaken to achieve this goal.
CMS currently uses a number of tools to transfer data which, taken together, form the basis of a ... more CMS currently uses a number of tools to transfer data which, taken together, form the basis of a heterogeneous datagrid. The range of tools used, and the directed, rather than optimized nature of CMS recent large scale data challenge required the creation of a simple infrastructure that allowed a range of tools to operate in a complementary way. The system created comprises a hierarchy of simple processes (named ‘agents’) that propagate files through a number of transfer states. File locations and some application metadata were stored in POOL file catalogues, with LCG LRC or MySQL back-ends. Agents were assigned limited responsibilities, and were restricted to communicating state in a well-defined, indirect fashion through a central transfer management database. In this way, the task of distributing data was easily divided between different groups for implementation. The prototype system was developed rapidly, and achieved the required sustained transfer rate of ~10 MBps, with O(10)...
The European DataGrid (EDG) project ran from 2001 to 2004, with the aim of producing middleware w... more The European DataGrid (EDG) project ran from 2001 to 2004, with the aim of producing middleware which could form the basis of a production Grid, and of running a testbed to demonstrate the middleware. HEP experiments (initially the four LHC experiments and subsequently BaBar and D0) were involved from the start in specifying requirements, and subsequently in evaluating the performance of the middleware, both with generic tests and through increasingly complex data challenges. A lot of experience has therefore been gained which may be valuable to future Grid projects, in particular LCG and EGEE which are using a substantial amount of the middleware developed in EDG. We report our experiences with job submission, data management and mass storage, information and monitoring systems, Virtual Organisation management and Grid operations, and compare them with some typical Use Cases defined in the context of LCG. We also describe some of the main lessons learnt from the project, in particu...
Macro-realism is the position that certain macroscopic observables must always possess definite v... more Macro-realism is the position that certain macroscopic observables must always possess definite values: e.g. the table is in some definite position, even if we do not know what that is precisely. The traditional understanding is that by assuming macro-realism one can derive the Leggett-Garg inequalities, which constrain the possible statistics from certain experiments. Since quantum experiments can violate the Leggett-Garg inequalities, this is taken to rule out the possibility of macro-realism in a quantum universe. However, recent analyses have exposed loopholes in the Leggett-Garg argument, which allow many types of macro-realism to be compatible with quantum theory and hence violation of the Leggett-Garg inequalities. This paper takes a different approach to ruling out macro-realism and the result is a no-go theorem for macro-realism in quantum theory that is stronger than the Leggett-Garg argument. This approach uses the framework of ontological models: an elegant way to reason...
The CMS Experiment is on the way to better understand and define its Computing Model, experimenti... more The CMS Experiment is on the way to better understand and define its Computing Model, experimenting and testing the new distributed features offered by many Grid Projects. The present work reports about the use by CMS of the early-deployed systems of LCG (LCG-0 and LCG-1). Most of the used features here discussed came from the EU implemented middleware, even if some of the tested capabilities were common with the US developed middleware.
Abstract: We develop the argument that the Gibbs-von Neumann entropy is the appropriate statistic... more Abstract: We develop the argument that the Gibbs-von Neumann entropy is the appropriate statistical mechanical generalisation of the thermodynamic entropy, for macroscopic and microscopic systems, whether in thermal equilibrium or not, as a consequence of Hamiltonian dynamics. The mathematical treatment utilises well known results [Gib02, Tol38, Weh78, Par89], but most importantly, incorporates a variety of arguments on the phenomenological properties of thermal states [Szi25, TQ63, HK65, GB91] and of ...
Abstract: A novel no-go theorem is presented which sets a bound upon the extent to which'\ P... more Abstract: A novel no-go theorem is presented which sets a bound upon the extent to which'\ Psi-epistemic'interpretations of quantum theory are able to explain the overlap between non-orthogonal quantum states in terms of an experimenter's ignorance of an underlying state of reality. The theorem applies to any Hilbert space of dimension greater than two. In the limit of large Hilbert spaces, no more than half of the overlap between quantum states can be accounted for. Unlike other recent no-go theorems no additional assumptions, such as ...
Abstract: One of the recent no-go theorems on\ Psi-epistemic interpretations of quantum proves th... more Abstract: One of the recent no-go theorems on\ Psi-epistemic interpretations of quantum proves that there are no'maximally epistemic'interpretations of quantum theory. The proof utilises similar arrangements to Clifton's quantum contextuality proof and has parallels to Harrigan and Rudolph's quantum deficiency no-go theorem, itself based on the Kochen-Specker quantum contextuality proof. This paper shows how the Kochen-Specker theorem can also be turned into a no'maximally epistemic'theorem, but of a more limited kind.
Uploads
understand what Jordan was doing in the double context of a Boltzmannian approach to statistical mechanics and of the early 'statistical interpretation'of matrix mechanics.
encoded by the physical variables. In such a model, it cannot coherently be maintained that the quantum state merely encodes information about underlying physical variables. The theorem, however, considers only models in which the physical variables corresponding to independently prepared systems are independent. This work considers models that are defined for a single quantum system of dimension d, such that the independence condition does not arise. We prove a result in a similar spirit to the original no-go theorem, in the form of an upper bound on the extent to which the probability distributions can overlap, consistently with reproducing quantum predictions. In particular, models in which the quantum overlap between pure states is equal to the classical overlap between the corresponding probability distributions cannot reproduce the quantum predictions in any dimension d>=3. The result is noise tolerant, and an experiment is motivated to distinguish the class of models ruled out from quantum theory.
Quantum and classical mechanics then make contradictory predictions for the same experimental procedure, however classical observers cannot invoke measurement-induced disturbance to explain this discrepancy. We quantify the residual disturbance of our measurements and obtain data that rule out any classical model by > 7.8 standard deviations, allowing us for the first time to exclude the property of macroscopic state-definiteness from our system. Our experiment is then equivalent to a Kochen-Spekker test of quantum non-contextuality that successfully addresses the measurement detectability loophole"
that the psychological arrow of time is necessarily aligned with the thermodynamic
arrow, is correct. Schulman gives an explicit physical mechanism for this
connection, based on the brain being representable as a computer, together with certain
thermodynamic properties of computational processes. Hawking (Physical Origins
of Time Asymmetry, Cambridge University Press, Cambridge, 1994) presents
similar, if briefer, arguments. The purpose of this paper is to critically examine the
support for the link between thermodynamics and an arrow of time for computers.
The principal arguments put forward by Schulman and Hawking will be shown to
fail. It will be shown that any computational process that can take place in an entropy
increasing universe, can equally take place in an entropy decreasing universe. This
conclusion does not automatically imply a psychological arrow can run counter to
the thermodynamic arrow. Some alternative possible explanations for the alignment
of the two arrows will be briefly discussed
understand what Jordan was doing in the double context of a Boltzmannian approach to statistical mechanics and of the early 'statistical interpretation'of matrix mechanics.
encoded by the physical variables. In such a model, it cannot coherently be maintained that the quantum state merely encodes information about underlying physical variables. The theorem, however, considers only models in which the physical variables corresponding to independently prepared systems are independent. This work considers models that are defined for a single quantum system of dimension d, such that the independence condition does not arise. We prove a result in a similar spirit to the original no-go theorem, in the form of an upper bound on the extent to which the probability distributions can overlap, consistently with reproducing quantum predictions. In particular, models in which the quantum overlap between pure states is equal to the classical overlap between the corresponding probability distributions cannot reproduce the quantum predictions in any dimension d>=3. The result is noise tolerant, and an experiment is motivated to distinguish the class of models ruled out from quantum theory.
Quantum and classical mechanics then make contradictory predictions for the same experimental procedure, however classical observers cannot invoke measurement-induced disturbance to explain this discrepancy. We quantify the residual disturbance of our measurements and obtain data that rule out any classical model by > 7.8 standard deviations, allowing us for the first time to exclude the property of macroscopic state-definiteness from our system. Our experiment is then equivalent to a Kochen-Spekker test of quantum non-contextuality that successfully addresses the measurement detectability loophole"
that the psychological arrow of time is necessarily aligned with the thermodynamic
arrow, is correct. Schulman gives an explicit physical mechanism for this
connection, based on the brain being representable as a computer, together with certain
thermodynamic properties of computational processes. Hawking (Physical Origins
of Time Asymmetry, Cambridge University Press, Cambridge, 1994) presents
similar, if briefer, arguments. The purpose of this paper is to critically examine the
support for the link between thermodynamics and an arrow of time for computers.
The principal arguments put forward by Schulman and Hawking will be shown to
fail. It will be shown that any computational process that can take place in an entropy
increasing universe, can equally take place in an entropy decreasing universe. This
conclusion does not automatically imply a psychological arrow can run counter to
the thermodynamic arrow. Some alternative possible explanations for the alignment
of the two arrows will be briefly discussed
as part of the international effort towards computing for the Large Hadron
Collider. The project, funded by the UK Particle Physics and Astronomy
Research Council (PPARC), began in September 2001 and completed its
first phase 3 years later. GridPP is a collaboration of approximately 100
researchers in 19 UK university particle physics groups, the Council for the
Central Laboratory of the Research Councils and CERN, reflecting the strategic
importance of the project. In collaboration with other European and US
efforts, the first phase of the project demonstrated the feasibility of developing,
deploying and operating a Grid-based computing system to meet the UK needs
of the Large Hadron Collider experiments. This note describes the work
undertaken to achieve this goal.