Skip to main content

ORIGINAL RESEARCH article

Front. Phys., 19 June 2017
Sec. Interdisciplinary Physics
Volume 5 - 2017 | https://doi.org/10.3389/fphy.2017.00019

The Real and the Mathematical in Quantum Modeling: From Principles to Models and from Models to Principles

  • Theory and Cultural Studies Program, Purdue University, West Lafayette, IN, United States

The history of mathematical modeling outside physics has been dominated by the use of classical mathematical models, C-models, primarily those of a probabilistic or statistical nature. More recently, however, quantum mathematical models, Q-models, based in the mathematical formalism of quantum theory have become more prominent in psychology, economics, and decision science. The use of Q-models in these fields remains controversial, in part because it is not entirely clear whether Q-models are necessary for dealing with the phenomena in question or whether C-models would still suffice. My aim, however, is not to assess the necessity of Q-models in these fields, but instead to reflect on what the possible applicability of Q-models may tell us about the corresponding phenomena there, vis-à-vis quantum phenomena in physics. In order to do so, I shall first discuss the key reasons for the use of Q-models in physics. In particular, I shall examine the fundamental principles that led to the development of quantum mechanics. Then I shall consider a possible role of similar principles in using Q-models outside physics. Psychology, economics, and decision science borrow already available Q-models from quantum theory, rather than derive them from their own internal principles, while quantum mechanics was derived from such principles, because there was no readily available mathematical model to handle quantum phenomena, although the mathematics ultimately used in quantum did in fact exist then. I shall argue, however, that the principle perspective on mathematical modeling outside physics might help us to understand better the role of Q-models in these fields and possibly to envision new models, conceptually analogous to but mathematically different from those of quantum theory, that may be helpful or even necessary there or in physics itself. I shall, in closing, suggest one possible type of such models, singularized probabilistic models, SP-models, some of which are time-dependent, TDSP-models. The necessity of using such models may change the nature of mathematical modeling in science and, thus, the nature of science, as it happened in the case of Q-models, which not only led to a revolutionary transformation of physics but also opened new possibilities for scientific thinking and mathematical modeling beyond physics.

Introduction

The history of mathematical modeling outside physics has been dominated by classical mathematical models, C-models, based on mathematical models developed in classical physics, especially probabilistic or statistical models, borrowed from classical statistical physics or chaos and complexity theories. More recently, however, models based in the mathematical formalism of quantum theory, Q-models, primarily borrowed from quantum mechanics but occasionally also quantum field theory, became more current outside physics, specifically in psychology, economics, and decision science, the fields (beyond physics) with which I will be primarily concerned here [e.g., 1, 2]1. My abbreviations follows P. Dirac's distinction between c-numbers (classical numbers) and q-numbers (quantum numbers), because the variables used in Q-models are in fact q-numbers. Quantum mechanics and Q-models are based in the mathematics of Hilbert spaces over complex numbers, C, with Hilbert-space operators used as physical variables in the equations of quantum mechanics, as against functions of real (mathematical) variables, c-numbers, that serve as physical variables in classical physics. The use of Q-models in these fields remains controversial, because it is not entirely clear whether they are necessary for dealing with the phenomena in question or whether C-models would suffice. It is true that debates and sometimes controversies have also accompanied quantum mechanics since its birth in 1925. These debates, initiated by the famous confrontation between N. Bohr and A. Einstein on, in Bohr's phrase, “epistemological problems in atomic physics,” used in the title of his account of this confrontation, have never lost their intensity and appear to be interminable [3, v. 2, pp. 32–66]. However, as Bohr's phrase indicates, the reasons for these controversies have been primarily philosophical. The effectiveness of quantum mechanics or higher-level quantum theories, such as quantum field theory, has not been in question: they are among the best-confirmed theories in physics. The situation is different in psychology, economics, and decision science, where it is the scientific effectiveness or at least necessity of Q-models that is doubted. My aim here, however, is not to assess this effectiveness or necessity, but instead to reflect on what the possible applicability of Q-models may tell us about the corresponding phenomena in these fields vis-à-vis quantum phenomena in physics. In order to do so, I shall first consider the key reasons for the use of Q-models in physics. In particular, I shall examine the fundamental principles that grounded and indeed led to the development of quantum theory. Then I shall consider a possible role of similar principles in using Q-models beyond quantum theory. My emphases are due to the fact that psychology, economics, and decision science borrow already available Q-models from quantum theory, rather than derive them from their own fundamental principles, while quantum mechanics and then quantum field theory were derived from such principles. This is not surprising because there was at the time no available mathematical model or (a more general concept, which includes an interpretation of the model used) theory to effectively handle quantum phenomena. The “old quantum theory” of M. Planck, A. Einstein, N. Bohr, and A. Sommerfeld, which ushered in the quantum revolution, became manifestly inadequate by the time W. Heisenberg began his work on quantum mechanics that he discovered in 1925 [4]. For the reasons explained below (mostly a search for a more rigorous derivation of the formalism), the research in quantum foundations is still concerned with deriving quantum theory from such principles, a project in part motivated by the rise of quantum information theory. That does not appear to be a significant concern outside physics where the use of Q-models is motivated primarily by their predictive capacities, which is of course a crucial consideration in physics as well. It may, however, be beneficial to consider the deeper reasons for the possible use of Q-models in these fields, or, in terms of my title, the real that gives rise to the mathematical of Q-models there. The principle perspective on mathematical modeling beyond physics might help us to do this and possibly to envision new, post-quantum, models there or even in physics. I shall, in closing, suggest one possible type of such models, singularized probabilistic models, SP-models, some of which are time-dependent, TDSP-models, and consider their implications for mathematical modeling in science and for our understanding of the nature of science2.

Physical Principles and Mathematical Models in Quantum Mechanics

Theories, Principles, and Models in Fundamental Physics

I would like to begin by outlining the key features of the standard mathematical model of quantum mechanics, more customarily used as a probabilistically or statistically predictive model in view of the difficulties of in maintaining its representational capacities, which continue to be debated:

(1) The Hilbert-space formalism over the field of complex numbers, C, an abstract vector space of any dimension, finite or infinite (in quantum mechanics, either finite or countably infinite), possessing the structure of an inner product that allows lengths and angles to be measured, analogously to an n-dimensional Euclidean space (which is a Hilbert space over real numbers R);

(2) The noncommutativity of the Hilbert-space operators, also known as “observables,” which are mathematical entities associated, in terms of probabilistic or statistical predictions, with physically observable quantities;

(3) The nonadditive nature of the probabilities involved: the joint probability of two or more mutually exclusive alternatives in which an event might occur is, in general, not equal to the sum of the probabilities for each alternative, and instead obey the law of the addition of the so-called “quantum amplitudes,” associated with complex Hilbert-space vectors, for these alternatives (technically, these amplitudes are linked to probability densities);

(4) Born's rule or an analogous rule (such as von Neumann's projection postulate or Lüder's postulate) added to the formalism, which establishes the relation between amplitudes as complex entities and probabilities as real numbers (by using square moduli or, equivalently, the multiplication of these quantities and their complex conjugates) and (3) above3.

In the development of quantum mechanics, discovered in 1925, these features were not initially assumed, but were derived from certain physical features of quantum phenomena and principles arising from these features. The formalism was only given a properly Hilbert-space form by J. von Neumann, in 1932, in The Mathematical Foundations of Quantum Mechanics, a standard text ever since [7]4.

I shall now explain the concepts of theory, principle, and model, as they will be understood here. By a theory, I mean an organized assemblage of concepts, explanations, principles, and models by means of which one is able to relate, in one way or another, to the phenomena or (they are not always the same) objects the theory considers. In defining principles, I follow Einstein's distinction between “constructive” and “principle” theories, two contrasting, although in practice often intermixed, types of theories [8, 9, pp. 35–50]. “Constructive theories” aim “to build up a picture of the more complex phenomena out of the materials of a relatively simple formal scheme from which they start out” [8, p. 228]. Thus, according to Einstein, the kinetic theory of gases, as a constructive theory in classical physics, “seeks to reduce mechanical, thermal, and diffusional processes to movements of molecules—i.e., to build them up out of the hypothesis of molecular motion,” described by the laws of classical mechanics [8, p. 228]. By contrast, principle theories “employ the analytic, not the synthetic, method. The elements which form their basis and starting point are not hypothetically constructed but empirically discovered ones, general characteristics of natural processes, principles that give rise to mathematically formulated criteria which the separate processes or the theoretical representations of them have to satisfy” [8, p. 228]. Thus, thermodynamics, a classical principle theory (parallel to the kinetic theory of gases as a constructive theory), “seeks by analytical means to deduce necessary conditions, which separate events have to satisfy, from the universally experienced fact that perpetual motion is impossible” [8, p. 228].

Principles, then, are “empirically discovered, general characteristics of natural processes, …that give rise to mathematically formulated criteria which the separate processes or the theoretical representations of them have to satisfy.” I shall adopt this definition, but with the following qualification, which is likely to have been accepted by Einstein. Principles are not empirically discovered but formulated, constructed, on the basis of empirically established evidence. “The impossibility of perpetual motion” is hardly empirically given; it is as a principle formulated on the basis of such evidence.

Constructive theories are, more or less by definition, realist theories, and conversely, many realist theories are constructive. Realist theories represent, commonly causally, the phenomena or objects they consider and their behavior, in science by mathematical models, assumed to idealize how nature or reality works, in the case of constructive theories at the simpler, or deeper, level of reality constructed by a theory. In other words, a constructive theory offer a representation of the processes underlying and connecting the observable phenomena considered, commonly by understanding the ultimate character of these processes on the model of classical mechanics or classical electrodynamics, as in the kinetic theory of gases, as described above or other forms of classical statistical physics. All such theories assume that the individual behavior of the ultimate constituents of the systems they consider is described by the laws of classical mechanics. A realist theory may represent objects or phenomena it considers in a more direct, if still idealized, manner, as classical mechanics (which deals with individual or sufficiently small systems) or classical electrodynamics do. I shall discuss the concepts of reality and realism, which encompasses that of realist theory, in more detail below. First, however, I shall define a mathematical model.

By a “mathematical model” I refer to a mathematical structure or set of mathematical structures that enables any type of relation to the (observed) phenomena or objects considered. (As I shall only deal with mathematical models here, the term “model” hereafter refers to mathematical models.) All modern, post-Galilean, physical theories are defined by their uses of such models. The requirement of using mathematical models may be seen as a principle, the mathematization principle, “the M principle,” arguably the single defining principle of all modern physics, from Galileo on. Such models may be realist, representational, as in classical physics, specifically classical mechanics, or predictive, as in classical statistical physics (the models of which are, however, underlain by representational models of classical mechanics), or in quantum mechanics, without assuming realism and causality even in considering elementary individual quantum processes, such as those concerning elementary quantum objects, “elementary particles.” This assumption is expressly abandoned or even precluded in non-realist interpretations of quantum phenomena and quantum mechanics, following Bohr and “the spirit of Copenhagen,” as Heisenberg called it [10, p. iv]5. The M principle is upheld in quantum mechanics, but, in non-realist interpretations, in a way different from how it is used in realist theories.

The probabilistic or statistical character of quantum predictions must also be maintained by realist interpretations of these theories or alternative theories (such as Bohmian theories) of quantum phenomena, in conformity with quantum experiments, in which only probabilistic or statistical predictions are possible. The reasons for this is that the repetition of identically prepared quantum experiments in general leads to different outcomes, a difference that cannot be improved beyond a certain limit (defined by Planck's constant, h) by improving the conditions of measurement, which is possible in classical physics. This fact is also manifested in Heisenberg's uncertainty relations, which are statistical in character as well. This situation leads to the quantum probability or (depending on interpretation) quantum statistics principle, the QP/QS principle, arguably the single defining principle in Q-models in physics and beyond, keeping in mind that in psychology, economics, and decision science, we do not have anything corresponding to elementary individual physical processes, involving the ultimate elementary constituents of nature, “elementary particles.” Nor do we have anything analogous to h. The probabilities themselves necessary for making correct predictions, in either quantum mechanics or in using Q-models elsewhere, are, thus far, calculated by using the Hilbert-space or mathematically equivalent formalisms and the (non-additive) procedure described above that uses quantum amplitudes and Born's or a similar rule6.

Realist models are, then, representational models, idealizing the nature of objects or phenomena they consider. The term “realism” will be primarily understood here as referring to the possibility, at least, again, in principle, of such models, and, in the first place, theories allowing for such models. One could define another type of realism, which would refer to theories that presuppose an independent architecture of reality they consider, while allowing that this architecture cannot be represented, either at a given moment in history or perhaps ever, but if so, only due to practical human limitations [9, pp. 11–23]. In the first case, a theory that is strictly predictive may be accepted, but with the hope that a future theory will do better, by being a realist theory of the representational type. Einstein adopted this attitude toward quantum mechanics, which he expected to be eventually replaced by a (representational) realist theory. Even in the second case, the ultimate nature of reality is commonly deemed to be conceivable on realist models of classical physics, possibly adjusting them to accommodate new phenomena. However, this type of realism implies that there is no representational theory or model of the ultimate nature of the phenomena or objects considered. Either type of realism is abandoned or even precluded in quantum mechanics, when interpreted in the spirit of Copenhagen. However, such interpretations do assume the concept of reality, by which I refer to what exists or is assumed to exist, without making any claim upon the character of this existence, which type of claims defines realist theories. By existence I refer to a capacity to have effects on the world, ultimately, which also assume the existence of the world by virtue of its capacity to have effects upon itself, effects which establish by means of and thus in terms as effects of our interactions with the world. In physics, the primary reality considered is that of nature or matter. It is generally assumed to exist independently of our interaction with it, which also assumes that it has existed when we did not exist and will continue to exist when we will no longer exist. This assumption is also made in non-realist interpretations of quantum mechanics, in the absence of a representation or even (as against the second, non-representational type of realism defined above) conception of the character of this existence. Thus, if realism presupposes a representation or at least a conception of reality, this concept of reality is that of “reality without realism” [9, 11]. The assumption of this concept of reality is a principle, the RWR principle. The existence or reality of quantum objects, a form of reality beyond representation or even conception, is inferred from effects they have on our world, specifically on experimental technology. It has not been possible, at least thus far, to observe a moving electron or photon, or for that matter even stationary electrons (there are no stationary photons, which only exist in motion before they are absorbed by other forms of matter, such as electrons). It is only possible to observe traces of their interactions with measuring instruments, traces that do not allow us to reconstitute the independent behavior of quantum objects movement, an impossibility reflected in Heisenberg's uncertainty relations. In non-realist, RWR-principle-based, interpretations, quantum mechanics only predicts, in probabilistic or statistical terms (no other predictions are, again, possible on experimental grounds), effects manifested in measuring instruments impacted by quantum objects.

While a principle theory, which, as I explained, need not be constructive in Einstein's sense, could be either realist or non-realist, a constructive theory is by definition realist. Realist or, it follows, constructive theories do involve principles, such as the equivalence principle in general relativity, or the principle of causality, which, to adopt Kant's definition, commonly used ever since, states that, if an event takes place, it has a cause of which it is an effect [12, p. 305, 308]7. Asymmetrically, however, a principle theory need not involve constructive aspects or be realist. In non-realist, RWR-principle-based, interpretations, quantum mechanics is a principle theory by definition, by virtue of the RWR principle. It is not possible, in such interpretations, to have a constructive theorization of the ultimate entities, quantum objects, which are responsible for the observable quantum phenomena, unless one sees quantum objects as constructed as in principle unconstructible. According to Bohr, thus formulating the RWR principle, “in quantum mechanics we are not dealing with an arbitrary renunciation of a more detailed analysis of atomic phenomena, but with a recognition that such an analysis is in principle excluded,” beyond a certain point [3, v. 2, p. 62]. In this interpretation, quantum mechanics divorces itself from the representation of the connections between observed quantum phenomena, which it only relates in terms of predictions, in general probabilistic or statistical in character, thus fulfilling the M principle under the conditions of the RWR principle.

Finally, the present view does not assume a permanent, Platonist, essence to any given principle, which can always be abandoned under the pressure of new experimental findings or new ways of theorizing previously available experimental findings. Indeed, one might argue that the greatest form of creative thinking in science or other theoretical fields is that which lead to the invention of new principles, which implies the transformation of principles, rather than any Platonist permanence to them.

The Physical Principles of the Quantum Theory

The RWR principle and the corresponding interpretation of quantum mechanics emerged only in the 1930s. Heisenberg's discovery of quantum mechanics in 1925 and Bohr's initial interpretation of it, proposed in 1927, were based on the following principles, with Bohr's complementarity principle added in 1927:

(1) the proto-RWR principle, according to which, “quantum mechanics does not deal with a space–time description of the motion of atomic particles” [3, v. 1, p. 48];

(2) the principle of discreteness or the QD principle, according to which all observed quantum phenomena are individual and discrete in relation to each other, which is fundamentally different the atomic discreteness of quantum objects themselves;

(3) the principle of the probabilistic or statistical nature of quantum predictions, the QP/QS principle, even (in contrast to classical statistical physics) in the case of primitive or elementary quantum processes, in which nature also reflects a special, non-additive, nature of quantum probabilities and rules, such as Born's rule, for deriving them, and

(4) the correspondence principle, which, as initially understood by Bohr, required that the predictions of quantum theory must coincide with those of classical mechanics in the classical limit, but was given by Heisenberg a new and more rigorous form of “the mathematical correspondence principle,” which required that the equations of quantum mechanics convert into those of classical mechanics in the classical limit, thus, in accordance with the M principle.

I speak of the proto-RWR principle because Heisenberg saw the project of describing the motion of electrons as unachievable at the time, rather than “in principle excluded,” as Bohr assumed a decade later [3, v. 2, p. 62]. This was, nevertheless, a radical move on Heisenberg's part, as Bohr was the first to realize: “In contrast to ordinary [classical] mechanics, the new quantum mechanics does not deal with a space–time description of the motion of atomic particles. It operates with manifolds of quantities [matrices] which replace the harmonic oscillating components of the motion and symbolize the possibilities of transitions between stationary states in conformity with the correspondence principle. These quantities satisfy certain relations which take the place of the mechanical equations of motion and the quantization rules [of the old quantum theory]” [3, v. 1, p. 48].

Quantum discreteness was eventually (as part of Bohr's ultimate interpretation) recast by Bohr in terms of his concept of “phenomenon,” defined in terms of what is observed in measuring instruments under the impact of quantum objects, in contradistinction to quantum objects themselves, which cannot be observed or represented [3, v. 2, p. 64]. Quantum phenomena are, in Bohr's interpretation, irreducibly discrete in relation to each other, and there is no continuous or any other conceivable process that could be assumed to connect them. Probability has a temporal structure by virtue of its futural and discrete nature: one can only verifiably estimate future discrete events. Such events may, however, be continuously and causally connected, as they are in classical physics, even though we may not be able to track these connections to make exact predictions, as happens in classical statistical mechanics or chaos theory. By contrast, in non-realist, RWR-principle-based, interpretations, the nature of quantum phenomena and events precludes us from causally (or otherwise) connecting them. This means that only probabilistic or statistical predictions are possible, even ideally and in principle, and even in dealing with elementary individual quantum objects, such as those known as “elementary particles,” and the processes and events they lead to, objects and processes that cannot be decomposed into a smaller objects and processes. This qualification distinguishes quantum mechanics from classical probabilistic or statistical theories, or of course classical mechanics where such predictions could, at least ideally, be exact in dealing with individual classical objects or a small number of classical objects. In quantum mechanics, in non-realist interpretations, this type of idealization is not possible, a fact reflected in the uncertainty relations. The theory only estimates the probabilities or statistics of the outcomes of discrete future events, on the basis of previous events, and tells us nothing about what happens between events. Nor does it describe the data observed in measuring instruments and hence quantum phenomena. They are described by classical physics, which, however, cannot predict them.

The QP/QS principle was mathematically expressed in Heisenberg's scheme by matrices containing the necessary probability amplitudes cum Born's rule. Heisenberg only formulated this rule in the case of electrons' quantum jumps in the hydrogen atom, rather than as universally applicable in quantum mechanics, as Born did. Born's rule is not inherent in the formalism but is added to it—it is postulated.

The correspondence principle was central to Heisenberg's derivation of quantum mechanics. In its mathematical form, introduced by Heisenberg, the principle required that both the equations of quantum mechanics, which were formally those of classical mechanics, and the variables used, which were different, convert into those of classical mechanics in the classical limit, a conversion automatic in the case of equations but not variables. (The processes themselves, however, are still quantum even in this limit.) Thus, the principle gave Heisenberg a half of the mathematical architecture he needed.

An important qualification is in order. Heisenberg's derivation of quantum mechanics from principles cannot be considered a strictly rigorous derivation, especially in a mathematical sense. As he noted in The Physical Principles of the Quantum Theory (from which title I borrow my title of this section): “The deduction of the fundamental equation of quantum mechanics is not a deduction in the mathematical sense of the word, since the equations to be obtained form themselves the postulates of the theory. Although made highly plausible, their ultimate justification lies in the agreement of their predictions with the experiment” [10, p. 108]. While Heisenberg, again, borrowed the form of equations themselves from classical mechanics by the mathematical correspondence principle, he virtually guessed the variables he needed—one of the most extraordinary guesses in the history of physics. A more rigorous derivation of quantum mechanics from fundamental principles may, thus, be pursued. More recent work in this direction has been in quantum information theory in the case of discrete quantum variables, such as spin, which require finite-dimensional Hilbert spaces, as opposed to infinite-dimensional ones for continuous variables, such as position and momentum (e.g., 1315)8. I shall comment on this work below.

Bohr's interpretation of quantum phenomena and quantum mechanics added a new principle, the complementarity principle. It arises from Bohr's concept of complementarity and may be defined as requiring: “(a) a mutual exclusivity of certain phenomena, entities, or conceptions; and yet (b) the possibility of considering each one of them separately at any given point, and (c) the necessity of considering all of them at different moments for a comprehensive account of the totality of phenomena that one must consider in quantum physics” [9, p. 70].

In Bohr's ultimate interpretation, this concept applies strictly to what is observed in measuring instruments, quantum phenomena, and not to quantum objects, placed beyond representation or even conception. Complementarity is a reflection of the fact that, in a radical departure from classical physics or relativity, the behavior of quantum objects of the same type, say, electrons, is not governed by the same physical law, especially a representational physical law, in all possible contexts, specifically in complementary contexts. In other words, the behavior of quantum objects has mutually incompatible effects in complementary set-ups, although this mutual incompatibility is, generally, manifested collectively, in multiple identically prepared experiments. On the other hand, the mathematical formalism of quantum mechanics offers correct probabilistic or statistical predictions of quantum phenomena in all contexts, in non-realist interpretations, under the assumption, that quantum objects and processes are beyond representation or even conception, by the RWR principle.

In some non-realist interpretations, such as the one the present author would favor, following W. Pauli, individual quantum events are not subject even to the probabilistic laws of quantum mechanics. This makes these laws collective, statistical [9, pp. 173–186; 11]. The QP/QS principle, accordingly, becomes strictly the QS principle. According to Pauli:

As this indeterminacy is an unavoidable element of every initial state of a system that is at all possible according to the [quantum-mechanical] laws of nature, the development of the system can never be determined as was the case in classical mechanics. The theory predicts only the statistics of the results of an experiment, when it is repeated under a given condition. Like the ultimate fact without any cause, the individual outcome of a measurement is, however, in general not comprehended by laws. This must necessarily be the case, if quantum or wave mechanics is interpreted as a rational generalization of classical physics, which take into account the finiteness of the quantum of action [h]. The probabilities occurring in the new laws have then to be considered to be primary, which means not deducible from deterministic laws. [19, p. 32]

Thus, in Pauli or the present view, this “beyond the law” includes the probabilistic or, in this view, statistical laws of quantum mechanics, laws that, thus, only apply to statistical multiplicities of repeated quantum events. Individual quantum events are not subject to laws, even to the probabilistic or statistical laws of quantum mechanics. Their outcomes cannot, in general, be assigned a probability: they are strictly random9. Only the statistics of multiple (identically prepared) experiments could be predicted and repeated, which repeatability appears to have been, thus far, necessary for scientific practice. Whether, however, one interprets quantum mechanics on such statistical lines or on the Bayesian lines, by assigning probability to individual events, we are compelled to rethink the concept of physical law as unavoidably contextual. This is “an entirely new situation as regards the description of physical phenomena that, the notion of complementarity aims at characterizing” [20, p. 700].

There are other important features of quantum phenomena, mathematically expressed in the quantum-mechanical formalism, in particular, the so-called “quantum non-locality,” which refers to the existence of the statistical correlations between spatially separated quantum events, and “quantum entanglement,” which reflects these correlations in the formalism. These features were discovered later and played no role in the initial derivation of quantum mechanics by either Heisenberg or Schrödinger. They do figure significantly in quantum information theory and recent attempts, mentioned above, to derive quantum mechanics from the principles of quantum information. Their analysis would require a treatment beyond my scope10. A few key points may, however, be mentioned. First, while quantum entanglement is a clearly defined feature of the formalism, the situation is different in the case of quantum non-locality. Although originating in the experimentally well-confirmed fact that certain spatially separated quantum phenomena or events exhibit statistical correlations (not found in classical physics), quantum non-locality is a complex and much debated issue. The problematic was in effect introduced in 1935 in the famous article by Einstein et al. [22]. I qualify because neither EPR's article nor Bohr's equally famous reply to it [20] used the language of correlations or entanglement. The latter term was introduced, in both German [Verschränkung] and English, by Schrödinger in his response to EPR's article, known as “the cat-paradox paper,” after the paradox found there [23]. The subject remained dormant until the 1960s, when it was rekindled by the Bell and Kochen-Specker theorems, even to the point of nearly defining the current debate concerning quantum foundations. The theoretical and experimental research on the subject during the last decades has been massive and literature concerning it is immense. The term “non-locality” is not uniformly used in referring to quantum correlations, because it may suggest some sort of instantaneous physical connections between distant events, a “spooky action at a distance,” as Einstein called it. Such connections are incompatible with relativity, although the principle of locality, which prohibits such connections, is independent of relativity. This type of physical non-locality, which is found, for example, in Bohmian mechanics, is commonly viewed as undesirable. The absence of realism allows one to avoid physical non-locality, as Bohr argued in his reply to EPR's article, which contended that quantum mechanics is either incomplete or physically nonlocal [20, 22].

From Models to Principles in Q-Modeling Outside Physics

Q-Models, Fundamental Principles, and Reality without Realism Outside Physics

In addressing Q-models in physics in preceding discussion, my main question, arising from the history of quantum theory, was: Given certain fundamental physical principles, established on the basis experimental evidence, in particular the QD and QP/QS principles, and perhaps adopting additional principles, such as the correspondence principle or the RWR (or proto-RWR) principle, what are the mathematical models that would enable us to handle this evidence? In turning now to the Q-models beyond physics, my main question is reverse: Assuming that mathematical Q-models apply in psychology, economics, and decision science, which features and which fundamental principles are behind such models, and how they accord with the fundamental principles of quantum mechanics? There are two sets of principles I have in mind. The first contains the principles that led to the emergence of quantum mechanics; and the second the principles of quantum information theory, which are, however, in accord with most principles of the first set. I shall be primarily concerned with this first set (apart from the correspondence principle, unique to quantum theory), but will also comment on the second11.

But why is this question important in the first place? As noted from the outset, if there are phenomena outside physics that appear to require Q-models, one need, unlike at the time of the introduction of quantum mechanics, not invent such models at this point. One can borrow them, “ready-made,” from quantum theory, which is what happed in the case of Q-modeling outside physics. Nevertheless, establishing, now inferentially, fundamental principles behind Q-models might allow us to make important conclusions about the nature of the phenomena handled by these models. To put it in stronger terms, finding the fundamental principles behind a given model, even if this model is already available, is important because otherwise we don't have a rigorous theory or a rigorous model, which is true even if a constructive theory is available, but is all the more important if it is not. Otherwise, we don't really know what our models are models of, especially, again, in the absence of a constructive theory and realism, which absence is likely if Q-models apply and is my main interest here. These considerations are also relevant in pursuing projects of more rigorous derivation of quantum mechanics from principles in physics, for example on lines of quantum information theory, even though the theory itself is already established. Part of the reason is, again, that doing so can give us a deeper understanding of quantum phenomena and quantum theory. More, however, is at stake. The main value of such projects lies in solving outstanding problems of fundamental physics, as in quantum field theory (which still has unresolved problems, its extraordinary successes notwithstanding) or quantum gravity, which has no model as yet [24, 25]. The same argument applies to Q-modeling beyond physics. The future of mathematical modeling there is at stake as well.

Before addressing the relationships between fundamental principles and Q-models in psychology, economics, and decision science, it may be helpful to summarize the non-realist, the RWR-principle-based, interpretation of quantum phenomena and quantum mechanics outlined in Section Physical Principles and Mathematical Models in Quantum Mechanics. While quantum objects are assumed to exist, the character of this existence or reality is, by the RWR principle, assumed to be beyond representation and even conception. As such, this reality is different from the reality of quantum phenomena, which are defined by what is observed in measuring instruments under the impact of quantum objects and, thus, can be represented. There are no mathematically expressed physical laws corresponding to the behavior of quantum objects. There are, however, mathematical laws that, expressing the QP/QS principle, enable correct probabilistic or statistical predictions of the outcomes of quantum experiments, manifested in measuring instruments, in all contexts. In addition, there are two interpretations of these mathematical laws. The first is probabilistic, along Bayesian lines, in which case these laws are seen as allowing one to assign probabilities to the outcomes of individual quantum events in accordance with one or the other law of the available set of laws, specifically those applicable in complementary situations. The second is statistical, when no such probabilities could be assigned because the outcomes of individual quantum experiments are not comprehended even by these laws and are seen as random, while these laws are assumed to predict the statistics of multiple identically prepared experiments in the corresponding contexts.

It is clear, however, that this conceptual architecture, in either the Bayesian or statistical interpretation, cannot apply unaltered in considering, along non-realist lines, human phenomena found in psychology, economics, or decision science and the possible Q-models there. This is because, while there are individual objects or, the case may be, (human) subjects and processes to consider, there are no elementary objects of the type found in quantum physics. There is nothing analogous to elementary particles, such as electrons or photons, and there is rarely a completely random individual behavior. When one deals in these fields with large multiplicities one can, either in using C- or Q-models, average the individual behavior and statistically disregard the differences in this behavior, differences defined by psychological or other human and social factors, in which case one could apply either a Bayesian or statistical interpretation of the Q-model used. While, however, this averaging is sometimes possible in psychology, economics, and decision science, there are often significant obstacles in using it. Each sequence of events considered in such situations is singular, unique. Accordingly, if a Q-model applies in a given class of such cases, it would have to be interpreted on Bayesian lines, if one can establish such a class. If not, then, as discussed below, another type of models may be possible, the singularized probabilistic (SP) models, some of which are time-dependent (TDSP). Each such model is unique to the individual situation considered, rather than applicable to a class of individual situations; and this uniqueness may pose difficulties for scientific use of such models.

The QP/QS Principle and the Complementarity Principle

Beginning with Tversky and Kahneman's work in the 1970–80's [e.g., 26], it has been primarily the presence of probabilistic data akin to those encountered in quantum physics that suggested using Q-models in cognitive psychology, decision science, and economics [e.g., 1, 2]12. Economic behavior may also involve psychological factors of the type analyzed by Tversky and Kahneman. (Kahneman was eventually awarded a Nobel Prize in economics.) The recourse to Q-models is motivated by the fact that one could not effectively use the classical (additive) rules but could use the quantum-mechanical-like (non-additive) rules for predicting the probabilities of the outcomes of certain psychological experiments, such as those involving responses to certain specific questions, asked sequentially. These responses were found to be statistically dependent on the order in which they were asked, which, again, in parallel with quantum mechanics, suggested that a non-commutative model and, in combination with the non-additive rules for calculating the probabilities involved, a Q-model could be more effective13. To clarify this parallel, in quantum mechanics, simultaneously measuring, or simultaneously asking questions concerning, two or more complementary variables, such as the position and the momentum of a given quantum object, are mutually exclusive or incompatible. Correlatively, changing the order of measuring (of asking the question concerning) the position and then the momentum of a quantum object, in general, changes the outcomes and hence our predictions concerning them. This circumstance is reflected, experimentally, in the uncertainty relations, and mathematically, in the non-commutativity of the multiplication of the corresponding Hilbert-space operators in the formalism, and epistemologically, in the complementarity of these two measurements. One can, analogously, consider psychologically incompatible and, thus, complementary questions in psychology and attempt to handle the corresponding events statistically by a Q-model [e.g., 1, pp. 259–260]. The situation involves further complexities in and outside quantum physics, which I put aside here. I would like, however, to mention R. Spekkens's article, which introduced “a toy theory,” based on the following principle, linked to complementarity: “the number of questions about the physical state of a system that are answered must always be equal to the number that are unanswered in a state of maximal knowledge. Many quantum phenomena are found to have analogs within this toy theory.” Many but not all! For the theory expressly fails to reproduce some among the crucial features of quantum theory, specifically and intriguingly some of those related to correlations and entanglement, such as “violations of Bell inequalities and the existence of a Kochen-Specker theorem” [27, p. 032110]. This failure reminds us that models based on the existence of incompatible questions, in and outside physics, may mathematically differ from quantum mechanics.

Q-models are, then, used to predict probabilities and correlations found in such experiments, without being expressly concerned with the principles characterizing the situations considered, but only assuming certain mathematical principles inherent in the quantum-mechanical formalism. Some among the principles of the first kind are, nevertheless, implicitly at work, specifically the QP/QS principle or the principle of incompatibility, in effect complementarity14. Whether these Q-models are required or C-models, models derived from the mathematics of classical physics, suffice remains, again, an open question, although it is difficult to assume that C-models could provide the non-additive probabilities necessary in such cases. A model alternative to that of quantum mechanics, possibly also free of quantum amplitudes and dealing directly with probabilities, is, in principle, possible even, as noted earlier, in quantum physics, but such a model is unlikely to be akin to those of classical physics. Thus, while they are both realist and causal, Bohmian models are mathematically different from those of classical physics. It may also be possible to construct a realist and causal mathematical model that would represent a deeper level of reality and that would have quantum mechanics as its limit, and then extend this model beyond physics [e.g., 30].

In any event, one can see the QP/QS principle, in part in conjunction with complementarity, as the main principle behind the use of Q-models beyond physics, accompanied, as in quantum mechanics, by the specific (non-additive) calculus of probability. Indeed, the QP/QS principle, along with the QD principle, was the starting principle for Heisenberg. The role of complementarity, only implicit initially by virtue of the non-commutative nature of Heisenberg's scheme, became apparent shortly thereafter, helped by Heisenberg's discovery of the uncertainty relations in 1927. It became clear that non-commutativity, the uncertainty relations, and complementarity were correlative, representing, respectively, the mathematical, physical, and epistemological aspects of the quantum-mechanical situation, defined by quantum discreteness (the QD principle). As noted earlier, quantum discreteness was eventually rethought by Bohr in terms of quantum phenomena, defined by what is observed in measuring instruments impacted by quantum objects, as opposed to the nature of quantum objects and processes, which are beyond conception and, hence, cannot be thought of as either discrete or continuous.

The psychological, economic, and decision-making phenomena treated by means of Q-models do not exhibit this type of irreducible discreteness or individuality. The processes that connect these phenomena are more akin to processes considered in classical physics, especially in chaos or complexity theory, again, often providing mathematical models, C-models, used in these fields. Now, assuming the defining role of, jointly, the QP/QS principle and the complementarity principle in considering these phenomena, could some form of the QD principle, correlative to the QP/QS principle in quantum mechanics, find its place in considering or even in order to derive Q-models in these fields? And if so, or in the first place, would the RWR principle, or a proto-RWR principle of the type used by Heisenberg, also be applicable? There are reasons to believe that such might be the case.

The RWR and QD Principles

Bohr thought that, along with the complementarity principle, the RWR principle might apply in biology and psychology. In considering biology, he argued as follows:

The existence of life must be considered as an elementary fact that cannot be explained, but must be taken as a starting point in biology, in a similar way as the quantum of action, which appears as an irrational element from the point of view of the classical mechanical physics, taken together with the existence of elementary particles, forms the foundation of atomic physics. The asserted impossibility of a physical or chemical explanation of the function peculiar to life would in this sense be analogous to the insufficiency of the mechanical analysis for the understanding of the stability of atoms. [31, p. 458; emphasis added]

The ultimate character of biological processes may, thus, be beyond representation or even conception, in accord with the RWR principle. Once the theory suspends accounting for the connections between the phenomena considered, these phenomena are unavoidably discrete, leading to the QD principle, and our predictions concerning them are unavoidably probabilistic, leading to the QP/QS principle. Our predictions concerning them are likely to follow a (non-additive) probability calculus of the type used in quantum probability, and thus are likely to require a Q-model. This is because, by the RWR or proto-RWR principle, it would be difficult or even impossible to treat the processes connecting the phenomena considered as either continuous or causal. Bohr's appeal to “an irrational element” is noteworthy, and I shall comment on it below. It is important that, as Bohr clearly implies here, this approach is possible even if the nature of biological processes is not physically quantum in the sense of being able to have physically quantum effects. (The ultimate constitution of all matter is quantum, but this constitution does not manifest itself apart from quantum experiments.) If they were quantum, such processes would be unrepresentable or inconceivable in Bohr's interpretation. At stake here, however, are parallel, rather than physically connected, situations that may require using the same type of mathematical models, Q-models, without possible connections between the systems defining these situations15.

A recent article by Haven and Khrennikov provides an instructive example for possible roles of both the RWR and QD principle in market economics in their Q-modeling of market phenomena involving arbitrage as analogous to quantum tunneling [33]. The term “quantum tunneling” refers to a quantum object's capacity to “tunnel” through an energy barrier that it would not be able to surmount if it behaved classically. It is a quantum phenomenon par excellence. The quantum process itself behind any given case of quantum tunneling cannot be observed. One only ascertains that a particle can be found beyond the barrier, which is to say, that the corresponding measurement will register an impact of this particle on the measuring instrument beyond the barrier. Thus, in accord with the general situation that obtains in quantum mechanics, one deals with two discrete phenomena, connected by probabilistic or (in which case, we need multiple trials) statistical predictions concerning the second event on the basis of the first. “Arbitrage” is the practice of taking advantage of a price difference between two or more markets: striking a combination of matching deals that capitalize on the imbalance, the profit being the difference between the market prices. An arbitrage is a transaction that involves no negative cash flow at any probabilistic or temporal state and a positive cash flow in at least one state; in simple terms, it is the possibility, ideally, of a risk-free profit at zero cost. In practice, there are always risks in arbitrage, sometimes minor (such as fluctuation of prices decreasing profit margins) and sometimes major (such as devaluation of a currency or derivative). In most ideal models, an arbitrage involves taking advantage of differences in price of a single asset or identical cash-flows.

Now, if arbitrage can be modeled analogously to quantum tunneling in physics, one might expect features analogous to those found in quantum tunneling, which dramatically exhibits the character of quantum phenomena. Haven and Khrennikov are primarily concerned with the use of Q-models in predicting the probabilities involved, by QP/QS principle (accompanied by the non-additive calculus of probabilities), rather than with the QD and the RWR, or proto-RWR, principles. They do, however, offer some considerations concerning discreteness:

We believe that the equivalent of quantum discreteness in this paper corresponds to the idea that each act of arbitrage is a discrete event corresponding to the detection of a quantum system after it passed …the barrier. In reality arbitrage opportunities do not occur on a continuous time scale. They appear at discrete time spots and often experience very short lives. We would like to argue that it is the tunneling effect which is closely associated to the occurrence of arbitrage. …We also mentioned the wave function in the discussion above, and quantum discreteness is narrowly linked with quantum probabilities. [33, p. 4095]

This view at least allows for an interpretation of the phenomenon of arbitrage in terms of the QD and the RWR principles, even if it does not require it. Haven and Khrennikov, while, again, allowing for the applicability of the QD principle, do not appear to subscribe to the RWR principle, or even to the proto-RWR principle16. In effect, however, they follow the proto-RWR principle, insofar as they are not concerned with representing how arbitrage actually occurs, any more than Heisenberg was concerned with representing the behavior of the electron in the hydrogen atom in deriving his formalism. They are only concerned with predicting the probabilities or statistics of future events of arbitrage.

Thus, situations governed the QD, QP/QS, and RWR (or proto-RWR) principles are possible in economics, psychology, and decision science, and just as in quantum mechanics, they may allow for either a statistical or Bayesian view of the Q-model used. When finite-dimensional Q-models (dealing with discrete variables, such a spin) are used, as they often are in these fields, one can also consider the application of the principles of quantum information theory. While I cannot address the subject in detail, the operational framework, used in this field, merits a brief detour. This framework allows one to arrive at Q-models in a more rigorous and first-principle-like way, by using the rules governing the structure of operational devices, “circuits,” via recent work on monoidal categories and linear logic [1315, 34].

According to Chiribella et al.: “The operational-probabilistic framework combines the operational language of circuits with the toolbox of probability theory: on the one hand experiments are described by circuits resulting from the connection of physical devices, on the other hand each device in the circuit can have classical outcomes and the theory provides the probability distribution of outcomes when the devices are connected to form closed circuits (that is, circuits that start with a preparation and end with a measurement)” [13, p. 3]. A circuit is an arrangement of measuring instruments capable of quantum measurements and predictions, which are, again, probabilistic or statistical, and sometimes, as in the EPR type of experiments, are correlated, which gives a circuit a very specific architecture, corresponding only to quantum but not classical experiments. A realist representation of a circuit is possible because a circuit is described by classical physics, even though it interacts with quantum objects, and thus has a quantum stratum, enabling this interaction. Hence, the information obtained by means of a circuit is physically classical, too, but the architecture and mode of transmission of this information is quantum: they cannot be generated by a classical process.

As discussed earlier, Heisenberg found the formalism of quantum mechanics by adopting, in addition to the QD, QP/QS, and proto-RWR principles, the mathematical correspondence principle and, by the latter principle, using the equations of classical mechanics, while changing the variables in these equations. This principle was not exactly the first principle. In particular, it depended on formally adopting the equations of classical mechanics, while one might prefer these equations to be a consequence of fundamental quantum principles. Heisenberg's variables were new, which was his great discovery. But they were new more of a guess, a logical guess, fitting the probabilities of transitions between the energy levels of the electron in the hydrogen atom he worked with. In the operational framework, one derives finite-dimensional quantum theory in a more first-principle-like way, in particular, independently of classical mechanics (which does not exist for discrete variables, such as spin). This derivation is made possible by applying the rules that define the operational language of circuits, as the language of monoidal categories and linear logic, and thus giving a mathematical structure to operational circuits themselves and thus, in effect, to measuring instruments [13, p. 4, 33]. These rules are more empirical, but they are not completely empirical (which no rules may ever be), because circuits are given a mathematical structure, from which the mathematical architecture of the theory emerges17. The resulting formalism is equivalent to the standard Hilbert-space formalism. As in Heisenberg, one only deals with “mathematical representations” providing the probabilities or statistics of the outcomes of discrete quantum experiments, in accord with the QD and QP/QS principles, without providing a representation of quantum processes themselves, in accord with the RWR principle.

In the areas of social science, which concerns human subjects, establishing the mathematical architecture for such “circuits” is a formidable task. However, given important recent work along the lines of category theory beyond physics [e.g., 35], this approach may prove to be viable in enabling a principle approach in Q-modeling outside physics18.

Q-Theories as Rational Theories of the Irrational

As indicated earlier, while the main reasons for using Q-models in psychology, economics, and decision science are due to the quantum-like nature or calculus of the probabilities associated with predicting certain phenomena, the underlying dynamics of the cognitive or psychological processes leading to each such phenomenon individually might, in principle, be causal or partially causal. This dynamics might also not be causal, especially given the quantum (non-additive) character of the probabilities involved. If it is causal or partially causal, then, unlike quantum processes, in non-realist interpretations, an analysis of these psychological processes may be possible, rather than “in principle excluded” [3, v. 2, p. 62]. This is because one might expect psychological, social, or economic reasons shaping these situations, and one of the tasks of analyzing them to explain these reasons, an imperative that is hard to avoid, as is clearly apparent in Tversky and Kahneman's articles [26, 37] or in Pothos and Buseymeyer's survey [1].

Psychological, social, or economic research using Q-models may renounce this task, especially in statistical analysis, thus in effect assuming a form of proto-RWR principle, akin to that used by Heisenberg. Even in this case, however, the question would still arise to what degree the QP/QS, QD, and (strictly) RWR principles, or the principles of quantum information theory, could apply in these fields, in particular in considering individual situations. As explained earlier, in quantum mechanics, in non-realist interpretations, the latter could either be treated on Bayesian lines or, in statistical interpretations, assumed to be random, which assumption would, again, be difficult in the fields in question at the moment. Some considerations of discreteness are unavoidable because, as noted, probability has an irreducibly futural and discrete character by dealing with estimates concerning discrete future events.

It is a more complex question whether one can renounce, as one does in quantum mechanics, in non-realist interpretations, considering or even assuming the existence of continuous processes connecting these events. I would surmise that such may be the case and that our brains may work, at least sometimes, in accordance with the QD, the QP/QS, and the RWR principles. This means they would not be relying on and calculating hidden causality connecting events but would instead functions by relying on the quantum-like workings of probabilities and correlations. This type of brain functioning would define what may be called a Bayesian Q-brain, which would require the corresponding Bayesian models. Importantly, however, this kind of Bayesian brain is fundamentally different from rational Bayesian agents, associated with the term Bayesian in cognitive psychology. Indeed, Q-models there are in part advanced in these fields against this concept of human agency. A Bayesian Q-brain need not always function “rationally,” at least, not in accordance with any single concept of rationality. A corresponding Bayesian Q-model, if possible, would allow one to predict the outcomes of decisions governed by the brain processes of the individual subjects involved without having, even conjecturally, a full access to these processes, by the RWR principle. Nor do those who make these decisions have this access: these processes are unconscious, and, if one assumes the RWR principle, this part of the unconscious is not causal or “rational” (in its own way), as S. Freud, for example, saw it [38]. Freud's thinking on this point was, however, ultimately more complex, even if against his own grain.

It is instructive to return, in this context, to Bohr's invocation of “an irrational element,” in the passage cited above and repeated elsewhere in his writings. The idea and even the language of irrationality have often been seen as problematic by Bohr's critics and even by some of his advocates. I would argue this assessment to be a result of misunderstanding Bohr's meaning. This “irrationality” is not any “irrationality” of quantum mechanics, which Bohr saw as a rational theory, a “rational quantum mechanics,” and argued for its rational character throughout his writing (e.g., 3, v. 1, p. 48; 3, v. 2, p. 63). However, he did see it as a rational theory of something—the nature of quantum objects and processes—that is inaccessible to rational thinking, or at least to a rational representation. If, as he says, “the quantum of action [h], which appears as an irrational element from the point of view of the classical mechanical physics,” it only means that cannot be rationally incorporated into the latter [31, p. 458].

Tversky and Kahneman's and related arguments are, too, sometimes seen as related to “irrational” elements in decision-making. This decision-making replaces purportedly “rational” Bayesian agents with at least partially “irrational” Bayesian agents. The “rational” Bayesian agents, as explained above, use probabilistic reasoning subject to updating their estimates on the basis of new information (which defines the Bayesian approach to probability). The irrationality of “irrational” Bayesian agents may be divided into three main, sometimes overlapping, types. The first type is in effect a form of rationality. This rationality is, however, different from rationality presumed to be dominant in the class of situations considered, say, the rationality of maximizing one's monetary benefits. In addition, this alternative rationality may be unconscious. The second type of irrationality refers to something that could be explained. However, it defies explaining it as anything assumed to be rational, say, as a form of rational behavior, beforehand. This irrationality may, upon further analysis, reveal itself to be the irrationality of the first type, but it may also be an alternative form of rationality19. Finally, the third type of irrationality is that invoked by Bohr: a realist theory cannot incorporate it in its handling of the corresponding phenomena, while a non-realist Q-model or theory can make it part of its probabilistically predictive scheme without explaining it. In this way, QD, QP (or, if averaging is possible QS), and RWR principles can be brought together in this domain.

There is yet another possibility, which leads to a different type of models or theories, conforming to the QD, QP (but not QS), and RWR principles. I shall call such models or theories singularized probabilistic (SP) models or theories, keeping in mind their non-realist, RWR-principle-based, character. Realist SP models are possible, but I shall not be concerned with them. SP-models may also be time-dependent (TDSP). Such models can only be briefly sketched here in conceptual and somewhat abstract terms, but their possibility is intriguing. SP- or TDSP-models need not be mathematically related to Q-models, but they might be, given the shared principles in which they are based.

Singularized Probabilistic (SP) Theories and Models

Let us recall that, as reflected in the complementarity principle, in quantum mechanics there is no single, uniform physical law applicable to quantum behavior in all contexts, while the same mathematical formalism or model can be used in all contexts. Depending on whether an interpretation is statistical or (Bayesian) probabilistic, the individual quantum behavior is either assumed to be random or to be subject to the probabilistic law, the application of which is defined by the context. By contrast, in the case an SP-model or theory, the following situation obtains. While, as in quantum physics, there is no single uniform physics law, realist or not, each individual behavior obeys its own singular law, defined by its own mathematical model, rather than conforms to one or another contextual probabilistic or statistical law, from a (determinable) set of such laws determined by the theory, using a single mathematical model. Under the RWR principle, assumed here for SP-models, such a model still does not represent the reality of the ultimate processes considered, which makes the absence of not only determinism but also causality automatic, just as in quantum mechanics under the RWR principle. One cannot, however, any longer adopt a statistical view, which assumed multiplicities of events that could be averaged (in quantum mechanics, contextually). In each case, only a Bayesian view of the corresponding (unique) model is possible. Such individual laws and accompanying mathematical models may also be changing in time, a change observed each time a new observation occurs. If so, the corresponding model or theory becomes time dependent, TDSP.

The concept of an SP and especially a TDSP model or theory is a radical idea, to my knowledge, rarely, if ever, entertained, at least in science20. Indeed, it is not clear whether such theories and, especially, the mathematical models defined by them are scientifically viable, particularly if the corresponding mathematical laws are assumed to be changing in time, possibly on small scales. For an effective scientific practice to be possible, one might need regularities beyond those found in each singular situation, for which a mathematical model, unique to it, would be introduced, say, in order to predict the outcome of events. Such changes of laws and models could, in principle, be governed mathematically, have an overall mathematical model. Thus, one could have a set of models mathematically parameterized so as to allow one to use them for different individual situations and to adjust them to make effective predictions in all of these situations. If not, then each case would require its own mathematical model. Would mathematical-experimental sciences, as they are practiced now, still be possible, then?

Furthermore, there might, in a given domain, be individual cases the character of which will defeat our attempt to treat them by mathematical means. Indeed, this is already so in the case individual quantum processes if one adopts a statistical view, according to which each individual process is random, beyond the law. Now, however, there would not be statistical regularities, of the type found in quantum physics, applicable to multiplicities of repeatable cases (handled, moreover, by the same model, even if contextually), because there would be no repeatable cases in any meaningful sense. There would be neither statistical averaging, nor individual mathematical probabilistic treatment. This situation may be more familiar in literature, which is concerned with the particular or the singular, for example, with a unique life history of a novel's protagonist. One also encounters this singularity or uniqueness in life itself. Such histories resist and even preclude statistical averaging, again, allowed by, otherwise equally unique, histories (which cannot be thought of as classical trajectories of motion) of individual quantum objects, as well as mathematical handling. But they may become, at least outside physics, perhaps especially, in psychology (which often deals with the same human conditions as literature), part of science, a science that will combine science and non-science, or at least mathematical, both of the more standard or the SP/TDSP type, and nonmathematical modeling. Indeed, as just indicated, the SD/TDSP-modeling already poses complexities for scientific practice. Could this situation also emerge in physics, for example, in dealing with quantum gravity? This is not inconceivable. If it does, it will not end mathematical modeling in physics or, again, beyond, or the mathematical-experimental character of modern science, which has defined it beginning with Galileo. It might, however, change both, just as it happened in the case of quantum theory, which not only led to a revolutionary transformation—physical, mathematical, and philosophical—of physics itself but also opened new possibilities for scientific thinking and mathematical modeling beyond physics.

Author Contributions

The author confirms being the sole contributor of this work and approved it for publication.

Funding

This work was funded by The Purdue Distinguished Professorship Research Fund.

Conflict of Interest Statement

The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

I would like to thank Mauro G. D'Ariano, Emmanuel Haven, Gregg Jaeger, and Andrei Khrennikov for helpful discussions concerning the subjects considered in this article. I am grateful to both readers of the article for their constructive criticisms, especially one of these readers, who made helpful specific suggestions for revisions and who directed my attention to Robert Spekkens's article “Evidence for the epistemic view of quantum states: A toy theory.”

Footnotes

1. ^I shall only discuss the standard quantum mechanics or quantum field theory, bypassing alternative theories of quantum phenomena, such as Bohmian theories, which are sometimes used in mathematical modeling outside physics, but which would require a separate consideration. By “quantum phenomena” I refer to those physical phenomena in considering which Planck's constant, h, must be taken into account, and by “quantum objects” (thus different from quantum phenomena) to those entities in nature that are responsible for the appearance of quantum phenomena, manifested in measuring instruments involved in quantum experiments or in certain natural phenomena.

2. ^The discussion to follow in part builds on two previous articles [5, 6], but only in part: overall the present argument is different, especially (but not exclusively) by virtue of considering SP-models.

3. ^I bypass more technical definitions, found in standard texts and reference sources.

4. ^There are alternative formalisms, such as those in terms of C*-algebras or more recently category theory, thus far, all mathematically equivalent to the Hilbert-space formalism.

5. ^The designation “the spirit of Copenhagen” is preferable to a more common “the Copenhagen interpretation,” because there is no single Copenhagen interpretation.

6. ^That does not mean that an alternative way of doing so, for example, by bypassing amplitudes or by using some an alternative formalism (not mathematically equivalent to the standard one) is impossible.

7. ^Causality is, thus, an ontological category, characterizing the nature of reality. It proceeds by connecting a cause (an event, phenomenon, a state of a system, or force) to an effect, while the principle of causality connects an event to a cause. Determinism is assumed here to be an epistemological category. It designates our ability to predict the state of a system (ideally) exactly at any moment of time once we know its state at a given moment of time. In classical mechanics (which deals with a small number of objects), causality and determinism coincide. Once a classical system is large, one can no longer predict its causal behavior exactly. In other words, a system may be causal without our theory of its behavior being deterministic, as is the case, for example, in classical statistical physics or chaos theory. Causal influences are generally, although not always, assumed to propagate from past or present towards future. Relativity theory further precludes the propagation of physical influences faster than the speed of light in a vacuum, c. Principle theories do not require causality, which is, again, difficult to assume in quantum physics without, however, violating relativity or more generally the principle of locality, which requires that all physical influences are local (still under the assumption that they cannot, locally, propagate faster than c).

8. ^Among the key earlier approaches are [16], Fuchs's work, which “mutated” to the program of quantum Bayesianism or QBism [17], and [18].

9. ^Randomness may be defined by this impossibility. This concept of randomness is not ontological, because one cannot ascertain the reality of this randomness, but epistemological. It is ultimately a matter of assumption or belief, practically justified in a given interpretation.

10. ^I have discussed the subject, also in relation to complementarity, in Plotnitsky (9, pp. 136–54). These connections also bring in a related (EPR-correlation) concept, “contextuality.” This concept plays a significant role in Q-modeling beyond physics [1, pp. 363–5, 21].

11. ^I have discussed the role of principles of quantum information theory beyond physics in Plotnitsky [6].

12. ^I also refer to these works for more detailed discussions of the ways in which Q-models are used in these fields.

13. ^As noted earlier, this does not mean that such probabilities could not be predicted by means of alternative models even in quantum physics.

14. ^Complementarity has received some attention outside physics, beginning with Bohr's own (tentative) suggestions. Inspired by Bohr and others did propose using the concept in philosophy, biology, and psychology. See Plotnitsky [28, pp. 158–66] and [29].

15. ^There are several recent arguments for such connections, most prominent of which is arguably that by Penrose [32] and developed in several subsequent studies. The model itself that Penrose has in mind is, thus far, only mathematically conjectured, following certain approaches to quantum gravity.

16. ^As indicated earlier, elsewhere Khrennikov argued for a classical-like model at the ultimate level of the constitution of nature in physics [30].

17. ^See also Plotnitsky [9, pp. 248–58] and Hardy [15].

18. ^See also a recent approach to representing sensation-perception dynamics in terms of quantum-like mental instruments, which are akin to “circuits,” in Khrennikov [36].

19. ^Some might still see, as Freud did, this “irrationality” as a form of unconscious “rationality.” Once again, however, Freud, against his own grain, could not ultimately avoid giving the unconscious a stratum that is beyond representation, if not conception.

20. ^Something akin to this possibility has been suggested in physics in Ungar and Smolin [39], but in a different context and based it on a very different set of principles than those adopted here, most especially because, as against the present argument, they assume realism and causality.

References

1. Pothos E, Busemeyer JR. Can quantum probability provide a new direction for cognitive modeling? Behav Brain Sci. (2013) 36:255. doi: 10.1017/S0140525X12001525

PubMed Abstract | CrossRef Full Text | Google Scholar

2. Haven E, Khrennikov A. Quantum Social Science. Cambridge: Cambridge University Press (2013).

Google Scholar

3. Bohr N. The Philosophical Writings of Niels Bohr, Vol. 3. Woodbridge, CT: Ox Bow Press (1987).

Google Scholar

4. Heisenberg W. Quantum-theoretical re-interpretation of kinematical and mechanical relations. In: Van der Waerden BL, editor, Sources of Quantum Mechanics. New York, NY: Dover (1925/1968). p. 261–77.

5. Plotnitsky A. The visualizable, the representable, and the inconceivable: realist and non-realist mathematical models in physics and beyond. Philos Trans R Soc A (2016) 374:20150101. doi: 10.1098/rsta.2015.0101

PubMed Abstract | CrossRef Full Text | Google Scholar

6. Plotnitsky A. Quantum principles and mathematical models in physics and beyond. In: Haven E, Khrennikov A, editors. The Palgrave Book of Quantum Models in Social Science. London: Palgrave-MacMillan (2016). p. 335–57.

Google Scholar

7. Von Neumann J. Mathematical Foundations of Quantum Mechanics, Transl. by Beyer, RT. Princeton, NJ: Princeton University Press (1932/1983).

8. Einstein A. What is the Theory of Relativity. The London Times, 28 November, 1919. In Einstein, A. Ideas and Opinions. New York, NY: Bonanza Books (1919/1954).

PubMed Abstract | Google Scholar

9. Plotnitsky A. The Principles of Quantum Theory, from Planck's Quanta to the Higgs Boson: The Nature of Quantum Reality and the Spirit of Copenhagen. New York, NY: Springer/Nature (2016).

Google Scholar

10. Heisenberg W. The Physical Principles of the Quantum Theory. Transl. by Eckhart K, Hoyt FC, New York, NY: Dover (1930:1949).

Google Scholar

11. Plotnitsky A, Khrennikov A. Reality without realism: on the ontological and epistemological architecture of quantum mechanics. arXiv:1502.06310 (2015). doi: 10.1007/s10701-015-9942-1

CrossRef Full Text | Google Scholar

12. Kant I. Critique of Pure Reason, Transl. by Guyer P, Wood, AW. Cambridge: Cambridge University Press (1997).

Google Scholar

13. Chiribella G, D'Ariano GM, Perinotti P. Informational derivation of quantum theory. Phys Rev A (2011) 84:012311. doi: 10.1103/PhysRevA.84.012311

CrossRef Full Text | Google Scholar

14. D'Ariano GM, Chiribella G, Perinotti P. Quantum Theory from First Principles: An Informational Approach. Cambridge: Cambridge University Press (2017).

15. Hardy L. Foliable operational structures for general probabilistic theory. In: Halvorson H, editor. Deep Beauty: Understanding the Quantum World through Mathematical Innovation. Cambridge: Cambridge University Press (2011). p. 409–42. doi: 10.1017/CBO9780511976971.013

CrossRef Full Text | Google Scholar

16. Zeilinger A. A foundational principle for quantum mechanics. Found. Phys. (1999) 29:631–43. doi: 10.1023/A:1018820410908

CrossRef Full Text | Google Scholar

17. Fuchs CA, Mermin ND, Schack R. An introduction to QBism with an application to the locality of quantum mechanics. Am J Phys. (2014) 82:749. doi: 10.1119/1.4874855

CrossRef Full Text | Google Scholar

18. Hardy L. Quantum mechanics from five reasonable axioms. arXiv:quant-ph/0101012 (2001).

19. Pauli W. Writings on physics and philosophy. Berlin: Springer (1994).

Google Scholar

20. Bohr N. Can quantum-mechanical description of physical reality be considered complete? Phys Rev. (1935) 48:696.

Google Scholar

21. Dzhafarov E, Jordan SR, Zhang R, Cervantes V. (editors). Reality, Contextuaity, and Probability in Quantumtheory and Beyond. Singapore: World Scientific (2016). p. 93–138.

22. Einstein A, Podolsky B, Rosen N. Can quantum-mechanical description of physical reality beconsidered complete? In: Wheeler JA, Zurek WH, editors, Quantum Theory and Measurement, Princeton, NJ: Princeton University Press (1935/1983). p. 138–41; 152–67.

23. Schrödinger E. The present situation in quantum mechanics. In Wheeler JA and Zurek, WH editors. Quantum Theory and Measurement, Princeton, NJ: Princeton University Press (1935/1983), 152–67.

24. Hardy L. Towards quantum gravity: a framework for probabilistic theories with non-fixed causal structure. J Phys. (2007) A40:3081–99. doi: 10.1088/1751-8113/40/12/S12

CrossRef Full Text | Google Scholar

25. D'Ariano GM, Perinotti P. Derivation of the Dirac equation from principles of information processing. Phys Rev A (2014) 90:062106. doi: 10.1103/PhysRevA.90.062106

CrossRef Full Text | Google Scholar

26. Tversky A, Kahneman D. Availability: a heuristic for judging frequency and probability. Cogn Psychol. (1973) 5:207. doi: 10.1016/0010-0285(73)90033-9

CrossRef Full Text | Google Scholar

27. Spekkens R. Evidence for the epistemic view of quantum states: a toy theory. Phys Rev A (2007) 75:032110. doi: 10.1103/PhysRevA.75.032110

CrossRef Full Text | Google Scholar

28. Plotnitsky A. Niels Bohr and Complementarity: An Introduction. New York, NY: Springer (2012).

Google Scholar

29. Wang Z, Busemeyer J. Reintroducing the concept of complementarity into psychology. Front Psychol. (2015) 6:1822. doi: 10.3389/fpsyg.2015.01822

PubMed Abstract | CrossRef Full Text | Google Scholar

30. Khrennikov A. Quantum probabilities and violation of CHSH-inequality from classical random signals and threshold type detection scheme. Progr. Theor. Phys. (2012) 128:31. doi: 10.1143/PTP.128.31

CrossRef Full Text | Google Scholar

31. Bohr N. Life and light. Nature (1931) 131:458.

32. Penrose R. The Emperor's New Mind. Oxford: Oxford University Press (1995).

Google Scholar

33. Haven E, Khrennikov A. Quantum-like tunnelling and levels of arbitrage. Int J Theor Phys. (2013). 52:4083. doi: 10.1007/s10773-013-1722-0

CrossRef Full Text | Google Scholar

34. Coecke B. Quantum picturalism. Contemp Phys. (2009) 51:59–83. doi: 10.1080/00107510903257624

CrossRef Full Text | Google Scholar

35. Abramsky S, Brandenburger A. The sheaf-theoretic structure of non-locality and contextuality. arXiv:1102.0264 [quant-ph] (2011). doi: 10.1088/1367-2630/13/11/113036

CrossRef Full Text | Google Scholar

36. Khrennikov A. Quantum-like modeling of cognition. Front Phys. (2015) 22:77. doi: 10.3389/fphy.2015.00077

CrossRef Full Text

37. Tversky A, Kahneman D. Extensional versus intuitive reasoning: the conjunction fallacy in probability judgment. Psychol. Rev. (1983) 90:293. doi: 10.1037/0033-295X.90.4.293

CrossRef Full Text | Google Scholar

38. Freud S. General Psychological Theory: Papers on Metapsychology. New York, NY: Touchstone (2008).

Google Scholar

39. Ungar RM, Smolin L. The Singular Universe and the Reality of Time: A Proposal in Natural Philosophy. Cambridge: Cambridge University Press (2014).

Keywords: principles, models, probability, statistics, reality, realism

Citation: Plotnitsky A (2017) The Real and the Mathematical in Quantum Modeling: From Principles to Models and from Models to Principles. Front. Phys. 5:19. doi: 10.3389/fphy.2017.00019

Received: 16 November 2016; Accepted: 26 May 2017;
Published: 19 June 2017.

Edited by:

Emmanuel E. Haven, University of Leicester, United Kingdom

Reviewed by:

Marco G. Mazza, Max Planck Institute for Dynamics and Self Organization (MPG), Germany
Gregg Jaeger, Boston University, United States

Copyright © 2017 Plotnitsky. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Arkady Plotnitsky, plotnits@purdue.edu

Download