Consciousness, measurement, and quantum mechanics – Part 7


(See Part 1, Part 2, Part 3, Part 4, Part 5, and Part 6. Also I am going to suspend the limit of three comments per post for this series of posts because it is a topic that benefits from back and forth discussions.)

In order to fully appreciate the role of Heisenberg’s uncertainty principle on the question of objective reality and measurement, a highly truncated history of quantum mechanics might help.

The theory traces its beginnings to 1900 when Max Planck decided to assume that the material that made up the walls of the cavity inside a body that was at a uniform temperature (called a blackbody) could be treated as oscillators that could only absorb and radiate energy in discrete amounts (‘quanta’) and not continuously as had been previously assumed. The size of these quanta depended upon the frequency of oscillation as well as a new constant he introduced that has come to be known as Planck’s constant h. The value of this constant was very small, which is why it had long seemed that the energy could be absorbed and radiated in any amount. This was a purely ad hoc move on his part that had no theoretical justification whatsoever except that it gave the correct result for the radiation spectrum of the energy emitted by the blackbody. Planck himself viewed it as a purely mathematical trick that had no basis in reality but was just a placeholder until a real theory came along. But as time went on and the idea of quanta caught on, he began to think that it could represent something real.

In 1905 Einstein proposed that light energy also came in quanta and this was used to explain the photoelectric effect, which was what he was awarded the Nobel prize for in 1921. Then Niels Bohr in 1913 used the idea of quantization to come up with a model of simple atoms that explained some of their radiation spectra. Both of their works used Planck’s constant.

Erwin Schrodinger’s eponymous equation was proposed by him in 1926 and set in motion the field of quantum mechanics because it laid the foundations of a real theory that enabled one to systematically set about making calculations of observables. Almost simultaneously, Werner Heisenberg came up with alternative formulation based on matrices. (Later on P. A. M. Dirac showed that the two formulations were equivalent.) But Schrodinger’s theory was in the form of a differential equation that enabled one to calculate the wave function of a particle that was moving under the influence of a potential. Differential equations and wave behavior were both very familiar to physicists and thus Schrodinger’s approach was more easily accessible and used more widely.

But there was from the beginning confusion about what the wave function in Schrodinger’s equation meant, what information it carried, and what it told us about the world. The fact that it was a complex quantity was a hindrance to creating a physical picture. It was Max Born’s interpretation that the square of the absolute value of the wave function (a real quantity) represented a probability density that enabled the connection of the wave function to observables.

What came to be known as the Copenhagen interpretation, advocated by Bohr, became the dominant view, and this placed heavy emphasis on the role of the measuring device playing the role of the observer. Bohr advocated something called ‘complementarity’ in dealing with measurements.

By 1935 conceptual understanding of the quantum theory was dominated by Niels Bohr’s ideas concerning complementarity. Those ideas centered on observation and measurement in the quantum domain. According to Bohr’s views at that time, observing a quantum object involves an uncontrollable physical interaction with a measuring device that affects both systems. The picture here is of a tiny object banging into a big apparatus. The effect this produces on the measuring instrument is what issues in the measurement “result” which, because it is uncontrollable, can only be predicted statistically. The effect experienced by the quantum object limits what other quantities can be co-measured with precision. According to complementarity when we observe the position of an object, we affect its momentum uncontrollably. Thus we cannot determine both position and momentum precisely. A similar situation arises for the simultaneous determination of energy and time. Thus complementarity involves a doctrine of uncontrollable physical interaction that, according to Bohr, underwrites the Heisenberg uncertainty relations and is also the source of the statistical character of the quantum theory.

In the early days, much of the discussion centered on the maximum information that one could glean from measurements on a single particle. But this posed a problem when it came to simultaneously obtaining both the position and momentum of that particle, as illustrated in a thought experiment posed by Heisenberg in his proposal concerning the uncertainty principle.

The problem can be seen by asking what we mean by measuring the position of an object. We can represent the act of measurement as looking at the object through a microscope. How precisely the position can be located depends on what is known as the resolving power of the microscope and is given by Δx, and it depends upon the diameter of the microscope lens and the wavelength of the light used.

But what does ‘looking’ mean in practice? It means that we send in a photon from some light source that bounces off the particle and enters the lens of a microscope. But when the photon bounces off the particle, it changes that particle’s momentum. By measuring the recoil of the photon, we can calculate the original momentum of the particle. But since the lens of the microscope has a certain width, that limits the precision of the calculation of the recoil, and this leads to an uncertainty in the calculated momentum of the particle given by Δp that also depends upon the width of the microscope lens and the wavelength of light used. What Heisenberg showed was that the product ΔxΔp was independent of the lens width and light frequency but had to be at least of the order of Planck’s constant h. It could not be the case that Δx=Δp=0, i.e., we could never know x and p exactly simultaneously. Hence if you measure one of them exactly so that its uncertainty tends to zero, the other one is completely unknown, i.e., its uncertainty tends to infinity, so that the product of uncertainties is non-zero as required.

The thorny question was whether the particle did have an exact position and momentum (that we could not determine because of the unavoidable disturbances caused by the measurement) or whether the particle did not have a position or momentum until it was measured. Those who believed in an objective reality thought it was the former while the Copenhagen approach assumed the latter. But there did not seem to be any way of distinguishing the two approaches experimentally. But the wave function solution of Schrodinger’s equation had the property that the position and momentum could not both be precisely specified simultaneously, and so it was assumed that those two properties did not exist until they were measured. Hence there seemed to be no objective reality.

Einstein was a firm believer in objective reality. Einstein particularly disliked the idea of the instantaneous collapse of the wave function everywhere upon measurement. After all, it was he who showed that nothing could travel faster than the speed of light, which was why he referred to this collapse as ‘spooky action at a distance’. The 1935 EPR paper that Einstein co-wrote with Nathan Rosen and Boris Podolsky is a beautiful example of the use of a thought experiment to make this particular point. (The arguments in the paper are often attributed just to Einstein alone and poor Rosen and Podolsky are shunted to the background, even though Podolsky is reputed to have played a major role in developing the ideas in the paper.)

The EPR paper suggested a way to show that a particle does indeed have an exact position and momentum prior to measurement and thus Heisenberg’s uncertainty principle was merely a statement about the limits of measurement, not about the limits placed on what exists as reality. EPR first set about defining what they mean by objective (or what they call ‘physical’) reality, saying: “If, without in any way disturbing a system, we can predict with certainty (i.e. with probability equal to unity) the value of a physical quantity, then there exists an element of physical reality corresponding to this physical quantity.” In other words, 100% predictability of the value of a quantity implied that the quantity was as good as having been measured, even if it had not actually been done so.

To show this they constructed an entangled wave function for two particles. The two particles then move in opposite directions so that they no longer interact. They then show that if we make a measurement of the exact momentum of particle 1, the wave function will collapse at the location of particle 1 such that it implies that particle 2 must have exactly the opposite momentum with 100% predictability. i.e., the momentum of particle 2 will have objective reality. However if instead we make a measurement of the exact position of particle 1, the wave function will collapse at the location of particle 1 such that it implies that particle 2 will have an exact location with 100% predictability. i.e., the position of particle 2 will have objective reality. Thus, by measuring either the momentum or position only of particle 1, we are able to predict with certainty the exact momentum and position of particle 2, even though we have not made any measurements on particle 2.

EPR take the next crucial step and argue that the measurements on particle 1 could have no effect on what is going on with particle 2 since they are widely separated. They argue that what properties of particle 1 we choose to measure cannot in any way disturb particle 2. But since particle 2’s exact position and momentum can be predicted with 100% certainty by measuring particle 1, they must be simultaneously real. In other words, without having directly measured either property of particle 2, they must both exist exactly. Thus for particle 2, Δx=Δp=0 and the uncertainty principle is violated. They say that the only way this conclusion can be avoided is if the reality of particle 2’s position and momentum depends upon the act of measurement carried out on particle 1 that is far away. They say that “No reasonable definition of reality could be expected to permit this.”

The EPR paper caused consternation in the physics community because the uncertainty principle was seen as an integral part of quantum theory and for it to not hold would cast doubt on the very foundations of the theory. But there seemed to be no way of testing this. It took another 30 years for John Bell to propose a way to do so by doing measurements on both particles, and another twenty years for the results to start coming in. Alas for EPR, as discussed in Part 6, the results showed that if the EPR claim that the measurements on particle 1 had no effect on particle 2 were true, then the results that you get disagree with the statistical predictions of quantum mechanics. Even Einstein never disputed the validity of the calculations obtained using quantum mechanics. Thus quantum mechanics does not provide a ‘reasonable definition of reality’ if by that we mean that an object’s properties must exist independently and prior to any measurement made on it.

But what Bell took away from Einstein, he also gave back. We see that the instantaneous collapse of the wave function occurs everywhere simultaneously, and not just at the location where the measurement takes place. Although quantum mechanics is a local theory in that it does not allow for information to propagate faster than the theory of light, this particular aspect of wave function collapse is a non-local phenomenon that leads to the denial of the idea of objective reality. But Bell showed that you could recover objective reality if you added in another non-local effect, and that is to allow for the results of measurements on particle 2 to also depend on the way that the detector at particle 1’s location is set up, however far away it may be. If you did that, you could recover agreement with the statistical predictions of quantum mechanics. David Bohm constructed a ‘pilot wave’ model that explicitly showed this. You would have expected that Einstein would have welcomed it but he was dismissive, calling it ‘too cheap’. He seemed to have hoped for a more sophisticated theory.

So there things stand. Unless one is willing to adopt one of the alternatives such as non-local theories or the Many-Worlds Interpretation or the spontaneous collapse models that were discussed in Part 4 and Part 6 of this series, and few in the mainstream physics community have done so, then one has to conclude that quantum mechanics precludes the existence of objective reality.

That is the end of this series of posts on this topic.

Comments

  1. file thirteen says

    Great stuff Mano. I really enjoyed the series of posts, especially this one.

    If there is no objective reality at the quantum scale, has anyone proposed an explanation as to why the reality we experience is so, well, solid? Is it similar to an infinite series converging to a finite sum? ((1+1/2+1/4+1/8+ -> infinity) = 2)

  2. Mano Singham says

    file thirteen @#2,

    There is a somewhat hand-waving argument to explain that.

    It is based on Louis de Broglie’s revolutionary idea (introduced in 1924 in his PhD thesis) that just as waves had particle-like properties that we call quanta, all particles also had wave-like properties and that it is the wave-like behavior of small particles like electrons that leads to all the quantum effects we have been discussing. de Broglie postulated that the ‘wave length’ associated with any particle is given by h/p where h is Planck’s constant and p is the momentum of the particle. Since h is so tiny, we can observe the wave effects, such as interference, diffraction, the uncertainty principle etc., only when we look at small objects on the microscopic scale that have tiny momenta. That has been confirmed in some ingenious experiments.

    The claim is that those effects also exist for macroscopic objects but the momentum of any macroscopic object is so large that its wave length is so tiny as to make those wave effects unobservable. That is why macroscopic objects appear to behave classically, because their wavelengths are infinitesimally small.

  3. Deepak Shetty says

    @Mano
    Good explanations for a complex topic!
    One question -- are there any current advancements in this area ? Most of what you describe , I dimly recollect reading about it a decade or two ago -- is any progress made on more definitive answers or has the scientific community by and large just accepted the no objective reality as you state in your conclusion and this isnt really an area of interest?

  4. Mano Singham says

    Deepak @#4

    In science, there are almost always competing schools of thought. Working on theories other than the standard paradigm carries risks because nothing might come of it. It is always harder to go against an accepted paradigm. So most scientists work within that standard paradigm, and this is true for quantum theory too.

    There are people looking at alternatives but they tend to be a minority. Often they are established physicists who have secured their careers or young people who wish to make a name for themselves by overturning the dominant paradigm. Sometimes they do this work on the side. For example, John Bell was a scientist at the CERN accelerator working on standard physics problems. His work in this area was, at least initially, not part of his main job. But once his paper took off and he became famous, he was able to devote more time to it.

  5. Jean says

    Since the reality of certain characteristics of particles becomes reality when measured does that also mean that they cease to be real when the measurement is done with? If time symmetry works at that level, that would seem to make sense that things becoming real also means that things cease to be.

  6. file thirteen says

    I feel I’m going to regret asking this, but where does gravity feature in all this? It seems that it’s range independent, but is that just because there are “gravitons”, like light photons but undetectable to us, that absolutely everything that has gravitational attraction is spewing out all the time, like tiny suns? That sounds absurd, and I thought there was currently no quantum theory of gravity, but if there are no carriers of gravitational force, doesn’t the existence of gravity break locality?

  7. ed says

    This is a very good series!

    I’ll say the same thing I said in the other post. The wave function is not a physical object, by construction. Worrying about its collapse as if it were a physical object is similar to worrying about the feelings of gods.

  8. Mano Singham says

    file thirteen@#7,

    You should not feel any regret!

    The classical theory of gravity says that gravitational waves travel at the speed of light, so it is a local theory.

    Gravity does not feature directly in this because the force of gravity is so weak compared to the other fundamental forces that it does not play a significant role in quantum mechanics calculations and can be ignored.

    Although we do not yet have a quantum theory of gravity, from the classical theory and from general principles we expect that such a quantum theory of gravity will have gravitons which are the carriers of the gravitational force. Gravitons are the gravitational counterpart to the photons in quantized electromagnetism, and will be massless and hence travel at the speed of light, and have spin 2. (Photons have spin 1.)

  9. Mano Singham says

    Jean @#6,

    The wave function is an extended object is space, which is why we say that a particle does not have an exact location. When we measure its position and find it at some location, we collapse the wave function to a point at that location. Once we stop measuring the particle and it is free to move, its wave function will evolve once again under the influence of the forces acting on it and become an extended object without an exact position.

    But this reversal to not having an exact location is not due to time symmetry. It is how wave functions evolve in Schrodinger’s equation.

  10. Jean says

    Mano,

    I was not only talking about position but going back to your previous posts also. If I have understood correctly, you mentioned that the spin does not have an up or down property until we measure it; it’s basically becoming real due to the measurement not because it had a prior defined but unknown value. That’s where the time symmetry would come into play where after we measure the spin, would it ever become undefined which would make it no longer real? If not, how does it respect time symmetry?

  11. file thirteen says

    Thanks Mano. Still confused though. If gravitons are a thing, what produces them? Photons come from stars mostly, so you can have an object like the moon that has a dark side -- the only photons that we see from it are reflected photons. But if gravitons worked in the same way (only emitted by stars), then wouldn’t the gravity on the dark side of the moon be less than on its light side? Or do gravitons not get reflected, in which case why do planets have any gravity at all if the gravitons pass straight through? Or does everything emit gravitons, in which case does that mean all of us are like tiny graviton stars? None of this makes sense to me.

  12. Jean says

    file thirteen @#12

    I’ll let Mano respond on gravitons but everything that has a temperature over absolute zero emits photons. And photons are also emitted in a multitude of different ways that have nothing to do with stars. You’re looking at one source to read this message; the photons from your screen (whatever type you’re using) have nothing to do with a star and are produced by that screen directly not reflected from the sun.

    I’ll add one thing about gravitons (if they exist): everything with a mass would emit gravitons.

  13. file thirteen says

    Ok, thanks Jean, that makes sense. Next foolish question:

    Everything not absolute zero emits heat at a rate proportional to the temperature difference between them and their surroundings. What about gravity? Is there any evidence that clumped things might emit more (or less) gravitons (assuming they exist) than the same mass of separated objects?

    I guess yes, I might be talking about the observed discrepancy in galaxy mass distribution. I’ll let someone more knowledgeable step in here before I drift further into wild speculation…

  14. Mano Singham says

    Jean and file thirteen,

    There is a great deal of similarity between electromagnetic radiation and gravitational radiation so I will address both in one response. The key word is ‘radiation’, the emission of energy from a source that spreads out through space. It is in this context that the concepts of photons and gravitons are the most useful.

    Radiation of electromagnetic energy occurs when a source emits it. In the case of electromagnetic energy, as Jean points out, any macroscopic object with a temperature emits radiation. So does any electric charge in motion. So this kind of radiation is ubiquitous and palpable and we experience it all the time. Depending on the context, this radiation can treated as classical EM waves or the waves can be quantized and treated was a stream of photons.

    The case of gravitation is similar. Any matter in motion will generate gravitational waves. Such waves can also be generated when matter is turned into energy. The thing about gravity is that the force of gravity is so extremely weak that it is impossible to detect in everyday life and very hard to do so even with highly sensitive and sophisticated instruments. It took the collision of two massive black holes to combine into one to create a loss of mass so large that the gravitational waves could be detected in 2005. In theory, any piece of matter in motion is generating such gravitational waves (like any electric charge in motion generates EM waves) but we simply cannot feel it. It is this gravitational radiation that is attempted to be quantized as a stream of gravitons. But while we have some idea of what the properties of gravitons should be (massless and spin 2), quantizing the gravitational field has proven to be very difficult.

    Note that the above deals with electromagnetic and gravitational radiation. What about the static cases when a charge or a mass is at rest? The charge and mass still create electric and gravitational fields everywhere respectively but those fields are static and not radiating energy. The concepts of photons and gravitons are less useful in those cases and harder to incorporate and so not usually invoked.

  15. Robbo says

    Jean and file thirteen,

    a bit more clarification:

    When Mano says “in motion” he really means accelerated motion. a mass or charge moving at a constant velocity does not emit radiation.

    Plus, file thirteen, heat isn’t always emitted proportional to temperature. heat transfer via conduction or convection for simple models, proportional to a temperature difference. heat transfer via radiation (black body radiation) goes as temperature to the 4th power.

    And, also, you got me thinking about a gravitational analog to electromagnetic blackbody radiation…is it a thing?

  16. ondrbak says

    First of all, great series and great discussions!
    That having been said, I find the notion of “objective reality” in this conext odd. We don’t, however much we try to, have first-hand access to objective reality. All we have are models attempting to fit very indirect measurements.
    Sure, an informal model that fits our biological sensations is much more familiar and has tradionally been thought of as representing an objectively real ontology.
    But recognizing that whatever I intuitively perceive as “reality” is still a model far removed from whatever it tries to represent, I don’t see a meaningful difference between chairs and sandwiches corresponding to our qualia, and mathematical objects corresponding to readings on some measurement devices.
    We shouldn’t be talking about what’s objectively real and what’s not, but rather how our different models of reality can be reconciled or shown to be compatible with each other.
    And I guess it’s fine to use “objective reality” and “objectively real” as a shorthand for “a part of naive perceptual ontology”, as long as we don’t try to claim that those are really real and objectively objective.
    Overall, I’m biased towards a view inspired by Sean Carroll that what “exists” is the universe as a whole. Everything beyond that are semi-arbitrary exercises in coarse-graining.

  17. jenorafeuer says

    One of my favourite bits about Heisenberg’s Uncertainty Principle is that you can actually run it backwards and use it to make more accurate measurements, as counter-intuitive as that might seem.

    And the key word is interferometry, as in the double-slit experiment, and it ties back to some of what Mano said above about de Broglie.

    So, assume that there’s a star close enough that it isn’t point-like; it has a visible radius measured in micro-arc-seconds. (There are in fact several of these.) As a result any photon arriving from that star must be coming from a region of the sky the size of the visible star. Which means that not only is the momentum of the photon defined by its frequency, but specifically the transverse component of the momentum is highly constrained to be within the angle defined by the radius of the star.

    So if we put a good frequency filter on the sensor, say a radio telescope, the primary uncertainty in the photon’s momentum will be proportional to the visible size of the star.

    Which means, by the Heisenberg Uncertainty Principle, that the uncertainty of the photon’s position will be inversely proportional to the visible size of the star. And the thing is for a double-slit experiment to produce an interference pattern, the photon has to effectively be able to go through both slits at once… which means the uncertainty in its position must be greater than the separation between the slits. So in theory for a star of a known angular radius, we can calculate the uncertainty of the photon’s position, and a pair of radio telescopes closer together than that distance will show an interference pattern… but a pair of radio telescopes further apart than that will not.

    That exact experiment was also done in Australia decades ago, using a pair of radio telescopes on railway cars along the longest stretch of straight railway track in the world running across the outback, where they could focus both telescopes on a single target and move them apart bit by bit and determine when the interference pattern vanished, and then calculate the angular radius of the star from that distance.

    For stars with known angular radii, the values matched. For other stars with angular radii smaller than could be easily measured by traditional means, it gave better results. It still only works on reasonably nearby stars, because the smaller the object’s image in the sky, the further apart the telescopes have to be to get any resolution, and eventually we run out of planet to put them on.

    This is actually part of the basis for VLBI, Very Long Baseline Interferometry.

Leave a Reply

Your email address will not be published. Required fields are marked *