PI Lecture on Density Matrices

Kea kindly pointed out to me that the Perimeter Institute just put on the web a lecture on density matrices and the foundations of quantum mechanics. The lecturer is Christopher Fuchs, and the duration is an hour and a quarter. As a promoter of density matrix theory, I thought I would discuss it here.

The lecture begins with the contribution of the foundations of quantum mechanics to the descent into insanity of John Forbes Nash some 50 years ago. Fuchs shared my view that quantum probabilities are not “non commutative generalizations” of classical probabilities, but his analysis is based on the assumption that quantum states should be written entirely in probability form. In this I disagree.

For those who don’t know what this is about, quantum states are usually represented by “state vectors” which are vectors in a Hilbert space. One can manipulate these state vectors by applying operators to them. That is, an “operator” is a linear transformation that maps the various state vectors to new state vectors. When one operates on a state vector, therefore one gets a new vector. To compute a probability, one takes two vectors and uses the inner product of the Hilbert space to multiply them. This gives a complex number. The probabiliy is the square of the absolute value of that complex number.

For those motivated to rewrite quantum mechanics in terms of classical probabilities, it makes sense to move to a mathematical description of the quantum state that is closer to the square of the absolute value of the complex valued inner product described above. One does this by converting the state vectors to “density matrices” as follows:
Density matrix from state vector

Then the probabilities of various things are computed with the trace. For example, to compute the average value of the result of a measurement with the operator H, the standard technique gives \langle a| H |a\rangle , and with density matrices the same result is found by tr(\rho H) where \rho is the density matrix.

For a given operator H, one can put the density matrix into diagonal form. The probabilities of the various possible measurements of H will then show up as real numbers on the diagonal of H. For a state that is “pure” in H, the diagonal of this diagonalized density matrix will have only one nonzero value, that corresponding to the state that the pure state is in.

All right so far. The diagonal elements of the density matrix (when written in the basis defined by the eigenvectors of some operator H) are the probabilities that the quantum state will be found to be in the various states when H is measured. The problem is that the off diagonal elements are not easy to write in probability form.

Now I agree with Fuchs that the density matrix is closer to the foundations of quantum mechanics than the state vector, but I disagree that it is particularly useful to continue to push the density matrix form so as to write it entirely in probability form.

Naive State Vector Formalism

The simplest way to describe the difference between the state vector formalism and the density matrix formalism is to describe how they treat a quantum state that is changing with time. One might imagine a 2-slit interference problem with a Schroedinger equation solution of \psi(x;t) .

Both formalisms describe the progress of a quantum object through the passage of time. In the state vector formalism, one breaks the description up into a series of “freeze phrames” or snapshots of the quantum state. At the time t_0 , the quantum state is assumed to be \psi(x;t_0) , which is a complex valued function of position.

The state vector formalism is defective in that it is not uniquely defined. One can always multiply a state vector by an arbitrary complex phase and obtain a new state vector that is mathematically different from the previous, but gives identical physical results. For example, \exp(2i\pi/3)\psi(x;t) is mathematically different from \psi(x;t) , but is physically identical.

The origin of the state vector formalism is at least partly due to the 2-slit experiment. The interference observed in that experiment is reminiscent of the interference seen in classical waves, so the natural assumption is that the quantum state is a wave. One can suppose that the arbitrary phase problem is due to our coarse intstruments and that the quantum state actually possesses some unknown and unknowable phase.

This naive belief should have been eliminated by a series of thought experiments early in the history of quantum mechanics culminating in the EPR paradox, but it is still prevalent as an unconscious assumption very common in physicists who are not working on the foundations of quantum mechanics.

One state vector alternative to the problem of the arbitrary complex phase is to use mathematics where all such items are assumed to be the same object. This assumption makes it difficult to justify linear superposition as the linear superposition of two quantum states depends on these arbitrary complex phases. And without linear superposition, most of the utility of quantum mechanics goes away.

Density Matrix Formalism

The density matrix version of the state vector Schroedinger’s wave equation solution \psi(x;t) is \rho(x,x';t) = \psi^*(x';t)\psi(x;t) . The probability density is then simply \rho(x,x;t) . The arbitrary complex phase, and the need to take an absolute value and square in order to get a probability has been eliminated. What’s new, is that the quantum state is described by TWO copies of the spatial coordinates.

We will call these two copies of the coordinates the “initial” coordinate x, and the “final” coordinate x’ in analogy to the situation with an S-matrix operator in elementary particle theory. While these are, as in the state vector formalism, “snapshots” of the situation at a given time t, this happens only because we began with a theory (Schroedinger’s) that assumed that quantum states could be so described. When one looks at the elementary particle situation describing the incoming and outgoing sides of a density operator as initial and final makes more direct physical sense.

So another, more physical, way of getting to an operator description of a quantum state is to consider the situation before and after the passage of a very small amount of time. That would be \rho(x,x';t) = \psi^*(x';t+\delta t)\psi(x;t) . In the limit as \delta t \to 0 , this is just the usual density matrix operator. Consequently, the density matrix formalism treats the quantum state as a series of pairs of snapshots of the particle, initial and final, and the quantum state consists of a series of transformations (defined by the density operator) rather than a series of snapshots or wave values. This interpretation of the formalism combines “S-matrix” theory with density matrix theory. For the S-matrix, one lets \delta t \to \infty .

Density Matrices in terms of Probabilities

The problem with using density matrices as indications of probability is that while a density matrix gives all the information that can be known about a quantum state, it codes that information up in a manner that is rather inconvenient to extract. The lecture at PI was directed towards information theory where the quantum states are finite dimensional. Conveniently, that’s also the subject I’ve concentrated on.

We will use the example of spin-1/2, or qubits. When a qubit is diagonalized with the usual Pauli matrices, the diagonal quantum states are spin up, spin down, and mixed states of these two. (By the way, these mixed states are obtained by linear superposition of the spin up and spin down states. This shows that linear superposition does exist in density matrix formalism, but it has the nature of a statistical combination rather than a mapping of states.)

The diagonal elements of a density matrix correspond to a “complete measurement” of the quantum system, which corresond to a set of primitive projection operators that annihilate each other and that sum to unity. Such a set of primitve projection operators correspond to the results of a consistent measurement of the quantum state. For example, with spin-1/2, one can build an experiment that divides incoming spin-1/2 particles into spin up and spin down, but that is all. One cannot build an experiment that consistently divides them into more than those two spin directions. (Consistent here means commutative, which means that the results of the measurements do not depend on order in which they are performed.)

The off diagonal elements of a density matrix are more difficult to interpret as probabilities. In the density matrix formalism, the state is an operator that acts on other states (which are also operators), and the off diagonal elements are necessary to define such actions.

But in the density operator formalism, the states are not defined as matrices and consequently there is no definition of “off diagonal”. One writes the spin-up state as (1+\hat{z})/2 and this is just as diagonal or non diagonal as spin in the +x direction: (1+\hat{x})/2 . So there’s no problem in defining off diagonal states in the density operator formalism, all the degrees of freedom are interpreted as probabilities, when the state is measured with respect to that degree of freedom.

Getting back to the PI lecture, the program that Fuchs is pursuing is to find sets of quantum states that when you measure the probabilities between these and an arbitrary quantum state, you can then define the arbitrary quantum state. In the case of spin-1/2, it may or may not be clear (but it is true) that if you measure the probability of spin in all six directions, +x,+y,+z,-x,-y,-z, then you can take these 6 probabilities and from them compute the density matrix of that arbitrary state. (Of course this will require multiple experiments as measurement perturbs the system.) So the question arises what set of measurements should you use to get just what you need to know to describe the quantum state?

This seems to be a fairly unimportant mathematical question. Since Hermitian 2×2 matrices have only 4 real degrees of freedom, you don’t need those 6 measurements. After some manipulation, one finds that one may be able to have a rather cute solution with 4 terms that satisfies certain symmetries. This gives a geometric problem that has been solved for for 2×2 complex matrices and 3×3 complex matrices.

Somewhat oddly, there is a mild connection to the “snark algebra” I work with here. The solution to the 2×2 complex matrix problem (which is spin-1/2), is to have 4 pure states that are equally distributed across the Bloch sphere. They are on the vertices of a tetrahedron. At one time I used this as a model for the snarks. This was before I went to perpendicular states.

Knowing this, it is easy to compute a cool fact about these vectors. We know that the total surface area of the sphere is 4 pi, so each face of the tetrahedron has surface area of pi. Berry phase for an area of pi is half the surface area, so multiplied in triples, the states have to give a complex phase of \exp(i\pi/2) = i . (For a proof of this, see “a beautiful theorem” back at the top of this post. I only recently found out that this is called Berry phase, or “quantum phase” .)

The problem has also been solved for 3×3 complex matrices, which have 9 (Hermitian) degrees of freedom. I didn’t write down the solution, but it appeared to me that they were related to the “snark algebra”, the circulant complex 3×3 matrices. See section 3.3, page 58/40, of my book on density matrices, which solves the projection operator problem for 3×3 matrices and specializes to the circulant case.

Advertisements

3 Comments

Filed under Uncategorized

3 responses to “PI Lecture on Density Matrices

  1. Doug

    Hi Carl,

    Are physicists, like engineers, beginning to recognize the noncommutative game theory contributions of John Nash, particularly Nash equilibia?

  2. Hi Carl,
    “(For a proof of this, see “a beautiful theorem” back at the top of this post.”
    http://en.wikipedia.org/wiki/EPR_Paradox
    ? Could not find “a beautiful theorem” at the top?

    Was that supposed to be this,
    Euler’s_identity

    or was it about Berry phase?

  3. carlbrannen

    Thanks Mike, I’ve corrected the link. I’m not sure how I screwed that up.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s