Category Archives: General QFT

Why a New Particle Matters

A while back, when the MiniBoone experiment announced evidence for a sterile neutrino, I was excited. It’s still not clear whether they really found something, here’s an article laying out the current status. If they did, it would be a new particle beyond those predicted by the Standard Model, something like the neutrinos but which doesn’t interact with any of the fundamental forces except gravity.

At the time, someone asked me why this was so exciting. Does it solve the mystery of dark matter, or any other long-standing problems?

The sterile neutrino MiniBoone is suggesting isn’t, as far as I’m aware, a plausible candidate for dark matter. It doesn’t solve any long-standing problems (for example, it doesn’t explain why the other neutrinos are so much lighter than other particles). It would even introduce new problems of its own!

It still matters, though. One reason, which I’ve talked about before, is that each new type of particle implies a new law of nature, a basic truth about the universe that we didn’t know before. But there’s another reason why a new particle matters.

There’s a malaise in particle physics. For most of the twentieth century, theory and experiment were tightly linked. Unexpected experimental results would demand new theory, which would in turn suggest new experiments, driving knowledge forward. That mostly stopped with the Standard Model. There are a few lingering anomalies, like the phenomena we attribute to dark matter, that show the Standard Model can’t be the full story. But as long as every other experiment fits the Standard Model, we have no useful hints about where to go next. We’re just speculating, and too much of that warps the field.

Critics of the physics mainstream pick up on this, but I’m not optimistic about what I’ve seen of their solutions. Peter Woit has suggested that physics should emulate the culture of mathematics, caring more about rigor and being more careful to confirm things before speaking. The title of Sabine Hossenfelder’s “Lost in Math” might suggest the opposite, but I get the impression she’s arguing for something similar: that particle physicists have been using sloppy arguments and should clean up their act, taking foundational problems seriously and talking to philosophers to help clarify their ideas.

Rigor and clarity are worthwhile, but the problems they’ll solve aren’t the ones causing the malaise. If there are problems we can expect to solve just by thinking better, they’re problems that we found by thinking in the first place: quantum gravity theories that stop making sense at very high energies, paradoxical thought experiments with black holes. There, rigor and clarity can matter: to some extent they’re already there, but I can appreciate the argument that it’s not yet nearly enough.

What rigor and clarity won’t do is make physics feel (and function) like it did in the twentieth century. For that, we need new evidence: experiments that disobey the Standard Model, and do it in a clear enough way that we can’t just chalk it up to predictable errors. We need a new particle, or something like it. Without that, our theories are most likely underdetermined by the data, and anything we propose is going to be subjective. Our subjective judgements may get better, we may get rid of the worst-justified biases, but at the end of the day we still won’t have enough information to actually make durable progress.

That’s not a popular message, in part, because it’s not something we can control. There’s a degree of helplessness in realizing that if nature doesn’t throw us a bone then we’ll probably just keep going in circles forever. It’s not the kind of thing that lends itself to a pithy blog post.

If there’s something we can do, it’s to keep our eyes as open as possible, to make sure we don’t miss nature’s next hint. It’s why people are getting excited about low-energy experiments, about precision calculations, about LIGO. Even this seemingly clickbaity proposal that dark matter killed the dinosaurs is motivated by the same sort of logic: if the only evidence for dark matter we have is gravitational, what can gravitational evidence tell us about what it’s made of? In each case, we’re trying to widen our net, to see new phenomena we might have missed.

I suspect that’s why this reviewer was disappointed that Hossenfelder’s book lacked a vision for the future. It’s not that the book lacked any proposals whatsoever. But it lacked this kind of proposal, of a new place to look, where new evidence, and maybe a new particle, might be found. Without that we can still improve things, we can still make progress on deep fundamental mathematical questions, we can kill off the stupidest of the stupid arguments. But the malaise won’t lift, we won’t get back to the health of twentieth century physics. For that, we need to see something new.

Advertisements

Path Integrals and Loop Integrals: Different Things!

When talking science, we need to be careful with our words. It’s easy for people to see a familiar word and assume something totally different from what we intend. And if we use the same word twice, for two different things…

I’ve noticed this problem with the word “integral”. When physicists talk about particle physics, there are two kinds of integrals we mention: path integrals, and loop integrals. I’ve seen plenty of people get confused, and assume that these two are the same thing. They’re not, and it’s worth spending some time explaining the difference.

Let’s start with path integrals (also referred to as functional integrals, or Feynman integrals). Feynman promoted a picture of quantum mechanics in which a particle travels along many different paths, from point A to point B.

three_paths_from_a_to_b

You’ve probably seen a picture like this. Classically, a particle would just take one path, the shortest path, from A to B. In quantum mechanics, you have to add up all possible paths. Most longer paths cancel, so on average the short, classical path is the most important one, but the others do contribute, and have observable, quantum effects. The sum over all paths is what we call a path integral.

It’s easy enough to draw this picture for a single particle. When we do particle physics, though, we aren’t usually interested in just one particle: we want to look at a bunch of different quantum fields, and figure out how they will interact.

We still use a path integral to do that, but it doesn’t look like a bunch of lines from point A to B, and there isn’t a convenient image I can steal from Wikipedia for it. The quantum field theory path integral adds up, not all the paths a particle can travel, but all the ways a set of quantum fields can interact.

How do we actually calculate that?

One way is with Feynman diagrams, and (often, but not always) loop integrals.

4grav2loop

I’ve talked about Feynman diagrams before. Each one is a picture of one possible way that particles can travel, or that quantum fields can interact. In some (loose) sense, each one is a single path in the path integral.

Each diagram serves as instructions for a calculation. We take information about the particles, their momenta and energy, and end up with a number. To calculate a path integral exactly, we’d have to add up all the diagrams we could possibly draw, to get a sum over all possible paths.

(There are ways to avoid this in special cases, which I’m not going to go into here.)

Sometimes, getting a number out of a diagram is fairly simple. If the diagram has no closed loops in it (if it’s what we call a tree diagram) then knowing the properties of the in-coming and out-going particles is enough to know the rest. If there are loops, though, there’s uncertainty: you have to add up every possible momentum of the particles in the loops. You do that with a different integral, and that’s the one that we sometimes refer to as a loop integral. (Perhaps confusingly, these are also often called Feynman integrals: Feynman did a lot of stuff!)

\frac{i^{a+l(1-d/2)}\pi^{ld/2}}{\prod_i \Gamma(a_i)}\int_0^\infty...\int_0^\infty \prod_i\alpha_i^{a_i-1}U^{-d/2}e^{iF/U-i\sum m_i^2\alpha_i}d\alpha_1...d\alpha_n

Loop integrals can be pretty complicated, but at heart they’re the same sort of thing you might have seen in a calculus class. Mathematicians are pretty comfortable with them, and they give rise to numbers that mathematicians find very interesting.

Path integrals are very different. In some sense, they’re an “integral over integrals”, adding up every loop integral you could write down. Mathematicians can define path integrals in special cases, but it’s still not clear that the general case, the overall path integral picture we use, actually makes rigorous mathematical sense.

So if you see physicists talking about integrals, it’s worth taking a moment to figure out which one we mean. Path integrals and loop integrals are both important, but they’re very, very different things.

Unreasonably Big Physics

The Large Hadron Collider is big, eight and a half kilometers across. It’s expensive, with a cost to construct and operate in the billions. And with an energy of 6.5 TeV per proton, it’s the most powerful collider in the world, accelerating protons to 0.99999999 of the speed of light.

The LHC is reasonable. After all, it was funded, and built. What does an unreasonable physics proposal look like?

It’s probably unfair to call the Superconducting Super Collider unreasonable, after all, it did almost get built. It would have been a 28 kilometer-wide circle in the Texas desert, accelerating protons to an energy of 20 TeV, three times the energy of the LHC. When it was cancelled in 1993, it was projected to cost twelve billion dollars, and two billion had already been spent digging the tunnel. The US hasn’t invested in a similarly sized project since.

A better example of an unreasonable proposal might be the Collider-in-the-Sea. (If that link is paywalled, this paper covers most of the same information.)

mcint2-2656157-large

If you run out of room on land, why not build your collider underwater?

Ok, there are pretty obvious reasons why not. Surprisingly, the people proposing the Collider-in-the-Sea do a decent job of answering them. They plan to put it far enough out that it won’t disrupt shipping, and deep enough down that it won’t interfere with fish. Apparently at those depths even a hurricane barely ripples the water, and they argue that the technology exists to keep a floating ring stable under those conditions. All in all, they’re imagining a collider 600 kilometers in diameter, accelerating protons to 250 TeV, all for a cost they claim would be roughly comparable to the (substantially smaller) new colliders that China and Europe are considering.

I’m sure that there are reasons I’ve overlooked why this sort of project is impossible. (I mean, just look at the map!) Still, it’s impressive that they can marshal this much of an argument.

Besides, there are even more impossible projects, like this one, by Sugawara, Hagura, and Sanami. Their proposal for a 1000 TeV neutrino beam isn’t intended for research: rather, the idea is a beam powerful enough to send neutrinos through the Earth to destroy nuclear bombs. Such a beam could cause the bombs to detonate prematurely, “fizzling” with about 3% the explosion they would have normally.

In this case, Sugawara and co. admit that their proposal is pure fantasy. With current technology they would need a ring larger than the Collider-in-the-Sea, and the project would cost hundreds of billions of dollars. It’s not even clear who would want to build such a machine, or who could get away with building it: the authors imagine a science fiction-esque world government to foot the bill.

There’s a spectrum of papers that scientists write, from whimsical speculation to serious work. The press doesn’t always make the difference clear, so it’s a useful skill to see the clues in the writing that show where a given proposal lands. In the case of the Sugawara and co. proposal, the paper is littered with caveats, explicitly making it clear that it’s just a rough estimate. Even the first line, dedicating the paper to another professor, should get you to look twice: while this sometimes happens on serious papers, often it means the paper was written as a fun gift for the professor in question. The Collider-in-the-Sea doesn’t have these kinds of warning signs, and it’s clear its authors take it a bit more seriously. Nonetheless, comparing the level of detail to other accelerator proposals, even those from the same people, should suggest that the Collider-in-the-Sea isn’t entirely on the same level. As wacky as it is to imagine, we probably won’t get a collider that takes up most of the Gulf of Mexico, or a massive neutrino beam capable of blowing up nukes around the world.

The Rippling Pond Universe

[Background: Someone told me they couldn’t imagine popularizing Quantum Field Theory in the same flashy way people popularize String Theory. Naturally I took this as a challenge. Please don’t take any statements about what “really exists” here too seriously, this isn’t intended as metaphysics, just metaphor.]

 

You probably learned about atoms in school.

Your teacher would have explained that these aren’t the same atoms the ancient Greeks imagined. Democritus thought of atoms as indivisible, unchanging spheres, the fundamental constituents of matter. We know, though, that atoms aren’t indivisible. They’re clouds of electrons, buzzing in their orbits around a nucleus of protons and neutrons. Chemists can divide the electrons from the rest, nuclear physicists can break the nucleus. The atom is not indivisible.

And perhaps your teacher remarked on how amazing it is, that the nucleus is such a tiny part of the atom, that the atom, and thus all solid matter, is mostly empty space.

 

You might have learned that protons and neutrons, too, are not indivisible. That each proton, and each neutron, is composed of three particles called quarks, particles which can be briefly freed by powerful particle colliders.

And you might have wondered, then, even if you didn’t think to ask: are quarks atoms? The real atoms, the Greek atoms, solid indestructible balls of fundamental matter?

 

They aren’t, by the way.

 

You might have gotten an inkling of this, learning about beta decay. In beta decay, a neutron transforms, becoming a proton, an electron, and a neutrino. Look for an electron inside a neutron, and you won’t find one. Even if you look at the quarks, you see the same transformation: a down quark becomes an up quark, plus an electron, plus a neutrino. If quarks were atoms, indivisible and unchanging, this couldn’t happen. There’s nowhere for the electron to hide.

 

In fact, there are no atoms, not the way the Greeks imagined. Just ripples.

Water Drop

Picture the universe as a pond. This isn’t a still pond: something has disturbed it, setting ripples and whirlpools in motion. These ripples and whirlpools skim along the surface of the pond, eddying together and scattering apart.

Our universe is not a simple pond, and so these are not simple ripples. They shine and shimmer, each with their own bright hue, colors beyond our ordinary experience that mix in unfamiliar ways. The different-colored ripples interact, merge and split, and the pond glows with their light.

Stand back far enough, and you notice patterns. See that red ripple, that stays together and keeps its shape, that meets other ripples and interacts in predictable ways. You might imagine the red ripple is an atom, truly indivisible…until it splits, transforms, into ripples of new colors. The quark has changed, down to up, an electron and a neutrino rippling away.

All of our world is encoded in the colors of these ripples, each kind of charge its own kind of hue. With a wink (like your teacher’s, telling you of empty atoms), I can tell you that distance itself is just a kind of ripple, one that links other ripples together. The pond’s very nature as a place is defined by the ripples on it.

 

This is Quantum Field Theory, the universe of ripples. Democritus said that in truth there are only atoms and the void, but he was wrong. There are no atoms. There is only the void. It ripples and shimmers, and each of us lives as a collection of whirlpools, skimming the surface, seeming concrete and real and vital…until the ripples dissolve, and a new pattern comes.

The Way You Think Everything Is Connected Isn’t the Way Everything Is Connected

I hear it from older people, mostly.

“Oh, I know about quantum physics, it’s about how everything is connected!”

“String theory: that’s the one that says everything is connected, right?”

“Carl Sagan said we are all stardust. So really, everything is connected.”

connect_four

It makes Connect Four a lot easier anyway

I always cringe a little when I hear this. There’s a misunderstanding here, but it’s not a nice clean one I can clear up in a few sentences. It’s a bunch of interconnected misunderstandings, mixing some real science with a lot of confusion.

To get it out of the way first, no, string theory is not about how “everything is connected”. String theory describes the world in terms of strings, yes, but don’t picture those strings as links connecting distant places: string theory’s proposed strings are very, very short, much smaller than the scales we can investigate with today’s experiments. The reason they’re thought to be strings isn’t because they connect distant things, it’s because it lets them wiggle (counteracting some troublesome wiggles in quantum gravity) and wind (curling up in six extra dimensions in a multitude of ways, giving us what looks like a lot of different particles).

(Also, for technical readers: yes, strings also connect branes, but that’s not the sort of connection these people are talking about.)

What about quantum mechanics?

Here’s where it gets trickier. In quantum mechanics, there’s a phenomenon called entanglement. Entanglement really does connect things in different places…for a very specific definition of “connect”. And there’s a real (but complicated) sense in which these connections end up connecting everything, which you can read about here. There’s even speculation that these sorts of “connections” in some sense give rise to space and time.

You really have to be careful here, though. These are connections of a very specific sort. Specifically, they’re the sort that you can’t do anything through.

Connect two cans with a length of string, and you can send messages between them. Connect two particles with entanglement, though, and you can’t send messages between them…at least not any faster than between two non-entangled particles. Even in a quantum world, physics still respects locality: the principle that you can only affect the world where you are, and that any changes you make can’t travel faster than the speed of light. Ansibles, science-fiction devices that communicate faster than light, can’t actually exist according to our current knowledge.

What kind of connection is entanglement, then? That’s a bit tricky to describe in a short post. One way to think about entanglement is as a connection of logic.

Imagine someone takes a coin and cuts it along the rim into a heads half and a tails half. They put the two halves in two envelopes, and randomly give you one. You don’t know whether you have heads or tails…but you know that if you open your envelope and it shows heads, the other envelope must have tails.

m_nickel

Unless they’re a spy. Then it could contain something else.

Entanglement starts out with connections like that. Instead of a coin, take a particle that isn’t spinning and “split” it into two particles spinning in different directions, “spin up” and “spin down”. Like the coin, the two particles are “logically connected”: you know if one of them is “spin up” the other is “spin down”.

What makes a quantum coin different from a classical coin is that there’s no way to figure out the result in advance. If you watch carefully, you can see which coin gets put in to which envelope, but no matter how carefully you look you can’t predict which particle will be spin up and which will be spin down. There’s no “hidden information” in the quantum case, nowhere nearby you can look to figure it out.

That makes the connection seem a lot weirder than a regular logical connection. It also has slightly different implications, weirdness in how it interacts with the rest of quantum mechanics, things you can exploit in various ways. But none of those ways, none of those connections, allow you to change the world faster than the speed of light. In a way, they’re connecting things in the same sense that “we are all stardust” is connecting things: tied together by logic and cause.

So as long as this is all you mean by “everything is connected” then sure, everything is connected. But often, people seem to mean something else.

Sometimes, they mean something explicitly mystical. They’re people who believe in dowsing rods and astrology, in sympathetic magic, rituals you can do in one place to affect another. There is no support for any of this in physics. Nothing in quantum mechanics, in string theory, or in big bang cosmology has any support for altering the world with the power of your mind alone, or the stars influencing your day to day life. That’s just not the sort of connection we’re talking about.

Sometimes, “everything is connected” means something a bit more loose, the idea that someone’s desires guide their fate, that you could “know” something happened to your kids the instant it happens from miles away. This has the same problem, though, in that it’s imagining connections that let you act faster than light, where people play a special role. And once again, these just aren’t that sort of connection.

Sometimes, finally, it’s entirely poetic. “Everything is connected” might just mean a sense of awe at the deep physics in mundane matter, or a feeling that everyone in the world should get along. That’s fine: if you find inspiration in physics then I’m glad it brings you happiness. But poetry is personal, so don’t expect others to find the same inspiration. Your “everyone is connected” might not be someone else’s.

The Many Worlds of Condensed Matter

Physics is the science of the very big and the very small. We study the smallest scales, the fundamental particles that make up the universe, and the largest, stars on up to the universe as a whole.

We also study the world in between, though.

That’s the domain of condensed matter, the study of solids, liquids, and other medium-sized arrangements of stuff. And while it doesn’t make the news as often, it’s arguably the biggest field in physics today.

(In case you’d like some numbers, the American Physical Society has divisions dedicated to different sub-fields. Condensed Matter Physics is almost twice the size of the next biggest division, Particles & Fields. Add in other sub-fields that focus on medium-sized-stuff, like those who work on solid state physics, optics, or biophysics, and you get a majority of physicists focused on the middle of the distance scale.)

When I started grad school, I didn’t pay much attention to condensed matter and related fields. Beyond the courses in quantum field theory and string theory, my “breadth” courses were on astrophysics and particle physics. But over and over again, from people in every sub-field, I kept hearing the same recommendation:

“You should take Solid State Physics. It’s a really great course!”

At the time, I never understood why. It was only later, once I had some research under my belt, that I realized:

Condensed matter uses quantum field theory!

The same basic framework, describing the world in terms of rippling quantum fields, doesn’t just work for fundamental particles. It also works for materials. Rather than describing the material in terms of its fundamental parts, condensed matter physicists “zoom out” and talk about overall properties, like sound waves and electric currents, treating them as if they were the particles of quantum field theory.

This tends to confuse the heck out of journalists. Not used to covering condensed matter (and sometimes egged on by hype from the physicists), they mix up the metaphorical particles of these systems with the sort of particles made by the LHC, with predictably dumb results.

Once you get past the clumsy journalism, though, this kind of analogy has a lot of value.

Occasionally, you’ll see an article about string theory providing useful tools for condensed matter. This happens, but it’s less widespread than some of the articles make it out to be: condensed matter is a huge and varied field, and string theory applications tend to be of interest to only a small piece of it.

It doesn’t get talked about much, but the dominant trend is actually in the other direction: increasingly, string theorists need to have at least a basic background in condensed matter.

String theory’s curse/triumph is that it can give rise not just to one quantum field theory, but many: a vast array of different worlds obtained by twisting extra dimensions in different ways. Particle physicists tend to study a fairly small range of such theories, looking for worlds close enough to ours that they still fit the evidence.

Condensed matter, in contrast, creates its own worlds. Pick the right material, take the right slice, and you get quantum field theories of almost any sort you like. While you can’t go to higher dimensions than our usual four, you can certainly look at lower ones, at the behavior of currents on a sheet of metal or atoms arranged in a line. This has led some condensed matter theorists to examine a wide range of quantum field theories with one strange behavior or another, theories that wouldn’t have occurred to particle physicists but that, in many cases, are part of the cornucopia of theories you can get out of string theory.

So if you want to explore the many worlds of string theory, the many worlds of condensed matter offer a useful guide. Increasingly, tools from that community, like integrability and tensor networks, are migrating over to ours.

It’s gotten to the point where I genuinely regret ignoring condensed matter in grad school. Parts of it are ubiquitous enough, and useful enough, that some of it is an expected part of a string theorist’s background. The many worlds of condensed matter, as it turned out, were well worth a look.

What Makes Light Move?

Light always moves at the speed of light.

It’s not alone in this: anything that lacks mass moves at the speed of light. Gluons, if they weren’t constantly interacting with each other, would move at the speed of light. Neutrinos, back when we thought they were massless, were thought to move at the speed of light. Gravitational waves, and by extension gravitons, move at the speed of light.

This is, on the face of it, a weird thing to say. If I say a jet moves at the speed of sound, I don’t mean that it always moves at the speed of sound. Find it in its hangar and hopefully it won’t be moving at all.

And so, people occasionally ask me, why can’t we find light in its hangar? Why does light never stand still? What makes light move?

(For the record, you can make light “stand still” in a material, but that’s because the material is absorbing and reflecting it, so it’s not the “same” light traveling through. Compare the speed of a wave of hands in a stadium versus the speed you could run past the seats.)

This is surprisingly tricky to explain without math. Some people point out that if you want to see light at rest you need to speed up to catch it, but you can’t accelerate enough unless you too are massless. This probably sounds a bit circular. Some people talk about how, from light’s perspective, no time passes at all. This is true, but it seems to confuse more than it helps. Some people say that light is “made of energy”, but I don’t like that metaphor. Nothing is “made of energy”, nor is anything “made of mass” either. Mass and energy are properties things can have.

I do like game metaphors though. So, imagine that each particle (including photons, particles of light) is a character in an RPG.

260px-yagami_light

For bonus points, play Light in an RPG.

You can think of energy as the particle’s “character points”. When the particle builds its character it gets a number of points determined by its energy. It can spend those points increasing its “stats”: mass and momentum, via the lesser-known big brother of E=mc^2, E^2=p^2c^2+m^2c^4.

Maybe the particle chooses to play something heavy, like a Higgs boson. Then they spend a lot of points on mass, and don’t have as much to spend on momentum. If they picked something lighter, like an electron, they’d have more to spend, so they could go faster. And if they spent nothing at all on mass, like light does, they could use all of their energy “points” boosting their speed.

Now, it turns out that these “energy points” don’t boost speed one for one, which is why low-energy light isn’t any slower than high-energy light. Instead, speed is determined by the ratio between energy and momentum. When they’re proportional to each other, when E^2=p^2c^2, then a particle is moving at the speed of light.

(Why this is is trickier to explain. You’ll have to trust me or wikipedia that the math works out.)

Some of you may be happy with this explanation, but others will accuse me of passing the buck. Ok, a photon with any energy will move at the speed of light. But why do photons have any energy at all? And even if they must move at the speed of light, what determines which direction?

Here I think part of the problem is an old physics metaphor, probably dating back to Newton, of a pool table.

220px-cribbage_pool_rack_closeup

A pool table is a decent metaphor for classical physics. You have moving objects following predictable paths, colliding off each other and the walls of the table.

Where people go wrong is in projecting this metaphor back to the beginning of the game. At the beginning of a game of pool, the balls are at rest, racked in the center. Then one of them is hit with the pool cue, and they’re set into motion.

In physics, we don’t tend to have such neat and tidy starting conditions. In particular, things don’t have to start at rest before something whacks them into motion.

A photon’s “start” might come from an unstable Higgs boson produced by the LHC. The Higgs decays, and turns into two photons. Since energy is conserved, these two each must have half of the energy of the original Higgs, including the energy that was “spent” on its mass. This process is quantum mechanical, and with no preferred direction the photons will emerge in a random one.

Photons in the LHC may seem like an artificial example, but in general whenever light is produced it’s due to particles interacting, and conservation of energy and momentum will send the light off in one direction or another.

(For the experts, there is of course the possibility of very low energy soft photons, but that’s a story for another day.)

Not even the beginning of the universe resembles that racked set of billiard balls. The question of what “initial conditions” make sense for the whole universe is a tricky one, but there isn’t a way to set it up where you start with light at rest. It’s not just that it’s not the default option: it isn’t even an available option.

Light moves at the speed of light, no matter what. That isn’t because light started at rest, and something pushed it. It’s because light has energy, and a particle has to spend its “character points” on something.