So You Want to Prove String Theory (Or: Nima Did Something Cool Again)

Nima Arkani-Hamed, of Amplituhedron fame, has been making noises recently about proving string theory.

Now, I can already hear the smartarses in the comments correcting me here. You can’t prove a scientific theory, you can only provide evidence for it.

Well, in this case I don’t mean “provide evidence”. (Direct evidence for string theory is quite unlikely at the moment given the high energies at which it becomes relevant and large number of consistent solutions, but an indirect approach might yet work.) I actually mean “prove”.

See, there are two ways to think about the problem of quantum gravity. One is as an experimental problem: at high enough energies for quantum gravity to be relevant, what actually happens? Since it’s going to be a very long time before we can probe those energies, though, in practice we instead have a technical problem: can we write down a theory that looks like gravity in familiar situations, while avoiding the pesky infinities that come with naive attempts at quantum gravity?

If you can prove that string theory is the only theory that does that, then you’ve proven string theory. If you can prove that string theory is the only theory that does that [with certain conditions] then you’ve proven string theory [with certain conditions].

That, in broad terms, is what Nima has been edging towards. At this year’s Strings conference, he unveiled some progress towards that goal. And since I just recently got around to watching his talk, you get to hear my take on it.

 Nima has been working with Yu-tin Huang, an amplitudeologist who tends to show up everywhere, and one of his students. Working in parallel, an all-star cast has been doing a similar calculation for Yang-Mills theory. The Yang-Mills story is cool, and probably worth a post in its own right, but I think you guys are more interested in the quantum gravity one.

What is Nima doing here?

Nima is looking at scattering amplitudes, probabilities for particles to scatter off of each other. In this case, the particles are gravitons, the particle form of gravitational waves.

Normally, the problems with quantum gravity show up when your scattering amplitudes have loops. Here, Nima is looking at amplitudes without loops, the most important contributions when the force in question is weak (the “weakly coupled” in Nima’s title).

Even for these amplitudes you can gain insight into quantum gravity by seeing what happens at high energies (the “UV” in the title). String amplitudes have nice behavior at high energies, naive gravity amplitudes do not. The question then becomes, are there other amplitudes that preserve this nice behavior, while still obeying the rules of physics? Or is string theory truly unique, the only theory that can do this?

The team that asked a similar question about Yang-Mills theory found that string theory was unique, that every theory that obeyed their conditions was in some sense “stringy”. That makes it even more surprising that, for quantum gravity, the answer was no: the string theory amplitude is not unique. In fact, Nima and his collaborators found an infinite set of amplitudes that met their conditions, related by a parameter they could vary freely.

What are these other amplitudes, then?

Nima thinks they can’t be part of a consistent theory, and he’s probably right. They have a number of tests they haven’t done: in particular, they’ve only been looking at amplitudes involving two gravitons scattering off each other, but a real theory should have consistent answers for any number of gravitons interacting, and it’s doesn’t look like these “alternate” amplitudes can be generalized to work for that.

That said, at this point it’s still possible that these other amplitudes are part of some sort of sensible theory. And that would be incredibly interesting, because we’ve never seen anything like that before.

There are approaches to quantum gravity besides string theory, sure. But common to all of them is an inability to actually calculate scattering amplitudes. If there really were a theory that generated these “alternate” amplitudes, it wouldn’t correspond to any existing quantum gravity proposal.

(Incidentally, this is also why this sort of “proof” of string theory might not convince everyone. Non-string quantum gravity approaches tend to talk about things fairly far removed from scattering amplitudes, so some would see this kind of thing as apples and oranges.)

I’d be fascinated to see where this goes. Either we have a new set of gravity scattering amplitudes to work with, or string theory turns out to be unique in a more rigorous and specific way than we’ve previously known. No matter what, something interesting is going to happen.

After the talk David Gross drew on his experience of the origin of string theory to question whether this work is just retreading the path to an old dead end. String theory arose from an attempt to find a scattering amplitude with nice properties, but it was only by understanding this amplitude physically in terms of vibrating strings that it was able to make real progress.

I generally agree with Nima’s answer, but to re-frame it in my own words: in the amplitudes sub-field, there’s something of a cycle. We try to impose general rules, until by using those rules we have a new calculation technique. We then do a bunch of calculations with the new technique. Finally, we look at the results of those calculations, try to find new general rules, and start the cycle again.

String theory is the result of people applying general rules to scattering amplitudes and learning enough to discover not just a new calculation technique, but a new physical theory. Now, we’ve done quite a lot of string theory calculations, and quite a lot more quantum field theory calculations as well. We have a lot of “data”.

And when you have a lot of data, it becomes much more productive to look for patterns. Now, if we start trying to apply general rules, we have a much better idea of what we’re looking for. This lets us get a lot further than people did the first time through the cycle. It’s what let Nima find the Amplituhedron, and it’s something Yu-tin has a pretty good track record of as well.

So in general, I’m optimistic. As a community, we’re poised to find out some very interesting things about what gravity scattering amplitudes can look like. Maybe, we’ll even prove string theory. [With certain conditions, of course.😉 ]

Science Is a Collection of Projects, Not a Collection of Beliefs

Read a textbook, and you’ll be confronted by a set of beliefs about the world.

(If it’s a half-decent textbook, it will give justifications for those beliefs, and they will be true, putting you well on the way to knowledge.)

The same is true of most science popularization. In either case, you’ll be instructed that a certain set of statements about the world (or about math, or anything else) are true.

If most of your experience with science comes from popularizations and textbooks, you might think that all of science is like this. In particular, you might think of scientific controversies as matters of contrasting beliefs. Some scientists “believe in” supersymmetry, some don’t. Some “believe in” string theory, some don’t. Some “believe in” a multiverse, some don’t.

In practice, though, only settled science takes the form of beliefs. The rest, science as it is actually practiced, is better understood as a collection of projects.

Scientists spend most of their time working on projects. (Well, or procrastinating in my case.) Those projects, not our beliefs about the world, are how we influence other scientists, because projects build off each other. Any time we successfully do a calculation or make a measurement, we’re opening up new calculations and measurements for others to do. We all need to keep working and publishing, so anything that gives people something concrete to do is going to be influential.

The beliefs that matter come later. They come once projects have been so successful, and so widespread, that their success itself is evidence for beliefs. They’re the beliefs that serve as foundational assumptions for future projects. If you’re going to worry that some scientists are behaving unscientifically, these are the sorts of beliefs you want to worry about. Even then, things are often constrained by viable projects: in many fields, you can’t have a textbook without problem sets.

Far too many people seem to miss this distinction. I’ve seen philosophers focus on scientists’ public statements instead of their projects when trying to understand the implications of their science. I’ve seen bloggers and journalists who mostly describe conflicts of beliefs, what scientists expect and hope to be true rather than what they actually work on.

Do scientists have beliefs about controversial topics? Absolutely. Do those beliefs influence what they work on? Sure. But only so far as there’s actually something there to work on.

That’s why you see quite a few high-profile physicists endorsing some form of multiverse, but barely any actual journal articles about it. The belief in a multiverse may or may not be true, but regardless, there just isn’t much that one can do with the idea right now, and it’s what scientists are doing, not what they believe, that constitutes the health of science.

Different fields seem to understand this to different extents. I’m reminded of a story I heard in grad school, of two dueling psychologists. One of them believed that conversation was inherently cooperative, and showed that, unless unusually stressed or busy, people would put in the effort to understand the other person’s perspective. The other believed that conversation was inherently egocentric, and showed that, the more you stressed or busy people are, the more they assume that everyone else has the same perspective they do.

Strip off the “beliefs”, and these two worked on the exact same thing, with the same results. With their beliefs included, though, they were bitter rivals who bristled if their grad students so much as mentioned the other scientist.

We need to avoid this kind of mistake. The skills we have, the kind of work we do, these are important, these are part of science. The way we talk about it to reporters, the ideas we champion when we debate, those are sidelines. They have some influence, dragging people one way or another. But they’re not what science is, because on the front lines, science is about projects, not beliefs.

Physics Is about Legos

There’s a summer camp going on at Waterloo’s Institute for Quantum Computing called QCSYS, the Quantum Cryptography School for Young Students. A lot of these kids are interested in physics in general, not just quantum computing, so they give them a tour of Perimeter. While they’re here, they get a talk from a local postdoc, and this year that postdoc was me.

There’s an image that Perimeter has tossed around a lot recently, All Known Physics in One Equation. This article has an example from a talk given by Neil Turok. I thought it would be fun to explain that equation in terms a (bright, recently taught about quantum mechanics) high school student could understand. To do that, I’d have to explain what the equation is made of: spinors and vectors and tensors and the like.

The last time I had to explain that kind of thing here, I used a video game metaphor. For this talk, I came up with a better metaphor: legos.

Vectors are legos. Spinors are legos. Tensors are legos. They’re legos because they can be connected up together, but only in certain ways. Their “bumps” have to line up properly. And their nature as legos determines what you can build with them.

If you’re interested, here’s my presentation. Experts be warned: there’s a handwaving warning early in this talk, and it applies to a lot of it. In particular, the discussion of gauge group indices leaves out a lot. My goal in this talk was to give a vague idea of what the Standard Model Lagrangian is “made of”, and from the questions I got I think I succeeded.

The Metaphysics of Card Games

I tend to be skeptical of attempts to apply metaphysics to physics. In particular, I get leery when someone tries to describe physics in terms of which fundamental things exist, and which things are made up of other things.

Now, I’m not the sort of physicist who thinks metaphysics is useless in general. I’ve seen some impressive uses of supervenience, for example.

But I think that, in physics, talk of “things” is almost always premature. As physicists, we describe the world mathematically. It’s the most precise way we have access to of describing the universe. The trouble is, slightly different mathematics can imply the existence of vastly different “things”.

To give a slightly unusual example, let’s talk about card games.

magic_the_gathering-card_back

To defeat metaphysics, we must best it at a children’s card game!

Magic: The Gathering is a collectible card game in which players play powerful spellcasters who fight by casting spells and summoning creatures. Those spells and creatures are represented by cards.

If you wanted to find which “things” exist in Magic: The Gathering, you’d probably start with the cards. And indeed, cards are pretty good candidates for fundamental “things”. As a player, you have a hand of cards, a discard pile (“graveyard”) and a deck (“library”), and all of these are indeed filled with cards.

However, not every “thing” in the game is a card. That’s because the game is in some sense limited: it needs to represent a broad set of concepts while still using physical, purchasable cards.

Suppose you have a card that represents a general. Every turn, the general recruits a soldier. You could represent the soldiers with actual cards, but they’d have to come from somewhere, and over many turns you might quickly run out.

Instead, Magic represents these soldiers with “tokens”. A token is not a card: you can’t shuffle a token into your deck or return it to your hand, and if you try to it just ceases to exist. But otherwise, the tokens behave just like other creatures: they’re both the same type of “thing”, something Magic calls a “permanent”. Permanents live in an area between players called the “battlefield”.

And it gets even more complicated! Some creatures have special abilities. When those abilities are activated, they’re treated like spells in many ways: you can cast spells in response, and even counter them with the right cards. However, they’re not spells, because they’re not cards: like tokens, you can’t shuffle them into your deck. Instead, both they and spells that have just been cast live in another area, the “stack”.

So while Magic might look like it just has one type of “thing”, cards, in fact it has three: cards, permanents, and objects on the stack.

We can contrast this with another card game, Hearthstone.

hearthstone_screenshot

Hearthstone is much like Magic. You are a spellcaster, you cast spells, you summon creatures, and those spells and creatures are represented by cards.

The difference is, Hearthstone is purely electronic. You can’t go out and buy the cards in a store, they’re simulated in the online game. And this means that Hearthstone’s metaphysics can be a whole lot simpler.

In Hearthstone, if you have a general who recruits a soldier every turn, the soldiers can be cards just like the general. You can return them to your hand, or shuffle them into your deck, just like a normal card. Your computer can keep track of them, and make sure they go away properly at the end of the game.

This means that Hearthstone doesn’t need a concept of “permanents”: everything on its “battlefield” is just a card, which can have some strange consequences. If you return a creature to your hand, and you have room, it will just go there. But if your hand is full, and the creature has nowhere to go, it will “die”, in exactly the same way it would have died in the game if another creature killed it. From the game’s perspective, the creature was always a card, and the card “died”, so the creature died.

These small differences in implementation, in the “mathematics” of the game, change the metaphysics completely. Magic has three types of “things”, Hearthstone has only one.

And card games are a special case, because in some sense they’re built to make metaphysics easy. Cards are intuitive, everyday objects, and both Magic and Hearthstone are built off of our intuitions about them, which is why I can talk about “things” in either game.

Physics doesn’t have to be built that way. Physics is meant to capture our observations, and help us make predictions. It doesn’t have to sort itself neatly into “things”. Even if it does, I hope I’ve convinced you that small changes in physics could lead to large changes in which “things” exist. Unless you’re convinced that you understand the physics of something completely, you might want to skip the metaphysics. A minor mathematical detail could sweep it all away.

arXiv, Our Printing Press

IMG_20160714_091400

Johannes Gutenberg, inventor of the printing press, and possibly the only photogenic thing on the Mainz campus

I’ve had a few occasions to dig into older papers recently, and I’ve noticed a trend: old papers are hard to read!

Ok, that might not be surprising. The older a paper is, the greater the chance it will use obsolete notation, or assume a context that has long passed by. Older papers have different assumptions about what matters, or what rigor requires, and their readers cared about different things. All this is to be expected: a slow, gradual approach to a modern style and understanding.

I’ve been noticing, though, that this slow, gradual approach doesn’t always hold. Specifically, it seems to speed up quite dramatically at one point: the introduction of arXiv, the website where we store all our papers.

Part of this could just be a coincidence. As it happens, the founding papers in my subfield, those that started Amplitudes with a capital “A”, were right around the time that arXiv first got going. It could be that all I’m noticing is the difference between Amplitudes and “pre-Amplitudes”, with the Amplitudes subfield sharing notation more than they did before they had a shared identity.

But I suspect that something else is going on. With arXiv, we don’t just share papers (that was done, piecemeal, before arXiv). We also share LaTeX.

LaTeX is a document formatting language, like a programming language for papers. It’s used pretty much universally in physics and math, and increasingly in other fields. As it turns out, when we post a paper to arXiv, we don’t just send a pdf: we include the raw LaTeX code as well.

Before arXiv, if you wanted to include an equation from another paper, you’d format it yourself. You’d probably do it a little differently from the other paper, in accord with your own conventions, and just to make it easier on yourself. Over time, more and more differences would crop up, making older papers harder and harder to read.

With arXiv, you can still do all that. But you can also just copy.

Since arXiv makes the LaTeX code behind a paper public, it’s easy to lift the occasional equation. Even if you’re not lifting it directly, you can see how they coded it. Even if you don’t plan on copying, the default gets flipped around: instead of having to try to make your equation like the one in the previous paper and accidentally getting it wrong, every difference is intentional.

This reminds me, in a small-scale way, of the effect of the printing press on anatomy books.

Before the printing press, books on anatomy tended to be full of descriptions, but not illustrations. Illustrations weren’t reliable: there was no guarantee the monk who copied them would do so correctly, so nobody bothered. This made it hard to tell when an anatomist (fine it was always Galen) was wrong: he could just be using an odd description. It was only after the printing press that books could actually have illustrations that were reliable across copies of a book. Suddenly, it was possible to point out that a fellow anatomist had left something out: it would be missing from the illustration!

In a similar way, arXiv seems to have led to increasingly standard notation. We still aren’t totally consistent…but we do seem a lot more consistent than older papers, and I think arXiv is the reason why.

Thought Experiments, Minus the Thought

My second-favorite Newton fact is that, despite inventing calculus, he refused to use it for his most famous work of physics, the Principia. Instead, he used geometrical proofs, tweaked to smuggle in calculus without admitting it.

Essentially, these proofs were thought experiments. Newton would start with a standard geometry argument, one that would have been acceptable to mathematicians centuries earlier. Then, he’d imagine taking it further, pushing a line or angle to some infinite point. He’d argue that, if the proof worked for every finite choice, then it should work in the infinite limit as well.

These thought experiments let Newton argue on the basis of something that looked more rigorous than calculus. However, they also held science back. At the time, only a few people in the world could understand what Newton was doing. It was only later, when Newton’s laws were reformulated in calculus terms, that a wider group of researchers could start doing serious physics.

What changed? If Newton could describe his physics with geometrical thought experiments, why couldn’t everyone else?

The trouble with thought experiments is that they require careful setup, setup that has to be thought through for each new thought experiment. Calculus took Newton’s geometrical thought experiments, and took out the need for thought: the setup was automatically a part of calculus, and each new researcher could build on their predecessors without having to set everything up again.

This sort of thing happens a lot in science. An example from my field is the scattering matrix, or S-matrix.

The S-matrix, deep down, is a thought experiment. Take some particles, and put them infinitely far away from each other, off in the infinite past. Then, let them approach, close enough to collide. If they do, new particles can form, and these new particles will travel out again, infinite far away in the infinite future. The S-matrix then is a metaphorical matrix that tells you, for each possible set of incoming particles, what the probability is to get each possible set of outgoing particles.

In a real collider, the particles don’t come from infinitely far away, and they don’t travel infinitely far before they’re stopped. But the distances are long enough, compared to the sizes relevant for particle physics, that the S-matrix is the right idea for the job.

Like calculus, the S-matrix is a thought experiment minus the thought. When we want to calculate the probability of particles scattering, we don’t need to set up the whole thought experiment all over again. Instead, we can start by calculating, and over time we’ve gotten very good at it.

In general, sub-fields in physics can be divided into those that have found their S-matrices, their thought experiments minus thought, and those that have not. When a topic has to rely on thought experiments, progress is much slower: people argue over the details of each setup, and it’s difficult to build something that can last. It’s only when a field turns the corner, removing the thought from its thought experiments, that people can start making real collaborative progress.

Still Traveling

I’m still traveling this week, so this will  be a short post.

Last year, when I went to Amplitudes I left Europe right after. This felt like a bit of a waste: an expensive, transcontinental flight, and I was only there for a week?

So this year, I resolved to visit a few more places. I was at the Niels Bohr Institute in Copenhagen earlier this week.

IMG_20160712_205034_hdr

Where the live LHC collisions represented as lights shining on the face of the building are rather spoiled by the lack of any actual darkness to see them by.

Now, I’m at Mainz, visiting Johannes Henn.

Oddly enough, I’ve got family connections to both places. My great-grandfather spent some time at the Niels Bohr Institute on his way out of Europe, and I have a relative who works at Mainz. So while the primary purpose of this trip was research, I’ve gotten to learn a little family history in the process.