# We Didn’t Deserve Hawking

I don’t usually do obituaries. I didn’t do one when Joseph Polchinksi died, though his textbook is sitting an arm’s reach from me right now. I never collaborated with Polchinski, I never met him, and others were much better at telling his story.

I never met Stephen Hawking, either. When I was at Perimeter, I’d often get asked if I had. Visitors would see his name on the Perimeter website, and I’d have to disappoint them by explaining that he hadn’t visited the institute in quite some time. His health, while exceptional for a septuagenarian with ALS, wasn’t up to the travel.

Was his work especially relevant to mine? Only because of its relevance to everyone who does gravitational physics. The universality of singularities in general relativity, black hole thermodynamics and Hawking radiation, these sharpened the questions around quantum gravity. Without his work, string theory wouldn’t have tried to answer the questions Hawking posed, and it wouldn’t have become the field it is today.

Hawking was unique, though, not necessarily because of his work, but because of his recognizability. Those visitors to Perimeter were a cross-section of the Canadian public. Some of them didn’t know the name of the speaker for the lecture they came to see. Some, arriving after reading Lee Smolin’s book, could only refer to him as “that older fellow who thinks about quantum gravity”. But Hawking? They knew Hawking. Without exception, they knew Hawking.

Who was the last physicist the public knew, like that? Feynman, at the height of his popularity, might have been close. You’d have to go back to Einstein to find someone who was really solidly known like that, who you could mention in homes across the world and expect recognition. And who else has that kind of status? Bohr might have it in Denmark. Go further back, and you’ll find people know Newton, they know Gaileo.

Einstein changed our picture of space and time irrevocably. Newton invented physics as we know it. Galileo and Copernicus pointed up to the sky and shouted that the Earth moves!

Hawking asked questions. He told us what did and didn’t make sense, he showed what we had to take into account. He laid the rules of engagement, and the rest of quantum gravity came and asked alongside him.

We live in an age of questions now. We’re starting to glimpse the answers, we have candidates and frameworks and tools, and if we’re feeling very optimistic we might already be sitting on a theory of everything. But we haven’t turned that corner yet, from asking questions to changing the world.

These ages don’t usually get a household name. Normally, you need an Einstein, a Newton, a Galileo, you need to shake the foundations of the world.

Somehow, Hawking gave us one anyway. Somehow, in our age of questions, we put a face in everyone’s mind, a figure huddled in a wheelchair with a snarky, computer-generated voice. Somehow Hawking reached out and reminded the world that there were people out there asking, that there was a big beautiful puzzle that our field was trying to solve.

Deep down, I’m not sure we deserved that. I hope we deserve it soon.

Occasionally, you’ll see people argue that PhD degrees are unnecessary. Sometimes they’re non-scientists who don’t know what they’re talking about, sometimes they’re Freeman Dyson.

With the wide range of arguers comes a wide range of arguments, and I don’t pretend to be able to address them all. But I do think that PhD programs, or something like them, are necessary. Grad school performs a task that almost nothing else can: it turns students into researchers.

The difference between studying a subject and researching it is a bit like the difference between swimming laps in a pool and being a fish. You can get pretty good at swimming, to the point where you can go back and forth with no real danger of screwing up. But a fish lives there.

To do research in a subject, you really have to be able to “live there”. It doesn’t have to be your whole life, or even the most important part of your life. But it has to be somewhere you’re comfortable, where you can immerse yourself and interact with it naturally. You have to have “fluency”, in the same sort of sense you can be fluent in a language. And just as you can learn a language much faster by immersion than by just taking classes, most people find it a lot easier to become a researcher if they’re in an environment built around research.

Does that have to be grad school? Not necessarily. Some people get immersed in real research from an early age (Dyson certainly fell into that category). But even (especially) for a curious person, it’s easy to get immersed in something else instead. As a kid, I would probably happily have become a Dungeons and Dragons researcher if that was a real thing.

Grad school is a choice, to immerse yourself in something specific. You want to become a physicist? You can go somewhere where everyone cares about physics. A mathematician? Same deal. They even pay you, so you don’t need to try to fit research in between a bunch of part-time jobs. They have classes for those who learn better from classes, libraries for those who learn better from books, and for those who learn from conversation you can walk down the hall, knock on a door, and learn something new. You get the opportunity to surround yourself with a topic, to work it into your bones.

And the crazy thing? It really works. You go in with a student’s knowledge of a subject, often decades out of date, and you end up giving talks in front of the world’s experts. In most cases, you end up genuinely shocked by how much you’ve changed, how much you’ve grown. I know I was.

I’m not saying that all aspects of grad school are necessary. The thesis doesn’t make sense in every field, there’s a reason why theoretical physicists usually just staple their papers together and call it a day. Different universities have quite different setups for classes and teaching experience, so it’s unlikely that there’s one true way to arrange those. Even the concept of a single advisor might be more of an administrative convenience than a real necessity. But the core idea, of a place that focuses on the transformation from student to researcher, that pays you and gives you access to what you need…I don’t think that’s something we can do without.

# Writing the Paper Changes the Results

You spent months on your calculation, but finally it’s paid off. Now you just have to write the paper. That’s the easy part, right?

Not quite. Even if writing itself is easy for you, writing a paper is never just writing. To write a paper, you have to make your results as clear as possible, to fit them into one cohesive story. And often, doing that requires new calculations.

It’s something that first really struck me when talking to mathematicians, who may be the most extreme case. For them, a paper needs to be a complete, rigorous proof. Even when they have a result solidly plotted out in their head, when they’re sure they can prove something and they know what the proof needs to “look like”, actually getting the details right takes quite a lot of work.

Physicists don’t have quite the same standards of rigor, but we have a similar paper-writing experience. Often, trying to make our work clear raises novel questions. As we write, we try to put ourselves in the mind of a potential reader. Sometimes our imaginary reader is content and quiet. Other times, though, they object:

“Does this really work for all cases? What about this one? Did you make sure you can’t do this, or are you just assuming? Where does that pattern come from?”

Addressing those objections requires more work, more calculations. Sometimes, it becomes clear we don’t really understand our results at all! The paper takes a new direction, flows with new work to a new, truer message, one we wouldn’t have discovered if we didn’t sit down and try to write it out.

# At Least One Math Term That Makes Sense

I’ve complained before about how mathematicians name things. Mathematicans seem to have a knack for taking an ordinary bland word that’s almost indistinguishable from the other ordinary, bland words they’ve used before and assigning it an incredibly specific mathematical concept. Varieties and forms, motives and schemes, in each case you end up wishing they picked a word that was just a little more descriptive.

Sometimes, though, a word may seem completely out of place when it actually has a fairly reasonable explanation. Such is the case for the word “period“.

Suppose you want to classify numbers. You have the integers, and the rational numbers. A bigger class of numbers are “algebraic”, in that you can get them “from algebra”: more specifically, as solutions of polynomial equations with rational coefficients. Numbers that aren’t algebraic are “transcendental”, a popular example being $\pi$.

Periods lie in between: a set that contains algebraic numbers, but also many of the transcendental numbers. They’re numbers you can get, not from algebra, but from calculus: they’re integrals over rational functions. These numbers were popularized by Kontsevich and Zagier, and they’ve led to a lot of fruitful inquiry in both math and physics.

But why the heck are they called periods?

Think about $e^{i x}$.

Or if you prefer, think about a circle

$e^{i x}$ is a periodic function, with period $2\pi$.  Take $x$ from $0$ to $2\pi$ and the function repeats, you’ve traveled in a circle.

Thought of another way, $2\pi$ is the volume of the circle. It’s the integral, around the circle, of $\frac{dz}{z}$. And that integral nicely matches Kontsevich and Zagier’s definition of a period.

The idea of a period, then, comes from generalizing this. What happens when you only go partway around the circle, to some point $z$ in the complex plane? Then you need to go to a point $x=-i \ln z$. So a logarithm can also be thought of as measuring the period of $e^{i x}$. And indeed, since a logarithm can be expressed as $\int\frac{dz}{z}$, they count as periods in the Kontsevich-Zagier sense.

Starting there, you can loosely think about the polylogarithm functions I like to work with as collections of logs, measuring periods of interlocking circles.

And if you need to go beyond polylogarithms, when you can’t just go circle by circle?

Then you need to think about functions with two periods, like Weierstrass’s elliptic function. Just as you can think about $e^{i x}$ as a circle, you can think of Weierstrass’s function in terms of a torus.

Obligatory donut joke here

The torus has two periods, corresponding to the two circles you can draw around it. The periods of Weierstrass’s function are transcendental numbers, and they fit Kontsevich and Zagier’s definition of periods. And if you take the inverse of Weierstrass’s function, you get an elliptic integral, just like taking the inverse of $e^{i x}$ gives a logarithm.

So mathematicians, I apologize. Periods, at least, make sense.

# Valentine’s Day Physics Poem 2018

Valentine’s Day was this week, so long-time readers should know what to expect. To continue this blog’s tradition, I’m posting another one of my old physics poems.

Winding Number One

When you feel twisted up inside, you may be told to step back

That after a long time, from a long distance

All things fall off.

So I stepped back.

But looking in from a distance

On the border (at infinity)

A shape remained

Etched deep

In the equation of my being

A shape that wouldn’t fall off

Even at infinity.

And they may tell you to wait and see,

That you will evolve in time

That all things change, continuously.

So I let myself change.

But no matter how long I waited

How much I evolved

I could not return

My new state cannot be deformed

To what I was before.

The shape at my border

Is basic, immutable.

Faced with my thoughts

I try to draw a map

And run out of space.

I need two selves

Two lives

To map my soul.

A double cover.

And now, faced by my dual

Tracing each index

Integrated over manifold possibilities

We do not vanish

We have winding number one.

By A. Physicist

…because it disagrees with precision electroweak measurements

…………………………………..with bounds from ATLAS and CMS

…………………………………..with the power spectrum of the CMB

…………………………………..with Eötvös experiments

…because it isn’t gauge invariant

………………………….Lorentz invariant

………………………….diffeomorphism invariant

………………………….background-independent, whatever that means

…because it violates unitarity

…………………………………locality

…………………………………causality

…………………………………observer-independence

…………………………………technical naturalness

…………………………………international treaties

…………………………………cosmic censorship

…because you screwed up the calculation

…because you didn’t actually do the calculation

…because I don’t understand the calculation

…because you predict too many magnetic monopoles

……………………………………too many proton decays

……………………………………too many primordial black holes

…………………………………..remnants, at all

…because it’s fine-tuned

…because it’s suspiciously finely-tuned

…because it’s finely tuned to be always outside of experimental bounds

…because you’re misunderstanding quantum mechanics

…………………………………………………………..black holes

………………………………………………………….effective field theory

…………………………………………………………..thermodynamics

…………………………………………………………..the scientific method

…because Condensed Matter would contribute more to Chinese GDP

…because the approximation you’re making is unjustified

…………………………………………………………………………is not valid

…………………………………………………………………………is wildly overoptimistic

………………………………………………………………………….is just kind of lazy

…because there isn’t a plausible UV completion

…because you care too much about the UV

…because it only works in polynomial time

…………………………………………..exponential time

…………………………………………..factorial time

…because even if it’s fast it requires more memory than any computer on Earth

…because it requires more bits of memory than atoms in the visible universe

…because it has no meaningful advantages over current methods

…because it has meaningful advantages over my own methods

…because it can’t just be that easy

…because it’s not the kind of idea that usually works

…because it’s not the kind of idea that usually works in my field

…because it isn’t canonical

…because it’s ugly

…because it’s baroque

…because it ain’t baroque, and thus shouldn’t be fixed

…because only a few people work on it

…because far too many people work on it

…because clearly it will only work for the first case

……………………………………………………………….the first two cases

……………………………………………………………….the first seven cases

……………………………………………………………….the cases you’ve published and no more

…because I know you’re wrong

…because I strongly suspect you’re wrong

…because I strongly suspect you’re wrong, but saying I know you’re wrong looks better on a grant application

…….in a blog post

…because I’m just really pessimistic about something like that ever actually working

…because I’d rather work on my own thing, that I’m much more optimistic about

…because if I’m clear about my reasons

……and what I know

…….and what I don’t

……….then I’ll convince you you’re wrong.

……….or maybe you’ll convince me?

# Unreasonably Big Physics

The Large Hadron Collider is big, eight and a half kilometers across. It’s expensive, with a cost to construct and operate in the billions. And with an energy of 6.5 TeV per proton, it’s the most powerful collider in the world, accelerating protons to 0.99999999 of the speed of light.

The LHC is reasonable. After all, it was funded, and built. What does an unreasonable physics proposal look like?

It’s probably unfair to call the Superconducting Super Collider unreasonable, after all, it did almost get built. It would have been a 28 kilometer-wide circle in the Texas desert, accelerating protons to an energy of 20 TeV, three times the energy of the LHC. When it was cancelled in 1993, it was projected to cost twelve billion dollars, and two billion had already been spent digging the tunnel. The US hasn’t invested in a similarly sized project since.

A better example of an unreasonable proposal might be the Collider-in-the-Sea. (If that link is paywalled, this paper covers most of the same information.)

If you run out of room on land, why not build your collider underwater?

Ok, there are pretty obvious reasons why not. Surprisingly, the people proposing the Collider-in-the-Sea do a decent job of answering them. They plan to put it far enough out that it won’t disrupt shipping, and deep enough down that it won’t interfere with fish. Apparently at those depths even a hurricane barely ripples the water, and they argue that the technology exists to keep a floating ring stable under those conditions. All in all, they’re imagining a collider 600 kilometers in diameter, accelerating protons to 250 TeV, all for a cost they claim would be roughly comparable to the (substantially smaller) new colliders that China and Europe are considering.

I’m sure that there are reasons I’ve overlooked why this sort of project is impossible. (I mean, just look at the map!) Still, it’s impressive that they can marshal this much of an argument.

Besides, there are even more impossible projects, like this one, by Sugawara, Hagura, and Sanami. Their proposal for a 1000 TeV neutrino beam isn’t intended for research: rather, the idea is a beam powerful enough to send neutrinos through the Earth to destroy nuclear bombs. Such a beam could cause the bombs to detonate prematurely, “fizzling” with about 3% the explosion they would have normally.

In this case, Sugawara and co. admit that their proposal is pure fantasy. With current technology they would need a ring larger than the Collider-in-the-Sea, and the project would cost hundreds of billions of dollars. It’s not even clear who would want to build such a machine, or who could get away with building it: the authors imagine a science fiction-esque world government to foot the bill.

There’s a spectrum of papers that scientists write, from whimsical speculation to serious work. The press doesn’t always make the difference clear, so it’s a useful skill to see the clues in the writing that show where a given proposal lands. In the case of the Sugawara and co. proposal, the paper is littered with caveats, explicitly making it clear that it’s just a rough estimate. Even the first line, dedicating the paper to another professor, should get you to look twice: while this sometimes happens on serious papers, often it means the paper was written as a fun gift for the professor in question. The Collider-in-the-Sea doesn’t have these kinds of warning signs, and it’s clear its authors take it a bit more seriously. Nonetheless, comparing the level of detail to other accelerator proposals, even those from the same people, should suggest that the Collider-in-the-Sea isn’t entirely on the same level. As wacky as it is to imagine, we probably won’t get a collider that takes up most of the Gulf of Mexico, or a massive neutrino beam capable of blowing up nukes around the world.