Tag Archives: newton

A Newtonmas Present of Internet Content

I’m lazy this Newtonmas, so instead of writing a post of my own I’m going to recommend a few other people who do excellent work.

Quantum Frontiers is a shared blog updated by researchers connected to Caltech’s Institute for Quantum Information and Matter. While the whole blog is good, I’m going to be more specific and recommend the posts by Nicole Yunger Halpern. Nicole is really a great writer, and her posts are full of vivid imagery and fun analogies. If she’s not as well-known, it’s only because she lacks the attention-grabbing habit of getting into stupid arguments with other bloggers. Definitely worth a follow.

Recommending Slate Star Codex feels a bit strange, because it seems like everyone I’ve met who would enjoy the blog already reads it. It’s not a physics blog by any stretch, so it’s also an unusual recommendation to give here. Slate Star Codex writes about a wide variety of topics, and while the author isn’t an expert in most of them he does a lot more research than you or I would. If you’re interested in up-to-date meta-analyses on psychology, social science, and policy, pored over by someone with scrupulous intellectual honesty and an inexplicably large amount of time to indulge it, then Slate Star Codex is the blog for you.

I mentioned Piled Higher and Deeper a few weeks back, when I reviewed the author’s popular science book We Have No Idea. Piled Higher and Deeper is a webcomic about life in grad school. Humor is all about exaggeration, and it’s true that Piled Higher and Deeper exaggerates just how miserable and dysfunctional grad school can be…but not by as much as you’d think. I recommend that anyone considering grad school read Piled Higher and Deeper, and take it seriously. Grad school can really be like that, and if you don’t think you can deal with spending five or six years in the world of that comic you should take that into account.

Advertisements

Our Bargain

Sabine Hossenfelder has a blog post this week chastising particle physicists and cosmologists for following “upside-down Popper”, or assuming a theory is worth working on merely because it’s falsifiable. She describes her colleagues churning out one hypothesis after another, each tweaking an old idea just enough to make it falsifiable in the next experiment, without caring whether the hypothesis is actually likely to be true.

Sabine is much more of an expert in this area of physics (phenomenology) than I am, and I don’t presume to tell her she’s wrong about that community. But the problem she’s describing is part of something bigger, something that affects my part of physics as well.

There’s a core question we’d all like to answer: what should physicists work on? What criteria should guide us?

Falsifiability isn’t the whole story. The next obvious criterion is a sense of simplicity, of Occam’s Razor or mathematical elegance. Sabine has argued against the latter, which prompted a friend of mine to comment that between rejecting falsifiability and elegance, Sabine must want us to stop doing high-energy physics at all!

That’s more than a little unfair, though. I think Sabine has a reasonably clear criterion in mind. It’s the same criterion that most critics of the physics mainstream care about. It’s even the same criterion being used by the “other side”, the sort of people who criticize anything that’s not string/SUSY/inflation.

The criterion is quite a simple one: physics research should be productive. Anything we publish, anything we work on, should bring us closer to understanding the real world.

And before you object that this criterion is obvious, that it’s subjective, that it ignores the very real disagreements between the Sabines and the Luboses of the world…before any of that, please let me finish.

We can’t achieve this criterion. And we shouldn’t.

We can’t demand that all physics be productive without breaking a fundamental bargain, one we made when we accepted that science could be a career.

1200px-13_portrait_of_robert_hooke

The Hunchback of Notre Science

It wasn’t always this way. Up until the nineteenth century, “scientist” was a hobby, not a job.

After Newton published his theory of gravity, he was famously accused by Robert Hooke of stealing the idea. There’s some controversy about this, but historians agree on a few points: that Hooke did write a letter to Newton suggesting a 1/r^2 force law, and that Hooke, unlike Newton, never really worked out the law’s full consequences.

Why not? In part, because Hooke, unlike Newton, had a job.

Hooke was arguably the first person for whom science was a full-time source of income. As curator of experiments for the Royal Society, it was his responsibility to set up demonstrations for each Royal Society meeting. Later, he also handled correspondence for the Royal Society Journal. These responsibilities took up much of his time, and as a result, even if he was capable of following up on the consequences of 1/r^2 he wouldn’t have had time to focus on it. That kind of calculation wasn’t what he was being paid for.

We’re better off than Hooke today. We still have our responsibilities, to journals and teaching and the like, at various stages of our careers. But in the centuries since Hooke expectations have changed, and real original research is no longer something we have to fit in our spare time. It’s now a central expectation of the job.

When scientific research became a career, we accepted a kind of bargain. On the positive side, you no longer have to be independently wealthy to contribute to science. More than that, the existence of professional scientists is the bedrock of technological civilization. With enough scientists around, we get modern medicine and the internet and space programs and the LHC, things that wouldn’t be possible in a world of rare wealthy geniuses.

We pay a price for that bargain, though. If science is a steady job, then it has to provide steady work. A scientist has to be able to go in, every day, and do science.

And the problem is, science doesn’t always work like that. There isn’t always something productive to work on. Even when there is, there isn’t always something productive for you to work on.

Sabine blames “upside-down Popper” on the current publish-or-perish environment in physics. If physics careers weren’t so cut-throat and the metrics they are judged by weren’t so flawed, then maybe people would have time to do slow, careful work on deeper topics rather than pumping out minimally falsifiable papers as fast as possible.

There’s a lot of truth to this, but I think at its core it’s a bit too optimistic. Each of us only has a certain amount of expertise, and sometimes that expertise just isn’t likely to be productive at the moment. Because science is a job, a person in that position can’t just go work at the Royal Mint like Newton did. (The modern-day equivalent would be working for Wall Street, but physicists rarely come back from that.) Instead, they keep doing what they know how to do, slowly branching out, until they’ve either learned something productive or their old topic becomes useful once more. You can think of it as a form of practice, where scientists keep their skills honed until they’re needed.

So if we slow down the rate of publication, if we create metrics for universities that let them hire based on the depth and importance of work and not just number of papers and citations, if we manage all of that then yes we will improve science a great deal. But Lisa Randall still won’t work on Haag’s theorem.

In the end, we’ll still have physicists working on topics that aren’t actually productive.

img_0622

A physicist lazing about unproductively under an apple tree

So do we have to pay physicists to work on whatever they want, no matter how ridiculous?

No, I’m not saying that. We can’t expect everyone to do productive work all the time, but we can absolutely establish standards to make the work more likely to be productive.

Strange as it may sound, I think our standards for this are already quite good, or at least better than many other fields.

First, there’s falsifiability itself, or specifically our attitude towards it.

Physics’s obsession with falsifiability has one important benefit: it means that when someone proposes a new model of dark matter or inflation that they tweaked to be just beyond the current experiments, they don’t claim to know it’s true. They just claim it hasn’t been falsified yet.

This is quite different from what happens in biology and the social sciences. There, if someone tweaks their study to be just within statistical significance, people typically assume the study demonstrated something real. Doctors base treatments on it, and politicians base policy on it. Upside-down Popper has its flaws, but at least it’s never going to kill anybody, or put anyone in prison.

Admittedly, that’s a pretty low bar. Let’s try to set a higher one.

Moving past falsifiability, what about originality? We have very strong norms against publishing work that someone else has already done.

Ok, you (and probably Sabine) would object, isn’t that easy to get around? Aren’t all these Popper-flippers pretending to be original but really just following the same recipe each time, modifying their theory just enough to stay falsifiable?

To some extent. But if they were really following a recipe, you could beat them easily: just write the recipe down.

Physics progresses best when we can generalize, when we skip from case-by-case to understanding whole swaths of cases at once. Over time, there have been plenty of cases in which people have done that, where a number of fiddly hand-made models have been summarized in one parameter space. Once that happens, the rule of originality kicks in: now, no-one can propose another fiddly model like that again. It’s already covered.

As long as the recipe really is just a recipe, you can do this. You can write up what these people are doing in computer code, release the code, and then that’s that, they have to do something else. The problem is, most of the time it’s not really a recipe. It’s close enough to one that they can rely on it, close enough to one that they can get paper after paper when they need to…but it still requires just enough human involvement, just enough genuine originality, to be worth a paper.

The good news is that the range of “recipes” we can code up increases with time. Some spaces of theories we might never be able to describe in full generality (I’m glad there are people trying to do statistics on the string landscape, but good grief it looks quixotic). Some of the time though, we have a real chance of putting a neat little bow on a subject, labeled “no need to talk about this again”.

This emphasis on originality keeps the field moving. It means that despite our bargain, despite having to tolerate “practice” work as part of full-time physics jobs, we can still nudge people back towards productivity.

 

One final point: it’s possible you’re completely ok with the idea of physicists spending most of their time “practicing”, but just wish they wouldn’t make such a big deal about it. Maybe you can appreciate that “can I cook up a model where dark matter kills the dinosaurs” is an interesting intellectual exercise, but you don’t think it should be paraded in front of journalists as if it were actually solving a real problem.

In that case, I agree with you, at least up to a point. It is absolutely true that physics has a dysfunctional relationship with the media. We’re too used to describing whatever we’re working on as the most important thing in the universe, and journalists are convinced that’s the only way to get the public to pay attention. This is something we can and should make progress on. An increasing number of journalists are breaking from the trend and focusing not on covering the “next big thing”, but in telling stories about people. We should do all we can to promote those journalists, to spread their work over the hype, to encourage the kind of stories that treat “practice” as interesting puzzles pursued by interesting people, not the solution to the great mysteries of physics. I know that if I ever do anything newsworthy, there are some journalists I’d give the story to before any others.

At the same time, it’s important to understand that some of the dysfunction here isn’t unique to physics, or even to science. Deep down the reason nobody can admit that their physics is “practice” work is the same reason people at job interviews claim to love the company, the same reason college applicants have to tell stirring stories of hardship and couples spend tens of thousands on weddings. We live in a culture in which nothing can ever just be “ok”, in which admitting things are anything other than exceptional is akin to calling them worthless. It’s an arms-race of exaggeration, and it goes far beyond physics.

(I should note that this “culture” may not be as universal as I think it is. If so, it’s possible its presence in physics is due to you guys letting too many of us Americans into the field.)

 

We made a bargain when we turned science into a career. We bought modernity, but the price we pay is subsidizing some amount of unproductive “practice” work. We can negotiate the terms of our bargain, and we should, tilting the field with incentives to get it closer to the truth. But we’ll never get rid of it entirely, because science is still done by people. And sometimes, despite what we’re willing to admit, people are just “ok”.

Thought Experiments, Minus the Thought

My second-favorite Newton fact is that, despite inventing calculus, he refused to use it for his most famous work of physics, the Principia. Instead, he used geometrical proofs, tweaked to smuggle in calculus without admitting it.

Essentially, these proofs were thought experiments. Newton would start with a standard geometry argument, one that would have been acceptable to mathematicians centuries earlier. Then, he’d imagine taking it further, pushing a line or angle to some infinite point. He’d argue that, if the proof worked for every finite choice, then it should work in the infinite limit as well.

These thought experiments let Newton argue on the basis of something that looked more rigorous than calculus. However, they also held science back. At the time, only a few people in the world could understand what Newton was doing. It was only later, when Newton’s laws were reformulated in calculus terms, that a wider group of researchers could start doing serious physics.

What changed? If Newton could describe his physics with geometrical thought experiments, why couldn’t everyone else?

The trouble with thought experiments is that they require careful setup, setup that has to be thought through for each new thought experiment. Calculus took Newton’s geometrical thought experiments, and took out the need for thought: the setup was automatically a part of calculus, and each new researcher could build on their predecessors without having to set everything up again.

This sort of thing happens a lot in science. An example from my field is the scattering matrix, or S-matrix.

The S-matrix, deep down, is a thought experiment. Take some particles, and put them infinitely far away from each other, off in the infinite past. Then, let them approach, close enough to collide. If they do, new particles can form, and these new particles will travel out again, infinite far away in the infinite future. The S-matrix then is a metaphorical matrix that tells you, for each possible set of incoming particles, what the probability is to get each possible set of outgoing particles.

In a real collider, the particles don’t come from infinitely far away, and they don’t travel infinitely far before they’re stopped. But the distances are long enough, compared to the sizes relevant for particle physics, that the S-matrix is the right idea for the job.

Like calculus, the S-matrix is a thought experiment minus the thought. When we want to calculate the probability of particles scattering, we don’t need to set up the whole thought experiment all over again. Instead, we can start by calculating, and over time we’ve gotten very good at it.

In general, sub-fields in physics can be divided into those that have found their S-matrices, their thought experiments minus thought, and those that have not. When a topic has to rely on thought experiments, progress is much slower: people argue over the details of each setup, and it’s difficult to build something that can last. It’s only when a field turns the corner, removing the thought from its thought experiments, that people can start making real collaborative progress.

Newtonmas 2015

Merry Newtonmas!

I’ll leave up my poll a bit longer, but the results are already looking pretty consistent.

A strong plurality of my readers have PhDs in high energy or theoretical physics, a little more than a quarter. Another big chunk (a bit over a fifth) are physics grad students. All together, that means almost half of my readers have some technical background in what I do.

In the comments, Cliff suggests this is a good reason to start writing more technical posts. Looking at the results, I agree, it looks like there would definitely be an audience for that sort of thing. Technical posts take a lot more effort than general audience posts, so don’t expect a lot of them…but you can definitely look forward to a few technical posts next year.

On the other hand, between people with some college physics and people who only saw physics in high school, about a third of my audience wouldn’t get much out of technical posts. Most of my posts will still be geared to this audience, since it’s kind of my brand at this point, but I do want to start experimenting with aiming a few posts to more specific segments.

Beyond that, I’ve got a smattering of readers in other parts of physics, and a few mathematicians. Aside from the occasional post defending physics notation, there probably won’t be much aimed at either group, but do let me know what I can do to make things more accessible!

 

Merry Newtonmas!

Yesterday, people around the globe celebrated the birth of someone whose new perspective and radical ideas changed history, perhaps more than any other.

I’m referring, of course, to Isaac Newton.

Ho ho ho!

Born on December 25, 1642, Newton is justly famed as one of history’s greatest scientists. By relating gravity on Earth to the force that holds the planets in orbit, Newton arguably created physics as we know it.

However, like many prominent scientists, Newton’s greatness was not so much in what he discovered as how he discovered it. Others had already had similar ideas about gravity. Robert Hooke in particular had written to Newton mentioning a law much like the one Newton eventually wrote down, leading Hooke to accuse Newton of plagiarism.

Newton’s great accomplishment was not merely proposing his law of gravitation, but justifying it, in a way that no-one had ever done before. When others (Hooke for example) had proposed similar laws, they were looking for a law that perfectly described the motion of the planets. Kepler had already proposed ellipse-shaped orbits, but it was clear by Newton and Hooke’s time that such orbits did not fully describe the motion of the planets. Hooke and others hoped that if some sufficiently skilled mathematician started with the correct laws, they could predict the planets’ motions with complete accuracy.

The genius of Newton was in attacking this problem from a different direction. In particular, Newton showed that his laws of gravitation do result in (incorrect) ellipses…provided that there was only one planet.

With multiple planets, things become much more complicated. Even just two planets orbiting a single star is so difficult a problem that it’s impossible to write down an exact solution.

Sensibly, Newton didn’t try to write down an exact solution. Instead, he figured out an approximation: since the Sun is much bigger than the planets, he could simplify the problem and arrive at a partial solution. While he couldn’t perfectly predict the motions of the planets, he knew more than that they were just “approximately” ellipses: he had a prediction for how different from ellipses they should be.

That step was Newton’s great contribution. That insight, that science was able not just to provide exact answers to simpler problems but to guess how far those answers might be off, was something no-one else had really thought about before. It led to error analysis in experiments, and perturbation methods in theory. More generally, it led to the idea that scientists have to be responsible, not just for getting things “almost right”, but for explaining how their results are still wrong.

So this holiday season, let’s give thanks to the man whose ideas created science as we know it. Merry Newtonmas everyone!

Perimeter and Patronage

I’m visiting the Perimeter Institute this week. For the non-physicists in the audience, Perimeter is a very prestigious institute of theoretical physics, founded by the founder of BlackBerry. It’s quite swanky. Some first impressions:

  • This occurred to me several times: this place is what the Simons Center wants to be when it grows up.
  • You’d think that the building is impossible to navigate because it was designed by a theoretical physicist, but Freddy Cachazo assured us that he actually had to get the architect to tone down the impossibly ridiculous architecture. Looks like the only person crazier than a physicist is an artist.
  • Having table service at an institute café feels very swanky at first, but it’s actually a lot less practical than cafeteria-style dining. I think the Simons Center Café has it right on this one, even if they don’t quite understand the concept of hurricane relief (don’t have a link for that joke, but I can explain if you’re curious).
  • Perimeter has some government money, but much of its funding comes from private companies and foundations, particularly Research in Motion (or RIM, now BlackBerry). Incidentally, I’m told that PeRIMeter is supposed to be a reference to RIM.

What interests me is that you don’t see this sort of thing (private support) very often in other fields. Private donors will found efforts to solve some real-world problem, like autism or income inequality. They rarely fund basic research*. When they do fund basic research, it’s usually at a particular university. Something like Perimeter, a private institute for basic research, is rather unusual. Perimeter itself describes its motivation as something akin to a long-range strategic investment, but I think this also ties back to the concept of patronage.

Like art, physics has a history of being a fashionable thing for wealthy patrons to support, usually when the research topic is in line with their wider interests. Newton, for example, re-cast his research in terms of its implications for an understanding of the tides to interest the nautically-minded King James II, despite the fact that he couldn’t predict the tides any better than anyone else in his day. Much like supporting art, supporting physics can allow someone’s name to linger on through history, while not running a risk of competing with others’ business interests like research in biology or chemistry might.

A man who liked his sailors

*basic research is a term scientists use to refer to research that isn’t made with a particular application in mind. In terms of theoretical physics, this often means theories that aren’t “true”.

There’s something about Symmetry…

Physicists talk a lot about symmetry. Listen to an article about string theory and you might get the idea that symmetry is some sort of mysterious, mystical principle of beauty, inexplicable to the common man or woman.

Well, if it was inexplicable, I wouldn’t be blogging about it, now would I?

Symmetry in physics is dead simple. At the same time, it’s a bit misleading.

When you think of symmetry, you probably think of objects: symmetric faces, symmetric snowflakes, symmetric sculptures. Symmetry in physics can be about objects, but it can also be about places: symmetry is the idea that if you do an experiment from a different point of view, you should get the same results. In a way, this is what makes all of physics possible: two people in two different parts of the world can do the same experiment, but because of symmetry they can compare results and agree on how the world works.

Of course, if that was all there was to symmetry then it would hardly have the mystical reputation it does. The exciting, beautiful, and above all useful thing about symmetry is that, whenever there is a symmetry, there is a conservation law.

A conservation law is a law of physics that states that some quantity is conserved, that is, cannot be created or destroyed, but merely changed from one form to another. Energy is the classic example: you can’t create energy out of nothing, but you can turn the potential energy of gravity on top of a hill into the kinetic energy of a rolling ball, or the chemical energy of coal into the electrical energy in your power lines.

The fact that every symmetry creates a conservation law is not obvious. Proving it in general and describing how it works required a major breakthrough in mathematics. It was worked out by Emmy Noether, one of the greatest minds of her time, which given that her time included Einstein says rather a lot. Noether struggled for most of her life with the male-dominated establishment of academia, and spent many years teaching unpaid and under the names of male faculty, forbidden from being a professor because of her gender.

Why must women always be banished to the Noether regions of physics?

Noether’s proof is remarkable, but if you’re not familiar with the mathematics it won’t mean much to you. If you want to get a feel for the connection between symmetries and conservation laws, you need to go back a bit further. For the best example, we need to go all the way back to the dawn of physics.

Christiaan Huygens was a contemporary of Isaac Newton, and like Noether he was arguably as smart as if not smarter than his more famous colleague. Huygens could be described as the first theoretical physicist. Long before Newton first wrote his three laws of motion, Huygens used thought experiments to prove deep facts about physics, and he did it using symmetry.

In one of Huygens’ thought experiments, two men face each other, one standing on a boat and the other on the bank of a river. The men grab onto each other’s hands, and dangle a ball on a string from each pair of hands. In this way, it is impossible to tell which man is moving each ball.

Stop hitting yourself!

From the man on the bank’s perspective, he moves the two balls together at the same speed, which happens to be the same speed as the river. The balls are the same size, so as far as he can see they should have the same speed afterwards as well.

On the other hand, the man in the boat thinks that he’s only moving one ball. Since the man on the bank is moving one of the balls along at the same speed as the river, from the man on the boat’s perspective that ball is just staying still, while the other ball is moving with twice the speed of the river. If the man on the bank sees the balls bounce off of each other at equal speed, then the man on the boat will see the moving ball stop, and the ball that was staying still start to move with the same speed as the original ball. From what he could see, a moving ball hit a ball at rest, and transferred its entire momentum to the new ball.

Using arguments like these, Huygens developed the idea of conservation of momentum, the idea of a number related to an object’s mass and speed that can never be created or destroyed, only transferred from one object to another. And he did it using symmetry. At heart, his arguments showed that momentum, the mysterious “quantity of motion”, was merely a natural consequence of the fact that two people can look at a situation in two different ways. And it is that fact, and the power that fact has to explain the world, that makes physicists so obsessed with symmetry.