Tag Archives: physics

arXiv vs. snarXiv: Can You Tell the Difference?

Have you ever played arXiv vs snarXiv?

arXiv is a preprint repository: it’s where we physicists put our papers before they’re published to journals.

snarXiv is…well..sound it out.

A creation of David Simmons-Duffin, snarXiv randomly generates titles and abstracts out of trendy arXiv buzzwords. It’s designed so that the papers on it look almost plausible…until you take a closer look, anyway.

Hence the game, arXiv vs snarXiv. Given just the titles of two papers, can you figure out which one is real, and which is fake?

I played arXiv vs snarXiv for a bit today, waiting for some code to run. Out of twenty questions, I only got two wrong.

Sometimes, it was fairly clear which paper was fake because snarXiv overreached. By trying to pile on too many buzzwords, it ended up with a title that repeated itself, or didn’t quite work grammatically.

Other times, I had to use some actual physics knowledge. Usually, this meant noticing when a title tied together unrelated areas in an implausible way. When a title claims to tie obscure mathematical concepts from string theory to a concrete problem in astronomy, it’s pretty clearly snarXiv talking.

The toughest questions, including the ones I got wrong, were when snarXiv went for something subtle. For short enough titles, the telltale signs of snarXiv were suppressed. There just weren’t enough buzzwords for a mistake to show up. I’m not sure there’s a way to distinguish titles like that, even for people in the relevant sub-field.

How well do you do at arXiv vs snarXiv? Any tips?

Jury-Rigging: The Many Uses of Dropbox

I’ll be behind the Great Firewall of China next week, so I’ve been thinking about various sites I won’t be able to access. Prominent among them is Dropbox, a service that hosts files online.

250px-dropbox_icon-svg

A helpful box to drop things in

What do physicists do with Dropbox? Quite a lot.

For us, Dropbox is a great way to keep collaborations on the same page. By sharing a Dropbox folder, we can share research programs, mathematical expressions, and paper drafts. It makes it a lot easier to keep one consistent version of a document between different people, and it’s a lot simpler than emailing files back and forth.

All that said, Dropbox has its drawbacks. You still need to be careful not to have two people editing the same thing at the same time, lest one overwrite the other’s work. You’ve got the choice between editing in place, making everyone else receive notifications whenever the files change, or editing in a separate folder, and having to be careful to keep it coordinated with the shared one.

Programmers will know there are cleaner solutions to these problems. GitHub is designed to share code, and you can work together on a paper with ShareLaTeX. So why do we use Dropbox?

Sometimes, it’s more important for a tool to be easy and universal, even if it doesn’t do everything you want. GitHub and ShareLaTeX might solve some of the problems we have with Dropbox, but they introduce extra work too. Because no one disadvantage of Dropbox takes up too much time, it’s simpler to stick with it than to introduce a variety of new services to fill the same role.

This is the source of a lot of jury-rigging in science. Our projects aren’t often big enough to justify more professional approaches: usually, something hacked together out of what’s available really is the best choice.

For one, it’s why I use wordpress. WordPress.com is not a great platform for professional blogging: it doesn’t give you a lot of control without charging, and surprise updates can make using it confusing. However, it takes a lot less effort than switching to something more professional, and for the moment at least I’m not really in a position that justifies the extra work.

Thought Experiments, Minus the Thought

My second-favorite Newton fact is that, despite inventing calculus, he refused to use it for his most famous work of physics, the Principia. Instead, he used geometrical proofs, tweaked to smuggle in calculus without admitting it.

Essentially, these proofs were thought experiments. Newton would start with a standard geometry argument, one that would have been acceptable to mathematicians centuries earlier. Then, he’d imagine taking it further, pushing a line or angle to some infinite point. He’d argue that, if the proof worked for every finite choice, then it should work in the infinite limit as well.

These thought experiments let Newton argue on the basis of something that looked more rigorous than calculus. However, they also held science back. At the time, only a few people in the world could understand what Newton was doing. It was only later, when Newton’s laws were reformulated in calculus terms, that a wider group of researchers could start doing serious physics.

What changed? If Newton could describe his physics with geometrical thought experiments, why couldn’t everyone else?

The trouble with thought experiments is that they require careful setup, setup that has to be thought through for each new thought experiment. Calculus took Newton’s geometrical thought experiments, and took out the need for thought: the setup was automatically a part of calculus, and each new researcher could build on their predecessors without having to set everything up again.

This sort of thing happens a lot in science. An example from my field is the scattering matrix, or S-matrix.

The S-matrix, deep down, is a thought experiment. Take some particles, and put them infinitely far away from each other, off in the infinite past. Then, let them approach, close enough to collide. If they do, new particles can form, and these new particles will travel out again, infinite far away in the infinite future. The S-matrix then is a metaphorical matrix that tells you, for each possible set of incoming particles, what the probability is to get each possible set of outgoing particles.

In a real collider, the particles don’t come from infinitely far away, and they don’t travel infinitely far before they’re stopped. But the distances are long enough, compared to the sizes relevant for particle physics, that the S-matrix is the right idea for the job.

Like calculus, the S-matrix is a thought experiment minus the thought. When we want to calculate the probability of particles scattering, we don’t need to set up the whole thought experiment all over again. Instead, we can start by calculating, and over time we’ve gotten very good at it.

In general, sub-fields in physics can be divided into those that have found their S-matrices, their thought experiments minus thought, and those that have not. When a topic has to rely on thought experiments, progress is much slower: people argue over the details of each setup, and it’s difficult to build something that can last. It’s only when a field turns the corner, removing the thought from its thought experiments, that people can start making real collaborative progress.

The (but I’m Not a) Crackpot Style Guide

Ok, ok, I believe you. You’re not a crackpot. You’re just an outsider, one with a brilliant new idea that would overturn the accepted paradigms of physics, if only someone would just listen.

Here’s the problem: you’re not alone. There are plenty of actual crackpots. We get contacted by them fairly regularly. And most of the time, they’re frustrating and unpleasant to deal with.

If you want physicists to listen to you, you need to show us you’re not one of those people. Otherwise, most of us won’t bother.

I can’t give you a foolproof way to do that. But I can give some suggestions that will hopefully make the process a little less frustrating for everyone involved.

Don’t spam:

Nobody likes spam. Nobody reads spam. If you send a mass email to every physicist whose email address you can find, none of them will read it. If you repeatedly post the same thing in a comment thread, nobody will read it. If you want people to listen to you, you have to show that you care about what they have to say, and in order to do that you have to tailor your message. This leads in to the next point,

Ask the right people:

Before you start reaching out, you should try to get an idea of who to talk to. Physics is quite specialized, so if you’re taking your ideas seriously you should try to contact people with a relevant specialization.

Now, I know what you’re thinking: your ideas are unique, no-one in physics is working on anything similar.

Here, it’s important to distinguish the problem you’re trying to solve with how you’re trying to solve it. Chances are, no-one else is working on your specific idea…but plenty of people are interested in the same problems.

Think quantum mechanics is built on shoddy assumptions? There are people who spend their lives trying to modify quantum mechanics. Have a beef against general relativity? There’s a whole sub-field of people who modify gravity.

These people are a valuable resource for you, because they know what doesn’t work. They’ve been trying to change the system, and they know just how hard it is to change, and just what evidence you need to be consistent with.

Contacting someone whose work just uses quantum mechanics or relativity won’t work. If you’re making elementary mistakes, we can put you on the right track…but if you think you’re making elementary mistakes, you should start out by asking help from a forum or the like, not contacting a professional. If you think you’ve really got a viable replacement to an established idea, you need to contact people who work on overturning established ideas, since they’re most aware of the complicated webs of implications involved. Relatedly,

Take ownership of your work:

I don’t know how many times someone has “corrected” something in the comments, and several posts later admitted that the “correction” comes from their own theory. If you’re arguing from your own work, own it! If you don’t, people will assume you’re trying to argue from an established theory, and are just confused about how that theory works. This is a special case of a broader principle,

Epistemic humility:

I’m not saying you need to be humble in general, but if you want to talk productively you need to be epistemically humble. That means being clear about why you know what you know. Did you get it from a mathematical proof? A philosophical argument? Reading pop science pieces? Something you remember from high school? Being clear about your sources makes it easier for people to figure out where you’re coming from, and avoids putting your foot in your mouth if it turns out your source is incomplete.

Context is crucial:

If you’re commenting on a blog like this one, pay attention to context. Your comment needs to be relevant enough that people won’t parse it as spam.

If all a post does is mention something like string theory, crowing about how your theory is a better explanation for quantum gravity isn’t relevant. Ditto for if all it does is mention a scientific concept that you think is mistaken.

What if the post is promoting something that you’ve found to be incorrect, though? What if someone is wrong on the internet?

In that case, it’s important to keep in mind the above principles. A popularization piece will usually try to present the establishment view, and merits a different response than a scientific piece arguing something new. In both cases, own your own ideas and be specific about how you know what you know. Be clear on whether you’re talking about something that’s controversial, or something that’s broadly agreed on.

You can get an idea of what works and what doesn’t by looking at comments on this blog. When I post about dark matter, or cosmic inflation, there are people who object, and the best ones are straightforward about why. Rather than opening with “you’re wrong”, they point out which ideas are controversial. They’re specific about whose ideas they’re referencing, and are clear about what is pedagogy and what is science.

Those comments tend to get much better responses than the ones that begin with cryptic condemnations, follow with links, and make absolute statements without backing them up.

On the internet, it’s easy for misunderstandings to devolve into arguments. Want to avoid that? Be direct, be clear, be relevant.

In Defense of Lord Kelvin, Michelson, and the Physics of Decimals

William Thompson, Lord Kelvin, was a towering genius of 19th century physics. He is often quoted as saying,

There is nothing new to be discovered in physics now. All that remains is more and more precise measurement.

lord_kelvin_photograph

Certainly sounds like something I would say!

As it happens, he never actually said this. It’s a paraphrase of a quote from Albert Michelson, of the Michelson-Morley Experiment:

While it is never safe to affirm that the future of Physical Science has no marvels in store even more astonishing than those of the past, it seems probable that most of the grand underlying principles have been firmly established and that further advances are to be sought chiefly in the rigorous application of these principles to all the phenomena which come under our notice. It is here that the science of measurement shows its importance — where quantitative work is more to be desired than qualitative work. An eminent physicist remarked that the future truths of physical science are to be looked for in the sixth place of decimals.

albert_abraham_michelson2

Now that’s more like it!

In hindsight, this quote looks pretty silly. When Michelson said that “it seems probable that most of the grand underlying principles have been firmly established” he was leaving out special relativity, general relativity, and quantum mechanics. From our perspective, the grandest underlying principles had yet to be discovered!

And yet, I think we should give Michelson some slack.

Someone asked me on twitter recently what I would choose if given the opportunity to unravel one of the secrets of the universe. At the time, I went for the wishing-for-more-wishes answer: I’d ask for a procedure to discover all of the other secrets.

I was cheating, to some extent. But I do think that the biggest and most important mystery isn’t black holes or the big bang, isn’t asking what will replace space-time or what determines the constants in the Standard Model. The most critical, most important question in physics, rather, is to find the consequences of the principles we actually know!

We know our world is described fairly well by quantum field theory. We’ve tested it, not just to the sixth decimal place, but to the tenth. And while we suspect it’s not the full story, it should still describe the vast majority of our everyday world.

If we knew not just the underlying principles, but the full consequences of quantum field theory, we’d understand almost everything we care about. But we don’t. Instead, we’re forced to calculate with approximations. When those approximations break down, we fall back on experiment, trying to propose models that describe the data without precisely explaining it. This is true even for something as “simple” as the distribution of quarks inside a proton. Once you start trying to describe materials, or chemistry or biology, all bets are off.

This is what the vast majority of physics is about. Even more, it’s what the vast majority of science is about. And that’s true even back to Michelson’s day. Quantum mechanics and relativity were revelations…but there are still large corners of physics in which neither matters very much, and even larger parts of the more nebulous “physical science”.

New fundamental principles get a lot of press, but you shouldn’t discount the physics of “the sixth place of decimals”. Most of the big mysteries don’t ask us to challenge our fundamental paradigm: rather, they’re challenges to calculate or measure better, to get more precision out of rules we already know. If a genie gave me the solution to any of physics’ mysteries I’d choose to understand the full consequences of quantum field theory, or even of the physics of Michelson’s day, long before I’d look for the answer to a trendy question like quantum gravity.

Entropy is Ignorance

(My last post had a poll in it! If you haven’t responded yet, please do.)

Earlier this month, philosopher Richard Dawid ran a workshop entitled “Why Trust a Theory? Reconsidering Scientific Methodology in Light of Modern Physics” to discuss his idea of “non-empirical theory confirmation” for string theory, inflation, and the multiverse. They haven’t published the talks online yet, so I’m stuck reading coverage, mostly these posts by skeptical philosopher Massimo Pigliucci. I find the overall concept annoying, and may rant about it later. For now though, I’d like to talk about a talks on the second day by philosopher Chris Wüthrich about black hole entropy.

Black holes, of course, are the entire-stars-collapsed-to-a-point-that-no-light-can-escape that everyone knows and loves. Entropy is often thought of as the scientific term for chaos and disorder, the universe’s long slide towards dissolution. In reality, it’s a bit more complicated than that.

2000px-chaos_star-svg

For one, you need to take Elric into account…

Can black holes be disordered? Naively, that doesn’t seem possible. How can a single point be disorderly?

Thought about in a bit more detail, the conclusion seems even stronger. Via something called the “No Hair Theorem”, it’s possible to prove that black holes can be described completely with just three numbers: their mass, their charge, and how fast they are spinning. With just three numbers, how can there be room for chaos?

On the other hand, you may have heard of the Second Law of Thermodynamics. The Second Law states that entropy always increases. Absent external support, things will always slide towards disorder eventually.

If you combine this with black holes, then this seems to have weird implications. In particular, what happens when something disordered falls into a black hole? Does the disorder just “go away”? Doesn’t that violate the Second Law?

This line of reasoning has led to the idea that black holes have entropy after all. It led Bekenstein to calculate the entropy of a black hole based on how much information is “hidden” inside, and Hawking to find that black holes in a quantum world should radiate as if they had a temperature consistent with that entropy. One of the biggest successes of string theory is an explanation for this entropy. In string theory, black holes aren’t perfect points: they have structure, arrangements of strings and higher dimensional membranes, and this structure can be disordered in a way that seems to give the right entropy.

Note that none of this has been tested experimentally. Hawking radiation, if it exists, is very faint: not the sort of thing we could detect with a telescope. Wüthrich is worried that Bekenstein’s original calculation of black hole entropy might have been on the wrong track, which would undermine one of string theory’s most well-known accomplishments.

I don’t know Wüthrich’s full argument, since the talks haven’t been posted online yet. All I know is Pigliucci’s summary. From that summary, it looks like Wüthrich’s primary worry is about two different definitions of entropy.

See, when I described entropy as “disorder”, I was being a bit vague. There are actually two different definitions of entropy. The older one, Gibbs entropy, grows with the number of states of a system. What does that have to do with disorder?

Think about two different substances: a gas, and a crystal. Both are made out of atoms, but the patterns involved are different. In the gas, atoms are free to move, while in the crystal they’re (comparatively) fixed in place.

147515main_phases_large

Blurrily so in this case

There are many different ways the atoms of a gas can be arranged and still be a gas, but fewer in which they can be a crystal, so a gas has more entropy than a crystal. Intuitively, the gas is more disordered.

When Bekenstein calculated the entropy of a black hole he didn’t use Gibbs entropy, though. Instead, he used Shannon entropy, a concept from information theory. Shannon entropy measures the amount of information in a message, with a formula very similar to that of Gibbs entropy: the more different ways you can arrange something, the more information you can use it to send. Bekenstein used this formula to calculate the amount of information that gets hidden from us when something falls into a black hole.

Wüthrich’s worry here (again, as far as Pigliucci describes) is that Shannon entropy is a very different concept from Gibbs entropy. Shannon entropy measures information, while Gibbs entropy is something “physical”. So by using one to predict the other, are predictions about black hole entropy just confused?

It may well be he has a deeper argument for this, one that wasn’t covered in the summary. But if this is accurate, Wüthrich is missing something fundamental. Shannon entropy and Gibbs entropy aren’t two different concepts. Rather, they’re both ways of describing a core idea: entropy is a measure of ignorance.

A gas has more entropy than a crystal, it can be arranged in a larger number of different ways. But let’s not talk about a gas. Let’s talk about a specific arrangement of atoms: one is flying up, one to the left, one to the right, and so on. Space them apart, but be very specific about how they are arranged. This arrangement could well be a gas, but now it’s a specific gas. And because we’re this specific, there are now many fewer states the gas can be in, so this (specific) gas has less entropy!

Now of course, this is a very silly way to describe a gas. In general, we don’t know what every single atom of a gas is doing, that’s why we call it a gas in the first place. But it’s that lack of knowledge that we call entropy. Entropy isn’t just something out there in the world, it’s a feature of our descriptions…but one that, nonetheless, has important physical consequences. The Second Law still holds: the world goes from lower entropy to higher entropy. And while that may seem strange, it’s actually quite logical: the things that we describe in more vague terms should become more common than the things we describe in specific terms, after all there are many more of them!

Entropy isn’t the only thing like this. In the past, I’ve bemoaned the difficulty of describing the concept of gauge symmetry. Gauge symmetry is in some ways just part of our descriptions: we prefer to describe fundamental forces in a particular way, and that description has redundant parameters. We have to make those redundant parameters “go away” somehow, and that leads to non-existent particles called “ghosts”. However, gauge symmetry also has physical consequences: it was how people first knew that there had to be a Higgs boson, long before it was discovered. And while it might seem weird to think that a redundancy could imply something as physical as the Higgs, the success of the concept of entropy should make this much less surprising. Much of what we do in physics is reasoning about different descriptions, different ways of dividing up the world, and then figuring out the consequences of those descriptions. Entropy is ignorance…and if our ignorance obeys laws, if it’s describable mathematically, then it’s as physical as anything else.

Bras and Kets, Trading off Instincts

Some physics notation is a joke, but that doesn’t mean it shouldn’t be taken seriously.

Take bras and kets. On the surface, as silly a physics name as any. If you want to find the probability that a state in quantum mechanics turns into another state, you write down a “bracket” between the two states:

\langle a | b\rangle

This leads, with typical physics logic, to the notation for the individual states: separate out the two parts, into a “bra” and a “ket”:

\langle a||b\rangle

It’s kind of a dumb joke, and it annoys the heck out of mathematicians. Not for the joke, of course, mathematicians probably have worse.

Mathematicians are annoyed when we use complicated, weird notation for something that looks like a simple, universal concept. Here, we’re essentially just taking inner products of vectors, something mathematicians have been doing in one form or another for centuries. Yet rather than use their time-tested notation we use our own silly setup.

There’s a method to the madness, though. Bras and kets are handy for our purposes because they allow us to leverage one of the most powerful instincts of programmers: the need to close parentheses.

In programming, various forms of parentheses and brackets allow you to isolate parts of code for different purposes. One set of lines might only activate under certain circumstances, another set of brackets might make text bold. But in essentially every language, you never want to leave an open parenthesis. Doing so is almost always a mistake, one that leaves the rest of your code open to whatever isolated region you were trying to create.

Open parentheses make programmers nervous, and that’s exactly what “bras” and “kets” are for. As it turns out, the states represented by “bras” and “kets” are in a certain sense un-measurable: the only things we can measure are the brackets between them. When people say that in quantum mechanics we can only predict probabilities, that’s a big part of what they mean: the states themselves mean nothing without being assembled into probability-calculating brackets.

This ends up making “bras” and “kets” very useful. If you’re calculating something in the real world and your formula ends up with a free “bra” or a “ket”, you know you’ve done something wrong. Only when all of your bras and kets are assembled into brackets will you have something physically meaningful. Since most physicists have done some programming, the programmer’s instinct to always close parentheses comes to the rescue, nagging until you turn your formula into something that can be measured.

So while our notation may be weird, it does serve a purpose: it makes our instincts fit the counter-intuitive world of quantum mechanics.