The history of the idea of mass-energy conversion is a slightly murky one. Textbooks and lecturers find it convenient to say that
Albert Einstein was the first person to suggest that mass and energy were interchangeable, but really ... he wasn't. That's a handy piece of educational fiction. It ain't so.
By 1905, a number of researchers
were reckoned to be close to the E=mc² result. The basic argument went something like this: imagine a mirrored cavity embedded in a piece of material, containing a trapped light-complex, in equilibrium with its container. The radiation pressure of the trapped light within the container is the same in all directions. But if the container and its trapped electromagnetic (EM) energy are now viewed by a different observer who reckons that the container is "
moving", then that observer will assign different
Doppler-shifted energies and radiation pressures to different parts of the light-complex: The forward-aimed components now get assigned greater energy and
momentum than the rearward-aimed components, and the overall momentum of the complex no longer cancels out - the container's nominal motion gives the trapped light an overall momentum that points in the direction of motion.
So the EM contents of the moving container appear to contribute additional momentum to it, as if it contains a speck of matter rather than EM energy. If we aren't allowed to look inside the container, we might not be able to tell whether it contained EM energy or real matter, and by working out how much energy it takes to reproduce the external effects associated with a given amount of mass, we end up with a very short equation for the conversion factor between rest mass and rest energy. That (if we calculate it correctly) is
E=mc².
However, it seems that Einstein's competitors either didn't calculate the conversion ratio properly, or failed to come out and suggest in print that this wasn't merely an
apparent conversion of mass and energy, but The Real Thing. Einstein did both, and earned the credit.
If we want to go back further, to find an older example of the idea of "interconvertibility" in a major English-language physics text by a famous author, all we have to do is open a copy of
Isaac Newton's "
Opticks" [
Babson archives]/[
1717 edition.pdf], and flip to the "Queries" section at the back. The relevant section is
Query 30:
Qu.30: Are not gross Bodies and Light convertible into one another, and may not Bodies receive much of their Activity from the Particles of Light which enter their Composition?...
The changing of Bodies into Light, and Light into Bodies, is very conformable to the Course of Nature, which seems delighted with Transmutations.
I've quoted this at the start of Chapter 2 of
the "Relativity..." book ("
Gravity, Energy and Mass"), which goes through some of these arguments in more detail (with the help of some pictures).
Traditionally, at this point in the discussion, a physicist will interrupt and say something like,
"Okay, perhaps Newton had the idea, but we weren't able to calculate the specific relationship until we had special relativity. Einstein used Lorentz's relationships in his calculations rather than Newtonian physics, so so E=mc² is clearly specific to Einstein's physics."
But that's not true either. It's correct that Einstein originally
presented E=mc² in the context of his new "special" theory, but if he'd done the momentum calculations with the same degree of care using "olde" Newtonian
emission theory, he'd have gotten the same result (with slightly less working). In fact, we can construct a
continuum of hypothetical theories whose relationships differ by Lorentzlike ratios, and
all of them generate E=mc². Turns out,
E=mc² is a general result. I've put the details of the "Newtonian optics" argument into
the book's "Appendices" section, as "Calculations 2"
So, while some physics histories present Einstein's discovery of E=mc² in 1905 as a triumph of the scientific method, the reality seems to be that the equation's discovery is marked by a sequence of earlier human failures going back two hundred years.
To start with, Newton couldn't calculate E=mc² because he'd gotten the relationship between energy and frequency upside down, and assumed (reasonably but wrongly) that the "bigger", redder wavelengths of light carried more energy and momentum for a given amplitude, rather than less ("
The Newtonian Catastrophe", chapter 3). Newton lived 'til 1727, and then his his successors
still couldn't calculate E=mc², because they trusted Newton to have gotten it right. If you were an English physicist, suggesting that Newton might have made a mistake was heresy. Towards the end of the century (1783),
John Michell used Newton's arguments
to calculate the gravitational wavelength-shifting of light, but he was still citing Newton's writing and using the old bad "inverted" relationships. Defending Newton from criticism was now a matter of national pride, and in 1772,
Joseph Priestley's
History of Optics had been cheerfully ridiculing the mental capacity of those poor retards in Europe who were so behind the times that they actually still thought that light was a wave! Antagonism between the two sets of researchers meant that the Newtonian group couldn't admit the possibility of major error.
The next couple of decades saw Europe shaken up by the
French Revolution, and then Continental physics really began to hit its stride. Newton's mistake had generated a bad prediction that light should travel more quickly through glass than air, and when Continental experimenters started using new technology to measure lightspeeds, they were able to show, quite conclusively (and perhaps slightly gleefully), that
this wasn't the case. As we got to the mid-C19th, work by
Christian Doppler and others meant that we were now
quite sure how to calculate the effect of velocity on light for any given model, but instead of going back and correcting Newton's error, Newton's supporters slunk off with their tails between their legs, and did their best to rewrite physics history so that later English-speaking physics students hopefully wouldn't realise just how dumb they'd been.
The latter part of the C19th was then "lost", too. Although we now had plenty of expert wave theorists, lightwaves were now generally reckoned to propagate through some sort of
aetheric medium, and there was no agreed set of governing principles defining what that medium's properties ought to be. The credibility of the older Newtonian principles concerning the behaviour of light (such as the idea that the behaviour of matter and light ought to obey a single set of underlying rules) were now widely considered to be "damaged goods", and the proliferation of aether models meant that we now had a bewildering array of competing predictions for exactly how the properties of light ought to be affected by motion. There were just too many damned versions for us to be able to do these sorts of calculations confidently, and be sure that our results meant anything.
That state of affairs lasted until the early Twentieth Century.
This is where Einstein came onto the scene. Einstein had three advantages over most other contemporary theorists when it came to deriving E=mc² - he was a fan of the idea that the
principle of relativity should apply to light, he was definite about the set of equations that he wanted to use, and he was (apparently) blissfully unaware of almost all of the previous two centuries of political bickering on the subject (probably helped in part by his habit, as a student, of not bothering to turn up for lectures). So Einstein was able to come to the problem "fresh", without a lot of preconceptions. He'd already tinkered with
emission theory, recognised some of the problems, and had then latched onto
Lorentzian electrodynamics, and decided that this was The Future.
In 1905, he published his "reimagining" of Lorentzian electrodynamics , which took the characteristics of Lorentz's relativistic aether and deleted the "physical medium" aspect as unnecessary. According to Einstein in 1905, aether was irrelevant to the problem - all that was required to generate the Lorentzian relationships was the principle of relativity and an assumption about lightspeeds. These two postulates were then sufficient to generate all of Lorentz's important math.
And then (
as a very short followup paper) ... if the Lorentzian relationships in the previous paper were correct, internal energy imparted mass to bodies, according to the relationship E=mc².
At this point, Einstein was on a roll, and he was looking forwards rather than backwards ... he didn't really have much motivation to point out that, if the relationships in his earlier paper were
wrong, and we reverted to the previous relativistic calculations for light, we
still got E=mc². Pointing
that out was a job for peer review and outside commentators, but almost no-one noticed.
We then coasted though another century, without much to suggest that anyone had connected the dots and understood the broader context for what Einstein had done and how it really related to Newton's earlier work. Right into the 1990s, students were still being told that E=mc² was unique to special relativity, and that the fact that atom bombs worked was ample evidence that no other system of equations could be right. Those claims weren't scientifically or mathematically correct, and weren't researched, but everyone seemed to believe them. Some people wrote research papers and entire
books on the history of E=mc², and
still somehow managed not to mention the Newtonian connection.
Not everybody missed it. The Sandman series by
Neil Gaiman quotes and
cites the key section in "Opticks" and points out its its significance. But Sandman isn't a book on theoretical physics, it's an illustrated fantasy graphic novel. So what we
appear to have here is a subject where some people who write university textbooks seem to be doing rather less background research and fact-checking than some people who write comicbooks.
I feel that this is an unhappy situation. But it seems to explain a lot about why theoretical physics is in its current state.