It seems simple to say that an 80-proof rum is actually just 40% alcohol by volume based on a straightforward mathematical operation, but strictly speaking, it is not true. Actually, using the historical British method in force until 1980, a 40% ABV drink is 70 proof
It has long been a problem to decide exactly how strong a given drink was (or is). From the ancient times, Archimedes’s principle was used to determine specific gravities (i.e., density) by use of hydrometers, but I can trace no records that show the consistent, state-mandated application of the principle to establishing the alcohol content of spirits. In any event, for most of history, brewing and distilling were primarily cottage industries in an overwhelmingly agricultural world, and while rudimentary regulations existed regarding quality control, it was not until the era of industrial mass production around the 18th century and the usual attendant evil of taxation, that consistency and proof of strength became something to be sought after.
The word proof as applied to alcoholic beverages takes its name from the (possibly anecdotal) exercise supposedly undergone by any rum during the Royal Navy days of yore, as well as – subsequently – the tests a spirit had to go through to rate its strength for taxation purposes. In short, a proof spirit was the most diluted (weak) form of that spirit which would still support the combustion of gunpowder. Not surprisingly, the Royal Navy was intimately involved in this: in order to show that the rum stocks on board were unadulterated, gunpowder was doused with the spirit and set alight. If it ignited, then it was supposedly proof, or over proof; if it did not, the liquor was deemed to have too much water and was underpoof. It was discovered that a ratio of 7:4 of alcohol to water was just enough to support combustion. This was deemed “100 degrees proof”. Naturally, this was more of a rule of thumb than anything else, since quality or type of gunpowder was never taken into account, and surely that would have had an effect on the combustion rate.
(Also, we may have the story in reverse: a master gunner would need to know the best kind and amount of gunpowder, depending on burn rates, to use on which sized cannon – to prevent explosion prior to expelling the cannonball – and having gunpowder doused in alcohols of varying strengths gave him quick measures of burn rates; also, fuses soaked in alcohol and gunpowder were common the prevent them being doused by seawater during battles, and gunpowder and/or rum was often added to drinking water as a preservative…but I digress).
In order to address mankind’s innate love of complexity, clearer and more complicated definitions of strength emerged. First, a legal standard was promulgated in the early 18th century, stating that a “proof spirit” was half rainwater and half spirit proven by the gunpowder method (this would roughly approximate to today’s ABV measurement); a gallon of proof spirit like this, with a density of 0.923 was deemed to weigh 7lbs 12ozs at 10.5°C (51°F).
By the third decade of the 17th century, tax was being levied on drinks depending on the alcohol content, and a Clarke’s Hydrometer was developed, adopted and stayed in use until 1817. Clarke’s hydrometer was quoted in the 1762 law (and again in 1802) defining a standard gallon of spirits: six parts spirits and one part water by weight, and weighing 7 pounds, 13 ounces at 50°F. It depended on its proper functioning by being bobbed in the liquid, and being calibrated against liquids of known densities, like water or pure ethyl alcohol.
The problem with all such hydrometers to that time was that they worked properly if there was a constant, reliable temperature, and there was only alcohol and water in the mixture…which of course was not always the case. Tax evaders constantly added other ingredients – molasses, spices, sweeteners and so on – which increased the density of the liquid without affecting its alcoholic properties (alcohol is less dense than water, the principle on which all such hydrometers function).
Finally, in 1817, the more accurate Sikes’s thermometer became the legal method for determining proof: it was established that using this instrument (pretty much just a refined version of Clarke’s) that “proven spirits” were at least 57.1% alcohol by volume and 49.28% alcohol by weight – the next century and a half of British proof measures (and therefore much of the rest of the world) were based on this number. It was still, however, primarily established by weights, not volumes – these were calculated indirectly. Too, the British Navy did its own ongoing measurements of the gunpowder test (which retained a peculiar longevity) and discovered that the ideal strength for gunpowder to ignite was actually 95.5 degrees of English proof – this equated to 54.5% ABV and therefore if one sees any navy rum at either 57.1% or 54.5% (like the Navy Neaters for example, issued at 95.5 degrees) then it’s okay and there’s no mistake.
Europe settled on the Gay-Lussac system developed by the famous chemist. He invented an “centesimal alcoholometer” which is a hydrometer calibrated to 100 percentage by volume divisions, and also provided the theoretical background for its use in an 1824 paper. The system became law in France in 1884 and was adopted by the EU in 1973, and is volumetric in nature.
However, time marches on, as do measures, and the term of proof as defined by Great Britain is no longer in use. All spirits are now measured for strength in terms of % alcohol by volume, and while this is not quite half the old proofing formula, it’s close enough for Government work, apparently. The United States regulations on alcohol state that the proof of an alcoholic beverage is twice its alcohol content expressed as percentage by volume at 60°F. So an 80-proof whisky is 40% alcohol. The Europeans used the Gay-Lussac method, and this is now expressed in degrees, not percentages (the numbers come out the same).
These days, even hydrometers are archaic relics of a less exacting past. Modern measurements of proof rely on pycnometry, hydrostatic balances and (now) electronic densimetry, though all still rely on aspects of Gay-Lussac’s principle. Other more labour intensive and exacting methods fell by the wayside while new ones are constantly being bandied about (like infrared analysis). At end, they all measure the amount of ethyl alcohol in a given sample. And all of that is still expressed in simple terms: proof.
In summary then: in the 1950s, say, a given whisky or rum could be quoted as being 80 proof if measured in the US, 40° proof if in Europe or 70 proof in the UK (and still others based on mass, in some US states). But global standards are now based on simple alcohol-by-volume measures of proof and companies regularly place ABV % on the bottles (sometimes also the proof using older terminology). The methods of assessment have gotten more complex even as the terms remain the same as those from three hundred years ago. It’s like the width of all modern railroad tracks conforming to the width of Roman roads which themselves were based on the width of wagon tracks dictated by the span of two oxen hitched up side-by-side…
It really is enough to drive a man to drink.