Compass Points to Fathom

The marine world is steeped in traditional terminology. Sometimes, though, sailing can seem almost too traditional: measuring anchor chain in shots or shackles, depth in feet or fathoms, distances in nautical miles or degrees of latitude (but never longitude). In response, we proffer this portolan of measurement, being a guide to most of the ways one can measure distances and angles at sea.

We start our journey with an explanation of degrees, which leads us to ancient Babylonia. Some 5,200 years ago, a succession of mighty empires along the banks of the Euphrates River invented writing, astronomy, bookkeeping and the sexagesimal counting system. This last invention is a way of counting that uses 60 as its base. We use 10 as the base for our contemporary number system, except when it comes to counting things that fit neatly in a circle, like time and angles. For these, we still use the Babylonian base-60 system. Hence, at least some scholars assume, arose the common convention of using 360° in a circle.

Skip forward to 130 B.C., where, on the island of Rhodes, we’ll find Hipparchus: astronomer, mathematician and all-around genius. He knew the earth to be a sphere, so he applied Babylonian degrees to the earth and produced a latitude and longitude grid for the planet. Although Greeks were already using degrees, he apparently devised the idea of subdividing them. In keeping with Babylonian usage, he subdivided degrees in to 60 minute parts (our “minutes”), and each of those in to 60 secondary parts (our “seconds”). With this subdivision arises the traditional way of expressing the size of an angle: degrees, minutes and seconds. Thus, you might find an angle of a little more than 12° written 12° 17′ 55″.

But those of us with GPS units are anything but tradition-bound mariners. We live in a digital age, and we want our angles expressed in modern, or base-10, numbers. GPS manufacturers are happy to oblige, and now one can find angles expressed as degrees, minutes and seconds (12° 17′ 55″), degrees and decimal minutes (12° 17.917′), and even decimal degrees (12.299°). Any angle — not just latitudes and longitudes — can be expressed in these units. The choice really depends on what is easiest for the user.

However, mariners of old didn’t need degrees — they needed winds. Within the eastern Mediterranean, conventional wisdom held there were eight distinct winds, each with specific characteristics, such as temperature, humidity and direction. By at least the 1st century B.C. (and probably much earlier), the Greeks used this eight-point system for measuring angles.

Venice rose to the rank of regional superpower in the latter part of the 13th century, thanks largely to the magnetic compass. Despite this, wind roses remained in common use, and around this time, Portuguese mapmakers introduced the 16-point rose on their sailing charts. By the 15th century, Italian and/or Portuguese mapmakers had begun to include roses with 32 points, eight per quadrant. This 32-point rose is what European mariners used during the Age of Exploration, and what we still use today.

It’s easy to see why. Start with the eight Greek points. They include the four cardinal directions (N, E, S, W) and the four intercardinal directions (NE, SE, SW, NW). Divide the arcs between each of these points in half, and you’ve invented the 16-point system. These inter-intercardinal points correspond to those standbys of lazy crossword-puzzle makers: NNE, ENE, ESE, etc. Divide those arcs in half to form the 32-point system, and we have the inter-inter-intercardinal points: N by E, NE by N, NE by E and so on. One modern point is 1/32nd of a circle, or 11.25°.

The handy mil

The mil is a group of units worthy of attention. Many types of mils exist, but we’ll focus on the version used in North Atlantic Treaty Organization military forces. Take an object that is one unit wide. Place it 1,000 units away. The angle subtended by the object is one mil. With 6,400 needed to go around a circle, the mil is tiny. One mil is roughly the thickness of a fingernail held at arm’s length. Despite its size, the mil is a useful unit, so useful that artillery gunners and tank commanders use it as their principle angle measure. Assume, in our more peaceful pursuits, you know that a lighthouse is 40 feet high, and it subtends an angle of 10 mils. Then its distance (in thousands of feet) is:

Distance = height/angle = 40 feet/10 mils = 4 (thousand) feet.

Mils are cool. Whatever unit you use to measure the size of the object is the unit that pops out at the end. A viewfinder calibrated in mils provides automatic range-finding abilities, which is the idea built into monocular range finders advertised in this very magazine. This cute little trick works because a mil is a “natural” way of measuring an angle. Natural units combine size and subtended angle; a natural system is handy.

Like the radian, the natural unit par excellence. Radians are the fastest way of accurately solving — in your head — all kinds of nav problems. For example, suppose during a race you want to steer 10° off the rhumb line to pick up better wind. How much will that course change slow your speed of advance toward the mark? You think, “OK, I take the cosine of 10°, so where’s my calculator …,” and by the time you find one-sixth of a radian, and the decrease in speed of advance is just half the square of the angle, so 1/72 is, gee, about 0.02, or 2 percent!” Bang, you throw the helm over, because that stronger wind is surely going to boost your speed by more than 2 percent.

Unlike all the other angle measures we’ve seen, the radian doesn’t fit evenly into the circle. First named by a professor at Queen’s University in 1871, the radian is, like the mil, a natural unit. In fact, it’s an über-natural unit: Take the radius of a circle and lay it along the circumference of the same circle. The resulting angle is one radian. An easy way to see a radian is to lay two yardsticks on a table, so they form a T shape. Place your eye at the end of the yardstick making the vertical of the T, and look toward the other yardstick. The angle subtended is roughly one radian.

The beauty of radians is that they make most geometry and trig calculations much easier. Remember sine and cosine? Forget about the calculators and tables: Always do trig with radians. Divide degrees by 60 to convert them to radians. For any angle less than about 0.75 radians (or 45°):

Sin (angle) = angle

Tangent (angle)= angle

Cosine (angle)= 1 – angle2/2

These approximations are excellent for the sine and cosine, and good for the tangent. The tangent is particularly helpful. Say a lighthouse 100 feet high occupies a pinky-width, or one degree, from your cockpit. How far away is it?

Distance = length/angle (in radians) = 100 feet/(1/60) = 6,000 feet, or about 1 nm.

Is this distance as precise as the one you might get with your brand-new combination radar/chartplotter with the 4-kw open-array antenna? No, but instead of leaving the helm and going below, you remained in your cockpit and held up your pinky. There are times when that radar will save your life, but other times, that pinky may just do the trick. Remember our motto: Use whatever unit is easiest.

Putting your body into it

Nearly all traditional units of measure are based on some part of the body, human or animal. No two bodies are the same, so we shouldn’t be surprised that most units had multiple definitions.

No better example exists than the foot. Everyone had his or her own unit of length called the foot. The one we use today derives, like so many other measures, from the Saxons, by way of the Romans, Greeks and Egyptians. During the Roman occupation of the British Isles, the Roman foot (a little less than 12 modern inches) was the standard. After the empire, local Saxon feet, ranging from 6 to 10 inches, returned to prominence. Following the Norman invasion, the modern foot was defined. A story suggests the foot was defined around 1100 as one-third the distance from King Henry I’s nose to the tip of his outstretched finger. Despite a few minor changes during the reign of Edward I, the foot we use today is Henry’s. Of course, the definition of the foot has become more precise in modern times. In the Metric Act of 1866, Congress defined a foot as exactly equal to 1,200/3,937 of a meter. This survey foot is the legal foot for, as you may have guessed, surveying. In 1959, the National Bureau of Standards (an agency of the federal government once part of the Coast and Geodetic Survey) redefined the foot as 30.48 cm. This is the same definition used by Great Britain, so this is called the international foot. They differ by two parts in one million; so don’t worry too much which foot your local tax assessor uses.

If you long for a more nautical unit, don’t turn to the fathom. In modern standards, the fathom is exactly 6 feet long and is only used on boats. But throughout its long history (its first use in written English dates to the year 800), the fathom was used as much on land as at sea. A fathom is simply the distance between the fingertips of one’s outstretched hands, or, as explained in a publication from 1519, “The length … tween the both toppys of his myddell finger, than he maketh a fathome.”

Fathoms are the perfect unit to use for measuring line or rope. You grab the line with one hand and pull it back while reaching forward with the other hand. Between your hands lies, by definition, one fathom of line. This pretty much explains why charts show depth in fathoms. Try it the next time you heave a sounding line — pull it up by the fathom. Count the fathoms you bring up, and announce the result to the crew. If you happen to be anchoring, throw some additional terms into the mix. After reporting the depth in fathoms, ask how many shots or shackles of chain the skipper wants paid out. In the United States, a shot or shackle is 15 fathoms, or 90 feet, of anchor rode. Were you in the Royal Navy prior to NATO unification in 1949, laying out one extra shackle of rode would have brought you just 12.5 fathoms of chain.

Everyone who had a foot, it seems, had a mile. The Scots, Irish, Saxons, Germans, Belgics, Danes, and Swedes — all of them had their own version, each different than the other. The mile we use today followed a now-familiar path. It began in Egypt, was adopted by the Greeks, then the Romans, who in turn foisted it onto the various peoples they subjugated. The Roman mile is 1,000 paces, or mille passus. A military pace (two steps) was about 5 feet long, so the Roman mile was 5,000 feet long.

It was Queen Elizabeth I who devised the statute mile. In 1587, the 30th year of her reign, she declared by statute that the mile would henceforth consist of eight furlongs (a unit of 220 yards), making the mile 1,760 yards, or 5,280 feet long. We still use this definition today, although, of course, the actual length of 5,280 feet changes each time the foot’s definition changes. The survey mile (of 5,280 survey feet) is about three millimeters longer than the international mile (of 5,280 international feet).

Nautical miles

Some readers may wonder why our discussion of the mile excluded mention of the nautical mile. The reason is simple — they have no relationship. None. The nautical mile is related to the conversion of distances and angles. Now, with both angular and linear measures understood, we can put them together to measure things on our spherical earth. Our goal is to see how, because we live and sail on a sphere, angles can be used to measure linear distances, and vice versa.

Any two points in a sphere are separated by some angle, measured from the center of the sphere. No matter what units you use to measure the angle or distance, this relationship is true:

Linear distance = angle x radius x factor

If we assume for now that earth is perfectly spherical, every degree, minute or second of latitude is equivalent to a linear distance on the sea surface. By long definition, a nautical mile is the distance equivalent to one minute of latitude.

To do this, we must return to find our friend Hipparchus busily refining latitude and longitude. Following those before him, he placed the primary line of latitude at the equator, halfway between the poles. Additional parallels of latitude were placed every degree, north and south of the equator. The angular distance from the equator to the pole, measured from Earth’s center, is 90°, so Hipparchus’ latitudes went from 90° north to 90° south.

He drew meridians of longitude perpendicular to the equator, running through the poles. He placed the prime meridian of longitude exactly where generations of later cartographers would — right through his own backyard. The angular distance from the prime meridian all the way around the earth and back to the prime meridian is exactly 360°, so Hipparchus labeled his longitude from 0° to 180° east and 0° to 180° west of the prime meridian.

Hipparchus certainly knew the earth was spherical and probably knew of Eratosthenes’ determination, around 240 B.C., of the earth’s circumference. By careful observation, Eratosthenes measured earth’s circumference at roughly 25,000 international miles, just 4 percent off the modern value. Had he wanted to, Hipparchus could have done a little division problem. Earth’s circumference of 25,000 miles was equivalent to 360° x 60 minutes, so one minute of latitude would equal about 6,100 feet, or almost exactly one fully modern nautical mile! This result would have differed only 4 percent from the modern value. It demonstrates an interesting but surprising fact: the nautical mile is totally unrelated to the statute mile. It is a quirk of fate that earth happens to be the right size for one minute of latitude to approximate a statute mile.

Alas, Hipparchus didn’t do this, and the nautical mile was a long time coming. One must know earth’s circumference to even use the term nautical mile, and the term doesn’t appear, in English anyway, until 1632. As early as 1525, various people in France, England and Holland were trying to measure earth’s circumference by laboriously surveying the length of 1° of latitude. By 1635, Englishman Richard Norwood used crude wooden octants to measure the latitudes of London and York; he then measured the distance between them using horizontal angles. From this, he apparently calculated the linear distance equivalent to one minute of latitude, and hence the size of a nautical mile: 6,075 feet. This is clear from recommendations he made in his 1637 book Sea-Man’s Practice, the Bowditch of the 17th and early 18th centuries. He was obviously pretty good at his job — his value is less than 2 feet from the modern one!

Many definitions

Like all other units, the nautical mile has lots of definitions, all slightly different. The Royal Navy used the Admiralty mile, 6,080 feet, until 1929. Then, at the International Extraordinary Hydrographic Conference in Monaco, the world defined a nautical mile as exactly 1,852 meters, or 6,076.11549 feet. The world, that is, except the United States, which continued to use the sea, or geographical, mile of 6,080.20 feet. The United States adopted the international nautical mile of 1,852 meters on July 1, 1954.

Ironically, none of these nautical miles is exactly equal to 1/60 of a degree of latitude! Earth isn’t exactly spherical, but it resembles a flattened sphere, or oblate spheroid. At the poles, one minute of latitude is 125 feet greater than the international nautical mile, while at the equator, one minute of latitude is 120 feet less than the international nautical mile. If that seems backward to you, you’re in good company.

At the pole, earth has a small radius but large curvature. Large curvature means a larger minute of latitude and hence long nautical miles. At the equator, earth has a large radius, but small radius of curvature. Small curvature means a smaller minute of latitude and hence a short nautical mile. Those of us lucky enough to sail at the latitude of 44° 23′ 49″ have the nerdy pleasure of sailing where one minute of latitude is exactly equivalent to one international nautical mile.

The knot is simply a measure of speed, exactly equivalent to one nautical mile per hour. Most readers will be familiar with the chip log, in which a piece of wood attached to a rope is lowered from a boat’s taffrail. Once clear of the wake, a 28-second timer is started, and the rope is paid out. Knots mark every 47.25 feet of line; the number of knots that run by in 28 seconds gives the vessels speed through the water in knots. (Why the odd numbers? Norwood originally suggested them in 1637. The ratio of 47.25 feet: one nautical mile = 28 seconds: 1 hour.)

As definitions of the nautical mile changed, so did the distance between knots on the line and seconds in the “hour” glass. The 1847 edition of Bowditch required 51 feet and 30 seconds, or 47.6 feet and 28 seconds, for the chip log.

Being comfortable with a range of units is part of the tradition of sailing, particularly those of us who think that Dacron is still pretty high tech.

Larry McKenna is a sailor, freelance writer and president of Working Knowledge. He lives in Overland Park, Kan., and sails in New England.

By Ocean Navigator