Feels strange to have single decimal places in your inch figures. I’d have expected something like eighths, sixteenths and thirty-seconds of inches, even though the decimal representation of them is unwieldy (1⁄32″ becomes 0.03125in). That also helps you avoid fractional pixels, which is worth doing if convenient: 1⁄32″ is 3px, because 1in is defined as 96px.
(As an Australian, the main place I’ve ever seen fractions of inches is old tools like spanners from before the adoption of the metric system, and that’s all fractions with powers-of-two denominators. But maybe Americans use decimal fractions of inches? I suppose I have seen laptop screens described in that way, e.g. 13.1″, 15.4″ and 15.6″, though people typically truncate to the inch.)
All small measurements in the US are done in power-of-two fractions of an inch. In fact, much of the initial opposition to the metric system here was led by the construction industry, who has found from experience that 1/2" 1/4" 1/8" 1/16" etc. measurements are too convenient to ever give up.
The standard system excels when doing actual work. If you have a rod with 1" diameter and you need to take it down to 13/16", you can use a lathe to make 3/32" cut. Or if you need to divide up 13/16 into 4 parts, for example, 13/64.
The metric system's strength is teaching in elementary schools where skills with fractions are not as strong.
Not exactly. Things are done in decimal fractions until you get down to around 1/64", but any farther and they're done in thou (1/1000" aka mils) or tenths (1/10000"). There's overlap, machinists will use thou up to and sometimes beyond a full inch.
Rulers, drill bits, and wrenches typically use fractions of an inch where the denominator is a power of two, but circuit board design often uses decimal fractions of an inch. The holes in a breadboard are commonly .1 inches apart.
Routing a PCB is like constantly flipping your ruler around: Old, classic through-hole parts are in inches/mills, while new, modern surface-mount parts are in millimeters. All electrical aspects (traces, spacing, lengths) are measured in inches/mills, and often the mechanical aspects are measured in millimeters (if the enclosure is not made in the U.S.). And if you use the wrong system, you engineering tables and formula stops working, the whole thing is out-of-context (e.g. measuring the gap in a differential pair in millimeters, the number makes no sense), and an integer becomes a float with three significant digits after the decimal point.
So here is the joke: The electronics industry is moving towards the metric system, one mill at a time. At least CAD programs have a button to switch between the two instantly.
Engineering and machining specifications are usually in 0.001” and referred to as thousandth’s or “thou’s”. Tools like wrenches or drill bits use fractions in the form x+1/2^n like 2 1/2 == 2.5”. It goes down to 1/64, I think.
You're saying that Americans preferentially use base-2 fractions (nearly) everywhere except on computers; on computers you preferentially use base-10, even though said computers are going to store the values as base-2?
> "on computers you preferentially use base-10, even though said computers are going to store the values as base-2?"
I don't understand this comment at all. Most people are not computer programmers and therefore have no insight into how computers represent numbers. They just populate text fields in Excel, or whatever, with numbers. It's easier to use decimal points to do that than fractions.
When I'm not using computers or calculators, I do prefer American-style fractions over decimal point representations. Fractions are a lot easier for me to manipulate in my head. If I'm out in my garage, upside down, underneath a car, and trying to decide which drill bit to use, I don't have convenient access to the calculator on my cell phone or a pencil and paper, and reliable mental math is important. The fractions are easier, so that's what I want.
I believe they're saying it's ironic that, despite the base-2 values being ideal for binary computers, users are having to use base-10 to enter them into the computer and hence getting none of the possible advantage. I don't believe they're making a value judgement.
A feature of some English measures is unbiased division into three equal parts, e.g. Three teaspoons to the tablespoon, twelve inches to the foot, and twelve ounces to the pound (for one of several definitions of “ounce”).
Thirds don’t fit neatly into binary nor do they fit neatly into decimal. This is what the comment refers to.
> I think they were referring to the War of 1812. A war the US lost
The outcome of history indicates the US won that war. Technically it was considered a stalemate. The US lost no territory, repelled a malevolent empire again, kept its economy & trade fully intact, and won numerous major battles throughout the war. In the following decades the US became an economic juggernaut, while the British Empire began to fade into the history books.
A supposedly mighty empire failed twice in less than 40 years to bring a small nation to its knees. Quite humiliating.
- "In 1813, the US won the Battle of Lake Erie, gaining control of the lake, and they defeated Tecumseh's Confederacy at the Battle of the Thames, defeating Britain's largest Native American ally, a primary war goal."
- "In 1814, the British burned Washington, but the US later repulsed British attempts to invade New York and Maryland, ending invasions from Canada into the northern and mid-Atlantic states."
- "Attempts to smother American maritime trade failed, however, and soon both sides began to desire peace."
- "In early 1815, after a peace treaty had been signed, but before this news had reached the Americas, the US defeated the British Army near New Orleans, Louisiana."
- "Peace negotiations began in August 1814, and the Treaty of Ghent was signed on December 24, 1814. News of the peace finally reached America in February 1815 about the same time as news of the victory at New Orleans. The Americans triumphantly celebrated the restoration of their national honor, leading to the collapse of anti-war sentiment and the beginning of the Era of Good Feelings, a period of national unity. The treaty was unanimously ratified by the US Senate on February 17, 1815, ending the war with no boundary changes"
That's just straight up wrong. The US started the war when they invaded Canada. The British Empire did not want to fight the war. To them, the United States was just a sideshow of the much larger and much more important Napoleonic Wars.
If anything, it was the British and Canada that repelled a malevolent United States, who had had eyes on annexing Canada since independence.
> In the following decades the US became an economic juggernaut, while the British Empire began to fade into the history books.
Uh, what. The 19th century was the golden age of the British Empire, whilst the US remained a small player on the world stage. You must be confusing the War of 1812 to the first or second world wars.
I know its the Fourth of July for you yanks, but this comment is just straight up propaganda.
And quoting Wikipedia to prove that it was a victory for the US is ironic given that Wikipedia states that the war resulted in stalemate and status quo ante bellum. The only real losers were the Native Americans.
Measuring by volume was from a time period before we had reliable digital scales. These days all baking measuring should be done by weight since it is far more accurate. I won’t even consider a recipe that goes by volume. Fortunately those are all written by amateurs.
I have a lot of junk in my kitchen drawers (admittedly a personal problem) and will opt for measuring directly from the bag into the bowl on a scale over digging around for that quarter cup every time.
Mechanical scales were normal in the UK for most of the 20th century. Everyone I knew had one in the 1990s, from around 2000 people started buying digital scales, and I took my mum's mechanical scale when I left for university.
Measuring ingredients by weight has a long history on Britain. The old, Victorian cookbooks use weights.
I’m generally a fan of simple mechanical devices over electronics, but I think digital scales are a good example where electronics really do makes things simpler; and as it’s a commodity (like calculators and digital watches) the electronic version is amazingly cheap too.
The ability to zero the count as you add each ingredient to a single bowl is very useful.
> These days all baking measuring should be done by weight since it is far more accurate.
It also has no "tamping" problem e.g. volumetric measurement of flour can differ in the amount of actual flour by a factor of 2+ if the floor is tamped versus aerated. Volumetric measurements are supposed to use aerated aka first make flour fly everywhere then try to coerce it into your cup.
The units were already defined in relation to metric units (officially through the Medenhall Order in the US though the OWM had officiously been doing that for some time, officiously through the Weights and Measures Act of 1897 in the UK as while it defined the meter in terms of yard, as the precision of the meter increased the yard followed rather than led), but different countries used slightly different conversion factors.
Welcome back to HNC News. Tonight's top story: US law makers have introduced a new bill that would require all digital displays in the United States to feature a pixel density of 96 ppi. White house officials say the new bill represents "the first step on the march to bring freedom to the cyber world". More on this after the break.
Anchor: "Let's bring in a Congressman who agreed to comment on this extraordinary turn of events"
Congressman: "I'm glad to support the expansion of freedom to all realms. Thankfully, it is now safe for us to do this that we can defend ourselves against the dangerous 'cypherpunks'. Thank you to everyone who has helped make this possible"
As a European whose only familiarity with inches is that "it's about 3cm" that actually annoys me.
Back when screens were small it sort of worked, I knew what 13", 15", 17" and 20" screens looked like. Since most monitors fell in that range (yeah, I'm showing my age) that was good enough. Anything bigger than that was "damn huge", anything smaller was "damn small".
But now monitors and especially TVs are immense. I see TVs with diagonals of 30", 55", 65"... That means absolutely nothing to me. I have to convert into centimeters for it to make sense. Fortunately some resellers do put the metric measurement next to the inches, but not everybody does it.
If it makes you feel any better, I'm from the US and the unit is kinda meaningless. It's not like I can visualize 42" (without converting to feet first), and because it's the diagonal length it makes it basically impossible to visualize - I have to have seen it to know anyway.
In Japanese, they don’t translate “55"” normally, instead it becomes a “55 unit” TV. FWIW, as an American, the diagonal is unhelpful, so you need to look up the horizontal and vertical measurements anyway to know what will fit on your shelf.
As an European this is really cool and unlike others self-loathing Americans here in the comments I understand the idea, humor and originality.
Happy 4th USA, you are great today due to the greatness of your forefathers and the men that built this nation.
How hard would it be to write "men and women" (or "women and men").
What do you reckon? The fellas who "did all the work" in US history, you reckon they did all their own cooking and cleaning, you reckon their intimate partners, or close-others didn't contribute any to the people whom history attributes greatness?
Probably not old boy, fair chance the wives, partners, and mistresses went a long way to getting us where we are.
And that's if they weren't directly responsible but not attributed. How's that saying go, next to every great man is an even greater women. It's often true, except that the woman part could be generalised to partner or carer.
Some people act alone, outliers though.
And to still, in this socio-political climate, not to bother to include women is, at least, tone deaf, if not outright intentionally exclusive.
The Constitution is THE Supreme Law of the land. It literally is the ‘bible’ in how America is to be governed. The difference is that The Constitution has mechanisms within itself for change. This is the opposite of dangerous. It is the reason slavery was abolished, women have the vote, etc. It is a living document and it serves the people.
I would say one of the tenet of the Bible and other holy books is that there are immutable. It's the words of God, and he doesn't make mistakes. (In theory at least, I've heard the Bible's translations haven't always been kind to the original text.)
That's a huge difference with the constitution, a fundamental one I'd say. And, most of the time, the people I see treating the constitution as gospel denies this very important mutability. The founding fathers were right and will always be right. The constitution must not be ammended, it is already perfect.
The constitution is the document that defines the existence of the United States and all of the rights, freedoms, and protections it provides and aspires to provide. It’s not perfect, but it is a heck of a lot better than blind adherence to the flag and whoever waves it.
Seeing as the constitution defines the United States, I think you get a lot closer to the truth when reflecting on it than you do when worshiping the flag or the military or whatever else it is that July 4th is supposed to be about.
It is also important to remember that the constitution is a living document, meant to grow and evolve over time. If there are problems - and there are - it is up to us to change it.
The United States is the world's third most populous country, and home to a number of technology businesses–one of the main draws of the site. I don't know if anyone has officially disclosed Hacker News's traffic data, but a lot of people have done analysis of what kind of traffic they get from being on the front page and it's invariably dominated by the United States: https://nicklafferty.com/blog/what-happens-when-you-re-on-th...
Most adults in the UK still measure their weight in stones and pounds, at least in my personal experience, and I have never come across anyone (except those raised in other countries) using either plain pounds or metric units for this purpose.
> Looks like they had 3 fingers per hand, so base 6 seems a likely choice.
Depends how they use their fingers and whether they use other body parts. There is evidence for human cultures having ranged from 4 to 10, 12, 16, and 20 (technically there are higher ones but they usually have sub-bases in that range).
You can count to 12 on a single hand (24 on two hands) by pointing phalanges with the thumb for instance. Some native cultures had base 8 using the space between the fingers rather than the fingers themselves.
Opinion: Highly composite numbers make good bases.
2, 6, 12, 60, etc
Number of fingers doesn't matter, although you can shoehorn almost any base into them. For example, base 6 lets you count to 5 on one hand and use your other to coujt multiples of 6. This lets you count to 35 on two hands.
What have I, or those I represent, to do with your national independence? Are the great principles of political freedom and of natural justice, embodied in that Declaration of Independence, extended to us?... What, to the American slave, is your 4th of July? I answer; a day that reveals to him, more than all other days in the year, the gross injustice and cruelty to which he is the constant victim.
"As I have said, this southern threat lost many votes, but it gained more than would cover the lost. It frightened the timid, but stimulated the brave; and the result was—the triumphant election of Abraham Lincoln."
No, it’s not. Before American independence the British crown held slaves.
Prior to independence both SC and Virginia actually requested to pause or slow down the slave trade. They had to make this request to England and it was refused. The Declaration of Independence laid the groundwork for the eventual end of slavery that culminated in the Civil War.
After US independence, Britain began to wash their hands of slavery eventually in 1834. It’s also worth keeping in mind because during the War of 1812 the British attempted to arm both slaves and native Americans to help their war effort...while still practicing slavery themselves. You can imagine the effect such an action had on local relations as well.
America’s independence should be celebrated by everyone. Selective history should not.
> Before American independence the British crown held slaves.
Well exactly. And it was a practice that continued for many, many years afterwards. Arguably the early US economy was mostly built upon slavery. All that wealth created by people forced into labour who received no compensation for their hard work, but instead just enriched the slave owners.
It's selective history to ignore this and pretend that America was some beautiful egalitarian society striving towards anti-slavery from the start, and that any slavery that remained was solely the fault of the British.
This should be a day of shame and dishonor, if we're not being selective about history.
Tunnel visioning on slavery is a pretty bad mindset. There are many facets to American society beyond slavery. It's almost impossible to imagine a person today who supports slavery. We've come a long way, and nobody is celebrating slavery.
You're the one who brought up Independence Day. And again, your knowledge of history is shallow. I am able to remember when the schools became integrated . I also know why the bathrooms at the rest stops in the South have two entrances ("separate but equal"). And I had a teacher who as a young girl was chased by the Klan.
What we have now is so much improved over what it was like just 40 or 50 years ago that anyone younger just can't appreciate the difference. I advise you to turn off your internet for a few days, go drink a beer & chat with your neighbors and maybe shoot some fireworks off. The internet amplifies biases - don't let it.
 The moral panic was .. impressive. The local TV station decided to broadcast that there was a riot at the high school (there wasn't) and all these angry parents showed up to pull their kids out of school. Meanwhile, the kids were like: "Steve, your mom looks really freaked out."
It's cute that this library is 1776 bytes, but that's quickly going to change as bugs are found and new features need to get added. It seems like coupling a product's name to something that will likely change very shortly is just bad product naming.
meh, call it done. it's modifiable by the user, so leave it alone and let the user mess with it. i have no idea what the author's intentions are/were for the project, but it seems like 'mission accomplished'.
FWIW I'm not saying I dislike the project, I love when people post quirky projects like this to HN for fun. I'm just saying that even the smallest change to this project would cause the author to lose the "1776 USA" marketing strategy. One could say it's an unstable strategy at a local maximum.
What name are you concerned about? The file is called 'usa.css'. The file size is specifically listed right next to the download link. Not unusual. If the file size does change, it'll just be another number, yet the name of the file/project is still 'usa.css'.
I'm talking about the cute fact that the library is 1776 bytes. It was obvious that author deliberately tweaked the file size to be 1776 bytes, to aid in the marketing of the library, and that expecting to keep the file size at 1776 bytes is an unrealistic goal if the author wishes to iterate on the code. I don't get why this is being made into a huge issue. It's a simple observation.
It’s Independence Day in the US. Patriotism is the theme of the day. This is just a humorous holiday-themed style sheet with a readme to match. It is not a serious style sheet that the author expects anyone to use.
Perhaps this is a cultural difference, but the entire thing just seems strange from my own (European) perspective. Only in the US do people seem to take pride in their measurement system as some sort of patriotic symbol, which is why I originally thought it was a parody shrug
I'm not even going to get started with all the problems I see with "God bless America", which would be long, tedious, probably inflammatory, and fairly off-topic.
You're reading way too into this. Someone just decided to have some fun and share it with HN. It isn't really serious, but parody isn't the right description. People aren't taking pride in the measurement system, that's somewhat of a joke. "God bless America" is a common cheer taken from a popular song. The UK has something similar with "God save the queen".
No, 'international' does not mean 'French'. Yes, the French came up with it but since it now has international adoption it is the international standard with the United States as one of the last hold-outs. That's what you get when you have a country run by lawyers with very limited input from science.