As much as we talk about math literacy here (compared to other platforms anyway), I can't help but feel like the incomprehensibility of high level math to laymen is a feature. It's not talked about outside of academia often, because why would it be, but for a long time philosophy resisted efforts to 'mathmatecize' (my own word) the field because of concern that it would become inscrutible to outsiders. As a result, academic philosophy is much more politically polarized than math. Where would we be if every step forward in mathematics was met with backlash for not paying adequate concern to some arbitrary set of external implications? Anyone can disagree of course but just looking at the political affiliation ratios of educators makes it clear that one aspect of the hard sciences historically has been an ability to stay focused on utility despite post-modernism. I'm probably being unfair though, the humanities continue to produce persuasive arguments (persuasive emphasis being one of the first things most lit and speech majors learn during critical thinking and writing courses in college) they've just dropped the pretense of being truth seeking persuits.

I might see it this way because I've been studying math for a long time (though not professionally most of it) but I don't think modern math is ever made more complicated or hard to understand just for the heck of it. That's because math is already felt as extremely hard for most people.

I think anyone in professional math is working really hard to make the field more comprehensible to themselves and so to the world - and the reason for this is the more a mathematician understands and more compactly they can understand it, the further they can go.

And certainly, the way a mathematician put things curtly makes it harder for the laymen, I think the biggest harm to the layman's understanding of math is a math education that gets people conditioned to not think abstractly and think of math as a series of dull exercises.

I agree. If some mathematicians purposeful obfuscated or made more complicated than necessary certain topics, it would create an arbitrage opportunity for others to clarify and simplify the topic.

There are definitely tangible rewards for those who can clarify and simplify a topic, because it can lead to discoveries and insights.

This gradual clarification of a topic hinges on the actual importance of the topic. If a topic is not important, I can see people getting away with obfuscating or complicating results.

Respectable, pro mathematicians don't obfuscate. But there are definitely math professors in non-top universities who get kicks out of torturing students and think "this is supposed to be hard" etc. I'm definitely a math enthusiast but I have personally met people like that. Typically the exams are under strong time pressure and are more about rote learning of fixed types of exercises and remembering definitions.

On the other hand, I had math profs who radiated curiosity and fun and the exams were just a few questions, but you had to think to solve them and you weren't under much time pressure.

So while mathematicians aren't purposefully obscure, the math people (teachers) the average person interacts with are sometimes purposefully obscure.

> As a result, academic philosophy is much more politically polarized than math.

I don't think this is a clear conclusion at all. Philosophy is inherently very political (all political ideas ultimately come from philosophy, after all). Philosophy is also concerned with the real world and human affairs much more than abstract math is.

It's also funny that you bring up post-modernism, a philosophical current famous and often ridiculed for being one of the most deliberately obfuscated.

I also know that a lot of the efforts of brininging mathematical thinking and abstraction in the humanities end up exactly in obscurantism: most of the field can't understand it, but it has cultural clout so they can't admit that, and it becomes a source of empty prestige, with maths literally sprinkled on studies just to access that clout. Even in medicine, math-y papers are/were often accepted unconditionally, with peer review unable to check the actual maths, but unwilling to admit their lack of mathematical culture. For an example, there was a medical article published in 1993 [0], which is a re-discovery of the trapese method of integration. This got around 100 citations.

Of course, that article is at least correct. In postmodern literary studies, there is a famous example of an intentionally meaningless, but very abstract and complex sounding article that was successfully published in one of the leading journals, despite ultimately being gibberish [1].

My utlimate point being, I don't think there is any link between obscurity of a field and its resistance to internal or external politics.

In philosophy 101, my professor said that philosophy was focused on controversial topics because once you could prove a side, it would become science. E.g. the heavens used to be a philosophic topic until we had the ability to prove heliocentrism.

Math is much more focused on the realm of what can be proved. Unproven statements in math (e.g. conjectures or axioms) are the exceptions that prove the rule.

Professional mathematicians spend their lives working on the unproven. It just seems like math is about what's already been proven because it takes so much education to reach the unproven ideas and unanswered questions.

Obfuscation is a practice of the entirety of Academia for a long while now. The art of obfuscation is not new at all. A small anecdote, I remember a Taiwanese friend mentioning the history of the complexity of Chinese characters and how they were devised, in part, to ensure that the lower class(es) could not learn them.

Obfuscation inherently guarantees a lot of things out of the box: you appear more enlightened, only a class of people can understand and thus judge your work, and all newcomers can only become "in the know" if and only if they are trained by someone who already is "in the know". This is very appealing to anyone with even minor insecurities (most people), and has been widely utilized throughout many areas of human society in time. Math is merely one example where this practice has presence.

EDIT: I will admit that the more democratized the internet has made knowledge, and the more prevalent higher-degrees become in society, the more awareness people have of this issue and the more it gets called out. So, all in all, there is the positive side of things that this art of obfuscation is in decline of use for things where it should never be used (like education, science, etc.).

I'd say this is generally not true, at least in math. There's definitely poor notation and misnomers that come from tradition, but it's hardly deliberate as "obfuscation" implies.

I'm not sure I agree (I have a PhD in maths). There are many maths papers and books I've tried to understand where that seems to have been deliberately made more difficult than it needs to be. Sure, you could argue that I didn't have the necessary prerequisite knowledge at that point, or that the authors simply weren't good at explaining things, but I do get the impression that it's sometimes more than that.

Using specialised meanings for words without definition or reference (words that are used for other purposes elsewhere so they can't be searched for) is a key problem, as is the same with notation, and as is leaving out crucial steps. If you're already in the know, those things are probably clear.

Most people suck at explaining. So the fact that most scientists suck at explaining is like... yeah, that's what naturally happens, unless you actively reward good explanations.

The perfect case would be if someone is great at research and also great at explaining. But great skills are rare, and specific combinations of great skills are even more rare. So you will have a scientist or two who are great explainers at the same time, but then you have a lot of scientists who are great at research but suck at explaining, and also a lot of people who are great at explaining but suck at research. How does academia reward them?

The ones with both skills will get the highest rewards: they will invent lots of cool stuff and describe it clearly, so they will get many citations. But there are only a few such people per generation. The ones who are good researchers but suck at explanation can still invent something and publish the type of paper you complain about; and they will get a (smaller) reward for doing so. The ones who are good at explaining but suck at research... will probably be fired.

There are a few things that a person whose only skill is explaining could do in academia. They wouldn't get a reward for explaining someone else's paper clearly, but they could get a reward for writing a meta-review that would explain multiple papers clearly. In theory, their scientific contribution would be in comparing the papers, but in practice, hopefully even people who only needed one of the papers would cite them as a reward for making it easier to understand. -- I don't know if this would actually be sufficient to survive academically.

People who are good at explaining can also make money by writing popular textbooks. Question is, whether there is a sufficient demand for a popular textbook explaining some obscure math.

> They wouldn't get a reward for explaining someone else's paper clearly, but they could get a reward for writing a meta-review that would explain multiple papers clearly.

I see tremendous value for such work, and I think it is one of the biggest current bottlenecks in the sciences. We've got the cutting edge stuff happening in papers, and we've got 5-10 year old knowledge that makes it into textbooks, but in the middle there is nothing. Everyone is working on their own to catch up. We're talking thousands of grad students independently banding their heard agains the same problems...

If there were more "review journals" things would drastically improve. Maybe NSF could provide some funding for such efforts? Not sure what the metrics would be for this---not citations, we'd nee more like views or upvotes like on HN. The other option would be for grad students to self-organize into "reading clubs" on different subjects and share notes.

Firstly, let's not fall for a false dichotomy. There is likely a simple equilibrium of review and creativity that must be maintained for an optimally healthy scientific community. Obviously, the old style of numerous-journals + publish-or-perish mentality was unhealthy implementations of a scientific system.

But, I would move away from papers entirely. They hold no redeeming values over alternatives like a git repository published publicly with an "issues" forum e.g., GitHub, etc. This needs to absolutely become the new standard, and we need to get rid of papers and journals, both physically and digitally, in their entirety. They have zero redeeming factors at this point in time except to boost the ego of the published. Quality control can easily be decentralized and available to everyone using something like GitHub. We know this because that is literally what the security and trust of open-source software is predicated on: putting everything in the open.

Modern math papers also love cryptic symbolism. I've been doing some research and been reading papers on a certain topic from 50s to these days. The difference in math presentation is startling: a paper from 50s really tries to explain things, e.g. it uses simple partial derivatives and some simplifications because that's just enough for the paper; on the other hand a paper from 2020 describes the very same math as a cryptic sequence of triangles and dots, and the reader really need to be familiar with the context to understand what this compressed formula expands into. I believe the main motivation behind this is the desire to look sophisticated, so anyone reading the author's paper would be impressed at a glance with the complexity of the math.

Like ∇^2 u = Δu = f? It's not really cryptic, it's taught in multivariable. And it's not done to look sophisticated, it's so that you can instantly recognize that it's Laplace, or Poisson, etc.

If I tell you B = (dA_z/dy - dA_y/dz)i + (dA_x/dz - dA_z/dx)j + (dA_y/dx - dA_x/dy)k do you instantly see what operator is applied? That's why you just write B = ∇×A.

People presenting ideas in shorter notation is not showing a lack of progress - usually it's the opposite. In old optimization papers people would write out the entire normal equations, entry by entry. Not really insightful at all, compared to A'Ax = A'b.

That's a trivial triangle. Modern papers go far beyond that and invent some partial diff operators, like "F.∇" which means "do the grad first, then dot product", use uncommon context-depenedent subscript and superscript indices that can mean fairly arbitrary things and so on. But my complaint is mostly about the use of unnecessary and terse formalism where a short full form is replaced by a 40% shorter form just to use a uber generic high order operator to look fancy and refer to couple mysterious (and useless in the context) theorems or other papers. This is when I have to read other similar papers and see wtf that cryptic equation really means (and often it means some mundane thing). Programmers have the same kind of mental problem: in order to look sophisticated, they intentionally use more cryptic and terse solutions to impress "less initiated" and to hide the fact that what they are doing isn't that complex.

This seems to be presenting a false dichotomy. B = (dA_z/dy - dA_y/dz)i + (dA_x/dz - dA_z/dx)j + (dA_y/dx - dA_x/dy)k and B = ∇×A are not the only ways, out of all possible strings, to encode that information. This is the power of modern programming languages. They are mathematics with much better syntax.

Every author has their own motivations, but I will echo what the parent has expressed that often, if not the majority of the time, authors are making their papers far more difficult to read than they need be.

To that, I'll add anecdote. Some years back, I had a fight with a coauthor on a paper who wanted to remove the majority of the substeps within a proof. He believed that it was condescending to include such material and that any interested reader should be able to derive it for themselves. And, in fact, he asked a colleague in the room at the time who agreed with him. I stated flatly that as the primary author that I could not understand nor complete these proofs without these steps, so the material stays. I contend that at least in this one particular case a combination of exclusivity and arrogance led to an attempt at obfuscation.

Remember, publishing isn't just about sharing knowledge. It's also a way to posture, advertise, build a brand, and get grant money. Beyond that, mathematicians are still people with all the imperfections that implies.

Could well be, and I have no definitive evidence otherwise, but sometimes it's hard to imagine anyone being that bad at exposition. Some "benefits" might be seeming smarter and more knowledgeable than they are, making it harder to exactly replicate their methods (most relevant in applied areas), or citations by intimidation ("I don't understand this, but it seems impressive and relevant, so I better cite it.").

They all sound plausible and likely. There are people obfuscating knowledge for all the reasons you mentioned.

At the same time, it doesn't mean there's a concerted effort to obfuscate mathematics.

Explaining things is hard, because you have to create a clear picture of the audience in your mind. Then, you have to draw the lines between what they know and what you want to introduce. The curse of knowledge is a big obstacle here.

Same for the Chinese characters. The complexity (number of characters and their different readings and meanings) stems from all the historical cruft the system accumulated for millennia, not something purposefully engineered to being obscure. There was lot of work made to keep the thing usable.

The other thing is at the time yes, people who could read and write was a small minority. That have been the case in other societies with writing too. Not because of a cabal making it difficult but because getting an education was very costly.

I'd pay to research that topic. How people ensured gatekeeping due to fear. And how much is useful (because sometimes cutting people can reduce the paradox of choice which can drag things to a halt) or not.

ps: I'm also very curious about how much obfuscation is just an icing of fetishism [0] on top of a very natural compression tendency (old mathematics papers were written in full words and I can agree that the meaning can be lost on the way and part of the beauty of mathematics is the short symbols that cover a large set of possibilities).

[0] too much pride and a bit of confusion even on the value of notation leading people to think it's more important than intuition and the concept described.

Many of the early mathematicians were mystics. So in a way the incomprehensibility is a feature and not a bug but for those that are well versed in the language it is a rich source of analogies and metaphors.

> "Good mathematicians see analogies. Great mathematicians see analogies between analogies."

To see analogies between analogies requires compressing concepts into symbols that at first glance are not as comprehensible as regular words and sentences but math is really just another language and people can learn it and apply it fruitfully.

Seriously, how intellectually compromised and paranoid do you have to be to ascribe these weird motivations onto the entire field of mathematicians???

They’re not a cabal, quite the opposite, they fight like schoolchildren (look up Leslie Lamport view on natural deduction, especially OR-elimination)

The conflict between philosophy and mathematics that you describe, is generally understood as the debate around logical positivism, the adoption of which necessitates an extremely naive philosophy of scientific knowledge.

You can even throw Gödel/Turing into this, which really finalizes any dumb debate about formalizing philosophy as math/logic, but at no point is there any proof of a conspiracy to keep mathematics off-limit for “normal people”.

Like, it’s literally published in journals, and it is easily followed, as long as you actually spend the time to understand the fundamentals it’s built on.

As an exclusive club, it's a feature. As pursuit for knowledge it's a tragedy.

There are many highly intelligent people who are differently able to absorb mathematical literature. Consider, perhaps, if the gender imbalance is caused by the time needed to take care of children and a household. - Time which mathematics in its current state unfairly demands.

Human brains are structurally not very different. If raw compute is about the same across individuals then the progress of academia relies on people with highly differing form of intelligence. An obvious example is people on the spectrum with seemingly absent emotional intelligence but vast capacity for abstract creativity. There should be many more types of intelligence available to us given a better communication protocol.

Since I'm firmly in the camp of gaining knowledge for the benefit of mankind, I think the camp which instead of descriptive names gives names like "Hausdorff" or "Abelian" to important mathematical concepts are intellectual looters and a detriment to the field. As you seem to say we are in the same camp, I hope you see the conflict in gatekeeping.

You're not about to guess what 'commutative group' is from it's name any more than you're going to guess what an 'abelian group' is, unless you have already learned what commutative or abelian means in the specific context of algebra.

I'm always baffled that programmers, who freely adopt frameworks and languages with mostly meaningless names (which incidentally makes them easier to google for), continue to insist that names are the major stumbling block in learning math. It's utter nonsense. You pick up the names and terminology quickly enough (or look them up if you forget); the hard part is applying them.

Sure if you're actively working in that subfield. I took abstract algebra in university and remember we treated groups and commutative groups, but I can't tell you what exactly rings were or what Abelian referred to. (But just like knowing the names of things is not the same as knowing things, forgetting these labels is not the same as forgetting the materials.)

Naming things is important. Yes it just takes one googling, but in many cases you are interfacing with many different topics and just reading straight up the useful descriptive name would allow keeping the flow without having to look up stuff.

Maybe this is more of a stumbling block for some than others. Often I find I have to make up mnemonics and other strategies like "the longer word is the one that ..." or "put the two terms in alphabetical order to match their descriptive names' alphabetical order".

Again, this doesn't happen for terms we use every day. But if you use it every year or so, it's a stumbling block to have to ask "which one was that again?" when the descriptive name would immediately clear it up.

I think it's mostly vanity and a "respecting the elders" and credit assignment thing that so many things are named for mathematicians instead of descriptive names. Having something named after you is like one of the biggest "awards" a mathematician can get. But this has no concern for didactics.

I doubt many lawmen would be able to infer the meaning of “communicative” from today’s meaning of “to commute”. https://www.merriam-webster.com/dictionary/commute defines it as ”change, alter”, “convert”, or “compensate”, or as a synonym for “to commutate”, with meaning “to reverse every other half cycle of (an alternating current) so as to form a direct current” (first used, according to that dictionary in 1890; https://projecteuclid.org/download/pdf_1/euclid.rmjm/1181070... claims Kronecker introduced the term “Abelian group” in 1870)

Unless dictionaries 150-ish years ago had significantly closer descriptions of the term, “Abelian”, being a new term not loaded with pre-existing definition might be the better choice for naming this property.

Does it? I think that could describe “associative” equally well as “commutative”.

Edit: “symmetric” might be an alternative for Abelian because the Cayley table (https://en.wikipedia.org/wiki/Cayley_table) of an Abelian group is symmetric. I’m not sure that’s immediately clear enough for laymen, though.

"Symmetric group" is already taken. Given a set S, the symmetric group G_s is the group of all permutations of S. Not to be confused with a permutation group of S, which is a group of some permutations of S.

I suppose we could rename "symmetric group" to "maximal permutation group", but stringing adjectives together is not a sustainable strategy for naming. Plus, it is not clear to me how accurate "maximal permutation group" is as a name. That is, the symmetric group S_3 is the maximal permutation group of {1,2,3}. However, it is not generally a maximal permutation group, as it is contained withing the permutation group S_4.

Symmetric is also good but it has a very distinct geometric interpretation. If we can avoid overloading terminology in a confusing way when designing these the names we should do so.

Perhaps pair-independent is a distinctive name for associative...

Well, introducing the new term “Abelian” certainly avoids overloading terminology in a confusing way :-)

“Pair invariant” IMO, isn’t good, certainly worse than “Pairing invariant”. “Parentheses invariant” might work, but of course would get just as confusing/incorrect as “pair invariant” once one moves from groups to fields.

IMHO, people don't get philosophy. There's philosophy about the "inner side of things", that uses cryptic symbolism with ten layers of meanings, and philosophy for the public. Only the latter is taught in schools and discussed. All these famous philosophers of the past - Plato, Aristotle, and even the modern Kant - were known first for their contribution to that "inner" philosophy, and coincidentally they also left a few texts comprehensible by the public, and only those texts are discussed. There are famous philosophers that only cared about the "inner side of things", there are statues to honor them, their books are kept by some libraries and individuals who recognize significance of those books, but you wouldn't hear their names, unless you do some research.

Modern science has the same dualism: some scientists bother to educate the public and simplify ideas for them, while some don't and only leave incomprehensible cryptic math. Einstein is a good example: GTR is what made him famous, but he got a public award (Nobel prize) for some side research.

I'd recommend to start with Plato and do some research about who he really was. The Plato's wikipedia page is a good example of what I'm talking about: lots of attention to insignificant details and an attempt to present Plato as some kind of "inventor of dialogue techniques."

To be honest, you are not writing clearly, and this raises my crank/quack alarms. If you have a concrete text that discusses this plainly, even if it's a full book, I'd be glad to hear it.

Just saying that it's some mystery, some forbidden secret knowledge, which mere mortals don't understand, is not convincing.

I have never heard of the phrase "the inner side of things". If this has a meaning, surely someone has written about it.

I think the incomprehensibility of high level mathematics is as much a planned feature as is incomprehensibility of Haskell to a common programmer. And yet, people who know Haskell find it actually more expressive than e.g. Java, precisely because it is more abstract, and neatly unifies some concepts.

I think what I realized from being a programmer, just like in programming, there is no single language of mathematics. Different sub-fields invented different languages, and since everybody wants to remain productive, the effort to unify the languages is extremely difficult and unrewarding. Yet is much more accepted activity in mathematics than in programming.

But it's not really intentional, it's just a side effect of natural tendency of various research groups to have somewhat different languages.

>Anyone can disagree of course but just looking at the political affiliation ratios of educators makes it clear that one aspect of the hard sciences historically has been an ability to stay focused on utility despite post-modernism.

Can you explain this further? How does political affiliation of educators have anything to do with the focus on "utility despite post-modernism"? Further, isn't there some utility in a discussion on the meaning, importance, and place of 'utility'?

>they've just dropped the pretense of being truth seeking persuits.

Is there any evidence or reasoning to persuade the reader of this statement?

>but for a long time philosophy resisted efforts to 'mathmatecize' (my own word) the field because of concern that it would become inscrutible to outsiders

This isn't the case in my experience, rather, philosophers were concerned about the implications in terms of the kinds of arguments they can make and their relevance if the field were 'mathematized' - you may not be aware, for instance, of 'analytical Marxism' which proposes that mathematical models are a superior way to continue Marx's project - but it is not without its critics on methodological grounds[0]. As you would expect, mathematicians and quantitative economists working within analytical Marxism share some 'politically polarized' views too. Where is there space for the 'utility' of a political-economic project within your schema?

"The conception of mathematics as a mere language contains, however, the seeds of its own destruction. The notion of language as a simple medium through which ideas are communicated has been challenged from diverse perspectives; it has been reinterpreted as both constitutive of, and constituted by, the process of theorizing (e.g., by Williams 1977, 32). The use of mathematics in social theory, too, may be reconceptualized as a discursive condition of theories, which constrains and limits, and is partly determined by, those theories. Mathematical concepts, such as the equilibrium position associated with the solution to a set of simultaneous equations or the exogenous status of the rules of a game, partly determine the notions of relation and causality among the theoretical objects designated by the theories in which the means of mathematical formalization are utilized."

[0] As Amariglio & Callari said, claims of rigor and clarity may just as well be used as rhetorical devices, in exactly the same way Descartes proceeded with rhetoric to 'prove' his own mind and God.

Working in theoretical physics (quantum computing) I think about this a lot: the unreasonable effectiveness of mathematics. I see two ways of understanding this. (1) That mathematicians are in the game of building powerful tools, and therefore these tools inherently capture a wide range of phenomena. (2) When confronted with a difficult problem we become the drunk person looking for their keys under the street light, even if we know the keys are elsewhere. In this case, the street light is whatever good calculations we have at hand, ie. mathematics.

A nice example is that Gauss realized in 1805 that Fourier transforms could be made less tedious by subdividing the problem (n=12=3*4 in his case), 160 years before Cooley-Tukey.

Yes, but Gauss was also a perfectionist and he didn't publish anything that weren't up to his standards. He even discovered non-euclidean geometry at some point but didn't think it was worth publishing so Bolyai's son János (and also Lobachevsky) had to re-discover it

> Bolyai’s son János was also a mathematician. In 1832, János published his brilliant discovery of non-Euclidean geometry. His father, overjoyed that his son might have achieved something worthy of praise from Gauss, the man he admired more than any other, asked Gauss for his view of the work.

There was a bit of a feud and I don't know if it was ever properly settled

> Whether Gauss actually fleshed-out non-Euclidean geometry as comprehensively as Bolyai and Lobachevsky is uncertain.

As much as we talk about math literacy here (compared to other platforms anyway), I can't help but feel like the incomprehensibility of high level math to laymen is a feature. It's not talked about outside of academia often, because why would it be, but for a long time philosophy resisted efforts to 'mathmatecize' (my own word) the field because of concern that it would become inscrutible to outsiders. As a result, academic philosophy is much more politically polarized than math. Where would we be if every step forward in mathematics was met with backlash for not paying adequate concern to some arbitrary set of external implications? Anyone can disagree of course but just looking at the political affiliation ratios of educators makes it clear that one aspect of the hard sciences historically has been an ability to stay focused on utility despite post-modernism. I'm probably being unfair though, the humanities continue to produce persuasive arguments (persuasive emphasis being one of the first things most lit and speech majors learn during critical thinking and writing courses in college) they've just dropped the pretense of being truth seeking persuits.

I might see it this way because I've been studying math for a long time (though not professionally most of it) but I don't think modern math is ever made more complicated or hard to understand just for the heck of it. That's because math is already felt as extremely hard for most people.

I think anyone in professional math is working really hard to make the field more comprehensible to themselves and so to the world - and the reason for this is the more a mathematician understands and more compactly they can understand it, the further they can go.

And certainly, the way a mathematician put things curtly makes it harder for the laymen, I think the biggest harm to the layman's understanding of math is a math education that gets people conditioned to not think abstractly and think of math as a series of dull exercises.

I agree. If some mathematicians purposeful obfuscated or made more complicated than necessary certain topics, it would create an arbitrage opportunity for others to clarify and simplify the topic.

There are definitely tangible rewards for those who can clarify and simplify a topic, because it can lead to discoveries and insights.

This gradual clarification of a topic hinges on the actual importance of the topic. If a topic is not important, I can see people getting away with obfuscating or complicating results.

Respectable, pro mathematicians don't obfuscate. But there are definitely math professors in non-top universities who get kicks out of torturing students and think "this is supposed to be

hard" etc. I'm definitely a math enthusiast but Ihavepersonally met people like that. Typically the exams are under strong time pressure and are more about rote learning of fixed types of exercises and remembering definitions.On the other hand, I had math profs who radiated curiosity and fun and the exams were just a few questions, but you had to think to solve them and you weren't under much time pressure.

So while mathematicians aren't purposefully obscure, the math people (teachers) the average person interacts with are sometimes purposefully obscure.

> As a result, academic philosophy is much more politically polarized than math.

I don't think this is a clear conclusion at all. Philosophy is inherently very political (all political ideas ultimately come from philosophy, after all). Philosophy is also concerned with the real world and human affairs much more than abstract math is.

It's also funny that you bring up post-modernism, a philosophical current famous and often ridiculed for being one of the most deliberately obfuscated.

I also know that a lot of the efforts of brininging mathematical thinking and abstraction in the humanities end up exactly in obscurantism: most of the field can't understand it, but it has cultural clout so they can't admit that, and it becomes a source of empty prestige, with maths literally sprinkled on studies just to access that clout. Even in medicine, math-y papers are/were often accepted unconditionally, with peer review unable to check the actual maths, but unwilling to admit their lack of mathematical culture. For an example, there was a medical article published in 1993 [0], which is a re-discovery of the trapese method of integration. This got around 100 citations.

Of course, that article is at least correct. In postmodern literary studies, there is a famous example of an intentionally meaningless, but very abstract and complex sounding article that was successfully published in one of the leading journals, despite ultimately being gibberish [1].

My utlimate point being, I don't think there is any link between obscurity of a field and its resistance to internal or external politics.

[0] http://care.diabetesjournals.org/cgi/content/abstract/17/2/1...

[1] https://en.m.wikipedia.org/wiki/Sokal_affair

In philosophy 101, my professor said that philosophy was focused on controversial topics because once you could prove a side, it would become science. E.g. the heavens used to be a philosophic topic until we had the ability to prove heliocentrism.

Math is much more focused on the realm of what can be proved. Unproven statements in math (e.g. conjectures or axioms) are the exceptions that prove the rule.

Professional mathematicians spend their lives working on the unproven. It just seems like math is about what's already been proven because it takes so much education to reach the unproven ideas and unanswered questions.

Obfuscation is a practice of the entirety of Academia for a long while now. The art of obfuscation is not new at all. A small anecdote, I remember a Taiwanese friend mentioning the history of the complexity of Chinese characters and how they were devised, in part, to ensure that the lower class(es) could not learn them.

Obfuscation inherently guarantees a lot of things out of the box: you appear more enlightened, only a class of people can understand and thus judge your work, and all newcomers can only become "in the know" if and only if they are trained by someone who already is "in the know". This is very appealing to anyone with even minor insecurities (most people), and has been widely utilized throughout many areas of human society in time. Math is merely one example where this practice has presence.

EDIT: I will admit that the more democratized the internet has made knowledge, and the more prevalent higher-degrees become in society, the more awareness people have of this issue and the more it gets called out. So, all in all, there is the positive side of things that this art of obfuscation is in decline of use for things where it should never be used (like education, science, etc.).

I'd say this is generally not true, at least in math. There's definitely poor notation and misnomers that come from tradition, but it's hardly deliberate as "obfuscation" implies.

I'm not sure I agree (I have a PhD in maths). There are many maths papers and books I've tried to understand where that seems to have been deliberately made more difficult than it needs to be. Sure, you could argue that I didn't have the necessary prerequisite knowledge at that point, or that the authors simply weren't good at explaining things, but I do get the impression that it's sometimes more than that.

Using specialised meanings for words without definition or reference (words that are used for other purposes elsewhere so they can't be searched for) is a key problem, as is the same with notation, and as is leaving out crucial steps. If you're already in the know, those things are probably clear.

Most people suck at explaining. So the fact that most scientists suck at explaining is like... yeah, that's what naturally happens, unless you actively reward good explanations.

The perfect case would be if someone is great at research and also great at explaining. But great skills are rare, and specific combinations of great skills are even more rare. So you will have a scientist or two who are great explainers at the same time, but then you have a lot of scientists who are great at research but suck at explaining, and also a lot of people who are great at explaining but suck at research. How does academia reward them?

The ones with both skills will get the highest rewards: they will invent lots of cool stuff and describe it clearly, so they will get many citations. But there are only a few such people per generation. The ones who are good researchers but suck at explanation can still invent something and publish the type of paper you complain about; and they will get a (smaller) reward for doing so. The ones who are good at explaining but suck at research... will probably be fired.

There are a few things that a person whose only skill is explaining could do in academia. They wouldn't get a reward for explaining someone else's paper clearly, but they could get a reward for writing a meta-review that would explain

multiplepapers clearly. In theory, their scientific contribution would be in comparing the papers, but in practice, hopefully even people who only needed one of the papers would cite them as a reward for making it easier to understand. -- I don't know if this would actually be sufficient to survive academically.People who are good at explaining can also make money by writing popular textbooks. Question is, whether there is a sufficient demand for a popular textbook explaining some obscure math.

> They wouldn't get a reward for explaining someone else's paper clearly, but they could get a reward for writing a meta-review that would explain multiple papers clearly.

I see tremendous value for such work, and I think it is one of the biggest current bottlenecks in the sciences. We've got the cutting edge stuff happening in papers, and we've got 5-10 year old knowledge that makes it into textbooks, but in the middle there is nothing. Everyone is working on their own to catch up. We're talking thousands of grad students independently banding their heard agains the same problems...

If there were more "review journals" things would drastically improve. Maybe NSF could provide some funding for such efforts? Not sure what the metrics would be for this---not citations, we'd nee more like views or upvotes like on HN. The other option would be for grad students to self-organize into "reading clubs" on different subjects and share notes.

Explainers-first shouldn't be fired if they make up a small percentage, because they do a wonderful job without the bulk of resources.

I'm not sure if explainers-first are the minority, though. After all, it's much easier to review than to create. And we do want new research.

Firstly, let's not fall for a false dichotomy. There is likely a simple equilibrium of review and creativity that must be maintained for an optimally healthy scientific community. Obviously, the old style of numerous-journals + publish-or-perish mentality was unhealthy implementations of a scientific system.

But, I would move away from papers entirely. They hold no redeeming values

overalternatives like a git repository published publicly with an "issues" forum e.g., GitHub, etc. This needs to absolutely become the new standard, and we need to get rid of papers and journals, both physically and digitally, in their entirety. They havezeroredeeming factors at this point in time except to boost the ego of the published. Quality control can easily be decentralized and available to everyone using something like GitHub. We know this because that is literally what the security and trust of open-source software is predicated on: putting everything in the open.Modern math papers also love cryptic symbolism. I've been doing some research and been reading papers on a certain topic from 50s to these days. The difference in math presentation is startling: a paper from 50s really tries to explain things, e.g. it uses simple partial derivatives and some simplifications because that's just enough for the paper; on the other hand a paper from 2020 describes the very same math as a cryptic sequence of triangles and dots, and the reader really need to be familiar with the context to understand what this compressed formula expands into. I believe the main motivation behind this is the desire to look sophisticated, so anyone reading the author's paper would be impressed at a glance with the complexity of the math.

Like ∇^2 u = Δu = f? It's not really cryptic, it's taught in multivariable. And it's not done to look sophisticated, it's so that you can instantly recognize that it's Laplace, or Poisson, etc.

If I tell you B = (dA_z/dy - dA_y/dz)i + (dA_x/dz - dA_z/dx)j + (dA_y/dx - dA_x/dy)k do you instantly see what operator is applied? That's why you just write B = ∇×A.

People presenting ideas in shorter notation is not showing a lack of progress - usually it's the opposite. In old optimization papers people would write out the entire normal equations, entry by entry. Not really insightful at all, compared to A'Ax = A'b.

That's a trivial triangle. Modern papers go far beyond that and invent some partial diff operators, like "F.∇" which means "do the grad first, then dot product", use uncommon context-depenedent subscript and superscript indices that can mean fairly arbitrary things and so on. But my complaint is mostly about the use of unnecessary and terse formalism where a short full form is replaced by a 40% shorter form just to use a uber generic high order operator to look fancy and refer to couple mysterious (and useless in the context) theorems or other papers. This is when I have to read other similar papers and see wtf that cryptic equation really means (and often it means some mundane thing). Programmers have the same kind of mental problem: in order to look sophisticated, they intentionally use more cryptic and terse solutions to impress "less initiated" and to hide the fact that what they are doing isn't that complex.

This seems to be presenting a false dichotomy. B = (dA_z/dy - dA_y/dz)i + (dA_x/dz - dA_z/dx)j + (dA_y/dx - dA_x/dy)k and B = ∇×A are

notthe only ways, out of all possible strings, to encode that information. This is the power of modern programming languages. They are mathematics with much better syntax.Can you point us to that specific pair of papers? I have never seen a debauchery of triangles of dots used to write a pde.

But that sounds like bad exposition. What benefit would any author get from purposely obfuscating what they're trying to share?

Every author has their own motivations, but I will echo what the parent has expressed that often, if not the majority of the time, authors are making their papers far more difficult to read than they need be.

To that, I'll add anecdote. Some years back, I had a fight with a coauthor on a paper who wanted to remove the majority of the substeps within a proof. He believed that it was condescending to include such material and that any interested reader should be able to derive it for themselves. And, in fact, he asked a colleague in the room at the time who agreed with him. I stated flatly that as the primary author that I could not understand nor complete these proofs without these steps, so the material stays. I contend that at least in this one particular case a combination of exclusivity and arrogance led to an attempt at obfuscation.

Remember, publishing isn't just about sharing knowledge. It's also a way to posture, advertise, build a brand, and get grant money. Beyond that, mathematicians are still people with all the imperfections that implies.

Could well be, and I have no definitive evidence otherwise, but sometimes it's hard to imagine anyone being that bad at exposition. Some "benefits" might be seeming smarter and more knowledgeable than they are, making it harder to exactly replicate their methods (most relevant in applied areas), or citations by intimidation ("I don't understand this, but it seems impressive and relevant, so I better cite it.").

They all sound plausible and likely. There are people obfuscating knowledge for all the reasons you mentioned.

At the same time, it doesn't mean there's a concerted effort to obfuscate mathematics.

Explaining things is hard, because you have to create a clear picture of the audience in your mind. Then, you have to draw the lines between what they know and what you want to introduce. The curse of knowledge is a big obstacle here.

Same for the Chinese characters. The complexity (number of characters and their different readings and meanings) stems from all the historical cruft the system accumulated for millennia, not something purposefully engineered to being obscure. There was lot of work made to keep the thing usable.

The other thing is at the time yes, people who could read and write was a small minority. That have been the case in other societies with writing too. Not because of a cabal making it difficult but because getting an education was very costly.

I'd pay to research that topic. How people ensured gatekeeping due to fear. And how much is useful (because sometimes cutting people can reduce the paradox of choice which can drag things to a halt) or not.

ps: I'm also very curious about how much obfuscation is just an icing of fetishism [0] on top of a very natural compression tendency (old mathematics papers were written in full words and I can agree that the meaning can be lost on the way and part of the beauty of mathematics is the short symbols that cover a large set of possibilities).

[0] too much pride and a bit of confusion even on the value of notation leading people to think it's more important than intuition and the concept described.

Many of the early mathematicians were mystics. So in a way the incomprehensibility is a feature and not a bug but for those that are well versed in the language it is a rich source of analogies and metaphors.

> "Good mathematicians see analogies. Great mathematicians see analogies between analogies."

To see analogies between analogies requires compressing concepts into symbols that at first glance are not as comprehensible as regular words and sentences but math is really just another language and people can learn it and apply it fruitfully.

--

https://en.wikipedia.org/wiki/Stefan_Banach#Quotes#:~:text=G....

Seriously, how intellectually compromised and paranoid do you have to be to ascribe these weird motivations onto the entire field of mathematicians???

They’re not a cabal, quite the opposite, they fight like schoolchildren (look up Leslie Lamport view on natural deduction, especially OR-elimination)

The conflict between philosophy and mathematics that you describe, is generally understood as the debate around logical positivism, the adoption of which necessitates an extremely naive philosophy of scientific knowledge.

You can even throw Gödel/Turing into this, which really finalizes any dumb debate about formalizing philosophy as math/logic, but at no point is there any proof of a conspiracy to keep mathematics off-limit for “normal people”.

Like, it’s literally published in journals, and it is easily followed, as long as you actually spend the time to understand the fundamentals it’s built on.

As an exclusive club, it's a feature. As pursuit for knowledge it's a tragedy.

There are many highly intelligent people who are differently able to absorb mathematical literature. Consider, perhaps, if the gender imbalance is caused by the time needed to take care of children and a household. - Time which mathematics in its current state unfairly demands.

Human brains are structurally not very different. If raw compute is about the same across individuals then the progress of academia relies on people with highly differing form of intelligence. An obvious example is people on the spectrum with seemingly absent emotional intelligence but vast capacity for abstract creativity. There should be many more types of intelligence available to us given a better communication protocol.

Since I'm firmly in the camp of gaining knowledge for the benefit of mankind, I think the camp which instead of descriptive names gives names like "Hausdorff" or "Abelian" to important mathematical concepts are intellectual looters and a detriment to the field. As you seem to say we are in the same camp, I hope you see the conflict in gatekeeping.

What descriptive alternative would you use for "abelian group"?

commutative group (which happens to be another name they often go by).

Which is just a synonym for abelian.

You're not about to guess what 'commutative group' is from it's name any more than you're going to guess what an 'abelian group' is, unless you have already learned what commutative or abelian means in the specific context of algebra.

I'm always baffled that programmers, who freely adopt frameworks and languages with mostly meaningless names (which incidentally makes them easier to google for), continue to insist that names are the major stumbling block in learning math. It's utter nonsense. You pick up the names and terminology quickly enough (or look them up if you forget); the hard part is applying them.

Sure if you're actively working in that subfield. I took abstract algebra in university and remember we treated groups and commutative groups, but I can't tell you what exactly rings were or what Abelian referred to. (But just like knowing the names of things is not the same as knowing things, forgetting these labels is not the same as forgetting the materials.)

Naming things is important. Yes it just takes one googling, but in many cases you are interfacing with many different topics and just reading straight up the useful descriptive name would allow keeping the flow without having to look up stuff.

Maybe this is more of a stumbling block for some than others. Often I find I have to make up mnemonics and other strategies like "the longer word is the one that ..." or "put the two terms in alphabetical order to match their descriptive names' alphabetical order".

Again, this doesn't happen for terms we use every day. But if you use it every year or so, it's a stumbling block to have to ask "which one was that again?" when the descriptive name would immediately clear it up.

I think it's mostly vanity and a "respecting the elders" and credit assignment thing that so many things are named for mathematicians instead of descriptive names. Having something named after you is like one of the biggest "awards" a mathematician can get. But this has no concern for didactics.

No, to commute is plain English. Abelian is not.

I doubt many lawmen would be able to infer the meaning of “communicative” from today’s meaning of “to commute”. https://www.merriam-webster.com/dictionary/commute defines it as ”change, alter”, “convert”, or “compensate”, or as a synonym for “to commutate”, with meaning “to reverse every other half cycle of (an alternating current) so as to form a direct current” (first used, according to that dictionary in 1890; https://projecteuclid.org/download/pdf_1/euclid.rmjm/1181070... claims Kronecker introduced the term “Abelian group” in 1870)

https://dictionary.cambridge.org/dictionary/english/commute doesn’t give any definition close to the mathematical meaning, either.

Unless dictionaries 150-ish years ago had significantly closer descriptions of the term, “Abelian”, being a new term not loaded with pre-existing definition might be the better choice for naming this property.

"Commutative" communicates something. "Abelian" communicates nothing.

"Order-independent" like I suggested communicates a lot.

Does it? I think that could describe “associative” equally well as “commutative”.

Edit: “symmetric” might be an alternative for Abelian because the Cayley table (https://en.wikipedia.org/wiki/Cayley_table) of an Abelian group is symmetric. I’m not sure that’s immediately clear enough for laymen, though.

"Symmetric group" is already taken. Given a set S, the symmetric group G_s is the group of all permutations of S. Not to be confused with a permutation group of S, which is a group of some permutations of S.

I suppose we could rename "symmetric group" to "maximal permutation group", but stringing adjectives together is not a sustainable strategy for naming. Plus, it is not clear to me how accurate "maximal permutation group" is as a name. That is, the symmetric group S_3 is the maximal permutation group of {1,2,3}. However, it is not generally a maximal permutation group, as it is contained withing the permutation group S_4.

Symmetric is also good but it has a very distinct geometric interpretation. If we can avoid overloading terminology in a confusing way when designing these the names we should do so.

Perhaps pair-independent is a distinctive name for associative...

Well, introducing the new term “Abelian” certainly avoids overloading terminology in a confusing way :-)

“Pair invariant” IMO, isn’t good, certainly worse than “Pairing invariant”. “Parentheses invariant” might work, but of course would get just as confusing/incorrect as “pair invariant” once one moves from groups to fields.

"Associative" doesn't provide any information to avoid confusion when dealing with mixed operators either.

Somehow including "peer" in the terminology could help.

And same for "Hilbert space" please :)

The set of an order-independent operator, perhaps.

IMHO, people don't get philosophy. There's philosophy about the "inner side of things", that uses cryptic symbolism with ten layers of meanings, and philosophy for the public. Only the latter is taught in schools and discussed. All these famous philosophers of the past - Plato, Aristotle, and even the modern Kant - were known first for their contribution to that "inner" philosophy, and coincidentally they also left a few texts comprehensible by the public, and only those texts are discussed. There are famous philosophers that only cared about the "inner side of things", there are statues to honor them, their books are kept by some libraries and individuals who recognize significance of those books, but you wouldn't hear their names, unless you do some research.

Modern science has the same dualism: some scientists bother to educate the public and simplify ideas for them, while some don't and only leave incomprehensible cryptic math. Einstein is a good example: GTR is what made him famous, but he got a public award (Nobel prize) for some side research.

Can you link some texts that are about the "inner side of things"? Not sure what you exactly mean by these words.

I'd recommend to start with Plato and do some research about who he really was. The Plato's wikipedia page is a good example of what I'm talking about: lots of attention to insignificant details and an attempt to present Plato as some kind of "inventor of dialogue techniques."

To be honest, you are not writing clearly, and this raises my crank/quack alarms. If you have a concrete text that discusses this plainly, even if it's a full book, I'd be glad to hear it.

Just saying that it's some mystery, some forbidden secret knowledge, which mere mortals don't understand, is not convincing.

I have never heard of the phrase "the inner side of things". If this has a meaning, surely someone has written about it.

I think the incomprehensibility of high level mathematics is as much a planned feature as is incomprehensibility of Haskell to a common programmer. And yet, people who know Haskell find it actually more expressive than e.g. Java, precisely because it is more abstract, and neatly unifies some concepts.

I think what I realized from being a programmer, just like in programming, there is no single language of mathematics. Different sub-fields invented different languages, and since everybody wants to remain productive, the effort to unify the languages is extremely difficult and unrewarding. Yet is much more accepted activity in mathematics than in programming.

But it's not really intentional, it's just a side effect of natural tendency of various research groups to have somewhat different languages.

so you're thinking being cryptic acts as a natural filter to keep the noise away ?

>Anyone can disagree of course but just looking at the political affiliation ratios of educators makes it clear that one aspect of the hard sciences historically has been an ability to stay focused on utility despite post-modernism.

Can you explain this further? How does political affiliation of educators have anything to do with the focus on "utility despite post-modernism"? Further, isn't there some utility in a discussion on the meaning, importance, and place of 'utility'?

>they've just dropped the pretense of being truth seeking persuits.

Is there any evidence or reasoning to persuade the reader of this statement?

>but for a long time philosophy resisted efforts to 'mathmatecize' (my own word) the field because of concern that it would become inscrutible to outsiders

This isn't the case in my experience, rather, philosophers were concerned about the implications in terms of the kinds of arguments they can make and their relevance if the field were 'mathematized' - you may not be aware, for instance, of 'analytical Marxism' which proposes that mathematical models are a superior way to continue Marx's project - but it is not without its critics on methodological grounds[0]. As you would expect, mathematicians and quantitative economists working within analytical Marxism share some 'politically polarized' views too. Where is there space for the 'utility' of a political-economic project within your schema?

"The conception of mathematics as a mere language contains, however, the seeds of its own destruction. The notion of language as a simple medium through which ideas are communicated has been challenged from diverse perspectives; it has been reinterpreted as both constitutive of, and constituted by, the process of theorizing (e.g., by Williams 1977, 32). The use of mathematics in social theory, too, may be reconceptualized as a discursive condition of theories, which constrains and limits, and is partly determined by, those theories. Mathematical concepts, such as the equilibrium position associated with the solution to a set of simultaneous equations or the exogenous status of the rules of a game, partly determine the notions of relation and causality among the theoretical objects designated by the theories in which the means of mathematical formalization are utilized."

[0] As Amariglio & Callari said, claims of rigor and clarity may just as well be used as rhetorical devices, in exactly the same way Descartes proceeded with rhetoric to 'prove' his own mind and God.

Working in theoretical physics (quantum computing) I think about this a lot: the unreasonable effectiveness of mathematics. I see two ways of understanding this. (1) That mathematicians are in the game of building powerful tools, and therefore these tools inherently capture a wide range of phenomena. (2) When confronted with a difficult problem we become the drunk person looking for their keys under the street light, even if we know the keys are elsewhere. In this case, the street light is whatever good calculations we have at hand, ie. mathematics.

A nice example is that Gauss realized in 1805 that Fourier transforms could be made less tedious by subdividing the problem (n=12=3*4 in his case), 160 years before Cooley-Tukey.

Yes, but Gauss was also a perfectionist and he didn't publish anything that weren't up to his standards. He even discovered non-euclidean geometry at some point but didn't think it was worth publishing so Bolyai's son János (and also Lobachevsky) had to re-discover it

> Bolyai’s son János was also a mathematician. In 1832, János published his brilliant discovery of non-Euclidean geometry. His father, overjoyed that his son might have achieved something worthy of praise from Gauss, the man he admired more than any other, asked Gauss for his view of the work.

There was a bit of a feud and I don't know if it was ever properly settled

> Whether Gauss actually fleshed-out non-Euclidean geometry as comprehensively as Bolyai and Lobachevsky is uncertain.

--

https://www.famousscientists.org/gauss-and-non-euclidean-geo...

As I said the first time this topic was posted:

This article is ironic coming from such a math phobic publication as Nature:

http://www.dam.brown.edu/people/mumford/blog/2014/Grothendie...

Why is it ironic and why are they math phobic?