> But why companies keep trying to push new languages to massive adoption.
How many languages has this happened with and how often? Go, Rust, Kotlin, Swift, Julia, Typescript are in a generation of newer programming languages and they're between almost 12 (Go) and 7 (Swift) years old . Each of them offers something new in comparison to its predecessors. Almost every other commonly used language is from the 80s or 90s.
There are other, more niche languages but they fall under your point about research, experimentation, doing stuff for fun etc. Just because some language pops up in blog posts or on HN every once in a while, that doesn't mean that it has wide enough adoption.
Is any of this really that much? I don't think that learning a new language once every few years should be something that challenging especially for an experienced developer.
Hobbyists create new languages because it’s fun, or in the earlier stages they think they can make something better than existing languages.
Everyone still creates DSLs because they are more effective at solving certain problems than general-purpose languages.
People create new languages (for the most part) when it helps them substantially improve their capability to solve the problems they care about (I.e. “delivering value”).
If you don’t feel the pain of those problems enough to understand why the improvements matter, then you can happily ignore the new language hype.
I can’t comment about the JS ecosystem, but every new language (~half dozen) I’ve learned has given me new perspectives on solving problems, and sometimes the language itself serves as a better tool I reach for frequently. This may be somewhat biased by the fact that the problems I’m most interested in are typically algorithmic/mathematical in nature, and YMMV for different classes of problems.
People underestimate how much work it takes to create a successful language. Creating a language is easy (I have created plenty just for fun). Especially with LLVM available. The hard part (99% of the work) is the rest. Libraries, documentation, standards etc. The language environment/libraries/documentation is much more important for developer productivity than the language itself.
Sometimes having your own language just fits into the larger ecosystem better than any existing language. Several of the major languages in the last while were about ecosystems more than the language. See C# which is basically Java, but for .NET. Once the decision not to use the Java standard library or VM was made, why try to adapt Java directly? Or see Swift, which meshes with the Apple ecosystem well. Language and compiler level support for reference counting that Apple likes these days. They could have used something like a strict subset of C++ with their own libraries. In fact, they did originally. But it was pretty kludgy as I understand it. And they were using their own compiler already. Might as well use your own language?
No single language is ideal for all tasks, so often new languages get invented in order to optimize for a particular sort.
Developing a new language is a significant investment, though, so it's natural for companies that do so to want to commercialize it and get it used by as many people as possible.
>Instead of delivering the value and focusing on the task, on the product we spend half of the time just learning yet another new super puper technology.
You shouldn't, in the vast majority of cases. You should choose the language that is best suited for the task at hand and use that. Chasing after the new & shiny is tempting (it's fun!) but it rarely actually makes engineering or economic sense.
I think there are three (arguably four, depending on how you categorize) major programming paradigms. If you are competent in one, then learning other languages in the same paradigm is pretty easy because it's really just a change in syntax.
Learning a different paradigm is a bit more difficult because it requires a change in how you conceptualize the problem.
In the big picture, evolution of PLs has been reachiny for higher levels of abstraction.
People used to program everyday business apps in arduous things like assembly language, C++, etc.
But the search has been staggering in various directions and taken missteps half the time due to historical accidents and corporate games. Look at eg the current primitive and fragmented state of GPU languages for a glaring example.
A good way to understand this is to look at the problems the languages themselves claim to address, and fanning out to see how people struggle to address those with another language where similar work is done. A lot of times the language itself isn’t the goal or the end product, but approaching it from that perspective helps to frame less exploratory approaches in a more conventional environment.