My personnal experience with data scientist and startups is that they're hired much much too early, when the product has a lot more fundamental issues to solve, and only because it looks cool to say that you're doing AI.
In practice, they're often frustrated for years by the lack of infrastructure to work on their ideas, but live with it because life is good.
I suspect AI today is like big data ten years ago : a lot of company think they need AI, but in fact what they need is a good product and a few algorithm requiring high-school level maths.
> I suspect AI today is like big data ten years ago
Exactly. Also as soon as big data came around nobody was doing just data, everyone was doing big data even if they had the same 10GB MySQL database they had from previous years.
AI is a bit the same. Doing any analytics? - Now it's AI. Opening and excel spreadsheet and doing a curve fit - I am a data scientist doing AI. Doing any actual ML - not learning anymore but super deep learning.
Sometimes tech reminds me rich, bored, stay at home SO's that are constantly redecorating their house. Not because they need it, but because they are bored and the next trendy design looks cool anyway.
It comes from both top and bottom. At the top managers want to justify their salaries to their managers so they always redesign / rebuild / reorganize, even if things work pretty well as is. At the bottom new programmers fresh out of college want to assert themselves. The best way to do is to propose that everything existing is old and shit and needs to be rewritten. So they volunteer of course.
His comment while accurate was probably a bit dramatized for effect.
My guess is he is a consultant hired to do a typical project (i.e., help me move X into the cloud, help me re-architect our data model for big data, help me implement AI for our dog walking app) and at that point he just shows them they aren't ready for it or flat out don't need it.
It's just my guess, but it's what consultants do. The bad ones are happy to take on your project and charge you $400 bucks an hour. The good ones unfortunately, deal with the dilemma of turning down lucrative work in the spirit of doing what's right.
My guess is that at some point you get tired of it and take solace in the fact that you're at least helping them to do it right.
After all, if they think they are going to hit big data scale and want the tools to handle it, you aren't completely a bad guy if you help them do it right, especially if you've already advised them not to do it.
I regret to inform you that my original comment was sarcasm. I'm not a consultant, but would be perfectly willing to do the job as I described. The hardest part would be keeping a straight face while demanding the $20k.
SVM, other kernel methods, Bayesian networks, genetic algorithms, clustering etc.
> Isn't regression always a form of curve fitting?
Sure and that's been done before for many years. I was just saying that today everyone who was doing that, isn't doing curve fitting or regression analysis anymore but "AI" and "Deep Learning" (doesn't matter if there are not neurons involved).
I think it's a true scotsman fallacy. We don't have a good definition for ML. So we reject methods and problems as not "real" ML". Ive done some logistic regression in my work and hesistated to call it ML. But Ive read a survey about tools used by people who do ML and logistic regression was the top item on that list.
Neurons are just an inspiration from biology. You can call them layers of neurons or you can call them matrices and do matrix multiplication. Nothing special about neurons.
I understand your argument that people like to use buzzwords and you don't like this. But it's a genral problem which applies to everything, not just ML.
>My personnal experience with data scientist and startups is that they're hired much much too early, when the product has a lot more fundamental issues to solve, and only because it looks cool to say that you're doing AI.
Not just startups, but big established companies as well. At bigger companies not only do you have data and infrastructure issues, you also have business process and political issues. I've seen more than a few cases where fixing a business process would have a much higher ROI than a model, but it's easier and cheaper to hire a data scientist and make a big noise about it than it is to admit your business process (that includes 50+ people) is a mess and do the work to fix it.
> a lot of company think they need AI, but in fact what they need is a good product and a few algorithm requiring high-school level maths
Yep, AI is the new silver bullet that will solve everyone's problems. It seems much easier to throw a million dollars at someone with the right credentials than to make hard choices and build a better product.
That's because you ultimately don't care for building the better product. You care to make an impeccable impression of doing the right thing - in the current market this will get you more capital than your customers will ever pay you.
This is my experience as well, and I'm wondering if this industry could learn something from another one: video games. In that industry, many specialized technical people work on a large project, but their goals and "infrastructure" (seem to be) aligned.
Are the issues we see in many software companies w.r.t. AI/ML/DS an effect of poor role definition/team hierarchy? It seems to me that a five-person team complete with an "engineer", scientist/researcher, a couple of devs for APIs and pretty pictures, and one other "utility" role would totally kill it and create amazing things. But, I've never worked in an environment where the ML people aren't completely segregated into their environment, so I don't know.
They can’t, and don’t. The secret is that experienced PhDs (mostly) dominate the high end of “AI” hiring, but don’t have much title or departmental differentiation from people who are “only” specialized software engineers in top tech companies. But this isn’t very well known, large companies want to recruit as much talent as they can regardless of role, and startups want to compete on paper - therefore, you have the following effects:
1. Startups give whatever sexy title they want to the people they can afford, which makes titles fairly useless, because they almost never can afford the talent commanding “sky-high” salaries.
2. Within large tech companies like Google and Facebook, it’s hard to immediately tell which of the many data sciencey, machine learning-y titles correspond to the truly stratospheric salaries versus the engineers that work with those roles. For some teams, like DeepMind, Google Brain or FAIR, it’s easier to tell. But for others it’s a mixed bag.
For comparison, see the fashionable term of art “quant” in the financial industry, which has similarly devolved into marketing and a bimodal distribution. As a rule of thumb, you generally can’t trust AI titles or salaries at startups unless those startups are really known for their talent; further, you can safely assume that, at companies capable of paying for top talent, the very impressive salaries belong to titles which seem the most exotic and out of reach for general engineers in the job description.
Fortunately startups don’t really need to compete for top talent as a genuine technological differentiator, they just need to engage in signaling, so this is mostly a non-problem. Startups almost never have problems usefully improved by the cutting edge of machine learning research, and can instead use off the shelf tools and existing software to accomplish the same things. Frankly, it’s exceptionally rare for a startup to even have the massive data, pipeline and munging infrastructure requisite for actual research.
This actually makes a lot of sense. I have a lot of friends employed as Data Scientist or Data Engineers who don't seem to be able to explain exactly what they do, or describe what "research" they are doing. From what I understand, it seems like they are designing pipelines that ingest data, run an off the shelf AI algorithm and display nice graphs. There is a lot of pipelines you can design for different kinds of data so I imagine they always have something to do.
In my experience, the best scientists are not motivated by money, at least not beyond enough to provide a comfortable lifestyle. For me it's much more important to work on interesting problems in a good team. Also, startups can provide an exhilarating feeling of 'anything is possible' which you rarely get at a large corporation.
You'd be amazed how little impact spreading that wealth around in the form of salaries will actually make on an individual salary. And on just how much those offices and perks actually contribute to company image.
I strongly disagree. And given that I'm working for a company who's entire purpose is to make money, I fail to see why I should not do the same. Nothing good comes from me forgoing making all that I can make.
You should try to get as much as you can for your family. But the best way to do this is to move up the value chain, not keep trying to squeeze blood out of a stone. You seem dead set on getting more compensation for bringing the exact same amount of value to the table.
My solution to your "fact" is to not work for a bunch of dicks. I've never gotten fired "as soon as it makes sense" once I transitioned into development. You seem to work for a lot of dicks. Stop doing that.
No, I work for business people. The very same type of people you tell me I should gift money by leaving it on the table.
This is going to be the last reply, but you've not made your case as to why I should gift my employer free money by leaving it on the table. There is exactly zero benefit to me for not getting everything I can, and quite a bit of upside. I have also found your double standard regarding the behavior of companies and the behavior of employees, and your handwaving away of that double standard, to be quite insane.
In short, you feel free to do whatever you want, and gift your employer free money. I, on the other hand, am going to take care of myself and my family by getting the maximum value I can out of the time I have to give my employer, and get paid as much as I can.
The original employer-employee relationship was between farmers and strongman warlords. If the strongmen didn't protect the farmers then they didn't get fed and if the farmers didn't produce as much as they could then they ran the risk of getting overrun by guys they didn't have an existing working relationship with.
So yeah, I think that was an inherently civil relationship. Agrarian empires were the original forms of civilization. Just because there's a hierarchy doesn't make it not civilized. Hierarchy is instead what makes it civilized. Hierarchy means that everyone can relax and focus on what's in their wheelhouse.
This is a value for value business relationship. It’s not mercenary for “coders” to capture more of the value that they created in the first place. An employer is not entitled to get cheaper labor just because it helps them get richer.
Like I told the other commenter, if you want to capture more of the value, you need to own more of the business.
Look, if we're talking giant corporations here, I agree with you. The company isn't going to miss another $5k/year. But if you play hardball with a $3M company, then they're only going to keep you until they can outsource your job away. Their budget is what they live and die on, and surplus profit typically gets rolled back into the business, not wasted on dividends.
You're directly affecting company viability by not being willing to leave some money on the table.
I understand what you are saying especially when it comes to a "lifestyle" business that keeps an employee on when they have a bad year. The employee trades some salary for security. I'm not so sure how common that sort of business is though.
You've got the start-ups that are focused on a big exit to pay their investors and let the founders cash out. They (and their VC investors) want their employees to sacrifice/invest/commit "like a founder" - but without the founders upside.
You've also got the large businesses that will, in your own words, "keep you until they can outsource your job away" for a little more profit.
This is the reality for developers and they have slowly decided to seek their fair share to the consternation of businesses that were used to adding that value to their own bottom line.
When I hear someone saying that "coders" (or sometimes "code monkeys") shouldn't be focused on salary I hear someone who (A) doesn't respect my profession and (B) doesn't see why they shouldn't be able to exploit me.
(When it comes to my own clients, I do leave money on the the table. I could justify it by saying that I do it "in the interests of a long term relationships" but in reality I'm emotionally invested in their success and I want their projects to succeed.)
That's not a bad approach but...from my perspective, why should I work for a company that either doesn't value my contributions or can't capitalize on them? The money I leave on the table doesn't go to charity after all, it goes in someone else's pocket or is invested to the benefit of the business owners.
To paraphrase on old saying "Developers go where they are wanted and stay where they're well treated."
Look at from a developers perspective; why should they bust their hump to deliver 20% more value only to be rewarded with a 3% increase in salary? ("Gee Chris, I would love to give more but company policy...") Why shouldn't I go someplace that does value my efforts?
The only way to fix the structural inequality of the employer-employee relationship is to have your own business. It is not a business relationship. It's an evolution on the lord-serf relationship.
> Look at from a developers perspective; why should they bust their hump to deliver 20% more value only to be rewarded with a 3% increase in salary? ("Gee Chris, I would love to give more but company policy...") Why shouldn't I go someplace that does value my efforts?
You should not bust your hump. Deploy adroit political acumen to reduce your workload. I can't remember the last time I busted my hump on a development job.
But if you want more money, you absolutely should go somewhere else. What I'm saying is that expecting your existing company to be the vehicle for that advancement is naive at best.
I requested, and got, two large raises at my last job. I was still underpaid at the end of it. I'm underpaid now, even though I got another massive raise when I switched companies.
The reality is, you get a market salary from the market, not from any one company. A company is either going to be open to paying market rates or they won't be. You have to make the decision whether to accept that. Playing hardball with a company that's not prepared to pay you market just won't get you anywhere. Find a company that's prepared to pay market.
I have brought additional value to a company and had them just tell me no. No they were not going to give me a raise. I had to jump ship to get a raise. You're stance is nice from an ideal CEO perspective but it never works.
Hm, I think I disagree with this. The famous statistician and scientist RA Fisher said "To consult the statistician after an experiment is finished is often merely to ask him to conduct a post mortem examination. He can perhaps say what the experiment died of." and less-statistically-inclined researchers in academia have often observed this to be true: when statistical/analytical/"data" related considerations are not taken into account during the early stages (design, planning) of a project, it is very difficult (and time and money consuming) to "bolt them on" after the fact. If "AI" (or whatever you want to call it) is going to be a fundamental feature of a product, data scientists (or whatever you want to call them) should be involved right from the very beginning.
This is an interesting subject, because intuitively one wants to make the comparison "Is Conan O'Brein paid a multiple of how many times funnier he is than a local standup?". But economically what's important isn't how funny he is, but how many viewers he can draw. So then you might ask, does Conan draw 30x more viewers than some no-name comic? But that also isn't the comparison that matters. If no-name comic can draw 10M viewers, but Conan can draw 20M viewers then should he be paid 2x as much? It depends on the costs for the rest of the show. If a show costing $500K to produce and $10K for a no-name host earns 1M then if they were to replace the no-name host with Conan and could earn $2M per show, it would make sense to do that as long as you weren't paying more than $1,010,000 (101x the no-namer) for Conan to host the show.
The point I am trying to make here is that these figures vary from industry to industry and from job to job, I could have conversely changed these numbers around and shown that it doesn't make sense to pay Conan a huge multiple of the no-name comic's pay. For example, if the revenue does not increase substantially (like from 1M to 2M) or if the salary of the person in question makes up a much larger portion of the overall expense of the company.
I think it's more of a winner takes all phenomenon in the Conan case. Usain Bolt makes millions because he's a tenth of a second (or less!) faster than many other people.
Those other people - with the exception of maybe Justin Gatlan - make a far more modest salary. It's less than a 1% difference in performance that leads to orders of magnitudes differences in outcomes.
Reminds me my first year as a webdev. I was extremely lucky to ride the apex web 2.0 hype/insanity wave. Now I can't believe myself that I, as a 21 years old and just 3 years of for-profit programming experience, got CAD 85k on first real job Canada for just jquery animations.
Later down the career path in Canada, I was frequently asked "you worked for guy A and B, than must've been hell of a job?" or "how I got there to begin with?" All of them dismiss my explanation that "I just used to be on line 1 of google for thing A"
The back side of the coin? The moment web 2.0 became a more of an "in-house" production with mid-to-big sized companies that no longer needed a "hired gun" outsider, and hype wave moved to other things I really hit a wall. Actually, on my next 2 jobs I took salary in $70ks and was contemplating selling major life assets after my last employer in Canada was unable to extend my work permit.
Things went much better after I scaled down my appetites and stopped looking for employment with companies obsessed with "rockstar hiring"
Exactly, and winner takes all is extremely common for markets competing for rather limited markets. In sports, consumers really don’t pay much attention to high school athletes, car designs are becoming more and more homogeneous, and in pop music there’s increasingly more convergence in terms of style (although the irony is that the entertainment industry is entirely driven forward by trying to “discover” a new trend that appeals). My point is mostly that our collective attention spans are very limited and any business that depends upon attention from mass consumers will be constrained by the simple fact that we all only have 24 hours a day and that we can only support / raise so many children that then also only have so much time. This, it is a mass war for attention with primarily superpowers around (celebrities).
But he isn't being paid for being 1% faster .... he is being paid because he is the FASTEST. It doesn't matter if he was 0.5% faster, 10% faster, or any other percentage. It matters that he is the FASTEST. That's the label that draws people's attention, not the 1% incremental faster-ness. =)
I don't see the parallel. A breakthrough or a patent is something that you do not know for sure you will achieve, but by hiring the best people in the field the chances of that happening go up and such competitive advantages are the cornerstones of corporate empires.
It's very similar, it's just that the metrics are not so easily measured. A top tier employee might accept the job only if their salary is 2x what a lower tier worker might accept. Now you have to balance out what you think the difference in likelihood between these two workers finding a "breakthrough" is, if what is a breakthrough worth to your business.
Yea, I don't disagree. It is clearly much more difficult to estimate certain things compared to an entertainer and how many audience members they can attract. That's part of the reason myself and another poster used it as an example. I think the point you're making is also what the article was trying to get at, that there are an unknown number of AI experts and it's hard to estimate what they should be paid which is why many of them are getting these crazy high salaries.
Your overall point is well taken, although I would say one of the defining difference between top AI experts and late-night talk show hosts is that demand is constrained for the latter, whereas there is a very limited number of slots for the former.
The effect of this being that all late-night talk show hosts are always subject to a direct financial comparison (but fortunately for their audience is sticky and belongs to them personally not to the television network). With top AI researchers, there's no real comparison or stack ranking, it's all about passing a certain bar which is objectively evident based on their past work, at least as long as AI is hot.
> but fortunately for their audience is sticky and belongs to them personally not to the television network
That's maybe half correct at best. Jay Leno could only take a small portion of his audience with him from the Tonight Show, if he chose to set up a competitor show. The same is currently true about Fallon, and likely far worse in his case today. Conan could never match his audience potential as host of the Tonight Show (pretty much no matter what he does, and certainly not on cable at TBS), because of the value of that specific platform, built up over decades and given its prominence on NBC.
A very large share of the Tonight Show audience, stays with the Tonight Show, regardless of host (barring the next Johnny Carson abandoning the show, or a truly horrendous product implosion at the current show). That audience largely belongs to Comcast NBC.
If Jimmy Kimmel leaves ABC, he would be replaced. Kimmel would struggle given the limited options, ABC would simply plop the next Kimmel into his spot and move on, with a large percentage of the same audience giving the next person a try.
Craig Ferguson's audience did not go with him, as another example.
I'm going to go on a totally off topic tangent, but this kind of makes me wonder why no-name comics aren't pursued more aggressively for TV. How many people are actually watching Conan because of name recognition, or the amount of money they're spending on production costs? It seems like a relatively unknown person with a smaller budget could outdo the competition just by actually being funny. Maybe I'm just too grumpy, but I really don't enjoy most of this segment at this point. Colbert in particular has been disappointing because I was a fan of what he did on Comedy Central.
Celebrities are their own brand. They have followings that are big (some bigger than others). That is the difference between them and no-names. Celebrities often have proven talent. Conan has decades of bankable funniness as a writer and entertainer - it's not a given any no-name can be that good or reliable.
What people make is ultimately what demand & supply dictate. On the other hand, for one-off cases where the market is super small, people make what they can negotiate. Of course, even in that one-off cases there is data about the pay of superstar within previous movies/shows/engineering orgs to use as an anchor.
But what about your analysis which is based on unit economics. How does that tie in? (because it is of course relevant) By the fact that if you hire lots of super stars by market price, but your unit economic does not allow the operation to be profitable, you will eventually have to shut the operation down. Good bye super stars and whole operation!
I suspect there may be some sort of hype in the AI area regarding salaries. Sure, there are a couple of rockstars, but I suspect that competition is really intense and compensation in ML area has an even more acused power law distribution than general vanilla software engineering that doesn't face as much pressure from maths, stats and other grads that look for a career to apply their quantitative skills.
I think that in the real world paradoxically math skills are easier to find than solid software design and development abilities. It may stem from the fact that the first one is taught quite well in school while the other one is more about individual learning and sometimes a contrarian stance to the system (can be reflected even a somewhat childish "I'll learn Haskell because the OOP and Java suck!") which is harder to find and therefore, more valuable.
In my experience with AI start-ups, if you have a rockstar CV in your deck you get funded. If you don't, you don't. It is very difficult for non-experts to evaluate the competence of budding AI teams. The best heuristic, track record, thus prevails, which in turn attaches a lot of value--from the company's perspective--to that single CV.
I’d laugh if I wasn’t crying. I narrowly escaped high school where the popular kids and “school celebrities” won. All that studying in university, all those labs, grinding that entry level job, building my skills, grad school, more hard work... and at the end of the day the popular kids inevitably win again.
A lot of the comments here are skeptical of the claims in the article. (This is how comments should, and do, function).
However, for all the reasons that the article may be wrong
- the shortage is temporary
- the shortage is overblown
- the shortage is illusory
- the shortage doesn't apply etc
Please compare with the situation for other high earners (CEOs, Entertainers and Bankers). Can the same arguments be made for them? (Conan O'Brien is funny, but he isn't 30x funnier than the person at my local comedy club?)
The work of an AI researcher is mechanically reproduced so a 1% benefit can be enormous. It could be the case that the competition isn't for more of them, but for the best of them.
Utility increase is not linear with cost increase. Conan or any other valued expert may not be 30 times better, maybe just something like 1.5 - 5 times better. But I do agree that demand will be satisfied in a long term.
Conan can deliver a joke written by someone else such that it is at least as funny as the local stand-up comedian, almost every time. He just has to be funny enough to not miss more than two times in a row. The local can have off nights. The big name has to kill it at every show. The really big name has to kill it at every show, in front of cameras.
And it's not just that. Entertainers acquire a fan base. You cannot grow your earnings without engaging with and growing your fan base. You can make a seven-figure income as a celebrity entertainer when 300000 people are willing to throw $10 your way every year. (The other 2/3 goes to support staff and overhead.)
That's not a matter of shortage. It's a matter of competition. At a certain point, people cannot spare another moment to follow another person, and have exhausted their entertainment budgets. The top names aren't the best. They're the most reliable.
CEOs and bankers have an uncommon skill set, for business management and financial management, but they don't have anything that can't be easily replicated by people in the same business that want to move up into the higher-paid positions. Those high salaries are partially from prestige competition. The CEO of a 20000 person company doesn't necessarily have more skill than the CEO of a 2000 person company. The manager of a $1 billion fund isn't necessarily more skilled than the manager of a $10 million fund. They just work for people who can afford to pay more. Often, they just know more influential people, and were in the right place at the right time.
AI research, on the other hand, depends on some serious skill. While there are a lot of people out there that can write dumb programs, and fewer that can write programs that can handle every foreseeable situation, there are rare individuals that can write programs able to do things the programmer never anticipated. It's like a comedian that can write a superjoke that gets a laugh from everyone, every time, forever. Or a banker that can get 15% returns every year, without fail, for 50 years. Or a CEO that grows earnings by 3.5% every quarter, and always meets expectations. Building a robot that can catch a thrown ball is a feat of that magnitude. It's absolutely incredible that we have any people able to do that.
There are plenty of software developers out there that can move into AI if they had to, but it would take some time for them to get up to speed on the state of the art, and it would take entire teams of them to produce the same level of benefit as one current AI specialist. Companies that see a path to monetization for AI are looking to find and hire the Carmack of AI and get there first, rather than 100 people able to constantly surf behind that leading wave by about six months.
The shortage is temporary because software folks are good at following the smell of money. The shortage is overblown, because this research was already happening before the truckloads of money pulled up to the dock. It is not illusory, because the previous lack of funding for AI has produced relatively few experts. But it will probably turn into a glut later on, because the people offering the money will eventually learn that AI research can't be rushed in the manner to which they are accustomed, and will abandon all those they enticed into the field.
What kind of salaries are MS applied Data Scientists commanding? I am very interested in enrolling in Georgia Tech's OMSA, but very little graduate statistics exist for these types of programs and job searches usually don't provide compensation. Further, the vagueness surrounding the name “data scientist” clutters up the information that is available.
I'm not really sure where these analytics programs are falling in the scheme of things. The usual advice is go CS or stats instead. As far as salaries, it's not too difficult to get 100k in the midwest. More on the coasts.
This is the sort of thing which was starting to happen with software engineers way back in the 1990s as the Internet started growing explosively. It was stopped by the large tech companies engaging in illegal wage-fixing for years, setting the standard for software engineers being paid rates divorced from the value they create. Sure they got busted for it decades later, but by that point the danger was passed and astronomical profits assured.
This is an exaggeration. Total compensation for returning interns at FANG may approach 120k salary, with stock plans and bonus compensation that max out at 20k additional value per year. Even algorithmic trading or strats-quant positions rarely broach 140-150k for recent graduates.
For the first year, maxing out at 20 is incorrect if only for my own case at the A. Mine is closer to +40. At F it can be anywhere from +113 per year or more for converting interns and even better if you can negotiate at G, I’ve heard numbers going up to 200tc. This is just including signing, not year end which can be even more but I’m not as familiar with that at other companies.
At least from my recollection, JS/2S base approach that but have very significant performance bonuses that easily match big tech.
As someone with a PhD in an engineering field (but currently working as a data scientist because all of my research was computational science), I wonder if independent machine learning publications would help with getting one of these jobs or if the CS PhD from a brand name school is an absolute requirement. I have 8 publications in my field, so I’m familiar with the peer review process, and it seems that with (a lot of) time permitting it might be possible to get a couple of NIPS presentations or JMLR papers. But I don’t know if publications alone would be enough to get hired for these AI positions.
I had a hard time making the decision when I chose an engineering PhD over a CS one, but 6 years ago machine learning hadn’t taken off like it has now, and engineering / hard science prospects seemed brighter at the time. If I had known it was going to become this big and this interesting, I definitely would have gone for CS instead.
> but 6 years ago machine learning hadn’t taken off... I definitely would have gone for CS instead.
I thnk you're doing it wrong because that's exactly why there's so much demand for CS PhDs right now.
Rewind to mid-2000's when the CS postdocs/phds graduating over the past couple of years were choosing to major in CS. They were warned that everything was being outsourced to India and advised to choose a "real" engineering field, or perhaps finance/physics/math. It's hard to imagine today, but lot of smaller colleges/universities were killing CS majors back in the mid 00's!
So not only is there a shortage of CS PhDs in the pipeline, but the ones that made it through came in with a burning passion for the science (as opposed to the money/hotness). This combination of input bias and restricted supply is what makes the current labor market so damn hot.
In 3 years, that will invert, and some major struggling to justify its existence because it's hard but has "no future" will blow up. Rinse and repeat.
Not necessarily. Or more precisely, the intuition doesn't have to be in the domain itself.
I've done operational improvement work across supply chain/inventory management, marketing, digital analytics, ecommerce, healthcare revenue cycle management, and call center operations. In almost every case I started with little if any direct domain knowledge. The intuition that was valuable for my work was around systems-oriented thinking applied to business processes and being able to quickly suss out weak or suspiciously opaque areas of the system. The necessary domain knowledge to do so was always picked up from domain experts as I went along.
In fact, taking a naive approach on domain knowledge has always worked in my favor to uncover invalid assumptions that those with domain knowledge just accepted without question.
It was a bit of a winding process, but essentially early on in my career I learned that I fit this description very well. And stumbled upon the same fact that patio11 did: it's incredibly hard for most companies to consistently fill that type of role. And now that I have a history of succeeding in those types of roles, it's a lot easier to talk my way into new ones.
That said, you're spot on that the core of what I do is data consulting, although most of it falls under a domain-specific name and has been W2, internal consulting roles. I'm actually in the process of switching my full time role to a less demanding one so I can focus on ramping up my actual consulting business.
Well, sometimes the domain is a specialized subset of ml, like vision or natural language processing. And there's also a decent amount of intuition involved in just getting the architecture + hyperparameters approximately right...
Two quarters (~1 semester) of Calculus is required, so a lot of integration is left out. Discrete Math is part of the CS curriculum, essentially as an introduction to proofs to prepare people for Algorithms . Linear Algebra is a recommended prereq for some classes, but a lot of people don't take it because more fundamental algebra is covered in classes like Analysis.
I graduated this December with a BS in CS. Our curriculum requires stats, and while you're not directly required, the requirements tend to force you to take Linear Algebra and Numerical Analysis as well for the BS. I'm now working as a Data Scientist for a very large bank in New York, mostly for my experience with AI.
In my technical university, physics is (or at least used to be a while ago) required for everyone (except for some "informatics(?)" degree program which in some way had managed to weasel out of the requirement).
Of course, as a physicist myself I fervently believe this is good and well, and the path towards enlightenment for all mankind etc.
Physics was also required for me when I was briefly a CS major.
I was generally on an upward climb on the ladder of abstraction (Electronics Tech -> EE -> CS -> Math), but the early engineering bent meant that I needed a few physics classes, and despite settling on the math degree, I still think the handful of physics classes I took were some of the best education I've had. It's an interesting confluence of abstract reasoning, practical concerns, model-building, and problem solving. It's not as if choosing math made that confluence unavailable to me, or that I really regret it, but I do sometimes think the particular balance a good physics program strikes might have been better for me.
Senior here. At the University of Denver, you're required to get a minor in math which includes a year of Calculus and 2 classes of either Stats, Linear, of Differential or any other high-level math class.
I was a math + physics major in college, and it was more than 3 decades ago, so take this with appropriate grains of salt. What I would have called high level math in the curriculum that I studied, wasn't so much about specific topics, but about the sophistication and creativity of your approach.
The engineering / science math was pretty much a matter of looking at the problem, guessing its "form," and applying a known technique based on that form. For instance, "this looks like integration by parts." Eventually you'll be shoved out into the world where there are problems for which there is no known solution, and you have to create your own techniques.
The more advanced courses did two things. First, they set aside problems and advanced towards proofs. This is really where math came alive for me. Proofs are so much more varied that you have to abandon the security of a bag full of known tricks. The other thing is that the derivations get longer, so you have to develop a longer train of thought, if you will.
Courses that were higher-level were like:
I don't know what new goodies there are, but I'd love to dive back into it again.
Ah ha. I read it as saying people getting 300k in cash. It makes more sense now. For me, I'll have to wait out several years until my RSU gets fully vested. Let's hope I don't get fired in the meantime :)
Certain elite grad program dept's might have best-in-world knowledge in a particular niche e.g. ETH Zurich has these acrobatic indoor drones better than even NASA / US Military / Boeing right now. If I were doing a make or break indoor drone startup, it might be worth it to pay a fresh grad to clone his research environment in my company; an aqui-hire of sorts. This escalates when two mega corps, like say Alphabet and Apple fund competing indoor drone companies at the same time.
I think self-driving tech has matured past this phase where a only few key university dept's have most of the intellectual capital, but I still here mid-200k figures for freshly minted PhD's from the right school.
Is a Ph.D necessary for competing for those high salaries? It hasn't been for other CS jobs, and from my experience with ML and NLP - it seems wildly unnecessary. I'd pick experience over a degree every time.
Presumably the "fresh Phds" getting 300k right out of school have extremely relevant experience and are quite capable of actually putting their learning to use (I don't have data to prove this).
People who just know TF and couldn't implement it and know what it does and doesn't do well aren't the kind of people attracting 7 figures.
There's a huge amount of advanced statistics, math, and "intuition" that takes years working with data to build. Abstractions like TF (etc.etc.) make applying existing solutions to existing problems more tractable, but the real gold-rush is happening around the new/relatively-unsolved problems.
One thing I've heard - and it seems plausible - is that one skill that's desired for these positions is the ability to consume and make use of academic research. If you have a PhD you've done that, and in a sense you are a practitioner of reading research.
I wonder if the algorithms can determine a fair wage for the said talent. Like many have observed here, the salary is for an expectation of innovation and differentiation. However, an a-priori guarantee, or even a relatively high likelihood of such an outcome is hard to guarantees unless the said person is Geoffrey Hinton. So what exactly are the salaries for?
There is no such thing as a "fair" salary, especially when you're talking about the differences between $300k, $750k, and $2 million. There is what the market is paying similar people, what leverage you can use to negotiate higher pay, and how much [above|below] market your employer is willing to pay to keep you away from its competition, among other things.
The entirely idea that it's "fair" to pay someone $400k a year to do a job but "unfair" to pay them $350k to do the same job is silly.
I am amazed that rather than see this as opportunity for racking up top $$$ for at least the next 3-10 years by learning Calculus, Probability/Statistics, and Linear Algebra on the way to whatever degree one seeks, there is so much criticism of this.
This situation is a pure win for smart people who put their minds to the task at hand. Buzzwordy or not, AI/ML/Newfangled Regression has more than enough wins with speech recognition, game-playing, image recognition, and recommendations to have a strong future.
Or, if you insist on negativity and you prefer to expend your time kvetching about whether Famous CS Person X or Famous CS Person Y should make the most money (Fantasy Data Science?), then more for me getting $h!+ done whilst you bicker. I mean I scratch my head that Mark Wahlberg is the highest paid actor in Hollywood, but hey, good for him in my book. Too bad his burger shop is crap.
I suspect that calculus, statistics, and linear algebra are required for most 4-year CS degrees. Some schools even have an "AI" track for their undergrad degrees. This article specifically refers to PhDs in Machine Learning, which is much higher bar.
If you do research in the same area that Google's research department is throughout your graduate degree, net research internships at insert fancy company with AI component here every summer, and get a PhD, $300k seems completely reasonable.
Personally, I think we're in a SaaS bubble and an AI bubble. "AI" isn't nearly as developed or useful as the amount of VC money being dumped into it.
You could also just become a plumber and probably make $300k a year[i] (~150$/hour). Engineers are incredibly undervalued for all the effort they put in their education (and containing education). A plumber could pull in at least $150 per hour in the bay area. I just hate the headlines Sky-High which in the rest of the professions is normal. Try finding an (below average english major) attorney at $150 per hour.
This is, like the comment of varelse, also a bit of an oversimplification.
To earn that kind of money as a plumber you would need to either work your way up over the course of roughly 15 to 20 years at a very generous plumbing company, or be an independent plumber with next level marketing skills.
The average plumber, pipe or steamfitter just coming out of their training is not going to get anywhere near 300 000 a year.
In my case, I turned a career in driver and low-level algorithm coding into a career in AI in <2 years emphasizing the ability to implement AI algorithms in C/C++ and CUDA. That said, when Tensorflow Monkeys can make $300K right out of school, imagine how someone who could bang the metal directly could do...
TLDR: Chance favored the prepared skill set. No fancy degree required, just results.
PS Before that, I blew a boring blind-allocated gig at Google insisting that GPUs were about to play a huge role there a year before they acquired DNNResearch. I even tried to join the very beginnings of the Google Brain team but they didn't have any openings or the budget/willingness to make one for me.
With no evidence whatsoever, I would argue against that. "Top talent" does not equate to "moving into an industry with potential". I think the other comment re TensorMonkeys is valid. How many AI researchers actually understand the ins and out of what they do or the tools they use? Seems to me that AI is/will become abstraction on abstraction etc as AI gets more complex and that means the people using the tech are more and more removed from "first principles", and therefore need to know less and therefore not really "top talent" in my book. Maybe at the start though.
The top end of engineers and researchers in artificial intelligence earn magnificent salaries. That domain is not, itself, the top end of software engineering, it's a specialization of it with its own salary track. You aren't actually comparing like for like here, because most engineers doing work in "artificial intelligence" aren't even pulling down the salary you noted for the top end of plumbing (which is itself wildly unrealistic).
$150/hr is the maximum of the range they give, and then you have to subtract payroll taxes, equipment, marketing, etc. On top of that, it's not like plumbers are working jobs back to back 40hrs a week.
> Yes. They get to write off equipment, marketing, etc. "It's not like plumbers are working jobs back to back 40hrs a week." In California probably more like 60hrs.
Are you just making up numbers? You've asked for sources, other commenters (including myself) have provided several, and the sources disagree with what you're stating spectacularly. I'm baffled by how confident you're being about something that, as far as everyone can currently tell in this thread, is plainly incorrect.
> You could also just become a plumber and probably make $300k a year[i] (~150$/hour).
...how many plumbers do you personally know? The reason everyone in this thread is pushing back against this is because the suggestion is ludicrous. It doesn't make sense in terms of market dynamics - the barrier to becoming a plumber is significantly lower than the barrier to becoming a software engineer. That's not intended to demean the profession, it's just true - you do not necessarily need any education (and the profession embraces this far more than tech does, which is itself progressive on that point) and requisite domain knowledge is not as extensive or as rapidly changing as software engineering. You need technical knowledge, but unless you're taking the most complex plumbing jobs and the most mundane engineering work, they're simply not comparable.
Some plumbers do well, particularly if they own their own business and are thereby successful entrepreneurs. But you can't judge the typical outcome of a professional career by the entrepreneurs who use it as a basis for their business. The modal plumber doesn't earn anything resembling $300k/year - according to the BLS, there isn't a single state where the average is even $100k. Where are you getting your data, from how much they're billing you when you have someone fix a problem in your house?
Plumbers make closer to 40k a year. Some tiny pcnt of them will get as high as the low hundreds, not as plumbers but as owners of successful plumbing companies. But the top tiny pcnt of software people who move into ownership and succeed become... stupid rich. if you compare apples to apples, the wealthy plumber fantasy fades quick.
That's not how labor-intensive businesses work. Again, The amount a client pays per hour for labor != the takehome pay of the person whose time you're ostensibly being charged for.
Revenue is not profit, and "price per hour for labor" != "hourly rate I pay my laborers". This is even true in owner-operator businesses. Furthermore, in bursty markets, "annual income amortized over career" != "my hourly rate multiplied by 40-60 multiplied by the number of weeks I want to work".
The numbers from my links are INCOME, not "what the business charged the client".
> "AI" isn't nearly as developed or useful as the amount of VC money being dumped into it.
That's the entire point of VC though - VCs don't put money into regular roads or into normal houses, as these things have already got their usefulness determined and we know they're darn useful. VCs pump money into things that look like they could use a few billion dollars to get developed and ready for the mass market. AI fits that perfectly according to your description.
VCs who put money into things that are already proven are VCs who are missing out on the Next Big Things. "AI" will become fully developed largely using the money that VC is putting into it.
My issue is that when I hear that a startup is AI-focused, that means they're just implementing something with machine learning (and possibly unnecessarily). These startups generate a lot of hype, which I believe is unwarranted.
I imagine Google researchers are working towards advancing AI techniques and knowledge more generally, not trying to disrupt some industry with existing machine learning techniques.
This comment cuts to basic point. Positions vaguely labeled "Senior AI researcher" are for research directors, not engineers. Taking a Udacity ML class and "teaching yourself some linear algebra" and Python is similar to achieving basic literacy in foreign language - you may be able to ask for directions, but try writing a novel.
Any job that requires significant expertise will command salaries at the top of the curve. Corporate litigation and IP attorneys command 400k-2m salaries, yet we don't get articles about IP lawyer bubbles, because the competition is vicious.
Yeah that bubble will burst when AI/ML replaces us or America* creates a surplus of competent engineers. Don't hold your breath for that. I made top $$$ (on a relative scale) in college tutoring aspiring computer scientists and engineers who hated these subjects.
(1) Calculus? I did well in courses in calculus, advanced calculus, advanced calculus for applications, general topology, modern analysis, real analysis, measure theory, functional analysis, and lots of applications to US national security and business. Taught calculus in a good university. Published peer-reviewed original research in calculus. Studied a lot more in calculus -- exterior algebra, numerical methods, the Navier Stokes equations, ordinary differential equations, deterministic optimal control, optimization, etc.
(2) Linear algebra. Worked in it for numerical methods, various approaches to curve fitting, multi-variate statistics. Did undergraduate honors paper on group representations that is just more linear algebra. Programmed a lot with linear algebra. Worked carefully through some of the best books, e.g., one by E. Nearing, student of E. Artin, and Halmos, assistant to von Neumann. Did a lot in linear algebra as part of optimization. Same for the FFT. Same for Markov processes. Reinvented k-D trees and associated cutting plane tree back tracking for nearest neighbors. First actual course in linear algebra was an "advanced, second" course from world expert R. Horn -- found very little new and led the class by wide margins on all measures. Using linear algebra and LINPACK for a small part of current startup.
(3) Crucial tools in the applications desired by AL, ML, data science need a lot in probability, stochastic processes, statistics, and optimization. Have excellent backgrounds in all of those.
(4) Ph.D. research in stochastic optimal control, a grand example of a machine doing some learning as it exploits the history of the stochastic process driving the system to be controlled.
(5) Software. Programmed in lots of languages for lots of operating systems for lots of applications, especially for US national security.
So, it looks like (1)-(5) would be good qualifications for the "shortage" of people for AI/ML?
But, I sent over 1000 resumes; my resume is on several public resume collections; I've applied to Google, Microsoft, and many others. I've never been arrested or charged with a crime other than minor traffic violations. I have not been convicted to a traffic violation in over 10 years. I've never used illegal drugs or used legal drugs illegally. I'm a native born, US citizen in the US. I have no handicaps and no serious medical problems. I've done good work in applied math and computing at GE, FedEx, and, in AI, at IBM's Watson lab.
Result: I don't get phone calls from recruiters. Basically I'm 100% totally unemployable at anything above manual work at minimum wage.
So, I'm doing my own startup based on computing and some applied math I derived. To users, my work will be just a Web site. But the site is an excellent, and the first even good, solution for a problem pressing for about 90% of everyone in the world with access to the Internet. I've designed the Web site and server farm and written the code. The code is 100,000 lines of typing based on Microsoft's .NET. The 100,000 lines have lots of comments and about 24,000 programming language statements. The code appears to run as intended and to be ready for at least early production.
Still, with that background, I'm 100% unemployable, at ANYTHING above minimum wage. This situation has cost me my chances of owning a house, getting married, having children, and my savings and inheritance.
I don't know dude, I've seen you post about this many times.
I suspect it has something to do with the way you (a) look, or (b) socially interact with people. I don't mean this in an offensive way, but if you look / sound like Donald Knuth you probably won't get a job because you're not Donald Knuth. You're just some quirky eccentric person that probably won't fit in because you can't make small talk around a water fountain or you don't have an interesting story to tell after a weekend because you probably don't actually do anything on weekends that's fun. I don't know, it could be a hundred reasons. I bet it's got absolutely nothing to do with how smart you are, what you know, or who you studied under. I mean, for starters, why not just go back to academia? Are they rejecting you too?
At some point, if you're being rejected for 1000 jobs, I think you need to have an honest talk with yourself and ask whether the problem is you or whether the problem is the job market. You're a smart guy, I'm sure you can run the math and answer that question.
Best of luck!
This situation has nothing to do with water fountain small talk or any such things. Proof: My resume copies got nearly no meaningful responses at all. That is, the whole job hunting effort died long before any opportunity for small talk.
Again, once again, over again, yet again, one more time, the qualifications (1)-(5) on a resume are terrific for AL/ML and innovative applications but get no responses at all.
Lesson: The claims that there is a shortage of people with good qualifications is total BS.
So, I'm starting my own business with some crucial, core original applied math. The math is difficult to duplicate or equal, especially by people who don't value (1)-(5) above.
Each job requires someone to create it. For the special case of a company founder, he creates his own. I'm no longer hoping someone will create a job for me and, as a company founder, am creating my own.
For my Web site, it's enough for lots of people to like the site. I hope and believe that a lot of people will like the site. For the revenue, it's enough for the advertisers that my Web site delivers lots of clicks from users with good demographics, and I hope and believe that will happen. And, except for trivialities, those two are enough.
Point: Back to the "lesson" above, there's no shortage.
Why are people claiming a shortage when there isn't one? There is a standard list of reasons, and there may be reasons not on the list. I don't have information enough to select the reasons in this case.
I can think of no better way to demonstrate that you are right than to succeed at building your own business. OTOH if the business fails IMO you ought consider that you need someone objective to review your perspective on and approach to building a career.
Perhaps start by having a friend with a steady gig look at you resume? Perhaps forward it to a recruiter? Or if by the time you read this you are filthy rich, congratulations!
Is the first paragraph of this actually true though? At least if you believe the article, the people racking up those salaries are CS/ML/Stats phds, not people who learned stats and linear algebra on the way to their biochem degree or sociology degree or maybe even EE degree. It seems like the idea is more people who can produce truly novel work.
Moreover, the article also claims there's a pipeline for the folks (below the PhD level? The writing is a little unclear) that's coming out: " At the current education rate, an influx of new experts will start to moderate salaries in three to four years, he says."
So who is it who is making educational choices today who will be well-positioned to cash in. Probably nobody. Rather, like with all talent shortages, the people who win are those who had the luck or foresight to have already studied something that happened to get big, and now everyone else is rushing to catch up and flood the market.
My doctorate is not in CS, Math or an engineering field, but I did (somewhat unknowingly) apply ML/statistics to my thesis, which was (more or less) generative adversarial search 20+ years ago.
Everything I did then turned out to be strongly related to techniques like Naive Bayes, Logistic Regression, Principal Component Analysis, Boosting, Bagging, and bunch of other techniques that get reinvented over and over again. Once I mapped them over to their ML incarnations, they were familiar territory going forward.
Could you say a bit about their background? (e.g. degree type, work experience, self-education) Given that a lot of this thread is about how one can make it in this field without a PhD in ML, your comment would be very illuminating.
One is a former game developer (with only a high school degree) who happened to have an interest in ML and studied up in his free time. A few were in a startup, bachelors in Computer Science from non-Stanford universities, that was doing AI work and was acquired. Another is a developer of a prominent open source package, originally utterly unrelated to AI, but he started working on learning related applications with his software and landed at Google as a result. Another is a friend with a PhD but in a totally different area (Computer Graphics) who was hired into Google where she landed on a ML team.
I agree with your position vis-a-vis all of the complaining that people seem to be engaged in here.
That said, I think it's a bit unfair to imply that the only thing required of people to participate in this particular frenzy would be to learn a little probability and statistics. I've been working on a system that learns from an enormous set of medical imaging studies for the purposes of analyzing same, and the technical ML and domain knowledge you need to bring to that party is actually pretty humbling. You're not gonna do this stuff with a little statistics and some H1B's. At least, not in a fashion that any medical practitioner will take seriously.
I think in a few years, you'll have lots of lower level people who will know enough to provide meaningful contributions. Of course, the flip side of that is that their salaries will be significantly lower.
I think you're misunderstanding who these high salaries are for. It's not for people who brushed up on their sophomore math courses. The demand is for people with graduate degrees and postgraduate experience with published papers and a pedigree that leads one to expect continued novel work at the level of cutting edge research.
> I mean I scratch my head that Mark Wahlberg is the highest paid actor in Hollywood
Alternatively Dwayne Johnson as well (whom I happen to enjoy more than Wahlberg). He doesn't do high-brow entertainment (starting from his wrestling days). Simple, fun entertainment for a super broad audience never goes out of style. Your average person is pretty happy to forget their 9to5 and troubles, and go see a big movie for escapism. That will never cease to be true. Arnold Schwarzenegger filled that role when I was a kid. That's not a criticism at all, it serves just as legitimate of a purpose as people that prefer to watch TED Talks instead of the next Rock action flick. My father had a basic saying that I didn't appreciate until I was much older; whenever I would criticize something that seemed a bit neanderthal-like (so to speak), he'd respond with: it takes all types. Generically what he meant, was that the world functions courtesy of a wild variety of all types of people. Would it function better if everyone were elitist and brilliant? I don't think so.
Miranda Sings, Jenna Marbles, PewDiePie, minecraft videos, twitch game streaming, et al., same fundamental as Wahlberg and The Rock.
Check out online courses. Coursera, EDX, etc. Andrew Ng's intro to machine learning course is a nice way to get up to speed with some basic concepts without diving too far into math. It doesn't cover any state of the art machine learning concepts, though.
Some other resources I have bookmarked:
- Convolutional Neural Networks for Visual Recognition Youtube playlist 
- Deep Learning for Self-Driving Cars 
- Natural Language Processing with Deep Learning