AMD is having its day

(economist.com)

394 points | by jkuria 11 days ago

21 comments

  • ConcernedCoder 11 days ago

    Jim Keller, the designer of "Zen" architecture, is now working for Intel... https://web.archive.org/web/20180426124248/https://www.kitgu...

    • kick 11 days ago

      It's kind of his "thing" to bounce around to different, interesting projects. He wrote the x86_64 spec while working at AMD (and was responsible for the K8s ala Athlon 64s), he played a big part in Apple getting their SoCs to best-in-class, and that's not nearly all he's done.

      Incredibly accomplished man, I don't think he's going to stay in one place for very long. He'll probably come back to AMD at some point or another.

      • sufiyan 11 days ago

        You really need to look a little more at Jim and his style. He has been around from the time of DEC (Digital Equipment Corporation). And has been a great engineer. But what sets him apart is not his technical competence, but his ability to organise great teams around him and get the work done. And he has always been the person who has been given the reins of companies in crisis. It is a marvel how he manages to get the product back up.

        • Bombthecat 11 days ago

          Connections. I bet he has a hoard of people willing to work for him. Even if pay would be low. They know it will succeed. So he can assamble an avangers team (pun intended)

        • K0SM0S 11 days ago

          I'm seriously hoping he gets on board a big RISC-V project eventually, and contributes to pull this baby off to the races.

          • kick 11 days ago

            RISC-V already has more accomplished people than Keller working on it, and I don't think his presence is required for it to continue to get better.

            What RISC-V needs at this point is marketing and a company with a dedicated sales team, not technical competency.

            • lliamander 11 days ago

              Now that POWER is fully open, it will be interesting to see which RISC architecture becomes the x86 challenger (if either). RISC-V is eating up Intel (and ARM) from below, but POWER has the, well, power to compete at top of the line (before Zen 2 came out, anyway). The next gen POWER is supposed to be built on Samsung's 7nm process.

              If RISC-V wins, it will be because it has a better developer/enthusiast/hacker story, but if so it will take a longer time to get there.

              Either way, I hope for he sake of our infrastructure that an open standard wins out, and that we have enough competing manufacturers that we never have a repeat of Intel's utter dominance.

              • floatboth 11 days ago

                POWER is basically only top of the line though, other than the relatively rare NXP QorIQ chips that were used in e.g. AmigaOne stuff.

                Most of the hype for RISC-V seems to be in tiny embedded stuff (e.g. WD's disk controllers) and academia (naturally). SiFive has finally made an out-of-order core, but I don't really see the market for big unix-capable RISC-V. RISC-V is really a royalty-free "MIPS in a trenchcoat", so expect it to be used where MIPS is used now.

                ARM is everywhere, from smartphones (unfortunately Qualcomm dominance, but Apple is kicking ass in performance) to AWS EC2 (in-house silicon!) to massive HPC clusters (Fujitsu A64FX is impressive, they use HBM2 RAM (!) to make a SIMD-capable CPU into something almost GPU-like in a way). AWS has basically ensured that ARM is the next ISA for servers :P

                • K0SM0S 10 days ago

                  I'm certainly not downplaying ARM's domination in general, but don't take this bit for granted:

                  > AWS has basically ensured that ARM is the next ISA for servers :P

                  These hyper-scalers (inc. Microsoft, Google, Alibaba...) will take a little bit of everything, because they generally can always find cookie-cutter workloads impeccably suited to any architecture, but also as industrial diversification, for R&D, etc. Like AMD, the presence of some ARM CPUs in the biggest datacenters says little about market forces, what matters is how many actual FLOPS are effectively handled by each vendor. I'm positive Intel still has the lion share, and the inflexion point in favor of AMD (nearest competitor) would be 2023 at best, more likely 2025-26 assuming Intel eventually catches up in price/perf/W.

                  A reasonably heterogenous infrastructure is very good when you're an order of magnitude bigger than entire datacenters, or so I hear.

                  I share your concern that RISC-V is currently largely confined to the MIPS space; and indeed it's a totally different ballgame to break into ARM's space, let alone x86 (but I don't see why RISC-V would seek the latter, especially considering POWER is up there).

                • MayeulC 11 days ago

                  The open source community has been questionning RISC-V's openess lately [1]. While I wish they could correct that, maybe that could give POWER an edge.

                  [1] https://www.phoronix.com/scan.php?page=news_item&px=Libre-RI...

                • K0SM0S 11 days ago

                  I don't really know what you mean by “more accomplished people than Keller” but I sure do share your overview.

                  My point was more that economies of scale happen with killer products (e.g. how CISC won over RISC back in the 90's, it wasn't about tech but about product, and as you say it falls down to sales and marketing). Somehow a guy like Jim got involved in some of the best scalers out there (AMD's x64, Apple's A chips, Tesla's self-driving AI, AMD's Zen, now something at Intel with 3D stacking most likely, and next...?)

                  Here's to RISC-V, anyhow!

                  • 0xcde4c3db 11 days ago

                    To me, David Patterson is the most obvious person who might qualify for that description. He (co-)wrote the book on computer architecture and how it relates to performance (Computer Architecture: A Quantitative Approach) and led development of RISC-I and RISC-II ("RISC-V" is a reference to this lineage; I believe other projects not officially bearing the "RISC" name were counted as III and IV).

                    • K0SM0S 11 days ago

                      He's a giant, no question about it.

                      I just hope the people involved in RISC-V are collectively able, as an industry, to deliver the kind of fabrication process backbone funded by great products 'agressively' marketed that saw the victory of pretty much all standards — CPU instruction sets very much included. A combination of very applied engineering + shrewd business mindset.

                      • 0xcde4c3db 11 days ago

                        I think RISC-V is going to win for fundamentally the same reason that ARM won: it removes obstacles between product teams and processor cores. ARM didn't win by having the best instruction set design or the most advanced architecture, it won because licensing it was less of a hassle than licensing a different core/ISA. It's like ARM had a "buy now" button where everyone else had "contact sales" buttons. But extending that metaphor, RISC-V is "you don't need to click a button; it's already on your porch".

                        I assume that RISC-V still has a gap (relative to ARM) between the ISA/HDL levels and getting to an actual tape-out. But I think SiFive et. al. can tackle it.

                        • hakfoo 10 days ago

                          I think there were a couple reasons ARM won:

                          * ARM wasn't directly in the manufacturing business. This meant they weren't a supply chokepoint when demand scaled by a factor of thousands. Just take the IP core and fab it yourself!

                          * They triangulated the market unexpectedly well.

                          There was a definite gap in the market between "performance at any cost" big architectures (x86/Itanium/Power/PA-RISC/SPARC) and "Draws no current, but gets winded running the clock for a VCR" (6502/Z80/68HC11/Atmel/PIC) little ones. For a long time, this was filled by on an ad-hoc basis-- one-off designs for game consoles, set-top boxes, and ahead-of-their-time smart appliances. Most of these were closed, single-vendor, no-interoperability systems where "ease of integration" and "off the shelf tools" didn't really matter, so we had all sorts of bit players.

                          It took the arrival of full-power handheld devices with third-party software support (PDAs and then smartphones) for a large demand to appear, and ARM was ready with a product line that fit the space well.

                          Compare the alternative, a world where we had had x86-based Windows CE devices and eventually iPhones. Intel/AMD/VIA would have been scrambling to put together the diversity of custom designs these devices cried out for. It would have required a massive respin of their architectures (look how hard it was to get Atom into a phone-ready state, and even then it did poorly) and then backlogged their foundries.

                      • dnautics 11 days ago

                        Patterson also worked on TPUv1 (iirc) and that wasn't so hot.

                      • kick 11 days ago

                        I don't really know what you mean by “more accomplished people than Keller” but I sure do share your overview.

                        Look at the SiFive team for starters. Anyway, here's to RISC-V!

                  • Roritharr 11 days ago

                    I wish I knew what his secret is.

                    He can't be a genius of THAT magnitude, that would be a miracle.

                    • ownagefool 11 days ago

                      There probably isn't a secret so much as he's a guy that likes to deliver and companies let him.

                      I go to company to company consultanting, and the companies almost always have very competent staff they just don't listen to.

                      • S_A_P 11 days ago

                        This! I also consult and have been to quite a few companies. Often I find that the most assertive person is who gets the ideas/perspectives pushed through, not necessarily the most competent. Sometimes the confidence comes from experience and usually that leads to ok decision making. Other times not so much.

                      • baybal2 11 days ago

                        Just lots of experience. Normally, people don't hang out in semi as much as in software engineering, and that itself is extremely short.

                        Semiconductor engineering was hurt badly when it tried to borrow work culture from the dotcom world, and treating engineers as disposables.

                        I myself know of other man of equal magnitude in the analog world. He worked in electronics for 24 years, most of it in China. He worked for one of my employer, where he was deservingly fired for failing a potentially multi-million buck project. Apple hired him, literally, next week. I have suspicion he had his hands on Apple's Air Pods.

                        It was tough to work with him, he has a reputation of "Shenzhen's Rob Widlar." Very few companies can make a working environment for him.

                        • goldcd 11 days ago

                          The total of the investment in a new architecture/chip is significantly higher than one guys salary (although I presume he's compensated very well).

                          Maybe it's just once the corporate gears of a company have aligned to invest in something new/interesting - he's interested and they're paying.

                          • kick 11 days ago

                            If you ask most people who have familiarity with x86_64, I'm certain they would use a different term than "genius."

                            Though the first step for you would be to work on projects you're more passionate about, and that are more technically challenging. A billing site (judging by your HN about section) won't challenge you in a way that will make you get better.

                            • jandrese 11 days ago

                              But the funny thing is he leaves behind a string of industry defining successes in his wake. You can't argue that he's not tremendously influential and successful in the traditional sense.

                              He's the Kelly Johnson of silicon.

                              • mzarate06 11 days ago

                                > If you ask most people who have familiarity with x86_64, I'm certain they would use a different term than "genius."

                                What term do you think they'd use, and why?

                                • amluto 11 days ago

                                  Mostly backwards compatible hack.

                                  • zaarn 11 days ago

                                    For a backwards compatible hack it works good enough. And "good enough" usually beats "better".

                                  • kick 11 days ago

                                    "The reason we're stuck with this shitty fifty year old architecture that's steadily gotten worse through every iteration," "Satan," "Evil Computational War Criminal," a few less nice terms.

                                    If you ask a Sun employee: "Enemy #1."

                                    Less sarcastically: "slightly above average."

                                    x86_64 is a nightmare, and we're a decade behind where we could be because it was what the industry settled for.

                                    • CamouflagedKiwi 11 days ago

                                      That seems extremely unlikely. If a new architecture could offer that much advantage (a decade is massive in this space) surely there'd be some pressure to move. One could easily imagine Apple moving to ARM, for example.

                                      Unfortunately, I think a much more likely situation is that x86 (/64) is only slightly hobbling things, and Intel are easily able to push past that with technical craftiness. As in many other cases, implementation trumps theoretical design.

                                      • kick 11 days ago

                                        Apple is moving to ARM, the most recent iPad Pro outperforms its laptop line (last I saw, at least), they've spent a fraction of what Intel, AMD, and VIA have invested into x86_64, and basically every major Apple leak mentions that they're investigating moving the Mac line to ARM in a generation or two.

                                        A new architecture can offer that much advantage; x86_64 wasn't even the third most-performant implemented ISA when it was released, and every other ISA makes gains far faster than it. SPARC and POWER are still competitive with it despite having 1/1,000,000th the amount invested in them, and in just a few years and with comparatively nothing invested into it, RISC-V is starting to rival a portion of the chips (though not the upper line of them yet).

                                        It "won" because of backwards compatibility, nothing more.

                                        • CamouflagedKiwi 11 days ago

                                          That's just not the case; Apple were already well behind in performance at the point x86_64 came out, due to them being stuck on POWER. They actively moved to x86, despite big compatibility issues, because of how much better they were.

                                          The evidence is that architectures are just not as important as all that. x86 is clearly pretty bad in many ways, but clever tricks and microcoding have been able to overcome those issues.

                                          • kick 11 days ago

                                            That's just not the case; Apple were already well behind in performance at the point x86_64 came out, due to them being stuck on POWER. They actively moved to x86, despite big compatibility issues, because of how much better they were.

                                            This is a complete misinterpretation of the above comment.

                                            • bsder 11 days ago

                                              > Apple were already well behind in performance at the point x86_64 came out, due to them being stuck on POWER.

                                              The issue was chipsets and peripherals, not POWER performance (which generally always beat x86 at the same point).

                                              The problem was that the entire ecosystem was built around communicating with an x86. So, you couldn't get a Northbridge or Southbridge equivalent that was even remotely close in performance or power consumption to those in x86 space.

                                              Unless you decided that you were going to take on everything in chip design, you couldn't compete. And Apple didn't decide to take on everything until Intel told Apple to pound sand and pissed Jobs off.

                                              • tigershark 11 days ago

                                                Absolutely false. They had to use liquid cooling in their top Mac at the time and it was still behind an opteron. Then the intel core architecture came out and pretty much destroyed it performance wise.

                                                • goatinaboat 11 days ago

                                                  The way I remember is that they couldn't get a PPC of the current (at the time) generation suitable for laptops - too power hungry, too hot. IBM weren't interested in supplying such a part so Apple were really left with no choice. It was a similar story of moving from 68k to PPC - the 68060 wasn't what they needed.

                                                  • fluffy87 11 days ago

                                                    There has never been a PowerPC system that beats x86 at any price point.

                                                    • bsder 10 days ago

                                                      You are far too young.

                                                      The 601 based PowerPC's were the first to be able to do 3D graphics on the microprocessor well.

                                                      The G4 based titanium Powerbooks were sufficiently better that they became iconic at a time that Apple wasn't regarded that well.

                                                      Sure, the G5 and up were disastrous, but the writing was on the wall well before that. Chipsets on the G3 and G4-based systems used more power than the processor and that only became untrue because the G5 was quite so poor.

                                                    • foldr 11 days ago

                                                      >not POWER performance (which generally always beat x86 at the same point).

                                                      Do you have a source for this? First time I've seen anyone claim that the G5 was competitive with contemporary Intel CPUs.

                                                      • usrusr 11 days ago

                                                        Contemporary Apple marketing that was extremely proficient at putting Power in a favorable light (just sneak enough Altivec into every comparison to make up for the rest) + every single (of the then few) Mac users who parotted it religiously?

                                                        Those were different times for Apple. The ship was noticeably turned, but the storm not over yet.

                                                        • geerlingguy 11 days ago

                                                          By the time the G5 required massive liquid cooling rigs and was nowhere near ready for laptop use... even die-hard Mac users were not parroting the party line any more. For most, the Intel move was a welcome one, as that was also during the height of the move from desktop to laptop. There was almost no chance for a decent G5-based laptop, at least not one as nice and sleek as the G4s that were already in existence.

                                              • nl 11 days ago

                                                Amusingly we can compare it with a clean design that was supposed to replace x86: Itanium. That had the full weight of Intel and partners behind it and was such a flop that it's been completely ignored in this thread.

                                                X86_64 is genius because it is the perfect example of the art of the possible.

                                                It drove SPARC into irrelevance, forced Intel to adopt it instead of Itanium and drove PowerPC out of consumer computers. I'd love to have a nightmare like that on my resume.

                                                • kick 10 days ago

                                                  I was going to mention Itanium, but didn't want to go through the "But Itanium was terrible!!!!!" flamewar that happens every time it gets brought up.

                                                • tux1968 11 days ago

                                                  Where could I read more about this analysis? Genuinely interested to understand how x86_64 is such an albatross and more details about what could be accomplished in its absence.

                                                  • kick 11 days ago

                                                    Previous HN thread on (some of) the failures of x86 itself (it notes that most of this is not solved with x86_64):

                                                    https://news.ycombinator.com/item?id=276418

                                                    There's plenty of criticism of CISC in general, but that gets into flamewar status.

                                                    A lot of modern abstraction, vulnerabilities and inefficiency can be summed up with "x86_64 sucks to write for, so let's build a new (or recreate an old) abstract machine!"

                                                    In its quest to maintain fifty years of (near) compatibility with an architecture originally used in a calculator, the industry created a monster.

                                                    There's a reason Apple (and Sun and MIPS before it) was able to get competitive with Intel's chips despite only getting a silicon team like half a decade ago and using an architecture generally seen as low-performance: they ruthlessly removed cruft, cutting ties with backwards compatibility in the process.

                                                    Backwards compatibility is a scourge unto innovation and ease of use, as even Intel saw (IA64, for all of its faults, was better than x86_64 in virtually every way).

                                                    I have a lot more criticism of x86_64, and can get a lot more technical in that criticism, but this comment is already getting a bit on the heavy side.

                                                    • sharpneli 11 days ago

                                                      IA64 had a glaring fault that trumps all of it’s advantages. It was explicitly a vliw design, so the user had to, or compiler, manually set multiple different kind of ops per single instructions.

                                                      Modern superscalar OoO is pretty much like that, except there is a piece of hardware internally that does it, almost like a hardware jitter. This freedom both means the magical compiler doesn’t have to exist but also it allows processors to have varying amount of execution units. See that you could actually use one more int alu? Just add it and even older software is able to automatically get the benefit.

                                                      • nl 10 days ago

                                                        Ironically the link you posted supports the opposite view: that x86_64 is ugly but that isn't really an issue. Quote:

                                                        in summary: x86 is ugly (and below is why I think so) but we don't care because compilers enable us to just forget about what ISA we're using. This is a good thing. This is the way it should be. But it's also an example of what the OP was talking about -- bad hardware design (in this case the x86 ISA; the actual hardware is quite good) not mattering because software is sufficiently good

                                            • Razengan 11 days ago

                                              That is admirable and pleasantly envious. Smart people like that being able to work on what they want at any organization they want makes everything better for everyone: the corporations, the users and the world in general.

                                            • sliken 11 days ago

                                              Indeed, after working at Tesla for awhile.

                                              While the core is great, I think the biggest contributors to the success are a few decisions outside of the design of the Zen core.

                                              One is an investment back in the Opteron days for an efficient, scalable, low power, low latency, high performance serial interconnect. This is a key enabler for the new chiplet strategy that has paid off so handsomely for AMD. Now a single piece of silicon can scale from low end Ryzen through the highest end servers without having to divide their limited R&D budget across numerous different designs.

                                              Additionally selling off global foundries enables them to pick the best fab per generation, something that Intel can't, or at least hasn't done.

                                              So Intel had a set back on the fab where the difficulty (chip yield) is increased by their larger chips. While AMD can switch fabs (they switched from global foundries to TMSC at 7nm) and make much smaller chips. The top of the line Epyc chip has 9 chips inside (I/O + 8 CPU chiplets).

                                              This positions AMD Particularly well for the future, they could now rev the I/O chip for DDR5, more memory channels, or any other performance tweak without having to re-engineer the CPU chiplets.

                                              It also doesn't require all the chip related technologies to move in sync. The I/O chiplet is actually made on an older process than the CPU chiplets. If the PCI-e 4 I/O chip was late, AMD could have shipped a PCI-e 3 I/O chiplet. If mid cycle PCI-e 5 becomes a must have AMD would have much less engineering to fix it.

                                              • snuxoll 11 days ago

                                                PCIe generations are the ONE thing AMD cannot easily rollback or advance out of cycle, since it’s used as the interconnect between everything on the die.

                                                • sliken 11 days ago

                                                  Really?

                                                  The Infinity Fabric (current generation that evolved from hypertransport) is a cache coherent serial network that can also do PCI-e (non-cache coherently). This can be swapped on a per serial connection basis as needed.

                                                  Seems like they could rev the PCI-e side without changing the IF side. The I/O chip has separate connections for off chip, so they could use PCI-5 there and not change the connections to any of the CPU chipets.

                                                  At least it seems that way... corrections?

                                                  • wtallis 11 days ago

                                                    The PCIe and IF connections are basically two MACs sharing the same PHY, and the PHY is the hard part to speed up. There's not really a plausible way that AMD could end up in a situation where they would have reason to boost the speed of just one out PCIe or off-package IF. And the on-package IF PHY between the IO die and the CPU chiplets is easier to get working at a given speed than the off-package version (though I don't remember off the top of my head if they are still using two separate PHY designs on the current generation). So by the time they've got an upgraded IO die with PCIE 5/6/etc., it's pretty much guaranteed that they will be ready to bump up the IF link speed to match.

                                                • AndriyKunitsyn 11 days ago

                                                  Gosh, modern hardware is so interesting, yet I know almost nothing about it! What is “serial interconnect”, is it the thing that connects CPU cores?

                                              • zokula 11 days ago

                                                Jim Keller is not the person behind Zen Mike Clark is!

                                                Mike Clark is the person who designed Ryzen . He is the guy who came up with the name Zen as well.

                                                https://www.statesman.com/business/20160904/amid-challenges-...

                                              • gok 11 days ago

                                                Jim is a legend. Led the AMD A8 which was the first real "oh shit" moment for Intel, and co-authored the x86-64 ISA. Bootstrapped the silicon teams for both Apple and Tesla.

                                                • abledon 11 days ago

                                                  it's like he's a demi-god in a neil gaiman novel, having fun playing games in the mortal realm.

                                                  He creates the "oh shit" moment at company A regarding company B, then goes and works at company B, to create an "Oh shit" moment for company A.

                                                  • Itsdijital 11 days ago

                                                    In no way trying to bring politics or anything into this. Just a neat fact.

                                                    Jim's brother-in-law is Jordan Peterson. Yup, that Jordan Peterson. Small worldish thing I guess?

                                                    • tosser199 11 days ago

                                                      They all grew up in the same small canadian town. Peterson nanaged to marry his high school sweetheart.

                                                      (Now these 2 posts look like s tabloid, speaking about perdonal lives)

                                                  • WillPostForFood 11 days ago

                                                    Jim Keller recently spoke at UC, on the topic of Moore's law not being dead. Very impressive guy.

                                                    https://www.youtube.com/watch?v=oIG9ztQw2Gc

                                                    • NotCamelCase 11 days ago

                                                      I like how some people pin the recent success of AMD on that one person, even if he doesn't claim it himself.

                                                      Is it because of not truly knowing what it takes to do VLSI nowadays or just hoping that if one man did it, it can be done again?

                                                      • afiori 10 days ago

                                                        > even if he doesn't claim it himself.

                                                        To be fair many people that are the single reason for success of a company do not claim it themselves.

                                                    • unfocused 11 days ago

                                                      Interesting timing for this article.

                                                      I'm in Canada, and for the first time in over a decade, I decided I couldn't pay the premium of $3200 (taxes included) for a Macbook Pro with 512GB SSD and 16GB of RAM.

                                                      I bought a Lenovo 2 in 1 touch laptop with lots of ports, and an AMD Ryzen 3700U with RX Vega 10 graphics, for $700 (taxes included).

                                                      The AMD Ryzen 3700U is just as good as the 8th Gen i5. Plenty for me at home. And with the money I saved, I can upgrade my NAS.

                                                      This combination of AMD + Windows + Lenovo has finally pushed me to try the world outside of Apple.

                                                      • corysama 11 days ago

                                                        Check out WSL. I’m of the opinion it was created specifically for web devs in your situation.

                                                        After all, what really is a MacBook to a web dev besides a off-the-shelf reliable laptop with a Unix terminal?

                                                        • snvzz 11 days ago

                                                          Similarly, I got a thinkpad x395 with the 3700U.

                                                          On Linux (Arch's current kernel), everything* works.

                                                          *I haven't looked into the IR camera nor the fingerprint reader, but the latter supposedly has a driver, as per Arch wiki's thinkpad support page.

                                                          • ralusek 11 days ago

                                                            How's the trackpad? I have heard people say that they've found an apple comparable trackpad, but I have yet to encounter one.

                                                            • fyfy18 11 days ago

                                                              I switched to ThinkPads last year (first a X250, now T470s) and this was my big concern too. I just sat down one day and spent an hour or so tweaking the various options for the trackpad driver until I got it how I was used to from Apple. I'd say it's 95% as good, and some things like dragging are infact easier as there are physical mouse buttons (as well as tap to click).

                                                              I have no idea why the defaults are so bad, but I've kind of got used to that as a Linux user ¯\_(ツ)_/¯

                                                              • nsomaru 11 days ago

                                                                Can you detail your distro and the tool you used to tweak? What settings worked well?

                                                                • macintux 11 days ago

                                                                  > and some things like dragging are infact easier as there are physical mouse buttons

                                                                  Did you ever try the 3-finger drag on a Mac? I find it hard to imagine any button being easier than that.

                                                                  It’s a shame Apple hides it in Accessibility. That gesture is wonderful.

                                                                • floatboth 11 days ago

                                                                  Any Synaptics multi-touch trackpad feels as good as Apple. Apple hardware is nothing special.

                                                                  It's all about the software. Scrolling in Firefox on Wayland with the upcoming vsync patch (https://bugzilla.mozilla.org/show_bug.cgi?id=1542808) and OpenGL layer compositing enabled feels perfect. (Fun fact, I actually contributed the inertial scrolling on GTK patch to Firefox :D)

                                                                  If you run legacy software and your trackpad scrolling is delivered to the application as emulated scroll wheel events, don't be surprised that it doesn't feel right.

                                                                  • JetSetWilly 11 days ago

                                                                    Apple's trackpads are good (although overly large these days) but their keyboards are utterly horrific so it depends what you prioritise. If you mainly consume media and browse the web, then the apple's input is optimised for that case.

                                                                    If you write or code or otherwise do real work, then something like a thinkpad has input better optimised around that.

                                                                    • nl 11 days ago

                                                                      Keyboards seems a pretty individual thing. I like Apple keyboards (even the touchbar ones), and think their standalone desktop keyboards are the best keyboards for programmers available.

                                                                      • heavenlyblue 10 days ago

                                                                        Apple keyboards either lead to a weird feeling in the finger joints or make you type on the keyboard as if you had a nailjob with a huge extension.

                                                                        But if I had to pick a keyboard I never actually have to type on myself - I would pick the Apple keyboard 100% of the time.

                                                                    • SoreGums 11 days ago

                                                                      I use a $250 mouse + a $250 mechanical keyboard as the laptop is docked... Pretty sure nothing compares to apples track pads. At the same time sitting hunched over a laptop on a desk is sub optimal too, many do it. Point is for significantly less money a non apple device is most likely gonna get the job done and not be a disaster hardware wise. Is the Trackpad really worth $2,000?

                                                                      • omnimus 11 days ago

                                                                        Whats the 250$ mouse?

                                                                        Btw not sure why you compare trackpad to a mouse. People who commute or work from cafes need good trackpad and apple ones are the best. The trackpad itself costs much less but i doubt people would use it as mouse replacement in office situations.

                                                                        • macintux 11 days ago

                                                                          FWIW, I use Apple’s trackpad exclusively in office or on the road. The gestures and tap to click are irreplaceable for me.

                                                                      • unfocused 7 days ago

                                                                        Ok, so after using for a couple of hours, the trackpad feels great! In fact, side by side they are identical in size with the Macbook Pro 2013. Clearly copied the size from Apple, but the feel is fine. The keyboard feels a bit different, but you get used it.

                                                                        I also did not change any setting on the trackpad. This is out of the box.

                                                                        • unfocused 10 days ago

                                                                          It's shipping as I type this and should have it this week. I'll update this thread since I've been using a Macbook Pro trackpad for 10+ years now and give you a first impression.

                                                                      • jangid 11 days ago

                                                                        AMD lags not because of hardware only. There is very weak support for drivers and libraries in software. Intel and Nvidia spend huge amount of money in supporting library maintainers and thereby creating a lobby.

                                                                        For example, look into the list of supported GPUs in the Github repositories of popular machine-learning libraries PyCharm, Tensorflow etc. AMD and OpenCL is nowhere as compared to Nvidia's CUDA.

                                                                        • jamesblonde 11 days ago

                                                                          ROCm has been upstreamed to TensorFlow 2.0 now and it is gaining traction on AI platforms:

                                                                          https://www.logicalclocks.com/blog/welcoming-amd-rocm-to-hop...

                                                                          Also, the performance of AMD GPUs for deep learning is improving (50% in last 12 months through software alone). The Radeon VII ($600) is about the same performance as the Nvidia 2080Ti ($1100) - and the Radeon VII can be used in a data center (Nvidia force you to use the Volta GPUs - $9000 for a V100):

                                                                          https://cdn.oreillystatic.com/en/assets/1/event/299/ROCm%20a...

                                                                          • Symmetry 11 days ago

                                                                            Has this resulted in any increase in uptake of their GPUs? I'm usually the one to praise The Economist for covering technical subjects without embarrassing missteps but this:

                                                                            Its gpus—which provide 3d graphics for video games and, increasingly, the computational grunt for trendy machine-learning algorithms—go up against those from Nvidia, whose revenues last year of $11.7bn were nearly twice those of amd.

                                                                            seems wrong to me. They've been doing perfectly well in HPC tasks in supercomputers. And their CPUs have been doing great. But as far as I'm aware most machine learning work that moves away from NVidia moves to custom silicon rather than AMD GPUs. For the sake of openness I'd very much like AMD to start becoming a more popular option though.

                                                                          • lone_haxx0r 11 days ago

                                                                            On the other hand, Nvidia's Linux drivers are blobs (that don't even support Wayland) while AMD and Intel actively contribute to Mesa, and treat their users well (or at least much better than nvidia).

                                                                            • chousuke 11 days ago

                                                                              I look forward to the day that someone pulls the rug from under Nvidia like AMD did to Intel.

                                                                              I'm not sure if it will happen, or when, but damn would it be satisfying.

                                                                              • Erlich_Bachman 11 days ago

                                                                                There have been numerous situations just this year where I've stumbled on problem from having an AMD card, whereas Nvidias with their "blobs" have worked just fine. Gaming, video editing, even ML, they all seem to work perfectly on Linux with Nvidia, and AMD always has some problems. So at the very least this is contested. Nvidia seems much more supporting of the open source OS.

                                                                                • rhn_mk1 11 days ago

                                                                                  My experience in research taught me that NVidia makes a hostage-like situation. CUDA being a standard, it locked us into the proprietary driver, and overly expensive GPUs (or struggling to source used Titans) for double-precision computation. AMD stack not supporting CUDA at the time, their more powerful cheap double-precision GPUs felt like a forbidden fruit.

                                                                                  To top this off, the proprietary driver gave us permanent issues with OS integration, testing and deployment, and being distributed out of band, it was really difficult to get every developer on the same version. Each version of the GPU compiler had different and conflicting quirks, increasing the chaos of which developer/researcher used which branch of our software.

                                                                                  I would not recommend tying any serious work to NVidia hardware to any team.

                                                                                  • yulaow 11 days ago

                                                                                    You don't have a laptop with a nvidia optimus card, do you? It's a shitshow of never ending problems with nvidia, while every amd laptop solution works as intended instantly after os installation.

                                                                                    • tankenmate 11 days ago

                                                                                      I have had exactly the opposite experience, issues with Nvidia cards / drivers that I couldn't fix. Over the last 5~10 years AMD support on Linux (and other open source OSes) have been much better. To the point that for some of my workloads AMD on Linux is faster than AMD on Windows.

                                                                                      • garaetjjte 11 days ago

                                                                                        It is not surprising, AMD OpenGL drivers for Windows are horribly slow.

                                                                                    • toma1k 11 days ago

                                                                                      There is one advantage of nvidia proprietary driver, you get support of new cards on day one, with amd/intel it takes few months to have drivers in good shape.

                                                                                      • clarry 11 days ago

                                                                                        I'm pretty sure amdgpu has supported AMD's recent cards (from the past few years) before they hit the shelves. Same for intel. It's always fun seeing the commits adding support for chips that you can't buy yet. And I've bought some very bleeding edge hardware from both companies in the past few years, with good results OOTB on Linux.

                                                                                    • Itsdijital 11 days ago

                                                                                      It's crazy how amd still hasn't fixed their long long term driver issues. It's been a weak point for years and still is bad.

                                                                                      I'm look at their new cards but holding off because people are saying the only way to avoid crashes and black screens is to install on a fresh windows install. Yeah, no thanks.

                                                                                    • Erlich_Bachman 11 days ago

                                                                                      For a random person looking for those libraries: They meant *pytorch. Pycharm is a python IDE used for any python project.

                                                                                      • Zardoz84 11 days ago

                                                                                        Looks that Nvidia should put more money on his drivers for Linux. It are a piece of crap.

                                                                                        • gameswithgo 11 days ago

                                                                                          This is true for their GPUs but is a non issue for their CPUs.

                                                                                        • arcanus 11 days ago

                                                                                          "For now, AMD’s resurgence is good news for consumers, IT departments, cloud-computing firms and anybody who uses software. Like any good monopolist, Intel charges a steep price for its products—unless AMD is doing well. Sure enough, Intel’s newest set of desktop chips, due in November, are some of its thriftiest in years."

                                                                                          • puranjay 11 days ago

                                                                                            I just built a new PC with Ryzen 3700x. I use it mostly for music production. Fantastic performance so far. 10 instances of Serum and my CPU doesn't even cross 15%.

                                                                                            Moreover, I feel like I'm buying for the future with Ryzen. Intel is going to change its architecture after the 10th gen. Buying a 9th gen Intel right now seems like throwing money at a dead end

                                                                                            • pier25 11 days ago

                                                                                              Same. My new 3700x build can push dozens of Diva instances.

                                                                                              If you want to see more details: https://vi-control.net/community/threads/my-ryzen-3700x-buil...

                                                                                              • puranjay 11 days ago

                                                                                                That's eerily similar to my build, down to the Noctua fan

                                                                                                • tuananh 10 days ago

                                                                                                  ryzen 3000 benefits a lot from faster ram. you may consider upgrading.

                                                                                                  • pier25 10 days ago

                                                                                                    Depends on the type of workload.

                                                                                                    In Cinebench or Blender rendering it makes practically no difference.

                                                                                                    I updated my RAM from 2133 to 3200Mhz and it made no difference either for DSP audio processing.

                                                                                                    • tuananh 9 days ago

                                                                                                      yep it depends on workload. cb20 doesn't scale well with memory.

                                                                                              • yumario 11 days ago

                                                                                                From the game theorist point of view I think its better if one company is the underdog. Think about it, if one company is the underdog, one company has a lot to gain by competing, while the other has a lot to lose if they don't compete. Therefore we get competition. Now, the more equal the market share of the companies the grater the risk and less the reward for competition... A better strategy would be not to undercut your competitor and instead divide the market share. Which leads to stagnation.

                                                                                                Do people here think it sound reasonable?

                                                                                                Edit: Mathematically the argument would be as follows:

                                                                                                Consider two company A and B. A has market share 'a' and B has b. n is the total market. Then a + b = n.

                                                                                                A's reward for competing will be n - a = b.

                                                                                                A's risk for competing will be a, (it's remaining market share).

                                                                                                A's will compete as long as the reward is greater than risk.

                                                                                                This will reach an equilibrium at a = b.

                                                                                                • SmirkingRevenge 11 days ago

                                                                                                  Take it for what you will, but an Intel engineer relayed to me (many years ago, in the aftermath of the Microsoft antitrust trial), that they attempt to ensure that AMD maintains a certain level of competitiveness as a hedge against antitrust. Sometimes that involves actually helping AMD via partnerships or technology sharing, if they are struggling too much - other times it means giving them a swift kick to the crotch if they are gaining to much ground. It may have been BS, it may not have - and much has certainly changed over the years... but...

                                                                                                  AMD has always been nipping at Intel's heels, for 20+ years now, never really losing or gaining too much to pose a real threat. Yet we've seen how ruthlessly Intel will snuff out potential contenders (such as Transmeta, RIP), it does kind of make you think there's something to it.

                                                                                                  • Slartie 11 days ago

                                                                                                    This was the reason for me to still buy AMD shares when they were deep underwater. I considered them basically immortal, because if the company ever was in existential danger, Intel would have a huge incentive to stage some kind of indirect rescue operation (not outright buying, that would kill the goose, but something else that surely would prop up AMD share value), because the monetary value of AMD as an antitrust insurance was easily much higher than AMDs market cap.

                                                                                                    Sold those shares way too early in hindsight, but still got a good return out of that thinking, and if AMD ever gets into trouble again, I won't hesitate to apply the same logic.

                                                                                                    • georgeburdell 11 days ago

                                                                                                      This actually would mesh pretty well with that one Kaby Lake G NUC, released in 2017 and probably had roots in 2016 when AMD's stock was at $3, that had Radeon graphics inside https://www.tomshardware.com/news/intel-discontinue-kaby-lak...

                                                                                                      • wtallis 11 days ago

                                                                                                        There's a simpler explanation for why Intel used AMD GPUs for a while: Intel's new integrated GPU design couldn't ship until they sorted out their 10nm issues, and their older design wasn't competitive. When Intel went shopping for GPUs on the open market, they could get them cheaper from AMD than Nvidia (though the HBM2 requirement was a clear downside). They actually paired AMD GPUs with both 14nm Kaby Lake and their failed 10nm Cannon Lake processors that had broken integrated GPUs. The short-lived Intel/AMD GPU partnership came to an end because Nvidia's lead over AMD got too big, but it was doomed to be cancelled as soon as Intel gets 10nm working.

                                                                                                        • lonelappde 10 days ago

                                                                                                          What is Nvidia's lead over AMD? Intel's non-AMD gpus are Intel gpus

                                                                                                          Do Nvidia integrated gpu (mpgu) exist?

                                                                                                          Nnvidia claims they exist but all their weblinks to details are dead. https://www.nvidia.com/object/main_mobo_gpus.html

                                                                                                          • wtallis 10 days ago

                                                                                                            Nvidia and AMD both make discrete mobile GPUs, and those are the only two options for offering better GPU performance on an Intel laptop when Intel's own GPU is inadequate. Nvidia's GPUs have for years generally had a substantial power efficiency over AMD's.

                                                                                                      • nickflood 11 days ago

                                                                                                        That's what everyone does now. Microsoft Teams versus Slack, Facebook versus Snap. Nobody wants to be hit as hard as Microsoft was

                                                                                                        • agumonkey 11 days ago

                                                                                                          it's been said enough for me to remember yeah, not killing AMD was a benefit, but keeping their head under the water too

                                                                                                          • SmirkingRevenge 11 days ago

                                                                                                            If it turned out to be actually true... I wonder sometimes if it makes a better example of antitrust policy actually working to some degree... or failing.

                                                                                                            • agumonkey 11 days ago

                                                                                                              yeah, it's working but within its limits ..

                                                                                                        • roenxi 11 days ago

                                                                                                          Ease of entry into the market is much more important than the present market structure. An underdog can become the market leader in a year or two [0]. The issue with Intel/AMD is that there are only something like 4 legal entities out of >7 billion legal entities on earth who are licensed to produce x86 chips. 4 is better than 1, but it is still a low number.

                                                                                                          It is nearly impossible to maintain an oligopoly in a market that is easy to enter and questions of underdog/overdog become irrelevant. All companies have to offer a reasonable (value/$) proposition to customers of they go broke.

                                                                                                          [0] Poor Nokia - http://www.asymco.com/wp-content/uploads/2012/02/Screen-Shot...

                                                                                                          • wincy 11 days ago

                                                                                                            Is there a time in the future that x86 related patents will expire?

                                                                                                            • Dylan16807 11 days ago

                                                                                                              Sure! For actual x86, they already have. You can make a perfectly good 32-bit chip with SSE2.

                                                                                                              The patents on the core of x86_64 will expire in a couple years. Even if you can't have the more recent vector instructions, that's pretty good for a lot of use cases.

                                                                                                              • nazgulnarsil 11 days ago

                                                                                                                each of those 4 entities has whole teams devoted to building up the warchest of patents + colluding on standards to keep the balls in the air indefinitely.

                                                                                                            • airza 11 days ago

                                                                                                              This basically describes the argument for antitrust regulation

                                                                                                              • porknubbins 11 days ago

                                                                                                                Possibly some kind of tacit collusion is more likely with relatively similarly sized competitors. But to me a giant vs underdog situation is usually worse because in the real world I don’t think the underdog usually comes from behind and wins. In fact I’m amazed by the number of times AMD seems to have accomplished this with a fraction of Intel’s resources.

                                                                                                                • WrtCdEvrydy 11 days ago

                                                                                                                  It feels backwards... the closer the companies are, the more there is to lose from not competing... but you do get commoditization. I wonder what a world with AMD being the leader and Intel being the underdog would look like.

                                                                                                                  • yumario 11 days ago

                                                                                                                    Consider two company A and B. A has market share 'a' and B has b. n is the total market. Then a + b = n.

                                                                                                                    A's reward for competing will be n - a = b.

                                                                                                                    A's risk for competing will be a, (it's remaining market share).

                                                                                                                    A's will compete as long as the reward is greater than risk.

                                                                                                                    This will reach an equilibrium at a = b.

                                                                                                                    • foota 11 days ago

                                                                                                                      I understand this is an armchair discussion about economics, but this is wildly oversimplified. What does it even mean to "compete" in this case? Obviously if either intel or AMD stopped researching entirely then they would quickly fall by the wayside, as long as either intel or AMD pose a credible threat to being able to innovate.

                                                                                                                      I would say that your a and b could be like, levels of investment into researching. They'll research at some level, or risk falling too far behind, but can't spend too much as then they'll have to divert funds from other things like marketing or production, or just run out of money. They'll both likely choose to invest at a pace where they think they'll be able to match the other's innovation, but not so much as to overspend.

                                                                                                                      • yumario 11 days ago

                                                                                                                        Yes it oversimplified. Suppose that a = b, i.e both companies has the same market share. Also suppose that when both companies has equal market share their innovations rates are the same, same price etc. The model has two equilibrium both companies compete in which their market share fluctuates around a = b. and we get a sort of predator-pray model[1]. Both companies do not compete their market share stays the same.

                                                                                                                        What does it means to compete? It could mean many things like not putting lower prices, delaying innovations until competitor has release their own etc.

                                                                                                                        [1] http://ccnmtl.columbia.edu/projects/seeu/dr/restrict/modules...

                                                                                                                        • RussianCow 11 days ago

                                                                                                                          The issue with this is the scenario where either company deviates from the status quo. For example, company A decides to make a tradeoff: invest more into R&D at the expense of sales and marketing. If Company B remains the same, company A may suffer a temporary dip in sales, but in exchange, their product becomes better over time and they are able to take more than the original 50% of the market due to having a superior product.

                                                                                                                          What you're talking about really only works if both companies agree to stay stagnant and collude to keep the status quo as-is. To be fair to your point, this has happened a few times historically, but it is usually considered price fixing and is very illegal[0].

                                                                                                                          [0]: https://en.wikipedia.org/wiki/Price_fixing#United_States

                                                                                                                          • yumario 11 days ago

                                                                                                                            I think a clearer is example would be with telecom companies. Say A and B, are telecom with equal market share. A, could "compete" in an attempt to gain market share and install a gigabit bandwidth, but this will only cause cause company B to retaliate and instant gigabit bandwidth as well. Therefore the market share and revenue will fluctuate back to equal. But both companies would have lose the money involved in installing the higher bandwidth. Therefore if the market share is equal the best strategy would be "tic for tac" i.e wait until you opponent does something. Which has two equilibrium either constant tic for tact. Or waiting for the opponent does a move.

                                                                                                                            In the case when one company's market share is smaller than the other it is always better to "invest" or compete.

                                                                                                                            • RussianCow 11 days ago

                                                                                                                              There is always an advantage to being first; if there wasn't, nobody would ever invest in anything new. Even in your example, the first company to bring gigabit internet to an area can secure some contracts that will still be in place when their competitors respond, so they get a head start when that does happen, which can lead to a longer term market share advantage if they can keep the momentum going. Even if that advantage doesn't last forever, it certainly makes them a lot of money in the meantime.

                                                                                                                              See: Uber vs Lyft, the iPhone (and, historically, many other Apple products), Coca-Cola, Netflix, etc.

                                                                                                                              • yumario 11 days ago

                                                                                                                                I don't think that's the heart of the question. My question is is better for innovation if two companies have equal market share or if one has a smaller market share? I trying to argue that it is better if one company has a smaller market share.

                                                                                                                                Case 1: Netflix. Netflix caused innovation in the movie rental industry. But when Netflix first began it had much smaller market share compared to blockbuster. Would Netflix innovate again? Sure, but I doubt it would do anything revolutionary again. Most likely it would grow stagnant, once the new market stabilizes between hulu, HBO, disney, amazon etc.

                                                                                                                                Case 2: Apple. Apple was the underdog in early 2000s and that caused then to innovate, while Microsoft had grown stagnant. Today both Apple and Microsoft sort of have similar market shared and they don't really compete with each anymore. Microsoft shifted to cloud, and dropped windows phone. Apple keeps doing what they are doing with marginal upgrades to iphones, and mac. I don't really expected to come up with another "iphone" level innovation any time soon.

                                                                                                                                Case 3: Amazon. No big company is really trying to compete with amazon this days. I don't see Google or Facebook coming up with own online stores. It just not worth it to compete. While they are a lot of smaller online stores.

                                                                                                                                Case 4: Automobile industry. Sure they are new car models each year...but it is nothing revolutionary. Simple marginal upgrades over last year model. It was not until an underdog (tesla) tried to gain market share that we have seen any sort of innovation from them.

                                                                                                                                Most of the innovation today happens at smaller companies, and they eventually either succeed and become the next Apple and Google, fail and go out of business, or they get bough by the big companies.

                                                                                                                                • RussianCow 11 days ago

                                                                                                                                  > My question is is better for innovation if two companies have equal market share or if one has a smaller market share?

                                                                                                                                  If that's your question, then the answer is obvious, isn't it? The higher the potential reward, the more motivation to innovate. If you only control 1% of the market, innovating might easily mean a 1000% growth of your company; but if you already own half, the best you can possibly do is double that. The upside just isn't there to justify big, risky plays.

                                                                                                                                  That said, there is always a desire to grow, even if not by as much, so there is still incentive to not become stagnant and to continue making improvements to your products, even if marginal.

                                                                                                                                  • yumario 9 days ago

                                                                                                                                    Yeah, when threadripper came out I believe it offered the same performance as intel I9 for about half the price. Why did AMD choose such aggressive pricing? Because they have the desire to grow. If Intel and AMD had the same market it would not make sense to put out a product for half the price as your competitor. So if Intel and AMD had the same share, we would probably still paying 1000+ dollars for intels I9.

                                                                                                                  • Camas 11 days ago

                                                                                                                    This is gibberish.

                                                                                                                    • zanny 11 days ago

                                                                                                                      Regardless, AMD would have to two by an order of magnitude to be "on Intels level". They have a long way to go to stop being an underdog.

                                                                                                                      • heavenlyblue 9 days ago

                                                                                                                        This is as simplified as saying "racism needs to exist because every other race is trying to take over my genes".

                                                                                                                      • sliken 11 days ago

                                                                                                                        Nipping at the heels, market share wise.

                                                                                                                        Crushing it on the actual CPU performance side. The new Epyc Romes are pretty amazing chips compared to the Intel warmed over Xeon 82xx series.

                                                                                                                        • tyingq 11 days ago

                                                                                                                          Assuming they follow on with a credible laptop CPU, AMD is a clear BIG underdog winner, and Intel has some reflecting to do. It's a pretty unique moment in time.

                                                                                                                          • big_chungus 11 days ago

                                                                                                                            It's been great for consumers, though. This ought to be a textbook illustration of how serious competition can drive a net increase in surplus, especially for consumers. I'd guess anyone who follows hardware, even from a distance, would agree. Finally broke the four-core "ceiling" intel imposed, too. I don't know if AMD will stay in front long-term (intel's got some nice stuff coming out on new nodes), but that doesn't matter. If intel is forced to put some work in and pull ahead again, still better.

                                                                                                                            Interestingly enough, I'm hoping that intel does the same to nvidia in the GPU space. Nvidia is still the choice if you want the best performance, and cuda is the standard. That might change (and prices might drop) if intel's cards end up being good. Fingers crossed.

                                                                                                                            • topmonk 11 days ago

                                                                                                                              Wouldn't we all fair better if those doing the duplicate work of researching for their individual companies instead were able to work together on a single design? If instead of two corporations wasting resources fighting each other for market dominance, if we all were only allowed to work for the government, tasks like these could be performed in a way that nothing was wasted and not only that but we all worked for the greater good of the people rather than the profit of a few.

                                                                                                                              Such a configuration would surely beat the pants off of any capitalist environment regardless of the regulatory structure imposed. Not to mention, by having goals that put the people first, would end up with a society of better served, happier people where empathy was the rule rather than the exception.

                                                                                                                              • timlatim 11 days ago

                                                                                                                                Intel's current situation with the 10nm node actually illustrates why having separate designs is a net benefit for the industry and the consumer. It was supposed to be delivered earlier, but due to a design/planning failure (https://www.tomshardware.com/news/intel-ceo-cpu-shortage-com...), it was stuck in development and kept Intel from rolling out their new architecture (Ice Lake) until recently.

                                                                                                                                What prevents a government-led project from hitting the same problem? Without serious competition from AMD, I'm quite sure there would be no price reductions and we'd be paying more for inferior products.

                                                                                                                                • tyingq 11 days ago

                                                                                                                                  Nowhere near where it was in the 90's. All of the various RISC players regularly leapfrogged each other. There was a pretty mad rush to buy, too, because you often ran software that charged "per CPU". So there was a free software upgrade of sorts.

                                                                                                                                  Oracle tried to beat that system with pricing tied to the clock frequency. That approach was kind of funny...they were lining their pockets with effort that other companies were putting forth.

                                                                                                                                  • topmonk 11 days ago

                                                                                                                                    The person in charge could have the same philosophy as you, and after seeing several reports of non-progress, invite some other people with alternative ideas to work in tandem.

                                                                                                                                    There is nothing wrong with trying different approaches in parallel, but why does it have to go all the way from cradle to market that way? At a certain point it would be obvious which design was the better one, and then the group who was working on the design that proved inferior could then join the more successful group and double their productivity. If it was possible to cut losses way ahead of the curve, imagine what could have already been achieved.

                                                                                                                                    • jl6 11 days ago

                                                                                                                                      It’s still not obvious that Intel’s R&D is inferior (late, sure, but it hasn’t killed the company yet), and ultimately the market is the best judge of which is the best design.

                                                                                                                                      • Symmetry 11 days ago

                                                                                                                                        It's not really superior or inferior but different approaches with different levels of risk. Intel invests lots of engineer hours tying their layouts very intimately together the the particular process they're using to get out every scrap of performance. AMD does their best to modularize their design at every level to not spend engineering hours they don't have and to let them move to whatever process node they need to.

                                                                                                                                        When Intel's fabs were doing well they were dominant. When they ran into problems they were stuck in a way AMD wouldn't have been if TSMC had problems.

                                                                                                                                        • topmonk 11 days ago

                                                                                                                                          > ultimately the market is the best judge of which is the best design.

                                                                                                                                          This is just bias. How can you know what the ultimate best judge of anything is? You would have to know what all the alternatives that could ever be conceived would be.

                                                                                                                                          And I can prove you wrong more directly. If the market is the best judge of the of the best design, why is so much money spent on advertising? If a better designed product with less advertising faced a worse designed product with better advertising and always won, since "ultimately the market is the best judge of which is the best design", then there would be no point spending money on advertising at all.

                                                                                                                                          • iguy 11 days ago

                                                                                                                                            Re "ultimately the market" most people would point to 20th C experiments with command economies here. The hope was as you describe, a benevolent leader trying different approaches until one was clearly superior, and avoiding all the cost & waste of competitors doing the same work twice, not to mention the waste of advertising.

                                                                                                                                            And it didn't work very well, even on purely economic grounds. Cars are a prototypical example to think about, and they were spectacularly awful for decades. It turns out the benevolent leaders are good at figuring out what's good for their careers, and less interested in making useful products (meaning safe, reliable, fast, efficient, etc.) Even in the west there was too little competition, but it turned out that duplicating almost everything GM did independently in Japan was a great investment to be making.

                                                                                                                                            Another way to say this: If 2 competitors each develop the thing, then in the worst case they spend twice as much money as in utopia. But we can't get to utopia. And in situations where there is only one option (be it enforced by the state, or just a monopoly) it's easy to go wrong by a much larger factor than 2. If you don't like cars, think about drug companies, or academic publishes, who seem happy to increase prices by 10 or 100 times if they can get away with it.

                                                                                                                                    • mamurphy 11 days ago

                                                                                                                                      The problem is incentives. If you are the only chip in town, who cares if you go for a long lunch and aren't that creative in pushing performance limits? Capatlism creates winners and losers, and creates incentives for people to try their best to not be on the losing team.

                                                                                                                                      • topmonk 11 days ago

                                                                                                                                        The problem is not incentives, but culture. From a view purely based on incentives, there is no reason to support your grandmother, for example, rather than kicking her out in the cold. Only our culture stops us to treat our closest family members this way.

                                                                                                                                        If we expanded what we thought of as “us” from just the individual unit, however small that might be (the familial unit for some, just the individual self for others) to to include the entire country and government running it, there is no telling what we could achieve.

                                                                                                                                        • depressedpanda 10 days ago

                                                                                                                                          > If we expanded what we thought of as “us” from just the individual unit, however small that might be [...], to include the entire country and government running it, there is no telling what we could achieve.

                                                                                                                                          Please consider that people have historically tried to achieve the Utopia you describe, where everyone is expected to put the State before the individual. It degenerates into totalitarianism, since forceful coercion and defining a "them" to the "us" is necessary for that to happen.

                                                                                                                                          I much prefer the arrangement of putting more emphasis on the individual and money being wasted on duplicated r&d than what you seem to be arguing for.

                                                                                                                                          • mongol 11 days ago

                                                                                                                                            I would say empathy plays a stronger role than culture in your example.

                                                                                                                                        • itronitron 11 days ago

                                                                                                                                          you need a diversity of opinion in order to make meaningful progress and that is more likely to happen when there are different companies

                                                                                                                                          • lliamander 11 days ago

                                                                                                                                            An under-appreciated point. All heirarchies impose some degree of conformity, whether they are corporate or government. The market is in fact a very non-hierarchical structure, and provides the necessary space for creativity.

                                                                                                                                            • topmonk 11 days ago

                                                                                                                                              If you are able to recognize that diversity of opinion is desirable, why can't the person in charge recognize this, and create groups that work on different ideas in tandem?

                                                                                                                                              Once the ones that are successful bear fruit, those that are working on the failures can be moved over to the successful groups. And instead of the current environment, where there is a lot of wasted effort allowing an inferior product to go all the way to market to languish there, the decision can be made much earlier than that, saving all those wasted resources.

                                                                                                                                              • itronitron 11 days ago

                                                                                                                                                That has worked at companies but it tends to not be a stable state for them long-term. Hewlett-Packard (HP) is the best example I can think of that started off as an innovator and then got re-orged into a walking corpse.

                                                                                                                                                I think it is a result of human nature responding to the incentives provided by their company, their education, and culture. Most people, if hired to lead the work on an idea, will become upset or stressed if they learn of another group at the company that is also focused on the same area.

                                                                                                                                                Companies can adapt their culture and organizational model to create an ecosystem in which diverse efforts can develop at their natural pace. However if that was easy to do well then there would be no need for companies to hire external design agencies.

                                                                                                                                                At the scale of chip manufacturers such as Intel and AMD they probably feel that they are considering many diverse opinions, but each company has its view and once they decide on the path forward they will cut any staff that are not contributing directly to that path (or at least AMD will.)

                                                                                                                                                • lonelappde 10 days ago

                                                                                                                                                  The problem is how to adequately choose the person in charge. Adam Smith wrote a book that explains this pretty well.

                                                                                                                                                  Your idea is appealing in basic theory, for sure, but a few thousand years of experiment and more sophisticated theory of actual human nature disproves it.

                                                                                                                                              • mika9090 11 days ago

                                                                                                                                                I also suggest that those people charged with designing this new unified CPU will have a progress quota. And if they do not meet this quota they will be send to a labour camp. This will surely speed things up!

                                                                                                                                                And just to be sure all is fair you will in charged of defining this quota.

                                                                                                                                                • sharpneli 11 days ago

                                                                                                                                                  In that case we would likely be stuck with AMD on bulldozer era or Intel now. No competition allowed thus the incubent will stay lazy.

                                                                                                                                                  The fault lies with human organizations in general. Everything will stagnate and there will be a boss ”We have always done it this way”. Only by coming from outside can one go around these issues.

                                                                                                                                                  In the 50s or so US was actually worried what you say might be the case and communist society would trash capitalism because factories would share innovations instead of competing with them.

                                                                                                                                                  For some reason the innovations didn’t really come.

                                                                                                                                                  • monkeywork 11 days ago

                                                                                                                                                    did you forget your /s ?

                                                                                                                                                • CamouflagedKiwi 11 days ago

                                                                                                                                                  Not that unique; the early Athlon64s were miles ahead of Intel, who'd gone a long way down the wrong road at the time. This feels very similar, but doesn't bode that well for AMD if they can only do this once a decade.

                                                                                                                                                  • dtech 11 days ago

                                                                                                                                                    Those 2003 (AMD Opteron) - 2007 (Intel Core 2) years were some of the most successful years for AMD, and it continued for a little while after until they couldn't keep up anymore.

                                                                                                                                                    If the result of this is that AMD gains in power and market share for a few years and then the there will be another major leap in CPU quality due to Intel, I'd consider that pretty good.

                                                                                                                                                    • gscott 11 days ago

                                                                                                                                                      Cpus are now lasting a lot longer... a top end cpu is likely to still be fast enough after 6 years, maybe longer. Catching up to Intel and surpassing them may be good enough for a whole decade.

                                                                                                                                                      • clarry 11 days ago

                                                                                                                                                        Dunno, my Ryzen 7 1800X was relatively "top end" when I got it on launch day, and I'm already itching for 16 cores & faster RAM. I can't imagine using my T470p for work for five more years, it's already feeling horrendously slow.

                                                                                                                                                      • lonelappde 10 days ago

                                                                                                                                                        If you don't care about electricity/thermals, midtier 95w 4core chips from 2011 are still plenty fast. If you do care about energy, then Intel gets better every year, with 5w tdp-down i7 chips coming out.

                                                                                                                                                      • tyingq 11 days ago

                                                                                                                                                        I feel like that's less than fair. AMD is a much smaller company with fewer resources.

                                                                                                                                                        • adventured 11 days ago

                                                                                                                                                          > AMD is a much smaller company with fewer resources.

                                                                                                                                                          To put that point into context, Intel's operating profit in just its most recent quarter ($6.55b) is greater than AMD's revenue over the last four quarters combined ($6b).

                                                                                                                                                          • CamouflagedKiwi 11 days ago

                                                                                                                                                            I'm not trying to be harsh; in both cases I think it's impressive what they did. Just saying it sucks a bit that in between Intel inevitably seems to get back in front.

                                                                                                                                                            • simion314 11 days ago

                                                                                                                                                              >Just saying it sucks a bit that in between Intel inevitably seems to get back in front.

                                                                                                                                                              What sucks is that Intel used illegal means to fight AMD , so despite having the better CPU the OEMs were not using AMD in their builds.

                                                                                                                                                        • twotwotwo 11 days ago

                                                                                                                                                          I'm interested to see it but I expect it'll be just a noticeable step up, not a blockbuster. The process and focus on IPC seem like good things for power consumption. But it seems like there's a ton of work Intel's done to optimize sleeps/twiddle clock freq/etc. for the thin-and-light laptop use case, and I'm not sure AMD is close to catching up with that.

                                                                                                                                                          They could always surprise me. (TSMC does fab power-sipping mobile chips too!) I just don't have super high expectations.

                                                                                                                                                          • atq2119 11 days ago

                                                                                                                                                            It's a good thing that the next Surface is going to use AMD's CPUs. It's likely that (unlike other laptop manufacturers) Microsoft will spend a decent amount of time to work through sleep and power issues on the software side at least for Windows, which will then trickle down to other manufacturers.

                                                                                                                                                            • mkl 11 days ago

                                                                                                                                                              Only the 15 inch Surface Laptop 3. All other models are still Intel.

                                                                                                                                                        • ekianjo 11 days ago

                                                                                                                                                          Problem is that in the enterprise segment there is almost no AMD offering with the big brands so it is going to take a while to displace the Xeons. I looked at Lenovo workstations recently and their offering is 90% Intel.

                                                                                                                                                          • washadjeffmad 11 days ago

                                                                                                                                                            OEMs don't list all configurations they sell on their sites, but they certainly want your money. We've asked for some ridiculous shit over the years (including sTR4 and SP3), and Dell has never failed to deliver a quote.

                                                                                                                                                            • godzillabrennus 11 days ago

                                                                                                                                                              Time to go super micro or Gigabyte for a cycle. Let the big brands feel the pain of not incorporating AMD.

                                                                                                                                                              • lhoff 11 days ago

                                                                                                                                                                Supermicro is offering Epyc Server since the product launch of the first generation. We ordered a Server back in June 2017 and it bot delivered in November because the CPUs Werke sold out in the beginning.

                                                                                                                                                                Now with Rome it gets even more wide spread. The Epyc CPUs are perfect for a Single CPU all Flash storage Server due to the 128 PCIe Lanes. Intel can't offer anything comparable at the same or similar proce point because you need to buy 2 CPUs

                                                                                                                                                                • sixothree 11 days ago

                                                                                                                                                                  Is that why all dell intel servers seem to come in 2 cpu only configs?

                                                                                                                                                                • jzwinck 11 days ago

                                                                                                                                                                  SuperMicro's ILO and rack mounting hardware were really poor a few years ago. Has that changed? Keep in mind the big brands have improved those parts since then, so unless SM made amazing progress, they're not a drop in replacement.

                                                                                                                                                                  • sliken 11 days ago

                                                                                                                                                                    Works for me. I just want to be able to turn on, turn off, reset, get a video console, and reinstall the OS from an ISO.

                                                                                                                                                                    Sure the software/tools could be nicer, but I've not been overly impressed with the competition. At least HTML5 has replaced the pain of ancient buggy java solutions that waste 5 minutes over a slow connections with sandbox permissions, java splash screens, and some awful client that's mostly VNC, but with a hacked up auth system.

                                                                                                                                                                    • sithadmin 11 days ago

                                                                                                                                                                      SuperMicro's rails and OOB management are still horrendous.

                                                                                                                                                                      • imroot 11 days ago

                                                                                                                                                                        I keep on hearing a rumor about a new HTML5 based RSA/OOB: It's one of those things that I keep on hoping for when I unbox a new SuperMicro, but, I've got 12K+ of these in the field and I'm finally at a point where I can replace them with Dells.

                                                                                                                                                                        In about 5 years, maybe I'll replace the ones I have at home (admittedly, I really should just do a blade server at home -- it'd probably be less power draw!)

                                                                                                                                                                        • sliken 9 days ago

                                                                                                                                                                          There's two kind of SM rails that I'm familiar with. The cheaper crappy ones, and the "tool less" ones. They are so nice I can mount rails from a single side of the rack with one hand.

                                                                                                                                                                          • detaro 11 days ago

                                                                                                                                                                            What's the problem with their OOB?

                                                                                                                                                                            • snuxoll 11 days ago

                                                                                                                                                                              Same Avocent garbage as everyone else I assume, it’s not like any of the OEMs make their own.

                                                                                                                                                                              • SteveNuts 11 days ago

                                                                                                                                                                                iDrac is MILES ahead of SM ipmi, in my experience.

                                                                                                                                                                                • greatpatton 11 days ago

                                                                                                                                                                                  Be more specific, because the basic stuff needed by 99% of setup has been covered for ages. We are managing hundreds of machine with just SP IPMI.

                                                                                                                                                                                • detaro 11 days ago

                                                                                                                                                                                  More specific? That you think it's garbage doesn't really tell me what's garbage about it compared to others. I've only used supermicro's a little, and it seemed to do what it offered just fine.

                                                                                                                                                                                  • snuxoll 10 days ago

                                                                                                                                                                                    I haven’t used it, but I assume it’s really no better or worse. Virtually every OEM licenses their OOB management controllers from Avocent - HP, Dell, Lenovo, etc. all use more or less the same platform with a custom skin and a couple addon features packed in.

                                                                                                                                                                                    From what I’ve seen, Supermicro is no different - they use Avocent hardware like everyone else. Everything just depends on what generation of controller they ship.

                                                                                                                                                                                    EDIT: as an example, the iDRAC on my Dell R210 II’s, the IMM on my Lenovo TD340, and iLO 3 on my HP ML10 are all the same generation of Avocent hardware with few differences. My Dell RX20’s have newer OOB modules and suck less to work with, but so does comparably new hardware from other OEM’s because they’re all made by the same damned company.

                                                                                                                                                                          • sliken 11 days ago

                                                                                                                                                                            Dell and HP sell several solutions, I've not been tracking the others.

                                                                                                                                                                            • justinclift 11 days ago

                                                                                                                                                                              HP now put their BIOS/firmware downloads behind a paywall, which - from my point of view - excludes them from all future purchases.

                                                                                                                                                                              eg older gear that would generally be suitable for dev use can't even be updated any more. Screw that crowd. :(

                                                                                                                                                                              • JohnJamesRambo 11 days ago

                                                                                                                                                                                Just when I thought HP couldn’t be more stupid, they surpass all expectations; that’s so crazy.

                                                                                                                                                                                They just sent my brother a $500 video card about eight months late after already replacing the whole computer that had a faulty video card. One hand doesn’t know what the other does in that corporation.

                                                                                                                                                                            • lazylizard 11 days ago

                                                                                                                                                                              And asus.

                                                                                                                                                                            • sliken 11 days ago

                                                                                                                                                                              No idea on the desktop/workstation uptake. But servers seem available from plenty of OEMs. Supermicro, Asus, and Tyan have a wide variety of single and dual socket offerings.

                                                                                                                                                                              What impresses me most is that the transition from Epyc Naples to Epyc Rome was pretty quick and doubled the performance on a wide variety of heavy floating point codes.

                                                                                                                                                                              • snvzz 11 days ago

                                                                                                                                                                                >I looked at Lenovo workstations recently and their offering is 90% Intel.

                                                                                                                                                                                This will only change if people choose what to get from the remaining 10%

                                                                                                                                                                                I did my part, getting a x395 recently, which worked out quite well.

                                                                                                                                                                                • beatgammit 3 days ago

                                                                                                                                                                                  Well, at least Thinkpad is available in AMD, which is a huge potential enterprise market.

                                                                                                                                                                                  • at-fates-hands 11 days ago

                                                                                                                                                                                    Are Xeons still running hot and sucking up a ton energy? My old workstation from a few years ago is a beast, but the dual Xeons it has runs pretty hot temp wise.

                                                                                                                                                                                    I’m curious if they’ve solved that problem yet

                                                                                                                                                                                    • sliken 11 days ago

                                                                                                                                                                                      Well 2 years ago or so before the current Xeon scalable and Epyc chips it was relatively common to have server chips in the 65-95 watts. Often servers with 2 chips (mid range) would be $2,500 to $4,000.

                                                                                                                                                                                      Unfortunately the realities of Moore's law (or the lack of it really) means that the perf/watt stopped doubling. So in a single generation a mid range server went from 65-95 watts to 180 watts or so. At the same time server costs approximately doubled.

                                                                                                                                                                                      Higher end chips are now over 200 watts.

                                                                                                                                                                                      The good news is that while they consume quite a bit of heat that the physical size of the socket, heat spreader, and chip have significantly increased. So you still have to dump the heat, but it's not as hard (read that as noisy) as it was cooling the smaller (with higher heat density) chips.

                                                                                                                                                                                    • Teknoman117 11 days ago

                                                                                                                                                                                      Dell EMC is now offering AMD Epyc Rome based systems now.

                                                                                                                                                                                      • lliamander 11 days ago

                                                                                                                                                                                        Epyc Rome launched with HPE (as in, you could buy a Rome based server from HPE the day it launched).

                                                                                                                                                                                        • dragonwriter 11 days ago

                                                                                                                                                                                          AWS isn't a big brand in the enterprise space? Because they certainly do have AMD offerings...

                                                                                                                                                                                          • lliamander 11 days ago

                                                                                                                                                                                            I did neofetch on an AMI and found it was running a first gen Epyc chip.

                                                                                                                                                                                            • chrisseaton 11 days ago

                                                                                                                                                                                              It seems a bit token, though?

                                                                                                                                                                                            • tyingq 11 days ago

                                                                                                                                                                                              Agree, but AMZN or MSFT could announce something significant at any moment. The incentive is there.

                                                                                                                                                                                          • sandGorgon 11 days ago

                                                                                                                                                                                            AMD needs to figure out a way to get CUDA support on its integrated GPU or get Tensorflow/Pandas/etc etc to adopt its GPU acceleration libraries.

                                                                                                                                                                                            Their brand is still not thought of as equal to intel. So the other way to build the brand is through developer adoption.

                                                                                                                                                                                            • selimthegrim 8 days ago

                                                                                                                                                                                              So why aren’t they handing out free GPUs to colleges and universities like nVidia?

                                                                                                                                                                                              • sandGorgon 8 days ago

                                                                                                                                                                                                because nobody wants their GPU.

                                                                                                                                                                                                AMD integrated GPU laptops are very common in India and are very cheap compared to equivalent intel. For example Ryzen 5 Vega 11.

                                                                                                                                                                                                People would rather shell out for expensive nvidia. Because no academic/research software can use the AMD GPU.

                                                                                                                                                                                                If CUDA was available on AMD, then the sale of their laptops (and servers) would 10x overnight.

                                                                                                                                                                                            • boyadjian 11 days ago

                                                                                                                                                                                              AMD HD 4770, HD 6870, HD 7870, R9 290, RX Vega 64, all the graphic cards that I bought recently have been AMD. And thought my previous CPUs where Intel, my next CPU will be an AMD.

                                                                                                                                                                                              • achow 11 days ago
                                                                                                                                                                                                • m0zg 11 days ago

                                                                                                                                                                                                  And I sure hope they keep it up. Intel was getting way too comfortable in its dominant position, and that hurts everyone (including Intel in the longer term). The best outcome for all is to have two companies with approximately equal marketshare competing on merit. There isn't much you can squeeze out of general purpose CPUs anymore, but I'd be quite grateful for the next 2 or 3x. And then they can start competing in acceleration hardware and GPUs.

                                                                                                                                                                                                  • tempsy 11 days ago

                                                                                                                                                                                                    There's a strange cohort of "meme" stocks that I see in certain male-dominant internet subcultures. Tesla and AMD seems to be at the top of that list.

                                                                                                                                                                                                    • tristor 11 days ago

                                                                                                                                                                                                      Can you expand on what you mean by this? I don't understand. Are you saying people pick these stocks because of cultural reasons and they're actually not good investments? Are you saying that male-dominated internet subcultures have a cultural reason to pick these stocks, specifically?

                                                                                                                                                                                                      • tempsy 11 days ago

                                                                                                                                                                                                        It's just a weird mix of both stocks being techy/geeky, extremely volatile (though generally speaking on the upward trend), Elon Musk adding to Tesla's weirdness, and strange infatuation with AMD's CEO Lisa Su.

                                                                                                                                                                                                        Tesla especially attracts a certain weirdness because there's a large cohort of people who think Elon is full of it and hiding massive fraud and are hellbent on exposing him (using $TSLAQ everywhere).

                                                                                                                                                                                                        If you follow earnings season both stocks did extremely well, especially Tesla. Made and broke a lot of people.

                                                                                                                                                                                                        • akvadrako 11 days ago

                                                                                                                                                                                                          Tesla hasn’t been on an upward trend for two years. AMD hasn’t for about one year. Neither have reached their highs of the past few years.

                                                                                                                                                                                                        • KaoruAoiShiho 11 days ago

                                                                                                                                                                                                          Probably the second one leading to a bias causing the first one. Some tech/geek communities really like those 2 stocks. There are other male communities as well that are into other stocks. Even in tech the culture leads to some stocks becoming overvalued and others undervalued. AMD is overvalued while FB is undervalued because of the cultural feelings toward those companies.

                                                                                                                                                                                                          • jsf01 11 days ago

                                                                                                                                                                                                            AMD is worth ~40B today. FB is worth ~550B today. AMD supplies a product in growing demand while FB supplies access to demographics in growing demand. But the downside risks are not the same. It only took FB’s viral growth to end MySpace. User loyalty is far more fleeting than the market for CPUs. Within five years or your preferred investment time horizon, compare the valuations of each firm and see how your assessment stood the test of time.

                                                                                                                                                                                                            • mantap 11 days ago

                                                                                                                                                                                                              Facebook has a plan A and a plan B for dealing with competition. Plan A is to buy out competitors to acquire users and diversify their brand (Instagram, Whatsapp). Plan B is to use FB's overwhelming resource advantage to copy their product (Snapchat). It's fair to say that FB is well aware of what happened to MySpace.

                                                                                                                                                                                                              Any potential "Facebook killer" needs to circumvent both of these tactics. Not impossible but not easy either.

                                                                                                                                                                                                              • webninja 11 days ago

                                                                                                                                                                                                                Facebook’s Lasso doesn’t seem to be killing TikTok

                                                                                                                                                                                                                • adventured 11 days ago

                                                                                                                                                                                                                  Zuckerberg went to Washington in September & October and took care of that. While he was busy with several other topics, you can be sure he leaned on the right members of Congress about TikTok.

                                                                                                                                                                                                                  The US has a very strong interest in every possible regard to protect Facebook's dominance and to kill or restrain TikTok by forcing it apart from Bytedance or burying it through the app stores.

                                                                                                                                                                                                                  If it gets separated from Bytedance it's either toast or it ends up in the belly of a US giant (or maybe even Spotify depending on the valuation).

                                                                                                                                                                                                                  • riffraff 11 days ago

                                                                                                                                                                                                                    Even if the us Congress somehow blocks tiktok, FB appears to be losing a lot of ground to it in the rest of the planet.

                                                                                                                                                                                                                • fireattack 11 days ago

                                                                                                                                                                                                                  Not very familiar with this, what's FB's copy of Snapchat?

                                                                                                                                                                                                                  • nsomaru 11 days ago

                                                                                                                                                                                                                    Facebook copies features into WhatsApp and Instagram like “stories” which are posts that disappear and cannot be saved, a core feature of Snapchat

                                                                                                                                                                                                                • KaoruAoiShiho 11 days ago

                                                                                                                                                                                                                  What does this even mean, haha. AMD is an also ran competing on price. It has no competitive advantages, no moat, weak brand. That is has a 180 PE is an absolute joke. It's so high purely because the culture really dislikes Intel/nvidia making so much money on their pricey products and wants a viable competitor badly. This leads to them emotionally becoming invested into the alternative even though objectively it's not a fast growing nor defensive business.

                                                                                                                                                                                                                  FB on the other hand. Incredible profit growth year after year. Powerful network effects leads them into crushing new markets all the time. Huge profits from their monopolies lets them easily take cool risks. The whole comment about MySpace/social media is exactly what I'm talking about. Males who are nerdy are extremely dismissive of products that are mostly used by women or other people from different milieus. People have a hard time pulling for companies that they don't see succeeding, it's a cognitive error leading to imagination inflation. Know your biases.

                                                                                                                                                                                                                  Please, keep this thread in mind in 5 years!

                                                                                                                                                                                                                  • ric2b 10 days ago

                                                                                                                                                                                                                    > It has no competitive advantages, no moat, weak brand.

                                                                                                                                                                                                                    Competitive advantage: it has more experience in GPU's than Intel, it is more flexible in terms of fab tech because it doesn't run it's own fabs, it can switch easily to a different fab.

                                                                                                                                                                                                                    Moat: Seriously? High-end chip design is one of the hardest industries to get into, you think some startup can just enter the market?

                                                                                                                                                                                                                    Weak brand: It's very well known in the whole world and has no negative associations, I don't even know what you mean with this.

                                                                                                                                                                                                                    • dralley 10 days ago

                                                                                                                                                                                                                      Don't forget they have the competitive advantage of chiplets which enables higher yields, and the moat of the x86/AMD64 cross licensing agreement with Intel. Intel and AMD are effectively the only organizations that can legally produce modern x86 CPUs even if others had the technical knowledge to do so.

                                                                                                                                                                                                                • WrtCdEvrydy 11 days ago

                                                                                                                                                                                                                  > FB is undervalued

                                                                                                                                                                                                                  Yes, I too thought myspace stock was undervalued.

                                                                                                                                                                                                              • inamberclad 11 days ago

                                                                                                                                                                                                                They're easy to predict if you keep an eye on them. For example, their announcements at CES earlier this year caused a 20% jump in about 3 days. Very good if you know that they're going to come out swinging.

                                                                                                                                                                                                                • ivalm 11 days ago

                                                                                                                                                                                                                  Sure, but the earnings was flat. My point is, statements of predictability are generally way oversold. If it was easy to predict it wouldn't be.

                                                                                                                                                                                                                • longcommonname 11 days ago

                                                                                                                                                                                                                  Younger invesres and not just men are migrating from traditional blue chip stocks that their parents prefered.

                                                                                                                                                                                                                  • Symmetry 11 days ago

                                                                                                                                                                                                                    Index funds and bonds are there for stability. If I'm going to pick an individual stock it's because I think I understand something other people in the market don't.

                                                                                                                                                                                                                  • fred_is_fred 11 days ago

                                                                                                                                                                                                                    And what's interesting is that AMD was a tech bro darling before memes were a thing, it dates back to the late 90s and early 2000s.

                                                                                                                                                                                                                  • m15i 11 days ago

                                                                                                                                                                                                                    Are there any good alternatives to Nvidia GPUs/cuDNN for deep learning?

                                                                                                                                                                                                                    • rarecoil 11 days ago

                                                                                                                                                                                                                      ROCm doesn't completely suck with the Radeon VII, which is a Radeon Instinct MI50. Deep learning is not my day job and I'd like to avoid supporting NVidia's insane prices for DL-capable cards, so I've been dealing with the performance hit, and only using the R7 for DL tasks then switching it off when the power isn't needed. The XFX Radeon VII is actually on sale for Newegg for $569 so it's a lot of power (and 16 GB HBM2) for that price.

                                                                                                                                                                                                                      • jamesblonde 11 days ago

                                                                                                                                                                                                                        Agreed. The Radeon VII is currently the best price/compute GPU out there for deep learning. It's performance on RESNET-50 is about the same as the 2080Ti -

                                                                                                                                                                                                                        https://github.com/ROCmSoftwarePlatform/tensorflow-upstream/...

                                                                                                                                                                                                                        • bitL 10 days ago

                                                                                                                                                                                                                          That's only theoretical. Try to use ROCm on latest frameworks or on external models that write custom CUDA operations/losses. Basic stuff might work in a more complicated way than on NVidia, advanced stuff is guaranteed to either not work or work in a couple of months when it lands into ROCm.

                                                                                                                                                                                                                          Radeon VII is a beast for FP64 computation, if you do simulation or heavy computations that require supercomputer-level of precision, then grab one while you can, it's the best price/performance of all GPUs on the market.

                                                                                                                                                                                                                          However folks, please don't follow the advice about using for it Deep Learning if you want to actually have Deep Learning business in any way.

                                                                                                                                                                                                                      • ccffpphh 11 days ago

                                                                                                                                                                                                                        https://github.com/RadeonOpenCompute/ROCm

                                                                                                                                                                                                                        ROCM makes it possible to use consumer grade AMD GPUs for deep learning.

                                                                                                                                                                                                                        • thom 11 days ago

                                                                                                                                                                                                                          Companies like https://myrtle.ai/ and https://www.graphcore.ai/ are popping up.

                                                                                                                                                                                                                          • bitL 11 days ago

                                                                                                                                                                                                                            Nope, just grab yourself a RTX 8000 and be able to train latest SOTA models (albeit slowly). Titan RTX is already insufficient and nobody else is in the game for actually owning DL hardware :(

                                                                                                                                                                                                                            • gaogao 11 days ago

                                                                                                                                                                                                                              Google has TPUs now

                                                                                                                                                                                                                              • miguelisolano 11 days ago

                                                                                                                                                                                                                                But only through Google Cloud for now, as far as I'm aware.

                                                                                                                                                                                                                            • dredmorbius 11 days ago
                                                                                                                                                                                                                              • jimbo1qaz 11 days ago

                                                                                                                                                                                                                                If I block web fonts, all references to AMD are lowercase. MiloTE and MiloSCTE (small caps).

                                                                                                                                                                                                                                    .blog-post small {
                                                                                                                                                                                                                                        text-transform: lowercase;
                                                                                                                                                                                                                                    }
                                                                                                                                                                                                                                • some1else 11 days ago

                                                                                                                                                                                                                                  This rule would ensure that the fallback font also displays small caps for that tag:

                                                                                                                                                                                                                                    font-variant: small-caps
                                                                                                                                                                                                                                  
                                                                                                                                                                                                                                  Small capitals only apply to lower-case letters, so it's common to lower-case small capital abbreviations, especially in body text:

                                                                                                                                                                                                                                  http://theworldsgreatestbook.com/book-design-part-5/

                                                                                                                                                                                                                                  • Finch2192 11 days ago

                                                                                                                                                                                                                                    Can I ask -- What was your line of thinking that led you to find this out?

                                                                                                                                                                                                                                    I'm the kind of guy who could probably figure out how to do this if it was, like, given to me as a task. But never in a million years would I just stumble across this.

                                                                                                                                                                                                                                    • jimbo1qaz 11 days ago

                                                                                                                                                                                                                                      I don't like it when different websites look very different. So in my primary browser (Firefox), I only allow sites to use my preferred sans-serif, serif, and monospace fonts. (In the options page's font dialog, I unchecked "Allow pages to choose their own fonts, instead of your selections above".)

                                                                                                                                                                                                                                      I noticed that all occurrences of AMD are lowercase, so I noticed that it's set to lowercase. Opening in Chrome, I noticed that there was a giant banner on the bottom covering 1/3 of my page, and another on the top which closed when I closed the bottom banner. In Chrome, I noticed that AMD was written in small-caps with font MiloSCTE, and the rest of body text was using MiloTE.

                                                                                                                                                                                                                                      • jacobolus 11 days ago

                                                                                                                                                                                                                                        I assume the previous poster blocks all web fonts as a matter of course (to save bandwidth? speed web page loading? reduce vectors for tracking?)

                                                                                                                                                                                                                                        Anyhow, people should ideally be using small caps a whole lot more. Using all caps for abbreviations is much uglier and less legible, especially in texts where many abbreviations appear.

                                                                                                                                                                                                                                    • pschastain 11 days ago

                                                                                                                                                                                                                                      Full article is behind a paywall :-/

                                                                                                                                                                                                                                      • Narishma 11 days ago

                                                                                                                                                                                                                                        Is there a non-paywalled version?

                                                                                                                                                                                                                                        • opencl 11 days ago

                                                                                                                                                                                                                                          The article is basically devoid of content beyond "AMD makes less money than Intel, Zen is good, their marketshare is small but rapidly increasing".

                                                                                                                                                                                                                                          • eBombzor 11 days ago
                                                                                                                                                                                                                                            • ShinTakuya 11 days ago

                                                                                                                                                                                                                                              Check the FAQ.

                                                                                                                                                                                                                                              • kick 11 days ago

                                                                                                                                                                                                                                                In comments, it's ok to ask how to read an article and to help other users do so.

                                                                                                                                                                                                                                                • j4kp07 11 days ago

                                                                                                                                                                                                                                                  So where do we complain? This posting is essentially an advert.

                                                                                                                                                                                                                                                  • zaroth 11 days ago

                                                                                                                                                                                                                                                    Often times people will upvote the topic / headline as something interesting or noteworthy or because the discussion in the comments is interesting. I’m sure a double digit percentage of people are not even clicking over to TFA, and then another double digit percentage click and then bounce back to the comments (after hitting the paywall) for the discussion and may still upvote.

                                                                                                                                                                                                                                                    • kick 11 days ago

                                                                                                                                                                                                                                                      Send all complaints over to hn@ycombinator.com.

                                                                                                                                                                                                                                              • hyperpallium 11 days ago

                                                                                                                                                                                                                                                Intel has had the smallest nodes since the beginning.

                                                                                                                                                                                                                                                With Moore's law's death, everyone is catching up.

                                                                                                                                                                                                                                                Isn't it that simple?

                                                                                                                                                                                                                                                • trynumber9 11 days ago

                                                                                                                                                                                                                                                  I don't think so. Because now TSMC has the highest transistor density. It is more that Intel fumbled 10nm so badly that others have passed them. Intel's 7nm had better be good and timely or they're in some serious trouble. TSMC's 5nm is starting early next year and with 1.84x scaling.

                                                                                                                                                                                                                                                  • hyperpallium 11 days ago

                                                                                                                                                                                                                                                    But nm are marketing terms now; TSMC's 7nm is said to be equal to Intel's 10nm.

                                                                                                                                                                                                                                                    Nodes are still shrinking, but not at the rate implied by their nm names. In addition, thermal constraints mean they can't actually be used at their theoretical capacity.

                                                                                                                                                                                                                                                    • trynumber9 10 days ago

                                                                                                                                                                                                                                                      Intel's 10nm is finally shipping now, according to Intel, but it seems small scale yet. TSMC's 7nm has been shipping for over a year, in very ubiquitous devices (iPhones).

                                                                                                                                                                                                                                                • known 11 days ago

                                                                                                                                                                                                                                                  I think China will try to acquire AMD

                                                                                                                                                                                                                                                  https://en.wikipedia.org/wiki/Advanced_Micro_Devices

                                                                                                                                                                                                                                                  • rgbrenner 11 days ago

                                                                                                                                                                                                                                                    Congrats to AMD, but I'm still very pessimistic on their long term prospects. It seems like Intel's advantages come from a system that produces improvements over years. You can see this just in their R&D spending: Intel spends nearly 2x AMD's revenue just on R&D. Whereas this development from AMD was thanks to Jim Keller (who now works at Intel)... It was a one-off event.. and once they've extended it as far as it'll go.. then what? Unless they develop their ability to innovate (they've had issues keeping top talent at the firm), this will probably be another one of AMD's boom and bust cycles.

                                                                                                                                                                                                                                                    • jsf01 11 days ago

                                                                                                                                                                                                                                                      Intel decided to invest more heavily in share buybacks than R&D as of their recent earnings call.

                                                                                                                                                                                                                                                      But that’s only half of the story. They need that R&D budget because they have the (massive and growing) expense of building and upgrading their own fabs, which have undergone a series of costly mistakes in the last decade. I wonder what % of intel’s R&D budget is actually comparable to AMD’s if you exclude the amount poured into fabs—betting those figures are much closer despite enormous differences in market cap. TSMC, who along with GloFo fab the AMD chips, is basically all-in on R&D investment and taking on debt to facilitate the construction of the most advanced fabs to date. And their prior investments in 7nm have scaled rapidly and panned out well. I think it was the fastest ramp for a node shrink that they’ve done.

                                                                                                                                                                                                                                                      Oh and Keller is definitely smart but you imply that he’s got a monopoly on talent in the semiconductor industry. There’s no way that’s the case lmao

                                                                                                                                                                                                                                                    • jcheng 11 days ago

                                                                                                                                                                                                                                                      But in 2019 you need to compare Intel’s R&D with AMD + TSMC. What’s different this time is that Intel has lost its process node advantage.

                                                                                                                                                                                                                                                      • Symmetry 11 days ago

                                                                                                                                                                                                                                                        Intel spreads their R&D revenue across many more product categories than AMD does. In addition to CPUs, they've got their new GPU, their SSDs, their entirely new form of NVRAM, wireless networking, and many other projects. And they run their own fab, which is a huge investment that AMD doesn't have to make.

                                                                                                                                                                                                                                                        AMD just makes their CPUs and GPUs. Also, Intel's CPU designs tend to be much more custom than AMDs, trading more engineering effort for a bit more performance.

                                                                                                                                                                                                                                                        • tuananh 11 days ago

                                                                                                                                                                                                                                                          you gotta give credits to the team as well. It wasn't just Jim Keller. btw this kind of architecture can give AMD a few years to catchup. if they can keep the momentum, they have a very good chance to be relevant again.