>Number one — and this is a big one these days, especially for this product — is that it’s not any less useful or versatile than the outgoing Mac Mini, including the generous assortment of ports. If the previous one served a role for you, the new one can probably do it just as well, and probably better and faster, with minimal donglage.
Wow. This is what it's come to. "It hasn't gotten worse! Yayy!"
I work at a company where most of us use Macs. Everyone that has upgraded from a 2013-2015 MBP to the touchbar model absolutely hates it, including me. There's gotta be someone around here that actually likes it, but I haven't met them yet.
Its OK if you like it. I'm sure lots of people do, but you know damn well significantly more people have had issues with this laptop than normal, so I'm not sure why you're implying otherwise.
The keyboard is by far the worst I've ever used, and I've owned a variety of $200 Wal-Mart black friday laptops and a couple of netbooks, in addition to some high end stuff. I had a friend with an older MBP attempt to show me something using my computer and he could barely type on it without frequent mistakes. Within a minute he was getting frustrated. It just feels terrible. I never thought anything would be worse than typing on a touch-screen but Apple's engineers have accomplished a horrifying miracle. Its like they intentionally tried to design something that's as loud as a mechanical keyboard while still having worse tactile feedback than a $5.00 rubber dome keyboard.
On top of that, its not noticeably faster, after 4.5 years it still maxed out @ 16GB Ram (They fixed this in 2018 but its too late), which is not enough for my use case + it died after 3 months (Not the keyboard, it was a power issue).
This is both the most expensive and the worst computer I've ever owned. The 2013 MBP I'm using now as a loaner while I get my new one fixed is, to quote Steve Jobs "Like getting a glass of ice water in hell." It just works.
The touchbar made me realize that I have a habit of resting my fingers up there. I came to this realization because when I didn't think I was typing anything, stuff would happen. Eventually I realized I was touching virtual buttons on the touchbar.
I don't hate it, but I wish more of the tools I use took advantage of it.
I think it could be massively improved by shortening it a little on the left to make room for a physical escape key. That's about 95% of my problem with it.
I prefer Magic Trackpad 2 (or native MBP 2015 trackpad) to any other way of navigation but Vi keybinds -if available- work as well.
I currently use Karabiner for this as well, but a slightly different configuration.
The two rules I use are:
* R-Cmd + hjkl are arrows (which works great with HHKB but even on native MBP it requires less movement of hand from trackpad or typing hand than the arrow keys)
* Caps solo is Esc while Caps with another key equals Ctrl.
Is there a way to do this in Linux as well? I currently have to use Linux regularly and I rebind Caps to Ctrl however for Vim it isn't ideal. So I'd like to have the same functionality I have with Karabiner on Linux (Xorg / console).
You should snag an app called BetterTouchTool. Makes the TouchBar an amazing addition to the laptop. It's just a shame that Apple didn't build in a tool like it but I hope the developer gets lots of love.
> Its OK if you like it. I'm sure lots of people do, but you know damn well significantly more people have had issues with this laptop than normal, so I'm not sure why you're implying otherwise.
Yup, that's exactly what the HN groupthink would like to believe. What is by definition a small user demographic complains repeatedly/loudly that their problems are the most important, indicative of "everyone", and Apple is doomed because so-and-so bought a Surface Pro/Dell/System76. The hyperbolie is kicked up a notch here, as clearly any $200 Walmart laptop keyboard is better.
As is typical, there's never any data to support the claims and a significant number of counter anecdotes are dismissed absent critical reasoning ("you know damn well...!"). Instead, the echo chamber resonates unabated by logic.
Throw in an out of context Steve quote and you can identify this drivel pretty uniformly. It's usually best to ignore, though at times a response is merited when it's completely off topic and unhelpful (as it is here, the new Mac mini seems awesome regardless of a hater's two year old take on the tbMBP).
I like the feel of the keyboard. I did not like the trip to the Mac store at 8 months whereby they very gingerly lifted the spacebar and removed whatever crumb was under there making it mushy.
I also like the touchbar. The touch-slide volume is a nifty improvement...I use the touchbar for the occasional screengrab...and that's about it. As someone mentioned above, I too would really like a physical Escape key
If you go back and read my comment, I didn't say that out of Apple's entire customer base, more people dislike the keyboard than like it. I said that more people are having issues with this computer than a normal Apple MBP.
Just like your comment, this is an anecdotal opinion on the Internet. I didn't read your response and come away with the conclusion that you were attempting to represent it as a peer reviewed white paper, so I'm not sure why you're holding my random Internet comment to the same standard.
My anecdotal evidence is that all of the typical places people go to talk about technology on the Internet(reddit, hacker news, blogs, etc.) seem to have to more complaints about the keyboard on the new MBP than I recall seeing about the old model, which was almost universally hailed as the best laptop on the market.
Included with that anecdotal evidence is my own experience in a company with hundreds of people that use MBPs. There might be other companies where everyone loves them.
It would be really helpful if we could trust Apple to publish accurate, relevant data on the new MBP vs the old one, but they have a history of hiding, denying and/or lying about issues with Apple products.
>The hyperbolie is kicked up a notch here, as clearly any $200 Walmart laptop keyboard is better.
This isn't hyperbole. Its just my opinion and it was presented as such. I've been using computers since the early 90s. I've never used a keyboard that felt worse to me than the MBP. When I say that I don't mean that its one of the worst keyboards I've ever used, I mean that its THE worst keyboard that I have ever used, which is obviously just my opinion.
>Throw in an out of context Steve quote and you can identify this drivel pretty uniformly
This isn't a case where reusing a quote is changing the context of what he meant. I wasn't saying that Steve Jobs agrees with my opinion of the new MBP. This should be pretty obvious. The original context was that Steve Jobs felt one product was so superior to another one that getting the former was like getting a glass of ice water in hell. Its a perfectly relevant quote used in the same way that he did. I just happen to be comparing different products.
>It's usually best to ignore, though at times a response is merited when it's completely off topic and unhelpful (as it is here, the new Mac mini seems awesome regardless of a hater's two year old take on the tbMBP).
1. Its perfectly on-topic to discuss the quality of Apple products in a post about an Apple product. The reputation of their products is pretty valid when people are discussing whether to buy a newer product.
2. I didn't bring up the MBP, someone else did and I responded.
3. Not liking a single Apple product doesn't make me a "hater". I loved my 2013 MBP, I loved all of my iPhones/iPads, and I love MacOS. Apple released a product that I have an issue with. Lots of other people are having the same issue. I'm not sure why you have to react to that like a personal attack.
There's this phenomenon where if lots of people have an issue with something and then a random person buys the product and has a good experience, that person decides to dismiss everyone that has had an issue, pretend its impossible the issue existed, and then act like there must be something wrong with the people that had the issue. I don't really understand that because with any product that sells thousands or millions, its typical for some people to have issues even if most like it. In this case, it just happens to be a product where a slightly larger percentage than normal is having an issue.
No it wasn't. Define "serious". What makes one computer user more "serious" than another. The point of the no true Scottsman is that the term is never defined by the statement and so it's completely subjective.
I don't see how much more clearly I could breakdown what happened. Their entire argument was predicated on the fact that most "serious" (replace "true") computer users don't use the Touch Bar because their lids are closed.
I'll compromise. We'll call it the no "serious" computer users fallacy instead of the no serious Scotsman fallacy.
Do you have to look down to use those icons on the Touchbar? This is something I don’t have to do as a touch-typist. Repeatedly craning your neck to look down will result in RSI. I spend a little effort to learn new hotkeys every so often and the benefit far outweighs the cost: I’m much faster and keep my good ergonomics.
Comparing the keyboard with a $200 walmart netbook is a joke. It might not be everybody's taste, but it's far from terrible. I actually like it and at my company where 95% of engineers have one, there is not more complains than usual about the new model. People that really can't stand the keyboard will usually use external peripherals at their desk.
What kind of work do you / your teammates do? I walked into an Apple store last week prepared to hate the keyboards and touch bars, but was very pleasantly surprised. My ladyfriend wanted to see the Air, but was so taken with the Touch Bar for photo work that she will be buying an MBP instead. While it might not be everyones cup of tea, I think Apple may have figured is demographics correctly on this one.
I also think the new MacBook Pro is the worst laptop I've ever had. I hated it so much I switched to a Thinkpad. Thankfully these are company issued laptops and I could do this pretty easily. Had I bought a personal one I'd be quite distraught.
Anyone who hates the touchbar has never chased a chat window for the mute button (it's there). Also, no need to leave the keyboard to click on dialog box buttons - they also get there.
It could be better - it could require a bit more of pressure to press buttons and could have haptic feedback like the touchpad, but I bet someone is working on that, even if it means extending haptic feedback to the whole chassis (which is not a bad idea anyway).
I have a 2013 15" MBP, a 2015 13" MBP, a 2016 15" MBP w/ Touchbar, and a 2017 15" MBP w/ Touchbar. I vastly prefer the 2013 and 2015, even though they are bare-bones specced and the newer MBPs are top-of-the-line. The touchbar MBPs are provided to me by my employer. I just bought my 2015 13" this year, after considering getting a newer model, and I'm planning for it to be my main personal laptop for years to come.
I don't like the touchbar, I don't like the new keyboard, and I don't like the new ports.
I think there is something wrong if, now two years later, there are still people like me who not only don't see a clear benefit to upgrading, but see it as a net-negative.
Ideally, there should be nearly no one (if anyone) who prefers the previous iteration.
It makes things universally worse imo. Everything was by touch for me previously, including adjusting sound. I had the keyboard memorized. Now I constantly have to look down. A keyboard shouldn't require me to change my focus, that defeats the entire purpose.
The dimensions aren't really a selling point for me at all. Both laptops are thin enough and light enough for me to carry them to and from work comfortably.
Performance is better across the board (6-core i9 vs 4-core i7, 32GB vs 16GB, faster SSD). It has a larger, pressure-sensitive track pad. The display is better.
I have a slight preference for the old keyboard, but I don't dislike the new one. I could take or leave the TouchBar. I was never a big user of the F keys and IntelliJ, where I spend a lot of my time, has good TouchBar support. I like having Touch ID. I'd prefer to have a real escape key, but the button is still in the same place so I haven't had to retrain my fingers to hit it.
USB-C with a Thunderbolt 3 hub is marginally more convenient than a Thunderbolt 2 hub plus a MagSafe power cord. I do miss MagSafe though.
These are almost exactly my thoughts as well coming from a 2015 MBP. I would only add that for me, I actually like the new keyboard. The low/firm travel of the keys is actually preferable for me. I felt like the old keyboards were "mushy". That's obviously a personal preference, however.
It's pretty common to make sweeping generalizations in casual conversation, which this is. I find it really annoying when someone says something like "everyone likes cats!" and then someone, inevitably, will hop in and reply with "ACTUALLY, I hate cats." This isn't a mathematics proof.
Sure, it's not necessary to completely accurate in casual conversation, but you can say "heaps of people like cats!" or "most people like cats!" or "almost everyone likes cats!" and make your point just as effectively, and without trying to invalidate or dismiss the dislike of cats.
I think really the goal was replacing the top row in the first place - the function keys are generally useless except for ancient legacy compatibility, advanced user's macros, and media control.
But the sides of the function keys are the escape and power keys, so the row itself couldn't go away
I wish the Touch Bar was farther offset, 50-100% taller for a better display area, and that the escape and touch id/power buttons were distinct. I also wish there was a standard way for applications to advertise functions for it (I have some really useful tools that are third party, such as mic mute).
Hate it or not, Apple has to make it relatively commonplace if they want macOS apps to bother developing for it instead of ignoring it. Their short-term goal is 100% of MBP users, and surely, eventually 100% of all macOS laptop users.
Keeping it optional indefinitely, then, defeats this goal.
The 12-inch Macbook has an F-key row which is what gets replaced with the touch bar.
I only estimated that they were going for 100% laptop coverage. But I can imagine a future where Apple keyboards have a touch bar option. Though not as important because Macbook touch bar penetration can drive developers to integrate with it, alone.
At which point it's not much different than gestures when it comes to answering your questions. What happens when you use a Logitech mouse on your macOS desktop instead of a Magic Trackpad?
Note that touch bar integration cannot have unique features, so it's never required. The challenge is to get developers to care about it which is the prerequisite for users to care about it.
Probably because it can't be made to be "just an option".
Having seen teardowns of a MacBook, I'm pretty sure that one without a touchbar would be more-or-less a completely different computer - different keyboard, yes. To accomplish that, though, you'd need to also make a different housing to accommodate it, and a different motherboard, too, because this stuff's all soldered together as a single unit these days.
And for all that, people would still be griping about the keyboard and the monoport.
They ignored those for the 2018 MBP refresh, probably because the new Macbook Air was coming. I'd bet on the full keyboard MBP getting discontinued now that it's out, or best case having the 2017 hang around while the touchbar models continue to see updates.
Right. But, echoing what I said, it's not "just an option" - it's a whole different model that they happen to be selling under the same name. Different CPU options, different monitors, different port configuration, etc. They don't even have the same number of microphones.
It's taking me some time to get used to, I'll grant you that.
I'll be getting a 2018 Mac Mini once they're available for order here, so this machine will see much less use then, just basically if I need to be able to do something while travelling (which isn't that frequent).
Well, tbh, if it's simply a choice of which key to map to the currently-Caps Lock key then Escape should be the winner even if exiling Caps Lock to the Touch Bar would cause significant annoyance to some heavy users.
However, while users are mostly just faced with perhaps-hard choices about how to remap the given keys, the keyboard manufacturer and integrator Apple had many other options. Caps Lock is currently 2U wide on both ISO and ANSI layouts https://i.ytimg.com/vi/3tJagPz-xIw/maxresdefault.jpg , so it could fairly comfortably be split into a 1U Esc and Caps Lock. You could even give Escape the outside position: that's not a big reach, just the mirror-image of the ISO English # key and more convenient than Esc's existing position in both ANSI and ISO, and there's an arguable case for not putting Esc on too much of a hair-trigger position near the home row anyway. Alternatively you could carve a 1U key out from the right of the right Shift, which is pretty uselessly overlong on both ISO and ANSI.
I don't think that really changed the implication. Regardless of whether he's speaking specifically about the Mac Mini, about Apple products in general, or about the entire industry; he's still saying that it's bucking the trend of "being a series of compromises and disappointments sold as innovation" and that almost everything is genuinely an improvement.
I'm using a 17" 2010 at work with an i7, SSD upgrade, and 16Gb of RAM and its plenty fast still. At home I've got a 2010 Air that begs for more RAM yet still runs as well as when I bought it including 3+ hours of battery life depending on if I am surfing or writing. 2010-2012 was a high point in Mac laptops.
It's actually a reference to the USB-C ecosystem for the laptop line; Marco and his cohosts on the Accidental Tech Podcast have burned a lot of time griping about the state of connecting things to a laptop. Not so much time on the head phone jack, which was mostly a no-op for them (since it was paired with the airpods).
No, that quote is not out of context at all. The expectation that Apple would in some way ruin the Mac Mini in the next update is the topic of several of the first paragraphs in the article. The gist is indeed "It hasn't gotten worse! Yayy!"
Granted, that's not all of it. He is celebrating that it's gotten better, but he is also specifically celebrating that it hasn't lost many advantages.
Ha. I completely forgot about the SD card reader. Probably because I plugged it in and never looked at the back again. Having an SD card reader on the back of a device just isn't practical. I've just been using a USB hub with an external reader.
I missed that! So not only is 'not any less useful or versatile than the outgoing Mac Mini, including the generous assortment of ports' a shockingly low bar, it's also what might charitably be termed 'rather a bit of a fib'.
Pro audio for recording has been shifting to USB over the past few years. They were heavily on Firewire or PCI card formats before, but USB has finally gotten fast enough to do the throughput, and of course it's much cheaper and more standard.
I'm not a pro audio guy, but I think if you want to work with any external audio device and want that connection to be noise-free, your safest bet is still optical.
I have an external DAC/amp, and I had been using optical until I recently upgraded my workstation. Now I am using USB, because the new motherboard doesn't have optical; I assumed USB would be fine, so the lack of optical wasn't something I considered with the purchase.
Unfortunately there's an awful amount of hiss with USB on this DAC. I'm not knowledgeable enough to pinpoint what's causing it, but standard remedies I've found (eliminate ground loop, try ferrite cores, get a usb filter) have not made a difference. It could be the USB interface on the DAC side, I have no idea.
While I don't _need_ an external DAC, I like using one for a few reasons: mine has a low output impedance which is good for some headphones, a convenient switch to toggle between headphone and speaker output, and a big volume knob with really smooth action. I like it. The alternative is to use the motherboard's ports; this motherboard does jack sensing such that the rear line out gets disabled if something is plugged into the front, so switching between headphones and speakers involves plugging/unplugging the headphones all the time. I also don't like adjusting volume through a tray applet.
The electrical interference can't do anything to to the digital audio, but interference on the USB cable can potentially be picked up by the analog amplifier circuitry in the DAC. I had a particular combination of headphone, DAC and amplifier years ago that I could hear electrical noise on when no music was playing.
Absolutely. I work as a sound engineer and have used lots of things of this type, and it's particularly easy to understand how a USB connection to the computer would wreck the sound of a cheap (sub-$1000, not professional) DAC.
Optical digital means perfect isolation from the ground plane of the computer. It's that simple. The DAC can do whatever it needs to manage its own noise levels, but you're pretty much guaranteed a huge difference from entirely decoupling the DAC from the computer's ground plane. That electrical interference can do a surprisingly enormous amount of damage to the analog circuitry of the cheap DAC, which itself is probably not very resilient at rejecting any sort of electrical interference.
But it sounds like its the same DAC, which was hiss free with optical. So it's not the DAC "generating" the hiss. Rather electrical "interference" making it over to the analog side from the digital side.
So it's the DACs fault, but maybe it's fair to say that it's harder to make a good DAC when USB is how the data is being delivered. I'm sure a high end USB DAC doesn't have this issue, but I'm pretty impressed with my cheap offbrand optical DAC that was $11. I'm guessing I would not be so impressed with the $11 USB version.
Logically, it seems like if you had a USB-optical dongle you would have no further trouble. I don't know if there is such a thing, but I do know that in that situation all the 'it's digital, so it should be perfect' talk becomes somewhat true.
You'd be converting from USB to optical, at which point you'd break the ground connection which would be where your hiss is coming from (assuming it's still noiseless when still being used with optical). Then, your concern would be jitter and whether the added conversion is adding lots of jitter to the equation. Your DAC might (or might not) be good at rejecting jitter noise. I've got a Lavry DA10 that's exceptionally good at rejecting jitter (in crystal mode), but that's mastering grade and maybe overkill for you.
It wouldn't add literal noise, but it's also possible for the USB connection to be more jittery than a different computer making the optical connection. That's partly hardware and partly software design (controlling how the data stream is buffered, and associated things that might slightly modulate the audio data clocking). So a change in computer feeding the DAC could also substantially affect the 'sound' of the DAC, as well as the noise issue you observed.
> Unfortunately there's an awful amount of hiss with USB on this DAC.
If you used optical before, it can be reasonable to use USB->optical. A lot of USB audio interfaces already have this. I'm not making use of it, but the USB interface I have on my Mini has an optical out.
Interesting. I bought a pretty cheap soundbar this year (from Yamaha) and I think it had optical as an option (along with 3.5mm and Bluetooth) but the main interface was HDMI-ARC.
And it works great over ARC! The TV remote's volume buttons get passed through automatically, and it turns itself off and on along with the TV. I don't even know where the soundbar's own remote is anymore.
The easy answer is: same thing I was using it for 10 and 20 years ago. Audio equipment doesn't age nearly as quickly as personal computers. I'm not going to replace my entertainment system just because there's a new Mac.
Also, I've had compatibility problems with HDMI between my old Mac Mini and my receiver. I don't know how to troubleshoot those sorts of issues. I do know that optical audio always works perfectly with every device I've ever used it with, though.
Headphones are stereo (okay, there are some exceptions), and optical S/PDIF can do stereo PCM, so there's no quality downside. But for home theater audio, you can't do any lossless multi-channel formats over optical S/PDIF, so the vast majority of enthusiasts will use HDMI.
Wireless (non-Bluetooth) Sennheiser headphones also take optical input (as well as 3.5mm). Since they're only doing stereo, and require a digital-to-analog conversion on the headphone side, makes sense to avoid an additional analog-to-digital on the transmitter side.
I'm probably in a tiny niche but I use it so that I can have optical out to my speakers and analogue out to an extension cord with a plug next to my keyboard that I plug my headphones into. This means I can swap between audio devices without touching cables or plugging things in to go headphones -> speakers depending on whether I want to annoy my wife with my music or not.
Sonos, which might be one of the most hyped average user audio things in the last years, only offers optical inputs for their soundbars. Imho it's a pretty huge limitation on Sonos side, but maybe they did it because HDMI ARC is still flaky in many setups and they prefer ease of use. Or simply because of cost savings.
I want to smash a Sony engineer with a hammer because my goddamned $5000 smart TV keeps insisting on switching from tv audio to external audio to tv audio to external audio to tv audio to external audio.
It doesn't matter that I have my audio preference set to "prefer external audio" because the effing thing decides to switch back and forth between 2 and 30 times on startup, and maybe 80% of the time it settles (properly) on the receiver, and 20% of the time it tries to use the TV speakers, which are off, and of that 20% of the time, at least half of the time I HAVE to power everything down and start all over again. The other half of the time I can go through the menu with the slow-ass on-screen menu, and manually toggle back to receiver.
I just want an effing option that says "never in hell try to use the fucking TV speakers, for the love of god".
Somehow HDMI audio is still less convenient for me.
I send HDMI to my TV which sends optical to my sound system, and it still works pretty well. Earlier this year I bought a nice sound system with a receiver for the reasons you mentioned, but it added extra lag to my video games, so I returned it. My current sound system also supports ARC but it causes most of the same problems. Decoding those fancy codecs creates lag on every system I've tried, so sticking to LPCM over optical seems to be the only way to enjoy games still.
Now that I think about it, axing the optical audio ports was probably a deliberate move by Apple to nudge home TV users of mac minis into using Apple TVs, where they have way more control over the interface and content.
edit: apparently Apple TV doesn't have optical audio either.
I doubt it -- the number of users who use a Mac Mini as an entertainment center is probably so small Apple barely thinks about them. More likely is that HDMI has superseded the optical audio port for most users, so they save money by removing a port few use these days.
It's still a little weird. I think a lot of people would have a separate A/V receiver and speakers; maybe this is changing as old equipment gets replaced, but I suspect there are still far more homes with optical-capable A/V receivers than HDMI-capable ones.
This is just a standard annoying thing with Apple though. Their designs are very 'forward looking' in that they don't consider what potential customers already have so much as they do what their own future peripherals are going to need.
At this point HDMI is not a particularly new thing; my Marantz receiver from ~2007 had HDMI connections on it. Audio-only digital connections aren't necessarily going the way of the dodo yet, but they're becoming progressively more niche. So are home theater PCs, of course.
Anecdotally: I do have a Mac mini with my receiver, which I replaced this year. But the new receiver not only has USB input, it has wifi and built-in clients for Spotify, Tidal, Pandora, TuneIn, Roon, whatever protocol Windows uses for media sharing whose name I'm utterly blanking on right now, and a dozen or so other services, with Apple Airplay 2 theoretically coming in an update.
The combination of price and absence of user upgradable M.2 SSD drives make this product un-viable in my opinion. The current price is fine if I could upgrade the drive later. Non-upgrade-able SSD would be acceptable if the price for larger drives was more reasonable.
If SSD could be upgraded buying a i5/i7 would have offered amazing upgrade paths down the road. This machine misses the mark IMHO but not by much.
You can upgrade the drive, externally through tb3. There will be tb3 enclosures that exactly match the mini’s size (historically there always have been) and it won’t even look weird. I did that on a previous mini (firewire).
Having said that, I’m still going to get one with 512gb ssd, because I just want everything neatly in one box.
Well, quite a bit faster, in ways that are relevant to my interests. Stands to reason designing a thing like this together onto a bespoke board allows for some optimizations: it's hilarious that it's apparently faster than any other Mac anywhere ever, for single-core things that suit it. Including Mac Pros and iMac Pros… I just think that's amusing.
I'm now wondering if VCV Rack is sufficiently multicore that it will perform better on more expensive Mac boxes, or whether this little thing has now set the bar for ability to run demanding modular synth software live. That would be really convenient.
Building a PC mothership is certainly cool as hell and I'm not knocking it, but there's something to be said for 'this is my live performance rig, it's stable as a rock and it fits in my pocket. And it's cheap enough that if I'm headlining Coachella with my modular jams, I'll buy a second one and clone it so I have a backup, there onstage ready to be plugged in if there's a problem'.
I'm with Marco on this one. Looks nice. I know what I'm getting with a Mac of this type, and this definitely looks nice to me.
And it still has real USB A ports so you don’t have to hop your audio interface through a C to A dongle thing. Though I don’t know if a C->A adapter is just rerouting wires, or has circuits with their own latency - either way, it’s a bit of mess you don’t need with one of these.
(Assuming you use a typical USB A audio interface rather than something more exotic)
FWIW, I also do electronic music, though just as a hobby.
Nah, my audio interface is more exotic. Thunderbolt MOTU 16 channel in and out, 192k/24. So I love that it has so many thunderbolt ports, because they're spoken for. I think you might well be able to run two MOTU 16As and have 32 in, 32 out. You'd start to have issues with drive space but the computer can probably handle it.
I just customized a Lenovo ThinkCentre with a Core I7 (I’m assuming it’s the six core), with a 128GB SSD, 8GB of RAM, Windows Pro, and a WiFi card and the price was $979. The equivalent Mac Mini is $1099.
People are missing the point about the integrated graphics. For me this is a feature not a bug because it has USB-C instead.
That means that instead of hopelessly compromising thermals and power supply with some power hungry, yet limited GPU (because of the small form factor) you simply buy the CPU and memory that you need. It simplifies the job of keeping it cool.
CPUs are evolving a lot slower these days. This one should last you many years before it becomes a problem. Having upgradeable memory all the way to 64GB means that too is not going to be a problem any time soon. It means that a mac mini should have a serviceable life of 3-5 years or more if you are less of a power user.
Additionally, you plugin an eGPU of your choice and additional storage via USB-C. Better, when improved eGPUs become available, you can sell your old one and buy a new one without having to tear the machine apart. They also don't overtax and compromise your power supply or cooling.
I currently have an imac 5K that is nearly 5 years old now. I maxed it out at the time with all the bells and whistles and it has served me well. So, money well spent despite the shocking price initially.
I'd totally consider spending 3-4K on a setup with a mac mini, decent eGPU + monitor, and external ssd storage (I actually have a Samsung 2TB T5 already). The new imac pro would cost more and deliver less value. What I like about this setup is that it is completely modular and I can replace individual components without having to worry breaking the other components.
I imagine the mac pro next year will also emphasize expansion and upgrades through USB-C rather than internals. Basically the old model without dedicated GPU and upgradable Xeon CPUs + Memory would be exactly the right product right now. Egpus can be replaced easily and with a solid base configuration, a pro machine should have a long productive life.
Sure, except the 'trashcan' Pro doesn't have room in it for anything either, it was also designed to daisychain drives etc. via TB. My very elderly 'cheesegrater' Pro of course does have lots of room for 3.5in drives and PCIe cards, but I have wrung about as much as I can out of it, so the new Mini has some appeal. I am skeptical that the new-new Pro of 2019 (assuming it materializes) will be designed with extensibility in mind...
It was a sarcastic attempt to point out that scattering the components of a computer into 3 or 4 units with separate power chords is an inferior design to the standard desktops we’re used to. I want a Mac desktop for under $3,000 that doesn’t require multiple accessories plugged in to behave like a desktop.
> If only they could make a single case to fit all these various computer components in
They do. It's called an iMac and comes free with an excellent monitor.
Now, seriously, unless you plan to do a lot of GPU computing (I don't) the internal Intel graphics are pretty reasonable. I can't see how a 64 gig mini would not be able to be my main machine for 5 years or more. With an updated SSD, my previous mini is a pretty good general purpose computer.
It's not so much that I want a gigantic GPU, but I'd like something that could just run two external 4K monitors without stuttering. That would make this an excellent general-purpose desktop machine. The 15" 2017 MacBook Pro does this fine, so it's hard to accept that it would be difficult to engineer something into place. Unfortunately, an eGPU is gigantic external box costing a large chunk of change, so it's not really a good option.
With a laptop, an external GPU is indeed a big issue in terms of form factor. However in your case, your big 4K monitors are comparatively large as well so having an eGpu that you can also use as docking station is not necessarily the end of the world. Likewise, with a mac mini this should be less of an issue. I'm also guessing this market will develop over the next few years with more attractively priced options aimed at people who just want a middle of the road setup that works reasonably well for games and vr. Right now all the available options are targeted at people who in any case spend too much money on HW.
Besides, with Apple you are paying a premium for a dedicated GPU in a macbook, which makes the price of an external eGPU less of an issue. Because if the choice becomes paying 500 euro extra for a mac mini with a last generation laptop grade GPU or spending the same on a eGPU with modest specs (or a bit more on something fancier), that's a lot easier to defend. Especially if you can swap it out once every other year for a new one.
I'm guessing that with the mac pro they may offer some options for a dedicated, non upgradable GPU as well as eGPU options. Most professionals would probably prefer investing in the latter because it provides them performance without much compromise and an easy way to upgrade. E.g. I'm still waiting for Apple to make a move with AR/VR, which they sort of support but which they don't really actively promote currently.
Anyway, the mac mini is not necessarily that interesting for people looking for cheap options. The start configurations are nice if you don't need much but most more serious users are going to want bigger ssds, more cpu, and more memory. That creeps into the mac pro use case already. I'm guessing that in terms of price segment this thing is designed to fill the gap with the mac pro, which will likely have a starting price of around 5K, just like the last one.
Well, THAT's interesting. I've been looking at some Blackmagic stuff for streaming and camera work, that would take burden off my desktop computer. It looks like the new Mini plus this plus outboard x264 encoding (also includes ability to run a good dynamic mic into the encoding) would add up to an insanely flexible livestreaming setup that could do both GPU-needing things and camera-based things without loading the computer, and be expandable later with SDI camera inputs and the ability to do production video switching.
It… didn't occur to me that I might be thinking about running such a rig off a Mac Mini. But then it didn't occur to me that a Mac Mini would come out and be in some conditions faster than any other Mac currently made. Interesting times…
I have an HP Omen Accelerator with an RX580 and a 2016 MBPr and it's plug and play in macOS 10.14. Last I checked, you could get most Nvidia GPUs working, but it required some spelunking. It may be different now.
I've been very happy with it overall. I have a 4k 28inch monitor and a 30 inch Cinema Display plugged into the GPU, and the eGPU enclosure has a sata port + ethernet port. There are some gotchas, eGPU.io is a great resource.
Also, bootcamp with windows 10 works with my eGPU, but required some work. If you want more info, let me know.
The pricing is not really justifiable especially given the most use cases of the Mac mini (Home Theater, NAS, Backup etc.) are better handled by other OSes.
The NUC8 with comparable specs (sans 6 core CPU) comes at under half the price! If you needed the 6 cores sure but given most workloads other than encoding don't - it's questionable. Even then the 6C processor doesn't cost enough to make up for the difference.
What OSes are better for a "Home Theater" TV computer?? That's a serious question; I recently replaced a creaking old Mac Mini (that couldn't do 4K) with a new Dell, after going a little bicurious with respect to Windows 10 (for software development work)... and, surprisingly, it's a complete and utter shitshow.
I was astonished, actually; I had assumed Windows would be better than macOS as a TV computer, other than integration with Apple services (Apple Music, my kids photos as screensaver).
Nothing could be further from the truth. It's a shitshow. 100% of Windows media players are garbage (VLC included, and there is no Movist). They can't play high-bitrate video without stuttering (on way better hardware), they show some ungodly mishmash of scaled UI and tiny unreadable UI on a 4K TV, for each player you install (about 10, so far, for me) you have to google for an hour to make sure they aren't malware (and of course almost all of them nominally are, trying to install all sorts of insane adware shit during the install phase, although that is par for the course on Windows)...
It's been 2 months and I saw the new Mac Mini and despite the 200%+ markup on storage I couldn't help but think Hmmm....
My TV also has at its disposal Xbox (OK but not great), PS4 (pretty shit), iOS (Apple TV, pretty shit), and Nintendo Switch (has no TV computer features at all, basically).
So what OSes are you talking about? Linux?? Android?
> What OSes are better for a "Home Theater" TV computer
I use an Apple TV 4K. For non-netflix/hulu/primevideo stuff, it streams off a Synology NAS (using Plex or VLC for Apple TV)
edit: it's funny how your experience pretty much mirrors mine almost 10 years ago. A Mac guy outfitting a machine for a home media center. I was used to Mac media stuff always being behind on codec support and performance so I bought some HP PC. It never worked right, always had to mess around with audio outputs (HDMI audio output worked maybe once) and codec stuff kept breaking since either Windows or the Nvidia drivers hated trying to do hardware acceleration and you just got a green screen. We mostly ended up plugging in our MacBooks...
I think it's a philosophical difference. iTunes is more database driven (which I really liked at the time for music). Let it control your filesystem, drop in your files and it'll load from metadata. Any changes will only modify iTunes' DB. Use iTunes to decide what to sync to devices.
A lot of people were super annoyed coming from WinAmp. They organized their library by filename and the metadata was often a mess. They also wanted to treat iPods as USB drive and drop files on there.
Plex is even different in that it only looks at the filename and pulls metadata from the Internet.
Not sure if you are aware of this, but you have to have your TV shows separate from your movies in Plex. When Plex scans files, it uses different logic for TV show and movie file name. I had a similar problem where I imported my TV shows as movies... some worked, but most didn't. After moving my shows out of my movies directory and re-importing them specifically as tv shows, everything detected perfectly.
We've moved over to Plex from iTunes. The iTunes Media library folder is already sorted into separate Movies and TV Shows directories, so just added those to Plex separately.
We're running the Plex app on an AppleTV (4th gen, not 4K), and it all works really well. The only issue is that it loses connection to the server maybe once every month or two, but I think that's related to my really crappy modem.
edit: Plex server is running on the 2012 Mac mini "Server" edition. Quad-core, baby!
I'll have to try this again. I put VLC on my AppleTV4, and I don't remember having luck playing things off my Synology. I'm using Kodi on my Android TV-equipped Sony to stream from the NAS, which works, but the UI is a disaster for adding sources. I finally got it to work, but I never want to have to change anything again.
For me, using Plex is an utter and complete nightmare. Anything with transcoding and library reindexing is a no-go. I've tried to be happy with it at least 6 times, and each try was an unmitigated disaster. I just want a thing to play files from storage.
Plex sucks on the Apple TV (and likely most other devices) because unless the video fits in a very narrow band of codecs (AVC, maybe HEVC with DD audio) it's going to insist on transcoding the video.
The absolute best setup I've found is Infuse (https://firecore.com/infuse) with Plex as the backend. Infuse will ALWAYS request the raw video/audio stream and does all the decoding on the Apple TV, so your Plex machine can be very modest (it's never transcoding, after all).
You can also forego Plex and just have Infuse index all the content, but I've found that it's nice to have a single Plex backend that can be shared amongst multiple Apple TVs, iPads, etc.
Which codecs do you mean? Guess I'm weird; I have only ever used h264 and now hevc, any others are only ancient lower-than-SD quality files which transcode without any issues. (Many files are .mkv; they still work without a transcode.)
Well DVDs are MPEG2, and some really old Blu-Ray releases are as well. You also see VC-1 in older releases. Probably should have been a little more specific, Plex will insist on transcoding if the client can't handle the codec natively. Generally speaking that means AVC or HEVC (on newer devices) only.
True, thanks. There's AV1, VP9 and related codecs too, but I've not seen many sources that only provide those and not one of the MPEG codecs as well. I've moved towards HEVC for anything I want to hang on to long-term, now that all my devices can handle it.
my apple tv 4 plays natively 1080p h265 with 5.1 audio with infuse 5 pro served by my pentium with 8 gigs of ram plex server over lan. Strangely, i am sure and old version of the plex app did the same. i think its more an issue of the way the videos are muxed and the audio codec.
Plex is the least nightmarish of the video library apps I've used in the last few years (Plex and Kodi are the only ones I've ended up using extensively, though I've tried out several others).
Transcoding is obviously a last resort, but I've used it on rare occasions for streaming live and recorded TV from home to the Plex app on my iPhone and it works reasonably well. For my home theater stuff I absolutely don't want any transcoding at all, and Plex is generally smart enough to figure out what formats my playback device is able to decode on its own. When I first got a Chromecast Ultra, I had to fiddle with some XML config files in Plex because Plex was only aware of the non-4k Chromecast and thus would try to transcode 4k content, but I believe that Plex resolved that issue fairly quickly.
The issues I had were that, first of all, Plex server wouldn't run on my ARM-based Synology, and then later, I think there was a build for it, but it didn't have the horsepower to transcode, and it seems like it ALWAYS wanted to transcode things.
I've also tried running Plex server on my Win7 HTPC with the volume mounted over CIFS, but I had various issues with that; for one, having to reindex the volume means I have to either wait 15 minutes to watch something I just procured, or manually login and force a refresh which is annoying when I just want to browse files. It's been awhile, but I have tried a number of approaches (including Plex server on OS X) and none of them were satisfactory. Furthermore, I could never figure out why any of it was preferable to simply having a media player read files from a network store and play them. I don't care about "album art" type stuff; I don't download/store/play hollywood movies or music, so those features would be irrelevant even if I cared about them in general.
The one nice thing about Plex is I now have a few friends sharing their libraries with me over the internet. It's a cool feature, but I'm only using it from the client side, and I typically forget to even consult their libraries before paying to rent content.
Yeah, I just have mine running on my primary desktop (an iMac), but I've done a bit of research on consumer NAS products and the Plex support definitely seems spotty.
As for Plex wanting to transcode things, I mentioned having some issues with my Chromecast Ultra, but even that wasn't too bad. There should definitely be an option to not transcode video at all and let my playback device tell me that if it really can't decode the file. As far as I could tell, that option doesn't exist.
I haven't had any issues with library refreshing with Plex on my iMac and movies shared over SMB from my Linux file server. Plex seems to watch the shares and do small incremental refreshes for new content very quickly.
> I don't care about "album art" type stuff; I don't download/store/play hollywood movies or music, so those features would be irrelevant even if I cared about them in general.
That's definitely valid. Wouldn't basically any playback device that supports DLNA (and can decode your files) work fine for that? For me, the library features are pretty important. I had done a lot of work getting Kodi on my Raspberry Pi setup how I liked it, then I switched to Plex which is just as good or better but with a lot less work and maintenance.
I haven't used any Apple stuff, but I've gone through many iterations of this over the past 15-ish years, going from MythTV to SageTV to Plex+Netflix, and using various iterations of Linux, Windows 7, 8, and 10 on everything from leftover PC part builds to the SageTV appliance to dedicated Mini-ITX HTPC systems to Raspberry Pis.
My main criteria is being able to use everything from the couch, using a single remote, and no keyboards. I was close to that with the SageTV boxes, but Google bought SageTV and it started becoming obsolete, and there was never really a good Netflix integration anyway.
I find the trouble with all PC-based stuff is switching between things and inconsistency between apps. Netflix on PC doesn't have a good way to do keyboard-based control, so remotes don't work well -- you pretty much need a mouse-style pointer. Plex works GREAT but the keyboard works differently than it does when you're using YouTube, and there's not a great way to switch. There's also all kinds of dumb problems getting multi-channel audio working with Netflix since they require some type of certified system or something.
A couple years ago I gave up on PCs and bought an Nvidia Shield, and it's basically the perfect device for me. AndroidTV is a great launcher, it supports everything I use (Plex, Netflix, Youtube, Weather, Baby cam monitor) and everything works with a single remote (I use a Logitech Harmony to control it + the TV + audio). No compatibility, rights or audio issues. And bonus: it is a Google Cast device, and also works with an OTA tuner (Silicondust HDHomerun -- though I never actually use that anymore). My TV never changes inputs: it's basically just a dumb display for the Shield, which now does 100% of content playback in my house.
It costs maybe a bit more than a nice MiniITX HTPC build, but just works (whereas an HTPC will take hours of time to get it working semi-properly).
Yeah, similar feelings. MythTV and SageTV both got me entirely off non-DVR broadcast TV. The idea of 'shows is on at x time' or 'show is interrupted by commercial breaks' is completely foreign to us at this point. Also even the MythTV interface of a decade ago is superior to all the DVR's I've seen at my family/friends locally: they all seem to do "live TV" first, then the DVR bit is an afterthought in a menu. To me, Live TV is the exception that I rarely use, and even then, I want to go to the Guide first -- not just start "channel surfing".
I did spend some time trying to get the SageTV backend working with Plex, but scheduling required an awkward web app (not wife friendly) and I could never get it naming everything in a way that made it easy to navigate with Plex. Around this time, my cable company also turned off their unencrypted QAM channels so I could no longer tune them, and that was really the nail in the coffin for me. I don't pick up enough OTA channels to make it worthwhile.
When Plex released their PVR thing, since I had the gear I set it up, but honestly, I just never watch it.
Kodi is recommended if you want a media pc connected directly to your tv, but personally I prefer Plex with a Roku and the media streamed to my tv. Only negative is that it's not useful for anything besides video
The only downside to the NUC is that it maxes out at 32gb of ram. I have the fastest i7 model ($1,600usd) and it's an absolute beast. I offload my compiling and services to it and it handles everything like a champ, gpu stuff notwithstanding. I do hit the wall with some stuff at 32gb so a 64gb option would be ideal. I would never buy a mac to run servers and I would never buy a NUC to replace my Mac workhorse.
I personally use Xbox and Roku along with Plex running on a windows desktop computer which is powerful enough to do transcoding.
All of the video files reside on a local windows machine and a cheap storage VPS on the cloud(located in NL). Both of the machines run plex server and auto transcode all content to an MP4 container ( HEVC for 4k content ). I force plex on the xbox to direct play all content and it usually works pretty well.
>I had assumed Windows would be better than macOS as a TV computer,
You assumed correctly circa ten years ago. Unfortunately the best solution bar none was killed by Microsoft: Windows Media Center Edition. It had a killer ten foot interface and actually supported CableCard digital TV tuners. I can only assume it wasn't a big revenue maker and that the support calls were pretty high. Not to mention it never had more than lukewarm reception from cable companies which didn't appreciate the competition for set top tuner boxes.
These days if you really want to roll your own you can buy an HD Home Run  and let it stream to you. It doesn't install any malware but I haven't seen the UI so no clue if it's beautiful or absolute crap. Since it can stream via DLNA that probably depends on the device you stream to as much as anything. I punted and used Youtube TV which I'm about to cancel since hurricane season is over. Like the old song says "500 channels and nothing's on."
I previously used a Mac Mini as an HTPC, but have switched to separate NAS running Plex server + a streaming box or stick as the client (currently a Synology but anything with Docker and maybe hardware accelerated transcoding should be good, and a Nvidia Shield as the client, but 4K Apple TV and Roku are supposedly ok too)
I like the flexibility of having them separate (upgrade separately, keep in separate rooms, etc), and having a unified “10 foot” UI I can control with a remote to access Plex, Netflix, etc.
You could easily get a 2 or 4 bay NAS + HDDs + Shield/Roku/AppleTV for less than the $800 base price of the new Mac Mini. Unless there’s a reason you need Mac OS I wouldn’t get the Mac Mini.
If you’ve already got the Dell I’d consider just using it as a server and getting a Nvidia Shield as the client.
It's not particularly great with the Pi III either. I have one, and I find it's under powered for the "media server" role. Especially, if more than one person connects. I like it for lots of other things, but this isn't one of them.
Am I the only one though that uses the Pi with a (Synology) NAS with MySQL (MariaDB) to store the Kodi data, and my NAS (NFS) for the content? you can pick up where you left off from your Pi, to your laptop, or Phone.
I know...it's like Plex in architecture.
As for the remote, CEC will let you control Kodi from your TV's remote. Or if you have seperate audio, like a soundbar or AV receiver, a harmony remote will do.
I used Linux Mint on a Lenovo Thinkcentre micro-PC for a while, at 2x scaling on a 1080p TV. Now I've changed over to KDE Neon (because the KDE UI is more flexible) at around 2.5x scaling, and it's a bit more comfortable. The main input device is a Logitech wireless keyboard with a touchpad. I mostly play videos through SMPlayer/VLC, so it doesn't need anything fancy.
A dedicated 10-foot UI would maybe work better, but as a low-cost low-effort solution, this works great. I got the PC for free (somebody was throwing it out at work) and the keyboard plus a DP to HDMI cable was $35 total. It plays 1080p x265 just fine.
Have you considered any of the ELEC releases? These are Kodi on minimal Linux with a diverse plugin ecosystem, with docker support. I am using LibreELEC. There is a live option in the installer to give a whirl.
A couple of years ago I would have argued with you, but we recently added a WebOS TV to the household: 4K HDR+ Netflix and Amazon Prime, perfect playback of seemingly everything on the NAS box (which is just a hard drive hanging off the router), all effortlessly and with zero mental effort or maintenance.
To reiterate what I said elsewhere, this Mac Mini is meant to be a desktop, not a media device (or some sort of perverse NAS). It's being pushed into the home theater niche by people who probably have a big trashcan Mac Pro at their desk and try to justify this lesser device. But it's an extremely competent legitimate PC for most people.
Yes, this precisely. Smart TVs of all flavors as well as Roku boxes voraciously mine your viewing habits, connected devices, network layout, mic input, and practically anything else they can get their dirty little hands on. They’re usually running out of date OSes and are full of holes to boot, making you a potential vector for smart device botnets.
Smart TVs and Roku boxes simply cannot be trusted. I have a brand new, relatively expensive Sony smart TV and while its Android TV support is more than adequate I don’t use it at all and keep it off the internet at all times. Instead, I use an Apple TV 4K.
Which is what I'd do but unfortunately Google insists on using it's own video codec for 4k instead of H.265. Meaning your 4k AppleTV can't play YouTube videos at 4K. My smarttv can though... We use AppleTV for everything else.
I personally use Chromecasts, but HTPCs are still a thing, and for multiple reasons :
- gaming (console "alternative", emulators, etc)
- streaming a personal library of media files (e.g. through Plex)
- it's way more versatile than the usually pretty terrible SmartTV onboard software in terms of what content you can access - like Amazon Video that only streams to their own crap stick
Media PCs never stopped working, and they're still better at some things. They're just less "convenient" in some ways, and requires more investment, both in time and money, which is why some people switched to streaming boxes or simply using their TV's software.
>it's way more versatile than the usually pretty terrible SmartTV onboard software in terms of what content you can access - like Amazon Video that only streams to their own crap stick.
The sad thing is: This has reversed. I gave up trying to Netflix and Amazon Video from my HTPC. The former at least has an app and now even supports all formats (Dolby Vision 4k, Atmos), but the controls are terrible for a remote.
The latter doesn't even have an app, and all you get is HD+Stereo.
And don't get me started on Blu-Ray.
Because it's 2018 and media format support in smart TVs is still garbage. I have relatively new Samsung and Sony TVs, and they are both really finicky about what they support - various combinations of container format, video codec, audio codec and bitrates work on only one or the other, and plenty works on neither (including 10-bit video).
I went through this when I built a Win7 HTPC perhaps 5 or 6 years ago. I was so happy to no longer have to use Connect360 on my xbox 360 to stream transcoded video from my Mac Pro (that I had to power on to run the whole thing). Finally I'd be able to just play files directly from my NAS into the endless world of Windows media support. No more transcoding from whatever stupid codec/container/disaster the 'scene' is using these days. It will Just Work because all of these people use Windows and it plays everything, right?
Nope. Unmitigated disaster. Same thing, navigating endless spy/mal/adware laden sites to download CCCP codec packages and plug things together, every time I installed one thing I found another thing that didn't work right. I have some weird Jockersoft thing to force some other application to keep running and restart if it fails, and even though I no longer use whatever that thing was, I can't uninstall the goddamned jockersoft thing to save my life.
Windows Media Center itself was actually a joy, though, I really enjoyed how well it worked for the things it could do out of the box which were far fewer than I had been led to believe.
I also realized that a decade+ of using OS X had left me unprepared to navigate the latest online threats in the world of not-entirely-legitimate download sites for Windows utilities. It got really bad while I was away.
I would say most uses of the mac mini are simply as a desktop computer. It's what I use as my primary OSX workstation -- I have an old late 2012 model, upgraded with 16GB and an external SSD that I made the primary -- and it works fantastic. I plan on getting the new model.
Two monitors. Monitors that are exactly what I want.
The i7 (3615QM) in it holds up extremely well. The storage was catastrophically slow but the system allowed me to effortlessly migrate over to an external SSD (then using the internal storage just for time machine backups).
Finally considering upgrading as it has been a while.
The low power consumption of the thing is one thing that always just blows me away. No noise, at all. My UPS seems to last forever when the power does go out.
Yes, you could only compile once, but as soon as you add new native modules, you have to recompile the whole app. But you can basically use any other OS to write RN code. If you don't plan to add new native modules to it, then you can compile / deploy your app once from a Mac, then use CodePush to remote update the JS which can be done from any other OS.
An Audi SUV's price isn't justifiable given almost every use case for a consumer automobile, but a lot of people enjoy the experience and find it worth paying tens of thousands dollars more than a Camry. By comparison, if you work on a computer for several hours every day and prefer the ergonomics of a Mac, spending an extra $1,000 is a pretty minuscule premium.
And I say that as someone that happily switched from a Mac to a PC I built a few years ago - I like getting the maximum grunt for my money.
The NUC8, like pretty much every small form factor pc, uses a mobile cpu, whereas the new mini uses desktop cpu’s. You have to look at the top model nuc8 with i7 to outperform the entry-level mac mini with the i3, and then the price advantage disappears.
That CPU power comes in handy for most of the mini’s use cases, transcoding for home theater / nas, or build times for build server.
>The NUC8 with comparable specs (sans 6 core CPU) comes at under half the price!
I'm actually very interested in these - could you point me at some? I've been interested in getting a NUC8 machine, but the cheapest I can find one for (granted, in the UK) is £750 before RAM & SSD... and I have been seriously considering the new Mac Mini as a result because the final price doesn't seem like it'll be much different.
I have a NUC7 (NUC7i5BNH) in a silent Akasa Newton case. It's great, I love having a completely silent computer.
I tested the original case, and it was fairly noisy, like a busy laptop. I don't recommend it.
You may find it slightly cheaper if you install the RAM and SSD yourself. This would be easy with the normal case. Installing the silent case was more tricky, since it requires using thermal paste to connect the case to the CPU.
In case it wasn't clear, user 21 was responding to user garblegarble with alternatives to the Intel NUC8i7BEH (~$484 in the US), not the Mac Mini.
But yes, the decked-out i7 Mac Mini is really expensive! As noted elsewhere in the comments (and in the article), a big chunk of that cost is in Apple's 1TB SSD pricing - ticking that option adds $800 over the 128GB SSD for some reason.
I have the 5i5 (Skylake) NUC running as a HTPC, and it's pretty nice. At the time I think I paid about $600 with the RAM and an SSD. I use it for the typical things, playing music and movies, but also for some lean-back computing on the couch when I don't want to sit at my desk to do email or whatever.
A quick glance at Amazon (US) says that a 7i5 NUC with 8GB of RAM and a 250GB SSD is $540.
If you’re bought in to the Apple ecosystem, you can get a good HTPC by buying a 5+ year old Mac Mini from Craigslist. I’ve seen em for around $100, and unlike an AppleTV, you can run whatever additional Mac software you want. My 2010 is running great, recently upgraded to a SSD, and thanks to a kind soul here on HN pointing me to a link, can even run Mojave even though it’s technically unsupported.
Meh… the things I'm looking to do require some fancy outboard and don't ask all that much of the host computer. I've been looking at getting a fairly spiffy newer iMac or medium-old Mac Pro to hook the fancy outboard to (things like video production gear that'll take SDI inputs, external hardware x264 encoding, my MOTU 16A multichannel interface).
It's looking very much like all that can be done just as well off the back of this tiny computer that's less than $1000. I was expecting the computer side to have to be more than $2000 to properly handle the fancy outboard. What with the Mini doing certain single-core processing tasks faster than ANY other Mac, and even having some RAM upgradeability (?!?) there is no question that it'd be able to work with the gear I mean.
I don't need my host computer to be a quarter the cost of all the outboard gear that plugs into it. It'll be roughly the same cost as each of the pieces of outboard gear. That doesn't seem too expensive at all.
I agree with you, but somehow i feel like Marco gets a pass - he's not really trying to run a tech review site to give this thing an objective score, it's his personal blog and his opinion and this is the model he finds interesting.
I know it's probably not fair to give him a pass on reviewing the top-end, because really it is a review and should be held to the same standard as other reviews, but something about his tone makes me not care so much.
But more than that, Marco Arment actually uses high end rigs in his daily work and describes pushing them to the limit. It is highly appropriate for Marco to review a Mac Mini with high end specs, and I find it informative
It outperforms every previous mac mini, and doesn’t benchmark that far below the 2018 i7 macbook pro. This is a desktop i3, the desktop-class cpu’s are a different performance class from the mobile line.
He did test one thing to match a lower end configuration:
>Interestingly, I disabled Turbo Boost to simulate the base i3 model’s thermals, and couldn’t get the fan to spin up audibly, no matter what I did. Those who prioritize silence under heavy loads should probably stick with the i3.
Watching on iOS I had no sync problems. Maybe it’s a transcoding problem caused by YouTube? At 1:25 you can clearly hear how when he turns away from the camera the audio changes as well. I’d say the audio is recorded live, not dubbed.
He makes Overcast, the best iOS Podcasting app. Many good features to handle all the fiddly edge cases. Almost no complaints and he keeps making it better.
 The only one thing I wish he'd add would be the ability to limit concurrent downloads to prioritize at least one of them finishing. I routinely get bad wifi and seeing 8 downloads at 55% is maddening.
I do agree with the wasted space part but why would you call it ugly? Also, I don't find the UI intuitive at all. Many actions like to go to the list of episode of a podcast you have to kind of start from the beginning. But ugly? No. In fact I find it pretty clean.
Boy oh boy, I can't stand that sound. Must be years of being trapped in other kids' parents' cars, who were educated and listened to NPR. I just wanted to listen to Big Douche and the Boys talk over the intro to Enter Sandman, like my high school dropout parents would.
The mic has a switch on it that imposes a hardware high-pass filter to counterbalance the proximity effect you get from being close to it. Since the mic is a cardioid mic (directional sound pickup) it would normally pick up a great deal of extra bass due to the geometry of the cardioid capsule (radio announcer sound, big and deep and bassy). The switch lets you take away that bassiness and the pop filter prevents the mic from going 'THUMP' when you say 'P' into it, and the combination of these things gives you that super-present, 'NPR interview' sound.
Most studio mics have a setting to taper off the bass end of the spectrum. A pop filter is a grate/net that sits between the mic and mouth to stop letters like ‘p’ sounding like a blast due to the air flow.
The price for those SSD upgrade are more than double of the current Retail Pro Level SSD at same capacity and similar speed. Which Apple saves on SSD Controller, better NAND pricing, no retail and distributor margin. Even though I expected the Apple SSD upgrade were to be expensive, those upgrade price levels are simply ridiculous.
The DRAM too but at least I could upgrade that myself to 64GB.
Ugh, all the comments about how it's overpriced in this thread.
"This is crazy! You can just build your own for half the price!"
Snooze. You could have made this comment about most of Apple's products for decades. Some people apparently just don't get it, so here it is from my perspective:
As long as Apple does what I need, I'm not going to buy anything else. I'm definitely not going to build my own PC to save a few bucks. I recognize that not everyone is in this position, but my time is valuable and the difference between "click buy" and "research and buy a ton of parts and assemble them and install software and drivers and blah blah blah" is worth thousands of dollars to me.
I rely on a Macbook Pro for work every day. I generally keep them for 2-3 years, and given my preferences, work setup, software, and how much money I make with these things, I'd pay triple (or more) what Apple currently charges for them vs. getting some shitty Dell or Lenovo and hassling with Windows or Linux. That's not to say everyone should feel that way, but that's my situation, and I'm not alone.
But Apple's strategy has never been to go after the price-sensitive market. Complaining that their prices are way higher than commodity parts you can piece together yourself makes zero sense. That's like complaining that a Ferrari is way more expensive than a Honda.
Yep, good point. I read someone online commenting that Apple's "price sensitive" option are second-hand devices as they (historically) remain updated and working well for a long time. Hopefully the new 3rd generation keyboard will stand up to years of use - I'm not at all confident that my 2017 will have much resale value due to the keyboard issues.
In late 2014 I replaced the disks in my 2008 Mac Pro at work and my 2009 Mac Pro at home with Samsung EVO 840 and 850 SSD disks. I kept track of accumulated writes.
Both machines were heavily used for development and consumer type stuff. No big data stuff or big media stuff.
Samsung rates these things at 150 TB write endurance for the 850 and something around 120 TB for the 840.
My projection based on that usage is that it will take over 35 years to reach 100 TB.
Samsung's ratings are quite conservative. Reviewers that have put these things through write torture tests to the point of error have typically gotten several times Samsung's rating. Around 170 years worth of writes at my usage rate.
I haven't seen numbers for Apple's durability, but if it is within even distant sight of Samsung's EVO performance you can expect to have retired the computer for other reasons long before the SSD dies.
I'd expect that the only reason the SSD might lead to getting a new computer is that you want a bigger SSD, but even that might not be an issue because Thunderbolt 3 gives pretty good performance for external disks.
My Mac Pros have both been retired, replaced with a 2017 iMac (one iMac could replace both office and work computer because we switched to working at home). My Samsung EVOs are now on the iMac in an AKiTiO Thunder3 Quad Mini enclosure.
The internal SSD in the iMac gets just short of 2000 MB/s for both read and write, according to Disk Speed Test from Blackmagic Design. The Samsung EVOs get around 465 MB/s write, 520 MB/s read, so 1/4 the internal disk but still fast enough for most purposes.
Note that the Samsungs are SATA drives. Something that made more direct use of TB3 could probably go much faster.
This article makes me happy. To see that Apple is listening. I'm super excited for the next Macbook Pro,and I really hope we really get something that is better, more reliable and practical. I'll be thrilled if they can find their groove again for the Macbook Pro
I'm in the market for a new machine, but unfortunately that GPU is too anaemic to make sense. It's a shame, because otherwise it looks like a pretty great proposition. An eGPU is an option, but at £500+ and twice the size of the machine, not a vary attractive one.
I guess the options are throwing down the cash for a 15" MacBook Pro, or waiting to see what they come up with for the Mac Pro next year…
Is that really an issue with Intel's integrated GPUs? Honestly asking.
I'm considering a Mac Mini, but there's no way I'm going back to low PPI displays and the lack of affordable 5K / 27" options concerns me (in my opinion 4K/UHD at 27" is a borderline unusable combination with its effective resolution of 1080p - feel free to convince me otherwise).
If those few 5K options can't be driven smoothly by an Intel GPU, it's settled and I'm waiting for an iMac update instead. Ideally I'd like a multi monitor setup though.
It's better than 1080p as it is much sharper and crisper, but 1080p at 27 inches results in rather large UI elements. An effective resolution of 2560x1440px (as seen in any iMac 27) seems like the proper resolution for a display of that size.
I'm curios about this too, does the Mac Mini offer "scaled" options, like MacBooks do with their built-in display, when connecting to an external 4k monitor? I think it does some supersampling, resulting in higher effective resolution.
Sure, it does offer scaling, but you really want 2x scaling - I tried non-integer values on a 5k display, just to get a sense for it, and it was as disappointing as expected. Slightly blurry text, rather obvious that this wasn‘t the recommended native setting.
In my opinion 2x scaling is the only option and that leaves you with an effective resolution of 1920x1080 points - sure, it‘s sharp, but far from true 5k and more importantly the same screen real estate we used to fit into 20-24 incch displays for years. It‘s the wrong resolution for 27“.
Yeah, I was doing 2x4k plus the internal screen off of the MacBook with a 555, and it was totally fluid. I would have happily forked out for the Mini if it had that GPU; it seems like a bit of a strange decision for such an otherwise powerful machine!
Is anyone worried that Apple's transition to their own CPUs will render these x86 machines obsolete very fast? How long did it take for the pre-x86 Macs to fall behind when the software started focusing on x86 instead?
I remember when they went to x86, but I think the landscape is different now. I was working in graphic design at the time, and I didn't know a single person outside of that field who used a mac. Now, I'm surprised when someone's laptop isn't a Mac. So, I imagine there's a lot more critical software that needs ported.
One area I can speak to now, which I'm heavily invested in, is audio applications. It's the main reason I still run a Mac. Upgrades are usually slow and cautious in this domain. I'm still on 10.12.6, and have no plan on upgrading anytime soon.
I, for one, am quite concerned. It used to be in regards to performance, but now it's all the applications and plugins that will need ported. There's a very real chance I'll go to Windows if it looks like a nightmare. Would love to see a Ubuntu variant come out of the ashes of such a thing, but that's probably a pipe dream.
Yep, it's time for my to change the laptop at work, I currently have an old i5 Macbook Pro and it held for +5 years. Even my personal macbook air is from late 2010 (reddit + spotify machine) I doubt that any mac today would held for another 6 or 7 years.
My personal issue with it is the lack of standard inputs, which make it impossible to use with an eGPU setup (excluding the Blackmagic, but that's a whole other kind of subpar). The data ports being exclusively USB-C is probably an issue for some as well.
It was partially the 580, partially the inability to swap out the card. It's nice that they're offering Vega options now, but for that model I'd be paying a $500 premium over a comparable setup, with TB3 out being the only benefit. I'll stick with my Dell P2415Q + Akitio Node for now.
All of my experiences with Mac have been like this.
Got a MBP for college, 2009. Still running as my mom's laptop in 2018. Meanwhile my dad's 2 year old dell has endless problems, it's incredibly cheap, despite having better internal hardware. It's slower, needed tons of configuration out of the box, and Windows 10 is just awful. My dad's laptop constantly has issues. My mom? Never once. It just works.
I will gladly pay for a product that is like that. I don't care what an SSD costs on Amazon. I don't want to spend time looking up components and motherboards and bargain hunting. I don't care about gaming and therefore the GPU. People on here look at numbers and costs but they never consider the customer experience.
It's far more damning of Apple to have the bad keyboard on the new MBPs than have some overpriced hardware. Ask 99% of people on the street if they even know what an i7 is.
I've bought a number of Macs since I switched in 2007 and I've had a number of tragic stories.
My wife's previous Macbook Air died during its second year. It was working fine and one day it didn't turn on. Apple Mexico asked for close to $1000 at the time to replace the logic which was simply ridiculous.
My top of the line 2011 MBP died when it was 2.5 years old because of a known GPU defect. Apple fixed it over a year later but it was too late. I already had a new machine and the second hand value of the 2011 plummeted. I ended up giving it away to a junior dev in my team a couple of years later.
My current laptop is a 2014 13'' rMBP. I wanted to change the battery and Apple Mexico asked for close to $400 since it argued the complete top panel had to be replaced. I ended up doing it myself for less than $100.
I still prefer Macs for working because of macOS, but I don't know what I will do when my current laptop dies.
I'm sure lots of people have stories like ours, one way or another. Things break. Not everything is perfect, but sometimes it is. How often are macs really failing? We hear stories because Apple is hated and you-tubers love getting scenarios where they can make an attack apple video. But without actual numbers we have no idea if this is a trend or not. Consumer Reports regularly ranks Apple as having the best failure rates.
For a long time smart car buyers never bought redesigned vehicles. Why? Their reliability is unknown. As device gains in hardware continue to diminish, perhaps it'd be wise for us to take this stance with electronics - and wait a few years.
> I still prefer Macs for working because of macOS, but I don't know what I will do when my current laptop dies.
Sadly it's not much better on the other side. Premium window machines still can't get basic things like the touchpad right. Windows 10 is pretty bad. I'd gladly get another machine - but nothing offers what I like about my Macs. The linux people aren't worth bothering with. Most people don't want to deal with the limits - and there aren't a lot of manufacturers.
My biggest gripe is not that things fail (that's completely expected) but how Apple reacts to that.
For example my 2007 MBP suffered from Nvidiagate. The GPU died during its third year, many months after the warranty had ended. Apple fixed it, no questions asked.
The 2011 Radeongate affair was ridiculous. There were thousands and thousands of users complaining online. It took Apple 2 years from the first machines failing to start a repair program AFTER a couple of class action lawsuits. It was a massive fuckup.
I haven't bought any of the redesigned MBPs with the butterfly keyboard, but again it took a couple of years to get a repair program after a couple of class action lawsuits. Also, in the US Apple is all fine and dandy, but in Mexico I've personally witnessed cases of Apple refusing to repair the keyboard because apparently they couldn't reproduce the issue.
> But without actual numbers we have no idea if this is a trend or not
Yeah, Apple is as opaque as things can be. Even more now than they will not even share the number of units sold in future reports.
I mean, if we're giving anecdotes, I bought a Dell laptop in 2006, refurbished for something like $600. I was able to upgrade it over the years all the way to Windows 10 (I think I paid $40 for the Windows 7 upgrade at one point). The only thing that failed for me on that machine was the built-in wifi. I only ended up getting rid of it two years ago because I really had no use for it, and had long since replaced both my laptop and my desktop machines by then.
Topically - it was at one point driving my television off this old E1505 and got a 2010 Mac Mini as a Christmas present, and hooked that up instead. Netflix and Hulu chugged on the Mac Mini, which also locked up for no reason from time to time. I literally installed nothing but Flash, for playing back videos.
Hulu and Netflix ran without hiccups or lag on the 2006 Dell laptop, so we put the laptop back.
Yup, I have a 2009 core 2 duo mac mini and it does netflix just fine thanks to html5 video. It still does everything I need reliably (web browsing, non-vm web dev, ms office), but I will upgrade to the new mini and hand this one off to the kids.
meh. I had a 2007 mini that crapped out one month after the guarantee expired. Took it in for support and they told me to buy a new one, not interested in taking it on. To be fair this was a specialist apple dealer. At the time there was no local Apple store.
My general experience with Apple products has been 50:50 some good, some bad, Apple is nothing special when it comes to quality. Better than some is about all I can say.
1 - it wouldn't be suitable for me now (nor for the past few years).
2 - I remember there being a big differential in equivalently priced machines back then. One of my current work Dell workstations is a much cheaper option than this Mac, much better specced and I've had no issues with it for over two years.
3 - It's £800 for the thing, seems way out of line with inflation compared to what I bought then.
Looks promising. I've ordered one to replace my ancient cheese grater Mac Pro, and it should be quite an update.
I've always been fond of the old Mac Pros (still one of most beautiful machines from Apple), but the extensibility through TB3 has rendered many of the advantages moot (and my Mac Pro is stuffed to the gills).
Side discussion: What are everyone's thoughts when it comes to external HiDPI monitor options to use with this Mac Mini?
Ideally I'd want 5K displays (I'm used to the iMac 5K) but it seems that LG's UltraFine is the only real option and the rest of the industry has settled on 4K/UHD at 27 inches, which results in a) limited screen real estate due to an effective resolution of 1080p at 2x scaling and b) slightly less pixel density.
I maybe fine with b) but both issues considered I'm seriously wondering why UHD displays at 27" are so popular - it seems like a subpar and regretful combination. Are my worries unwarranted?
I’ve been using dell P2415Qs. I run 3 at the HiDPI “looks like 2560x1440”. It’s not perfect 2x scaling, but it is close enough (185 dpi) and the price $300-400 each cant be beat. I’ve considered switching to the 21.5 inch LG ultrafines (220 dpi) but they are $700 and have very limited port choices compared to the dells. I use the dell displays with multiple other machines that don’t have USB-C graphics out. And want to be able to use the displays with eGPUs that won’t have USB-C out.
I just hope the default GPU of the mini can drive 3 UHD displays without choking on the dock animations :|
Well, at least 24“ are more reasonable for UHD or 1080p at 2x scaling. However I currently use a non-retina iMac 27 at home (which I‘m looking to replace) and an iMac 5k at work - I really want both: Screen real estate and retina-level pixel density.
Concerning the Ultrafine: Yes, it‘s too expensive and since I‘d need to connect it to another PC without Thunderbolt it‘s not an option for me.
I love my P2415Q, will probably get a second one as well. My only gripe with it are the rather large bezels, compared to some of Dell's other offerings. It really is a shame that basically no one is focusing on HiDPI monitors, especially when 4K laptop displays are all the rage now.
I have a 31.5" 4K display I'm using at native resolution with my MacBook Pro and honestly I can't wait for 8K... I wouldn't want to trade off any of this screen space but every day the low ppi bothers me.
MacOS only works probably with integer-based scaling - 1x, 2x, theoretically 3x. Everything inbetween will lead to blurry text/assets and decreased performance since the output as a whole will be scaled and downsampled by the GPU.
I'm working a lot with text and a sharp/crisp display is important to me. 2x scaling is the only proper option in my opinion.
Everything inbetween will lead to blurry text/assets and decreased performance since the output as a whole will be scaled and downsampled by the GPU.
I run 2 27" 4k displays off a MacBook with integrated graphics, both scaled at 1.5x to "look like" 2560x1440. It works fine; text is not quite as sharp at it would be at 2x, but it's far better than it would be on an actual 1440p display. I haven't noticed decreased performance, but I'm not doing heavy graphics work.
Agreed that display manufacturers are messing this up. 4k displays should be either 20-24" for 2x scaling or 40+ inches for unscaled; 27" displays should be 5k.
There is also Philips 275P4VYKEB. But you need two displays ports to power it. IIRC only some new and not so adopted(both by displays and GPUs) display port standards support 5K. Apple iMac 5K uses same setup of two DPs to power display.
Not high enough DPI for retina, not big enough to avoid scaling. The monitor market seems to be messed up at the moment, there's LG monitor models which were shown at CES 2017 that are barely appearing in some countries today.
The Mac user consensus seems to be that the panel is great, but everything around it is flakey (at least compared to previous Apple displays) - plugging and unplugging doesn't work 100% of the time, stuff like OS integration of brightness and volume keys isn't seamless etc. Looking at user ratings it seems like reliability isn't great either but that could be skewed.
Anecdotally... I bought an LG Ultrafine 5K when they came out - I believe I've got one of the initially faulty ones that can't go too close to a WiFi access point due to lack of shielding.
It was flaky as hell when it first came out but over the course of about a month of receiving it, the problems largely seemed to stop occurring.
I typically unplug and plug it in once per day when I take my Macbook Pro away to sit somewhere else in the evening. But when I'm working during the day I use it as my primary display. For my own experience the integration of brightness and volume is seamless. The speakers are good (although I use some old Genelec monitors instead for this), and the quality of the display is absolutely fantastic as you said.
About one in fifty times plugging it in to the laptop doesn't immediately work, and then I plug it into a different USB-C port on the laptop, and then it does.
Just one data point: I have two of those screens connected to an iMac Pro. This is a setup officially supported by Apple, but sometimes when waking up the Mac one of the displays doesn't turn on and the only option to get it back to work properly is disconnecting either the power or the USB cable.
This is reproducible by connecting both screens to a 15" MBP Pro 2016.
If the screens would at least have an ordinary power switch so I would have to crawl over my desk once every two weeks my life would be a lot better,...
There is zero justification for the pricing. It needs to be starting around $399. Apple is still working the privileged pricing model. Considering it has year old processor and 8GB RAM. For $800 an i5 and 16GB should be standard. I can forgive that 128GB SSD but not the CPU or RAM.
Side note, why is a computer only for games, streaming, and mining bitcoin? It that what the end user experience is limited to?
There is a justification for the pricing, but you probably won't like it: More than enough people will buy the product at a high margin price to make up for the price-sensitive people who won't buy it.
Fair enough, but it "values" the Mac Mini at $799 with a year-old i3 and 8GB of RAM? It valued the last one at $499 and it was 5 years old. I guess what the market will bear is really true...for Apple.
Keep in mind that the old Mini, priced at $499, only had a Laptop Dual Core i5 CPU. The new ones all have Desktop grade Quadcore CPUs in them. This isn't a totally justifiable reason for the price increase, but it is seemingly a large performance increase.
>> I can forgive that 128GB SSD but not the CPU or RAM.
I had the curse of having to use a 2017 MacBook Pro 13” with a 128GB SSD...it’s not forgivable, it’s not usable.
By the time I put xCode in there as well as Office, et cetera, and project files, I was left with 20-30GB left, and anytime I dipped below...20? it bombarded me, literally, continually, with a message ‘you are low on disk space.’
You know what is never forgivable? Soldering a hard drive to the motherboard on a $1500 device.
I'm fairly new to this NAS stuff. Is there a reason to have one hard drive per sever, or is that just a limitation of this one in particular? I'm sure that RAID over a network of hard drives is a solved problem, but I wouldn't know where to start.
This is a really weird device. The mac mini has long been used as a home theater pc of the apple ecosystem, due to it's formfactor and being the one apple device that didn't come with its own 20 inch screen or 3 foot tall tower. Apple tv, for all it's shortcomings, has sort of supplanted that and you're much better off sticking that under your tv and running your plex server somewhere else. So now we have a mac mini which is too expensive to be a home theater pc or to be stuck in a closet serving media, and is comparably priced with macs that have actual screens. So its like good for someone who for some reason doesn't want to carry a macbook around but wants a semi-portable workstation? or someone who wants an iMac with a bigger screen? It's not useful as a mac mini, and it's pretty redundant thanks to every use case being covered by an existing mac. Who is this device for? Who is buying this over a macbook or imac?
I've bought two Minis as dev machines in the past. Portability was part of it. I also saved a bunch of money because I already had a full set of peripherals. Edit: Upgradable RAM and the ability to install a second drive was also great in retrospect, I wonder if the latter is still possible with the 2018 model.
And I'm not sure whether glossy Retina screens are really the best bang for the buck for developers. I often wish I had dual 1440p or a ginormous 21:9 screen when I'm dealing with complex projects.
But is this a paid review? He was lent the hardware, which means he had to return it after he had made his review. While there’s certainly some conflict of interest there, I don’t know anyone that would consider that a payment.
Most reviews are done on devices _loaned_ by the vendor; Consumer Reports and a couple of others buy their own stuff but they're very much the exception. A paid review is where the vendor pays for the review; that's very different. Basically any professional review of a computer you see will be on a review unit loaned by the vendor.
Not saying the price is not outrageous but I found it ironic that you spec maniacs can’t tell the difference between a sata ssd vs pcie ssd. You will be hard pressed to find retail ssd that’s remotely close to the performance in these new macs.
For users that care, a 1TB PCIe SSD currently costs a mere $230 (for the Samsung 970 EVO on Newegg, which has pretty impressive performance). Sure, that's slightly more expensive, but it's nowhere near the premium Apple are charging. On the other hand, users who don't need the extra performance still have to pay through the nose for it - and arguably the difference between SSD and HDD is more important from a user experience perspective than the difference between SATA and PCIe.
If this was a PC then a 1TB SSD would be about $200, but the 2018 Mac Mini has a non-upgradable soldered-on SSD on the motherboard so you have to pay Apple's build-to-order prices if you want one. (The RAM is technically upgradable but requires a security Torx bit to access just to be annoying.)
It's accurate maybe not according to the market in general, but it's how the pricing on the BTO works. I tried building a similarly specced Mac Mini just now and upgrading to the 1TB SSD is +$600, whereas 2TB would be +$1,400.
> Apple lent me a high-end configuration for review — 6-core i7, 32 GB RAM, 1 TB SSD
The problem is that the i7 has hyper-threading, which creates a giant security hole in your system. I'd be much more interested in the i5 benchmarks, since the i5 (supposedly) does not have hyper-threading.
The "giant security hole" I assume you're referring to isn't particularly worrisome for your own hardware running trusted software. It's more of a problem in a shared environment where you don't know who else is running code.
I've been an avid Apple user and supporter for many years, preached about the stability and the superior (tongue in cheek) hardware. I was waiting for a decent replacement for an ageing mini, but seriously Apple, at these prices?? It's insane. I've made up my mind after I saw the Apple event and reading/following the news of them hiding their sales numbers from now on .
There probably is but it will be built on top of Mac mini or Mac Pro farms as Apple does not allow virtualization of macOS on any hardware not carrying a Apple logo (I believe that is how it is actually stated in the terms even). So of it is available it might not be a cheaper option.
But I believe Travis CI does have macOS workers available for free, but they might be scarce in available time slots.
I've been using MacInCloud at work for the past 2 years or so, renting a single server for use as a VSTS/Azure DevOps build agent server, for iOS builde. I'm very happy with it - the server we have is fast enough, both CPU and disk, and it's been rock solid.
I've only contacted their support team about billing stuff, but they were responsive and helpful.
It's the fastest SSD on the consumer market by a good margin. My main gripe with it is that Apple doesn't provide any affordable option for large, slow storage. Of course you can always go external, but then it won't be so much mini anymore.
Except... it's not. We know that larger SSDs are faster... but if we compare a 1TB Apple SSD in the iMac Pro to a 500GB 970 EVO (around $120)... the 970 EVO is at least equal, if not faster in performance.
I don't know anything about SSDs (hint) but that SSD looks like it's on sale at Amazon right now for $393. I'm more than happy to pay Apple a premium of $200 or whatever to just get what I want out of the box and not have to think about this, research options and compatibility, worry about drivers or who knows what else, etc.
...which is why Apple is more than happy to charge you said premium. As long as there are enough consumers like you who prefer to pay more there will be suppliers who will charge more.
Those who prefer to pay less, have more freedom and get a higher-performance system which is more tailored to their needs will, for the (small) price of some more up-front thought, choose other suppliers. They will have the added advantage of being able to partially upgrade their system by swapping the SSD while those who choose the Apple option will have to wait for the next iteration of this product.
Plus I don't really see why a crazy-spec SSD is really key here. It's not going to be the bottleneck for most use cases in a Mac Mini that has an integrated graphics card (at $2.5k...) A regular SSD would do.
Having a user serviceable M.2 slot would have made this the perfect Mac for me.
Lets say I have Applecare for the first 3 years and the SSD dies on the 4th year, what are my options? Replace entire logic board with new CPU & SSD?
I'm impressed, but I am also not sure how accurate the benchmarks he's using are for replicating real-world load.
I actually think it's pretty cost effective if you go for 32GB RAM instead of 64GB, and keep the baseline 256GB SSD storage. It comes out to $1,899. With 64GB RAM and 2TB storage, it comes out to $4,099, which is a huge markup for things that can be self-upgraded on a PC.
Even $2,699 for the 64GB version doesn't seem that bad for effectively the best MacOS running computer you can get currently.
I have the Hades' predecessor, one advantage of this line is a pair of M.2 slots in RAID that will later hold 4tb, 8tb, etc NVMe sticks allowing massively more capacity at similar speed to the Mini's storage.
The AMD graphics in the Hades' is vastly better, but still significantly inferior to anything you will connect via eGPU. A lot of casual usage will not require the Hades' GPU but you'll always pay a power/thermal price for it. As resolutions increase, or depending on your requirements now, you'll probably be forced to pair it with an eGPU because it really only excels at 1080p for 3d stuff afaik.
One nice thing from the Hades' predecessor is fanless cases, I don't know if they've emerged yet for the Hades Canyon but going fanless is really amazing. Prior to that it could be loud, and the Hades added a second loud fan for the GPU.
I bought an iMac last year and regretted it. The performance was fine, but I ended up disliking the fact that it was an all-in-one. The fact that you can get iMac performance without a display attached is a big feature of the new Mac mini for me.
Yeah. I guess if portability is a plus, that would be better than an iMac. I move once a year so that wouldn’t really be a problem for me, but if you need a Mac for portability, why not just get a MacBook?
I think the Mac mini is at an awkward price point. You need a monitor, keyboard and mouse which cost you extra. At that point you can get a base MacBook Pro.
I think it would only really make sense for businesses
I have the 2014 Mini with 4GB. It is depressingly slow and the horrible part is that it replaced a slightly older Mini that died but that had 16GB of RAM in it which was my little workhorse. When I bought the 2014 one, I didn't know the RAM was soldered on. I have suffered for three years like this (I have other computers, but this Mini is the main "family" computer for storing photos and documents and gets backed up daily). It's relatively important even though we all have other computers at our house.
What is most sad is that a colleague gave me a 2010 i5 Mini with 8GB of RAM and that one is actually faster than the 2014 Mini with 4GB of RAM--OSX needs to manage memory better! WTF Apple? Why sell computers with RAM spec that doesn't work and then make them non-upgradeable? It's inexcusable.
My question is, should I get one of these Minis and is the RAM upgradeable? Or should I buy a refurb Mac Pro and get off the Mini ecosystem entirely? Again, I need a little desktop workhorse that isn't an all-in-one kind of dealie. Laptops are great, but I want a desktop type machine with ports for what I do with it.
I had a 2015 MB Pro in 2015/2016, loved that one. My current work machine is the 2017 Pro laptop, it's not quite as good. Dongles suck!
I was able to re-utilize that RAM from the defunct Mini by getting a refurb 2014 MB Pro for my daughter and adding it to that one instead. That one has also been a great little workhorse at our house.
Apple’s new strategy seems to be, instead of having products that sell themselves for their clear benefits, just flood either YouTube casuals or Twitter “celebrities” (depending on target segment) with the highest end config machine and expect praise.
Please point me to a succesful hardware company that has ever let their products sell themselves. I simply don't understand what your gripe is. Providing review units for members of the press is completely standard. Every company does it, and Apple has done it always. The only difference is that it is now bloggers and YouTubers customer's look to for buying recommendations, instead of the Walt Mossbergs of yore.
True, I used to go out of my way to find unbiased and honest reviews from "real" people that weren't "seeded" by manufacturers.
The problem is that many others did the same, so these honest reviewers gained followers, that gave them exposure and eventually they get approached by the PR departments which would like to "lend" them top end review samples for an undisclosed amount of time. Well before the press NDA, of course.
Lately I find myself buying paper magazines again.
That being said the new Mac Mini does seem like a great product.
Sure, but the authors of larger magazines typically do not get to keep (or even use) the product for their own private purposes, so they're not nearly as invested and have far less reason to be biased.
I've heard many companies "forget" to ask for their seeded samples back, particularly those sent to so-called influencers.
Pretty much every computer manufacturer provides review units, and always has. In fact, if anything, Apple's known for being a little on the stingy side here; for some of their product releases only the biggest publications have gotten early review units, whereas some vendors hand them out far more freely.
Well, you know, we waited years for a new Mac mini replacement, and we got one that is very disappointing on the low end, very expensive on the high end, no clear future support, missing features from previous iterations, but hey, it’s amazing machine because Apple finally graced us with it and I got it for free, so let me overlook that 2500$ price tag.
There is still massive conflict of interest. They want to continue getting these review units, and Apple is known for "being offended" and blacklisting people from media events and review units.
None of the reviews I've seen from "general public" "reviewers", in the year or so Apple has been doing it, have provided real criticism. It's always "fine", always "great", big issues are glossed upon.
At the very least, Apple picks their target "reviewers" very well based on their bias for Apple products. That's fine in and of itself, but it creates a bad image when these people get their review units before even the media reviewers.
If you listened to Marcos podcasts, you'd know that he has been very critical of the most of the Macs that have come out in recent years. He remains critical of most of their laptops. That doesn't mean there is not still a conflict of interest in principle. But in practice, I would say he has a track record that proves, that he is not afraid to bite the hand that feeds him. With or without conflicts of interest, there's really no substitute for getting to know a reviewer to find out if you generally share that person's sense of what is valuable and not.
 Maybe it really isn't the hand that feeds him. Marco's primary gig is his podcast player for iOS.
I really wish that devs out there, including the HN community, give PCs and Windows another chance.
The PC specs are amazing, value for your money is excellent, the OS is beautiful - the biggest obstacle now is so much open source documentation explains how to install or build on a linux based machine, ignoring the Windows users and making them feel like sh*t for working on a Windows box.
If more dev's offered docs around building/compiling on Windows, and Windows support, that would be excellent...
Pity that it's so hard to source W10 Enterprise LTSC (previously LTSB), you can bypass pretty much all Windows Update woes that way, miss out completely on Cortana and Edge, and avoid having "features" no-one asked for rammed down your throat every few months.
Despite Microsoft's FUD ("The Long Term Servicing Channel, which is designed to be used only for specialized devices (which typically don't run Office) such as those that control medical equipment or ATM machines"), it also appears to run Just Fine[tm] on the latest desktop hardware
I do prefer the macOS ecosystem overall and distinctly dislike the UX mess that is Windows 10. I have been using Windows for decades now, and feel very comfortable with it, and I still maintain a desktop tower for gaming and intensive work, but if I had the option to switch to a macOS machine I would. There just isn't a Mac hardware out there that fits my needs. But each year I feel Apple is making the Mac ecosystem worse and worse. At some point the software advantage will not be worth it.
That being said, if someone likes Windows 10 now, there really is little reason to remain on Mac hardware. Practically, most creative software exists on Windows.
It's my impression that most Mac (and Linux) users have given Windows a chance. Maybe not the very latest versions, but most Mac users I know (including myself) have switched from Windows at some point in the past. I don't doubt that Windows is less infuriating than it was when I left in 1998, but nothing its more modern incarnations appeals to me in a way that would even make me consider switching back. "Not so infuriating anymore" is a weak value proposition. There is also an element of "fool me once shame on you, fool me twice shame on me".
At the same time, I am perfectly happy with my Macs; I still absolutely love macOS, my work 5k iMac is a delight to use, and my private 2010 MacBook Pro is still going strong (although as of Mojave, it can no longer run the latest OS). My 2008 Mac Mini is nearing end-of-life, but only because I can't justify upgrading its internal storage to an SSD when the computer is stuck on Mountain Lion. I worry about the price hikes, but OTOH, as long as I can expect getting 8-10 years useful life out of a system, I don't mind paying the Apple Tax.
I think it must strongly depend on your use case and exact hardware, because those exact reasons in my case are an argument against windows. I have to download and install what in order to get driver support? How many tray icons can one person possibly need? It's the worst in my experience with printers and scanners; "please install this 300MB package which will constantly run in the background and annoy you at the least convenient time to replace your ink - oh, and in three months we're going to completely change the interface and replace it with something that doesn't even support feature that you're using". Or, hear me out, I could install CUPS and xsane (or another sane frontend) and be done. Driver support is indeed hit-or-miss, but when it hits there's absolutely no work at all. (Exception: if your printer isn't already supported, you can often find and download a single small file to add support; CUPS is beautiful)
I agree with you that a lot of vendor software is garbage. If the hardware you use is supported by your Linux distro of choice, I'd agree it could be a better experience than installing any driver. But at least there is always a driver. That has not been the case for desktops and laptops I have had in recent years. Always something missing in plethora of Linux distributions, and the ways to solve those issues are not straightforward even for a 10-year experience software engineer. That is not something I want to think about when buying new hardware.
Yes, especially on Mac, high DPI on Linux is still garbage to this day. It's not really good on Windows either, but at least with Windows 10, it is somewhat serviceable. On Linux, support is so abysmal, it is really comical in 2018. Moreover, it seems the dev community still hasn't "seen the light" in high DPI displays, and will often dismiss or backlog required changes for support. In 2018.
HiDPI worked / works great in UbuntuGNOME. I spend most of my time in a text editor, terminal or browser, but even things like games for my kid (Tux and Tux Kart) supported HiDPI without requiring any magic tricks.
/edit setting a different scalefactor on my external (non-HiDPI) monitor was frustrating in that it worked... sometimes
That's if you are on supported hardware. If your sound/wifi/disk storage/modern GPU lacks support, you are SOL for a long time and have to resort to experimental drivers that may or may not be stable and may or may not be working as expected. Meanwhile, practically all hardware has Windows drivers.
Tired meme, Windows drivers are the only ones you have to "fiddle" with out of band, I can't even think of another OS where going to a third-party website to download an executable is how you install drivers for the machine. (Not OSX, not BSD, not Linux).
Buy any desktop in any supermarket in the world and it's going to work out of the box with the latest Ubuntu. This has been the case for 5 years now.
If you have some weird hardware that needs a kernel module that isn't enabled by default or packaged as a kms for your distro, then sure I can definitely see some awkwardness there, but for instance I haven't had so much as a wifi problem in 10 years.
Definitely a stark contrast to Windows where you have to go to the manufacturer website for each component of your machine if you're not using the bloated OS that comes pre-installed.
I think the advantage right now is largely in a few select sectors right now, and mainly due to ecosystem.
Music production is a big one from my perspective -- currently, if you are looking at the tools and plugins that are the most popular in the professional world, your practical choice is basically between Macintosh or Windows.
The majority of DAW plugin synths / effects currently are not compiled for Linux, and I do not believe many popular DAWs (stuff like Ableton Live, Logic, Cubase, etc.) are currently supported yet either. I have heard that if you keep it light weight, some audio plugins will run just fine under something like WINE. But for a heavy-hitting plugin like, say, Spectrasonics Omnisphere, I have heard that emulation is too slow to be practical.
There are certainly native Linux DAWs and plugins out there, you can probably go quite far with Ardour or the Linux version of Reaper, and there's a few plugins too (u-He has some native Linux builds of their excellent synthesizer plugins for instance). It's just that the native ecosystem out there is quite a bit smaller, unfortunately.
That has always been Apple's MO. Ever since their "Think Different" marketing campaign, where they realized they can convince the sheep to follow if they associate with the brightest minds and celebrities (e.g., put an Apple logo next to Einstein, next to an astronaut, etc.) without actually offering a better product.
It's always been a fashion statement to own an Apple product...
What bothers me is how the technical community, both software engineers and academics, have fallen into this trap.
A new Win10 pc is a much better development machine. Sorry, but I do prefer to Think Different and don't care about what is fashionable, but make my choices based on specs, utility and value.
Since I am part of this aforementioned technical community, I will spend a few extra dollars in order to get: 1) high resolution screen, 2) large track pad that works well, 3) native "unix like" shell, 4) sleep that works 100% reliably when I close the lid.
I don't consider it a fashion statement, I just want it to work. After 20+ years I don't like to tinker with my desktop anymore. I would go back to Gnu/Linux for desktop, but finding the right hardware with the above specs is a challenge if not impossible (I have yet to find a track pad driver that works as well as Apples).
I have not reevaluated in 2018, If there is anything better I would like to hear specific examples of hardware and OS combinations?
I unfortunately took a job at a company that thought they were being fashionable and progressive by forcing each employee to use a Mac. They saw it as a benefit. It was idiotic.
95% of their workforce was using Excel for reporting tasks. But, of course, this was Mac Office 20xx (08, 12?), when the rest of the world was on Office2016+. Half the features weren't available, or required people to hold down 8 keys simultaneously to work.
Right. I guess, for me at least, before it might have existed to an extent, but since machine we’re good value and it was celebs that did the marketing, this didn’t bother me much. Now it seems to me like they are desperately trying to push those things in crowds where I do not want to see being corrupted like this, namely the technical community and engineers.
Mac ships with their own posix subsystem, WSL allows you to install a bunch of popular flavours of linux on top of Windows. I personally run Arch Linux and have access to a lot more bleeding edge packages.
The review was good but I'm honestly surprised that Marco, of all people, would have such a badly dubbed video. Get Casey Liss to help you, his audio in videos isn't as good as your audio but it matches his voice at least. #AccidentalBadVideoDubbing