I know this is about games, but I wanted to relay my experience using an eGPU (using the Razer enclosure and an AMD RX Vega 64) in a professional context.
I had a good performance boost using Photoshop and Maya. But if you're using a portable MacBook Pro that you're connecting every day, I wouldn't recommend it. The connect/disconnect experience is really flaky, and you'll probably spend a few times a week trying to figure out why the eGPU isn't connecting/disconnecting.
I switched to a Mac Mini at work (stock GPU build of course) with the same enclosure and graphics card, and the experience has been pretty delightful. If you're doing visual stuff, it's the budget version of a Mac Pro. It seemed to improve overall OSX responsiveness too, but perhaps that's just the new Mac Mini in action.
I still use the old MacBook Pro at home, but it's not my daily machine. Now I don't have to carry a bag to work, and my home laptop can cover things in a pinch. Using a desktop machine at the office for work has improved my work-life balance, and my productivity.
I've been using an eGPU (15" 2018 MBP i9, RX 580, 10.14.x) for over half a year and only ran into connect/disconnect issues once. If you're having disconnect issues I'd urge you to see if killing the QuickLookUIService process causes the eGPU to immediately disconnect. If it does, you probably have a buggy QuickLook plugin with one or more apps. In my case there was a QuickLook plugin from a recently installed trial app that was preventing the eGPU from disconnecting. I uninstalled the app and everything went back to normal. I've googled around and it appears like other people are also having this issue with QuickLookUIService. Hope this helps.
I love my 2018 MBP + eGPU setup to the point that I chuckle whenever I see people claiming how awful these MBPs are. I've finally reached computing nirvana. The i9/32GB RAM in my MBP is powerful enough for me needs at home and on the go. The eGPU makes for a great dock and access to a more powerful GPU at home. It has built-in gigabit ethernet, an SSD, and several USB 3.1 ports.
The comment regarding QuickLookUIService is interesting. Not because I use an eGPU but because QuickLookUIService is always reported as using significant energy on my mid 2014 15” MBP.
This has encouraged me to dig deeper and to try to find the reason for this once more.
I found my problematic quick look plugin by running ```qlmanage -m plugins``` and doing a hack binary search where possible. I identified 3rd party plugins, assumed the problem was there, then disabled half of them by temporarily removing the associated apps. It only took a few iterations over a couple of days of casual use to pinpoint the culprit and the problem has not returned.
Mantiz Venus: https://mymantiz.com/products/mz-02-venus. It can do 87W power delivery (so you can run a 15" MBP at full tilt while charging up the battery), has a number of USB 3.1 Get 1 ports on front/back, GbE ethernet, and a SATA adapter that can house a SSD.
While researching eGPUs I couldn't find one that was spec'd out quite like the Mantiz, particularly with respect to the 87W PD. Do keep in mind that like all the other eGPUs out there, the USB ports and anything connected to them share 5 Gbps of theoretical bandwidth. In the case of the Venus, that means the SATA SSD, GbE ethernet, and connected USB devices all share that bandwidth. But I have never seen my ethernet throttle while doing reads/writes to the SSDs and I have 1 Gbps up/down fibre optic internet. I just point that out as a lot of people were upset that all these devices hang off the USB for the Mantiz. But if they didn't they'd just eat into the TB3 bandwidth available to the GPU itself.
I actually run my setup in clamshell mode while connected to the eGPU and just use one big display. So I'm not a great person to provide a testimonial there. But I have periodically run with the MBP display open and enabled. The biggest difference is a subtle but noticeable improvement in user interface latency, which is interesting given the video signal is going out over an additional cable.
My understanding is that for the 15" MBPs, the built-in discrete GPU would always be used to drive an external display. So the following doesn't apply for them. But the 13" MBPs have the built-in Intel graphics and you can definitely feel a difference driving even one bigger external display using Intel graphics vs something more substantial like an AMD Polaris (RX580/RX590) or Vega cards.
Surprised there are still no comments about the Blackmagic eGPU and BM eGPU Pro! And these are not even mentioned in the blog entry.
These are the sleek semi-official, Apple-design contributed eGPUs for the macbook line. Like the LG monitors Apple helped out with. But with less scandal.
I have been using the Blackmagic on my Macbook Air 2018 for several months and I really, really like it.
I have a _single_ USB-C cable from the MBA to the Blackmagic. This powers the macbook, but also offers a bunch of ports and HDMI out to my P2715Q Dell. They allow me to use HD Audio USB speakers, a webcam, a pro mic and a few cables to charge stuff with.
There are almost no stability issues with yanking the cable and putting it back in. I do this almost every day--often not having the patience to use the official "disconnect" process on the system tray icon.
Not only does improve OS graphics stuff, like moving between spaces at 4k, but small things like Messages animate more smoothly on this large high res external monitor.
I only very occasionally play DOTA 2, and you can get some really great graphics performance out of this eGPU that is not possible on the machine alone.
Plus, and very important to me--the Blackmagic eGPU is silent!
It is not a cheap add-on but if you have a home office and go portable sometimes an eGPU especially the Blackmagic is really awesome.
It is more expensive, but it just works, looks great and idk I dig it. Glad to have a thread I could pipe up about it in.
I heard about this bottleneck a lot, but I just haven't noticed it--I do not have thermal issues. I run in clamshell in a cheap vertical lap top stand. I rarely hear the CPU come on when I'm docked with the standard BM eGPU.
I run scaled 'looks like' 2560 x 1440. I run tons of apps and browser tabs in multiple spaces.
I only notice the CPU is weak when I'm encoding with something that isn't aided by GPU.
One case I do hear the fan come on is near the end of a 200+ step unit test in pycharm on my primary personal web app.
But that is just about the only time I can think of it comes on and it shuts back down as soon as my tests complete. (hopefully all passing!)
A bit more on my specific setup:
- 2018 MBA (maxed out)
- Dell P2715Q monitor
- Blackmagic eGPU
- Edifier S880DB (USB HD Audio)
- AT2020 USB+ mic
- Logitech C920S webcam
- Vertical Laptop stand
- Magic keyboard / trackpad
This setup is probably the best I've ever had, because the portability is great and the docked situation is just very, very good. All those peripherals off a single connection to the laptop.
I am hoping that the ARM mac will have much better graphics, but I think it is likely at least the first two generations of it will still be majorly improved even by the 580 in the current blackmagic.
It is worth mentioning that I do try to use the software disconnect of the eGPU and have had that just stall out or not complete. I have not tried the trouble shooting someone suggested here though--and it is so stable w the BM that I don't worry about yanking the cable anymore.
I've been using an Akitio Node + WX8200 (Vega 56 Pro, needed to drive 2x 5k monitors) for about 7 months now on my daily driver (2018 i9 32gb 15" macbook pro) and I haven't had any real issues with disconnecting. There are only two annoyances:
1) Sometimes an app won't quit that's using the external GPU prevent me from safely disconnecting. This is annoying when you have to run to a meeting and don't want to risk the system crashing.
2) If you connect the eGPU when apps are already running, they'll continue to be rendered on the internal GPU. This is probably OK for most people, but the internal GPU can't drive 2x 5k displays, so although it works they're really laggy until they're relaunched.
Other than those two issues, I highly recommend this setup to everyone. I actually have 2 egpus, the other has a 1080ti that I use for light ML work and gaming on linux and windows respectively.
I’m considering purchasing an Akitio Node for my Mac mini 2018. I noticed there is only one thunderbolt port on the inclosure which means I cannot connect external display directly to it. Will the performance lost be significant?
I connect my (non thunderbolt) monitor directly to the GPU. If you need to connect to the computer directly you’ll take a small hit in some cases but a lot of gpu work isn’t constrained by pcie bandwidth
I would say it's higher FPS due to the better GPU. It's hard to notice when you are using it, but if you use an older computer then one with a better GPU you'll notice the difference immediately
It's also a matter of cost alone. I'm currently in the market of building a Gefore 1660 based light gaming environment. Building a Desktop PC the cheapest way possible using that card plus a Ryzen 1200 comes out at only ~70$ more (~550USD, 16GB RAM, not including SSD) than buying the cheapest eGPU enclosure plus the card. For that 70$ I get considerably better performance (since my laptop is ultralight) and less hassle.
I don't see how eGPU is worth it except for very narrow usecases, like doing some light 3D modelling and wanting to use an already available Mac (since buying the appropriate integrated Mac hardware is way too expensive in comparison).
Just build a PC and run Linux on it. A lot more games will be playable that way, thanks to Wine, dxvk and vkd3d which macOS can't run well due to Apple sabotaging Vulkan support.
Perhaps you are thinking of Apple's Metal instead of Vulkan. As far as I know, Apple has been quite staunchly against Vulkan. I too would love a link to any WWDC where Apple has discussed bringing it into macOS, that would be great news!
Yes, but most of the games are on win, and egpu's on bootcamp are not so P&P. Been there, done that. Lots of wasted time :(
[edit] especialy enoying "Error 12". As article mentions:
There is a really good chance, however, that you'll see a yellow exclamation point over the GPU's icon, and instead of "This device is working properly", you'll be greeted by "Error 12" and a message that the device doesn't have sufficient resources (read: either memory addressing or bandwidth) to be used. If you see this, your GPU won't engage at all, and you'll have to troubleshoot it. There are a bunch of ideas in this thread, but what will work for you depends on a combination of factors.
If you do not mind being locked into AMD cards only I suppose it is a good solution. The eGPU solution was probably the only way to save face with Apple's fixation on slimness limiting their choice of onboard GPUs.
If you look at the situation Apple design locked themselves into, the inability to swap out components and for many models you cannot even upgrade the memory, has forced users to rely on external boxes for storage and graphics which pretty much brings back in worse form all those boxes Apple was "designing away".
I like my current Apple computer but I am discouraged from every buying another first for their support of manufacturing in China and for the fact that every time I hit some limit imposed by their fixation the sycophants chant "just use external" and I am back to square one.
Gaming on Mac is okay if you have a good time manually editing files with magic strings from the internet to trick the OS into using an appropriate resolution/refresh rate, and ultimately ending up with noticable input lag.
I went from gaming on a $6k mac pro to gaming on a $1k Alienware and realized I'd been playing in the kids pool the whole time.
I would refrain from saying "gaming on MacBook" because it creates confusion about whether one is using Windows or macOS.
When playing on Windows (Bootcamp) the MacBook is just a regular Intel based hardware, the experience is no different from using any other laptop out there (aside from Windows install process).
Playing on macOS that's another story. Mouse acceleration can not be disabled, switching to and from fullscreen games is sometimes buggy, and not many popular names are available natively on macOS.
Tangentially, I know of killmouseaccel[1] which does exactly what it says: disables mouse acceleration on macOS. However, it needs to be executed at every restart, or deamonized.
SteerMouse, among other things, has a nice total bypass for it. Here, I am just using the raw DPI and linear movement of my mouse: https://i.imgur.com/xEEjjVD.png
As I understand it, Windows too has mouse-acceleration that remains (rather subtly) even after disabling "Enhance pointer precision" in Control Panel. When I was into competitive gaming, there was a Registry fix doing the rounds that got rid of it completely.
The sheer frustration and potential undiscovered bugs of getting gaming to work with a GPU on a Mac caused me to give up trying to game on a desktop.
Which is actually perfectly fine unless you need 4k60fps or 1440p144fps, which I do not. I originally started gaming on the PC a decade ago when many games and sales were PC only: the gaming market is at a place now where the game selection and prices between consoles and PCs are similar. There isn't anything on a PC/Mac I'm missing that I can't get on my PS4 Pro (or even a mobile device!).
I prefer PC because i have real ownership of the games i buy (i mostly get my games from DRM-free stores like GOG) and no arbitrary backwards compatibility breakage (i have many games on CD from the 90s and 2000s that work on my brand new PC... though some need several workarounds, but still much preferable to not being able to play the game at all - nowadays i rip my CDs and DVDs to my external HDD alongside community made patches/fixes though so i do not have to deal with hardware issues and scratched disks).
Also the sort of games i prefer to play (FPSs, CRPGs and to a lesser extent simulation games) play much better on a PC than a console or mobile.
Note that with PC i do not mean Windows, although i do use Windows. But with Proton evolving i think that in a few years i should be able to play any game i want on Linux or even Mac (assuming Apple hasn't gone full retard in the meanwhile).
I was a PC-gamer for 25 years, and recently switched to a PS4. I've also been a pretty avid Battlefield player. One of the pleasant surprises of the platform switch is the absence of cheaters. There always seemed to be at least one. My strategy was to bide my time, and switch to the cheater's side when I could. At least that way, he wasn't griefing me. Now I don't have to worry about it.
I'm not going to say that getting used to the controller from a keyboard and mouse was quick and easy, but it was quicker and easier than I thought it would be, at first. My rankings are almost what they were on PC now, so it doesn't bother me any more.
I don't know if they're a joke. There are quite a few players out there who seem to like it just fine. But I, on the other hand, am absolutely a joke trying to play FPS games with a controller. I need a keyboard and mouse or I can barely play at all. There are ways to hook up a keyboard and mouse to some consoles and some games, but it's generally considered "cheating".
Doesn't matter anyway. My reflexes aren't what they were when I was a teenager.
I feel like auto-aim has always been kind of a sad hack, in that it's done on consoles but not on PC, so it effectively isolates console multiplayer from PC multiplayer.
I feel like a more "proper" solution would be to come up with shooty-shooty bang-bang game mechanics where "ability to aim with finesse at the things you're shooting at" is mechanically irrelevant, so that "aiming at things" isn't really a part of the console or the PC experience. I.e., design the game so that, despite complete ignorance of your input method, there is still absolutely no advantage in using a mouse compared to a joystick.
Any ideas on how such a thing could be accomplished?
• Maybe a game could snap all the people to a 3D voxel grid. You'd move smoothly, but your "shadow" would always exist within a certain voxel; and you'd always see everyone else over the network as popping between voxels. To shoot someone, you just have to target the right voxel.
• Maybe a game could give people, instead of hitscan line guns, extruded hitscan beam weapons of some diameter. Everyone is blowing 1m-wide holes in one-another [and the terrain] with giant laser cannons. I'd play that!
> I feel like a more "proper" solution would be to come up with shooty-shooty bang-bang game mechanics where "ability to aim with finesse at the things you're shooting at" is mechanically irrelevant, so that "aiming at things" isn't really a part of the console or the PC experience
The thing is, even if you remove the finesse, mouse may well STILL have an advantage. I can whip a mouse around to do a 180 far quicker than I can on a controller.
Well-made motion controls (to the point that motion controls from any other shooter feel terrible) really feel like mouse-level precision for Splatoon.
On the other hand, Splatoon is a third person shooter so it may not translate too well to a first person shooter, although you do get first person perspective when using a Charger, the game's terminology for a sniper weapon, with a scope. Not only that, but I remember the addition of motion controls to the Switch port of Doom being considered such a huge upgrade it might as well have been a new game.
I'm with you on auto-aim being a sad hack. Having been a rabid Splatoon player for the past four years, I tried Overwatch on PS4 and just gave up after a while — I don't want my performance tied to arthritic finger bending to try and convince the game to auto-aim for me, I just want to point and shoot as I've been able to do very accurately with the Wii U GamePad and any Switch controller, especially when the PS4 controller supports motion control.
It's generally an option you can set. When playing on higher difficulty levels, auto-aim gets turned off. It's actually amazing how well some people can aim with the controller and no auto-aim. I am not one of those people.
The price of consoles is subsidized by the fact that the games are super expensive. On PC the prices of games is usually less while they are at full price and massively less while they are on sale which is a quite frequent event. When I look at console games I almost never see any significant sales. Its not uncommon to see AAA games at 90% off when they are a few years old. On console they just stop selling the game.
> On PC the prices of games is usually less while they are at full price and massively less while they are on sale which is a quite frequent event
That's no longer true; sale prices for the same game on console and the game on PC are typically the same. Additionally, retail sales prices often are synced with digital prices.
> When I look at console games I almost never see any significant sales.
On the digital storefronts, all 3 major consoles have weekly sales, and often deep discounts for special events.
At least on Nintendo's platform this has not been my experience. The same game is often cheaper on PC than on the switch at full price, and for first party titles I don't think I have ever seen a sale on a Nintendo game.
- Factorio
- Minecraft, w/so many hundred different mods
- Kerbal Space Program
- Oxygen Not Included
- Mods in general, actually.
- Elite: Dangerous. Or things which require joysticks in general.
MoltenVK helps some of that, but I think it goes beyond supporting OpenGL. OSX just has so many things that cripple the gaming experience. Here at Atlassian we're pretty much a Mac only shop, and needless to say a lot of us play games as well. The amount of bugs we encounter with OSX compatibile games is stunning. The same game on the same laptop model on the same version of the OS will have two entirely different issues. We've gone so far as to dedicated the first hour of every LAN session we do to fixing OSX bugs. The consistent ones we experience though are input lag issues (i.e. to click and drag to select units, you have to click, wait half a second, then drag, otherwise the drag action will only select the last half of what you were trying to select), a lack of ability to minimize full screen apps meaning you have to close the game to change any setting or join a different discord channel (or do non-exclusive fullscreen, leading to a large drop in FPS), and a small subset of machines (about 1 in 10) that simply cannot go over 20FPS regardless of resolution/rendering settings/game selection. It's surprising given how good the mobile gaming experience is on iOS devices. It just isn't a priority on OSX whatsoever, and the UX of it shows.
OpenGL is in maintenance state on all platforms basically. The only alternative they had was Vulkan. And Vulkan is not perfect either. Personally I prefer Metal to Vulkan. It's very easy to transition from OpenGL to Metal, it feels like an improved API without global state and packing stuff into structs. Meanwhile Vulkan opens the whole low-level can of worms on you and you suddenly have to worry about memory barriers, image transitions, semaphores/synchronization, even in the simplest of applications.
> OpenGL is in maintenance state on all platforms basically.
The OpenGL 4.6 spec was released four months ago with new features and extensions come out all the time.
> Vulkan opens the whole low-level can of worms on you
Vulkan's API is unnecessarily ugly, it gives the impression of a low level API with all the structs and such but in reality the driver just picks up data from those structs to move around. In real low level APIs you use structs because they are memory mapped and/or aligned pretty much as the hardware expects it, but Vulkan is a hardware agnostic API.
And yeah, Metal is probably a better API but it is also only available on Mac which makes it a no-go for pretty much everyone outside the Apple bubble.
Personally i'll stick with OpenGL for the time being. I do not need much of the fancy stuff newer OpenGL versions have anyway (i don't do AAA-game level rendering) and i do not see OpenGL being dropped any time soon from any sane platform.
When it comes to Apple's stuff i'll decide between writing a small OpenGL wrapper myself for the functionality it use or just dropping the platform altogether when the time comes and they drop support for the API. Considering they still support Carbon (even if only for 32 bit programs), it might be a while until they drop OpenGL though.
It's such a weird narrative. OpenGL is outdated anymore; the Khronos Group sponsored alternative to things like Metal is Vulkan, which wasn't even announced until after Metal was already shipping, much less finalized.
Mantle came first, and nobody was ever going to adopt Mantle because it was an AMD product and nobody wanted to adopt a single GPU vendor's API wholesale like that. The two biggest platform vendors, Microsoft and Apple, managed to get Mantle-inspired APIs out years before Khronos got Vulkan out the door. What were Microsoft and Apple supposed to do? Wait? Force everyone to switch over as their APIs were gaining adoption?
Not supporting Vulkan on Windows (Only legacy ICD allows for it, outside Windows 10 sandbox), XBox, PlayStation (all models), Nintendo (all models other than Switch which has 2nd API, 1st being NVN) doesn't seem to affect game's quality.
Mantle is inspired on what game consoles have been doing for years, so it isn't like Apple and Microsoft needed to wait for it anyway.
Lack of tooling has been a common issue with Khronos and OpenGL ARB before them.
Pretty good at releasing paper standards, not so much creating SDKs with all the required infrastruture that other APIs offer out of the box.
To be clear, I think you and I are mostly in agreement. The OpenGL/Vulkan advocates are consistently pushing things that really only benefit people who have decided to adopt OpenGL/Vulkan without a lot of real-world support for their positions.
(Heck, I would go so far as to say that OpenGL as a vendor-agnostic API for consumer graphics cards might not have been possible if Microsoft hadn't forced some level of standardization across graphics card vendors in order to get Windows logo certifiation, and if Microsoft hadn't pushed Direct3D, we could all be stuck using Glide for some cards, a different API for other cards.)
I'm starting to get "this will stop working soon" messages on a lot of my Mac games. I assume because of the 32bit deprecation. Expecting to lose 50-75% of my library soonish, maybe more. Of the ~60% of my total library that worked on Mac to begin with.
MacOS was surprisingly good as a gaming OS, so far as game selection, but it's about to go back to the "we have... Breakout! and, uh, Super Breakout!" olden days, it seems.
The middleware that matters has supported Metal for about two years now.
Removing OpenGL isn't far away, as one of WWDC 2019 talks was exactly how to migrate to Metal from OpenGL, while all Apple frameworks now use Metal by default, although some of them still offer the old OpenGL support as fallback.
and smaller than other platforms for the users of such middleware - and due to physical and power differences isn't usually running the same game but a modified one. A lot of porting to Mac depended on OpenGL being an API that you could start debugging on other platforms as well.
This mostly benefits new games. If you want to update to (for example) Metal-compatible Unity, that probably means updating to the latest release, and bumping forward from 2017 Unity to 2019 Unity often breaks games. It's a common problem for indie developers who have to upgrade every few years since Apple loves breaking old games.
While there are definitely games that will be lost when deprecating 32bit, I think it's harsh to say that we are going back to the olden days.
If you look at the top sellers on Steam that have Mac support you'll see a very large library of great games that I don't think anyone would have guessed would be supported on Mac 5-10 years ago.
How many of them are 64-bit ready? How many that aren't are likely to ever get recompiled for 64-bit? I'd bet that pretty much nothing but games currently getting regular updates are going to make the jump.
Especially in certain genres. For example, in "builder"/4X games, Civ 6, Tropico 6, Hearts of Iron IV, Cities: Skylines, Don't Starve, and Parkitect all have native Mac versions, and that's a substantial chunk out of the Steam top selling charts for the Simulation tag.
I just upgraded from a 2014 MBPr to a 2019 MBP. I didn't pay for the GPU before, and playing Civ V hurt. Civ VI was impossible. This time, I got the Vega 20, and I am not disappointed. Both games run great.
Probably because it's the wrong tool for the job. If you want to play games, a Mac is easily one of the worst ways to do that.
The eGPU method of bolting functionality onto a platform which doesn't want it is only neat on a technical level at best. It is essentially the world's biggest dongle.
> The eGPU method of bolting functionality onto a platform which doesn't want it is only neat on a technical level at best.
It's neat on every level. To quote the article:
> "…you are now able to take a top-of-the-line gaming GPU, seat it inside an external box, plug that box into your computer, and—using a single high-bandwidth cable—push the necessary instructions to render 4K games at 60 frames per second on the card before (over the very same cable!) pushing those frames back to your notebook's built-in monitor without introducing any perceptible latency."
I know God intended for us to use PCI cards by opening up cases and inserting them into slots on a motherboard, but it's incredibly cool to be able to just plug one in like a USB device.
But "just plugging in one USB device" wasn't even what happened in this post. The amount of steps described inside literally were 10x what it takes to just build a PC, boot it, install games and play ️
For me, it's a dongle that does exactly what I want: max graphics power at home for games, less weight and better battery life when out and about for writing and light browsing.
The only reason I don't have an eGPU yet is that most of the market still seems to be making weird compromise-driven design decisions, like sharing a single Thunderbolt controller between the card and any added USB ports.
Not if you're bored to tears of AAA games. Plenty of nice indies available cross platform. And for benchmarking etc I see Aspyr has ported a Battlefield of Honor of Duty title...
Won’t disagree with the “wrong tool” characterization, but there are many situations where using the right tool (a dedicated windows pc or game console) comes with many other problematic side effects.
I do all of my gaming on a Mac with an eGPU and find it convenient and practical for my needs. It is much better (for my use case and constraints) than my previous solutions of hackintoshes, dedicated windows PCs.
A dedicated windows pc is a pain. I have separate windows installs but rebooting to win to play a game is also a huge disruption since all my other stuff is on Mac OS.
Thus, I've ignored games that don't run on Mac OS for a while. In spite of buying one or three on sales.
I actually feel bad about never finishing Witcher 3... ;)
I'm playing through Witcher 3 for the second time... on a PS4 Pro. For $400, it's an AWESOME and HASSLE-FREE gaming experience. (Based on my experience, and what I've heard announced, the PS5 will be a day-1 purchase for me.)
Sure, it won't look QUITE as nice as running on the $8,500 rig my son now uses for Fortnite and Minecraft, but it still looks GOOD. AND you don't get surprised by Windows 10 random breakages and forced updates. AND you don't have to put up with needing to register accounts to download your mouse, keyboard, and video drivers.
Whatever bad things can be said about gaming on a Mac, I'm DONE with gaming on Windows. No mods? Whatever. No problems.
The original purchaser paid that. Lightning hit it. I replaced the mobo, and 3 of the 4 hard drives. I've had it for 10 years, upgraded to an SSD, and upgraded the video card 3 times. I played on it for years. It's now my kid's box. My point is that this all Windows is useful for, for me, now.
Oof - I left it unfinished for more than a year. Finally got around to picking it up a little, maybe a month ago, and got completely hooked until I finished it + DLCs. It was a true gaming delight - the DLCs are amazing as well, and, as a whole, it's easily the best game of all times for me - and I believe I played almost every significant RPG ever...I found it so much more worth my time than movies or playing anything else.
I had a eGPU box with a GTX 1060 for months, but was using it for Windows 10 and a zbook 15 notebook. Unfortunately there is no Mac support for NVIDIA but some users get it working themselves - never tried it as it looked a bit to cumbersome for my taste. On Windows 10 the setup worked great. Was able to play games like PUBG or Quake Champions with decent (Medium) settings at Full HD and a reasonable framerate (I believe usually 70-90 FPS).
But in the end I decided that I wanted to switch away from Apple and also sold my Laptop to buy a Desktop PC with more horsepower. My wife still has a rather new MacBook Pro, so if I want to fire up some Final Cut Pro or Garage bend I could still do it.
Was not sure if the whole setup of eGPU was worth it for me, it surely allowed me to play those titles which were not playable without the eGPU but I believe it would have been wiser to spent the money directly on a new PC.
But of course, with switching away from the Mac my case is a bit special.
A very nice how to. For some good real world numbers I always look to https://barefeats.com/ which has been testing eGPU setups for quite some time on different Macs
I’ve yet to see an eGPU review measure latency. I’ve always been skeptical of their ability to render frames out with adding meaningful latency to the system.
I know it’s theoretical possible to work just fine. But it’s a long pipeline with lots of opportunity for inefficiency and connection bugs.
I love my 144Hz gaming monitor on my desktop. I’d be thrilled to connect my laptop to an eGPU if it can produce an identical gaming experience.
But my spidey sense says eGPu’s are a convoluted buggy mess. And that’s unlikely to change in the foreseeable future due to lack of users.
Can anyone chime in on WHY apple doesn't care about the gaming segment? Is it just elitism from the design/dev team or do they truly think no one wants to game on a mac? It's strange behavior for a company.
I wonder how much of it is history. Steve Jobs initially rejected the idea of gaming on the iPhone. They opened their arms only when mobile game revenue became impossible to ignore.
"The truth is Steve Jobs doesn't care about games. [....] He's not a gamer. [....] It's difficult to ask somebody to get behind something they don't really believe in. I mean obviously he believes in the music and the iTunes and that whole side of things, and the media side of things, and he gets it and he pushes it and they do wonderful things with that, but he's not a gamer. That's just the bottom line about it." ~John Carmack (2008)
I've been using an eGPU for about 6 months now with my 2018 13in Macbook Pro and I love it. It drives my 4k monitor so much better for work tasks, and gaming is fun.
The only issue is the eject never seems to actually work, so I just unplug the eGPU every time rather than safely eject. It's a bit annoying to have every app reboot when unplugging. I'm hoping the next version handles it a bit better.
One change I've made from the article is installing Windows on an external M2 drive over USB-C (I don't believe it's Thunderbolt, just standard USB) because my Macbook was running out of space between Windows and MacOS..looking at you, Ark and your ridiculous 100GB install with another 100GB of scratch space required to update.. Everything seems to work at full speed!
This NVidia stuff is getting ridiculous. Apple basically killed off any Deep Learning enthusiasts and nobody could do any serious computing-intense stuff on their platform. A polished toy.
The extra-painful part about this whole situation is that it's apparently only due to Apple blocking NVIDIA from signing and publishing their drivers. (for whatever bizarre reason they may have to do that)
I know it stinks for the people who are acustom to Nvidia's workflow, but it is quite hard to feel bad for Nvidia. A bigger fish is doing to them what they do to smaller players.
I think there’s more to the story than Nvidia is letting on. It wouldn’t surprise me a single bit if Apple is blocking Nvidia due to something like QA issues — Nvidia’s web drivers are notoriously buggy under macOS and Nvidia seems completely unrepentant about it. Pair this with Nvidia’s unwillingness to share source and collaborate with other companies on development and you’ve got problems.
I currently run an EVGA 980Ti Classified in my hackintosh. It’s great hardware, but with the frequency of glitches that the drivers bring I’m increasingly inclined to sell it and replace it with an AMD card.
Hopefully at some point deep learning and other applications for GPU technology will become vendor-agnostic, as with CPUs. While I can run Linux on either an Intel or an AMD CPU (and many others), it seems awfully silly that I can't run a CUDA application on an AMD GPU.
A bigger problem than the performance is the very poor software support. None of the top frameworks officially support AMD or openCL. Only broken patchsets are available for both tensorflow or pytorch.
Performance is an issue, but the bigger issue is that your model just won't work.
NVIDIA is really the only game in town right now. Google's TPUs are somewhat of an exception but they only sort of count, because you can't just go buy a TPU for training. Makes TPUs a non-starter for a lot of businesses.
What's interesting is the slow pace of change. 3 years ago we were in the exact same place. I don't see it changing overnight.
Apple abandoned OpenCL years ago. It's now officially deprecated on macOS. The most recent version they implemented is OpenCL 1.2 (released 2011). For reference, the current version is OpenCL 2.2 (released 2017).
It's both poor software support and poor performance, e.g. Vega 20 has 15TFlops in theory, in practice one could get around 7.5TFlops. And only use models supported by ROCm, custom CUDA kernels used by many models nowadays might not work after cross-compilation to AMD instructions.
You are implying that there is a significant user base that does deep learning on a single node using MacOS. In my experience supporting data scientists for 10 years this is almost never the case.
Not pros, but when you are a macOS user and want to learn Deep Learning or do some smaller state-of-art models, you are simply out of luck. Or if you are a data scientist on a plane/traveling with no Internet connectivity needing to quickly change something in the model and retrain. With Linux/Windows you have no issue if you have a proper NVidia hardware (1070 and better).
OpenCL is what the rest of the world uses and will continue to use. All this means is many programs will drop mac support like they have already done and some will be created as mac exclusive because no one wants to support both.
OpenCL is only used by those that rather use C, instead of C++, Fortran, Julia, .NET, Java,... to target their GPUs, thanks PTX.
Yes Open CL 2.x woke up to the fact that most researchers would rather use something else other than C, but too late and the driver support and debugging tools are still found lacking versus what NVidia and their partners offer.
Even on mobiles, OpenCL doesn't have a say.
After the initial support, iOS is all about Metal Compute, and Android has Renderscript.
That was what I remember being the case years ago. I used to be tracking the progress of the blender project ans they where making some good progress on getting everything running on OpenCL because it works on all cards.
I've heard in general OpenCL is a lot more generic because it's designed for diverse compute platforms like FPGAs, but could you list any specifics? I find this very interesting.
The writer is pretending that egpu is some sort of hidden secret that no one wants you to know about, I don’t understand why - since this is not the case - many companies are actively marketing egpu solutions.
It also falsely presents egpu as a silver bullet, in particular it claims that there is no latency even when using the laptop’s built-in monitor (as opposed to an external monitor plugged into the GPU) which is absolutely not true (an external monitor will perform better).
Many reliable sources have published independent benchmarks comparing the 3 solutions (internal gpu, external gpu over internal monitor, external gpu over external monitor) you should look at the benchmarks in case you want to make up your own mind about this topic.
“Why an external monitor? While both macOS and Windows can "loopback" frames rendered by the eGPU over the cable and onto the computer's built-in monitor, operating system support for this is very recent and it introduces a nonzero bandwidth cost which can result in lower framerates as well as frames being dropped.”
Fair point, there is another section of the article that seems to suggest otherwise, but this section that you quoted addresses it so I take my comment back.
Not brought up here, but the Apple Thunderbolt 2 -> Thunderbolt 3 adapter is one of the only ones that actually works in reverse, too. With some hacks from eGPU.io, I ran an Nvidia GTX 1060 on a Mac Mini for a while over TB2 for hashcat work.
I gamed a bit on my previous MBP, a 15" from 2012. The CPU and GPU got so hot, I suspect it reduced the lifespan of the laptop battery. Does anyone else have concerns about this for gaming on laptops? Would unloading to the EGPU mitigate this issue?
This was a concern for me trying to play ~2005-era games on macOS on a 12" MacBook. One of the reasons I offloaded to Steam streaming and stream from my Windows HTPC now.
I tried this with a Mac eGPU Dev Kit and a 2014 MBPr... right before they removed the ability to run it with a TB2 cable. Jerks. Anyway, the amount of hassle with trying this is high, and I say that as someone who ran Linux on all my desktops and servers from '94-'14. Installing Windows on Bootcamp to do this? That's a hard pass from me, dawg. I'm done with Windows. Even for gaming. It's all Mac-native or Playstation for me now.
What’s the easiest way to turn your desktop to a Remote Desktop for ssh (eg, to run ml models)? The dynamic ip is the only weird part. Which service or solution do people use?
Not sure if I understand the issue completely but if you have problem with the dynamic IP addresses in LAN and don't want to fix the addresses you can use .local domains (this works in Apple and Linux worlds, in Windows you need LLMNR), i.e. connect to yourhostname.local instead of 192.168.x.somerandomnumber.
Apple doesn't seem to have it's ear to the ground. Ever since windows 10, the software and design advantage Apple had over PCs has arguably narrowed down to irrelevance.
Windows today boasts of good design, higher security than previous versions, a Microsoft supported full blown embedded Linux (WSL) and a much larger selection of apps/tools
It all comes down to specs and build quality in 2019. If one is paying top of the line for Apple, they better ship decent spec-ed hardware, including GPU
I had a good performance boost using Photoshop and Maya. But if you're using a portable MacBook Pro that you're connecting every day, I wouldn't recommend it. The connect/disconnect experience is really flaky, and you'll probably spend a few times a week trying to figure out why the eGPU isn't connecting/disconnecting.
I switched to a Mac Mini at work (stock GPU build of course) with the same enclosure and graphics card, and the experience has been pretty delightful. If you're doing visual stuff, it's the budget version of a Mac Pro. It seemed to improve overall OSX responsiveness too, but perhaps that's just the new Mac Mini in action.
I still use the old MacBook Pro at home, but it's not my daily machine. Now I don't have to carry a bag to work, and my home laptop can cover things in a pinch. Using a desktop machine at the office for work has improved my work-life balance, and my productivity.