Having set this up, this is going to be absolutely huge for ML. A lot of nonsense is getting cut down by this:
1. You don't have to install special CUDA-specific drivers that are behind the normal gaming drivers anymore. You'll soon be able to just use the Regular nVidia Drivers (and even now, all you need to do is install a Beta driver version). That's huge for someone just starting out, they don't have to have "the dedicated CUDA machine / OS"
2. At least in Arch, the Linux side of this is literally as simple as `yay -Sy python-tensorflow-cuda`, that's it. Can't get any easier.
3. VS Code's WSL integration means you can use a real editor, with real linting / auto-complete, but without the technical issues of Desktop Linux (aka the Wayland Nightmare and DPI issues everywhere)
> (aka the Wayland Nightmare and DPI issues everywhere)
The picture is more mixed than you present.
I use Fedora at 2560x1440@14" and 2 by 3840x2160@27 and the dpi thing isn't an issue, set your font scaling to 1.5 and that's about all you need to do.
I dual boot windows and fedora (for gaming in the former and dev in the latter) and Windows 10 is by far more unpleasant to deal with, updates that tie your machine up for significant time, drivers are still "track down random binary from random vendor site and hope it works" vs "sudo dnf update" is (for me) no comparison at all.
Into the territory of personal preference, Cinnamon is a better desktop on than Windows - it's more responsive, more cohesive, closer to the classic WIMP interfaces I prefer.
As for wayland, I don't use it yet, I will when it's been stable for a year or two.
In my experience high DPI actually works somewhat decently and those fractional stuff can be fixed with don't scaling.
At least if you only use one monitor. I personally have two I would use with 1.25x and no scaling which is just not possible with the x11 model.
Additionally having two monitors completely breaks vsync with very ugly tearing lines.
And Wayland on Nvidia has been very unsuccessful for me. Even when I got it to run the performance in e.g. firefox was absolutely terrible. Sadly Wayland is necessary if you want hardware video decoding.
Xming on Windows isn't exactly a walk in the park either -- I'd still rather use the native, proprietary build of VSCode on a proprietary OS with the SSH extension if I'm on a high-DPI screen.
Many many many apps on Linux still have massive issues with DPI, especially with mixed-DPI environments (which are no longer an edge case, they're the Common Case with a laptop attached to a monitor).
Even accessing machines remotely via Xrdp has huge issues because once you create the session with a certain DPI, logging into the session from a different DPI machine means you're stuck reading either extremely tiny or extremely huge text
They are totally an edge case.
If you care about hidpi why do you still have a bad external screen?
If you don't care about hidpi simply set a lower resolution on the one hidpi screen.
Its really not, only Linux makes it that way. If its done right (as it is on mac, and to a _very_ slightly lesser extent, Windows) there isn't 'blurriness' at all. One monitor will just happen to be sharper than the other.
I work every day with a 4K monitor and a 1080p monitor side-by-side and it works well. Linux couldn't handle it in any configuration, especially with mixed Intel integrated+nvidia graphics, but Windows is a champ.
Wayland+Intel is working quite well in my mixed 4K+1080p setup. Qt apps are handling like a champion. Gtk seems okay too.
Obviously it's not so great with Xwayland, definite blurriness there. But to compare, the only workable X11 setup was one that ignored my 4K screen entirely and the whole thing was blurry.
? there's an enormous effort to fix the blurry chromium on wayland (and by extension electron and vscode), codenamed ozone, and vscode make me painfully aware of it every day.
That’s an effort to have Chromium work natively on Wayland. It works just fine on X.org and even under Wayland with XWayland if you use something like ChromeOS’s Sommelier or just set window.zoomLevel/--force-dpi-something=2 (and not have XWayland windows scaled).
I mean, it's just an X server for Win32 - I'm assuming it's a fork of Xming, just polished and neatly packaged for usability. It's fast enough to use interactive apps like text editors.
Just for fun, I tried loading a Word document in LibreOffice - I can see the redraw lagging behind scrolling, but it's still usable enough even when typing.
Installing CUDA specific drivers was always optional, in case you wanted to be sure of being on a tested configuration. The runtime parts of CUDA have always been included with the gaming drivers. The CUDA install was mainly for the development SDK, nvcc, etc.
> (aka the Wayland Nightmare and DPI issues everywhere)
It’s more of a pick-one. Wayland has pretty good per monitor scaling; your X11 apps will be blurry but that’s about all I’ve experienced.
The Windows DPI nightmare long surpassed it anyways. On a typical Wayland desktop the usual biggest problem is the DPI being set wrong for a monitor. Even on X11 the worst you get is applications that don’t scale correctly.
On Windows mixed DPI is damn near unusable. I don’t mean “sometime in the past.” I mean, now. I have a laptop with a HiDPI screen, that I sometimes connect to two monitors that are normal DPI. The process of connecting them causes Windows to throw a fit, jumping between multiple DPI levels before actually settling on the correct DPI. This often screws up apps that are slow to respond to DPI changes, like Explorer, which sometimes manages to resize my locked taskbar during this process, and it frequently leaves applications in a totally broken state. Here’s an explorer window that didn’t make the transition. https://m.imgur.com/VgrStHQ
Some builtin Windows apps don’t handle DPI switches correctly even then. You have to entirely sign out for Explorer to enforce minimum column widths correctly on a new DPI, and it just won’t on mixed DPI; it appears to adjust them on startup and never again. I filed a feedback item about this one.
As for third party apps, it’s a crapshoot. Qt supports scaling but on Windows mixed DPI setups its broken. IDA Pro launches with enormous window borders and some controls still scaling when loaded after a DPI change. You can of course use compatibility mode to force System scaling, and that makes it work at the cost of blur.
Why are third party apps such a crapshoot? Partly, history. Windows has undergone no less than 4 different iterations of DPI scaling. No, really:
There’s plenty of good information on this issue, but as someone who has tried to adhere to per monitor v2, I can say it is not terribly easy unless you limit yourself to dialogs. Builtin resources like fonts do not appear to update, so you need to create your own to support scaling a window with Win32 controls.
And that’s not all. I close my laptop lid when my laptop is docked. Windows helpfully sleeps the laptop even though external monitors, a mouse, and a keyboard are connected. There is no fix. Of course, I tried to make my own solution. https://github.com/jchv/SwitchOnDock
All in all, the Linux desktop sucks, but the Windows desktop gets a free pass because everyone has internalized the ways that it sucks. Meanwhile Linux with Wayland has progressed meaningfully and is still being regarded with unwarranted disdain nonetheless. It isn’t quite to the level of macOS with scaling, but it’s close. And that docking issue is also something that is not an issue on either Linux or macOS. So YMMV.
One thing I cannot do is allow someone to shame Linux for its DPI problems without acknowledging Windows DPI problems.
> And that’s not all. I close my laptop lid when my laptop is docked. Windows helpfully sleeps the laptop even though external monitors, a mouse, and a keyboard are connected. There is no fix.
Right click the battery icon in the system tray > Power Options > Choose what closing the lid does > Do nothing.
Of course, then lid close will never work as intended. What I want is for lid close to sleep IFF no external monitors are connected, regardless of whether the machine is on AC power or not. Thank you for the suggestion, and in my actual solution, I basically programmatically switch this setting on and off based on whether or not external monitors are connected.
This is what Windows Custom Power Plans was built for. Have a plan called "Docked" that disables lid close, and leave it as sleep in other plans. Custom Power Plans have been buried in Windows 10 as confusing power user/OEM features, yet they still exist.
I think there's an option in settings to select the primary display - if you set that to the external display, theoretically it should work. My laptop stops sleeping on lid close with this.
What the hell. It’s one thing to say you prefer nothing to happen on lid close. That’s cool. I don’t want that. This is like saying, “I would like a mocha” and having the barista say “Wrong, you want an iced coffee.” No, I do want my laptop to sleep on lid close, just not when it’s docked. This is clearly not much to ask for considering Linux and macOS do it just fine by default and I can still disable sleep on lid close if I want to.
Why is it better for Windows to do this wrong? It is clearly a side effect of the fact that the Power Policy system only differentiates between AC and DC and no other conditions.
What happens when I move rooms? Well, I only have one docking setup. But also, it is not a problem if my laptop spuriously goes to sleep. (My solution does not automatically sleep the laptop when it is undocked, either, so this is also still not an issue.)
It came out harsh, sorry. But it is more like asking for a 33cl beverage but getting 35cl.
There are tons of issues with a computer spuriously going to sleep. Network connectivity dies for one, as well as lots of minor quirks such as music stops etc. Having control of that is quite essential.
Being forced to manually sleep despite ones preferences is not a major issue. It is a sub-second operation that comes with lots of benefits. One of which being explicit with what you want to do.
On the flip side, a laptop failing to go to sleep can be a fire hazard or at least permanently damage your laptop, depending on where it’s stored. This killed one of my X1 carbons.
And yet, my Ubuntu does the right thing in both cases:
- When the lid is closed and there are no external monitors, the laptop goes to sleep.
- When there is an external monitor, it switches to it entirely.
And when the external is disconnected after that (the laptop is still closed), it goes to sleep fine.
> Meanwhile Linux with Wayland has progressed meaningfully and is still being regarded with unwarranted disdain nonetheless.
That's because things don't work.
I moved from i3 to Sway and back to i3 because Sway didn't deliver of being compatible with i3.
Then there is the xdg portal mess which makes screen sharing through a web browser weird.
And also the whole XForawarding mess, which makes all Apps using it blurry in HiDPI. I use my computer for work, and 99% of it happens in emacs. The remaining 1% is using Chromium, which also does not support Wayland yet, so for me 100% of my wayland experience on HiDPI amounts to "all text on Wayland is blurry". There is no "HiDPI per-App" setting to fix this, scaling to 1.5x is also bad, etc. And that's without entering on the "wl-roots will never support nvidia thing". I need to use both AMD and nvidia hardware for work. I don't care about whether Wayland's desktop war (e.g. Sway) against nvidia, who has 50% of the GFX market, and 99% of the compute market, is honorable or not. But I do care about not being able to work on Wayland while X works just fine.
Sway devs say that this is 100% nvidias fault. From my POV, all other desktops manage to work with nvidia's hardware just fine, Sway does not, so I see this as being 100% Sway's fault. Without an i3 replacement I can't use Wayland, so until this fanaticism finished, I'll probably won't be able to use it.
I scanned through the comments. There is no editor war anymore these days ;) For development I wouldn't have issues as I just use VIM and CLI. Neovim currently has linting and auto-complete and works so much better then before. Actually also using the language server protocol same like vscode. So no issues on DPI. The other tool used is chrome and that handles DPI quite nicely on linux.
But I switched to mac long time ago. Because I agree it handles text and HDPI displays much better. Maybe windows improved on this but reading text on it was such a downgrade to a mac. If windows would have as nice text rendering as mac I would switch instantly. Cheaper hardware for better price and all. No have best of both worlds windows and linux all in one.
The only time I see any HDPI issues on Win10 is when running ancient apps. It certainly works great in VSCode.
I'm not sure what you mean by "nice text rendering" - are you referring to ClearType vs macOS anti-aliasing? If so, that doesn't really come up for HDPI, because you can't see the artifacts either way.
I'd hope that the barrier to entry for starting ML practitioners isn't easy environment setup, as Google cloud, AWS, even colab exist with easy to use environments.
Anyone recommend a good guide for quick setup and noodling on one of these platforms? I tried a Digital Ocean machine learning droplet awhile back with one of their guides and nothing on it worked.
I liked using colab to test certain things out, but it depends on whether or not you enjoy using tensorflow.
Edit: I should also add that if you enjoyed the colab experience but want to scale up, you can attach the same notebooks to a Google cloud compute VM and run them on that.
Throwing out a random data point: Until now, NVIDIA has been very anti-virtualization (on the consumer side), going as far as to engineer its drivers to detect in-use virtualization (Hyper-V, Xen, QEMU, etc.) and fail on purpose [1]. I'm curious to see how they now handle this scenario (given WSLv2 runs in a virtual machine). Perhaps they just commented those checks out in their 'specialized' drivers, an interesting development for enterprising individuals looking to enable consumer GPU pass-through for general purpose virtual machines and containers.
Yes, instead of using a virtual GPU driver that simulates the behavior of a real device, a paravirtualized driver is a shim that connects a device in the virtualized operating system to a real device on the host.
In summary:
* Full virtualization is a complete, in-software implementation of a device. Early virtualization technology was typically of this nature.
* Paravirtualization typically requires cooperation between the host and the guest, with a special communication layer (in WSL2's case, provided by Hyper-V) between a guest device driver and a host driver.
There are at least two more method of passing a host device through to a VM.
* "GPU passthrough", "PCIe passthrough", or "VFIO passthrough" depending on the source, Microsoft bucks these all and calls this direct device assignment, or DDA. In this mode, the guest OS is given exclusive access to a device or a device hierarchy (defined by the layout of the motherboard itself). This uses the MMU and IOMMU of the host to allow a VM to run a native driver, e.g.: nvidia's CUDA driver, and it will see a real physical device. (Nvidia's driver has historically blocked this by detecting that other parts of the guest OS are virtualized, because from the driver's perspective the device is a real, authentic Nvidia device, but the rest of the OS devices are virtualized and there are ways to detect that.)
* SR-IOV (https://en.wikipedia.org/wiki/Single-root_input/output_virtu...) is a PCI-express native method of splitting a device into virtual functions which can be mapped into a guest. I think the first real use for this was network adapters, which allowed VMs to get 10-40GBps network adapters working at native speeds by passing through virtual functions so that hardware offloading worked. Nvidia supports this on some of their server platforms, with GPUs offering up to 7 or 8 "virtual functions" which allows a single GPU to be partitioned and assigned to separate VMs. Once split up in this fashion though, I think it can be tricky to present the full device as a unified GPU.
The name “paravirtualization” is sort of misleading. It has more to do with history of virtualization than anything to do with technical aspects of it. The intention was to separate it from full on emulation, which is much slower. All modern virtualization is paravirtualization, unless you’re trying to emulate something specifically, e.g. develop for an ARM µc on an x64 workstation.
Technically it just means there’s a client & server components and they communicate through some IPC. In this example you’re talking to a GPU driver running on the host, instead of the GPU directly. The communication goes through a VM boundary, the host driver accesses the device, then returns the result through the boundary, once more. You could just as well do all of this via a network socket, it would just be a little slow.
When you “pass-through” a device the guest is talking to it directly, all the host has to do is set up the IOMMU to limit the guest and the device from accessing the host’s or other guests’ memory. Basically VLANs for PCIe and the IOMMU is a firewall.
Pity. CUDA was/is proof positive that Linux on the desktop is perfectly feasible and that you can use it to both do a UI and do meaningful computation on the same machine without getting tied down into all kinds of licensing schemes. Opening this further up to Windows gives fewer people a really good reason to try out Linux as their daily driver. I personally don't get why any developer would prefer Windows over Linux with its near infinite software repositories related to all things developers would like.
> I personally don't get why any developer would prefer Windows over Linux
Because windows now literally has all the stuff Linux has plus a lot more.
The only reason for a well paid developer to prefer linux over windows for a desktop OS is that you don't need any of the extra stuff (a decent desktop experience, hardware support and a commercial end user software eco system) or that you have some fundamental objections concerning Microsoft's business practices. These are good reasons, but experience shows that most people don't care enough about things like privacy to give up convenience or shiny things.
Apple may be in for a tough awakening. They profited enormously from providing the desktop OS that a very large fraction of the technical elite has been using for a long time. MacOS still is still smoother than windows in some regards, but the writing is on the wall: linux completely dominates the server but is as hopeless as ever as a general purpose desktop OS, but windows no longer sucks for software development. It used to be that macs had the unique selling point of offering a nice desktop experience on good hardware coupled with something close enough to linux that you could use it for a very wide range of software development tasks. But new hardware is developer hostile and macOS drifts further and further away from linux (ever more lock-down that directly breaks developer workflows and tooling and ever more diverging user land with a mix of ancient GNU and irrelevant BSD tooling). By contrast windows now includes an ultra-high-fidelity linux layer with excellent tooling and integration to "core" windows and Microsoft also owns the world's most popular editor and code hosting solution.
I'm amazed how well Microsoft has managed to find back to its embrace, extend, extinguish roots. It is both a technical and a strategic marvel.
According to the Stack Overflow survey, Linux Desktop marketshare for developer has been growing constantly over the years. 23% in 2018's "primary desktop OS" question, 26% in 2019, 27% in 2020. Prior to that the question was asked differently, allowing multiple choices, but Linux was even lower back then. WSL was released in 2016 so much of this growth has happened while it was out. There might be some delay in the effects though, and maybe it would have been a much faster trend had it not been for WSL.
WSL2 came out (in non-preview version) last month and is a vital improvement. Windows Terminal is still pre-1.0 as is the CUDA and DirectX on Linux stuff. Then give it a bit of time to percolate through, technically and socially.
I guess the writing’s on the wall now. The latest lines of Apple products have been all been disappointing to me: laptops and phones without ports, keyboards that break, silly gimmicks all about — and a smartwatch that I developed a skin rash from so bad I still have it now, four months after having stopped using the watch.
CUDA was always the big thing for me, I use CUDA and develop with it — AMD and Apple have missed their shot on this, in an absolutely massive way. Alienate developers with centralized app stores and huge fees, then alienate researchers by picking a fight with the biggest name in the GPU industry while providing no alternative. Silly.
I haven’t used Windows in years and it will be so so bittersweet the day I boot it up on a Dell laptop again.
I'm exactly the audience Microsoft is winning over. For the last 3-5 years, I've been an avid Mac user. The primary reason has always been the familiarity of the terminal.
Now, Apple is making a lot of hardware decisions that I don't like. Having used WSL daily for several months now, it's the game changer.
My next laptop will probably be a PC, and I can only say this because of WSL.
"Personal Computer" has long been specifically associated with Windows machines and specifically excludes Mac and Linux. It started with the "IBM Personal Computer" and its branding caused non-IBM machines (eg Macintosh) to use other branding ("a mac"). When IBM partnered with Microsoft to make home computers, this branding really solidified that "PC = Windows".
And then of course, there was the very successful advertising campaign from Apple where they specifically distinguish Macs from PCs: https://en.wikipedia.org/wiki/Get_a_Mac
Essentially, Macs are "personal computers" but they aren't "Personal Computers", if you know what I mean.
I have been rather miffed that the lack of corporate branding/identity for Linux at the time when Macintosh decided to "distinguish" itself now means that GNU/Linux machines are apparently not "personal computers" anymore.
But if PC = IBM PC, and those basically always came with Microsoft software, I guess history supports that interpretation.
More like familiar experience. Most people don't care about things they have not experienced, Blub paradox.
It is amazing how Mac users stick with Apple through a lot of inconvenience - x86 transition, USB type C, adapters, no audio jack, touchbar, breaking updates, walled garden iPhone. But they also brought a lot to community - good chassis, quality touchpad, HiDPI, wide gamut, capacitive touchscreens with gorilla glass.
Microsoft users has no less inconvenience - ribbon, metro, viruses, ever changing frameworks, ever confusing names (winrt), wrecked settings, custom theme on every application. And community gains - Word, Excel, up to IE6 innovations, XBox, VS Code, language servers, maybe crossplatform .Net some day.
I do not want to wreck them, I want to be objective. Microsoft of 80s rocked, in 90s - big and everpresent. Today they excel in PR. Hell, they got package manager 20 years late.
> The only reason for a well paid developer to prefer linux that you don't need any of the extra stuff (a decent desktop experience, [...list of several more "linux sucks" opinions])
I think you'll find many Linux devs who have previously used or considered Windows have their own list of "windows sucks" reasons.
(By "Well paid", are you referring to people who are willing to spend money on big ticket desktop sw, or people who are in it for the money and don't have the curiosity or interest about how the system works? I think for most it's not about not being able to afford the SW.)
Direct rendering for X. This means the maximum OpenGL version that can be supported with hardware acceleration is 1.4, so many modern X program don't work or will use software rendering.
Multi-monitor support. Under X, windows are by default positioned and sized by the window manager and it's hard for application developers to screw it up, but under Windows each application developer has to be careful to make their program work well on multiple monitors. This shows, and event some built-in Windows applications get confused by multiple monitors, usually by having an action performed on one monitor open a window on a different monitor.
Just a small correction, the only "developers" that find Apple's offerings hostile are those using macOS as Linux replacement instead of paying to Linux OEMs.
Developers targeting Apple platforms are doing just fine.
>The only reason for a well paid developer to prefer linux over windows for a desktop OS is that you don't need any of the extra stuff (a decent desktop experience, hardware support and a commercial end user software eco system) or that you have some fundamental objections concerning Microsoft's business practices. These are good reasons, but experience shows that most people don't care enough about things like privacy to give up convenience or shiny things.
Sometimes it is simply the problem of linux being utterly broken. Linux users wonder why people won't adopt linux, and it is because it is full of problems. I spent a full day this week trying to get an internet connection for my linux desktop working (sans ethernet cable) and was unable to. USB tethering was broken for unfixable reasons and wifi adapters are a nightmare with linux. So that drove me back to windows, I tried WSL, and now I probably just won't ever bother trying to run linux on desktop again. I didn't want to tinker with my software, I wanted to do my personal projects. Windows at least lets me get the job done.
Well, same could be said for Macs. I have both a Macbook and Surface. Windows has surpassed Mac at this point in dev friendliness due to WSL. I've got VS Code and Docker Desktop running remotely to WSL 2 and my Surface Pro is silent. Meanwhile, my Macbook's fan is raging right now with just a single VS Code window. That's very anecdotal, but I have a nice, native sandbox for all my work. It's honestly the best of both worlds. With Mac, every time I upgrade I have to re-install XCode just to compile some Node binaries and use Git. I could go on and on about how little Apple cares about devs, but we are clearly not the target audience. I haven't tried Linux in a while, but on its own it is still the worst of the three for everything that's not software development.
This is exactly what I have found. I know a bunch of developers (myself included) that went from Macs to Microsoft's surface line. WSL2 has been an absolute game changer, there's something liberating about being able to launch distros of Ubuntu, Debian, Fedora etc all faster than the computer wakes up from sleep, it's literally as fast as opening the command line. Microsoft has been making big moves for developers over the past years through open source contributions, Apple literally does not care about them and this sentiment can be seen explicitly on their respective GitHub pages.
Well, CUDA is about the opposite of open source. It needs special hardware and a closed source driver.
CUDA on Windows was already a big deal in engineering. Scientists and academics preferred Linux (and python, and docker) and Windows is playing catch up there to become compatible with the tools that have grown in popularity over the last 5 years.
Personally, as an open source advocate, I think competition (even from MS) is healthy. It drives everyone to build a better product.
Yes, but that doesn't really contradict what I said. As long as manufacturers are willing to support Linux for their products even those that require special hardware, and in spite of having a closed driver this is possible and will work very well, so well that an entire profession can be better supported on Linux than on Windows. And that's very interesting, especially given that CUDA is an outflow of gaming hardware which was better supported on Windows.
WSL2 and all these features don't have anything to do with open source either. It's basically a shim for communication of two proprietary bits of software on both sides. Fortunately all these graphics and compute shims will never get merged into upstream kernel so they'll be second-class citizens like forever.
Dave Airlie who is maintainer of Linux graphics subsystem will never allow to merge it since it's can only be used for proprietary software.
It's incredible that people like you are bitter because others are given a choice and they don't choose what you would want them to. And all in the name of freedom, no less.
I have no bitterness here, just facts. It's important to understand that none of these WSL features has anything to do with open source. It's exactly your standard "extend" stage.
See: proprietary hypervisor runs OS with a patched non-upstreamable Linux kernel and then there is proprietary userspace drivers and libraries that able to talk to their proprietary parts inside Windows.
Can any of it be used with different hypervisor? No.
Can it be used with open source drivers? Again, nope.
If Microsoft or Nvidia wanted to use open APIs they would just support technologies that already exist, e.g there is GPU virtualization with Virgil 3D. There also VirtIO which is standard for all kind of virtualized interfaces, but again Microsoft only interested in using their proprietary tech.
Microsoft could also just use Vulkan instead of bringing proprietary Direct3D user pace to Linux. Or might be they could actually open source parts of their stack, but then it's will be against their vendor lock-in.
This custom kernel only contain shim that needed to pass commands from proprietary component on Linux to proprietary drivers on Windows.
When it's come to graphics drivers Linux upstream don't accept any code that can only be used for proprietary components. So all this Direct3D and CUDA shims never gonna be merged.
The only proprietary part in AMD and Intel graphics stack is GPU firmware. Both companies have few closed-source components that might work on top of the same open-source kernel drivers, but they all have open source alternatives.
Also Linux is becoming the new Windows, and is one of the reason I stopped using it. What is the point of using a system that on every release is more and more bloated and full of crap that it breaks?
Docker for example, I hate it, it serves no purpose beside wasting disk space and slowing down your machine. BTRFS is a bugged imitation of ZFS. Wayland not only is bugged but even does less things than Xorg.
Most scientists and academics then just don't know how to write software or administer operating systems, and if they program they write the worst crap of spaghetti code possible, no wonder they like Linux, Docker and python.
I use macOS, at least I have a system that has a decent user interface, and for my server I started using whenever I can FreeBSD, much more stable, much more simple, good documentation, a good operating system.
Okay I guess I just need to sleep and had to make it somewhat more clear: I never seen anyone using docker for deployment of any software for Linux Desktop. There are superior channels formats for this purpose.
Web developers usually need docker on their localhost just so it's work the same way as production environment. Though when you compare Windows on desktop and Linux on desktop you not gonna use docker at all.
It's funny that you list docker and wayland as linux bloat that freebsd doesn't have, when docker works just fine on freebsd and wayland support is also in the works.
Personally I use Ubuntu as a daily driver. No docker unless you install it and comes with Xorg (as well as wayland, but you're not forced to use it).
I do agree that Linux has been adding complexity at a good pace the last 10 years.
I disagree about Docker. It's a good tool for reproducible development and production environments. What else should we use? virtual machines?
Scientists and academics are not in general, professional software engineers. At least they are using open source tools! What you rather they write their spaghetti in? MATLAB?
FreeBSD does not have the mindshare and ecosystem that Linux does. That actually does matter.
First a developer you don't do only programming: you maybe have to work with documents, do a presentation, a spreadsheet and thus use Office. You maybe have to make some icons or logos and thus you need Adobe Illustrator. You maybe need to connect to a printer like the one that we have at my office that good luck doing that on Linux. You maybe need a CAD software to designa PCB, you need whatever other software that is Windows-only.
Second there are situation where as a developer you are forced to use Windows. A lot of SDKs, compilers, debuggers, programmers for microcontrollers, FPGA, etc runs only on Windows, and if you have to do with firmware development you are basically forced to either use Windows or spend a week just to make a LED blink (if you manage to do that).
Same reason for using macOS: I use a Mac mainly because I need to make iOS software and the only way is to use a Mac. By the way macOS to me is a good compromise, you have all your UNIX tools in a system that you can use to run also proprietary software.
On my server I use Linux, even if recently I started using more and more FreeBSD and I'm liking it a lot (recently they added too much useless bloat in Linux like systemd and other crap, while on FreeBSD you have a simple system that is stable as a rock)
So, I guess when I use Office365 on my linux laptop, which is connected to a HiDPI monitor, that I can't do what you say.
Ok.
Here's a bit of a reality check. You can do, with very limited corner cases, everything you want to do on windows, on linux (and Mac) today, via the web or similar clients.
One exception to this is skype for business. Which is complete and utter trash anyway. But there is a commercial linux version, not by Microsoft, that works quite well for it. I have it and use it.
Teams for linux from Microsoft is in preview/beta. But, the port using electron was out first. And is fully operational.
For things that just absolutely require windows, I boot my licensed windows 10 home which was installed on a small SSD when I bought the laptop, in a qemu kvm window. Everything works on it.
I am amazed with all the hate on Linux desktops. I've been using Linux machines as my primary desktops, and linux laptops for more than 20 years. My current laptop is a Sager (Clevo) unit with 64 GB ram, 1.5TB SSD, GTX1060. Win10 was installed when I bought it, on an SSD. I changed boot order, installed Linux Mint 17.x at the time, and everything just worked.
Once I realized I could use windows from the qemu kvm session, I set that up. Took about 10 minutes.
Something I've learned from a long and painful life with windows, is that one should never let it touch real hardware. When you have to manage it versus fleets of linux machines, you understand why keeping it virtualized is the best option.
FWIW, I don't think WSL2 is going to change anyone's game. This is my opinion of course, but most of the developers I know are either on Mac books, or linux machines. About 60-40 split. In my field (HPC) its 25-75.
Microsoft's best play at this moment would be to buy Canonical, and offer that as MS Linux.
A very fair and practical judgement. May use a mac and remote to windows as some linux egpu has driver issues bout not windows. Remote vs code from mac?
Just to add to this: Linux is happy to work in whatever way is convenient to you. WSL2 is case and point here. Windows is less flexible. If I can get everything I want out of one environment (stable gui, business software, and a native-like linux shell), that will save me the hassle of multiple machines, traditional VMs, or dual booting.
I don't think we will see the Linux Subsystem for Windows 2 ever. I also am not enamored with GUI side of the Linux world. WSL2 seems like a not bad place to land.
In my ideal world, Microsoft would base Windows on a mainline linux kernel. That would make me so happy.
WSL1 emulates linux's syscall interface. WSL2 is a lightweight vm. In either case you can run a regular distro, so not sure what you are on about a package manager.
I understand what they do. I just dont understand why you would do it.
What is the benefit compared to the much simpler, better performance and cheaper option of just running Linux proper and using wine for windows things.
You call it opening up to Windows, I call it the final word on the question of a proper Windows port. Once WSL is established as the one way to use a piece of software on Windows it stay in the half-world of WSL forever.
Perhaps WSL is a perfect tie in terms of user migration: for every person who decides that WSL is their most convenient way of using Linux there might be another who decides that if they are spending so much time in WSL anyways should better try shipping the surrounding Windows. And everybody wins, even pure Linux users who will over time use more software that isn't developed with the overhead of Windows port ifdefs.
That's quite funny sentiment: basically you are saying the only reason for some people to use Linux if they are forced to use it by some library. An operating system shouldn't rely on that, if it wants to be truly general purpose.
> I personally don't get why any developer would prefer Windows over Linux
I'll offer you one data point, for what it's worth.
We do multidisciplinary work on our workstations. That means electrical engineering (Altium Designer), mechanical engineering (Solidworks+CAM), embedded (Keil, Xilinx, Linux, NVidia Jetson, etc.), FEA (Solidworks Simulation), server-side development (Linux, Python, Django, etc.) and the myriad business documents that can go with all of that.
The traditional setup has been to custom build powerful Windows machines with as much memory as the mobo will support and half a dozen SSD's assigned to specific functions (system, data, library, hardware dev, software dev, virtual machines) and then build a bunch of task-specific virtual machines.
Using multiple (3+) monitors it is easy to live in this multi-OS/multi-paradigm world once you get it all setup.
Not quite using WSL yet but watching its development with great interest.
If all someone is doing is web development it used to be that either an Apple laptop or a dual-boot Windows/Linux machine was the top choice. WSL seems to be displacing Apple in this domain.
You can use those software repositories through WSL2, using apt as per usual. And Windows now has WinGet so you can install software just like you would with apt/yum/pacman etc.
It's definitely not as robust - you can't uninstall stuff using WinGet but the concept is there.
There’s a rapidly growing third-party ecosystem growing around WinGet, as well. It’s promising. Here’s an example of a WinGet webgui thingy: winstall.[1][2]
Do you or wider HN users have any picks for WSL/2 and/or WinGet tools?
Because Linux is still crap regarding audio and 3D graphics programming, plus it has a culture of doing everything the PDP-11 way instead of embracing the desktop and GUI based workflows.
I have the opposite setup: Linux is my primary os and I have windows on a separate ssd that I can either dual boot into or use in qemu from Linux. It gets used once a month or so. I don’t game, so maybe I am the wrong demographic, but I don’t miss anything from windows. All my document work (even right-to-left languages) works well on Linux, default Ubuntu gnome. I use two monitors, with different resolution. What am I missing? I spend my time in vim/cli, slack, github in Firefox, and email on thunderbird.
Last time I tried it, I installed Ubuntu 18.04LTS, updated, installed the recommended graphics driver, and then the machine would not boot after that.
I recently tried 20.04LTS on a laptop, with an external GPU over tb3 - it works! However, I really struggled to rectify the display density of my external 4k with the laptop's built in 1080p. I had other show stopper bugs, besides bad scaling, however. :(
What you're missing is the ability to open a game for a quick session while keeping your work open in the background.
What you're missing is that desktop linux is famously unpolished, and worse, brittle. There's a list of known(!) issues with desktop linux that some guy updates - currently it has over 100 entries.
What you're missing is your system continuing to work after major updates (Windows 10's relatively recent and rare "big oops" bugs aside lol). Personally, I use w10 pro with deferred feature updates to mitigate this, which is kinda messed up but there you go. I simply can't afford an update to knock my desktop out, which has happened to me a few times with desktop linux.
What you're missing is that "works for me" works for you, but not for me.
In case you're wondering why I keep trying desktop linux, it's because what I'm missing is an open, free, libre, privacy-respecting desktop. These are values I take seriously enough that I pay yearly estimated license fees to the open source components I use in the form of donations.
For work there's a bunch of random applications I need to use. Most documentation/IT support assumes you are using the standard macbook/windows machine. I'd rather not do the work of trying to use linux as my main OS. Plus I'm regularly writing code for mac/linux/windows hosts.
Overall I like Windows as an OS, but I like using a linux command line for development. Windows with WSL2 meets most of my dev needs.
>Most documentation/IT support assumes you are using the standard macbook/windows machine.
Frankly, learning and intuitively understanding how many software components work is a valuable skill. "do the work" is steep up front but yields dividends, imo.
I've always been confused that nvidia pushes cuda and ML but shuns linux like it is a leper. I get that gaming is M$ based, but cuda and ML is often done on linux machines.
I can just hope that with AMD's competition that nvidia makes some things easier for us practitioners. I just want stable and relatively up to date drivers.
Does kind of feel that way. Having a full Windows Desktop with good graphics drivers and software like Microsoft Office, Photoshop, commercial games etc. and Linux for the programming/server-software side of things.
I have literally no reason to dual-boot with Ubuntu anymore. WSL2 and Visual Studio Code fulfils all my Linux coding requirements.
Maybe (probably?) a bad thing but it's damn convenient.
A Linux environment is superior for doing development (for everything I care about), and nowadays I've been able to play every game that doesn't have a native Linux port in Proton.
Be aware of the fact that WSL 1 and 2 can coexist. Running wsl -l -v from PowerShell should tell you which distribution is using which subsystem. You can change between the two.
If you're using WSL 2 and you put your Linux files on the Windows filesystem IO performance becomes spectacularly awful. If you keep your Linux files on the Linux filesystem it works rather well.
Docker should be installed on the Windows side with Docker Desktop and then associated with your Linux machines under Docker Desktop settings. This makes it available on both Windows and Linux. If you install it directly on Linux it only becomes available on that Linux installation and you get no interoperability.
Systemd tools will yell at you. I encountered issues when running systemctl. There are some fixes but I don't know if they work because I'm lazy so I worked around the issue instead.
I don't know if accessing the Linux files from Windows breaks them. I'm using VSCode remote extensions instead of mounting \\wsl$ and accessing it directly.
To some extent, yes. However, you will get the best performance (and improvement over WSL1) when working with files contained within the Linux file system.
This is BIG for machine learning adoption! 2 years ago it was a nightmare to setup everything on a Windows box, and wasn't working in any VMs (WSL or VirtualBox). Very excited about this.
If I can’t ask for the budget available for any Windows project _for tooling and libraries ^0 I never really know how much of a problem was it being a Windows project. Even then I used to like to ask about experiences with a handful of different parts of the windows systems which frequently suggested that either the programmer was pretty new to windows or that something was amiss, it was usually my experience to find more unix talent avoiding windows by default and at least with a BSD tcp stack and any communications project, windows jobs kept turning out to be for executive management demonstration purposes which was a very important opportunity for getting into more significant financial programming work.
I tried Windows 10 Pro and Windows 2019 Server about 2 months ago on a self-built computer (my first in ~15 years). What I found was:
- driver support was atrocious for Windows server, and also for Windows 10. Almost every component from a fairly standard Ryzen build installed some form of custom skinned crapware to get basic functionality
- spyware and advertising was everywhere.
- the UI is a mess of old and new. It's unforgivable that this far from Windows 8 there is still such a mix of metro and older UIs.
- mixed DPI support is poor leading to screens where elements are quite different sizes
These are not specific to coding, but in particular the advertising and spyware removes Windows as a contender for _any_ use from me.
After a brief foray into desktop Linux, I returned all the parts to Microcenter and went back to OS X.
On the spyware part it can be all turned off. They give the Group Policy tools to do it to 'Enterprise Customers' but it is all there in the settings/registry and tools like this will flip it all off for you quickly and easily. https://www.oo-software.com/en/shutup10
- driver support was atrocious for Windows server, and also for Windows 10. Almost every component from a fairly standard Ryzen build installed some form of custom skinned crapware to get basic functionality
SDI Tool Origin. Don't waste your time with everything else.
Command line navigation and support was awkward and painful unless you go through a complicated rigmarole of figuring out which combination of package managers and power shell scripts lets you run something that sort of approximates Linux.
Python and virtual environment a were bizarrely difficult to get working across my + my teammates computers.
Many useful packages weren’t on Windows, or if they were were awkward to install or just weren’t as fast.
Sure I can install WSL and just use that for everything, but then you get to a point where everything is kind of duplicated, and I’m doing everything in Linux anyways, so why not just use that and be done with it?
Windows never felt as fast as Linux, and using it just perpetually annoyed me.
Some of these are fixed or there's progress being made:
- IMHO PowerShell is not that bad, but the newer PowerShell Core works better and with WSL you can use your favorite linux shell (and mix and match Windows and Linux binaries)
- WinGet is in preview
- No search results: my experience is 50/50 on that
- second class citizen: I wouldn't say most, definitely for Golang
- Docker now runs on WSL2 (which is still a sort of VM, but in my experience it's night and day with the old VM and especially compared to the default Docker experience on macOS)
- Why? You can use Python, and scripts are where Powershell really shines as a language
PowerShell is horrible and confusing to use. Maybe I’m just dumb, but it was confusing to learn and super unintuitive.
If I’m running everything in WSL/2 why not just run Linux? I don’t remember the last time I had to use something windows specific - and word is available online for the sporadic times that needs to be used.
What I've found to be most alien to people used to bash is that Powershell is fundamentally different because of its object-passing style instead of dealing with text streams, usually once someone gets a grasp on this their experience using powershell improves significantly
About WSL, you're right: if you only use WSL it doesn't make sense to use Windows, but it's a good compromise for anyone who needs both at the same time (at least it's a better compromise than a VM)
This feels like it's gonna be terrible for people already doing ML.
Now, the following has more than a bit of elitism, I'm aware of that, so please don't comment to point that bit out...
I'm afraid that searching for answers to specific ML questions will start feeling like trying to google some Windows problems (every few years when I make the mistake of trying to help someone out) where it's all "download this, then click here..."
I hope I'm wrong and this will only be used by people willing to go the extra mile...
What's scary about Googling for specific ML questions beginning to look like googling some Windows problems? This kind of dichotomy already exists between people who interact with ML only through keras and those who use a more flexible library. If you fall into the latter category it's still easy enough to find the answers to ML questions that mostly pertain to the deeper stuff. Don't even get me started on the difference between people who do and don't know statistics theory. But the point is that all of these groups do ML and they coexist just fine.
On another note. I don't think technical proficiency (or whatever your elitism-metric is) is very correlated with using Windows as the primary platform. You can Google stuff related to a Windows problem and through that learn how to work with the registry, and yes, some of the things you need to do require you to RTFM. So this elitism also suffers from being off the mark.
Hope we don't see things like: "Google announces today that to work around Windows' file path length limitations, Tensorflow's directory structure is being rewritten to use three characters or less per directory."
I recently helped a friend with their museum's collection management software. The server and client were Windows based, and even with the proper dependencies and provisioning, the installs kept failing.
Some of the paths were too long and the installer could only be executed after being unpacked to root.
It's been a while since I more than casually gutted and used Windows as an app container, but I hadn't realized how much has remained the same after more than two decades.
Windows itself can deal with long paths for a long time now. The problem is the apps - they need to use newer APIs that don't e.g. deal with structs with wchar_t[MAX_PATH] fields in them. For apps written in higher-level languages, it's usually the standard library that needs to be updated - e.g. Node.js and Python already have such support. But if it's C++, then it depends on how much the app developer cares.
What sold me on NT 3.1 was the in-the-box Microsoft Transaction Server.
MTS is the single largest reason why SQLServer_Linux was a shoehorn refit, not anything like the job that it could have been.
Out of the box CICS interop was bigger than my memory of the first browsers for W/NT. Want to trade with the world and speak EDIFACT, or do X.500 real money movement? You had codes running before you could put the screws on your local IBM account rep to get out of bed for you.
I last looked at HN just as Lockdown was announced in England. The last discussion I perused was showing a project that provided secure remote access to sensitive ports. Which I think is stable CICS fayre since about ‘85 on the cross platform side I would guess at least twenty years older for anything/360.
I mean there are services like Amazon AWS, Azure, Google Cloud that will set up everyting for you. The money you pay them to me is worth the time you otherwise would have spent to do the setup yourself.
This is pretty wild. The WSL2 upgrade really made the terminal way faster than the original WSL, so I'm curious how this runs. Looking forward to trying it!
Indeed. I'm not sure why WSL has consistently required some bleeding edge version of Windows to run. I remember trying to get on a managed desktop at work a couple of years ago and apparently the LTS version of Windows used in the enterprise is even farther behind.
Ironically, I'm super excited about this because I run a Mac. I didn't feel like changing anything in the bootloader using ReIT so I use an old laptop for Linux stuff. Mac + Bootcamp Windows + WSL2 + eGPU sounds like my dream machine.
This is finally the year of the Linux desktop. However, it is running on Windows as WSL.
With this support now for CUDA now, it can be argued that the best, most versatile developer experience is on Windows. You now have access to all the Windows specific tools (such as Visual Studio) as well as all the Linux tools in a very seamless environment.
> This is finally the year of the Linux desktop. However, it is running on Windows as WSL.
One more thing. Given that there's lots of distros, which Linux distro/desktop has won this year? Right, the winning Linux desktop is Windows, thanks to WSL2.
Actually, this might be a great price discriminator for NVidia. They have tried to differentiate their consumer and professional GPUs so they can charge different prices. I could see NVidia only providing WSL/Windows drivers for their consumer GPUs. If you want native Linux, then you would have to buy a professional GPU at higher cost.
I'd say it's going to be great for Linux, since usage of Nvidia will drop and progress will accelerate, because no one will have to deal with blob idiosyncrasies caused by Nvidia refusing to upstream their driver.
With Intel joining AMD on high end GPU scene with open drivers, it's Nvidia who will be the loser with their dinosaur blob approach.
As games are running both on AMD and NVIDIA GPUs, but CUDA is NVIDIA only (and supported by lots of languages and libraries), as a programmer I don't see how AMD could disrupt NVIDIA's developer friendliness/lock in (unless AMD provides a great CUDA implementation).
Yeah, it's very confusing. I've heard it explained but I still think it doesn't really make sense. The core of the argument that I've heard is that Windows has multiple subsystems so they say it's a "Windows Subsystem", but the 'for' in "For Linux" still makes no sense.
The arguments for not calling it something like "Linux Subsystem [for/on/in] Windows" don't seem to match up with the fact that there used to be a similar thing which was called "Microsoft POSIX Subsystem" also.
The spiritual predecessor to WSL was something called (at different times) Windows Services for Unix, and Windows Subsystem for Unix-based Applications.
It was a legal / trademark concern. Naming the thing Linux something something was deemed too risky so they went with Windows something something instead.
The new release of VMware Workstation now works with Hyper-V as its hypervisor but a bunch of people (and I'm one of them) have experienced massive, unacceptable performance regressions for VM guests. I had to disable Hyper-V and hence stop using WSL2 until it's resolved. If the performance issues are intrinsic this will probably be a permanent showstopper for people who rely on VMware or Virtual Box. Hopefully this gets sorted out one way or another.
This looks very cool if it can deliver up to its promises! As a Mac user and ML developer I’m starting to look more and more jealous towards Windows - it is starting to make sense this way.
But I’m also a bit afraid. Has anything related to CUDA ever been easy to install and setup? Anyone who tried this have some pointers on this? For example, I don’t know how many times I’ve googled the CUDA, cudnn, TF compatibility matrix but it must be close to 100. Is this helping fix that as well?
This is quite light on detail, have just installed, was thinking that nvcc would be installed to WSL2 as part of the process. It wasn't so did it via apt which worked fine. Built a sample (with some hacking of the SM versions which should be higher as I have an RTX) and when I run the basic Matrix mult demo get code=38(cudaErrorNoDevice) Anyone else had any luck?
Just done some more reading around and it seems most people are using docker in WSL to get things work! This seems overkill to me. I usually just write CUDA code in linux and use it. Am I missing something?
Why would it matter if it's upstream? Upstreaming something like this is not really useful to anyone but Microsoft; it is still something that is inextricably tied to Windows and WSL.
The point is not "extinguishing" Linux per se, it's achieving enough lock-in that only Linux that Microsoft customers can use is WSL.
WSL is not a Linux. It is a Windows subsystem for running Linux userlands. (Almost as if the WSL name isn't totally nonsense!)
The "default" is Ubuntu. But Debian is supported, OpenSUSE is supported, Kali is supported. Unsupported but available, you can get Alpine, CentOS, Fedora, Arch, lots of distros.
That was WSL1, WSL2 is a micro-vm running the Linux kernel and whatever userland you want. Ubuntu is just the most advertised one, but all are equally unsupported by MSFT, support is provided by the distro "vendor"
By "supported" I mean "available in the Windows Store." I believe that those are submitted by the distro vendors themselves.
And from a lock-in perspective, the userland is all that matters, yeah? If an app runs on Ubuntu, whether it's WSL 2 or in Docker or in a a VM or on bare metal. If it's all the same, then it's not a Microsoft Linux, it's just Ubuntu. Or whatever Linux you want.
You could say that WSL is not a GNU/Linux. But since WSL comes with its own kernel, part of it is definitely a Microsoft-extended flavor of the Linux kernel.
Even if this driver went upstream, it wouldn't be any less tied to the WSL virtualization platform.
The only part they're upstreaming is some ioctls to send opaque blobs from closed-source binaries on the Linux side to closed-source binaries on the Windows side.
1. You don't have to install special CUDA-specific drivers that are behind the normal gaming drivers anymore. You'll soon be able to just use the Regular nVidia Drivers (and even now, all you need to do is install a Beta driver version). That's huge for someone just starting out, they don't have to have "the dedicated CUDA machine / OS"
2. At least in Arch, the Linux side of this is literally as simple as `yay -Sy python-tensorflow-cuda`, that's it. Can't get any easier.
3. VS Code's WSL integration means you can use a real editor, with real linting / auto-complete, but without the technical issues of Desktop Linux (aka the Wayland Nightmare and DPI issues everywhere)