What is the best single-PC setup for a Linux enthusiast who still likes to play Windows-only games and run other Windows-only software?
Not WINE, because despite working surprisingly well in many cases and being in general better than you might expect, it's often flaky and unstable, and is absolutely not guaranteed to run every program.Not dual-booting, because then you have to constantly reboot and context-switch.Not a Windows virtual machine, because those don't let you run anything graphically intensive.Not a Linux virtual machine on a Windows host, because then you're not really using Linux (especially evident when Windows blue-screens and takes your poor VM along with it).What you want is a way run Linux and Windows simultaneously, allowing each operating system access to the tools it needs to run your system: Windows should get the graphics card and everything else it needs for playing games, and Linux should get the rest for doing everything else.Solution: set up a Linux host with a Windows virtual machine that uses modern virtualisation technology (VT-d & IOMMU) to directly access the graphics card.I set this up over the last week or so, and it was actually a lot easier than I thought it would be. A year or two ago you needed to compile custom kernel images to get it right, but with the latest version of Ubuntu (16.04) running Linux kernel ~4.4, all I had to do was:1. Make some minor config changes to assign my graphics card to the pci-stub driver1 on boot instead of the default radeon driver.2. Set up a Windows virtual machine with Qemu-KVM and assign it my card (there was even a neat GUI for this, and only one point where you need to abandon it and dip into config files).3. Install graphics card drivers on Windows VM.I had to do a bit more futzing around to get keyboard and mouse sharing (Synergy), some futzing with that to get mouselook in FPS games to work (by default Synergy sends absolute mouse coordinates, but games want relative ones, so you end up with a madly spinning screen or a crosshair that simply refuses to move) and a little more mucking around to get my onboard graphics card to play nice so I could use it for the Linux host.All-in-all, easier than getting graphics card drivers to work on Linux itself. /sUnder my setup, I have both monitors wired to both graphics cards. So I can start up my Windows VM, move my mouse up, switch output on the screens, and then feel exactly like I'm using Windows on a normal Windows PC. And if I don't want to actually use Windows, I can Steam's in-home streaming to literally play games on Linux.Graphics settings in games are exactly the same as when I was running Windows, and performance seems the same too (though this may take a few weeks to fully assess).Startup is obviously faster, and I can now spend most of my time using my favourite minimal tiling window manager.Overall, this worked a lot better than I expected, and was almost entirely painless to set up. My thoughts could change in the coming weeks (during which I plan to do a full writeup of how I set things up on my other blog), but so far, so good. Really worth a shot if you think you'd like it and have compatible hardware (you need to be able to enable VT-d in your BIOS and will also require a reasonably recent graphics card). For reference, the three main resources I used were this Linux Mint forum thread, this Arch Wiki page and the five-part guide on the VFIO Tips and Tricks blog.EDIT 2016/09/09: And here at long last is my comprehensive guide to setting this up, as promised months ago.
Thanks for the work and this great writeup. Just wondering if in the intervening period you've had any problems and if so how did you get around them.
Thanks in advance.One issue I had was with Windows update, but that wasn't actually entirely related to the passthrough. At some stage last year a Windows 10 update pushed new Intel microcode, and that update failed for a lot of people. Because I only use Windows for games, I didn't really notice or care until I realised I hadn't received the anniversary update. Long story short, I had to change the VM's CPU to a Core2Duo temporarily to get the updates to install.
So that's not really an issue, in fact it's something that running Windows in a VM made a whole lot easier to fix than it might otherwise have been.Apart from that, I've had my PC lock up once or twice because of running out of RAM, but that hasn't happened since I slightly reduced the number of the hugepages reserved for the VM.I also had a strange situation with a game of Age of Empires II HD: four of us were playing, me on my VM and the other three on Windows bases, and for some reason there was extreme lag between me and one of the other guys. I tried bridging my VM onto the main network, thinking that was the issue, but it didn't really seem to help. It could have also been a problem on his end though, because my connections to the other two guys were perfect.Um, yeah. Otherwise it's worked like a dream.I'm finally getting close to finishing up the full writeup on this I promised back in May, so expect that on https://davidyat.es in the coming days. Some of my friends and colleagues have also set this up, so I'm getting their input to try to make a really comprehensive guide & reference point.And here it is: https://davidyat.es/2016/09/08/gpu-passthrough/
Thanks for all the work put into this. I have flipped through the documentation as well - again super. However before jumping down the rabbit hole I would like to know how 'clonable' is this. So for instance I get a different type of computer/different video card. Do I have to do all by hand from scratch or can I somehow clone and just change drivers for video/motherboard? Basically what I am getting is is deployment management - being able to migrate to different hardware easily - get a second machine, a new one, want to get my friends onto this, etc.
@arlesterc: As long as your hardware supports passthrough, everything should be reasonably straightforward. Everyone I know who's done this has followed basically the same procedure, and the many online guides seem to largely concur on most of the setup.
If you were to upgrade your GFX card, I imagine all you'd have to do would be to change some IDs in your pci-stub loading and VM setup. If you moved from Nvidia to ATI or vice versa, you'd have to change a couple more details. Most of everything would remain the same.For setting it up on new PCs: IOMMU group numbers and device IDs will probably be different on different machines, and obviously there's a bunch of differences for AMD vs Intel and Nvidia vs ATI hardware. Much of this could likely be handled with a bash script. I haven't actually done this though, so I can't say for certain how easy it would be. But I think it's doable – passthrough is much easier than it was a couple years ago when you had to compile custom kernels.So I don't think it would be too bad, but I haven't actually set it up on multiple PCs myself, so I can't say that for certain.I need to do this sort of setup.