(this is a guest post by Tenox)
I spent a day evaluating NVIDIA GRID K1 card, which is a GPU for high end, graphics intensive desktop virtualization (VDI) deployments. Otherwise called vGPU. What does it actually mean?
As you can see on the stock photo, the card doesn’t have VGA, HDMI, DVI, DP or any video output port what so ever. The output happens purely through Remote Desktop Protocol (RDP) extension called RemoteFX. On VMware and Citrix it works little bit different but I will be covering Windows / Hyper-v installation only.
The GRID K1 is somewhat similar to Quadro card so the driver is not your usual GeForce package, but the experience is quite similar nevertheless. Upon installation you see 4 different physical GPUs in Device manager:
This works similarly to having multiple CPU cores that show up as separate processors in the OS. Here is a first fun fact: you can’t actually use any of these directly, as they simply have no output port and can’t display any graphics… Instead, you have to use Hyper-v with RemoteFX extension:
Then for each guest machine, you add a RemoteFX graphics card as hardware:
In order to use RemoteFX you need to Remote Desktop (RDP) to the guest machine. The protocol is fortunately available since version 7.1 so even Windows 7 can use it. However only Enterprise editions of Windows support it.
Inside the guest VM you see a virtual RemoteFX Display Adapter in the Device Manager:
And as you can see Direct3D is available and enabled. Note that this is over RDP to a VM! The VM’s console curiously displays following message:
Hard to show on static screenshots, but I have to say that RemoteFX user experience is noticeably better compared to a regular RDP. Everything works smoother and faster, scrolling pages, moving windows is a snap. You can play videos / YouTube, etc. But I was more interested in real use case which are high end 3D applications. So I proceeded to install Steam…
Yes! this is GTA V running over Remote Desktop in a VM!
The frame rate sucks quite considerably, even in safe mode, but it was playable and quite responsive (no lag). I actually spent couple of hours going through it and except for low FPS had no issues.
I also spun up this Wyse Thin Client terminal:
disconnected my RDP session and reconnected from the terminal… poof the game was still going:
I even got sound out of the little thing.
I suspect that the low FPS is rather to do with small GPU horse power and vRAM assigned rather that with remote viewing or NVIDIA itself. Unfortunately in Hyper-v it’s impossible to control or fine tune assignment of GPU resources to a particular VM beyond simple on/off switch. The K1 card supports 32 users, so I was only getting 1/32th power and RAM. Perhaps I could have spawned 32 VMs with GTA. Or Call Of Duty multiplayer….
In ending notes I have to conclude that this is a rather interesting technology. According to NVIDIA, Cloud is future of Gaming. In fact they already have cloud game streaming service:
http://shield.nvidia.com/game-streaming-with-geforce-now
Beware of campers who now will be able to disconnect from online games for months at a time.
Thanks for this post, this sounds quite interesting. Presumably this would also work using Server 2008 R2 as a host and Windows 7 VMs wouldn’t it?
Yes, provided that Windows 7 VM is of Enterprise edition. This technology has been around for quite a few years and K1 card in fact is old by today’s standard. It’s just that I never had a chance to play with it. The new Tesla M60 is supposed to be way better.
Wow, really cool, but $3,000 USD for a virtual video card is a bit much… But like everything else in this field in 5 years this should be commodity.
Supposedly is possible to mod certain GeForce cards with certain model chips to become Quadros, NVSs or in this case GRIDs. Would work enough well for testing (“don’t use this in a production environment” ™).
http://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/
This EEVblog site is full of electronic wisdom 🙂
I think you can use almost any DirectX 11 class card with RemoteFX. There’s an outdated list at http://blogs.msdn.com/b/rds/archive/2013/11/05/gpu-requirements-for-remotefx-on-windows-server-2012-r2.aspx
I’ve also toyed with PCIe pass through to VMs in Vmware. As long as you’re using a Quadro or AMD card seemed to work well (though obviously limited to one card per VM).
But the GRIDs have the memory and hypervisor (namely Vmware) support to scale this properly.
Thats very interesting. I will try if I can use generic GeForce or Quadro with RemoteFX.
reminds me of this http://lg.io/2015/04/12/run-your-own-high-end-cloud-gaming-service-on-ec2.html
Oh wow I didn’t know you can get GPU instance on EC2. I will try this right away. I would probably still prefer to do RemoteFX rather than VPN and Steam Streaming. But I can try both.
What server did you install this in? I have been trying to get the K1 to work on our server but when I get to the RemoteFX settings, the computer doesn’t see the K1.