> I'd be interested in how your desktop alternative works with the VR displays.
"your desktop alternative" -> collection of crufty exploratory kludges. :)
I'm running custom stacks on linux and X. Browser as compositor, and three.js. React. Tracking from low-level Vive lighthouse driver[1], or laptop webcam optical, or none. Camera passthrough AR.
Most recently, I'd just plug a Lenovo WMR HMD with a duct-taped-on camera into an old laptop with integrated graphics; run a browser full screen on the HMD; run xpra to put emacs and xterm on laptop and HMD; with the camera AR in background; and sometimes track head motion using the laptop webcam and yellow duct-tape HMD marks. Boring and crufty. Though emacs looks kind of "hip" with text changing depth.
> Do you edit text, code, email and surf
Desktop is just xpra[2]. A remote desktop that pulls in individual X programs. No "plug in a null display device" Microsoft silliness. Text is ok. Video is low fps (though I've not tried to improve it). I'd not want to surf in a such small window - think a 900 px square.
Because of resolution (and budget) limits, the UI is more 2D on sphere than 3D. Just picture normal desktop windows. Vive resolution was unusably low and PenTile. LCD Windows MR resolution is tolerable with individual pixel control (thus the 2D). 3D might be ok with subpixel rendering, but I don't yet have a laptop with a dGPU, so I've been putting it off. Given 2D, I'm still just using xpra's kb/mouse handling. Bits of a React-and-three.js approach. For hand tracking, leapmotion is unusable, my finger tracking with fiducial markers is currently too slow, and gloves with IMU fingers are still like $4k+. So I'm basically just doing exploratory spikes, waiting on late-2018 hardware availability and prices.
> desk chair?
Desk chair, conference room, classroom, subway. All sitting with laptop keyboard. I've explored room-scale UI before, but for this, just emacs and xterm in space. Not even in space, just in your face. I'm tired of burning life fighting ephemeral display and input limits. I'd like to do software dev and collaborative compilation and category theoretic type systems in 3D. But I'm going to wait for the needed hardware, rather than struggling against the glacial pace of tech progress.
Wow, thanks for this response. It sounds like you should go for a phd in VR/AR non-game HCI. I like that you added AR or something like it, do the passthrough cameras do edge detection with motion compensated infill (more pixels come through the closer/faster they move) so one doesn't feel cut off from the outside world?
Too bad the IMU finger gloves are so expensive, doesn't make sense since the sensors are only $9 each qty 1. Using gloves like these [0] along with a HMD that had integrated wide field cameras could have great finger tracking support.
Just simple passthrough video for now - no vision. And mono, in a bid to reduce eye strain from long hours of use.
> do the passthrough cameras do edge detection with motion compensated infill (more pixels come through the closer/faster they move) so one doesn't feel cut off from the outside world?
Do you mean making "big vision-occluding windows" more transparent when the world behind them changes and/or head spins?
> expensive, doesn't make sense
Existing small high-end market; immature big low-end market; limited ability to do market segmentation; no HomebrewComputerClub-like market-bypass; lack of incentives to avoid collateral damage to rate of progress.
My hope is Xmas 2018 will see both finger and eye tracking get products priced for consumers.
> [color] glove
Yeah. Sigh. They did a startup... and were bought by Oculus. I don't know of an open source release. So here we are, literally a decade later, and you can't easily get one.
It's been interesting to watch VR's widespread innovate-startup-acquisition-unavailable dynamic, and contrast it with say the ferment of HCC. It attracts investment, but devastates the market ecosystem, and cripples research.
"your desktop alternative" -> collection of crufty exploratory kludges. :)
I'm running custom stacks on linux and X. Browser as compositor, and three.js. React. Tracking from low-level Vive lighthouse driver[1], or laptop webcam optical, or none. Camera passthrough AR.
Most recently, I'd just plug a Lenovo WMR HMD with a duct-taped-on camera into an old laptop with integrated graphics; run a browser full screen on the HMD; run xpra to put emacs and xterm on laptop and HMD; with the camera AR in background; and sometimes track head motion using the laptop webcam and yellow duct-tape HMD marks. Boring and crufty. Though emacs looks kind of "hip" with text changing depth.
> Do you edit text, code, email and surf
Desktop is just xpra[2]. A remote desktop that pulls in individual X programs. No "plug in a null display device" Microsoft silliness. Text is ok. Video is low fps (though I've not tried to improve it). I'd not want to surf in a such small window - think a 900 px square.
Because of resolution (and budget) limits, the UI is more 2D on sphere than 3D. Just picture normal desktop windows. Vive resolution was unusably low and PenTile. LCD Windows MR resolution is tolerable with individual pixel control (thus the 2D). 3D might be ok with subpixel rendering, but I don't yet have a laptop with a dGPU, so I've been putting it off. Given 2D, I'm still just using xpra's kb/mouse handling. Bits of a React-and-three.js approach. For hand tracking, leapmotion is unusable, my finger tracking with fiducial markers is currently too slow, and gloves with IMU fingers are still like $4k+. So I'm basically just doing exploratory spikes, waiting on late-2018 hardware availability and prices.
> desk chair?
Desk chair, conference room, classroom, subway. All sitting with laptop keyboard. I've explored room-scale UI before, but for this, just emacs and xterm in space. Not even in space, just in your face. I'm tired of burning life fighting ephemeral display and input limits. I'd like to do software dev and collaborative compilation and category theoretic type systems in 3D. But I'm going to wait for the needed hardware, rather than struggling against the glacial pace of tech progress.
[1] https://github.com/mncharity/node-webvr-alt-stack [2] https://xpra.org/