Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Weylus – Use your tablet as graphic tablet/touch screen on Linux (github.com/h-m-h)
176 points by HMH on June 7, 2020 | hide | past | favorite | 42 comments



Hi HN, I created this program out of my personal need to be able to do some handwriting on my Linux system. I know, I could just have bought a graphic tablet but as I already have a tablet with the necessary hardware for note taking around I was unwilling to do so. And after quite some fruitless search for a solution to this problem Weylus (= Web + Stylus) was born.

It is written mostly in Rust, some C and a tiny bit Typescript and works by serving a webpage that uses the browser API to capture pen, touch and mouse events from the tablet. Additionally the desktop or a selected window is captured and encoded to a video stream via ffmpeg. This stream is played in the browser using the Media Source Extensions API.

This should also run on macOS and Windows but with very limited features.


I have to say it is quite cool how easy it is got it set up. However, I for me the screen updates on my tablet were too laggy. I wonder how much it would improve if the stream had a lower resolution and limited colors.


Thanks for giving it a try! If you are on Linux you could try to select a window and check if making it smaller still results in lags.

Unfortunately the ability to playback videos via MSE API with low latency seems to vary quite a bit from device to device. You could also try an older release (v5.*). Those work by repeatedly sending PNGs and updating an image tag, actually the gif in the Readme is from that version. If this works better for you I will probably add it back as fallback in case video playback is to slow.


I'm guessing it's slow because capturing the display and encoding it takes a while?

I wonder if you could integrate this HTML5 VNC client: https://github.com/novnc/noVNC , since the complicated stuff above will be taken care of by the VNC server.

I feel like the boss in Office Space saying "if you could just go ahead and...".


> I'm guessing it's slow because capturing the display and encoding it takes a while?

This is a possibility but encoding is already pretty optimized (ffmpeg and libx264 with ultrafast as preset), which is faster than my previous approach of just streaming PNGs. And streaming PNGs was not that slow either, even on a rather old laptop. That's why I suspect there's something wrong on the tablet.

Regarding noVNC: What latencies do they achieve? I am not a webdev so I guess they might have some more tricks up their sleeves, I will probably have a look. As far as I can tell noVNC does not support things like multitouch and pressure sensitivity so that rules out just using it (at least for my usecase).


Have you thought about having a companion mobile app instead of using the browser? If so, what made you go for a web based solution?


Yes I have. I have decided to go with using a browser for now as I simply do not have the time to support multiple mobile apps and I am not too inclined to pay Apple 99$ for a Developer account. Personally I need this to work on my iPad, so you could consider using the browser as my way to break free of Apples walled garden :)

If someone wants to build a mobile app I am more than happy to provide other means of connection to Weylus than websockets. This also might bring the latency down as udp streaming and more control over video decoding and playing is possible then.


Hi, -H-M-H- . Totally newbie here. I tried installing the deb package. I set up the uinput, but I still can't make it work. I'm using xubuntu along with an IPAD. Do I need to change the "bind address" on weylus? What could I be missing here? Thx


If you're on macOS and iPad this is exactly what the 'sidecar' feature does. Needs recent hardware though since it uses the video encoder built in recent Intel CPUs to make it less laggy.


I've used Sidecar, and I thought it was more of extending your screen to the iPad, with some basic input controls on the device as well.

It looks like this is more of a mirroring setup where you can use the tablet for touch-enabled input.


Sidecar works great with touch-enabled input as well—I personally use it for Photoshop/Illustrator projects, and it has completely replaced a traditional tablet (one less thing to carry).

Regardless, this project looks incredibly cool. The cross-OS support is a killer feature.


You need to sign in with the same Apple ID, which is such a joke. I have an iPad that I created a unique Apple ID for and I can't use it with my maxed out MacBook Pro without losing the purchases. Truly infuriating


Why did you do that?


Because my personal devices are personal devices, and I got the iPad in order to do research for a business I'm building on the side. I didn't think I wouldn't be able to use both seamlessly just because of an arbitrary decision on Apple corporate's side


I'm so mad about Sidecar. Works great, unless you use the wrong VPN (in my case, required for work). Why having a VPN affects a connection over USB, I have no idea.

Anyway, got a Luna Display and couldn't be happier.


Did you know about GfxTablet* (limited to Linux)? Any thoughts?

* https://github.com/rfc2822/GfxTablet


Yes I do! That is where I got the idea to use uinput from. But as this is no longer maintained and does not mirror the screen I decided to create Weylus. Something that is different in GfxTablet is that inputs are not directly mapped to screen coordinates but it works like a regular touch pad.


This is a pretty damn cool idea. Do you only use this at home? I'm thinking about how I would use this when I'm not home, I guess the wifi hotspot on my tablet would suffice?


I already used it the other way around with a hotspot created on my laptop. As long as you can connect your browser with decent speed it should work just fine.


VNC should be the ideal solution to this on a fast enough network ...and if vnc client can send a pen input to the host and host can see it as a proper HID device.


> ...and if vnc client can send a pen input to the host and host can see it as a proper HID device.

As far as I can tell this is not possible: https://tools.ietf.org/html/rfc6143#section-7.5.5

Also selecting a specific window, which is possible in Weylus, does not seem to work with vnc.


Thank you!

This is awesome.

This is important enough I'd donate right now, if (1) there were a donate button (2) it'd fund further development to the point where this was production-ready.

Bringing this from proof-of-concept to mass-market would be a huge amount of work:

1) From the video, latency would need to go way down.

2) Pressure support is important, if not critical.

3) Packaging. To be impactful, this would want to be distributed in Apple's store, and in Debian/Ubuntu repositories (or at least a .deb). Paid on the Apple side is fine (and not contrary to open source -- I'm paying for convenience, and that's a payment I'd gladly make).

Next step would be to go from mirroring to second display. This is less bad than it might seem on the surface; it doesn't involve a whole new X driver, so long as outputting from the current one to a phantom device, and mirroring the phantom display (same thing as if I unplugged one of my monitors; HDMI keeps going out, and I could continue to mirror it).

One possible way to get there is to get the architecture to a place where others can do that work (e.g. leave appropriate places for people to plug in, and appropriate breadcrumbs where others can finish the work).

Disclaimers: Of course, one donation isn't enough to fund further development. I'm not rich, so we're not talking massive donation. Aside from that, by the time you set something up, I'll likely be long gone from this discussion (you don't have my contact info to follow up). But that's also not an abstract comment; I donated to Digimend before, and this would address part of the same major gap. Take the offer more as an indication of the massive value this would provide to at least one person.


Thanks for your feedback! I honestly did not expect Weylus to get that much attention. Unfortunately my time is limited and I am not too comfortable taking donations then.

> 1) From the video, latency would need to go way down.

On the encoding side the only thing I can think of to improve latency is to try hardware encoding if available. Something like NVENC might bring it down. That leaves optimization for the network protocol to use, something based on udp is likely better suited than websockets. And finally mobile apps probably leave more room for improving playback latency (and enable other protocols than using websockets).

> 2) Pressure support is important, if not critical.

It is pretty straightforward to add new input methods, I will add some instructions to the Readme, maybe someone will step forward and add support for macOS and Windows.

> 3) Packaging.

There already is a .deb, although I still need to fix some dependency issue [1].

> Next step would be to go from mirroring to second display.

Apparently this is almost possible right now (at least with X11), see [2].

> breadcrumbs

I think this is the way to go, I will add some more info on how to contribute soon.

[1]: https://github.com/H-M-H/Weylus/issues/6 [2]: https://github.com/H-M-H/Weylus/issues/5


Thank you!

On latency, the thing other tools used to really bring it down is a wired USB connection.

And yes: This is something in which there will be an almost desperate amount of interest in some communities. I think the bottleneck on how much attention it receives is, quite frankly, whether individuals in those communities find out about it.

I'm committed at about 160% capacity for the rest of the fiscal year, but if this were a more normal time, I'd probably hack on this myself.


You can do this with Steam Link, which is available on iOS and Android devices. You do not need any Steam hardware.

Just install the Steam Link app, and then minimize the Steam app. Now you have access to your computer from your tablet.


That doesn’t sound like it’s open source which is a problem for many reasons.

Also there are many devices (including some smart phones) where the web will work but that app might not.


I agree it might not be ideal that it is not open source.

But it is by far the easiest solution to install at this moment, with predictable results.


Works great for me as is. The fact that no android app is necessary is so great. The improvements I would see:

- have a qrcode when running the server instead of displaying the url, much more practical

- have a "relative cursor mode" to behave like a usual trackpad, where you move the cursor from its current position. I am not interested in seeing my display on the phone, just use it as a trackpad while looking at my display.


This is a brilliant hack!

I've researched the possibilities for something similar a few months back but my idea was far far simpler, just avoiding the need for an app to scribble things down in a web app. Also, unlike you I didn't finish.

Hats off to you, for me this is one of those twice a year moments on HN where I see something utterly brilliant that no one else though about!


Thanks for making it - this seems like a nice solution for me, as I just bought a stronger laptop again after dailying a Dell 5285 for a while (almost perfect, but screen and power are just not enough and I barely ended up using the pen for notes anyway).

This seems almost perfect for such a setup. You can even get a graphics tablet for little money, since those 8-10" intel atom windows tablets from 2013 or so are super cheap now! Lag doesn't look so bad. Wayland support will be a huge headache, but is not a huge problem yet as long as you can run many apps via XWayland.


This is excellent. I've been searching for ways to use modern hardware with my Xournal/Xournal++ workflows, and this is a promising approach. I am, however, a little concerned about robustness of video streaming.

In an ideal world, Xournal++ would have a modern web-based front-end as opposed to GTK (which currently means it's de-facto Linux-only until GTK resolves issues with Windows pen/inking APIs).


> In an ideal world, Xournal++ would have a modern web-based front-end as opposed to GTK

Weirdly, GTK can do web UI, although i don't know what gotchas may exist. https://developer.gnome.org/gtk3/stable/gtk-broadway.html


Honest question: Why is a web-based front end preferable for a desktop application?

Thinking about it, the only advantage would be if one anticipates a multi-head scenario—like drawing to a common canvas from a mobile OS and a desktop OS.

Would this look any different if we could a desktop OS like gnu/lInux on all of our computing devices?


Yes, I am thinking mostly about the ease of porting the app to different platforms. This is particularly relevant to Xournal/Xournal++ as they are terrific tools, but the hardware that these tools are optimized for typically does not run GNU/Linux. If you think that finding a good Linux laptop is a challenge, try to find a good Linux laptop with a stylus. And there's nothing in the form factor of an iPad, for example.


I have been using Weylus and Xournal++ a lot by now and video streaming is pretty stable (at least on my machine :P). In case the connection does fail a simple page reload will restart the stream.


Love the concept but the lag seems terrible. This is what kills any stylus for me. I'm so used to the instantaneity of paper than the smallest ms of delay between action and effect makes my brain hates the whole process.

As a Linux user, the iPad pro writting feedback is probably the only thing I envy from Apple's users. What they did is black magic.


I think it depends on the stylus and the software. This drove me batty for a long time. I now have a cheap Aliexpress drawing tablet (no display), Digimend, and MyPaint under Ubuntu.

It does as well as pencil and paper.

It's not as good as a fancy pen and nice paper, mind you, but it's good enough.

My major problem is straight lines. I haven't got that down yet (albeit with just a few hours practice). But writing has no lag, and comes out fine.


I see you used X. From what I remember when I looked at this years ago, X was really bad at highlighting dirty pixels. It couldn't tell you where your pixels had changed. So you had to flush the entire screen out over the network (vnc) every time something changed. This inefficiency made it unworkable in some cases.


Exactly what I needed for doing some whiteboarding during a teleconference. Thanks!


Alone from the name I thought this would have support for Wayland. Apparently that is not the case. Nice program though ;)


Fantastic job, I don't know why this didn't already exist, but I'm glad it does now.


Cool project.

Need to see if this works with PostmarketOS and off to buy one old tablet.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: