Hacker News new | past | comments | ask | show | jobs | submit login
[dupe] The growing image-processor unpleasantness (lwn.net)
89 points by pantalaimon on Sept 2, 2022 | hide | past | favorite | 10 comments



A previous post of the same thing (via different URL) from 15 days ago:

https://news.ycombinator.com/item?id=32516826


IMHO the key comment from that post is this from kkielhofner:

"I worked on a embedded hardware application using Linux and "cameras". We (like many people) were using sensors from OmniVision[0] and there was a dedicated contractor who's entire area of focus (and life's work) was getting photons from the sensor of the camera into something usable by software. In our case that was an unholy combination of vendor software, custom code derived from very obscure NDA-only datasheets, and a bunch of custom plugins for gstreamer that could actually make use of the sensor, interfaces, and hardware encoding silicon.

It was one of the more eye-opening "WOW, so that's how the sausage is made" experiences of my life."

This matches my experience, bringing up a new MIPI camera on any of the big name SoCs is a nightmare. Actually I've seen a lot of fails with gstreamer, where for one reason or other gstreamer doesn't work, so the software just uses the image sensor driver directly (usually V4L2). But for many SoCs, this means that you get no IPU support, since it's only available as a gstreamer plug-in.

(try to make a webcam using gstreamer: ffmpeg works better...)


Yeah, as someone in this space... what a nightmare it is. Even with off-the-shelf components with standard interfaces like USB3 or Gig-PoE. The supporting documentation, drivers, and software are... well, not so supportive.


i. can't. agree. more.

been building sensors for research for 5 years with all kind of SDKs, even a 30k camera is a nightmare (that one got decent support tho).


Michael Matter at Sanstreak is fighting against the lunacy. If high speed cameras with solid platforms are ever important to you again, I hope you check out his project.

https://edgertronic.com/


a sheer nightmare yes: did this for a research project (3 mipi modules on jetson tx2) -> worked 6 months 80 hours per week getting the setup working.

ah the joy of guessing device tree parameters (support from the sensor vendor was almost void)


I just recently encountered some trouble with this domain the other day. I was trying to port a video project from an intel system that was using their QuickSync solution for hardware h.264 decode to a Raspberry Pi 4 using the V4L2 hardware decode API (the Intel system is before the breaking IPU6 change in Alder Lake discussed in the article).

In both cases, it's to hardware accelerate FFMpeg decoding on Linux. This was trivial to accomplish on the system with Intel QuickSync, I still don't have V4L2 working properly on the Pi 4 and have had to wade through several complex issues.


Yeah it's complicated to get V4L2 working on Linux and requires recompiling a few things from my experience - we eventually got hardware acceleration on ffmpeg on Ubuntu + RPi working but it was a lot of trial and error, and I really didn't document it too well... can't imagine this won't come back to bite me...


Does this mean the computer won't be able to record video from the peephole camera when running Linux (or able to but very poorly)?


Only for narrow definitions of "Linux". As tangentially noted in the article, there are dozens of people working on this problem in the context of ChromiumOS. IPU6 already has support in ChromiumOS, via a binary media driver. If you are ideologically opposed to such drivers, you can expect poor or no support for the latest peripherals. There is a sweet spot for 100% free software peripheral support and that sweet spot varies between 2 and 25 years in the past, depending on the peripheral class in question.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: