I wrote a simple-sounding script to record pulseaudio null sink on one device and send it to another. Tried a bunch of different ways:
parec | opusenc | nc
parec | ffmpeg | nc
Then discovered that ffmpeg can record directly from pulseaudio and serve it over rtsp. But after a few days of experimenting I realised I could still fuck up the command and be left scratching my head not understanding why it didn't work. Positional arguments in multiple places that affect different things depending on what precedes them. Trying to figure out which part of the command line affects input, cloned streams, output(s) etc. Ffmpeg is a great tool, but seriously hard to use. I ended up not only dropping it for recording and sending over the network, I threw it out completely and ended up with:
parec | flac | mbuffer
... Which provides the best quality, consistency and latency. But the best part is that the actual command line isn't much more complex than the pseudo code above.
> InterPlanetary File System (IPFS) protocol support. One can access files stored on the IPFS network through so-called gateways. These are http(s) endpoints. This protocol wraps the IPFS native protocols (ipfs:// and ipns://) to be sent to such a gateway. Users can (and should) host their own node which means this protocol will use one’s local gateway to access files on the IPFS network. If a user doesn’t have a node of their own then the public gateway https://dweb.link is used by default.
And for the avoidance of doubt, yes, ffmpeg is used on multiple planets, not many things are, and many frameworks-du-jour at launch are obsolete by the time the probe actually lands on Mars, let alone end of mission.
After almost 20 years of Linux I couldn't type a correct FFmpeg command with a gun to my head.