Indeed, if you search for "ceramic antennas" you'll see that they are already being used and smaller than equivalent PCB antennas. They rely on a dielectric material with high e_r, which implies that speed of light is slower there.
Lots of portable devices use them nowadays!
Yes, in the games category you can find some. But for playing the classic fully fledged games I suggest something like https://flashpointarchive.org/ , which is more organized
Ive done a lot of work with barcodes, barcode scanners, and symbologies. Many of the cheap ones absolutely will. They just don’t utilize the tracking laser to get a reflection as the light from the screen is literally being shone into the sensor. The bars don’t have to be perfectly black, just dark enough from the “quiet” areas to have recognizable contrast.
I don't get the point of designing and building a 3.3 to 5v booster instead of just wiring a cable to one of the existing USB C vbus 5v pins? Am I missing something?
It's the safe thing to do. If you source power from some other place you have to worry about not accidentally back powering something by mistake. Granted, this can be accomplished just by carefully ruling out the possibility. Some people would rather just not risk making a mistake here
I'm not an electronics engineer, but is it safe to shift 3.3v to 5v if the underlying hardware wasn't designed for that? Is there a chance to put too much strain on the power source?
You've got to stay under the engineered current limits on the provided 3.3v, including how much current is reasonable on all the wiring to your boost converter, but chances are good that mouse transceiver uses much less than the typical usb bus powered maximum current of 500 mA, and even at 75% efficiency, that's about 1 amp at 3.3v, which doesn't need thick wires or traces, or a big power supply.
The transceiver probably does use more power than a fingerprint reader, especially if the reader is idle, but likely not enough to worry about.
All that really matters is how much power (watts, i.e. voltage * current) the device draws compared with how much power the laptop was designed to output. Shifting the voltage doesn't really affect anything other than losing some power to conversion inefficiencies.
The computer doesn't know that it has a 3.3v line externally boosted to 5v. It doesn't care about that at all.
It can care about how much power is being used, since the upstream 3.3v power supply -- whatever it may consist of -- is a finite thing.
But power is not the same as voltage. A device running from 5v does not necessarily use any more or less power than one that runs from 3.3v does. We don't have enough data to quantitatively know if power consumption is problematic or not for this particular instance, but I very strongly suspect that it is not an important concern here.
Finally, computers (and computer-like things) definitely do care about signal voltage. But USB signalling voltage is the same regardless of supply voltage, with USB 2 working between ~0v at the low end and at most 440mV at the high end, and tolerating up to 3.6v for compatibility with previous versions -- by specification. So that's not an issue.
Tl;dr, it's fine. And the author did a fantastic job of executing this hack very, very cleanly.
I think I myself would have taken the easy route and found a good place to run a bodge wire for 5v, and maybe even stripped the Logitech adapter out of its housing for some good old fashioned soldering fun, but everyone has their own proclivities.
The "strain" would depend on how many milliamps your 5V device draws. I don't know the current consumption of these logitech dongles, but it seems to be adequately low for this hack to work.
If I understand right, the existing USB ports are only USB-C. I'm not sure of what implementation the Nano has but if it supports USB-PD it may be able to output up to 20V if the "main" connected device asks for it.
Fair enough, if it supports being charged using any port that is reasonable. I would assume that the notebook has some other internal 5v regulator, maybe for the embedded controller or some other legacy device
In the "For a more concrete example" paragraph, since we're assuming that each byte of 8 bits must have 4 zeros and 4 ones, we always know the 8'th bit after only seeing the first 7. Indeed, many times we would know the 7th and 8th bits after seeing the first 6. So this source is not IID (at the bit-by-bit level) and Shannon's result does not apply there.
As other commenters have said, the source could be IID at the byte level. As noted, it would have 8-choose-4 = 70 symbols. And then Shannon's results would apply to the bytes, not the bits.
The article doesn't give enough information to say whether the source they have in mind is or isn't IID at the byte level. But the obvious choice (all 8-choose-4 symbols are equiprobable) does allow us to compute the Shannon entropy, which is of course always:
H = - E p(i) log(p(i))
where p(i) is the probability of the i'th symbol, which is always just
p(i) = 1/(8 choose 4) = 1/70
Of course, then
H = log(70)
So unsurprisingly the Shannon entropy gives the right answer.
*
Cover and Thomas is really good and intuitive on this. See section 3.2 of:
The thing that's wild is that, in the asymptotic case, "everything is in the typical set". The OP is kind of riffing on this, taking it very literally!
Whenever you're compressing a file on your computer the frequencies are known beforehand. Agree this does not apply to noisy-channel coding, only the Shannon source coding theorem in lossless compression. Good point though I'm not sure if this would be considered an entropy encoder, as previous values do have some impact. Any more info I should look at for that?
Got it, I was thinking of the locations as being i.i.d. in that you do not have any information on their location. You and others are saying that adjusting the symbol counts as you go would give the same performance to other algorithms. Looking into that, thanks.
No-name phones end up using chipsets/radios from varios manufacturers: unisoc, broadcom, mediatek, rockchip, ...
They can just grab a reference schematic and pcb and tune it a bit and get it manufactured. They all use mostly the same interfaces and protocols, like mipi dsi for lcds, so they can have many almost drop-in alternatives to choose depending on availability.
For printheads, it's likely that there are no common standards, each manufacturer has it's own head-cartridge interface, so it's probably more expensive to remain open to changes, reducing the potential profits
Well, that's on me for saying "smartphone components" when I really wanted to emphasize the screens. Can smartphone screens be made in a significantly less expensive factory than print heads? Why or why not?
Because of scale factor. LG, Philips and Samsung because of large scale production, selling screens really very cheap, much cheaper than could small scale factory.
Equipment to produce screens is not something from other world, they are not cheap but achievable for business, but on small scale will not be cheaper than priced LG or Philips screen, even without considering taxes and fees of marketplaces (could add 2x-8x of cost).
If some small factory will try to enter market, it will immediately become bankrupt, because nobody will buy more expensive than from LG or Philips.
Exist possible exceptions - some Industrial or Space or Military applications (or for example, Medical), where customer have very specific needs and could pay additional money for them.
They cannot, just like the chips they are made in very large, expensive plants. There are only a handful of display manufacturers, especially as yoi get up to big panels.
Your foundational analogy is faulty, smartphones are not easy to make.
You can buy some chips for cents, that doesn't mean the factory that made them is any less amazing or difficult to build and operate. The cost of the individual product doesn't necessarily tie to the cost of the plant.
reply