Kind of. If you look at the specs of the tensor chips they appear to essentially copy exactly what Qualcomm is doing. They use off the shelf ARM reference design cores packaged together the same way QC does. They’re also about a year behind QCs latest stuff.
In comparison Apple (and the very very latest QC chipsets) use custom ARM cores. Google has yet to do this.
If you want books that can help you learn to think, you can't do better than the classics. I suggest Plato's dialogs or the Analects of Confucius. You may find them more accessible than you expect. They were excellent teachers.
Both translations are from the 19th Century and so have entered the public domain. But both are also often considered the best English versions of these works, even today.
(Note you can download a facsimile pdf of the original book, and print it out as you go. I prefer to read anything long-form that way. Better for concentration in my opinion).
More likely future LLMs will mix ads into their responses. ("Your air filters are in stock in aisle A6. Have you heard about the new membership plan from Lowes...?")
If it was a real Personal Assistant I would just have to say: "I want to pick up my home air filter at Lowes today." and it would 1. know what brand/model air filter I needed, 2. know which Lowes is my local one, 3. place the order for me, and 4. let me know when it will be available to pick up.
No one seems to have mentioned what is most obvious to me about this: if this goes through, Google will corner the market on mobile (on-device) AI. They already have Android users, this would give them iOS users too. Not a large market right now, but the way things are heading it could prove significant in the future.
https://github.com/dotnet/runtime/pull/115732#issuecomment-2...
reply