Machine learning feels a lot like cpu fabrication in that a homegrown solution is almost certainly going to be inferior to just taking whats on the market and configuring it to fit your needs. If you aren't going to specialize in this field, is there a point in learning how to roll your own ML anymore?
I'm not sure if these are examples of dropped support, but I run into issues on websites that prevent me from doing something I really need to do:
- I could not unsubscribe from amazon prime yesterday using firefox. The page where you select the option was not rendering correctly. It was white for half the page vertically and the link/button I need to press was absent.
- about 6 months ago I could not sign into apple id on apples site on firefox. (or something like this, I forget exactly what I was trying to do).
- about 6 months ago I could not sign into nintendo's site to cancel a subscription.
So it's not super frequent, but every few months there are important things I can't do in firefox.
In my experience, problems like that are almost always a matter of cookie/cache sticking around when it shouldn't or plugin interference. The only sites I ever have have blocking trouble with in FF are shitty web interfaces for local device configuration, old automatically generated webpages like from MS Access or some other super old enterprise abomination. I worked on a team of web developers that generally developed using FF and then tested heavily in chrome-- everything from simple pages augmented with JS to complex SPAs-- and the differences were pretty minimal.
Yeah that could possibly be it. If I run into again I could try clearing caches. And also wanted to mention that since firefox is what I use daily, of course I will mainly see issues there.
If I used chrome daily perhaps I would see the opposite (broken on chrome, works on firefox).
I get the same feel from DuckDuckGo. I use it, till it doesn't work, then switch to google when it doesn't. Of course google would perform better, as I only use it for the cases where DDG fails.
Yeah, with one exception it has always been dark reader that caused a page to render wrong in Firefox. The exception was some misconfigured oauth stuff that didn't work.
It's sad that there are specific browser-oriented websites (and development processes, obviously) instead of the standards-oriented ones.
(Sure, it's Chrome-oriented ones. We've seen similar previously with IE, by the way.)
We have standards for the web. Real ones: the docs, which are discussed and approved in the industry. We have them for a long time!
So if some browser does not comply to the standards, it's really not the best strategy to adapt a site to the browser instead of the standards.
We are in the situation when (effectively) one company (Google/Alphabet) can lead anything to the whole market, step by step (even when changes contradict the web standards that are in place). The market is not the browsers market, of coyrse, but the internet ads through browsers control, which brings the most money to Google. By projecting its power to each and any aspect of it, Google ensures the uninterrupted market control for years ahead. So Google will continue to do. In the long run, we need to rely on standards instead of specific browsers. Otherwise it's just the monopoly of Google and web tech "market" is just their own backyard. That will bite us all hard.
Having a standard is not even possible technically when you have 1 player that is too good. Due to Hyrum’s Law, any small divergence from the spec will be observed and relied on. Why would you work against the spec which is nebulous, when you could be testing against 99% of what your user use?
It's not too good. It's just wealthiest. Because it holds ads market monopoly. Because it happens to be the popular search engine at the same time.
But it's not the best. Firefox is on par (I know they get some (most?) payments from Alphabet). And people were using Firefox/Netscape browser long before Google existed.
Kern features were integrated in unexpected places. Like, for instance, I could run some command on my Solaris server and create a resource pool of processors and memory; and it was so integrated that this contract could be applied to individual network connections. I could allocate half a processor or some memory to, say, an oracle process for licensing reasons right within the smf, one command. Sure, I can do these things on modern Linux and bsd, and I make my living doing complex things with k8s these days, but it all seems like such a hacked together mess compared to what we had.