Nobody cares about [de]centralization, for more than 99 % of all internet users it does not matter whether the internet is the internet or a single server sitting in someone's basement. They want to use services - chat, write mails, watch videos, have a website, buy stuff, sell stuff - not run infrastructure of any kind. So nobody is going to have their own servers, they will all use existing services. And because it is easy to switch, everyone will be using the best - for some definition of best, could be easy to use, cheap, functional, ... - service and everyone else will go out of business. That also makes the internet simpler, there is one place for one kind of service and everyone else will also be there. And this does not only apply to end-users, the move of IT into the cloud is fundamentally the same thing, nobody wants to run the infrastructure.
You can maybe argue that everyone has their preferences wrong and they are hurting themselves in the long run, but good luck fighting that battle.
> And because it is easy to switch, everyone will be using the best
But it's not easy to switch.
E-mail was easy to switch "back then", when you didn't have a bunch of accounts tied to them. IRC was easy to switch, because most of the servers were interconnected into a few large networks, and all the clients used the same protocol.
And now? Your grandma only knows how to use whatsapp? Well, you're not switching away from that, and facebook is getting all your data.
Email is worse than that, running your own mail server means pleasing Google and Outlook to accept your email. And you may also have to pay to get out of blocklists turned extorsionists.
Actually it examplifies what goes wrong when an infrastructure monopoly is created.
Only a sucker would do that. Google, Yahoo and Hotmail, the providers of nearly all email addresses, don't rely on public blocklists. Those are very much a noughties thing.
Right so to get users to switch you need to invent a new mode of communication. We have had postal mail, telegraph, telephone, fax, email, and instant messaging. What's next? Find an opportunity for disruptive innovation in communications that isn't already dominated by established competitors.
But it's not 'everyone', it's just grandma. Auntie is using Viber. Grandpa from the other side is using google hangouts (chat? something). And your cousin is using telegram.
Most people don't have friends on other continents. At least for Germany I can confidently say that there is nothing (except rounding error) besides WhatsApp. No matter the demographic.
I have never experienced a debate which messenger to use, and I have joined about twelve study-related group chats over the last two years. Same for personal messages. Some people have Signal or Telegram installed. After two or three messages for the novelty factor, everybody is back on WhatsApp. Because that app is open all the time.
I live in slovenia, and have instagram, whatsapp, viber and telegram for "normal people" who only have one of those (and can't be reached elsewhere), I can't reach some people (besides sms/call),because I don't have facebook (messenger). Also i have a few relatives using skype only, those are a pain, because the client sucks.
I have friends in different continents and the three main messengers I need to communicate are WhatsApp, Telegram and WeChat. Then most people also would like to use Instagram Direct Messages but I don't have an account there.
People in eastern Europe largely use Telegram and people in China use WeChat. Everyone else mostly WhatsApp. I don't exactly know how it is in the US though, perhaps iMessage is more popular there.
You're an outlier. As I said, most people don't have intercontinental friends. So it doesn't matter that different messengers have "won" in different regions. It's still one main messenger for almost everyone.
My friends used to use a mix of iMessage and Facebook Messenger, while my extended family relied mostly on Facebook Messenger + SMS.
I didn't like this, I didn't have/or want an iPhone, meaning I was excluded from certain iMessage groups and I hated using Facebook so I was implicitly excluded from discussions via Facebook Messenger. Most of my communications were over SMS as a result, so not ideal for my social life.
Seeing this as a problem I researched alternative message apps that had feature parity with iMessage, I figured any attempt to get people to switch would fail if I couldn't get this much. I also decided to bank on the latent frustration people had with Facebook, the company, meaning I had to scratch WhatsApp off the list.
I ended up with Signal vs Telegram. Telegram had the sleekest interface and good feature parity with iMessage. While Signal fell short feature wise, it supported SMS (at the time) and some of my friends were interested in it from a privacy angle.
Ultimately I decided to be realistic, so I scratched off Signal and chose Telegram. The goal was to get everyone to switch not just a few who were "interested", so feature parity had to stay the priority, privacy be damned. My pick was very important because I figured the likelihood of a successful migration would decrease with each attempt I made.
Finally having made my choice, I consulted 1 on 1 with my individual friends and family members who usually organize events, and convinced them to install telegram + join my premade group chats. I then nagged them to notify everyone that all event planning would now be via Telegram and that everyone needed to install it now. I think the Telegram invite link really helped grease the wheels here.
Finally after setting the stage with that, I individually convinced each friend and family member 1 on 1, via a call or in-person, to install Telegram and join the new group chats. I made arguments such as: It would unify our communication under the same platform and make everyone's lives easier, we can use Telegram surveys to more easily schedule stuff, it has all the same features as iMessage + more, I already got X and Y to join so I really don't want you to be left out, I can help you install it, etc. I found it was important to take full initiative during all this.
Finally, in only a few days I got two distinct groups migrated onto Telegram. We have continued to use it for 2-3 years now so it's safe to say the migration stuck. The only one who I couldn't get to join was my cranky uncle who wanted signal instead (first I had heard of this from him), but his wife joined so it didn't really matter anyway, he is simply excluded from discussions now.
So ultimately, you can get people to switch if you put the work in :)
> Finally having made my choice, I consulted 1 on 1 with my individual friends and family members who usually organize events, and convinced them to install telegram + join my premade group chats. I then nagged them to notify everyone that all event planning would now be via Telegram and that everyone needed to install it now.
So nobody but you ever got a say, it was all "me me me". You sound insufferable.
I can assure you, your acquaintances still use whatever they used before, and Telegram is the "weird person messenger" now.
What a really nasty thing to say, I was genuinely offering advice to you on how to negotiate with people you should already be getting along with anyway.
I can assure *you* that my friends, not a "acquaintances", are much happier using Telegram than they were planning everything over a soup of Facebook Messenger and SMS. If I had it my way we would all be using IRC or Signal, but my compromises to ensure feature parity with iMessage was out of an understanding of what everyone desired, which was the core of what I was trying to get at here but I guess you missed that, not that I'm surprised considering your shitty attitude. In truth an insufferable person simply would not be capable of convincing 18 different people to switch to a new messaging app, no matter how badly they nagged them.
Many of my friends have actually thanked me for fixing the situation since it has greatly improved our ability to make plans and hang out together, which is a tricky thing to do in adult life where everyone is on different schedules. But sure I guess my desire to improve how me and my friends communicate makes me selfish huh?
When I was at uni for CompSci we had those discussions. Including the odd ball with sms-only phones. We also had threema and signal, none of which prevailed.
They don't care when it works. Then they get locked out of their google account and lose a lot of things at once and only have few alternatives to choose from and may return google yet again due to lack of choice.
> You can maybe argue that everyone has their preferences wrong and they are hurting themselves in the long run, but good luck fighting that battle.
That battle has been won many times. We don't let people run blind into open knives in many contexts.
I mean there is of course a way to fight and win that battle and it is regulation, write things like interoperability and data portability into law and enforce it. I was more thinking of things that can be done without enforcement in the initial comment.
>write things like interoperability and data portability into law
This would be nice for sure, but I think the solution could be even simpler than that. The only successful way these centalized service platforms have managed to monetize is by gross privacy violations in support of pervasive advertising. Strong privacy laws would essentially outlaw their business models and leave a hole that a network of decentralized hobbyist services would fill.
I do agree that 99% of users don't want to run infrastructure.
I think there's a difference between Fediverse-style federation/decentralisation and true P2P/BitTorrent-esque dectralisation. BitTorrent, in its current iteration, does have many semi-technical users, but perhaps your grandparents would struggle to use it. I think a much more friendly UX could be built; maintaining its decentralised properties would be more difficult but not inconceivably so.
I actually think transparent decentralisation is possible but the current policy settings (copyright, surveillance and advertising) somewhat disincentivise people from working on it, to the extent that most of the current projects are hobbies, crowdsourced or funded by research grants.
While we are so damn real, it is the same with climate change. 99% of people dont care if the product they buy have bad CO2 emission stats or not. All they care about is the product, the price, and the use they want to put it to. Nobody really cares about the rest. If you can buy it in a store, people will do so.
I'll go a step further: any theoretical benefit that decentralization has (except ownership) can be emulated by a centralized architecture.
That said, I'm looking forward to reading this RFC when I get a chance. I hope there's some good ideas in it.
I think we're heading for a two-tier internet, though, in many ways. Look at the post yesterday about a facebook drenched in AI-generated dog sculptures.
I think that's a second order effect of ownership.
You'll bring the same heat down on yourself (eventually) if you use a distributed protocol but rent your server from Amazon. Therefore I think it's ownership of the hardware that is the defence against censorship, not the protocol you use.
This has never proven true in practice. There is so much people are not willing to communicate when third parties are present. This is why the behaviors and availability of features are wildly different on closed networks versus the web.
Few people seem to care about the environment either, or the exploited foreign worker class despite their impact on humanity, and besides it would be really hard to fight that battle, so we should just be apathetic and carry on with business as usual.
I feel the 99% part is important to repeat regularly among tech savvy groups: we are the fleetingly small exception. Most users don’t know and don’t care when it comes to the technical merits of implementation.
> chat, write mails, watch videos, have a website, buy stuff, sell stuff
Why can't every computer do this without servers? It sounds like a ridiculous idea but it can't be impossible to achieve. My 24 core laptop has more power than my 4 core server.
If everyone runs their computer 24/7 or if you are willing to live with the limitation that somebody can only write you an email when your computer is turned on, then you can do that.
And of course if everyone runs their computer 24/7 and exposes it to the internet, then we are back where we started, everyone runs, maintains, patches, and secures their own servers. Which nobody wants to do even if they were capable to do so, which they are not.
Email already supports delivery to destinations which are intermittently connected. The legacy MTA behavior is to retry automatically every few hours for several days, with failure resulting in a notification to the sender. Operators have adjusted the behavior over the years to support grey-listing so the retry period and interval are both shorter -- but still in use!
While communication between two hosts which are rarely simultaneously connected is best facilitated by a third party, some tweaks to your MTA's the delivery retry interval and period could enable reliable communications between email users who are connected simultaneously on a regular basis.
Decentralization is about enabling builders - users get the indirect benefits which follow.
What you have done is justified 'everything under the sun' so long as it technically operates in a free market. But the ability for users to switch does not guarantee that the incentives to compete are at all healthy or robust.
You are correct when you say centralization affects users indirectly and they will simply use services which are most immediately convenient. But competition is not giving users much benefit because every centralized service has a monopoly on their instantiation - it's not like you can make a few tweaks and give everyone a moderately better experience - you have to start from scratch and make yet another siloed and extractive platform for any and every improvement. And then if you do they can easily copy you back before you build a fraction of their momentum.
X isn't going to let you improve the experience and just take users; they're going to say: "Have fun building up user trust and security infrastructure - also you're never getting our users." That's the difference between a protocol and a platform. The reason companies build platforms is because they need to fund infrastructure and opsec at scale so they effectively need to build monopoly protocols i.e. 'platforms.'
So "switching is easy" is meaningless. Building successful competing platforms is, by design, very difficult; it takes large investments and huge risks and a lot of rebuilding what's already been done just for the sake of catching up to platforms who have obvious incentives and built-in methods to discourage competition via their tight, centralized structure. Even if the platform is better, it will probably fail relative to its predecessor.
Decentralization most directly helps builders. If the basic requirements of a service are sufficiently decentralized security, networking, front-ends, then a builder who wants to compete via small (or any sized) improvements *does not need to rebuild the entire service.* Small builders who would otherwise not have access to startup capital, risk tolerance, or excessive build-hours would be equally able to compete because their decentralized access to the basic requirements cannot be locked behind an extractive economic scheme.
Imagine every talented programmer making open source software could leverage it atop secure, robust and interoperable networks. And they could earn money from it.
There is a massive difference between being a passive proponent of the free market and a maximalist for market competition. Your justification for the state of the field is a passive retreat to free-market capitalism. Users certainly have the ability to choose, but there are all sorts of schemes and situations in a free market where competition is choked. Honest proponents of decentralization are maximizers of opportunities for competition - they recognize that the free market is a gradient and not some binary quality which automatically imbues every operation inside it with good accountability.
It is my opinion that there is one singular powerful force driving centralization: the need for moderation.
I remember when CmdrTaco left slash dot. The place went to ruins almost immediately.
If dang didn't do what he did or someone like him, we would all probably leave here as well.
Joel Spolski managed to create a self-moderating site in stack overflow.
Say what you will about the platform being toxic, and you're not wrong. Content from the uninitiated is not treated well, but on the other hand, The quality of the content is generally very high. People have a lot to say about this, but I will remind them that self-moderation like what stack overflow achieved is a monumental achievement! This is a really hard problem!
In the early years of Mastodon the moderation wasn't so up to scratch as it is now. It has gotten so much better. It does give me hope that moderation can still be done on a decentralized platform. That said, it can't be denied that Mastodon is fighting in uphill battle precisely because it is decentralized when we speak of moderation.
Once we start to have more mature tools around moderation in the face of decentralization, I think we'll start to see a groundswell of decentralized services finally coming out of the shadows. We see this starting to happen already.
I guess it depends what you mean by decentralised here. For peer to peer networks, moderation is indeed difficult, since by definition no one singlehandedly controls what's shared there.
On the other hand, federated services like Mastodon really work fine in the same way old school forums and personal sites did; the community decides what's acceptable, and content that goes against that gets blocked/removed.
The real issue is that moderation doesn't scale; you can't easily automate it if you want it to work well, and hence both large centralised networks like Facebook and Twitter and large decentralised ones struggle to keep things under control.
I'm not sure moderation scales down well either. It's easy to say "the community" decides but in practice that means it's down to the handful of people who step up and do all the work. And people interested in volunteering their time for free to moderate don't always have the personality type we'd prefer.
That's why it's so concerning to me that Mastodon admins can interfere with your ability to leave an instance. Like email, you're to an extent dependent on your old provider to forward.
Yeah that's another issue. You're also dependent on the moderators on your instance being reasonable people and being fair to those using it.
But sadly that's gonna be a problem with instances of any size. If Twitter or Facebook or Instagram close your account, your data is probably just gone as well.
What's interesting to me is that Facebook, Google etc. haven't solved moderation. Big centralized servers have awful automated solutions with no way to appeal.
It's a problem with scale and incentives. At their scale, hiring people like Dang is not going to be particularly profitable, since you'd need thousands or tens of thousands of them to moderate the platform (and even then, it gets difficult since you really don't want too much moderation for private messages among family members and friends, companies, etc).
Add this to the difficulties of finding people that good in the first place (even more so said people willing to work for a large company and moderate at scale), and you've got a bit of a problematic situation:
- If you try to hire 'quality' moderators, then you likely spend millions of dollars a year on moderation, without any real guarantee that the moderation will actually be good (since different communities and groups disagree on it, cultural differences come into play, etc)
- If you try to outsource the moderation to some low cost region, then you get poor quality moderation due to cultural differences and a lack of understanding of the subject matter and community.
- And if you try to automate it, you arguably get an even worse situation, since false positives and negatives are everywhere and often there's no way to appeal the situation when they get things wrong.
Then things get even worse when you realise that different communities interact and have different standards...
Moderation is one problem of gray area, there is never a clear line between what is acceptable content and what isn't. Not only is the community and their opinion on what's acceptable changing, the moderation rules are a moving target and at scale you inevitably have government pressure to moderate certain content that the government doesn't approve of.
From every attempt I've seen tried over the years in tech, moderation is a losing battle and the main reason building large platforms for user-generated content is such a nightmare.
Yeah this is what I mean that moderation isn't "solved", and maybe it can't be. I personally think that it's an issue of having so many big companies that try to control what gets shown. Ideally it should be up to the local communities what they want to see and not see, which is something a decentralized system can offer.
Totally agree that the centralization by massive companies is a problem here. I don't think decentralization will fix it though.
Mastodon tried this, specifically with federation as an attempt to decentralize. As long as people try to use it similar to a centralized platform like Twitterz the moderation problems only get worse. You end up with more moderators with less accountability all trying to moderate both their smaller circle and everything from the outside world too.
The only way I see fixing it with a different network model is by going for a fragmented system. Think 90s era forums, very group is its own small bubble and is never attempting to he part of some unified global town square.
Slashdot's moderation system didn't depend on CmdrTaco, that was famously the point of it. The userbase moderated itself. Slashdot never really made money and got kicked around between different companies a lot. There were some website redesigns that upset people and the visual design limited the number of stories that could be posted.
Moderation can be done in a decentralized way. Email spam filters are an example of that.
The big problem is people have been trained to think that centralized authority is a necessary precondition for security. Just look at the state of browsers gating access to APIs for sites on local sub networks. We have created a two tier WWW.
But the security concerns are not without basis, yet doing things like coming up with a secure replacement for mdns is not exactly aligned with the interest of organizations that want all information to go via the cloud, and they will fight it tooth and nail.
As tech people we really should put our money where our mouths are on this and stop using github, and in doing so fix the pain points.
> Just look at the state of browsers gating access to APIs for sites on local sub networks.
Given we're on HN I think it's fair to nitpick a bit - what do you mean here? What API are you accessing through a browser? If it's a control panel or something through an API, you can install a cert into the browser, or get a wildcard cert signed for a local domain.
> The big problem is people have been trained to think that centralized authority is a necessary precondition for security. Just look at the state of browsers gating access to APIs for sites on local sub networks. We have created a two tier WWW.
Most people wouldn't in any way think like or even about this. Which people do you mean?
> The big problem is people have been trained to think that centralized authority is a necessary precondition for security.
I think that's inverted, but not in the way that you think it is inverted. I think the map that fits looks more like centralized authority builds the systems that serves its needs (and they're identifiable as such because they were built by a centralized authority with no inclination to hide its efforts, and maybe even incentives to advertise them).
I'd like to know more about this though:
> Just look at the state of browsers gating access to APIs for sites on local sub networks. We have created a two tier WWW.
Because I don't see it (the first part). I don't agree wholly with the second part either, because I do defense in depth and I don't entirely trust my own network. But barring those measures / tastes yes there would be two classes of services, internal and external. This is pretty old school, along with the DMZ third wheel.
> As tech people we really should put our money where our mouths are on this and stop using github, and in doing so fix the pain points.
Github is (in my mind at least) just one manifestation of this, but yeah, I host my own Forgejo (a fork of Gitea) instance for personal projects. Also trying to get the company to switch to Gitlab (especially since I strongly prefer its CI/CD to TeamCity), but I'm against a lot of organizational inertia there so that's not really a fight I expect to win.
GitLab is fine-I-guess, we use it at work--but at home I use GitHub and self-hosted runners because GitHub Actions is great and GitLab CI isn't nearly as comfortable to use. And I don't want to host GitLab and another CI, which also involves learning a third CI platform other than the one I use for work and the one I like.
GitHub Actions turns out to be maybe the best CI out there these days for low-friction, get-it-out-there stuff, and there's probably a lesson or three to be learned in there too.
I'd put a lot of fault with the internet search and discovery mechanisms, that almost all favor popularity based mechanisms.
This works well in a vacuum, but as they start to direct traffic, they feed into themselves to increase the popularity of whats popular, and obscurity of what's obscure; and inevitably create an extreme Pareto distribution where the Internet seems to consist of only a handful of different services.
The easy fix would be to do something familiar to genetic algorithms - fish out things that are unpopular at random and give them 10% of results. On facebook or the like, do that for things that people DON'T want to see (conservative content for a progressive, for example).
This doesn't require a substantive change to the existing algorithms, so it shouldn't break (many) things.
>fish out things that are unpopular at random and give them 10% of results. On facebook or the like, do that for things that people DON'T want to see (conservative content for a progressive, for example).
I don't think that's nearly as easy a fix as you suggest. Sure, some political views may have reasonable opposing views, but the other side of the story when trying to look up the history of the measurement of the diameter of the earth is probably flat earth garbage.
That's already happening. OTOH, people who are only ever exposed to flat earth garbage (or, let's say, young earth creationism, or homophobia) might get a glimpse of what the real world looks like, and look into it.
All this is shadowed by the SEO industry. What small sites have in their favour is the long tail of new keywords. In tech there are a lot of them. I had a blog that came up in a colleagues search results, I checked and I was place 20 for a decent keyword. I don’t so SEO so there is hope for the little guy!
One of the things that bugs me is the design of the current DNS system which is currently extremely centralized and under the control of a small number of organizations. It's ridiculous that nobody can truly own a domain name and instead, we're all just renting them and renewing them and have to keep forking over money.
I really like the concept of Unlimited Domains which lets you buy and own domains forever but I'm wondering why browsers don't support them broadly as an alternative. Blockchains are optimized for high-availability and therefore, that makes them ideal for the DNS use case where you want lookups to be free. Also, it is acceptable for updates (e.g. ownership changes or changes to the 'zone file') to incur a cost that is proportional to the utilization of the network for that purpose; it would guarantee that each update action would incur the lowest price possible.
I believe we need creative thinking for a decentralized DNS, though im not of the belief it would be incentive compatible to do so on a blockchain. ENS and similar systems are a neat parlour trick, but as a simple example, whos paying the bill for storing and serving that data to users? How does your client know that the data you are being fed isnt just a DNS injection? You could use POW to validate the authenticity, but then data creation must be throttled to maintain high enough fees. You could use proof of stake, and find DNS more centraluzed than ever.
Proof of Stake has low energy use, simple design which aligns network usage with token price and it's not centralized as people claim (that's FUD from Bitcoin maxis; reality is it's more resistant to centralization than PoW); so long as the initial distribution is sufficiently spread out and it's set up correctly, it tends to stay decentralized even when under attack. I've seen PoS blockchains maintain good decentralization even in extreme cases after massive price drops while someone is actively trying to buy up majority of tokens (what would happen is it would send the price shooting up temporarily and the attacker loses money after it crashes back down to the original level).
Also, the fees for updating DNS records go straight to the validators (people running the nodes) and this list of validators can change over time so it's not a fixed set of entities; yet whoever they are, they always have a substantial personal stake in the reliability of the project.
You don't need to throttle data creation, with PoS, the price of the token goes up with market demand for it. You just need to make the token the main mechanism for payment for purchasing domains and doing zone file updates. This coupling can be built into the code. Many projects have done similar things. E.g. FileCoin coupling token price to storage space.
...Problem with FileCoin is that while it's good for high-read, low-write scenarios (like DNS), it's not so good if everyone needs to update files frequently as update costs become prohibitively expensive. I guess FileCoin could be great for CDN use case.
Its anything but simple, which is why its proven subject to several attacks, the attack surface is so broad its not even mappable. Its a system of systematic centralization, its the DNA, it rewards people on the basis of their wealth. Its not like this system hasnt been tried before, its the basis for the operation of the federal reserve after all, and had you proposed it on the cypherpunk message boards it would have been ripped to shreds. I would rather live with no central mechanism for DNS, where browsers make you select from a variety of DNS providers that are privately hosted, than turn it over to a kleptocratic proof of stake system.
I had heard of ENS but yesterday I discovered that SNS is more affordable so I tried it out. now you can type gushinggranny.sol into Brave browser and it redirects to my peertube instance. SNS has A records too but I have not tried it yet. I am grateful to see that Brave browser is so forward thinking to support a decentralized DNS right out of the box and I am grateful to have found a usecase for NFTs that isn't completely stupid.
Yes, but in practice the whole system would be broken if different countries had different records for the same domain. With blockchain, you can enforce consistency.
I think you misunderstood me; I was making an analogy.
All systems have some form of common ground or central point. That does not make every system “centralized”. With the example given upthread, blockchain, there is the development team, or standards organization, which is a central controlling point of the blockchain. Does this make all blockchains centralized? No. In the same way, a common root zone does not make the DNS centralized.
Both suggestions ("blockchain" and "DNSSEC") are nonsensical here.
As you note, "blockchain" is a technological concept. Implementations of a blockchain have varying qualities, some of them are centrally managed by an entity, some of them are based on participant consensus in varying degrees. So suggesting "blockchain" be used to solve this problem is a bit like suggesting "nails" be used to build a house. Sure, it's a component you could use, but you haven't really nailed down any specifics about how the house is going to be structured.
DNSSEC, by contrast, is 100% a centralized PKI system. Leaf records are signed up and up and up, and the root of it all is signatures on the root zones by the operators of those zones. Having a hierarchical structure where an entity at the top of the pyramid signs off on the records beneath is about as canonically centralized as something could be.
The suggestion to use “blockchain” came from user jongjong, not me. I therefore cannot elaborate on it.
> DNSSEC, by contrast, is 100% a centralized PKI system.
Again, my point is that just because something can have, or has, a central point, does not mean it is “centralized”. The DNSSEC root cannot override individual DNS records on the domain or subdomain levels. DNS (and DNSSEC) is federated; i.e. it delegates absolute authority over sections to others, which can in turn delegate authority for domains to yet other, which can in turn delegate authority over subdomains, and so on.
To make an analogy again: The US federal government does not possess absolute authority over the states. The UN does not possess absolute authority over its member nations (or other nations). Interpol does not possess absolute authority over every police force in the world. And so on.
> Self-hosting with a dynamic IP seems difficult if not impossible
You just need something on your network that monitors your IP and updates your DNS when your home IP changes. "Dynamic DNS Update Client" yields results in Google that will be a good start to understanding.
> There are some dynamic DNS services but that kinda defeats the self-hosted part.
No, selfhosting at home doesn't mean you have to host a public DNS server. You will definitely need some external DNS pointing to your home network. There are multiple free providers.
You will still need a static IP for SMTP. Dynamic IP assignment and NAT traversal are the largest hurdles to self-hosting reliably. DNS updates are great, but then you're still relying on a large centralized DNS provider. You can host nameservers yourself, but then you're back to the Static IP issue.
Self hosting SMTP servers is downright impossible these days.
The big Google, Yahoo, Microsoft, and others "fight spam" / competition
by black listing little known SMTP servers.
If you try to do it from a VPS, they have usually blocked entire
segments of the companies IPs. (Because someone, a lot of someones
came before you)
There are good reasons to right spam.
But the efforts of the biggie names are forcing people to use their services.
(Its free after all right)
You have to rely on a registrar for your domain name anyway. For everything else, a cheap VPS with a static IP is easy to manage and can delegate to your homeserver behind the scenes. The VPS also can be easily replaced by a different one when needed just by managing the DNS entries.
I tried this two decades ago. We just had broadband for the first time and I installed some web service application on my Windows PC. My ISP had what appeared to be a static IP so I manually set up a free DNS service with that IP. The experiment didn't last long enough for the IP to change but I did learn it did because long after I took the service down, my domain was pointing to someone else's IP.
It what I now realise was a bad idea, I was writing code by web server code using C with CGI. I was supposed to use Perl, but I didn't want to spend time learning that as I already knew C.
The best way I've found to get around the DNS problem is to take a cheap VPS (I use Hetzner) and run a reverse proxy like Caddy. Then you VPN that VPS to the machine at your house.
Standards bodies like IETF and the W3C also play a crucial roles and are in some aspects the worst offenders when it comes to promoting centralization via standards. So seeing an RFC that's the product of multiple layers of centralization talk about that topic seems very ironic.
They do address this in the RFC, but only briefly and I don't think to a sufficient enough extent.
If you want to get rid of centralisation one of the most effective things to do is ignoring standards bodies. Of course that may be detrimental to end-users in other ways (e.g. non-iteroperability).
If you want to communicate, you just have to agree on some protocol for communication. That's why we need standards and standards bodies. Having to agree on a protocol level is much better than everyone having to agree on a single entity who provides the apps and runs the servers.
Centralization happens because it's easier and cheaper to implement services in a centralized fashion. Since users care more about features and price than about centralization, it seems pretty obvious that this problem can't be solved with standards, since standards are unenforceable.
Standards can be enforced through government mandates. We've seen that in the US health IT space where CMS/ONC now mandate that payers, providers, and vendors implement certain HL7, X12, and DirectTrust open interoperability standards. Compliance is fairly high.
But there are also drawbacks to government mandates in terms of slowing down the pace of innovation and raising compliance costs.
This is true, but the closest example of this happening that I can think of is mandating USB-C for phones. I don't think governments have ever regulated anything like internet protocols.
Also the standard has to have wide adoption before governments would consider mandating it.
Users could also make interoperability a requirement and thus drive standardization. Nobody is interested in an email provider that cannot send emails to other providers. After WhatsApp changed its ToS in 2020 and I didn't want to agree to the new terms, I decided to delete all IM apps which depend on a single provider and only use a single one which is compliant with the XMPP internet standard. I also wouldn't sign up for any social network that isn't ActivityPub compliant anymore.
Right. This is a convenience problem because the web has no analogue to the browser for publishing content. If publishing content would be as convenient and simple as consuming it, there would be no need for users to flock to publishing services. Most of the centralized services today simplify publishing. Any "community" that forms around them is entirely incidental, and in the case of global social media, mostly harmful.
I think this is a problem the early web should've solved[1]. Now that the centralized model is dominant, most people wouldn't see a reason to change their habits, even if a new solution would be easier and simpler, and not just better on a technical or privacy level (which most people don't care about anyway).
Netscape 4 came with Composer, Windows/Office used to come with FrontPage. People had content creation tools integrated with their browsers, but didn't use them and so they went away.
I think it's because the tools were too un-opinionated and unconstrained, so if you wanted to do e.g. a blog then generic HTML editors were too much work even though they were visual. Also, static content publishing is almost never enough. You at least want(ed) search and to understand if anyone is visiting, but then you're into the realm of needing databases and such.
> Netscape 4 came with Composer, Windows/Office used to come with FrontPage.
Those are WYSIWYG tools only, which besides being a nightmare to work with, don't solve the actual serving problem. The earliest product that came close to that use case AFAIK was Opera's Unite in 2009, but it didn't last long. By that point it was already too late, since users mostly had asymmetric connections, so serving any type of content from their home network was infeasible. (Had the web launched with easy publishing tools, ISPs would've been forced to offer symmetric connections from the start, and this wouldn't have been a major issue.)
> Also, static content publishing is almost never enough.
True, but there's no reason these tools couldn't have evolved to allow dynamic content as well. Nowadays we have many different approaches that could make this possible, cobbled together from bits and pieces of native and alternative web technologies. Centralized services do make this easier, and who knows, we might've settled on them anyway, but I believe the general perception about user data would've been much different for the better had these tools existed from the start.
I don't think there was ever much chance of people running servers from home, if only because back then computers were noisy and power management didn't work well, so it was normal to turn them off at night. But most ISPs offered some web serving space via ftp so that wasn't an issue. It was more just that the tools were too much work. Obviously they were still a lot easier than coding it all by hand!
I don’t know. Stick everyones data in a single DB and have private business logic can do anything decentralised protocols can do but not vice versa. At least in respect to building the addictive attention seeking experiences that are needed for “success”.
Decentralized has one advantage though. You choose your user agent. For example your browser, your bitcoin client, your email client and so on.
I don't think that's sufficient. Take Mastodon for instance. The user count surged recently, demonstrating that it's not actually difficult to adopt or use as a Twitter alternative. But people tend to prefer algorithmically-driven social media that's optimized not to give them agency, but to be maximally attention-grabbing and sticky. The centralized and profit-motivated product is worse in the sense that it wastes more of the user's time and makes seeing the things they've explicitly followed more difficult, but that particular kind of "worse" by design is more popular. I don't really have a solution to propose here, just want to point out that sometimes the worse option wins by virtue of being worse, so being "as convenient and simple", or even being better is not always enough to attract people.
I wanted to set up a security camera to watch my pets when I was gone. The only requirements were that it had to have iPhone app, and I wanted the video to stream from my home network and not be funneled through a cloud provider. I couldn't find anything that did what I wanted, beyond a bunch DIY solutions which I did not have the time or energy to implement.
One aspect of this that independent site admins often have to deal with is CSAM filtering. Many jurisdictions require it in some form but by definition you can't "roll your own" automated solution because how could you? Centralized solutions like PhotoDNA are not available to most people.
With most other things I can see open source solutions prevailing, with this I can't. This will likely be the thing killing the decentralized Internet of ye olden days for good.
it's prohibitively expensive to access these tools. I have to identify myself and pay over £1000 to be allowed to access them. the agency either doesn't care about independent sites catching CSAM or they don't want independent admins finding out that these tools aren't very effective. alex gleason the fediverse dev tried contacting them about this problem and they didn't care at all.
Things used to be decentralized well when I think back to the early days of the internet. When I think about why, its because links were expensive or unreliable - what centralization brought was a kind of predictability and ease of use. The challenge for new decentralized systems to address I think is to offer a similar level of convenience as the centralized systems. Just being decentralized doesn't seem enough.
Granted I've not lived long enough to have witnessed the birth of the internet with adult eyes; what I have personally observed is a constant battle of centralisation and standards forming.
For a common example that many people may remember, AOL was at one point pushing towards a centralised model and was succeeding- until they completely collapsed and the open web resumed being a thing.
Similar for the times where there were no standards on video playback on the web, a bunch of companies competed with completely incompatible systems that had various pains associated until eventually standards emerged that sunk the majority of it. (ironically one of them with a healthy helping from Apple).
The web was never really decentralised - it's a fond memory and a legend we tell ourselves, this battle seems to been waged from almost immediately after it was conceived.
I found a much more articulate article on the matter while I was writing: https://archive.is/UUgl7
I was there, AOL was certainly not the internet.
It's an interesting article but it seems to cherry pick a bit. AOL was like a huge BBS that gave its users a limited portal into the internet, and only on a very limited basis starting in 1993. They never really had proper access as far as I know, and for those of us not accessing through AOL things worked perfectly fine and I never accessed anything through or hosted by AOL.
I'm not really sure how to convince you in the end, but it's funny now hearing my own experience was only a legend. I feel like I've achieved something and need a plaque indicating I've successfully moved into some new life phase.
For one thing, stop cargo-culting? "Centralization" is a vague term that isn't inherently bad. Use specific terms, identify specific problems, and it'll be easier to find solutions for them.
> "Centralization" is a vague term that isn't inherently bad. Use specific terms, identify specific problems, and it'll be easier to find solutions for them.
Thank you for reading the RFC and succinctly summarizing its main thrust.
I don't see it mentioned yet, but I think it's pretty important to address that centralization goes far, far lower to the most basic Layer 1: asymmetric (ie, centralization promoting) WAN links, stickiness of IPv4, secure trust foundation, maybe DNS or other equiv, and lack of IP auth or otherwise core level mitigations for DDOS. These are core foundations, and whatever the internet standards above it's much harder when the foundations are shakier.
1. To the first, for a solid stretch of decades, WAN links in much of the world that might otherwise have supported more decentralization have been tilted towards consuming from the center vs providing anything oneself, and of course just fundamentally stagnating as well at the impetus of powerful monopolies and regulatory capture. Until a few years ago I had the exact same 5/1 ADSL link I'd gotten in I think 2000 or 98/99. It was really something at the start, and I was able to run some fun stuff of it. 5, 10, 15 years later? Not so much. The US in particular put hundreds of billions into promises of big fiber networks, which then instead got used to just consolidate and profit. Cable and big telecoms are still fighting tooth and nail to prevent efforts like municipal fiber. But once you have symmetric 100/1000 or more, an extremely reliable, sudden a lot of new possibilities open back up again. Of course in those decades a lot of effort has naturally gone into centralized efforts because what would even be the point of designing for something without much potential user base because said users were stuck on crap connections either slow period or with decent download but utter garbage upload? So the ecosystem isn't where it might be on that front either, even though it's decent for more technical users. But I don't think we should forget just the most fundamental issue that if you want to serve bits in 2023 doing so at .5/1/2 Mbps with mediocre latency is pretty limiting. Lifting that isn't sufficient but it is necessary, even if the decades of mindset and ecosystem will lag.
2. To the second, I did at least get in early enough that I could still get, for free (as it should be), a static public globally routable IP address. That has also been a major boon even back when I was stuck sipping through a narrow straw. It's hard to internet if you can't do the inter part. Workarounds to coordinate via an IP elsewhere of course exist, but it's an extra layer vs "hey I can just talk directly home (or barn or office)!" IPv6, despite its flaws, should help bring that part back as well, but the flaws and slow adoption have also delayed things.
3. To the third, how to authenticate is a perennial problem of decentralization efforts. If we at least had universal, highly reliable fully trusted secure DNS, preferably with better registrar governance as well, then that would be a somewhat practical way to bootstrap something. I could put my own domain restricted root CA public cert in DNS, and everything could then just trust all certs issued by it for that domain only at a basic level and it'd all just work. Add a few cross signing options and an ecosystem for turn key CA management appliances into the mix and it's possible to envision something pretty approachable that would at least match and slightly exceed everything Let's Encrypt offers. That's another sandy foundation that really stings.
4. Finally, if everyone has decent pipes, and was running in a decentralized manner, there is of course the potential for more and even bigger DDOS. It would be helpful if there were standards for all the various tiers of operator from core straight back to residential ISP so that attacks could be automatically reported and followed right back out the stack to whatever WANs worldwide were involved and cutting them right there, or at interconnects for ISPs who wouldn't comply. Having to layer in providers like Cloudflare, however hard and nicely they work, has papered over it but remains suboptimal. Granted, this doesn't hurt dark/gray types of decentralization, where rather then decentralizing services or communications to the world one is doing it to other trusted networks exclusively. And that's definitely still very useful.
I'm sure there's others, but at the very least continuing the fight for a really good physical layer seems pretty critical to me.
I worked on Bitcoin in the early days, and developed decentralized software and protocols as part of that. I also did most of the design on an "enterprise blockchain" system later which is basically (in my view) a peer to peer database run by competing 'frenemy' businesses, i.e. with mostly untrusted nodes. So I feel like I have a lot of practical experience in this domain.
The RFC is decent enough. It's moderate and reasonable, and cites Marlinspike's "ecosystem is moving" essay which is a very important piece of thinking in this space. I'm not a fan of the RFC's friendliness towards regulation. Governments often achieve the opposite of what they want when they try to regulate the internet, and they don't care about centralization at all or in fact prefer it because it makes it easier to engage in control when there are only a few big players vs thousands of smaller players (see how modern EU regulation is explicitly targeted only at "very large platforms" and ignores the rest).
But the RFC lacks specific suggestions. For engineers who want concrete and achievable ideas that can be worked on with minimal cost, here are a few I'd pick:
1. Support IPv6. Getting flat end-to-end routing working again is one of the lowest lift ways to improve decentralization on the modern internet, in both obvious ways (reducing CGNAT) and less obvious ways, for example it's conceivable that Android could be extended to support socket activation. That would allow apps to bypass push notification and centralized reflectors in some cases. I'm not sure how commercially strategic push services are to Apple and Google these days - it costs a lot of money and it was revealed recently that governments are wiretapping supposedly e2e encrypted messengers by grabbing the push messages. So whilst I doubt Apple would allow it, in theory someone could write a patch for Android to enable it and contribute it upstream.
2. Support confidential computing. A lot of centralization happens because we need a program to be run on a server somewhere to do something sensitive, which means we need to trust the server operators (cloud+admins). So we gravitate towards big brands that everyone can agree on, like AWS. Confidential computing lets client apps (phones, desktop apps, less easily also web apps) to verify the server they're connecting to is untampered with and running the expected software. It takes cloud and root out of the trust equation, meaning you can in theory do things like have a P2P network of anonymous operators who offer their services without needing horrifically complicated and ad-hoc app specific cryptography. The tech works today, but very few people are aware of it or use it, and it's not integrated well into our tech stacks. But it should be!
3. Write smartphone, tablet and desktop apps. Web apps are inherently very centralized. The name of the app is conflated with its hosting location, browsers practically force you to delegate most of the app's work to the server, and user data ends up tightly bound with the operator and implementation. You can't even do tricks like confidential compute with them really, because browsers don't understand the remote attestation protocols. If you write client-side apps you can dodge all those problems and loosen the bindings between user data location, software distribution location and compute location.
Still, you have to be realistic. After some years I realized that centralization happens because decentralization is in some sense like communism. If you take away ownership over private property then people lose the incentive to improve it. It becomes a commons and the usual tragedy follows. Centralized services are private property, and so the owners make sure they are well kept and improved. Also private property and profit is mentally grounding - projects that lack these things have a habit of going crazy and losing interest in what users actually want. These days I'm not quite so interested in pure open source p2p systems anymore because of that problem, but there's a lot of scope to find interesting corners where private property can be combined with more decentralized implementations. After all, Office 2000 was owned by Microsoft yet still much more decentralized in practice than Office 365.
> After some years I realized that centralization happens because decentralization is in some sense like communism. If you take away ownership over private property then people lose the incentive to improve it. It becomes a commons and the usual tragedy follows.
Its really interesting to me that this was one of your takeaways, I actually would have seen it the other way around but haven't worked in the space nearly as much as you have.
The way I see it, centralization of the internet is the analog to communism and the argument for it would be that the internet and the services we use every day are so vital to daily life that one authority needs to own it to make sure everyone has access. In that view, decentralization would lead to things being poorly maintained and abused for selfish gain. Centralization (communism) would benevolently protect the common resources on everyone's behalf and make sure those resources are fairly made available to all.
From that angle, centralization of the internet is likely to follow the same road as historic examples of communism. We would see corruption, censorship, and power/money being syphoned off to the few in charge. That sure does feel like the centralized internet we have today.
I was referring to theoretical communism, the one where there's no private property for real, where everything is communally managed. Perhaps anarchocommunism is a better term. Tragedy of the commons gets the issue across just as well. In practice communist countries were highly centralized, agreed. All property was the private property of the state.
> benevolently protect the common resources on everyone's behalf
A resource is too abstract a notion. The things we're talking about here are services which can adapt and improve. In his essay, Marlinspike was trying to communicate that you can't federate or decentralize because "the ecosystem is moving" i.e. your centralized competitors are innovating and you have to keep up with them or ideally even exceed them. Mere protection here isn't good enough, it requires active change that may upset some stakeholders. Collectivism fails here because of its totalizing nature: there's one of everything, which is theoretically at least communal property. But then you have to please everyone, so the only changes you can make are the ultra-low risk ones and because you often don't know the risk, in practice that means you're forced to simply clone what is observed to work elsewhere. So you end up permanently behind and with time it gets harder and harder to keep up.
With competition that's less likely to happen, because there's an incentive to take risks and do things that may upset some existing users, if you think it'll please even more people who aren't your users today.
This is a core tension that appears whenever people talk about decentralization. It's the way the Bitcoin community lost the plot as well. Some people interpret it to mean "one universal totalising system which is collectively owned". Other people interpret it as "an interoperable system of many competitors that can innovate and diverge from each other when needed".
What Moxie argues for in that essay is effectively a setup where the service and data are centralized but importantly the control/power is kept out if the central authority as much as possible.
I can definitely see an argument there for that centralized model being akin to theoretical communism.
>Still, you have to be realistic. After some years I realized that centralization happens because decentralization is in some sense like communism. If you take away ownership over private property then people lose the incentive to improve it. It becomes a commons and the usual tragedy follows. Centralized services are private property, and so the owners make sure they are well kept and improved.
Counterpoint:
There's plenty of examples of poorly stewarded private property. Probably even moreso than poorly stewarded communal property. You're thinking about the problem at a level of collective action, while completely missing the individuistic level. Care is a finite resource, and exists only on the individual level. It is further fundamentally constrained in bandwidth. A person can only exercise care on so many things at a time, and there is a sizable list of more important things to worry about, often constantly being added to throughout an individual's life. Yet we value the individual's ability to manage their own priorities in terms where they want to allocate their bandwidth; in spite of other work needing to be done.
Centralization vs decentralization has no major impact in regards to work getting done, only to how the work gets done, and the accepted tradeoffs we make as a result of working in that modality.
In point of fact, it isn't an inevitable consequent of existence that you've come to observe centralization winning out in the end; but rather it is a result of the fact the web was spawned and has matured in a market that first and foremost packages up and organizes the tasks and rewards of labor through the creation of centralizing legal fictions in order to coordinate itself.
A society with a hammer, like an individual, is prone to discerning the solution to every problem being the pounding of a nail.
I give away technologies / products which favor / foster decentralization:
* De-anonymize cloud services and make PTR records work again.
* "Track the trackers" email aliasing; since none of the big providers do this well, you'll need to run your own Postfix mailserver to take advantage of it. Maybe you should set up DANE while you're at it?
* A decentralized telemetry / SIEM -like platform using the DNS.
* "Internet in a box" on your laptop: a wifi access point which doesn't have broader internet connectivity and which takes over all DNS (as well as DHCP) and allows you to run apps on that hotspot that are only accessible via that hotspot. Beyond the utility for remote / mobile de facto geofenced applications, imagine a world where "routers" were nodes with multiple wifi radios and which joined multiple "boxes".
If the three phases of consensus are kook / lone gunman, co-conspirators, movement, (...consensus) I'm past the kook phase and into conspiracy. Happy to get on a video call and give anyone an hour of help with any of the above.
Things that the RFC (and frankly nobody, really) doesn't cover are that centralization is overwhelmingly good for "free", but this has two sides. What two sides do you think I'm going to point out?
There are absolutely awesome proprietary offerings (distributed doesn't necessarily mean open source or free) for doing no-code responsive mobile development by an "army of one" or a savvy user, for 1000 users or less (designed for scenarios where a UI needs to be built for a single person or maybe a half dozen at the most). Centralization favors the Procrustean (I still like that word, thank you commenter for gifting it to me): if the user doesn't fit, kill it trying and if it dies another one will take its place.
I'm not sure if I need to see Coca Cola ads intended for an Argentinian or Ukrainian audience. Just let that sink in. Saw a pro-centralization argument the other day that $grandma in a $third_world_country can't view her grandson's football league games without the wonders of centralization (conveniently conflated with not regulating those centralized entities' traffic across international borders)... but this chauvinistically fails to grasp that this requires an international data plan (oftentimes lacking on phones in third world countries) and that grandma has to register for some global nexus of surveillance capitalism that otherwise has little relevance to her life in order to do so.
"But how can an 'international data plan' be a thing, since the whole point is that the Internet doesn't have borders?" (some rando I mumbled to about this)
That's because nation states aren't the only gatekeepers. Those centralized services are also gatekeepers. Not only the services, the people who put the brick in your hands.
A lot of free or low cost phones in places where people don't have much income are essentially tethered to some AOL-like service, like Facebook. Really, your internet connection is to (or optimized for) Facebook: it's e.g. a "Facebook phone". Anything that's not "Facebook" counts against your data plan.
You can maybe argue that everyone has their preferences wrong and they are hurting themselves in the long run, but good luck fighting that battle.