I've used Red Hat Linux (first Caldera Linux, a 90s era clone), then Fedora, as well as RHEL and its clones (Scientific, Centos, Rocky) since the late 90s. I've used it personally, pedagogically, and professionally. I recommendeded it to others for their desktops and servers. If you wanted to use Linux, I'd tell you there are two options that are stable and well-supported: Red Hat and Debian. Red Hat has died recently. RIP.
I'm no longer confident that Fedora will continue to be a thriving open-source project. If IBM is killing RHEL clones now, why risk Fedora being next?
I hear you saying "RHEL is derived from Fedora" and "Fedora is a separate open-source project". The thing is, RHEL is derived from Centos Stream *now*. Why does Fedora need to exist? Or rather, why does Red Hat need to provide engineering and infra to Fedora? It's true that Fedora is a separate project, but its mostly contributed by Red Hat employees. If IBM decides tomorrow to stop paying for Fedora because Centos Stream is enough, can Fedora stand alone?
Maybe this is just FUD, but I no longer have confidence in an IBM owned Linux distro. I've already moved to Debian for personal machines, and I've been advocating it professionally as well. It's just irresponsible for an org that spends significant engineering time customizing their Linux installations to base on an uncertain Red Hat/IBM backed distro over a fully open-source project with no strings attached. Fool me once shame on you but fool me twice and, well, IBM isn't going to get me again.
> each release of CentOS Stream is a frozen Fedora release
This is the current state. Why should IBM maintain rolling updates into Fedora ("rawhide"), then cut a Fedora release, then bless a Fedora release as Centos Stream, then backport from rawhide to Centos Stream as needed until they decide to cut a RHEL branch? If I were an exec at IBM, that would seem like a lot of extra releng salary that could be better spent directly on core product (which is RHEL which comes from Centos Stream) or billable hours. Someone's bonus can go up a few bills by eliminating a line item while improving efficiency.
Why should IBM continue paying for rawhide -> fedora release -> centos stream -> rhel ? How long until the fedora release step is put out to pasture then eventually culled?
You're basically suggesting that to do a complete step change every 3 years might be considered "more efficient" but literally nobody would consider that more efficient, it would just add massively more risk into the whole process and be recouped later by delays. That is agile v. waterfall 101, and even IBM is perfectly capable of understanding that.
That is to say nothing of the fact that Fedora is used very extensively internally, and that there are hardware partners like Lenovo that sell it preinstalled, and Fedora Asahi which is the new home of Apple hardware enablement (and thus the best Aarch64 development platform). The Fedora ecosystem is extremely valuable to Red Hat.
IBM can't "kill" Fedora. All the specs files, the assembly mechanisms for the distribution is available online. The community can (and has) forked fedora and keep it going under a new name, if they wanted to.
But this is all backwards.
> Why should IBM maintain rolling updates into Fedora
The better question is why is IBM maintaining rolling updates into Fedora? Because they have a business incentive to do so. As other comments have opined, they get free testing, free feedback of upcoming changes. The community gets a lot in return. RedHat funds so much open source/free(dom) software development it's ludicrous. There's reciprocity here. There's a lot of other distributions also benefiting from RedHat's work, like Debian (and vice versa).
It seems like the community here on HN has gone complete bonkers over this RedHat business decision. We've reached a next level of open source entitlement syndrome.
We are not entitled to use RHEL for free. We are not entitled to repackage RHEL and sell it for free either. We are only entitled to those part of the source code which is covered by copyleft licenses. Are we entitled to the SRPM for those programs? We don't know. The patches applied, sure. How about the spec files to assemble the RPMs? Who knows. But it seems like everybody are demanding that RedHat should keep doing this work and ensure that third parties can keep their business model of reselling said work.
This is probably not a popular opinion, but the community seem to have a deranged take on this situation.
We are only entitled to those part of the source code which is covered by copyleft licenses.
No, we're not. Even for code covered by copyleft licenses, we are only entitled to the source for which we have received a binary. However, once we have received those sources, we should be free to do with them whatever the license allows.
But that last part is what Red Hat is violating, and that's why some people have a deranged take, thank you very much: they use their sales contracts to specifically deny freedoms granted by copyleft licenses.
If you as a Red Hat customer redistribute the source code, Red Hat won't sue you. You didn't do anything illegal, you are doing what you are allowed to do.
Nothing in the GPL entitles you to future support from Red Hat, source code of future versions or anything of the sorts.
You're free to do whatever you want with the sources. Red Hat is free to stop business with you at any time. There's nothing contradictory here.
No, this is conflation everybody is having. You are free to do with the received source as you like, maintaining your rights granted by the GPL.
But the GPL does not compel RedHat to keep you as a customer. There are two separate legal instruments, the license for the source and the terms for RHEL. RedHat are free to chose their customers. We can't force a company to take us on as a customer.
They're getting free community testing of early software in Fedora.
I don't think you realize just how much RH focuses on releasing stable software. They have SLAs with government agencies, healthcare providers and they need to live up to them. This is why CentOS Stream is in fact more secure than RHEL because they spend so much time testing the patches that go into RHEL that CentOS Stream now gets them before RHEL. A reversal to the state of affairs with old CentOS.
It is also why they were so keen on killing any project that claims to be a RHEL clone, because they want their customers to know what they're downloading, and know what they're running. RHEL comes with guarantees.
Of course I'm the first to admit that most of their motivations to kill the CentOS source repo were financial. That doesn't change the fact that Red Hat is a very enterprise-minded company, and their perspective isn't always obvious to the rest of us.
And they can do that work internally, privately, in the dark, where they cannot benefit from any community engagement, or they can do that work in the open, engaging with the community.
I suspect the difference in cost between the two is not significant enough to be relevant, and the open variant has more benefits.
Note that this was RH's stated intent when the whole CentOS non-Stream EoL shenanigans went down - they said that Fedora & CentOS Stream benefits RHEL, CentOS non-Stream does not.
[After writing this, I was curious about Fedora project costs, they are lower than I thought: https://budget.fedoraproject.org/budget/FY20/overall.html - Red Hat's contribution was ~$196k for FY2020 (in cash, there's probably more 'soft' costs in terms of staff and hardware time/resources)).
I share this sentiment, even though I joined the Linux club much later, and began the journey with Ubuntu. I have arrived on Debian for similar reasons though. It just seems the purest distro while also being very popular and well supported.
For any who don't follow the Linux kernel closely, Maddog is an absolute legend.
His perspective is very interesting and appreciated, especially love how he frames things historically. Knowing the history really helps put things in context. TFA is long, but worth reading.
Jon Hall was essentially Linus Torvalds' agent in the 90s. While at DEC, he got Linus to the right places and meet the right people, so that his "hobby, nothing big or professional" OS entered the trajectory to become what we know today.
I'm about 30 years younger than Jon Hall, so I couldn't be familiar with his accomplishments other than by oral/written accounts. Since he hasn't written big hit books I could read or software I could use (alright alright, Linux for Dummies), I constantly saw people calling him a legend but never understood why. I finally asked around in the Linux kernel community, and was explained the extent of his contribution: he was Linus' mentor in a way. When they met in 1994, Linus was a 25 yo student and Jon a 44 yo DEC marketing manager. I like to think of their conversation as something like "Listen to me kid, this is what you gonna do".
With this in mind, a line in Jon Hall's wikipedia bio stands out: It was during his time with Digital that he initially became interested in Linux and was instrumental in obtaining equipment and resources for Linus Torvalds to accomplish his first port, to Digital's Alpha platform.. Another one in his linkedin work history reflects this view: Senior Marketing Manager, DEC, 1983-1998: In 1994 met Linus Torvalds, recognized commercial value of Linux, obtained funding for port of Linux to 64-bit Alpha processor, opening up a billion dollar line of Linux-based High Performance Computing Super Computers.
While on an Alpha team at DEC, I was in a meeting when maddog walked in, and I fanboyed a bit- he was the first famous "computer person" I'd ever met. My boss was with me, and I had to explain to him who he was. I've also seen him on the highway with his New Hampshire "Live Free or Die" UNIX plate.
You are correct - I have the same book with the CD of RHL still stuck to the back. I don’t think I ever installed it, but I did install Mandrake Linux and used the dummy book to learn it. Why such an odd distro like Mandrake? I literally found it on the shelf in a Best Buy. I was already a heavy Unix user at work (we all had dumb terminals running Solaris) and I really wanted that experience on my personal system, thus started my
life-long love for Linux.
My first desktop Linux is Mandrake too. Reason? It had KDE. It looked like Windows. I've recently come back to KDE by way of OpenSuse. Both decisions surprise even myself. I never thought I'd use an rpm based distro again. (I am using Tumbleweed. Rolling updates+snapper snapshots on btrfs is what sold it to me. I would prefer zfs but had no issues with btrfs yet in two years)
Hey, I also first installed Mandrake. I got it in on a magazine. I always a thought it was a common distro, kinda like ubuntu some years later when they started shipping disks
I’ve always admired Maddog and I was lucky enough to meet him when I was in my early twenties at a Linux conference in Birmingham, UK. I bought him a coffee and even though I earned a pittance and couldn’t afford my own coffee at the time, I felt it was a badge of honour and afforded me 5 minutes of chat with this giant of computing.
> However I will say that there are many people who use these clones and do not give back to the community in any way, shape or form who I consider to be “freeloaders”
Yes, however that's always been something you just kind of accepted with free software. That's part of what giving something away for free entails: not everyone is going to give something, or anything at all, back.
With the context of the previous and subsequent sentences he's saying that "freeloaders" are a subset of those that don't contribute: "the people who sign a business agreement with IBM/Red Hat and then do not want to live up to that agreement".
Yeah, if you read later, he mentions that something as simple as telling your company “hey, maybe we should try using open source” is sufficient to be giving back to the community in his opinion.
“Freeloader” here really means someone like the IT guy that occasionally spins up a microservice in a Linux Docker container because that’s what he saw suggested on Reddit, but is otherwise proudly promoting his Microsoft certs on LinkedIn.
No, that's not enough to renege on the business contract. It's someone who installs 10 RHEL instances that generate an insane number of tickets, because the issues with the other 10,000 CentOS systems are redirected to those 10 RHELs.
That’s what everyone one does with FreeBSD. Most network gear is some stripped BSD variant.
The difference with say Oracle Awesome Linux or whatever is that they are selling RH’s value add work and ecosystem as a service. As in they literally come in “Buy this stuff to be supported with our other stuff, we ship whatever RH ships.”
IMO, Red Hat’s changes are really making it more expensive to create exclusive “Oracle Red Hat for X” or similar business models. The people who can’t afford RHEL for their academic HPI still can’t, and just pick something else. People here quacking about Fedora are just quacking - Fedora is a project with a commercial purpose that serves the company’s needs. It will exists until it doesn’t.
RH isn’t Microsoft - everyone acknowledges that lots of distros exist and many are excellent. You can start a distribution today. But “fine, I’ll just use Debian” isn’t really a valid answer either. Debian is an awesome project and product, but if you chose Red Hat in the first place for reasons other than “I got a CD with a magazine in 1998”, Debian probably isn’t ideal without you doing alot of work. Then it’s a money discussion. (Cost of humans doing stuff vs paying RH to do stuff and humans to do other stuff)
>not everyone is going to give something, or anything at all, back.
I would extend that to almost noone.
99% of people are going to not give anything back. 0.9% will file complaints or bug reports or donate money or whatever. 0.1% will actually give back any functional code.
It's the same with absolutely everything. How many people give money to charities, how many help them by volunteering. How many people actually take times in politics to get involved into parties, march the streets or leafleting. Etc.
Your comment reminded of something I was taught by my economics professor, he told us about small things can add up and gave the example of giving to charities. So when I started working I started giving like a couple percent of my income to charities. Now over the years it really has added up in total.
This all reminds me of the time I was donating a server+hosting to the Fedora project. I was setting up the server and asked my contact "I presume you'd like to be running RHEL on it?" "Yep." "Do you have a license key I can use?" "I can get one but it's annoying, just install CentOS."
My former company sent 10 of us to a red hat kernel internals class to get us preped for moving away from hpux/Solaris.
The instructor advised using centos to play around with after the class (before CentOS was bought). We ended up using redhat on one of the next projects.
Red Hat used to be a lot more lenient with the clones, just like Microsoft didn't care whether you used a genuine copy of Windows/Office or not. They probably assumed that once you became accustomed to CentOS, you would naturally buy RHEL when an enterprise project came along.
Maybe that's still a reasonable assumption to make. Maybe it isn't. Maybe it was in the past but isn't anymore due to the proliferation of clouds, serverless, saas, whatever.
In my recent use, the Red Hat subscription management experience is night and day compared to even a few years ago. It's seamless and very much out of the way for most cases because of Simple Content Access[1]. And producing custom RHEL images for deployments in the cloud (or otherwise) is now supported by Red Hat through their Image Builder service[2] (which you can deploy locally on a RHEL box too[3][4]).
In general, there's been a huge revamp of the user experience for RHEL and associated services. And it's been for the better. The experience on console.redhat.com is stellar and worth it alone, in my view.
I have 0 love for what RedHat is doing, but one thing I have noticed: I have developed enterprise software for Linux for these last two decades, and the last time I saw one of these (huge) companies having an actual RHEL license was with RHEL 6. It's been CentOS and clones since then; for a while I think people even _meant_ CentOS when they said RHEL.
Same story with Qt, where they'll stay with Qt 4.x to avoid paying Digia even if practically all their software depends on it.
A RHEL subscription is a license in all but name; “A subscription key you add to your account, which you then use to log into the subscription management service on the deployed server to then make use of the privileges the subscription grants.” is effectively one or two steps abstracted away from “enter this key directly on the system to access the privileges of the fully licensed software”
As a Red Hatter, I appreciate maddog’s view. I wish others had his wide perspective of what companies do for open source. Seeing how the sausage is made changes your perspective. And I wish we could give folks a lesson on what we actually sell because it’s not software or software licenses. We’re not skirting around open source licenses or even changing our licenses. We’re trying to keep our folks doing valued work for our customers. And so much of what we do makes it to the upstream projects that are shared with competitors, communities, and users.
It's great to hear such a great historical account from maddog. I was working at the LTC when maddog made one of his trips to Austin that he talked about in the article, so I had the pleasure of chatting with him over lunch. This was somewhere around 20 years ago.
It's always fascinating to hear what a septuagenarian considers to be the "inflection points" of their career. It's especially striking how my own inflection points interweave with theirs. I was right in the middle of the SCO bullshit, but I'm not sure I'm allowed to talk about some of it. Maybe enough corporate lawyers have retired and/or died by now that I can finally say what I want anyway. If I can find the motivation.
I'm glad maddog is still sharing his story, as I find as I get older that I care less and less about sharing anything with anyone on the Internet. Any conversation of any significance to me tends to happen face-to-face these days, so it hardly seems worth it to share or engage with anyone I don't know personally.
I work for a company that tried to put RHEL on the high performance server equipment we build, and we’re quoted about 1 million dollars per license, non-negotiable. That would have made our product far too expensive. So we ended up going with CentOS, and then Rocky, and hired a team of about 30 people to manage it (and some other projects). I appreciate Red Hat and their right to manage software as they see fit, but given my experience over the past several years, it feels they are just trying to squeeze the brand for as much as they can get. I’m curious to watch what will happen to it over the next five years.
I wasn't practicing law back then, but my secondhand understanding is that while it wasn't clear that copyright would apply to software, or how, savvy players largely expected some kind of protection for written software beyond trade secrecy.
There were all kinds of questions, theories, and proposals about whether that would happen under copyright law or perhaps through some software-specific regime. The US answer was clear when "computer program" was written into the scoping definitions of the Copyright Act. We still cite back to the commission that pushed that recommendation, CONTU, when debating loose ends.
CONTU wasn't established until 1974, and the US Copyright Act wasn't amended to explicitly affirm the copyrightability of computer programs until 1980 – but the US Copyright Office was already accepting copyright registrations of computer programs by 1972 [0]. While the Copyright Office's interpretation of copyright law is not always affirmed by the Courts or Congress, more often than not it is – so, it would not have been an unreasonable assumption in 1972 that the answer to "Are computer programs copyrightable under US law?" was likely "Yes". I don't know whether the same answer would have been true 3 years earlier or not, but quite possibly it would have been – it is an interesting historical question, when was the first copyright registration for a computer program which they accepted?
I also found a journal article [1] which says (p. 1748, my emphasis):
> There was considerable debate in the 1960s, during the gestation of the legislation that became the Copyright Act of 1976, about whether computer programs could, or should, be protected by copyright law. Although no one seriously questioned that source code forms of programs could be copyrighted as written texts, there were two principal concerns about applying copyright to machine-executable forms of programs...
So, according to that, the debate was primarily about whether object code was copyrightable, as opposed to source code. At the time, distributing software as source code was extremely common – indeed, configuration files were rare, configuration was commonly hardcoded in the source code, making compilation a necessary part of the installation process – which meant that source code being much more clearly copyrightable than object code would have been less of an obstacle to commercial software distribution than it would have been in later decades, when object code only distribution became much more common.
[0] Catalog of Copyright Entries. Third Series: 1972: Title Index. Books: July-Dec. page 3926 which lists "CILA Mark-1 system (casualty insurance logistics automated) source program listing. NETWORK DATA PROCESSING CORP" – https://books.google.com/books?id=4kAhAQAAIAAJ&pg=RA1-PA6 – note there are many other references to "computer programs" in that index, but it is sometimes unclear whether they are manuals or source code; this particular entry is rather clearly source code.
> While the Copyright Office's interpretation of copyright law is not always affirmed by the Courts or Congress, more often than not it is
I don't think I've seen that asserted before. I certainly wouldn't bank on it these days.
As for timing, there were companies speculating well before '72 that copyright would be the game. Archival work found a copyright-based license agreement from IBM from as early as 1969. See https://www.create.ac.uk/blog/2018/11/14/the-first-software-....
That's not the same as saying the question was settled. After the Copyright Act amendment, it sure was.
> I don't think I've seen that asserted before. I certainly wouldn't bank on it these days.
Let me ask the question historically: in previous decades, how often have the legal interpretations of the Copyright Office ended up being affirmed by the Courts and/or Congress, versus how often have they been overturned by them?
Also, the Courts owe a certain degree of deference to regulatory agency statutory interpretations, as established by the Supreme Court cases Skidmore v. Swift & Co. (1943) and Chevron U.S.A., Inc. v. Natural Resources Defense Council, Inc. (1984)–the second of which came after the time period we are discussing, but the first came before it. There is no obvious reason why that deference would not also apply to the US Copyright Office's interpretations of copyright law, and indeed there is case law applying those decisions to it.
> That's not the same as saying the question was settled.
The journal article I cited says that the question was settled all along for source code, and the legal doubts were only about object code.
The assertion about the Copyright Office's batting average was yours. If you want to make the assertion, the research question's also yours!
As I recall, Skidmore held that what agencies say laws mean gets only the deference it deserves. In other words, the courts will reconsider for themselves how persuasive their arguments are.
Chevron starts with the question of whether the administrative agency's decision was made in a way that a statute gives the force of law. The Copyright Act gives the Copyright Office that power in administering some processes, like copyright registration. But last I checked, which was well after Chevron, questions about whether an application followed the registration process got deference, but the more basic question of whether something's copyrightable in the first place remained with the courts. Just because it's called the "Copyright Office" doesn't mean the courts will defer to it about the whole Copyright Act.
This difference could very well matter for some going issues, like the Copyright Office's recent rejection of some artwork created with the help of generative AI. I would be very, very surprised to see appeals courts handing that legal question over the Copyright Office.
> The assertion about the Copyright Office's batting average was yours. If you want to make the assertion, the research question's also yours!
I can't claim to have researched it formally, but my impression from reading a number of cases on the topic is that most of the time, the Courts end up agreeing with the Copyright Office. And more often than not, when they disagree with it, the disagreement gets reversed. For example, in National Broadcasting Co. v. Satellite Broadcast Networks, Inc (940 F.2d 1467 (11th Cir. 1991)), the 11th Circuit concluded that satellite broadcasters were cable systems. After oral argument, but before the decision was handed down, the Copyright Office issued a rule that they were not. The Court decided that the Office's rule was not retroactive, and hence did not apply to the case; they expressed doubts about whether they owed it deference, but avoided deciding that. Subsequently, in Satellite Broadcasting & Communications Ass'n of America v. Oman (17 F.3d 344 (11th Cir. 1994)), a different panel decided the Copyright Office rule was owed Chevron deference, and reversed a District Court decision applying that 1991 decision.
> As I recall, Skidmore held that what agencies say laws mean gets only the deference it deserves. In other words, the courts will reconsider for themselves how persuasive their arguments are.
You make Skidmore sound weaker than it actually is – in Skidmore, SCOTUS reversed the District Court and the 5th Circuit for failing to give sufficient deference to the statutory interpretations issued by the Department of Labor. Skidmore instructs Courts to evaluate an agency's "rulings, interpretations, and opinions", in light of "the thoroughness evident in its consideration, the validity of its reasoning, its consistency with earlier and later pronouncements, and all those factors which give it power to persuade". In other words, if a Court wants to reject an agency's interpretation, it has to present a persuasive argument that it is flawed on one of those grounds, or else the rejection has significant odds of being overturned on appeal.
> But last I checked, which was well after Chevron, questions about whether an application followed the registration process got deference, but the more basic question of whether something's copyrightable in the first place remained with the courts
In Varsity Brands, Inc. v. Star Athletica, LLC (799 F.3d 468 (6th Cir. 2015)), the Court of Appeals ruled that copyright registration decisions were owed Skidmore deference, and overturned the District Court for failing to extend that deference – in a dispute about copyrightability. It said that (in the 6th Circuit at least) when the Copyright Office judges a work to be copyrightable, as expressed through its decision to register the work, there is a rebuttable presumption that the Copyright Office's judgement is correct. That decision was upheld by the Supreme Court on appeal, but all three of majority, concurrence and dissent dodged the issue of deference entirely.
Star Athletica's holding turned on just the kind of "force of law" question I mentioned. The answer there was "no", so the relevant rule was Skidmore, not Chevron. Of course Skimore's still law. But what does it say?
Looking back at the Athletica opinion, the 6th followed the Supreme Court in referring to Skidmore as "the power to persuade", as distinct from "the power to control". There's an abstract rubric for courts to use in assessing agency interpretations that don't have the force of law. But the door is very much left open for courts to adopt interpretations they find more thorough, better reasoned, more consistent, &c. You could squint and see an outline of how appeals courts review all decisions there.
With "presumption", I think you may be confusing terms. There's a statutory presumption you get in your favor once you successfully register copyright with the Copyright Office. That's a presumption for a party challenging the validity of a copyright to overcome---say, a defendant in a copyright infringement suit. Judges and courts don't bear "burdens" under "presumptions". They follow the rules that put them on litigants, or review the decisions of lower courts that should have.
Consider the opposite case where the Copyright Office refuses registration because it says the subject matter's not copyrightable. Perhaps because the artist created the artwork by prompting Midjourney. There's an appeal process for refusal to register within the Copyright Office, under its regulations. If you appeal twice and lose twice there, that's "final agency action" courts can look at.
If the issue ends up in court, the statutory presumption of 410(c), by its terms, doesn't apply. No issued registration, no presumption of validity. But there is still Skidmore. In the Sixth Circuit that's clear now. The court couldn't ignore the Copyright Office's reasons. But if it weren't persuaded, it could rule otherwise. It would have to read the Copyright Office and grapple with it, but not agree with it. Especially if it heard a better argument in briefing.
> With "presumption", I think you may be confusing terms. There's a statutory presumption you get in your favor once you successfully register copyright with the Copyright Office. That's a presumption for a party challenging the validity of a copyright to overcome---say, a defendant in a copyright infringement suit.
There are two different rebuttable presumptions here – one is the statutory rebuttable presumption under the Copyright Act; the other is that Skidmore deference is itself a rebuttable presumption. They are distinct, but in copyright cases, is there a clearcut boundary between them? If you read the 6th Circuit's decision, you will find that they deal with both the statutory presumption issue and the Skidmore/Chevron deference issue in the same section, and treat them as closely related as opposed to clearly separable. Maybe, if you think I'm confusing the two, you might think the Sixth Circuit panel was too?
> Judges and courts don't bear "burdens" under "presumptions".
I think we are using "burden" here in different senses. You are using it in a narrow, technical legal sense, and I agree with you that in that sense, the parties bear "burdens", not the Court.
However, in a broader sense of the term "burden" – in the sense of (informal) logic, philosophy, discourse analysis, etc – lower courts do bear a persuasive burden, of convincing the appellate courts to uphold rather than overturn their decisions, and rebuttable presumptions can work to shape, even increase, that burden.
> The court couldn't ignore the Copyright Office's reasons. But if it weren't persuaded, it could rule otherwise. It would have to read the Copyright Office and grapple with it, but not agree with it. Especially if it heard a better argument in briefing.
The District Court erred by simply engaging in a cursory dismissal of the Copyright Office's interpretive position, as opposed to engaging in a detailed analysis of that position against the Skidmore factors. If it had done that, the 6th Circuit would have found it harder to reverse the District Court's decision, even if it had ultimately arrived at the same result. Of course, if the 6th Circuit really wanted to overturn the decision, it could have (especially given de novo review)–but the District Court could have made the 6th Circuit's work cut out for it, as opposed to giving it easy grounds for a reversal.
You could say the lower court both failed to meet its persuasive burden with respect to the appellate court, and simultaneously failed to impose a greater persuasive burden on the appellate court (in reversing) than it could have. But for both, these are added persuasive burdens which only exist because the rebuttable presumption of Skidmore created them.
It was very unclear whether software source code could be copyrighted until CONTU recommendations were written into law in the 1970s. This is a major reason why IBM established software licensing for some of the System/360 software.
> Strange you don't believe Jon Hall, but willing to believe some random stranger on social media.
I'm not aware that Jon Hall would claim any particular expertise or interest in legal history, whereas there may well be "some random strangers on social media" who do.
Also, this is a matter which can be established by citing sources which Jon Hall might be unfamiliar with (for the simple reason that the topic might never have interested him sufficiently for him to research it.)
I met maddog circa 2000 in Paris for an event I had co-organised. We asked him a lot of questions, and got some insightful answers (sometimes wrapped in long stories, like in the linked article).
The one I remember most: "Q: how can we help Linux gain market share on the desktop, beyond the few percentage point it already has?" "A: that won't work. Find a market that doesn't currently use computers, create a product that can address the needs of this market, and make sure that it uses Linux. For instance, think about smartphones."
Nice to see maddog is still actively involved in FOSS.
At one COMDEX in Chicago, I think it was the late 90s, he talked me and a friend into manning his booth for a couple hours so he could go check out the show. I don't remember if it was LPI, LI, or some other Linux related thing. We just handed out distro CDs and told people about Linux IIRC.
It was a crazy time when Linux was taking the PC world by storm, Tux was everywhere.
I forget where it was when I first heard maddog speak.. It was in the linux dark ages though. Solaris, AIX, HPUX, etc. were still grinding along and then there was this upstart Linux. He was talking about how the GNU tools were native and the first thing all the admins did on those other platforms was to build them because UNIX is like playing a piano but the GNU tools turned it in to a fine piano and the keys just felt better, the music was better because of it. I have always loved that analogy.
It's all true AIUI. I had some exposure to Sun boxes early in my Linux career and the stakeholders of those machines always installed the GNU userspace to make them more ergonomic to use and just get everything consistent.
I think it can't be overstated how significant GNU was in making Linux successful. Everyone basically already knew how to use its userspace because GNU had been around for so long and people were already deploying it on the proprietary unices.
It's a bummer to see what's been done to rms/fsf since... practically tarred and feathered.
The free software community, at long last, finally have decided humans are more important than bits and have excised people from their midst who treated bits as more important than humans. This was the right thing to do. I should know. (Sigh.)
When you're paying for RHEL you're not paying for the software that you are installing. All of that can be downloaded.
You are paying for:
- A reproducible target. You know EXACTLY what code you are running. If you manage more than three installations this is the only way you can diagnose and fix whatever issues your installation has.
- Support. The very few times I used RHEL support I always got timely and thorough assistance. Even when chasing a hardware bug or issues with third party device drivers.
- Backwards and FORWARDS compatibility. Red Hat systematically backports kernel bug fixes and support for new hardware to old kernels. We ran 2.6 kernels on Intel hardware released long after the 2.6 series were EOL.
- Device drivers. No, not for your five dollar mouse, but for hardware that costs the same as a small SUV.
If you're avoiding RHEL due to cost, have a look at their SKU list and talk to your local sales org, they have a wide range of options.
(Not affiliated with Red Hat or IBM, but RHCE since 2004)
Dude deliberately ignores the whole category of RHEL users -- integrators. Say, an integrator needs to make sure some program works with RHEL, and, let's say that program works with big clusters of computers (well, think something like Ceph, or maybe Slurm etc.) Now, the integrators will need hundreds of machines running RHEL... and if they have to pay license fees for that, well, that ain't gonna happen. If cornered, integrators will just go away and not use RHEL at all, and then the value of RHEL will also drop.
Of course, any software developer targeting RHEL is also kind of an integrator, and if their company also needs to buy hundreds or thousands of licenses, well, the same thing applies.
This gets even weirder and worse if you think about using public cloud providers, who act as RHEL redistributors -- but you need to somehow have your own VM images... with RHEL. (I work for a company in this exact situation r.n.) Are we "freeloaders"? Because, I guess, if we are, then we might stop, but I don't see how that would benefit RHEL, because we'd have to drop support for RHEL instead of paying so much in license fees.
That's not how we partner with Integrators, VARs, ISVs and other Partners. We have several programs to assist our Partners in making their development environments, production systems or SaaS platforms cost effective and sustainable. Even individuals, hobbyists, OSS developers have options.
Integrators and partners usually pay to be part of the game, that is one reason why the whole certification industry exists, and how many certified persons a company must have to be part of the game.
Well, the company I work for today has some sort of an "exclusive" contract that allows us to have different use pattern than individual / corporate "end users". But this is because we "wanted" to be "partners" (I don't know the legal details), i.e. we made some effort to acquire this special status, and I don't know anything about the financial workings that were involved in becoming this kind of "special friends".
On the other hand, I worked for another company before, which was making a distributed storage product that targeted, beside other things RHEL (well, we really only used CentOS) because that was perceived as potentially most common customer profile. None of us wanted CentOS for our development needs, nor were we using CentOS for our own infrastructure etc. So, we paid nothing to Red Hat. I don't know how much this "special friendship" would cost if we had to buy any kind of a license to simply target RHEL (but we'd need a lot of machines running it -- during my time there we had ~20 ESX for testing, so... idk, that'd be what... couple thousands VMs? -- something like that).
> For years Unix programmers measured their “age” in the Unix community by the generation of John’s book which they owned. I am proud to say that I have a third generation of the photocopies.
...
> However, some of these FOSS people condone software piracy and turn a blind eye to it.
> I am not one of those people.
Mr. Hall, as a self-admitted proud pirate, what right do you have to condemn anyone else? I myself do not condone piracy either, of course. Piracy is vile and pirates should get many decades in prison. But to say that you are proud of being a pirate and a few paragraphs later condemn the practice... that is the height of hypocrisy!
dd-wrt (alternative wifi router firmware) were one of the first to use such a development model. It did piss off a lot of open source developers -- including me at the time.
> In fact I had a professor who taught programming that told me I would NEVER be able to earn a living as a “professional programmer”. If you wrote code in those days you were a physicist, or a chemist , or an electrical engineer, or a university professor and you needed the code to do your work or for research.
[ ... ]
> I remember negotiating a contract for an efficient COBOL compiler in 1975 where the license fee was 100,000 USD for one copy of the compiler*
LOL, probably five times that blinkered professor's salary.
Jon supported me directly at a startup in the late 1980s when Ultrix was new. He was generous with his time and direct with his answers. I'm quite grateful all these years later.
From the comments in that thread, one can find the other side¹ of the story:
> The first release of GNU Emacs (numbered 15.34) was in 1985, and still incorporated some of Gosling's display code. The dispute continued, with Unipress announcing that it wanted to "inform the community that portions of the GNU Emacs program are most definitely not public domain, and that use and/or distribution of the GNU Emacs program is not necessarily proper." This was countered by Fen Labalme and others who claimed that Gosling had included their code in the sale to Unipress.
> Stallman solved the problem in characteristic fashion by announcing:
> > I have decided to replace the Gosling code in GNU Emacs, even though I still believe Fen and I have permission to distribute that code
Wikipedia gives the first release of GNU Emacs was in March of 1985. In that version of the story, at least, all of Gosling's code was gone by August.
A way out of this crisis could be an Enterprise Linux Standard, a document
describing in detail a standard installation, together with a vast set of conformance
tests.
It would be a supremely boring project, but if interested orgs such as suse, oracle, ibm etc suport it, it might be feasible.
It could start from a description of rhel, but ideally could contain some extension
of the file system standard to allow compatibility with debian.
Then, you would have a reference platform that multiple distributions could provide and there would be competition, not just one major provider of a system `as is' and some more or less incompatible others.
Yes, it would continue POSIX. But POSIX is about a small part of a linux distribution.
There was LSB that was more extended, a bit. Then there was the file system standard,
and perhaps others that I don't know about. But a standard covering what one should expect
from a modern 'base system', in full detail, updated every 10 or 5 years, does not exist. There is a de facto `standard', RHEL, but it is by fiat. A proper standard requires a documentation (the `standard') that defines what is essential to be found, _and_ an implementation--or, ideally, more than one, _and_ a means to check that a given implementation matches that documentation.
If you have that documentation, it works both in space (alternative implementations) and time. In the future, you can be sure that a new implementation, perhaps of a future version of the standard, is still exactly compatible with the previous version.
This has been done for languages (C, Ada, Lisp, Fortran etc), libraries and kernels (that is POSIX), processor ISAs, but not for linux distributions AFAIK.
1. Younglings will gain by reading Maddog's article because it contains a lot of insight that was not previously available online.
2. Maddog didMiss a major point (which I'm certain he is aware): THE ENTIRE COMPUTER INDUSTRY WAS PUSHED TO ADOPT UNIX/POSIX APIS BY THE FEDERAL GOVERNMENT AS PART OF ITS EFFORTS TO PROTECT ITS SOFTWARE INVESTMENTS FROM HARDWARE VENDOR LOCK-IN. Even tho POSIX could be added to any platform, it was really crummy on some highly structured systems like VMS, leading all major hardware manufacturers to begin supporting UNIX-variants. POSIX was so "successful" the industry came together (with CDE, X/Open, etc.) to attempt to then further expand the functionality of such applications to be more competitive with what WindowsNT-based apps could offer. Heck, WindowsNT, which also had to awkwardly adopt POSIX APIs (along side Microsoft's Xenix UNIX-variant), even adopted one or two of the optional X/Open APIs.
3. WHY IS IBM INVOLVED IN LINUX? IBM wasn't (really) in the dot-com boom that made Linux what it is, but IBM was deeply affected by the dot-com bust like all other tech companies. One of the first things IBM did to survive was reduce costs by firing more than one hundred thousand highly paid engineers and shipping their jobs to India, putting them to work replacing legacy mainframe systems and setting up cheap websites in LAMP. IBM poured resources into LAMP to support that staff.
4. WHY DID IBM BUY REDHAT? As enterprise systems began to rely heavily on free software, Ginny Rometty the former CEO of IBM and, unbelievably, a one time System Administrator, bought REDHAT to "corner the market" for providing services to support free software within enterprises.
5. WHY IS IBM REDHAT GOING SIDEWAYS? Because IBM isn't going to ever be able to corner the market for free software support by locking its customers into REDHAT. Those customers will continue on with their current REDHAT distributions for a while and a new generation of workers will migrate them to distro-agnostic job control systems like Kata Containers (on AWS, Azure or on-prem Debian, etc.) even as the customer application providers release updates pre-packaged as containers that specify alternate Linux distributions (not CentOS/Fedora/RHEL) in their image/pod spec files. Note that while IBM REHAT may go sideways, IBM will still be able to provide free software support for non-REDHAT initiatives and will kill (or maybe spin) off REDHAT as that becomes more apparent.
>So recently Red Hat/IBM made a business decision to limit their customers to those who would buy a license from them for every single system that would run RHEL and only distribute their source-code and the information necessary on how to build that distribution to those customers.
Is that what the contract says? Does it say that if a customer installs CentOS, or some RedHat downstream, that they are now in breach? I thought it was about distribution of source code. What is going on?
Also is RedHat fine if the other systems run a different linux like Debian? Where's the line?
1. Some customers used to buy one RHEL license and install 1,000 copies of RHEL. That loophole was closed years ago. It doesn't prevent using CentOS.
2. Red Hat has to give you RHEL source code but if they do they'll also cancel your subscription. This makes it harder for Alma/Rocky to get RHEL source code. That's the new controversy.
>1. Some customers used to buy one RHEL license and install 1,000 copies of RHEL. That loophole was closed years ago. It doesn't prevent using CentOS.
It's not really closed, because you also have users buying one RHEL license and using 20,000 clone deployments, and then reporting any issues they see on clones against their one copy of RHEL, demanding customer support for dozens of issues against one RHEL license.
Have you ever been through one of those? How does it work? If a vendor turned up to my business and demanded access to my systems for some kind of audit I would tell them to go and fsck themselves, they wouldn't get past reception.
It's legal and customary in every place Red Hat does business. Have you ever actually worked with an enterprise software vendor before?
You signed an agreement with Red Hat to allow them full access to all your systems to audit usage. You will be in breach of contract, and you'll get to meet Red Hat's lawyers.
The point is, license audits suck for everyone and present their own kinds of reputational risks (e.g. Oracle), and making them less necessary by using other approaches towards the problem is rational from Red Hat's perspective
This article is actually very good at explaining exactly what is it Redhat is doing and why. And I find myself in agreement with author's conclusion.
I started my Linux journey with Redhat Linux 5.1 (Manhattan) I got on CD-Roms in a paper computer magazine. This was in late 90s long before anyone came up with the name "red hat enterprise Linux". Red Hat was the distribution that took Linux and packaged it and a huge library of software on cd-roms. It was free. It was popular. It brought Linux to the masses including to myself. Also it provided tremendous value to people like me(at school at the time) who couldn't even afford Internet access at home. It gave me access to a huge library of quality open source software for free which was a basis of my first tiny side IT business (small office servers and support) that meant I could now afford to buy better hardware, dial-up for as long as I wanted and eventually a 128kb DSL line to the internet. When RHEL came out with their licensing I considered it a sort of "step back" towards the "old" paid software business model. I didn't like it at all. I turned away from Redhat for many years favoring Debian.
Forward a couple of decades later, and I'm no longer doing small office servers as a side hustle, but I'm working full time consulting for fortune 500 companies. In this environment RHEL is seen as a safe choice. Whenever an important physical linux system is deployed it's running RHEL and it is fully licensed with best support. I can count on fingers of one hand the times my employers actually used RHEL enterprise support during my entire career, but they still paid for it. Why? Because it limited the risk. What about dev, and test systems? What about VMs no one really cared about? All of them run CentOS. Why? So people that run these systems that didn't have to be so highly available could use the same tools to manage them. So we would know if something worked on CentOS it would probably work on rhel due to same versions of software etc. It was neat, but I'm sure it really cut into RedHat's bottom line. Consider that Microsoft was paid for every single server, regardless if it was just a developer sandbox or a highly available email server. Also MS made you pay for support for all of them. Yes, you could have different levels of support for various servers, but beyond certain numbers of servers/users the only way to buy MS software was with very expensive support. Using Linux in general was seen as a money saving method precisely because CentOS was available for less important stuff and you could buy RHEL for production systems. In a way CentOS was a marketing vehicle for RHEL.
But about 5 years ago this has started to change. More and more big companies I worked with moved to "the cloud". Although you could use rhel there, almost no one did. AWS had their own distro of Linux(based on rhel too) now that too was seen as a safe choice. Also, containers, autoscaling, ease of provisioning and general robustness of Linux in general created an environment where the value of rhel support to businesses was much lower. Cloud technologies have seriously started eating Redhat's cake in large companies IMO.
So RedHad had to do something to stay afloat. And they did. Is it enough to save them? I don't know, but for sure they don't deserve the hate they get for it.
Would I recommend Rhel to anyone other than a huge business that can absorb the cost? Of course not.
I don't think we can simply forget that RH are the ones who made CentOS an internal project, promised to grow and support it, kill it off, and then call folks who are upset about the bait and switch "freeloaders". That seems rather dishonest, considering it is the polar opposite of what they initially promised to do.
You writing "in before" doesn't make it less true. We don't call free-clone users "freeloaders".
At the very best, "freeloader" is a term used by someone internally, but I've never ever heard it used in over 10 years in the company.
Full disclosure: Red Hat employee, not writing in the name of the company, views my own and all that. I'm also pretty pissed about CentOS dying, and like many Red Hatters, have voiced my concerns internally. I also happen to think that a lot of the online criticism is unfair, incorrect, and sometimes done in bad faith. So I'm trying to be accurate. As far as I know, nobody in Red Hat calls or even thinks of CentOS/Rocky/Alma/Oracle/... clones as "freeloaders".
That sounds unlikely. Have you been taken in by the name squatting of "centos stream" sounding like centos while being a totally different thing? That was basically the point of IBM choosing that name after extinguishing centos.
Centos was a de-branded redhat built by a community effort that behaved the same as RHEL, so you could use it to develop&test software for deployment on RHEL.
Centos stream is an unstable upstream that RHEL is partially derived from, run by IBM. It could have been "fedora", but that wouldn't have the same obfuscation properties. Presumably IBM don't really need two unstable upstream distributions and will drop one of them at some point.
There are a couple of new community redhat builds, none called centos.
It's not a "totally different thing". It's different, yes, but not that different.
The name "Community ENTerprise OS" is still perfectly accurate. The release model is a little different from CentOS and RHEL, but you still get LTS support comparable to every other LTS distro, you are still guaranteed no ABI changes, etc. RHEL is built from CentOS Stream, so nothing can go into CentOS Stream that wouldn't go into RHEL.
>RHEL is built from CentOS Stream, so nothing can go into CentOS Stream that wouldn't go into RHEL.
This isn't strictly true. The main problem with CentOS Stream is that embargoed security updates don't go into Stream until after they ship in RHEL. This means that if the developer responsible for the patch forgets to commit it to Stream, it can take weeks or even months until somebody at Red Hat notices and the patch goes out to Stream users. As one example, a couple months ago basic packages like httpd and php were 4 and 5 months behind RHEL, respectively.
Yeah, Stream is okay if you don't require bug-for-bug compatibility with RHEL and just want a familiar, relatively stable, rpm-based distro. I would say its role corresponds loosely to Debian testing, whereas Fedora is more like sid.
In reality, though, I've been advising clients to treat Stream as nothing more than a stopgap solution until they can migrate to something Debian-based. We've been burned once by the untimely EOL of CentOS 8, and the continuing drama just doesn't instill confidence in the future of any free-as-in-beer distro related to Red Hat.
> RH is a for profit company. If something they do is a money sink it would be stupid to keep it going.
It was their choice to take on the "money sink" in the first place. Failing to plan ahead and deliver on your promises is bad business and damages your reputation in the community as well.
> If you perhaps dislike for-profit companies
Now you're just jumping to wild conclusions, all because I think RH is pants on head stupid (from both a community and business perspective) for shutting down CentOS and going back on their past promises. Essentially, Red Hat is taking a sizeable business risk by shutting down their inbound funnel's flywheel to capture as much value as possible in the short term. This screams of short-term thinking. We will see if their gamble pays off.
> you are only entitled to the source if you are a user of said software
Sure. But you are entitled to have certain freedoms with that software. These freedoms include running as many instances you'd like for whatever purpose, and giving away copies of that software for other people to use. So the constructive result of these freedoms is that the cost of software is anchored around the cost of distribution, rather than an arbitrary cost set by the developer.
Now personally, I think it would be fantastic if someone were able to come up with some sensible pay-to-use libre license that still retained most of the freedom. The libre community is desperately missing straightforward funding like the proprietary software ecosystem. But it would be necessary to reduce freedom to do that, and it's not particularly clear how to best do that specifically. Any concept of "license instance" to charge based on results in assumptions about how the software is run, that basically undermines the ability to run it on a new type of thing. If the license is per core, then about "serverless"? Or let's say someone forks the software and makes substantive changes - does the original maintainer get to stick around charging rent despite not continuing development?
But what Redhat is doing here really isn't really a full on attempt at that. Given the works are still GPL and whatnot, I don't think they'll end up making any more than a speed bump for a new community bug-for-bug compatible distribution to gain popularity the way CentOS did.
And heck, it still remains to be seen whether threatening to terminate an orthogonal contract for exercising one's rights under the GPL ends up constituting an "additional restriction". If that was instead a monetary penalty for breach of contract, we would certainly say it does.
> Where does it say in the GPL, that you would be entitled to perpetual updates for every copy?
I never said that it did?
I'm saying that "unlimited free shit" [0] is seemingly a direct side effect of software freedom, as the FSF has defined software freedom.
I actually then went on to say that it would indeed be interesting to see a slightly less libre license that didn't anchor the use-instance price to within an epsilon of the distribution cost. So I don't really know what you're trying to argue with here.
[0] actually: widespread use of libre licensed software that has been distributed to others at arm's length
"you are only entitled to the source if you are a user of said software."
Correct but incomplete. One of the rights you automatically get is the right to share said source code. RedHat is making this forbidden with a contract.
IMO this is the only thing Red Hat is doing wrong
(That and changing the end of support date for Centos 8 after they had already committed to a longer timeframe. But this is a different topic.)
This is not difficult to understand at all. There's actually nothing to understand. It's facts.
How people will respond to it is where you'll have variety. I don't think it's proper to use a license then add an auxiliary thing that goes against the spirit of the thing. This is my opinion.
As long as you keep basing your product on GPLed code of course. You can always write everything yourself, take software with a more permissive license, or don't provide updates to anybody.
>You distribute binaries you have to distribute its source as well.
binaries are distributed only to those on the active contract, if you lose the contract you lose the binaries and thus the source. Of course, if you made a local backup you can still use everything for your own benefits
> if you made a local backup you can still use everything for your own benefits
This is GPL code. You can not only use both the binaries and source "for your own benefits",you can also share both forever with whomever you want. There is no revoking the GPL, nor taking the binaries or source back.
Nobody is taking anything back. If your contract with RH stops you lose access to their servers and cannot download their binaries or their source anymore. You can freely share everything you managed to download.
I don't know if you are trolling or actually think this is a clever argument but its not.
Clearly Red Hats thread of cancelling the support contract is intended to discourage source distribution. The point of the GPL is to ensure source distribution is possible. These two goals are inherently opposed.
This is incorrect. Sharing source, you receive from Red Hat as part of your Subscription, is not limited by our Subscription Agreements and there is no penalty for sharing it. See Section 1.4 here.
This Agreement establishes the rights and obligations associated with Subscription Services and is not intended to limit your rights to software code under the terms of an open source license.
This is interesting. If this is indeed the case then much of the discussion around it could be considered FUD. Why is redhat not addressing it? Why did the false idea get traction in the first place?
I do think Mike McGrath tried to communicate this. It got missed somehow.
Maybe we now got a case of foot (or even two feet) inserted into mouth. It’s highly emotional topic, even internally. I don’t know what the communications strategy is. Not my job. I will say that we’re encouraged internally, “don’t feed the trolls” and to let our various communication teams and leaders to do their jobs.
I just hope others like maddog guide the community back to sane discussions.
Both commenters to my comment used the word "simply". I don't think this is simple at all.
How is it simple for a company who wants support from a vendor to "simply" lose that support even though the license of the code was specifically designed to ensure the very thing.
another way to frame the "entitlement to free stuff" is to consider the real costs of duplicating and copying digital assets: there are essentially negligible (and less than marginal)
though I suppose this is a little difference between 'static' digital assets that we don't expect will change, like digital art, or games, or mp3 and so on... and 'runtime software' which we expect will have to continuously adapt, be improved, and will need to get updated. though this is a fine line.
why should only a few select group get to capture the enormous productivity boon enabled by digital copying? that I ask this question doesn't mean I hate capitalism, I think that there are things it's great at, and things at which capitalism sucks.
digital assets are exactly where capitalism becomes the worst, whereas material assets are exactly where capitalism is at its best.
but how long should we pay for digital assets that have already been created?
to pay for stuff already made is not an economically productive activity. it doesn't create any more value. and this is even worse over digital assets;
also, this is why I was observing a distinction between digital goods which won't change (most media expressions) and software which needs updating
I consider this an open problem; but I have a minority view, as I always get downvoted when stating this position.
as I see this, the current hollywood strike is consequence of failure to acknowledge what I'm saying. and it'll keep getting worse. the problems in academic publishing are also an expression of this things I keep saying. I'm starting to get tired of saying this shit. not only do things not change, but they keep getting worse.
I suppose property over [blank] asserts itself over the digital to the decrement of most and the advantage of few. I see it slipping away... what could have been. what we could have had with the internet as we seem to chose (and I'm dragged along) to a world where everything is a service subscription and a few lucky ones collect rent forever; I guess they already do and always have, but now they will also do it over digital assets. capturing for themselves the technologically provided boons of the digital and internet technology
> but how long should we pay for digital assets that have already been created?
That's a mis-characterization of what's going on here.
Red Hat makes RHEL. Then it (used to) goes *the extra step* of debranding RHEL, removing trademarks, and publishing it to git.centos.org.
Red Hat is not (and cannot) prevent you from getting your hands on digital assets. It just stopped going out of their way to provide you an easy (almost) 1-click way to clone their product for free. You want to create a clone of RHEL? Sure the upstreams are all available, do it.
but you are still not entitled to anyone's source code unless you are a user of the program.
do you want to pay RH's contract and distribute their code for free? well, find someone to host a VPS that will receive petabytes of traffic each month then.
you will soon find out, it's really not that "free"
You are just substituting one for another. Find enough people to chime in a dollar to fund a VPS? Find enough people to form a high bandwidth bittorrent mesh?
Potato potato, good luck serving the global market with a bittorrent version of dnf update
Their CDN bills are paid (or rather sponsored by) Fastly, and they have a mirror network hosted by universities and other third parties. No third party is going to donate resources to RHEL the way they might for Debian, though.
I'll add that those CDN bills are not at all insubstantial. But still peanuts compared to the actual development costs.
I wouldn't say they're exactly hiding behind the FSF. The FSF displays its brand of crazy front and center, like with its statements that not giving out your source code is morally evil but you should charge as much as you can for copies of your software. They fit right in.
This is kind of a weird take unless you're some corporate lacky.
Where does the GPL only refer to users? Isn't anyone supposed to be able to get access? The AGPL does, for certain; it's what I license my work under to ensure it can't be rehosted somewhere and be proprietary'd.
Interestingly, Red Hat appears to be gatekeeping what constitutes a user of their systems in order to offer GPL terms only to those who are users. This requires payment, so it's a way for business to refuse open terms unless paid for.
Why is that okay? They accept patches from community members, what moral right do they have to repackage and then withhold source when asked?
The true freeloaders of libre software have always been commercial efforts. If they couldn't freeload off of GNU, Linux, and BSD, where would we be?
Quite insulting for Red Hat to declare others to be freeloading when they are the ones profiting on the community's work. They wouldn't have a company without free software.
Nobody's entitled to profit. I hope some rights holders sue Red Hat to keep the source shared. They're perfectly free to build their own non-GPL distro on top of Linux, nobody'll stop them, and they won't have to share.
Where does the GPL only refer to users? Isn't anyone supposed to be able to get access?
the GPL always only gave rights to the users of the software, never the general public. only users have the right to get the source code. the users however also have the right to share their copy which is why it is usually pointless to not make the code public in the first place. red hat circumvents that by using a support contract that terminates when users share their copy.
Also, penalizing customers for sharing source code makes no sense when we already do that by upstreaming our work and our internal policy is to "Upstream First".
Skill? Funding? Effort? Nothing? It’s why we have CentOS’s origination before we hired the developers, Oracle’s Enterprise Linux, RockyLinux, AlmaLinux. We have invested massive amounts of dollars & human resources into our CI/CD, QA/QE, and performance testing. Maybe that worries people? That’s speculation on my part.
What we changed (my opinion: doesn’t represent what Red Hat is saying, has said, or will say) was an optimization (or correction for an oversight) to our CI/CD process after making CentOS the upstream to RHEL. We used to sanitize the RHEL srpms of Red Hat trademarks, graphics, proprietary information, embargoed information or source, etc. Since of a lot of the RHEL developers are CentOS, Fedora, upstream developers and this helped especially those CentOS developers. But if RHEL is downstream and already a git pull of those CentOS, Fedora, upstream, then sanitizing srpms is redundant, duplicative and even unnecessary for CentOS’s benefit.
i am not doubting your words, but if it were that simple then we would not have all this brouhaha about RHEL clones not being able to continue. or well some of them did. there was one who said that they would be able to continue as before which would confirm that this is actually possible.
note that i am not arguing whether RHEL clones should exist, but only whether they are able to exist without anyone breaking any kind of contract. i do hope you are right, but the current public sentiment seems to be that this is not the case.
I agree public sentiment does not seem to align with the change. And I believe a lot of sentiment was influenced by social media personalities, leaders of the post CentOS upstream projects. Even our competitors like Oracle and SUSE jumped in.
Almost all my accounts have some CentOS or other RHEL clones. We’re not suing those customers because they chose another Linux for a part of their footprint. We’re also not suing the clone makers. You’d think with all the lawyers we have access to, that would be an easy way to kill off clones. But we’re upstream first. Best ideas win. They tell you that on day one.
Also, thinking of this a bit more. I must admit, if my business plan was to clone Red Hat’s srpm repos and rebuild them (overly simplified), then losing access to Red Hat’s srpm repos would cause a lot of emotions. That srpm repo was taken away. That said, the access to the upstream git repos hasn’t changed.
exactly that is the crux of the problem that is getting everyone upset. do customers still have access to that?
the access to the upstream git repos hasn’t changed
but that does not allow building a 100% RHEL clone. although it is probably enough to build something sufficiently compatible. which is i think the way one of the former clones went and which is probably good enough for at least a subset of clone users. like for me, i just want a distribution that has long term stable releases. RHEL compatibility is actually secondary to not important at all in my use case.
btw: i am happy we can have this discussion without the emotions that usually go along with it. it can be difficult to get something across with all the noise everyone else is making.
>> That srpm repo was taken away
> exactly that is the crux of the problem that is getting everyone upset. do customers still have access to that?
Customers will always have access to srpms. They will, however, contain legally encumbered artifacts.
>> the access to the upstream git repos hasn’t changed
> but that does not allow building a 100% RHEL clone. although it is probably enough to build something sufficiently compatible.
Taking away sanitized srpm drops probably does affect those distros who would like to be a downstream of RHEL. Playing devil's advocate to downstreamers: But why wouldn't you want to collaborate with us in the upstream? Are they maybe implicitly recognizing the value in the engineering that Red Hat does? Or maybe even just the name Red Hat Enterprise Linux?
The GPL is a license and contract between two entities. We have no such contracts with non-customers. Maybe that is unfortunate.
In 2019 before IBM completed the acquisition, (https://www.sec.gov/Archives/edgar/data/1087423/000108742319...), we had Sales & Marketing expenses of $1.38B. We spent $668.5M on R&D. About $2B spent on employees selling, marketing, teaching, documenting, building, maintaining, certifying, supporting, enhancing, testing OSS, including contributions to conferences, foundations, consortiums, of many, many different OSS interests. Over 200 upstream projects (https://redhatofficial.github.io/#!/main). I think about all the folks I've worked with in Consulting, Support, Engineering, Sales, Product Management, Technical Marketing, Recruiting. I think about their families and loved ones. ...And I feel like we're doing alright by OSS with what we're getting out of it. It might not be a perfect or ideal model for everyone. Most of us are willing to improve where we can.
> btw: i am happy we can have this discussion without the emotions that usually go along with it. it can be difficult to get something across with all the noise everyone else is making.
Agreed. I do wish the discourse was less emotional that it has been. Many folks making assumptions without taking the chance to ask clarifying questions.
Taking away sanitized srpm drops probably does affect those distros who would like to be a downstream of RHEL.
when CentOS started, there were no sanitized srpm either as far as i know. it was part of the CentOS project to sanitize them. the only difference then is that those unsanitized srpms were previously public, and now they no longer are. is that correct?
if that is the case, and any clone project can get access to these srpms by paying for a single license, i don't see what the big deal is. there is nothing red hat hasn't already been doing "worse" ever since RHEL started, with the only possible exception of making srpms no longer public.
again, assuming this is correct, is there really any downside to making those srpms public?
But why wouldn't you want to collaborate with us in the upstream? Are they maybe implicitly recognizing the value in the engineering that Red Hat does? Or maybe even just the name Red Hat Enterprise Linux?
as far as i can tell it is the promise of long term stability and security updates. and the compatibility.
small businesses who can't afford a RHEL license but need that kind of promise without the other support features that red hat offers. or they develop applications that need to be able to run on RHEL. there is a market for that, and the current CentOS stream or any distribution based on it can't make the same promise as a clone.
but time will tell, alternative distributions based on CentOS stream previously didn't exist. there is one now, if i read that right, and it should be able to take some of the market that RHEL clones are in. and maybe eventually also show that their releases have an almost equal level of longterm stability and updates, as well as sufficient RHEL compatibility.
the only drawback of a distribution based on stream is the security updates that don't come until they are in RHEL. but then how long should it take for an RHEL security update to make it into CentOS stream?
>> Where does the GPL only refer to users? Isn't anyone supposed to be able to get access? The AGPL does, for certain;
The AGPL was designed (partly) to overcome this specific limitation in GPL3.
The preamble and license [1] reads;
Preamble
"By contrast, the GNU General Public License is intended to guarantee your freedom to share and change all versions of a program--to make sure it remains free software for all its users.
...
"if you distribute copies of such a program, whether gratis or for a fee, you must pass on to the recipients the same freedoms that you received. You must make sure that they, too, receive or can get the source code.
The license itself is best read by a lawyer, but you'll notice when talking about object code (section 6) the source is "bound" to the object code, and its recipients. There's no requirement to make the source generally available.
Aside- the Linux kernel uses GPL2 which has even fewer requirements than v3. But the concept of supplying code "to the user" is there as well.
So, in context, the recipient is the purchaser of a RH support contract, and they would argue that it should be ONLY the recipients who receive the source?
The primary problem I see here is, a purchaser is receiving source in return for the ability to distribute it, if they want to keep support. Quite a one-sided agreement that seems to dilute the spirit of libre software. You have the freedom still, sure, but choosing to use it ends the contract.
RH are complying with the terms of the license. They equally gave the right to fire customers if they choose. Given that the customer really has no major incentive to distribute the source, its not a terribly hard decision to make.
On the one hand this is a matter of law, and the law is being upheld.
On thd other hand it's a matter of "spirit" and "community" and other non-legal arguments. Which IMO is neither here nor there. There's always someone who wants you yo do things another way; you can't please everyone all the time.
What I don't get is the fuss. If you don't like the red hat policy then pick another distro. There are plenty to choose from.
What do you mean by "fuss"? Left to interpretation, anything not aligning with your view could be classified as fussing.
I don't use RH so it doesn't affect me personally, but they still hold sway in the ecosystem at present and their practices may inspire others to use orthogonal contracts to disincentivize the use of rights granted by the GPL.
We can choose non-RH today, but can you not picture a future where this is practiced by more than just them? When social practices change in influential places, it can have far-reaching consequences that don't appear related at first. If others didn't look to RH for guidance or examples of how to do business on Linux, I'd be much less concerned.
This happens a fair bit I the OSS space. Non-users have very strong opinions on how companies, that have OSS offerings, should behave. Because the software is OSS, customers, or more often not-even-customers, feel the urge to comment.
Clearly this case is not a legal issue. It's a business issue. And the business case being made by RH is that they choose not to do business with folk who distribute their source. That's novel, but not illegal.
Regarding "business on linux", to all intents and purposes it doesn't exist. Apart from RH, The percentage of users who have ever spent a dime on -any- Linux OS or Linux program is a rounding error from 0. Every second week there's a "show HN" on some new startup or scheme to somehow pay OSS developers.
The "business" of OSS, and business if Linux, is an unsolved problem. RH is at least innovating in the space, although very (very) few have been successful using their model.
Your concern for this innovation is noted, but, with respect, only affects users who are not customers. Which suggests you are less concerned with the business of Linux, and more concerned with getting stuff going free. (Which I get; we all like free things.)
If OSS -could- figure out a viable business model though, then that would allow many more OSS projects to exist, suck less, and make better progress. This would be a huge win for users to have the ability to control their own machines.
Thank you for the thorough reply. It took some time to have the opportunity to return the same respect.
Perhaps I'm not 100% in the group you're referring to, since I'm not really looking to get RH code or packages for free, or use any RPM distro. I'm interested in DIYing it with LFS, KISS, or something else suitable for a single maintainer. So, yes to non-customer, but no to wanting their code for free. However, I don't see any problem with criticizing a business decision while not being a customer. How do you shop for services if not by getting a feel for them from the outside? If I were looking for Linux support, I now know that a RH contract would be legally fragile (i.e. easy to breach and cost myself), and require me to surrender freedoms granted to me by the original software license, which is a net reduction in software freedom.
I think I see the point you're getting to with it being the sole choice of customers whether they want to breach contract to exercise the rights of a license; that choice doesn't affect non-customers per se, but it's a sinister loophole that leads people to trade their freedom to share for support. That has social ramifications, especially if done at scale. One business doing it is different from the majority, or even a trend of businesses doing it.
It would indeed be beneficial to find a fair, equitable, and sustainable business model that allows programmers to put food on the table with their technical work (former package maintainer and bug buster, myself), but how will this market differ from proprietary, if there are orthogonal contracts surrounding the works, providing different benefits with counter-incentives to exercising freedoms? Maybe the big contrast is the "exit" button comes with the source to fork oneself with. :P
I think the closest we've come to workable are sponsorships from companies who have an active stake in whatever software (or related stack) that their revenue depends on. It seems fair at first, but there's no enforcement model, nor does it seem feasible or desirable to form one. There's also the side effect of active development tending to focus on corporate concerns, due to the money coming from them. Some people have been successful on crowdsourcing sites like Patreon. Flattr had a neat model (your monthly budget would be split between every project you tipped/subbed that month), but I never looked into financial reports to see how effective it was at getting funds to projects.
If I look at this contract change through the lens of someone less savvy, then I see the value. That type of user isn't interested in even reading the source, so the support holds more value to them than the freedom to share.
So it goes, September is approaching.. :/
Thanks again for sharing your perspective, it's one of the first decent online conversations I've had in a while. Financially sustaining freedom-related software is a long-standing puzzle, and each solution seems lacking in one way or another.
Did Red Hat write perl? Any of the GNU coreutils? Any part of the compilers used to produce the binaries?
There is this view that RedHat is somehow doing something so indispensable; but really they are curating a collection of software, written wholly and completely by other people and released under GPL2 or similar licenses.
That somehow this gives them the ability to supercede the licenses that Red Hat, themselves, agreed to when downloading, compiling and distributing the Linux kernel and related software... just seems ridiculous.
So many are commenting, but forgetting that the full measure of legality is seen in courtrooms. It's unlikely that a successful defense in court can be mounted due to IBM's deep pockets, if we are realistic about it.
My hope is that Red Hat gets the full 'Bud Light' treatment instead, and simply ceases to be much of a force in the marketplace.
I don't know about Perl, but there was a long stretch of time where the only full-time paid developer of the CPython core team was a Red Hat employee, with all other core team members only able to contribute a few hours a week. I don't think that's the case anymore because IIRC Microsoft is now employing a few, but as far as I know Microsoft and Red Hat are the only two companies with any employees working on Python as a full-time job.
>Any of the GNU coreutils?
Yes
>Any part of the compilers used to produce the binaries?
Red Hat is the primary corporate contributor to GCC, yes. I would guess it's probably also the largest contributor to glibc as well, but if not, it's definitely top 3.
You'll notice that for instance GCC is ahead of Clang on C++23 adoption, and that's largely because Red Hat is paying people to work on it, whereas Google who was a major sponsor of Clang/LLVM work has been decreasing their contributions there.
You've pretty well debunked this, but I took a stroll through the git repos; the commit logs tell the truth.
Python: the latest commit was from a Microsoft employee. Not to mention the creator of Python works for Microsoft now.
GCC: The commit log is overwhelmingly from: RedHat, Suse, Intel, Arm, Oracle (!), IBM and a few indie developers scattered about. I saw maybe one GNU person in the logs at the most.
Sounds like Free Software has gone corporate.... RMS might want to get his suit dry cleaned.
it should be clarified that red hat is not violating gpl license. they are free to repackage software, then sell it closed source, and only distribute the source to its paying customers. this use-case is explicitly endorsed by rms and generally follows the spirit of free software.
That is fine. The problem is redhat is allegedly making business agreements with their customers requiring no redistribution of the code. This is a direct violation of the GPL by denying one of the freedoms it guarantees.
Isn't Red Hat's position, "You have the freedom to redistribute this code but if you do so, we will sever our relationship with you as a customer of ours, so you won't get any more code." Which might go against the intentions of the GPL but not its actual wording.
I'm sure their lawyers have though out a good defense but it very much does sound like an additional restriction placed on the right to distribute the source. Hopefully Red-Hat will be proven wrong before more companies jump on this.
but how does red hat structure their stuff? I know that gnu projects must surrender their copyright to gnu, in order to prevent future shenanigans. if red hat had their core infrastructure (package manager, package definitions, etc.) copyrighted to red hat, then they can change the license on future releases of red hat to make it restrictive. you're then free to release a gpl source of the packaged code and the red hat specific modifications, since they fall under gpl, but you can't release the scaffolding anymore that make up the rest of the red hat system. I'm not a lawyer, but I've seen this kind of trick pulled on gpl projects before, where version 2 is now bsd/proprietary, while gpl version 1 remains in public access.
(edit: I'm reading the rest of the thread, and it seems there's some confusion about what exactly is in the new red hat contracts.)
Only one historical exception: Cygwin, basically inertia from the Cygnus acquisition.
Maybe more significantly, Red Hat has only made limited use of CLAs in the past and hasn't used any CLAs for many years now. It's basically corporate policy.
There was one other Cygnus-era one... libgcj, the gcj runtime library. However, Red Hat assigned copyright to the FSF in exchange for them adopting a more permissive license for the GNU Classpath project; one that was eventually selected by Sun when they open-sourced Java.
What is your interpretation of the following text from GPLv2 (I added link to the official text above):
"2 b) You must cause any work that you distribute or publish, that in whole or in part contains or is derived from the Program or any part thereof, to be licensed as a whole at no charge to all third parties under the terms of this License."
That language is referring to copyright permissions (i.e. promising not to sue people for copyright infringement for using or distributing the software in accordance with the GPL). It doesn't refer to providing access.
A trickier thing for Red Hat might be
> You may not impose any further restrictions on the recipients' exercise of the rights granted herein.
One may argue about how to interpret that with regard to, for example, terminating a business relationship as a result. (I could see arguments on both sides.)
Also, Red Hat's method for complying with section 3 could be a subtle issue, because one of the options for compliance requires promising to provide "any third party" with the source code upon request. I don't know whether Red Hat is using that option or a different option.
>One may argue about how to interpret that with regard to, for example, terminating a business relationship as a result
I think there's been far too little focus on this, which is the crux of it. IMO (an IANAL) it seems pretty clear that even though this is not an explicit new license condition, it de facto does prevent redistribution: i.e. "you can do business with us, but only if you do not exercise one of your rights" in de facto limiting that right
I would love to see this adjudicated. Is there a company that builds statically against RHEL and resells, or modifies RHEL as a paying customer and resells that could show material injury by this move?
>> You may not impose any further restrictions on the recipients' exercise of the rights granted herein
Is it all that obvious or clear that "if we don't like what you do with our stuff, we will not renew your contract next year or sell you anything anymore" a restriction on the software they already delivered?
The software to which that clause applies was already delivered, and redhat is not, as far as I know, applying any restrictions on that software.
As a company, they are not obliged to sell to anyone who shows up with the sticker price, so this doesn't look, to me, like a restriction on the software itself.
After all, the right to choose your customers is a very basic one that only has few exemptions related to individuals in protected classes.
> Is it all that obvious or clear that "if we don't like what you do with our stuff, we will not renew your contract next year or sell you anything anymore" a restriction on the software they already delivered?
Red Hat is not using that option. In fact GPLv3 restricts that option to physical products, which RHEL is not. By distributing the packages via SRPMs, Red Hat has a single method that complies with GPLv2, GPLv3 and also with the attribution requirements of permissive licenses.
Recent versions of RHEL actually include a GPLv2-oriented written offer for source, but this is in addition to Red Hat simultaneously making corresponding source available along with binaries (for all packages regardless of license).
It's explicitly valid for any third party. I would assume that a customer could use it even where (as should normally be the case) the customer would have source code access under 3a.
There are a lot of drawbacks to use of the written offer option so I'm not sure if Red Hat will continue to use it with RHEL in the future.
I suspect that RH is not complying with the GPLv2. I can use yum to install a package from RH’s repo and it does not result in me having the source, so 3a is out. They don’t offer to distribute source code at cost to any third party, so no 3b. And 3c is non commercial distribution, so that’s out. There is no 3d.
You only need to enable the companion source repo(s) to get access to the source. To be able to access the binary & source repos via our CDN, you need to be a registered “customer” of Red Hat (which includes no-cost developer account agreements) which then gives credentials to access our CDN. If you have a valid credential to pull binary RPMs, you also have access to pull source RPMs.
> They don’t offer to distribute source code at cost to any third party, so no 3b.
If you are a customer of RHEL, then you do in fact have the ability to request a copy of the source code, including on physical media, and the ability to download it yourself from the customer portal, or from the srpm repositories.
The entire change is that the source code is now only being published in 2 places (CentOS stream and the customer portal) instead of 3 (the following two plus git.centos.org). I suppose it's 3 places instead of 4 if you include the srpm repositories.
Maybe that's covered as a 3a distribution by this additional language?
> If distribution of executable or object code is made by offering access to copy from a designated place, then offering equivalent access to copy the source code from the same place counts as distribution of the source code, even though third parties are not compelled to copy the source along with the object code.
it's the recursive clause, that applies gplv2 to derivative works. the prescriptive part is "must cause … to be licensed", and nothing else. what is says is that paraphrasing "the work based on program must be licensed at no additional fee under similar terms as the original program". this part is self-contained and doesn't say anything about e.g. copy and distribution.
the copy and distribution clause is only part 3, where you "may" copy and distribute the program (or its derivative work, as per part 2) provided that you either "accompany it with complete … source code" or some means to get the source code from you on demand.
I can't claim this just from reading the license, because I'm not a lawyer, but in rms's reading and in Lessing's reading, the combination of part 1 and part 2 mean, paraphrasing, "if you make derivative work, compile it, and distributed it, you must also provide source code".
I'm no longer confident that Fedora will continue to be a thriving open-source project. If IBM is killing RHEL clones now, why risk Fedora being next?
I hear you saying "RHEL is derived from Fedora" and "Fedora is a separate open-source project". The thing is, RHEL is derived from Centos Stream *now*. Why does Fedora need to exist? Or rather, why does Red Hat need to provide engineering and infra to Fedora? It's true that Fedora is a separate project, but its mostly contributed by Red Hat employees. If IBM decides tomorrow to stop paying for Fedora because Centos Stream is enough, can Fedora stand alone?
Maybe this is just FUD, but I no longer have confidence in an IBM owned Linux distro. I've already moved to Debian for personal machines, and I've been advocating it professionally as well. It's just irresponsible for an org that spends significant engineering time customizing their Linux installations to base on an uncertain Red Hat/IBM backed distro over a fully open-source project with no strings attached. Fool me once shame on you but fool me twice and, well, IBM isn't going to get me again.