For a professor to use only FLOSS when teaching CS students about software engineering seems entirely appropriate to me, and a good on-principle exercise.
Consider the goals of universal accessibility of education, reproducible systems research that can be built upon, allowing students to explore and improve systems, techie obligations to promote privacy and security in information systems, and simply setting an example to software engineering students that this is doable.
Also, MIT is one of the original homes of various FLOSS ideas, and if they can't manage to use FLOSS, who can? So maybe there's additional sense of professional obligation. And after he did it, it was written up, to encourage others to try it.
Sussman is not just "a professor at MIT." He's one of the inventors of Scheme and one of the authors of SICP. And perhaps most apropos for this article, one of the founders of the FSF, along with of course RMS.
Your comment could be taken out of context here. I am no language expert, but it's better to use the form "Sussman is not only a professor at MIT, but also ...".
"just" is probably overused these days. It has replaced words like "please" and structures like above.
My intent was to elaborate on the parent's phrase "For a professor to use only FLOSS when teaching CS students...", not to accuse him/her of ignorance. If my use of the word "just" sounded argumentative, I apologize.
Sussman is awesome. But I'd like all professors, everywhere, to consider FLOSS as a possible good match for their missions -- and not dismiss it as something for only a handful of exceptional specialists to do.
My approach in all area - work include - is to use FLOSS by default unless I can't for some reason. Usability is one reason, and as a developer it's good to observe the flaws that drive me to commercial software at times. Why would anyone not choose FLOSS by default?
For some industries where the software is doing something mission critical, it's important to have complete control over the software and even more importantly, you need someone to sue
Is the importance of having complete control over the software an argument against using free software? Surely the best way to have control over it is to be able to read what it does and make it do what you want.
I think the interesting thing here is what I would call the tragedy of overregulation.
Most regulated businesses are big corporations with attached bureaucracies.
When they get a lawsuit because somebody was injured, what will happen is an analysis how this bug could occur.
If it is found that the problem was caused by a library or third party that could get sued, the corporation will sue them and get their money back.
If they find there's no one to sue like with FOSS, they will likely start regulating the use of FOSS.
This has the perverse effect that after a lot of iterations of this cycle the whole toolchain is designed for "sueability" not for quality, performance, or any other worthy goal. Further the toolchain becomes increasingly opaque and proprietary.
Even though the proprietary software has more bugs, and they're harder to find due to their closed source nature, the leaders of Big Corp have covered their asses. The engineers build more workarounds and spend less time improving the quality of Big Corp's code base. The quality of the product suffers. But none of it is the fault of anyone. That's what's important.
You'll be dealing with multiple contributors making changes if you want to upgrade to the latest version, with pull requests from all over the place. Nobody has time to read the entire codebase, so you have to audit and qualify all the random open source contributors instead of just the one group writing the code. You could start with a FLOSS codebase and then just keep any additions/modifications you make proprietary/not ever upgrade, try to fix security patches and things yourself. But that can become difficult, and if you find yourself actually tapping into the benefits of open source to be able to benefit from the collaborative work of thousands of coders, you're stuck having to trust lots of random people again. An old school finance firm could use R or Python, but a lot of them use SAS because you only have to qualify one provider, and if something goes wrong, you can sue them. You dont need to have programmers on staff to evaluate the codebase, you just need programmers that can use SAS. Newer forms and firms in less regulated industries are more comfortable breaking away from these to get the competive advantage of better tools, but it's not for everyone.
So you're really making the argument between writing something yourself versus using an open source solution, instead of picking between an open source and a proprietary solution.
No, I'm comparing R and SAS for example in the above post. Same arguments apply. And again, these arent general to all cases, just a subset of highly regulated/conservative industries
In that case you're just not auditing being able to audit the closed source version, which I see as strictly worse than the situation with open source software which you could audit if you put effort into doing so.
By audit I'm referring to the people that worked on the code, not the code itself. Running background checks on a firm and having a strong contract with a firm is easier than hiring people to audit the underlying source code. It's not better. It's just easier. Based on the reaction to my post, people seem to think I'm arguing that closed source is better. I'm not. I'm providing an explanation for the thought process behind why some companies in some industries stick with closed source from personal experience. I'm not saying the reasoning is correct and leads to actual reduced security vulnerabilities/risks etc - it almost definitely doesn't. But people think it does, the legal liability is easier since you just have to sue one company, auditing is easier since you just audit one company (not the tech, the company, these are not tech savvy enough managements and firms to audit the codebase - as far as they are concerned, clear background check = code is OK to use for critical stuff). I agree with you that it's strictly worse. If you have better luck than I do convincing a conservative financial services firm that using R is better than using SAP, please do let me know how you pulled that off.
The answer to your question is in your comment.
When you consistently have to pick commercial software over floss because of the usability, it makes sense to skip that step and go to the commercial software right away.
It's not the role of most universities to teach a specific word processing program--or even a specific programming language, etc.
And even if someone has only used Libre Office in school, I'd say the school has done a pretty piss poor job in general if the grad can't pick up MS Word (or GSuite etc.) pretty quickly. Companies change the apps they use for specific purposes all the time and employees need to be able to adjust.
...but he doesn't really talk about what trade-offs he was forced to make.
He mentions that classes were broadcast with Jitsi Meet - but has anyone else used that and can comment on quality for a large class - say when doing Q&A in class?
I've been in a couple meetings with Jitsi with up to about 20 people and it seemed to work fairly well although they were just interactive discussions and didn't involve screen sharing or other features along those lines. You get much bigger than that and you're tending to lean on chat/Q&A in a classroom/presentation format anyway.
I teach at a world ranking university in the UK which has decided that in order to have a 'standardised student experience' teachers are prohibited from running their own FLOSS setup when there are proprietary contracts in place. Sorry but MS Teams is not designed with teaching in mind. I've been advocating for Big Blue Button and Jitsi but effectively told to shut up. It's infuriating.
You bring up a fascinating point. In many large enterprises -- even those that make money in some aspect of computing -- the centralized IT organization controls everything related to computers. I think we would find it odd if, say, the John Deere Corporation was forced by their "Transportation Support" organization to use Honda ATVs for moving people and goods around their factory campus. Or if Yale Law School was not allowed to purchase books written by their faculty. Or if the Stanford EE department was not allowed to equip their buildings with low-voltage LED lighting they had invented.
And yet we just accept that a leading computer science department at a major university can be forced to use crap enterprise software that's vastly inferior to anything they themselves could have written.
Some of my examples might be hyperbolic. But the power of IT departments to mandate a dumbed-down status quo still seems very weird to me. I believe it's one of the factors that keeps computer science and engineering from making more forward progress.
I think your examples actually show exactly why IT controls everything, and not people who want to do stuff because they feel like it that day.
John Deere does not make ATVs, they don't make any tier of people moving equipment. The idea that they should use their own is ridiculous. If they didn't enforce the Honda rule, people would be riding around in the front of dozers.
Much in the same way that if Stanford invented new bulbs, they would be used in a lab. With safety standards applied. Why don't you want these newly invented bulbs used all over your building? Well, what happens when they burn your building down? What happens if the people in the lab who invented them decide to make a company selling them, and are busy with that, and now you need to pay your maintenance people to deal with these new bulbs they don't know how to use.
Being slow to change in a larger organization is a feature, not a bug.
You're right, I didn't see these when I checked their site.
At the same time, I stand by my argument that there is often a great reason why you want to standardize. For example, do you want a cyclical dependency in your production stream. If there is a defect in your people movers, and you need those people movers to operate, you now have to split the newly produced people mover parts for fixing your production equipment vs getting them out to customers.
The point I am making isn't that IT is some bastion of brilliance and operational excellence. They're mediocre at it. And this is a good thing, not a bad thing.
As orgs scale, you want to be less nimble because any given success or failure is amplified. If a 10 person company screws up and goes out of business it sucks but it's not a big deal. 800 people? That's enough to get a presidential candidate to visit your campus to speak about the important of retaining jobs.
People underestimate the impact of the work we do in tech. Another thread on HN today pointed me to https://medium.com/better-marketing/pepsis-40-billion-typo-c... which I think is a great example. A simple software bug led to 18 million in loses, huge brand damage, and deaths of people involved in the protests.
I understand your point. There's value in standardization. But there's also huge value in eating your own dog food. When IT prevents dogfooding entirely, it's gone too far.
You could try to calculate some numbers on savings or "wasted hours because the corporate support is slow and wont fix our problems fast enough" to turn their heads around. Another pointer could be that the students could be involved in running some of the systems.
Usually, in situations like yours, turning complaints or proposed solutions into numbers helps a lot.
The problem is that Microsoft are good at selling a product which just works out of the box and is just about sufficient, even if it's not tailored to your use-case. Plus there's so little transparency as to how much they're charging my university, or how many hours wasted there are, so it's hard to make a price-comparison.
Student participation in some of the projects could work for certain modules, but only as experiments -- again we've been specifically prohibited from spinning up our own solutions.
> Usually, in situations like yours, turning complaints or proposed solutions into numbers helps a lot.
I agree that ultimately what needs to happen is that those of us who care about FLOSS need to organise and try to chip away at the corporate one-size-fits-all dependency syndrome at the university. The kinds of data ownership debates happening in Germany seem very far off here. The British university is in retreat. [1]
> You could try to calculate some numbers on savings or "wasted hours because the corporate support is slow and wont fix our problems fast enough" to turn their heads around.
If the UK universities are anything like the US universities this is near impossible to do. Maybe COVID has changed some things but when I worked in the university system (state run) it was more about who could woo what administrator. The amount of waste and nepotism would make your head spin.
One thing about Big Blue Button though: a friend teaches in a smallish university and has been trying to find out how to get simple, turnkey hosted installations of it (the IT department is not interested). It seems inordinately difficult to simply create an account with one of the listed hosters on the page and start paying. The big draw of BBB is the whiteboarding feature.
The alternative is Jisti Meet with the "Presenter" mode sharing a LibreOffice window, which actually works pretty well, but would be better if it were possible to toggle on/off the thumbnail of the presenter (it currently occupies about the lower right ninth of the screen).
Jitsi Meet is so far (out of Teams, BlueJeans (no linux desktop client and therefore no whiteboard), BBB) the best.
If some of the faculty researched the relevance of FLOSS to university missions, and then made a good argument for it, do you think the university would change this IT policy?
Senior management are the ones making these calls about enforcing standardised systems, and they do not take advice from us pipsqueaks below.
Right now everythings in emergency mode anyway so as far as they're concerned outsourcing everything to Microsoft is one less problem. Everything can be justified by the state of exception.
Kudos to MIT for leaving their staff to make such calls for themselves. I know Sussman is a superstar but still. I'm sure a culture like that helps contribute to MIT being the world's most highly ranked university.
Obviously he is a great (one of the best?) teachers. Having the lecture use free software only, shows he also cares about ethics at his work. I think that makes him a better teacher. Far too few people in CS care about the ethics side as well.
> Far too few people in CS care about the ethics side as well.
Unrelated to this story, but on the topic of ethics in CS: I found the decision of Joe Redmon [1] extremely brave. (If you don't know who he is, he's the principle original author of YOLO, and could easily have had a stellar CV career. Also, his papers are just amazing reads with tons of humor.)
Yeah right, because MIT is such a cheap school, it totally is required to use free software, because that is going to make a difference. Instead, MIT should make sure that students get their software and materials free of charge, not that the material or software itself is free (just because students on scholarships or alike may not have the money). They should use what is best for the students and not what is best to support some ideology.
Free software is a matter of freedom, not price. But your comment contradicts itself.
Your posited opposition between "what is best for the students" and "what is best to support some ideology" is without foundation — different ideologies differ precisely in that they make different claims about what is best for people, such as students. Whatever set of claims you endorse about "what is best for the students" constitutes an ideology.
Now, it may be that there is no objectively correct ideology — that, for example, it's just as valid to celebrate the mass human sacrifice of the Khmer Rouge killing fields as an inspiring example of class struggle, as Pol Pot did, as to deplore it as a violation of fundamental human rights. I do not believe this, but some people do.
But you do not seem to be taking such a purely moral-relativist position — instead, you are arguing that MIT "should make sure that students get their software and materials free of charge" and "should use what is best for the students". That is, you are attempting to promote your own ideology about how MIT should teach its classes, arguing that MIT should prefer your ideology to Gerald Jay Sussman's ideology and, implicitly, that MIT's administration should order him to choose different software with which to teach his classes. You are attempting to camouflage your attempted imposition of your own ideology on MIT under a dishonest implicit claim that your own point of view is free of any ideology.
As it happens, MIT does not adhere to your ideology; instead it adheres to an ideology known as "academic freedom", which holds, among other things, that professors and other instructors have fairly wide latitude to choose their manner of teaching, the material they will teach, and the points of view they will express, which easily extends to the choices in question. When the modern ideology of academic freedom was forged in, mostly, the German universities of the 18th and 19th century, it brought them to the frontier of human knowledge and made them the leaders in advancing it; nowadays many of the universities most faithful to this ideology are in the United States, but the principles are the same.
Your call for MIT to abandon its principles and suppress academic freedom, mendaciously cloaked behind a spurious claim of ideological neutrality, is deplorable.
You should not have posted it.
(To preempt some comments, not only do I not teach at MIT or any other university, I've never attended MIT and I didn't even graduate from college; and MIT, roughly speaking, bullied a friend of mine to suicide. This is not about group loyalty.)
> celebrate the mass human sacrifice of the Khmer Rouge killing fields as an inspiring example of class struggle, as MIT professor Noam Chomsky did
This obviously never happened. What Chomsky and Herman instead did was criticizing the media portrayals of the Khmer Rouge vs. the US bombings that took place at the same time, killing 600000 civilians in Cambodia (https://en.wikipedia.org/wiki/Operation_Freedom_Deal).
That article was written before the worst of the killings happened in 1978, though it still seems outrageous to me that it blames the bad conditions in Cambodia on US bombings killing water buffalo; Chomsky touts "the destructive American impact on Cambodia and the success of the Cambodian revolutionaries in overcoming it" and describes reports that “virtually everybody saw the consequences of [summary executions] in the form of the corpses of men, women and children rapidly bloating and rotting in the hot sun,” as "fallacious", saying that they "collapse[] under the barest scrutiny".
The US bombings were indeed terrible, but (as the page you link explains) they did not kill anywhere close to the 600k people you claim, and they happened earlier than the Khmer Rouge killing fields, not at the same time.
I have corrected my comment to instead make the more defensible, though still perhaps controvertible, claim that Pol Pot celebrated the sacrifices in that way.
I found your response, regardless of the stance, absolutely rude.
It’s important to understand and try to find why that person is thinking this way than to shut them down in the manner you have, again with the same subjective ideology that the parent is commenting on.
On HN, as the topic gets more divisive, please be nice and respectful.
Think about, the other person is not stupid to feel so passionate or strongly about something. There must beSome reason. Peel the layers until you get to the bottom of it. IMO, that’s so much more interesting to study than to ignore them.
Humanity gets better when we try to get out of a local optima. When we do don’t explore radical voices, and instead ignore them, we have no possible way to wiggle out of the uncanny valley. This refactoring if you will, of the human progress guided by logical reasoning, understanding of trade offs, gathering empirical data and studying behavior is paramount to a peaceful and harmonious society.
I urge you to please listen to others and ask them why they think that way. What are the pros and cons of a particular approach. Be honest and seek truth.
My criticism of our other interlocutor is certainly not that they are stupid. It is precisely that their rhetoric is dishonest, sabotaging precisely the dialectical process you claim to be concerned with fostering. If that's your concern, shouldn't you be criticizing them for their dishonest framing, not me?
Moreover, what they are attacking is that Sussman is enabling his students to study the software the course is run on, so they can understand its tradeoffs and guide human progress by logical reasoning, rather than treating the software as impenetrable black boxes they are forbidden to investigate and powerless to change; and they are attacking MIT's adherence to the ideology of academic freedom, one of the most effective ways to explore radical voices, get out of local optima, and seek truth.
I notice that you still haven't posted a response to their comment, whether criticizing their dishonesty as your stance implies you should, or attempting to understand their point of view as you are urging me to do. If you think it's important to find out why that person is thinking this way, then why are you making no effort to do so, instead attempting to influence me?
My best guess is that you're just feeding me a line of bullshit that you think will persuade me, rather than saying anything you sincerely believe, since your behavior in this thread is precisely the opposite of the behavior you are advocating.
It's not about the price, it's about the freedom I assume. Having free of charge proprietary software for students is a different approach. I wouldn't call it unethical though.
Why wouldn't you? Using software that restricts your freedom to use, modify, and share in the pursuit of advancing the human understanding of the universe around us seems to be short-sighted. Free Software gives everyone the equal opportunity to leverage the same tools and practices to do research, development, and further our understanding of our world. In most cases, proprietary software does not.
Wow. Please, explain, how writing a book in Libre Office instead of say, Pages, changes some physicists ability to research, development and furthering our understanding of our world.
And no, free software does not give anything to everyone. A very small percentage of the world population could make use of the benifits it being open cource, and even smaller ever will.
Despite all the advantages of free software, often it is very useful to use a particular proprietary tool and often it is very useful to learn (and thus, teach) how to use a particular proprietary tool, and there's nothing unethical about doing that useful thing instead of refusing to do because it involves proprietary software. Some people would refuse to use proprietary software as a matter of principle, and that's also a valid ethical choice, but it's not the only ethical choice.
Calling an action unethical is a strong accusation. It's very rude to call an action or a person unethical unless the wrongness of that action goes way beyond merely not taking advantage of an opportunity to facilitate some ethical goal, calling it unethical implies some active wrongdoing instead of failing or refusing to take what you consider the most beneficial choice.
You seem to be committing the same error I called out in https://news.ycombinator.com/item?id=23475012, although far less egregiously: advocating a form of moral relativism almost everyone would reject, while urging people to take some actions and not others based on normative arguments. If you applied the same reasoning consistently, you would end up saying, "Some people would call an action unethical despite the potential of offending the actor, and although thus speaking out is also a valid ethical choice, it's not the only ethical choice." Unless you're making some kind of fine distinction between classes of undesirable acts that I'm missing?
Okay, I'll try to be a bit specific on what I'm asserting:
1. For any scale of "ethicality" what matters is not only the actual scale, but also the "zero point". On a hypothetical scale of -10 to +10, there's a lot of choices that are not the most ethical choice providing the most utility or making no sins or whatever model of ethics is used, but are "above zero" - so they are ethical actions (ethically permitted actions) despite not being most ethical choices. Most things that we do fall in the range between, say, 0 and +5 on that arbitrary, hypothetical scale - they're ethically permitted but far from the "most ethical" possible acts. E.g. it's ethical to try and follow the effective altruism movement principles; but it's also definitely ethical to perfrom ordinary altruism.; it's ethical to abandon your life and go to a poor country to feed starving people, but it's also ethically permissible to not do that and simply live a good life. Otherwise we might as well say that everyone who's not devoting their life to charity is unethical, and that's not what the word means.
2. To adjust for moral relativism - there are many moral standards, however, even in moral relativism we (or at least I) expect them to be mostly aligned. E.g. what's +7 for you might be +10 or +5 for me, but it's very unlikely to be -7. If someone's personal ethics or religious persuasion allows and even mandates them to, for example, rape and kill babies, then we simply mark their relative morals as unacceptable (unaligned?) and invalid despite generally accepting some relativism. And relativism is tricky - we do accept some relativism - if someone asserts that not following their exact moral code to the letter (which most of the society doesn't do) automatically makes someone (e.g. most of the society) immoral and evil, then we consider their morals, or at least that part of their morals, as not aligned with widely accepted morals, extremist, not valid, and ignore it. I.e. the "privileges" of moral relativism seem to be granted only to those who also grant others the same privilege.
So I'm working with the expectation of not-absolutely-objective but still somewhat aligned moral principles, making an assumption that if something seems definitely permissible for me; i.e. not even close to 0, far from the (admittedly fuzzy) line of what's permitted and required in my opinion; then for others with different-but-still-reasonable ethical priorities it might be, at worst, mildly discouraged but not breaking any major taboos.
3. Calling someone unethical or immoral is a strong accusation. It is justified if and only if the action goes "below zero" on that scale; if the agent has broken some taboos or significant moral principles. It's not appropriate if the agent has merely acted unoptimally or "less ethically" as they could. It's appropriate if the agent has failed in some ethical duty, if some evil act was done, but it's not appropriate if the agent has failed in some ethical "opportunity", if they did not do something that is nice but not morally mandated.
4. Teaching someone how to use an useful proprietary tool is not unethical. Teaching someone useful skills is a good act that helps that student, does not impose any undue harm, does not violate any person's rights or moral imperative, it's strictly "above zero" on a moral scale even if it would have been more good or "more ethical" to do something else e.g. teach some free software instead. All the arguments made above for ethicalness of teaching free software instead of proprietary tools go into the category of "it would have been better to do that instead", there was no argument made that teaching proprietary tools is actually harmful or evil, and that there's some specific harm to society that outweighs the benefit to the student. So I did not see any actual justification why the act should be considered unethical (only assertion that doing something else might be better), but there was an assertion that anyone doing so is unethical.
5. Unjust accusations are harmful. It's offensive, harming people without appropriate reason or justification, and harming people does violate most moral principles. I assume that this is not what's being debated here - I would assume that the grandparent poster would agree that unjust accusations are harmful but would rather contest/debate the position is that these accusations are just; they might question my #4 assertion, but not this one - however, I might be mistaken, of course.
6. I'm not asserting that you should not call out unethical actors because that would offend them; there's nothing ethically wrong with just or justified accusations even if they turn out erroneous because of a honest mistake. But I am asserting that in any reasonable debate it's appropriate, polite and even an ethical imperative to give some benefit of doubt (not "innocent before proven guilty" beyond all doubt criteria, we're not proposing to execute or imprisin someone, but at least some reasonable benefit of doubt) before making any accusations. This requires some serious consideration whether that act is actually unethical (according to the criteria of #1) or it's merely less-than-optimal e.g. failing to signal some support to an ethical movement.
7. I am asserting that making unjust accusations of unethicality is itself unethical (i.e. not just suboptimal, but "below 0 on that scale", breaking moral imperatives). This is a much wider issue than this debate on free software, I've seen such accusations of unethicality very frequently misused (IMHO even intentionally) in recent political debates on both side, and I believe that this misuse of accusations is harmful behavior.
Going back to your particular example of the statement "Some people would call an action unethical despite the potential of offending the actor, and although thus speaking out is also a valid ethical choice, it's not the only ethical choice.", it is something that seems reasonable to me - it's permissible to call out unethical actors (though note the abovementioned difference between calling out actually unethical actors versus claiming that an actor is unethical without any grounds to do so), and it's permissible to not call out unethical actors; I don't see any contradiction there.
Well, first off, in this context they're using free=libre not free=gratis. (Free as in freedom, not free as in free beer).
Second, there are a lot of contexts where MIT's educational materials are available for free (gratis). I've taken a lot of MIT courses over the years and never paid a dime except to purchase a hard copy of SICP.
Honestly, imho of course, open standards are even more important than open source. At least with open standards anyone can make their own implementation.
At least until we reach the bloat level of the web.
Honestly, it's not necessarily feasible to run all the tools one might have at the university and if the only way of doing science is by using expensive apps only available to rich individuals or companies that are well off, well its quite limiting in who can gain an attractive amount of experience working with it.
It also sets a baseline for every student and prepares them for a future where science is made to be shared and part of that is using tools and work flows that can be copied around the world.
Personally I think it's a great practice to default to FOSS, whether community bits or commercially supported for a variety of reasons. However:
>if the only way of doing science is by using expensive apps only available to rich individuals or companies that are well off, well its quite limiting in who can gain an attractive amount of experience working with it.
The same might be said of all the expensive hardware and other facilities required for many STEM (and even other) courses. Electron microscopes and MRI machines (and exotic computer hardware) are scarce resources however you look at it.
Machines are naturally scarce, you need resources and labour to build them for each machine. Software is artificially scarce, the reproduction cost for software is almost zero.
So while it is understandable that hardware will be limiting and that cannot really be helped, software should never be a limiting factor in education imho. One can never remove all obstacles, but the easily avoidable ones should be avoided.
Btw., similar arguments apply to books, there is no reason digital versions of books used in education shouldn't be free (as in beer and freedom).
I am the product manager for BigBlueButton. While we implement most of the capabilities you would expect in a web conferencing system, we focus on giving the instructor many ways to engage students for learning. Being open source has enabled many schools around the world to setup and run their own BigBlueButton servers. Thanks to our community, we're localized in over 25 languages, provide a pure HTML5 interface, and have been deeply integrated into many of the most popular learning management systems. Our road map will continue of focus on the teacher/student engagement. Needless to say, Covid-19 made a lot people take a closer look at BigBlueButton. We've been working on it for 10+ years now, and we're very determined to make it the most effective platform for virtual classrooms and build upon our community.
Some of the folks in the Debian community have been trying to package it up in Debian and found it a bit challenging that some of the dependencies of BBB work only with specific Ubuntu versions. Would you please help address that and help them move forward with packaging it up and include it in Debian? Thanks again.
We're in the process of moving away from an internal build system for the packaging to having the debian package scripts as part of the repo. This work is underway for BigBlueButton 2.3 (the next version) and beyond. Once we get them released, it's going to be a lot easier for others to build and contribute to the packaging.
I think one of the nicer points to take away here is that prof. Sussman worked around remote teaching problems not by getting bogged down in meetings, but by calling a friendly admin and by himself installing a piece of free software on a computer he had lying around in his lab.
Free Software gives you back the agency to solve your problems in any way you see fit(be they hacks or not). It doesn't leave you helpless and dependent on the goodwill of third parties.
"The class used a draft textbook that Chris Hanson and I have written. The book is entitled “Software Design for Flexibility (how to avoid programming yourself into a corner)”; it will be published by MIT Press soon, with a Creative Commons Share Alike license (and all the code in support of the book is under the GNU GPL)."
It honestly would be great for some competition, FOSS or not in the distance learning department. Blackboard and friends are horrendous, and as we know zoom has myriad problems. For a small group meeting I have each week that is now online, I've started using jitsi too.
This actually surprises me, with how cruddy it is, I assumed it was proprietary... My uni switched to canvas from Moodle, to the general detriment of anyone without fast internet and a fast PC.
I brought this up once and a couple of university employees were quick to mention that features that actually let them use it at scale cost a bunch of money and there were fairly large support costs even for open source solutions.
Blackboard has some really shady IP practices as well. In 2006 they got a patent essentially for LMS (learning management systems) in general and immediately sued a competitor.
Not only that, I’ve discovered some minor exploits in my time using it. For what it is, it’s horrible software and none of my professors were ever happy to use it.
My universities computer science, physics and math departments use Sakai [1] instead of the university blackboard and I quiet like it. It's FOSS and works well on mobile (in contrast to the universities blackboard system)
My University uses ilias and in my experience it's way better than moodle, but I haven't used BB. Although I haven't seen any glaring issues in the english translations, it seems to be primarily targeting german audiences. https://github.com/ILIAS-eLearning/ILIAS
A bit tangential to the main topic, but I can't wait for that book to be out. IIRC (I read it somewhere??) it's based on his 'Robust Systems' thoughts and experiments.
> The class used a draft textbook that Chris Hanson and I have written. The book is entitled “Software Design for Flexibility (how to avoid programming yourself into a corner)”; it will be published by MIT Press soon, with a Creative Commons Share Alike license (and all the code in support of the book is under the GNU GPL)
Thanks to open source projects students are able to learn, experiment and grow. I was lucky enough to have one of our professors introduced me to Asterisk for VoIP calls. Being able to learn and modify the code base and contribute back to the community are skills which helped me grow as a professional
It turns out Google isn't as good as talking to people. If you'd like a confirmation, e-mail a one-line question to him:
"Hi. I'm interested in how elite schools hire people. Would MIT hire someone like you (as you were when you were hired) today?"
(1) Please don't reference this conversation. (2) Please keep it short. Jerry's super-approachable, but MIT professors get a ton of cold-calls. 40 words tops. (3) If you are MIT-affiliated, mention that. You'll be more likely to get a response.
Or if you'd like more background, call him on the phone (yes, those exist) and see if he has time for a chat. He'll always find time to talk to MIT students (current and former), but for others, it depends on how busy he is.
Life skill: Cold-call interesting people. You can learn a lot that way.
Don't take this so much as about ideology, than as a mode of pedagogical thought. Free software is decomposable to first principles, and it makes perfect sense to use it in a CS class.
If this was some other domain like surgery or structural engineering, using free software would add no value to the process (since the domains are already so deep that the students anyway treat all software as black boxes since their own domain is difficult enough for one person to cope with).
So here I think is the line where it makes "sense" to use a free software in university teaching setting, or not. If a considerable percentage of the students are likely able to move beyond to the "black magic box" model of software to investigating actually the CS principles behind the software, then using a free stack is definetly beneficial for the education.
If the students anyway treat the software as a black box, then it makes sense to use a black box that is pedagogically most prudent, free or not.
I'm from a developing country and the quality of commercial EMR systems varies from acceptable to abysmal. They usually have very poor usability and security. I once used a system which backed up the database by making a copy of the MySQL directory on the same machine. This other system would fetch all patient data from web APIs without using HTTPS despite the existence of GDPR-like legislation.
So yeah, there's a lot of room for improvement. Hospitals are unlikely to switch to a new system but new doctors might be open to free software. They need expert support for it though. Encrypted cloud storage services for medical data and images would add a lot of value but I'm not sure if that's legal.
I used BigBlueButton between 2010-2013 for remote training classes on OZ Technology.
The infrastructure was built over AWS, automatically stopping the servers after all people leaving the channels. For starting it, it was monitoring the training schedule and 'opening/loading' the channel a few moments before the class start.
That is nice and all but I fail to see how it is particular: the situation described here is also what me and most of my colleagues did during this period (except we used instances of these software hosted by our own university). I believe it is the same in most universities.
What’s wrong with paying for a product or a service if it’s better for my students? Isn’t the teacher’s job to find the best available tools for their students and not engage in some kind of open source software usage high score?
Edit: People are getting downvoted left and right. Why is this such a polarizing topic?
> Edit: People are getting downvoted left and right. Why is this such a polarizing topic?
Likely because there's only so many times someone wants to reply with "Free as in free market, not as in free lunch."
And it's not that much of a polarising topic, but if Gerald Jay Sussman, professor at MIT, co-writer of SICP, board-member of the FSF, writes about "Free Software", and multiple people start off with complaints or comments about "paying", even if there have been multiple posts and corrections in the commentary already, it becomes very hard to retain good faith or discern any value in the post.
>What’s wrong with paying for a product or a service if it’s better for my students? Isn’t the teacher’s job to find the best available tools for their students
There is nothing wrong with paying for something, but what's better is relative.
Isn't it better to not give away students' private information? Isn't it also better for CS students to be able to look at the source code of tools they are using? Inevitably you will need to choose some metric with which you'll measure quality of a software, that metric is subjective. This guy chose one which valued aforementioned qualities more, if you teach your own class you are free to choose software which better fulfills your subjective criteria.
I pay for SublimeText because I genuinely feel like paying for it - think about it - why does someone go out of their way to spend money on something they wouldn’t feel is “better”? I agree that criteria of what’s better is subjective but regardless, I don’t think it’s saying anything my stance on open source software.
>Don’t we buy oscilloscopes, lathe machines, incubators, scientific glass, etc for our student labs?
Of course, because there's a material cost for manufacturing physical things. The marginal cost of "manufacturing" copies of digital information after the information is created the first time is the price of the electrons it takes to perform that copy, so very close to zero.
I think everyone agrees there's a real weight of responsibility on teachers. Some feel that getting students onboard with FOSS is the responsible choice.
Free software is like physical goods with Right to Repair. In the olden days, if you bought a radio, it came with a service manual and a schematic.
It's how many people learned EE, and it was a huge loss when that went away. People maintained their own stuff. People tinkered. That's how Sussman learned to EE too.
That's an analogy to free software exactly, 100%, and spot-on.
Turns out I've fed my family just fine working on free software for most of my life (sometimes as a software engineer, but more often as a researcher, entrepreneur, executive, and otherwise). Probably 75% of my jobs.
Even in an extreme hypothetical -- if the government were to mandate that all software be free software -- only a minority of software developers would lose their jobs. Banks still need to manage transactions. Employers still need to manage payroll. Google still needs to serve up ads and search results. And I don't want to host AWS myself. Those organizations will continue to pay to build software.
There's a huge bit of confusion that the word 'free' somehow means you don't get paid. It doesn't. It turns out if the source code pops up on github under a GPL license, most of the time, the world just keeps on ticking.
There are exceptions, of course -- companies like Adobe would likely disappear -- but for 90+% of jobs in software, whether it's free software or proprietary impacts your ability to make money not-at-all.
The most successful organization I helped found was almost exclusively free software. There were hundreds of people using our platform as open source, and zero of them competed with us head-on. The only differences were: (1) our customers trusted us a lot more (if we went away, they wouldn't be SOL) (2) we had a massive amount of engineering work done on someone else's dime.
In more senior roles, or even being more assertive in most junior roles, I could usually release what I was working on as free software by asking. Right now, of the programming work I do, about 90% is free software. The organization I work at is probably 95% proprietary software. The value of keeping me around + good PR + possible contributions + ... is much higher than the value of having exclusive rights to source code that I write.
I partially agree with your points. I've worked in deeply esoteric fields - layout editors for semiconductors to wind-turbine simulation packages. There is a whole world of software out there besides the tools most software engineers use - things like PostgreSQL and your favorite web framework.
A turbine simulation package costs $250k a year in license and it is specifically tailored for our wind turbine nacelle loads, configuration and wind farm layout.
If they just give out this software for free but provide support, I am not sure if that would be sustainable. They can provide the source code for inspection ("visible code" not "open source code") if that's what your concern is. Most esoteric software don't care about the visibility of their code. It is just the right to use it freely that they opposite and I feel like rightfully so.
That criticism sounded a bit harsh and hasty. Sussman has done a lot of good, and one instance you might've heard of the ground breaking of giving away a top textbook, SICP, on the early Web.
I will probably get downvoted a lot, and this is more or less pertinent to this, so here's a rant:
I am quite sick to make software work these days: someone coded something using Python3.6 and numpy1.18.5 then three years pass. Now, it's Python3.8.9 and the software is incompatible with it, so now I have to download Python3.6.5 (not Python 3.6.1) e try to make things work. But then TLS/SSL was discontinued and I'm getting weird messages. Download the source files, ./configure, make, make install multiple times adding different parameters, add new repositories to apt, try to download the right packages, end up installing the latest Python3.8.9, and other things I don't want. And the software still not works.
Oh boy. Something is got to change in making software.
Really though, with the nix package manager (https://nixos.org/), if anyone at any point in time had a working nix package for a given program, and as long as all the inputs (source code) are still either online or cached somewhere with the same sha256, it's possible to get the exact same output.
This works with very few exceptions. Some things, like systemd dbus calls, runtime calls to an internet api that might have changed, other runtime impure things, will of course be exceptions. But in general, there aren't other exceptions. It doesn't matter if your computer has python3.9 installed, nix doesn't mind using python3.6.5 for one old package.
So yeah, nix solves exactly the problem you're complaining about. I agree something's broken in other distros, but nix fixes it.
Note, others will say containers solve this, and that's also sorta true. If someone had a docker image laying around with exactly that version of software installed and working, and you can still download said image, you'll get a working setup that way. But actually rebuilding the container (unless it's built with nix) is unlikely to be as nicely reproducible, and it misses some nice properties as a result.
Also, I can see why you might get downvoted since this isn't really that relevant to the post. It's not just free software that suffers from the problem of shipping and packaging software being a space full of unreproducibility and incompatibility
That doesn't fix anything but just pushes the problem further down the road. You are simply not upgrading the python version and running the identical three year old version.
It's kinda relevant: I imagine sending kids an assignment 3 years old and make them run with current software versions... A lot of them would be disgusted by this back-and-forth installing, tweaking, re-installing, virtual envs, all of that, and will only uses windows let's say, which hides almost everything from the user these days.
I typed "python" on windows the other day and it open the MS-Store, asked me to install Python3.8.3. I said OK. Then: where did it put it? The folder is like a crazy sequence of numbers somewhere in the structure. Is this right? For users, maybe, for developers, a nightmare.
I know! That's exactly my point! I didn't like it when it landed somewhere in my drive. I'm not saying proprietary is good, I'm saying it's better (these days), despite this little things that they do for high-level users. In a nutshell: they don't know who their users are: beginners, intermediate, advanced. They have to appease a lot of users, so they do these things and usually get away scot free.
Proprietary sw: easy to use/maintain, fun, works most of the time, professional developers behind it, profit.
Open source sw these days: several versions, constant tweaking, works so-so (depends on the software), semi-professional developers working on it (I am not saying they're not good) or amateurs (with amateur designs), not for profit, of course.
EDIT: remove double 'multiple versions'.
I think an example might be in order, because the case you mentioned sound like a quite unlikely case. Python 3.6 had some of the most frequently code affecting changes of how one writes code in Python, while 3.8 did not have that much of frequently code affecting changes in my experience. Although I have not used async much yet. I am using Python almost every day at the job and have no issue with 3.8 at all and also did not have to port anything, although writing for 3.6 before. I think they are doing a great job of avoiding incompatibilities.
For the case of numpy, some pretty obscure functions must have been used then. Usually there will be deprecation warnings ahead of time and a bit ofsearching will clear up, what you should use instead, going with later versions. Also they don't simply remove the dot product or something.
So that's why I think an example would help your post and make people understand that specific case.
What's wrong with having to pay for software? Or learning to accept that some software is proprietary? Or even with learning to use the right tool for the job, even if that "right tool" may sometimes come at a cost?
Students are required to pay for their education at MIT. Were the costs of this course offset with the costs of the non-free software used in an otherwise "standard course"?
People put immense effort into developing software. Is asking for compensation for one's time and effort somehow wrong?
And in many cases, proprietary/commercial software really does outperform the equivalent FOSS/Libre solution. Why are we teaching people to reach out for the suboptimal tools in these situations?
The second of the "four freedoms" of free software is
> The freedom to study how the program works, and change it so it does your computing as you wish.
That is particularly important for students of computer science -- what better way to see how all kinds of software works than by reading and modifying its source code?
I'm surprised if a "standard course" doesn't mostly use free software. Mine certainly did, I remember only a single module where we used a commercial software package (something for hardware simulation). I've never felt that this has limited me in any way. One module had us modifying the Linux kernel.
You find people willing to pay for your work, not for a license. If your ongoing work is not worth anything, why should you be paid extra for work you did in the past?
>If your ongoing work is not worth anything, why should you be paid extra for work you did in the past?
If you write a piece of software that took you a year to write but buyers are only willing to pay $500 for then do (A) live on $500 or (b) try to sell to many buyers including some in the future?
You seem to be saying (b) is out of the question but if it is who will write the software these people wish to buy?
You can find multiple buyers before you write the software, you know - you don't have to have just one person paying you. If it's consumer software, things like Kickstarter or Patreon are how this is often done - if it's b2b, this is how a lot of software development is done anyway. Examples of software successfully funded via crowdfunding include Spine, Magit, and Diaspora - although Spine appears to be closed-source anyway. You can write an MVP, pitch it to potential clients, and write software that provides value to them. That's what I'm doing.
Yes, copyright is unethical, as is closed-source distribution. Sell actual work - there's always work to be done on or around software. My project is livestreaming software, but it turns out there's plenty of folks who don't want to set up livestreaming software, they just want to pay someone for the result of having a solid branded stream - so my business model is providing them that end. If I could do it reasonably with existing software, I would, but I can't, so I'm writing software for it.
And there's nothing to stop someone else letting you write all the code and get paid for the bit you charge for, right.
In fact they could work full time on that while you have to spend time doing software updates/maintenance etc.
If I'm not making money off it, I'm going to stop writing it and find something else to do, obviously?
Being able to say "I have deep technical knowledge of this domain, proven by the fact that I literally wrote the software and can customise it to your needs" is worth something, unsurprisingly.
I think you're assuming that most software work is confection.
In reality, or at least in my case, pretty much all the money I've ever earned was in doing bespoke work.
You _could_ see contributions to FLOSS as loss leaders; though that wouldn't be accurate, since there are definitely benefits beyond just advertising your skills.
A key benefit: if there is a set of freelancers working around a single FLOSS code-base, each of them actually benefits by contributing back; because the shared code-base increases in quantity and quality, and thus leads to competitive advantage for all.
How could an architect of great ability ever become wealthy if they weren’t able to just sell the same building design over and over again with no work for decades? Oh wait, that’s the norm.
Earlier you said copyright is unethical and also why should you be paid extra for work you did in the past.
Architects have copyright to their work. They do sell the same plans over and over. No one can just copy their work without paying royalties or at least getting permission.
Wealthy, great architects make their fortunes on bespoke work. Even the plans they sell often have to be changed for each location - it's uncommon that the exact same building can be replicated in different locations over time, due to local planning constraints, preferences of their clients, new regulations, and so on.
Great software developers deserve to make their fortune too and should be free to do so in a manner that suits both them and their (consenting adult) clients without name-calling from developers with different philosophy/orientation.
Copyright is a legal fiction - it doesn't exist except by force of the state, and there's plenty of evidence that it has a large variety of downsides. I think it's entirely reasonable to argue for alternative methods of providing benefit to society that don't rely on having the state threaten people for you.
Or, in other words - the default state of things is that copyright does not exist, not that it does. It's on copyright proponents to prove that we have a better world with it than without.
Great software developers already largely make their fortune doing bespoke work for clients with a need for it. So do great lawyers, great doctors, great system administrators, great technical writers, and so on. This isn't a new idea.
All software isn't line of business bespoke jobs for mega corp. Family business sometimes need software specific to their vertical market but can't afford to employ someone for a year to do it.
Indeed, but who's making their fortune selling software to family businesses? If you're going to make "a fortune"... sell to people who have money. You can make a living though, quite happily. If there's a pile of small businesses which all need more-or-less the same software, set up a crowdfunding campaign and advertise it wherever they hang out. Sell a support contract. Sell a training course. When someone wants that one extra feature, quote them for it.
Or even better, grab some other people with knowledge of the domain and go find capital so you can do what the other businesses are doing, but better, because you're backed by deep knowledge of the software that runs your company and the other companies have no clue and no money.
Why do you think so? Purism created its products using crowdfunding, i.e. they sold their devices (and free software on them) before anyting was created.
In the particular case of Purism Librem 5 phone, many people decided to buy the device in order to support the development of free software for mobile phones. As opposite to the PinePhone, whose developers do not develop any software and rely on the "community developers".
Really? If the car contained no software, people would still buy it over alternatives that contain software at various levels for a better driving experience, comfort, and entertainment?
Becoming employed by a car company that wants to improve the experience of their vehicle is a perfectly reasonable option though. Or writing software for someone else's car - there's more than enough gearheads who'd spend a ridiculous amount of money on custom code for their vehicle if only they were permitted to.
Develop domain knowledge of the motor car. Ever more outlandish scenarios to enable you to make money from software in an open source way.
Well I want to develop software for devices people already have, their PCs.
Nobody but some software developers and those who don't want to pay anything cares if the software is open source or not.
Proprietary software has worked well for business for decades. It's more likely to be maintained and extended because it's paid for.
In my market there's cut throat competition between various proprietary software companies. All have developed software to migrate each others' data.
Don't like software A then go with software B or C or D.
I'm struggling to understand your reasoning. If the author had said "I only drink fair-trade coffee" would you also object if it happens to be the case that the fair-trade coffee is the cheapest type at his local supermarket?
We're discussing an educational institution with power and authority (MIT), which promotes drinking "free coffee", whilst simultaneously portraying it as somehow morally superior to "coffee one has to pay for" to consume.
Growing and selling coffee takes time, labour and effort - yet none of that is being reflected or accounted for when we choose to not pay for the coffee we consume.
Is this a sustainable approach? Does it promote "choosing the right tool for the job", or does it promote blind idealism ("free is better")? And why does a university, which takes exorbitant tuition fees, not prioritize the best software for the course (over the one that's merely free)?
You seem to fail to grasp the difference between "free as in beer" (gratis) and "free as in freedom" (libre).
Your coffee-response is all wrong: this is not about "free" coffee, about (not) having to pay for coffee, but about a moral stance on coffee: e.g. demanding the cafetaria only serves Fair Trade coffee, regardless of the price.
This is not about "having to pay for it" at all. The opposite really: running your own jitsi or BigBlueButton is probably more expensive than using the free tier of Teams, Zoom or Hangouts.
I run a large-ish jitsi instance: approx €50/month just for the VPS, my hours probably add another €2000/month to that.
> "You seem to fail to grasp the difference between "free as in beer" (gratis) and "free as in freedom" (libre)."
Why are we going back-and-forth about this again?
I previously asked whether there were concrete examples of software used during the course that were "free as in freedom" but NOT "free as in beer", and the response to my question was that no such examples were available.
It's the actions that matter here, and the bottom line is that they weren't intending to pay for anything to begin with. And by that, they were abdicating the "it's not about the money" argument, in my honest opinion.
> "I gave you an example by quoting how expensive it is to run a "free" jitsi service."
That's an entirely meaningless argument. Here's what you're doing:
> "I run a large-ish jitsi instance: approx €50/month just for the VPS, my hours probably add another €2000/month to that."
Here's what Sussman is doing:
> "I used a Jitsi Meet server that I installed on an obsolete and otherwise useless computer that was sitting idle in my laboratory, on its way to the electronics junk heap."
The two scenarios are not comparable.
I'm not making exaggerated claims here. The University can absolutely afford to do better, the students (who pay exorbitant tuition fees) deserve better, and any "libre software" idealism here is simply people trying to cut costs, jeopardizing the quality of education and the overall experience, while touting moralistic superiority...
Distance learning, for example, could've been a much more widespread and accepted thing, had it not been for instructors cobbling up together scrapyard-bound hardware to use as a chat server. Coming up with a proper solution takes investing (time, money, expertise) - which some people will evidently avoid at all costs...
"It made available licenses for various nonfree programs, but I objected to them on grounds of principle.".
Which is to say Sussman could effectively, for the purposes of this course, get _any_ license for free-as-in-beer.
So the actual decision would purely be on some other grounds. Since Sussman is a world renowned CS teacher, I choose to believe he made his choice based solely on whether it was most suitable to teaching CS.
(This is not an unreasonable belief: The concept of "Free Software" guarantees that the student is able to take the software apart to see what makes it tick. That is obviously a very valuable property when learning how things work!)
> (This is not an unreasonable belief: The concept of "Free Software" guarantees that the student is able to take the software apart to see what makes it tick. That is obviously a very valuable property when learning how things work!)
And something that resonates very well with the basic attitude and culture of academia/research.
The article you point to illustrates that that basic attitude and culture is under threat. I agree. Openness, being able to build on the works of others, and learning "how something ticks on the inside" are still basic to science though.
I, too, wish that what the article illustrates weren't happening, but are you really arguing that you can't take the idea from a published paper, understand it and build your own work on it? The FOSS philosophy is the closest equivalent for code.
> promotes drinking "free coffee", whilst simultaneously portraying it as somehow morally superior to "coffee one has to pay for" to consume.
The price does not enter into Sussman's argument. Maybe it could, given how students tend to be short on money and abhor paying for expensive textbooks, but it doesn't.
> Growing and selling coffee takes time, labour and effort - yet none of that is being reflected or accounted for when we choose to not pay for the coffee we consume.
In my coffee analogy, the promotion of the fair-trade alternative would be based on the fair-trade mechanism for ensuring the lower levels of the production chain receive a fairer share of the income. Whether or not the author pays less or more at the store is irrelevant to the argument.
> Is this a sustainable approach?
For this class? Almost surely!
For some people (e.g. me)? To a large extent (things aren't black and white). Apart from (the admittedly large chunk of) non-free Javascript run by the websites I visit, and some firmware, my computing world runs entirely on FOSS.
For absolutely everyone in every situation? Surely not. That's OK.
Really, the only point that matters a lot here is the first one.
> Does it promote "choosing the right tool for the job", or does it promote blind idealism ("free is better")?
It seems to me to also promote the fact that FOSS is far more compatible with academic culture and behavior. While indeed you may have to pay publishers for access to articles (luckily a practice that's on decline!), you certainly have complete freedoms to build on the work presented in those articles for your own research!
I'd go so far as to say that no closed tool can ever be "right for the job" in an academic research setting! (Although one sometimes does have to compromise when no adequate alternatives exist, especially when it comes to lab equipment – but in the CS world things are a lot better.)
> And why does a university, which takes exorbitant tuition fees, not prioritize the best software for the course (over the one that's merely free)?
I don't understand what MIT's tuition fees have to do with this.
> "In my coffee analogy, the promotion of the fair-trade alternative would be based on the fair-trade mechanism for ensuring the lower levels of the production chain receive a fairer share of the income. Whether or not the author pays less or more at the store is irrelevant to the argument."
But for there to be any form of income trickling down production chains, someone has to be willing to pay something for the services they consume. Is this controversial in any way?
> But for there to be any form of income trickling down production chains, someone has to be willing to pay something for the services they consume. Is this controversial in any way?
No, but it is also entirely orthogonal to the blog entry we're discussing. At no point does the monetary cost of something enter into what Sussman is talking about. You seem to be conflating free-as-in-beer with free-as-in-freedom (man, I feel nostalgic writing those phrases again – I don't mean this as an offense, but the difference seems to be so much more well understood among the tech savvy these days than it was 15 years ago).
This is where the coffee analogy becomes relevant. While indeed the cost of coffee affects a lot of things in the coffee value chain, the idea of fair trade (for all its flaws, let's not digress into those here) are not about the end product's cost.
Your question is missing the point, but to give an answer:
Sussman taught the first edX course, 6.002x. Agarwal took credit for it (since he shot the videos and was the face), but Sussman did the plurality of the work, followed by Terman, by Mitros, and then by Agarwal.
Open edX is free-as-in-beer but not free-as-in-price.
I actually think that free software is an excellent choice for a classroom because not only can the students examine how everything works, they can take a copy home with them and modify it.
I wish I had had that opportunity when I studied computer science, because my tools were treated as a sort of "magical black box" with no visibility.
The programs we wrote in class were toy programs, and there was no sort of "reality check" for how our editor or compiler or any other tool was actually written.
A large part here is the "sales". Commercial software, almost by definition, has a sales team that ensures their software is bought by universities, companies, governments etc.
Non-commercial software does not have this in that sense. It may have advocates who, out of sheer enthousiasm, bring software into such universities, companies and governments, but hardly ever an actual sales.
And since non-commercial software is largely FLOSS, whereas commercial software if most often proprietary, we see that, simply through sales, the proprietary alternatives are used far more often.
Regardless of any technical merits. Technical superiority hardly "sells" software, sales teams do that. Unfortunately, I might add.
No one objects to people being paid to make software.
What if people were paid through public grants, or paid well through the welfare state; instead of through the legal scarcity provided by intellectual property?
You're assuming a generalization which is not present in the article. People can disagree on the merits of paid software and still agree that teaching using open solutions is has benefits. One does not necessarily follow the other.
While commendable, I don’t think this approach is useful. I also had a professor that only wanted to use FOSS for its students but the reality is that university should prepare - at least to a certain degree - for work.
The radicalization of this approach leads to students that land their first job without knowing how enterprise commercial software work lacking therefore a very useful entry level skill.
I do not think the job of universities is to teach how to use any specific piece of software - be it enterprise, commercial, proprietary, free or not. That software changes and often different between companies, so what should be learned is knowledge that applies regardless of software.
And TBH i do not see why a university should provide free (or worse, paid[0]) advertisement material for a commercial product.
[0] i mean paid by the software companies and it is worse because usually the students either paid for the admission to the university, meaning they paid to get advertised to, or they enrolled in a public university, meaning the taxpayers paid money to have their children advertised to
It is the job of university to participate in what our future will be. In that sense, I'm all for universities to push student in a direction or in another (provided that direction doesn't put them in difficulty later in their live).
IMHO, you don't advertise free software. Because free software is not commercial per se. Advertisement is for commercial products (and with a bit of sarcasm, I'd say that most commercial software need advertisement either because they don't have enough value either because they just want to be bigger than the other; in both case, the mankind is not well served).
>>> the taxpayers paid money to have their children advertised to
You cite "a professor", and this article is also about a singular professor. That implies that this is the exception, not the rule. Students have ample opportunities elsewhere for exposure to "enterprise commercial software".
The delta between Jitsi Meet and Zoom or Teams for video-conferencing is minimal from a user experience perspective. The idea that a CS graduate from MIT would struggle to figure out Zoom or Teams at their first job is laughable.
Universities do not exist so businesses can outsource job training. Academia is an end in itself, not a means to an end. It's about teaching different ways of thinking, and performing research.
Vocational schools exist if you want to be spoonfed how to do something cut-and-dry.
>Academia is an end in itself, not a means to an end.
Sorry my overly idealistic friend, but as long as every job posting "prefers" (i.e. requires) a BS CS, it's fine to see the pursuit of that BS as a means to an end.
Sure, but then don't put your entitlement on the university. The university offers you an academic degree, if you want to use that to get a job, play by their rules, or don't go. Free market and all that noise
Why? It sounds like a symbiotic relationship still.
Academia trains people for academia, which is an end in itself. Industry happens to like the skillset that people with that training have. This motivates people who do not wish for an academic career to undergo academic training.
Win-win. It doesn't mean that said academic training has to adapt to the whims of industry, or turn into trade school.
Is there maybe room for a less academic higher education system next to the more academic one? Sure. But getting industry to accept that (as in e.g. Germany or Switzerland) is not academia's concern.
Academia don't only train people for academia for the simple reason that there's clearly not enough academic jobs for all of the students who come through.
And the teachers and professors know that.
Here's one CS department which explicitly states how they can help students improve their career changes in the professional world, quoting https://cse.unl.edu/focus-areas :
> The Department of Computer Science and Engineering at the University of Nebraska-Lincoln introduces Focus Areas for its Computer Science and Computer Engineering majors. The goal of the Department is to equip our graduates with advanced skills focused in specific areas to better position them for successful careers. In today’s professional world, computing and computational problem solving skills are ubiquitously in demand in a host of advanced technology and scientific applications.
> Of course, the first thing people imagine when they think of getting an English degree is teaching high school. Teaching is certainly a noble profession, but there are many career paths one can pursue with an English degree outside of teaching due to the excellent training provided in this field.
I must be doing something wrong, because all of my work over the past decade was done using mostly (like 95%) free software. And as a hiring manager I cared not at all if my hires know about "enterprise commercial software". In fact, I'd also add that the time I spent learning commercial dev tools (Microsoft stack) turned out to be absolutely useless in the long term. Worse than useless, in fact, because I could have spent that time learning tools and libraries on which some commercial company doesn't pull the rug from underneath me every 2-3 years, like Microsoft has a habit to do, to force upgrades.
Seriously? In 2020, the idea of writing software using free tools is "radical"?
> without knowing how enterprise commercial software work
What software specifically? I mean, you want to write web clients? Almost entirely free tools. Android or ChromeOS drivers? Ditto. Backend cluster deployment paradigms? Free. Docker? Free. Kube? Free.
I mean, it's not like you can't find some worthwhile "enterprise commercial software" out there to buy. But to pretend that the bulk of the most exciting work in software isn't almost entirely done with open source tooling (and has been for more than a decade!) is... just very strange.
I am a fan of open software, and also a realist. I think what he did is great. My only concern is that the source code in the book is gpl licensed (and not MIT or better public domain). This means for the student taking the class he technically is not allowed to use anything later in his work life (except of course he works for an open source company, which only a few do, bigger exception if the company uses the software only inhouse - but then he is in the EE department, so this means most students might work on products later).
Most coursework is (implicitly) proprietary, and that's not a problem, so I wouldn't expect an issue there. I suspect the key issue here is that the students are unlikely to be copying the material at a level which would qualify as a derivative work.
Consider the goals of universal accessibility of education, reproducible systems research that can be built upon, allowing students to explore and improve systems, techie obligations to promote privacy and security in information systems, and simply setting an example to software engineering students that this is doable.
Also, MIT is one of the original homes of various FLOSS ideas, and if they can't manage to use FLOSS, who can? So maybe there's additional sense of professional obligation. And after he did it, it was written up, to encourage others to try it.