Hacker News new | past | comments | ask | show | jobs | submit login
Facebook Announces React Fiber, a Rewrite of Its React Framework (techcrunch.com)
851 points by apetresc on April 18, 2017 | hide | past | favorite | 406 comments



I'll repeat the comment I made in the "React 15.5" thread a couple weeks ago (https://news.ycombinator.com/item?id=14063818):

For those who are interested in some of the details of the work that's going on, Lin Clark's recent talk on "A Cartoon Intro to Fiber" at ReactConf 2017 is excellent [0]. There's a number of other existing writeups and resources on how Fiber works [1] as well. The roadmap for 15.5 and 16.0 migration is at [2], and the follow-up issue discussing the plan for the "addons" packages is at [3].

I'll also toss out my usual reminder that I keep a big list of links to high-quality tutorials and articles on React, Redux, and related topics, at https://github.com/markerikson/react-redux-links . Specifically intended to be a great starting point for anyone trying to learn the ecosystem, as well as a solid source of good info on more advanced topics. Finally, the Reactiflux chat channels on Discord are a great place to hang out, ask questions, and learn. The invite link is at https://www.reactiflux.com .

[0] https://www.youtube.com/watch?v=ZCuYPiUIONs

[1] https://github.com/markerikson/react-redux-links/blob/master...

[2] https://github.com/facebook/react/issues/8854

[3] https://github.com/facebook/react/issues/9207


I admire your exceptional work cataloguing these resources. However, just one look at that giant horde of links is, to me, a perfect demonstration of why the front-end development ecosystem is way out of control.

Its amazing to me that the autoconf/automake/libtool system for making write-once-run-everywhere *nix applications is downright simple by today's standards.

Every year the hot libraries change, the build tools change, the metalanguages, the data formats, even the damn paradigms change. Each generation requires more layers trying to fix up the underlying evil that nobody who soaks themselves in the matter will admit: the browser is a bad app platform and javascript is a bad language. Both have been pressed into serving a purpose for which neither were designed.


Your comment seems to be a bit of a non-sequitur.

Application development is complex, period. Posix development is complicated. Qt is big and complicated. Makefiles are complicated. Autotools is complicated. Big Java server apps are complicated. C++ is complicated. We've just hit the point where people are trying to do the same thing in a browser.

Would you prefer to build your application using Make, Scons, Waf, CMake, QMake, Tup, Bash scripts, Maven, Gradle, Ant, Buck, or something else? The build tool question is _totally_ solved, right? :)

Yes, the JS community has been reinventing a number of wheels. Yes, the browser environment and JS standard lib have weaknesses. On the other hand, it's also allowed developers to build useful applications that are immediately distributable to pretty much anyone who has an internet connection.


The benefits you acknowledge (distribution) are not because of the tech but in spite of it.

At the end of the day all it would take is one of the 3 major browser vendors to offer an alternative. That alternative could be distributed to a large percentage of desktops in a relatively quick manner saving billions if not trillions of dollars in wasted productivity.

It will never happen though.

* EDIT: This comes from someone who uses React, Redux, webpack, etc.. daily


No-one could even agree on making HTML5 valid XML - in order to leverage all the tools and knowledge that exist in that space.

Not that I even like XML. But having another XML dialect that's not actual XML shows that the only hope is to build valid tools and abstractions on top of all the crud.


There's nothing "not actual XMLish" about HTML. HTML is formally describable in its entirety using SGML (see my project at http://sgmljs.net/blog/blog1701.html), just not by the XML fragment of SGML.


Is it possible to scrape random html5 pages off of the internet and transform them with xslt 2?

Or write content/metadata schemas (distinct from a DTD schema that simply describes html5) using XSD?

Unlike xhtml, my understanding, was that the current tooling around XML won't just work out-of-the-box for html5.


You're missing the point. HTML wasn't XML because XML didn't exist then. SGML did. XML is a "an SGML dialect that's not actual SGML," not the other way around. The people who designed XML knew the SGML spec, as they wrote XML as a subset of it, and they knew what was in HTML. They could have written XML so that HTML conformed. They chose not to. But the HTML spec at the time was absolutely conformant SGML.


What cwyers said. Plus, XML was designed to replace HTML as XHTML, but it didn't happen. Doesn't mean you need to give up markup processing alltogether. Just use full SGML, which has additional features such as tag omission/inference and other markup minimization features for text authoring not in the XML profile of full SGML such as markdown and custom Wiki syntax support. You might actually like it.

You can parse HTML 5 into XML using my HTML 5 DTD [1] and eg. OpenSP's osx tool or my upcoming sgmljs.net tool.

What do you have against DTDs? They do the same as XSD, and then some.

[1]: http://sgmljs.net/docs/html5.html


> But having another XML dialect that's not actual XML…

HTML is based on SGML, it was never an XML dialect.


>"At the end of the day all it would take is one of the 3 major browser vendors to offer an alternative."

You mean like Dart?

Besides, languages that target WebAssembly are almost certainly going to evolve to fit that space.


I have serious doubts that WebAssembly is really going to come to dominate the web the way many people here seem to believe it will.

The idea that all of sudden there is going to be a rush to push compiled/closed sites after so many very successful years of using free-and-open stacks seems rather non-logical to me.

Time will tell.


The success of the JavaScript stack was not dependent on source being shipped to the browser, and the push for WebAssembly​ is not motivated by keeping source away from the browser.


What? It absolutely was. Whole generations of today's JS people trained themselves with View Source.


....decompiling minified Javascript?

Sorry, this ship has sailed.


That was before minification. I think that ship already sailed.


> The idea that all of sudden there is going to be a rush to push compiled/closed sites after so many very successful years of using free-and-open stacks seems rather non-logical to me.

Well, then we have minified JS that isn't exactly readable. Similarly with multiple-compiled JSX-to-ES6-to-ES5/JS code.

Also, there's a number of JS obfuscators, whose main purpose is to smuggle a piece of closed code into those theoretically “free-and-open” stacks.

Thus, I think, after wasm becomes generally supported, there will be more incentive to write code in your language of choice and compile it to a common platform, than to write code in JS or one of its derivatives and minify/compile it anyway.


I see your point. But I'm more optimistic. We devs like to jump on hype trains. We only need one good reason to do so.


>"compiled/closed sites"

It's worth noting WebAssembly isn't compiled into a form only computers can understand. There are plans to offer 'View Source'-style functionality for WebAssembly. WASM is a compressed AST rather than hardware-specific bytecode.


> Besides, languages that target WebAssembly are almost certainly going to evolve to fit that space.

And are already but now targeting JS. Like PureScript, ReasonML, Elm and even TypeScipt to some extend.


If this is such a good and valuable idea, why will it never happen? Seems the major internet-interface makers (Google via Chrome, FB via their app, for ex), are vying for control over the way people develop web applications too. Making an alternative that gets adopted would be invaluable to them.


It won't happen because none of them will implement the other guy's new thing. Chrome would do the Google thing, Edge would do the Microsoft thing, and Firefox would sit back and balk that none of that junk is standardized. Then webdevs would just stick with Javascript and friends because it's the only thing that mostly works everywhere. That's exactly what happened to Dart, right? It had aspirations of hopping into Chrome along side JS but who else was going to put it in their browser? Now it's just another thing that compiles to Javascript because that's the only way you'd get anybody to adopt it.


You could also see it the other way around. That we've come pretty far with standardization based on JavaScript and HTML5 (and CSS and SQL and POSIX and IP ...). What do you expect exactly?


Mozilla is probably in the best position to do something. If it's good enough to get developers to switch and rave about it, ms and google will follow to keep mindshare.


JavaScript was invented by Netscape, the predecessor of Mozilla. Also Firefox came out in a phase of browser stagnation, but Chrome in particular and the other browsers are so powerful today that I'm more worried we're going to loose web standards to "whatever WebKit does". So I wouln't hold my breath.


Not true, see webassembly.


The parent and grandparent were talking about unilaterally developed tech, "At the end of the day all it would take is one of the 3 major browser vendors to offer an alternative." etc

The correct comparison there is Dart. WebAssembly is a joint effort being developed by a team comprised of folks from every major browser vendor, so it isn't really an example of what they're talking about.


That already happened and is called webassembly. And it is backed by not one, but all of the major browsers.


Dart is shipped with chrome, or not? I don't see any huge success :)



> Would you prefer to build your application using Make, Scons, Waf, CMake, QMake, Tup, Bash scripts, Maven, Gradle, Ant, Buck, or something else? The build tool question is _totally_ solved, right? :)

Yes.

I've worked with CMake, Gradle, Maven, Ant, with webpack, makefiles, and the entire rest. And you know what? I went back to ES5, just because it's easier than fighting the JS environment. Comparing your average build.gradle with a webpack config is downright insane.

Why does the web environment have to be so much more complicated than the native world? The web could be so much more powerful if they'd study history, and learn from it.


With what webpack config though?

Half the problem with the web stack is inexperienced people making things way more complicated than they need to be. A minimal webpack.config is pretty damn simple.


> Qt is big and complicated.

I'd encourage every developer considering Electron to give Qt/QML a try. It merges the declarative nature of WPF with a nicer syntax and the ability to use JS directly in the QML file, obliterating the need for converters of formatters (that plague XAML/WPF and JavaFX). Writing custom components is trivial and intuitive, with zero syntactic overhead (if a QML file "MyCompX" is present, you can instantiate its contents with "MyCompX" without any additional setup).

For Android, it has the advantage of being able to hot-reload the UI without jumping through hoops (simply serve them over HTTP and install a small reload handler in your app that can be triggered either from the app of by something like a UDP message from the host). (Change one like in the app's main and you are able to pull the QML files from a local directory so you can modify them on the tablet without a PC or internet connection ...)

I'm actually reluctant to mention Qt/QML because I consider it to be my secret magic silver bullet for portable (non-web) UI development.

Regarding the license (there's a lot of misinformation, including Qt's main site trying to coerce you into using the commercial license): if you link dynamically, commercial usage is not a problem (LGPL). (This is usually only an issue on iOS.)

The recent Electron situation (needed on one hand, loathed on the other hand) is a unique opportunity for Qt to gain (back) some developer mind share, but to avoid alienating web developers with scary C++ stuff it would help to decouple QtCreator from the Qt C++ API: offering a version that only exposes the QML editor and interface generator, and ships qmlscene as execution runtime and an option to deploy the QML-only application with a rebranded qmlscene (embedding the QML resources and a custom icon). To be clear: this all exists right now, but with the C++ part enabled (which will scare off some developers even if they can create applications without it).


>The recent Electron situation (needed on one hand, loathed on the other hand) is a unique opportunity for Qt to gain (back) some developer mind share, but to avoid alienating web developers with scary C++ stuff it would help to decouple QtCreator from the Qt C++ API

For me it's exactly the other way around. If I'm using Qt it's because I want to use C++ to make my app as efficient as possible in terms of CPU and memory usage. Now that JavaScript is in the picture, I'm less likely to consider Qt for a cross platform app because I can just as well use any of the other more popular options like Electron or ReactNative.

Qt used to be a C++ GUI framework. Rebranding it as something else won't work. They won't get the web crowd and they will alienate people who want to avoid web technology.


> Application development is complex, period. Posix development is complicated. Qt is big and complicated. Makefiles are complicated. Autotools is complicated. Big Java server apps are complicated. C++ is complicated. We've just hit the point where people are trying to do the same thing in a browser.

Actually, Java, C++,QT and co are going through great length to make development simpler and all the compilation pipelines easier. Let's take Java for instance, groovy, maven and war packaging considerably made Java server development easier. Java frameworks like Spring are trying very hard to make servlet development easier, by making complexity opt-in. Java is getting constructs like modules which will help with library development, C++ too, QT is streamlining its API ... So I'd totally disagree about your statement. The success of Go yet another proof of this. People are tired with unnecessary complexity and are ready to give up on "elegance" for the sake of simplicity.


And the Javascript world has gotten a defined syntax for modules, package managers, and build tools that implement build pipelines, compilation, and linking. Same concepts, new language and environment.

My point is that building complex software inherently involves _some_ amount of complexity on the development side, which is reflected in the language syntax and tooling used with that language. Now that people are building complex apps in Javascript, we're seeing catch-up in the tools used to build Javascript apps to match the tools that have existed for years on the desktop/native side.


I don't know about Java frameworks and build tools really making things easier. They just shift problems around so that instead of having to deal with the Servlet API (say) you have to deal with Spring idiosyncrasies, IDE integration troubles, cognitive overload by too much tooling, and endless discussions about the "right" way to use a particular frameworks or design pattern. Saying this from 20 years of Java experience.


The word Java and "simple development" is just absurd for any real application much less most enterprise applications.


Is it that absurd?

Building Java applications using Undertow with a build setup in gradle is not _that_ different from a node.js service using Express and package.json for dependencies, or a Ruby, Sinatra and Rake stack. After having tried all three, I ended up going back to the JVM. It's not a perfect ecosystem, but it's damned good one.


If you're on gradle you're already pretty cutting edge compared to most enterprise java setups. Probably most projects are still using maven.

Also there's definitely been some cross pollination of ideas from at least the Ruby world to the Java (Rails inspiring Grails and probably Play too), so it's getting better. Add java8 lambdas, and it's quite nice.


> Probably most projects are still using maven.

Aren't you optimistic.


Maven's pretty big still.

https://github.com/search?l=Maven+POM&q=pom.xml&type=Code&ut...

https://github.com/search?l=Gradle&q=build.gradle&type=Code&...

And that's public github, which is supposed to be more cutting edge than the legacy code that powers most companies. I started a project with gradle on my last team and everyone complained because they had never used it before.


Yeah, I would love me some maven...


just got rid of a project using a 10 year old custom patched ant.


Rust, PHP had simple af builds/packaging. So no, not everything is complex, just old dinosaurs AND web.


I would never want to use php's "just recompile it with the new dependencies" approach ever again. It's a complete mess as soon as you go off the rails of what's included by default.


I will never believe people recompile PHP binaries every day.


PHP's packaging was only "simple af" until you had a binary dependency that wasn't part of your operating system.

Complicated things exist.


But now you're hitting on one of the main points that was always criticized in PHP: the fact that it came with a lot of batteries included. The upside of this is that it makes PHP applications very easy to deploy, because often, you either don't need to resort to binary dependencies outside the realms of your OS's packaging system, or there will be a `php[major-version]-[library_binding]` package for your OS probably already in your OS's official package mirrors. Say all you want about PHP, but of all the application stacks I've had to deploy, PHP was and is probably still the simplest. The advantage of that shouldn't be underestimated. The only serious competitor in that space that I've ever touched is probably Go.


Yeah, you can recompile PHP 2-3 times per year, and it's just long line of args, nothing more. And you can compare it with every-time compilation of modern js apps. What an equation! And you forgot to mention Rust, but let's pretend it's normal way of arguing.



The most disappointing part of all of this is that in the JS environment no build step is required, but it is often there anyways...


What do you mean? Are you referring to gulp and other similar tools?


I am referring to any build step. Do you need a build step to execute JS?


Browsers and JavaScript are what we have, we can't magically replace every browser and JS implementation in the world. We have to figure out where we want to go and slowly get there. That is exactly what is happening in the frontend world and that is why there is so much churn.

Browser apps have to work across 4 major browsers with versions going back almost 10 years, an infinite amount of screen sizes, and a near infinite amount of devices. There is no other platform with the capabilities of the browser that has this much reach.

If we had the ability to start from scratch and build the ultimate web platform, yes, it would probably be way better than what we have, but we don't have that ability nor will we ever have it, so talking about it is a waste of time. We need to look at our current capabilities, where we want to be in the future, and iterate until we get there.

All of the tooling we have serves a purpose and solves a problem, it's not like frontend devs are adding tooling for the fuck of it.

Take a common stack nowadays: React, Redux, Webpack, Babel, Eslint, Yarn, Styled Components

Every single one of those tools in the stack make my life easier. I don't have to use all of them. In fact, there was a time where I didn't, but I would never want to go back to that time because these tools allow me to build way more complex apps way faster than I used to. And the other great thing about these tools is you can add/remove them as needed. Take something like browser prefixes, as browser support for certain prefixes or features are implemented at a large enough scale, the tooling for that becomes obsolete and you drop it.

And as far as JavaScript goes, pre ES6 I would agree, it sucked. Post-ES6, I will have to disagree. I love writing JS nowadays.


> And the other great thing about these tools is you can add/remove them as needed.

In order to get that freedom, you need to build the tooling in such a way that it aligns with the web standards, so that when the standards catch up you can drop what you don't actually need anymore. We all switched from coffescript to ES6 and Babel because ES6, being standards-based, will eventually be supported in browsers, right?

And yet, with JSX we have a gratuitously non-standard dialect of JavaScript that isn't on the standards track and will never be natively supported in browsers. So, even when browsers catch up and offer full support for modern ES6 syntax, we're still stuck with transpilers because Facebook thought it was a good idea to tightly couple their custom templating language to JavaScript itself instead of wrapping it in template literals, which is the standards-based approach that ES6 actually provides for embedded DSLs and templating.

Today, in Europe and North America, roughly ~85% of users run a browser version that has native ES6 support. IE marketshare fell off a cliff in 2016, and now IE11 is at less than five percent and still falling. I don't actually need to transpile during development today, and I won't need to do it in production either in a year or two, and yet everything written with JSX will need a transpiler forever.


> And yet, with JSX we have a gratuitously non-standard dialect of JavaScript that isn't on the standards track and will never be natively supported in browsers.

It just gets compiled down to native javascript. JSX is also optional in React - meaning that you don't have to use it. And having an actual client-side data-structure to represent DOM is arguably better than using string based templates which is what folks have been using for decades.

Also, compilation is not a dirty word - when you consider just how primitive the standard language is. A static type-checking compilation pass also helps keep large codebases/modules manageable, and aids refactoring by being able to catch errors - without actually having to run the code to figure out what it does.

It's true that there is unnecessary fragmentation. But the broad trend reveals that front-end development is actually starting to catch up with best-practice.


> We all switched from coffescript to ES6 and Babel because ES6, being standards-based, will eventually be supported in browsers, right?

No, a lot of us switched to ClojureScript, TypeScript, Elm, etc and don't look back.

As long as you have to transpile, why not transpile from a good language? (Not to say that native ES6 support would compensate for the advantages of the abovementioned languages)

Standards are not generally virtuous in programming languages. They tend to come about to mitigate language fragmentation between rival factions, something that most good and popular languages don't have a problem with.


Or, you know, allow multiple implementations without someone having to reverse-engineer what's meant to happen in edge cases from the reference implementation...


In PL context standard usually implies a committee design, whereas language specification is used to mean an engineered formal definition to define semantics. I agree having a spec without the committee design is a mostly good thing.

Though there's a good argument for just blessing the most popular implementation as "the spec". Paper specs have bugs too, and they always live in a kind of shadow world vs real life.

Things that aren't theoretically specified are relied upon in real life. See eg the Java and C standards. In Java's case nobody cares what the standard is, all implementations just have to be Sun compatible realistically. In C's case it's really opened a world of hurt with all the undefined behaviour and resulting security problems etc.


Like Python? Sorry. Couldn't resist.


Browsers catching up won't be the end of transpilers. If you want to write modern JS and have it support old Android phones or safari, you're stuck transpiling. And, browsers well never catch up, there will always be new features to be trying. It's really beautiful: we get to try different possible future versions of JS and make an informed decision on what and what not to bring into the language.

Also I have a feeling we'll be seeing a lot more to-JS languages, not to mention apps written in Rust, etc that target webassembly.


> which is the standards-based approach that ES6 actually provides for embedded DSLs and templating.

Template literals could not possibly be a replacement for JSX. JSX is Javascript, template literals are strings. JSX isn't a templating language by design, it's one of the primary advantages.


JSX is not JavaScript, it's a non-standard bastardization that will never be supported natively in browsers. Its reliance on JavaScript for flow control is so problematic[1] that people have started working around its many glaring inadequacies by adding new language primitives to JavaScript itself via Babel plugins[2].

I do my templating in template literals because using standards-based JavaScript means that I don't need 50MB of fragile packages from npm or a largely superfluous build step during development.

[1] https://github.com/facebook/react/issues/690

[2] https://www.npmjs.com/package/babel-plugin-syntax-do-express...


Actually you can get pretty close to JSX with es6 template strings and no external transpiler dependency.

https://github.com/trueadm/t7


It was never intended to be supported natively by browsers -- it's syntactic sugar for the React.createElement API. That's what I mean when I say JSX is Javascript.

That issue you linked to is 4 years old and is clearly a reflection of the poster being unfamiliar with React/JSX. The community has since unambiguously decided to avoid inserting more control flow in JSX. Also, I'm not entirely sure what the do/while plugin has to do with JSX.

> I do my templating in template literals

I would argue if you can write your code as easily with template literals as React/JSX your application isn't complex enough to warrant using a framework.


> That issue you linked to is 4 years old and is clearly a reflection of the poster being unfamiliar with React/JSX

And yet the underlying issues still remain unaddressed. There aren't better ways to handle the cases raised in that issue, which is why people resort to adding additional non-standard syntax to work around the warts.

> Also, I'm not entirely sure what the do/while plugin has to do with JSX.

The "do" keyword is not do/while, it's a non-standard language feature that you can use to turn statements into expressions by wrapping them. It's like using an IIFE with a return value so that you can use a conventional "if" or "for" statement in place in JSX.

> I would argue if you can write your code as easily with template literals as React/JSX your application isn't complex enough to warrant using a framework.

I don't think you really understand how template literals work. Tagged templates can be used for more than just basic string interpolation. You can read more on the subject here: https://appendto.com/2017/02/advanced-javascript-es2015-temp... Or have a look at t7, which uses tagged template literals for virtual DOM templating: https://github.com/trueadm/t7


Ah, sorry I misunderstood the plugin. But to the point -- people do all kinds of crazy things in all kinds of languages with all kinds of tools. That doesn't mean there's anything fundamentally broken about them. Now that I know about the existence of the Do plugin I can also definitively say I have no use for it. It's a band-aid for bad code, not a bad framework.

> I don't think you really understand how template literals work

I absolutely do. I use them to internationalize my apps. They're a powerful tool that is not comparable to React/JSX. JSX is not just a template engine which is what you seem to be implying.


To be honest, the lack of it existing in javascript aside, statements-as-expressions is a useful thing - no need to remember the semantics of `if {} else {}` vs `? :`, ability to have very complex (multi-line) expressions, and in languages with RAII, blocks being expressions means it's easy and clear to be able to acquire a resource in order to calculate a single value.


Last I heard (from a twitter thread with those in the know), the "do" keyword is considered highly likely to become standard.


Flow control using js is very simple. For the example you linked it should have used smaller components. Jsx is all about components. I cannot go back to string literals.


The place for templating in the web stack should be the markup language, not JavaScript IMO, and in SGML (on which HTML is based) there has existed a standard for it since 1986.


The thing that backend devs don't realize is that frontend is not easy. A frontend dev builds something that will be used by a HUMAN, while a backend dev builds something that will be used by a PROGRAM. Related to the comment, things were changing a lot because there is not one single way to do things, as in iOS/Android. There is no standard imposed by a huge private company like Apple or Google. Everyone is free to reinvent the wheel.

Facebook did something super valuable, that is ReactJS, and this is becoming the _de facto standard_. ReactJS is simple, opinionated, and supported by a large corporation. Thanks to ReactJS a new ecosystem was born, and now finally you have many projects gravitating around it, and this is good. It's just a matter of time IMO.

I strongly disagree with _"the browser is a bad app platform and javascript is a bad language"_. JavaScript has its quirks, but it's also really plesant to work with if you don't do weird shit like `isNaN('wft' - 1) // it's true`. The browser is a great virtual machine that is democratizing the web. If you are doing things right, you can have graceful degradation (it degrade gracefully to a lower level of user in experience in older browsers) to increase the audience of your web app, from the dev in SF with gigabit internet to someone living in a rural area in a not so tech advanced country. This of course has a cost, that is spending time developing your app.


What's so unusual about the current web platform is that none of the technologies involved were actually created for the task they are being used for right now.

HTML wasn't meant to build interactive application UIs. CSS wasn't meant to do layouting (or if it was it was done so badly that the intention is unrecognisable). HTTP wasn't meant to carry fine grained API calls. JavaScript wasn't meant to do large scale application development.

It's a long list of wrong tools for the job that are being cobbled together to supposedly form the right tool for the job. It's working to some degree but it's messy.

That's not to say that creating a technology for a specific purpose guarantees that it will work well. EJB, SOAP, WS-*, there are many examples of failed purpose built technologies.

But having an entire platform built exclusively from components that are unfit for purpose isn't something that happens very often.


This is a really interesting reply. I have lots of questions regarding your comment:

1) What was CSS meant to do if not layouting?

2) What should we have used instead of HTML for building interactive application UI's?

3) What advantage does using the right language for the right job confer?

I'm a beginner so please excuse any ignorance on my part.


As a fullstack dev, it amuses me when my underlings claim that backend is hard and they want to learn it. They're already doing front-end. (And yes, we're working them towards more and more backend, and even mobile.)

It's just different, IMO. Programming is programming, but different specialties have different things you need to worry about.


Absolutely. I've been doing "frontend" work before it was even a term (ie. html tables for layout), and it's still the #1 thorn in my side. Even with all the tools available, developing a functional, aesthetically pleasing GUI is not an easy task.

Plus, organizing your code so it's not a 15-nested-div mess with inline CSS just to make the damn thing look right.

It's probably why most of my personal projects have a CLI, and that's it.


"The thing that backend devs don't realize is that frontend is not easy." - I think that's a false dichotomy, one that describes no professional programmer I have ever encountered except the most junior, or the most foolish of hopefully-soon-to-be-ex colleagues. In my experience it is product managers, architects with a mostly-infrastructure background, and nontechnical stakeholders, who tend to assume that frontend is easy. All developers who prefer to work in the backend will have built a user interface at some time, and learned that programmatically fulfilling human needs and intentions is a tough ask.

"ReactJS ... is becoming the _de facto standard_ ... and this is good" - strong disagree, standards that aren't standards that are controlled by a proprietary actor have rarely been a good thing. The woes of dealing with Microsoft Office file formats, or Adobe Flash, or Google Reader pay testament. Heck, QWERTY keyboards. Or Edison's AC/DC feud with Westinghouse, a struggle over proprietary preferences that can be directly blamed for the extended length of my tea breaks when visiting the US.

I feel like there's something missing from core ES, something conceptual that would encourage an ecosystem to develop that isn't framework-specific. We just don't know what that is, yet.


I've been a web developer for 15 years now. Building the UI is always hard and where the waste of time is felt the most. That's why i prefer to work on the backend as much as i can.

I'm against the front x back end split. I think any web dev need to know html, css and js. That's mandatory. Besides that, you should definetely learn the server side too, specially databases, since they'are by far the biggest bottleneck and performance hog you'll need to optimize.

But I'm getting old and I know that because I think everything new is stupid. Once upon a time, we had RAD tools to build UI's, but unfortunately, that's lost in the past...


> a perfect demonstration of why the front-end development ecosystem is way out of control.

The answer to this is simple but horribly disappointing. In the military we describe this behavior as "justifying your existence". There are three separate causes to this problem.

First, solving valid problems is hard in any environment. Dicking around with tooling and configuration is easy. This includes frameworks, abstraction libraries, and so forth. Real problems are often tied to products, services, or end-user problems (opposed to these invented developer problems).

Second, writing open source software takes initiative and discipline. It is easier to have the energy for hard problems when people are telling you what to work on and you are getting paid. In this case failing to complete the hard problems bring retribution and eventually termination. When you are on your own or volunteering your time you have to really want it, which completely compounds the first point.

Third, there is a low barrier to entry to do front-end work. You can copy paste some code from the internet into your browser console and watch it do stuff. You don't need a compiler, special environment to get started, or any kind of fancy software. This problem is compounded in the legacy corporate world where web technologies are still viewed as inferior child's play for incompetent weak-minded children. After all real developers write ALL of their web related code in Java 6 (sarcasm).


So true.. At my fortune 50 company we force ie to render in ie7 engine for our internal reporting. Yes, that ie7...


It's quite possible to use react with nothing else: no build tools, no transpilers, no fancy JSX, so in terms of tooling complexity, it can be as simple as you want or as complex as you want.

Regarding the complexity of the underlying system, I think this is not a concern of the typical frontend developer. It has nothing to do with JavaScript or the web. If you are writing a "simple" C program on Linux, would you be concerned about the complexity of the underlying C library (yes glibc is rather big and complex) or the Linux kernel? You won't. This is the point of abstractions. If you want, you don't have to learn about any of these; treat the implementation as a black box and start writing actual frontend code.


Any resources you can provide about how to use React without JSX? JSX is a blocker for me.


> JSX is not a requirement for using React. Using React without JSX is especially convenient when you don't want to set up compilation in your build environment.

> Each JSX element is just syntactic sugar for calling React.createElement(component, props, ...children). So, anything you can do with JSX can also be done with just plain JavaScript.

Source: https://facebook.github.io/react/docs/react-without-jsx.html


Sure. At the bare minimum, you write "manual" `React.createElement()` calls. To simplify or shorten things, you can alias `React.createElement()` to something like `h()`.

There's also a variety of libs to help with that process. We have an existing ES5/Backbone app that's built with Require.js, and we don't have a "compile" step in there. We just added our first couple React+Redux-based features into that codebase, but can't use JSX due to that lack of compilation. I opted to bring in https://github.com/Jador/react-hyperscript-helpers as an alternative to JSX. Not perfect, but it's a suitable alternative given our limitations. There's several other "virtual DOM/hyperscript"-ish packages out there as well.


It's basically just to use the React.createElement API that JSX transpiles to. I say give JSX a chance though.


I had an initial negative reaction to JSX, as well. It grows on you, though.


Might I suggest .dom[0] instead of React if you are planning to use plain JS. It is tiny and has a slightly simpler API which is nice for plain ES6

[0]: https://github.com/wavesoft/dot-dom



And thus, yet another insipid "hey let's argue about JS fatigue!" comment clobbers another HN comment page, with 100+ replies not one of which has anything to do with TFA.


If it wasn't a problem, it wouldn't be upvoted. That is how voting works. I'm not a JavaScript programmer, although I have dabbled extensively in the past. My impression of this thread is that not much has changed in that world. Sencha, titanium, and Phonegap were super hot back in the day, along with jQuery, SproutCore, Knockout and Backbone. Now it is all different for JS. That fuckng sucks.


> If it wasn't a problem, it wouldn't be upvoted.

It's upvoted because it's HN's current favorite holy war. "Front-end development is out of control" is something everyone has an opinion about, in a way that the announcement of React Fiber isn't.

That doesn't mean the same debate needs to be re-litigated every two days. Occasionally would be fine, but when it happens in every tangentially related thread it sucks the oxygen out of everything else.


Because it is out of control! You have to have some serious "nalgas" to dive in... I tried some years back and said; "F this" I am going to learn Python and Java. I am not even paid to program in most cases. There is a LOT to be said about stability.

It does need to be reignited or it wouldn't be. It is like being a climate change denier at a climate conference. "Uhhggg why are we talking about this again?" I don't believe its a problem... Well, shit loads of people do!!!!!! SO suck it up buttercup.


"Look, I know it's an off-topic rant, but this issue is so important than it warrants off-topic rants!", said everyone who's ever posted an off-topic rant.


It's such a dumb argument that boils down to "things are changing and I don't like it!". Javascript fatigue is not real. I am completely convinced anyone who says otherwise has never had to maintain a non-trivial web application in Backbone/jQuery -- meaning they're complaining about frameworks that weren't built for their use cases.


Javascript fatigue is real. Shit in front-end land changes so fast it is impossible to keep up with.

We just finished a rewrite from jQuery to Angular 1, with updates to all the tooling that goes along with it. We're now at least three years behind the curve... Hooray!

Meanwhile, the backend frameworks we've been using have had one minor version Nuget package release in the same timespan.


Angular 1 is 7 years old. And guess what? It's still a perfectly valid framework to build a web application with. There is no "curve".


Of course it's a valid framework. But the cool kids have all moved onto Angular2, or React, or some other thing I'm too stuck in the mud to even know about.


But why does that matter? Does your Angular 1 application work? Does Angular 2 existing prevent it from working?


Why did you rewrite the app from jQuery to Angular 1?


Any sufficiently large jQuery webapp has a strong tendency to become indistinguishable from a plate of spaghetti, without strong efforts to keep things sane. Angular, once we got over the learning curve and converted over the old functionality, made it a lot easier to make changes and add features.


No. It's "things are changing for no other reason than some overpaid FB brogrammers with too much time on their hands require Internet attention and need something to gloat over for the next conference."


It's funny how using a bad language and bad platform I can put together a robust, networked, maintainable, cross-platforn application faster than I could with any prior set of tools. Weird.

And, by the way, I'm not defending react.


> However, just one look at that giant horde of links is, to me, a perfect demonstration of why the front-end development ecosystem is way out of control.

This has got to be the biggest jump to a conclusion I've seen in HN comments this year, without a close second.

Sincerely, C programmer who thinks there still something to be learned from some of the architectural underpinnings of a library like React.


Good troll.

Ask your users to install your app and qualify as extra features (with associated budget) the capabilities of bookmarking, sharing urls, having multiple tabs, "remember me" and forms autocomplete, going back, having a history, or auto-update.

Justify it to them it's because their browser is a bad frontend platform. Like this horrible, horrible mobile twitter lite ;).

In my first year of programming, html was such a joy to use compared to gcc. Could get a button with a background color and an image displayed under 2min. Just saying.


The web is a great platform, and so are browsers. JS apps running in the browser? Not so much.

How many of those features you list are routinely broken by SPAs?

Bookmarking and sharing URLs? Almost always.

Going back and having a history? Often.

Having multiple tabs, and forms autocomplete? Sometimes.

I hate web development mainly because I love the web so much.


> the browser is a bad app platform and javascript is a bad language

Yeah, and we're finally making real progress towards making it better. I used to do web development because nobody else wanted to, now I do because it's really fun. This is a forum for hackers -- we should be excited about the unsolved problems instead of complaining that things are changing.


I'm not complaining that things are changing, I'm complaining that things are changing for the worse.


And you are totally wrong about that. This is the most exciting time to be a web developer since jQuery first came out. I promise it only looks that way from the outside.


I am very much not looking at it from the outside.

ES6 is a step backward. Prototypical inheritance was one of the few things Javascript had going for it, but because CS majors who only learned Java in undergrad couldn't be bothered to learn prototypical inheritance now we have two incompatible inheritance systems instead. Promises are minor syntactic sugar over callbacks which could be implemented as a library. Meanwhile `this` still doesn't mean "this", there's still no integer or boolean types, accessing a nonexistent variable silently fails but happily continues execution, only reporting an error when it's far away from the problem. There's no threading, no security primitives, and no decent debugger. None of the problems are being fixed and the new features are either lackluster or make things worse.

And that's just the language itself. The ecosystem is in more of a shambles. Web development in the age of left-pad and global shared state being normal is exciting in the same sense as plagues of frogs and locusts are exciting: it's not boring but if it were boring that would be better.


> now we have two incompatible inheritance systems instead.

JavaScript still only has prototypal inheritance; class syntax is just sugar[1].

> Meanwhile `this` still doesn't mean "this"

Even with arrow functions, which don't change the binding of "this"[2]?

> there's still no... boolean types

typeof false === "boolean"

> accessing a nonexistent variable silently fails but happily continues execution

Trying to access a nonexistent variable throws a ReferenceError[3].

> no decent debugger

I disagree, but regardless — doesn't the fault here lie with browsers, not the language itself?

We also get a module system, object and array destructuring, async/await, template strings, generators… it's pretty difficult to say the new ES features are "a step backward".

[1] https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe... [2] https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe... [3] https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...


> JavaScript still only has prototypal inheritance; class syntax is just sugar[1].

Sure, but the patterns you use with the class syntax are incompatible with the patterns you use when using object literals, so libraries pick one or the other, typically the class way.

> Even with arrow functions, which don't change the binding of "this"[2]?

Even with arrow functions, `function` still exists ands is more frequently used.

> Trying to access a nonexistent variable throws a ReferenceError[3].

var a = {};

var b = a.c;

Doesn't throw a ReferenceError for me.


> Sure, but the patterns you use with the class syntax are incompatible with the patterns you use when using object literals

How so? You can still access the prototype. It's just that the most common reason to — declaring instance methods — can be accomplished with the easier-to-read class syntax now.

> Even with arrow functions, `function` still exists ands is more frequently used.

ES6 is a non-breaking change. You can use arrow functions for literally every function you write other than generators.

> var a = {}; > var b = a.c;

That's a nonexistent property, which probably should throw some sort of error — but that's why we have tools like TypeScript and Flow that can detect this.


>You can use arrow functions for literally every function you write other than generators

But not for every function I read, and that is the problem with all backward compatible fixes to programming languages. You're just piling on more (sometimes better) features on top of all the old brokenness and everybody has to know all of it.


Yeah but that's in no way unique to Javascript. Java developers still have to deal with type erasure due to decisions that were made 20 years ago.


True. I'm not trying to say JavaScript is or was great, just that ES6 is pretty clearly not a step backward (which was the original claim).


Javascript has had a Boolean type at least since ES5; I'm not sure what exactly you're trying to convey there. The lack of a Number type is pretty frustrating for many types of computation, but at least some of them are covered by typed arrays.

I'm not sure what you mean by "no decent debugger" either, there's debug tools in every browser plus quite a few for node.

"Accessing" a non-existent variable has been a ReferenceError since ES5(http://es5.github.io/#x8.7.1). Assigning to a non-existent variable has been a ReferenceError since ES5 as well, in strict mode.

What makes you dislike JavaScript so much you'd go on such a rant about it?


> What makes you dislike JavaScript so much you'd go on such a rant about it?

Writing it every day.


That's because noone really, truly, honestly want to RTFM of a given stack, library, etc.

It's more fun to roll your own or to pick up a new one.

I mean, in how may ways can you organize code that manipulate a DOM tree, pull data, massage data, and update the DOM tree? Well, the more the merrier :)


I fear that this comment embodies more than a little bit of truth.


There's no need to actually use all these complex tools if you don't want to. We wrote a pretty large (>20kloc) React application with a simple require.js based build flow using a Makefile that invokes a couple of tools for bundling the JS and generating assets like CSS. We used our own resource loader framework based on jquery (Redux didn't exist at the time) and a simple routing library (director.js). Still works like a charm!

In my opinion, the important thing in frontend development is to know which tools and libraries to adopt and which ones to ignore.

As a platform, I find JS+HTML5 really appealing and I'm happy that there are many ways to use and creatively remix these technologies, so I don't see the large churn in the ecosystem as a disadvantage but rather as proof that browser-based JS works very well as a platform to develop applications on.


Why is an active ecosystem with healthy competition a bad thing?

Personally, I can't stand javascript, and why you'd ever elect to use it for backend/server side code mystifies me. So I don't use it (or work in web development), and couldn't care less about what platform the hip web devs deem acceptable.


I mostly agree, however it does seem to make some sense to dive deep into one ecosystem, instead of learning & keeping up with two. (Let's not pretend that Javascript is the only one advancing--Django of a few years ago is not compatible with Django of today, as one other example).

I have always used python/ruby on the backend, and I prefer them, but I recently worked on a couple of javascript backend projects, and I found one abhorrent and the other quite pleasant.

As could probably be expected, the practices used to develop the latter were just better. (These were even done by the same developers, in large part, and the differences were mainly lessons learned).

Anyway, if you're already working in a javascript-heavy front-end like React, it doesn't seem like such a stretch to use it on the backend, too.


Would you mind elaborating on some of those practices (specifically, the problems that made working on the first codebase abhorrent)?


healthy competition is robust and tested. A short fade that's gonna be abandoned within a year or so doesn't quality as healthy competition.


Old school devs and those who prefer to invest the bulk of their effort on the backend might be delighted to discover Intercooler.js

https://news.ycombinator.com/item?id=12885980


So much negativity here. Constantly pushing the boundaries of existing technology is how we got here in the first place. If we were content with the status quo, we'd still be looking at a four-month journey from New York to San Francisco on horseback.


Ironic analogy. Because what used to take a day in Rails now takes a week in [pick your back-end]/[pick your JS frontend].


No it's more like we've got a 2 month journey on the Hindenburg.


React has been "the yearly hot library" since 2013 for anybody counting along.


It's like the Engineering teams at Google and Facebook have never heard of Microsoft or Sun or Adobe. There are so many lessons (learnt the hard way) in the evolution of .NET, Java and Flash that these guys seem to be completely oblivious too.

Google, Facebook devs please go spend some time with the language and platform greybeards at Microsoft.

EDIT: Feels like Software Engineering needs a set of Software History textbooks to keep reminding us of what has already been built.


To be fair, I'm pretty sure some of the engineers at Google and Facebook have spent some time with the language and platform graybeards at Microsoft, Sun, or Adobe. Some of them are those "graybeards" (in quotes, because they don't literally have gray beards). [1]

[1] : https://en.wikipedia.org/wiki/Lars_Bak_(computer_programmer)


Any specific examples? Nodejs seems like it's managed to avoid the early java mistakes of XML everywhere and one-thread-per-request servers.


amen. I rolled a angular app 3 years ago, and now most of the libs i used are unsupported, and impossible to upgrade anything cause of dependency hell.

now we dont use yeoman to create react apps, and we have a thousand start kits that get u off the ground. your node_modules directory probably has 30k+ plus files before a single line of your app is written. yarn

just imagine ur an embedded programmer, and you boss asks u to write a hello world for the nexet device, and u come back with a react app with 30,000 dependencies to get the job done. then u find a bug, which fb just fixed in the next release. But u cant upgrade to the latest version, because that conflicts with library that is used by a 3rd level dep, that is used to add padding to a smiley somewhere.


I'm reading some negativity into your take.

I see an incredibly complicated problem space- presentation of multimedia content and execution of code on billions of 'fragmented' devices over which the content creator has no control- and the minds of humanity iterating on approaches to the problem.


> Every year the hot libraries change

Front-end development is fashion.


Would you rather have less documentation and fewer tutorials? I wouldn't call four links about a new library excessive.


Let's advocate the other extreme. Get back to mainframe and Cobol. No more fuckups dealing with endless changes. No more need to deal with youngsters and their framework of the year.

Of course, it comes with its own issues, but you'll be blind to them for the first few years while you enjoy the cure of stability :D


Or, you know, let's advocate reasonable positions and not pretend that the only alternative to Javascript is Cobol.


I was having a similar discussion just today.

I was told I should be using swagger, and upon learning more about I started asking why they didn't just build tooling around WSDL. I couldn't see anything enabled by Swagger that isn't enabled by WSDL.

About the only thing I could think of was simplicity due to it being more specialized than WSDL.

I was told javascript can consume swagger easier, which may be true, but I don't know why that's a significant problem with WSDL seeing as how you could autogenerate the javascript interfaces anyway.

I still can't really wrap my head around what problem Swagger is solving that wasn't already solved by WSDL.


The problem with WSDL, at least at its inception, was that it tied in with the whole WS-* extravaganza, a set of standards that are built on top of each other and heavily based on XML. While WSDL has evolved to support RESTful services, early WSDL was closely tied to SOAP.

The XML thing is slippery slope. Suddenly your toolchain has to be aware of namespaces, XSD schemas, and so on. It's a largeish universe of a standards that sound good on paper but don't contribute meaningfully to anyone's productivity. And XML has historically been (and still is) popular with the enterprise Java crowd — J2EE, Spring and so on — there's a pretty steep divide between that and the more agile world of web devs and startups. If you're buying into the XML stack, WSDL will seem like a nuisance more than a productivity multiplier.

That said, I don't think Swagger/OpenAPI is very good, either. It's replacing XML with YAML, and introducing its own set of issues. And the community and tooling ecosystem aren't quite there.


The question I posed in the discussion is why they didn't extend WSDL instead of trying to rebuild the entire ecosystem by creating Swagger.

To me the XML/JSON thing is a non-issue. I've never really gotten the unreadability of XML argument given a decent prettifier (which also applies to JSON), especially for something like a web service description.

I guess my thing is, we already have a web service description language, it's called WSDL, and it's already been extended to support REST, why not extend it to support whatever use case you have, or build the tooling around it to support said use case?

You could have easily built the Swagger Spec to spit out WSDL instead of JSON. It means you would have been able to lean on the existing tools that can swallow a WSDL file instead of having to recreate the entire ecosystem yourself.

What I kept getting back was "JSON > XML" and to me that's just a non-issue. It's so far down the list of what's important that I can't imagine making the decision to try and recreate all the tooling around WSDL for it.

And I guess I've never really had your experience with WSDL because I've never been forced to be aware of namespacing issues any further into my system than the edges. Perhaps I've been lucky, but I've been using WSDL for years to ease integrating into web services and I've never stopped and thought to myself that it would be better in JSON with a YAML spec. And I rarely deal with SOAP nowadays, and when I do, I just use a library to swallow the WSDL and do the right thing.

I guess that's really where I'm coming from. It smacks of religion to me.


I don't know about others but I tire of these comments chiding the "out of control" front-end ecosystem. The situation has already changed years ago. We aren't going to get off of your lawn while you continue to beat a dead horse. The web is built on open standards and there are billions of pages and apps out there. To expect web development to be a perfect monoculture is a failure of your imagination and ability to adapt rather than a failure of the ecosystem.


I get the Javascript fatigue fatigue part, but this comment breaks the HN guideline against name-calling in arguments, and even crosses into incivility:

> We aren't going to get off of your lawn while you continue to beat a dead horse.

> a failure of your imagination and ability to adapt

Please don't do this in HN comments regardless of how tired you are.


Fair enough, though I don't see any name calling here. I will try to be more civil in future comments.


Not the OP, but I expect the web to be a thing where documents (i.e. mainly text) don't have any issues rendering on my 5-year old phone or on my 8-year laptop (both of which work very well, still, and which I don't plan to replace anytime soon).

The recent web practices (I'm talking mostly of the last 2-3 years, since more and more people have started copying Twitter with its launch of its "every website needs to be a JS-app" thingie) have practically destroyed how most of the websites are displayed by the 2 devices I own. Sites like Hacker News or Wikipedia, which still work perfectly fine, are very far and few between. I sincerely deplore this.


A web limited to textual content is a pretty quaint and uninspiring vision, IMO.

The web started that way because static documents are relatively easy to represent, but the future (and present, for that matter) is rich experiences that can be distributed in the same way.

But there are many steps left to get there, so either buckle up and help build that future, or get used to an ever-shrinking ecosystem of the purely textual Web.

I'm personally very excited by the convergence of mobile and web development, PWAs, improved functionality for layouts, self-hosted app architectures (like Tent and Sandstorm, language-agnostic runtime via Web Assembly, better serving performance via HTTP/2, low level multimedia APIs, encapsulation of rich UX via Web Components, and so on and so on.

Sure, it's bewildering right now, but in the future, this will all be knit together in a cohesive way.


My opinion happens to differ. Textual content is incredibly rich and is likely a target for all content in the future. Consider, we are literally training models to accurately caption things today.

Why? Because language is a hell of an abstraction. More, it is approachable and can often be bounded. Consider, I am not even remotely skilled in art. However, I can adequately describe, for me, a picture by saying who is in it and the event. Try doing the same without language. Often "quaint and uninspiring textual" language.

Do I opine for the what we had in the 90s? Not necessarily. But I do grow weary of folks reinventing frameworks that merely further complicate the very abstractions that they created and now require these frameworks.


> Sure, it's bewildering right now, but in the future, this will all be knit together in a cohesive way.

Could have said those same words literally a decade ago when we were all struggling to figure out all this Rails and MVC stuff and how automated ORMs worked above MySQL. Seven or eight paradigm shifts later, we're all still confused and I see no cohesive knitting.


Do you want to bet on that last one?


Sure, but in the meantime the web is a much less accessible place for screen readers, underpowered mobile devices, and those without good internet connections


> Sure, it's bewildering right now, but in the future, this will all be knit together in a cohesive way.

You'll get over that optimism when 20 years passes and it's a worse mess than it is now; that's what going to happen.


The fact is that 99% of what's done on the web is (mainly textual) document management/display. Facebook (the website responsible for React!) is 99% document display.

The truth is that "rich experiences" are mostly made by enhancing the basic experience of reading rich documents. If you want an inspiring vision of a document-centered future, look at IndieWeb.org. And/or roca-style.org. It's a vision of how the web can be knit together into a cohesive whole based on URLs, rather than segmented into a bunch of unaddressable silos.


About self-hosted app architectures, Cloudron is really good in terms of stability. Just got a newsletter from them that they are hitting 1.0 soon.


You could say the same thing about flying cars. A car with four wheels is a pretty quaint and uninspiring vision, IMO. However it happens to be infinitely more practical.


It was was has worked for almost 5 thousand years, so I am going to go ahead and say it is not quaint, its not uninspiring, and when you say "in the future it will all be knit together in a cohesive way" I laugh heartily. It never has been, and it wont be.


And I would love if my 2001 Honda Accord was compatible with Tesla's autopilot, but I understand it is not a realistic expectation.

I'm not sure why you'd expect the web to be a. Mostly text and b. able to render easily on obsolete devices.

The web is becoming a robust application delivery platform. That is so, so awesome. Most people do not want to be stuck with shitty looking, text only websites. Moving the platform forwards necessitates that it will use more resources. Increased resources availability and consumption over time is fairly consistent across most aspects consumer computing.


The point isn't that the web should be mostly text, but that it shouldn't comprise of layers and layers of unnecessary fiddle-faddle that doesn't add anything useful to the end-user's experience.

If you can convey what your trying to convey with a JS-less (or even just JS-lite) 2-page website, then don't build a monolithic, scroll-hijacking, background-parallaxing fireworks show of a website, tied together with the most tenuous of Javascript libraries.

I'm all for the web as an application delivery platform, but not every website, or application, needs so much bulk.


That's not a problem with the ecosystem though. That's a problem with bad developers. The same is true for any technology and programming language.


They may be robust but the user experience still sucks. JS heavy web sites are unresponsive and turn my 3 year old MBP into a vacuum cleaner. Facebook is the best example. I can barely load the web site without the fans spinning. Firefox can't handle it at all. It is unusable for me.


That is crazy. I use Facebook on a 5 year old MBP with no problems. I have Ad Block Pro and U-Block, but even without them, my computer can handle Facebook just fine.


ublock origin is the only adblocker you need. Adblock Plus does the same thing but is less efficient and lets some ads through by default, and ublock is abandoned.

Don't run two ad blockers, they will just use more resources for no benefit.


Something is wrong with your MBP.


It isn't just him. Facebook slowed for whatever reason on my PC too in the last half a year.

4.5 GHz 4670k, 16Gb of tight timings ram.


Might be more Firefox than your macbook or the web. I've noticed (while developing an extension) that Firefox feels noticeably more sluggish than Chrome. Safari somehow feels even faster than Chrome, but I'm too tied to the extension ecosystem of Chrome to switch.


I have the same experience with Firefox on OSX. I noticed that it isn't an issue with the dev version of Firefox with multiprocess turned on.


Firefox is definitely partly to blame. Chrome does work better and Safari is probably the best of the lot.


Safari is definitely faster than Chrome on Mac.


It also uses significantly less power.


Alas I dream for the day when Safari adopts WebExtensions


I'm using Chrome but I tend to agree. A particularly painful area of Facebook is Buy/Sell pages. It slows to a crawl if you scroll through too many listings. Even on my 6 core X99 system.


This experience is not unique to the web, from what I've seen. Apps that aren't well made can easily drop frames on a 3 year old iPhone.


Ubuntu on chromebook user reporting. I cant complain much on the latest chrome even with my 2 gigs of ram and much less powerful processor.


> Increased resources availability and consumption over time is fairly consistent across most aspects consumer computing.

Worth remembering this is the case for people interested in tech. The local library still runs Vista on a 10yo system. My parents will use Android 2.x until the phone does not turn on anymore. Bandwidth updates don't apply to many people living outside of towns. Etc. It's been a long time since we've reached a point where an average person shouldn't need more bandwidth and power to look for information online.

And BTW​, you can have beautiful text-only websites. These 2 properties are not related.


> The web is becoming a robust application delivery platform.

This was our mistake. I don't want garbage shoved in my face. I want to read. And that's it.


Amen! React might be the best tool for building hybrid mobile or even cross platform electron apps, but the truth is, it suck balls for web development​.

Bundling the entire website inside a single js file, that needs to be re-downloaded everytime you add a div or change a single css class is stupid, sorry.

Your website doesn't need to be pure text. It can be beautiful, modern and responsive. And it doesn't need much js for that.

The world has become mobile first, not js-centric. Pushing spa everywhere is just wrong.


React is great for web development (using Next.js) based on my recent experience.

Citing only one side of any architectural trade-offs isn't particularly interesting either. The other side is that we can now easily build sites using Universal JS (serverside-rendered combined with SPA).

Delivering a website with seemingly instantaneous repaints even on flaky internet connections is just a superior end-user experience.

Just click through the primary nav of a Universal JS site vs an old-school one, and it feels like we've been putting up with the equivalent of the old 300 ms tap delay, but for all websites and website links.

Not engineering away that latency penalty will tend towards foolish for website properties that want to remain competitive.

Users will become increasingly accustomed to very responsive sites and the latency penalty that is currently normalised will become more glaring and unacceptable.


What has been your experience on server side rendering ? We are very concerned about SEO,etc - have you seen any impact of using Next.js on SEO performance,etc


We have a number of Next sites in development at the moment, but none in production (soon!).

SEO shouldn't be a problem, especially as the initial page is serverside rendered.

The only slight complexity is in returning sitemap.xml I believe, which requires a little bit more configuration currently. If you search the Github repo for 'SEO' you should find some tickets (open and / or closed) that discuss this.


I'm with you, but it's just part of the growing pains. The web is successful because it's easy to layer businesses on top of each other... I can have a Tumblr with a Shopify backend and Google Adwords on top. Apps are walled gardens so they can only enforce one business model at a time. That can make things nice and tidy, but it walls you out of the messy, compositional business models of the web.

Because business models are composed on the web, it's just harder to settle on unified design standards. It takes time for everyone to agree on a separation of responsibilities on the page. This is compounded by the sheer newness of the business models. My webcomic about fancy cheeses has a segment marketing a line of tractors now to industrial buyers at a conference in Wisconsin this week? OK. That's an opportunity I probably wouldn't have had selling Java apps.


Well what's unique about HN and Wikipedia? They're largely non-monezited. If Buzzfeed can make more money off of a flashy website it's hard to argue with.


HN doesn't do much, but Wikipedia does a decent amount of stuff with JS on their website despite people taking it as an example of "the web as documents"

meanwhile, I don't know how FB or Twitter are nice user experiences when you operate on a "paginate 20 tweets at a time" philosophy.


I don't think it's either/or. You can make a "flashy" (in the sense of Buzzfeed) without it being burdened by huge amounts of js. Likewise, monetized sites are still frequently not like Wikipedia or HN.


Ten years ago MapQuest was state of the art, and you'd click "next tile" over and over to traverse a map. Then Google Maps showed up with its voodoo ajax and infinite scroll and ushered in the modern web app era. Sure, some folks overdo it, but I'm not going back.


That is a vision that is incompatible with the reality of computing from the past 10 years.


I would love HN to provide an easy way to see where the reply thread to this first comment ends. Some js might enable that. Basic html/css - I think no.


It's that [-] button next to the username above each comment. It collapses all the cascading comments. It is enabled via JS. If you disable JS, then it disappears.


I suspect you could hack a similar collapsing thread UI together with styled radio buttons and no JS, if you really wanted to prove a point.


Thanks, I didn't see it


But I seldom hear people saying the old web pages can't be shown on today's browser.


Huh? Who is talking about monoculture? OP certainly isn't, and I don't think anyone else is seriously suggesting such a thing.

It's not unreasonable to expect that the ecosystem doesn't operate under an ever-expanding set of wrappers, frameworks, layers, and models, and it's certainly not unreasonable to expect that our tools don't suck.

The open standards you talk about ARE already a part of the ecosystem, and well-established: HTML, HTTP, DOM, and ECMAScript. The new JS library of the week is not a part of that.


First, let's be clear here: React was first open sourced in 2014, being used at Facebook extensively before that. Sure, it's no Win32 API, but it's not exactly the new kid on the block.

But the point of the web already having these standards is exactly the point - these frameworks are built on top of and contribute to the underlying universal standards, so what's the problem?


There's this cargo cult complaint where people talk about 'layers and layers of bad abstractions' while rarely offering a superior solution or describing exactly which abstractions are bad and why they are bad. Mostly just hemming and hawing about things being "bad" and "overcomplicated".

It's basically the same instinct as complaining about code written from a previous dev that you didn't write--because you didn't write it, you weren't there to think about the very real tradeoffs that always have to be made when actually implementing something, rather than just complaining about the previous person's implementation.

I never know which abstraction people are complaining about. Is it http? html? the dom? javascript? wrapping libs like jquery? view frameworks like react? routing frameworks? complete frameworks like angular? what's the solution? get rid of all of those things? use different UI abstractions? get rid of javascript ?(oh wait, you can already). The truth is, it's just easier to complain than it is to implement, so that's what people do.


Completely agree here. I think many people realize this and have built awesome tooling to be able to manage different ecosystems and environments.

One that I've been particularly keen on over the last year is GraphQL. If you're interested in simplifying frontend development while giving it more power and control to dictate the data that it needs for each component / view, you should check out GraphQL. I know what you're thinking... "Ugh, another dev tool to learn etc etc". But it's definitely one that's here the stay and a bigger idea than even REST itself. The OPEN STANDARD of it is KEY.

The idea is that you have a type system on the API level and one endpoint. And querying works using a standard GraphQL language, so it's cross-API compatible and you don't need to go learn a whole new API convention at each new place or project you're working on since with REST, they're probably all built in different ways. And GraphQL works across all platforms (i.e. no more proprietary SDKs). All you need are things like Relay and Apollo and that's it.

I've actually been working on a platform called Scaphold.io (https://scaphold.io) to help people learn more about GraphQL and give them a cleanly packaged server and help you manage your data so you don't have to in order to feel the benefits of GraphQL right off the bat. Because of the use of GraphQL, it comes with the best of both worlds of Parse and Firebase with awesome data modeling tools, real-time capabilities, and of course it's built on an open standard.

Truly is the future of APIs that works perfectly with React, and any other framework that comes out down the road for that matter. Facebook's tooling is really neat cause it's built from first principles as well as from their experience running one of the largest distributed systems in the world. So they definitely know a thing or two about building developer tools that are cross-platform compatible and developer-friendly.


GraphQL existed a decade ago, it was called OData and it wasn't as revolutionary as everyone thought it would be at the time.


OData - which was essentially LINQ over URL - was nice (but incomplete) idea with terrible realization. We've used it in two projects in the past and we've fallen back from using it in both cases:

- OData never got wider adoption outside (and even inside) MS. GraphQL is already adopted on every major platform.

- OData had shitton of documentation and format changes with every version. Backward compatibility not included. GraphQL has clear specification which is pretty stable since its first publication.

- OData queries had no validation. You could send almost every query just to get runtime response that your framework/library doesn't support that functionality - even if it was developed by Microsoft itself. On the contrary all major GraphQL server implementations have full feature support.

- Thanks to Introspection API GraphQL has self-descriptive metadata model, which is great for tooling and compile time query verification. Also versioning, schema and documentation are all included into GraphQL from start.

- And finally GraphQL focuses on exposing capabilities of your service into well defined format understood by frontend, not on exposing subset of your database to the user - like OData did.


Monocultures develop because one tool is far better than all the others.

Javascript isn't a monoculture because all the tools are bad.


Monocultures develop because of corporate backing. The web being open and unclaimed is what leads to the great diversity. As someone who does both server and JavaScript development, the tools are not bad. Often times they solve real problems well.


That's silly - if anything it's the other way around. Web tools have a ridiculous degree of corporate backing, especially with the countless libraries from Facebook and Google (React, Angular, Go, Dart, Flow, IMMUTABLE, etc.), and even Microsoft with TypeScript and others. Meanwhile, Make and friends don't have any corporate backing.


None of them can lay claim to the web though, contrasted with other ui platforms like windows, ios, etc.


Plus IE and Chrome, fwiw


The web is pretty much corporate. All the major tools are built by major corporations, all the browsers are developed by major corporations (with the exception of Firefox, but Mozilla gets most of its money from major corporations) all the major web players are major corporations, all the major web monetization is rented by corporations.

The web's no longer an open platform. Yeah you can put up a blog on your own server using only FOSS. But if you want to drive traffic to it, you're using a huge corporate ad network or some SEO company. If you want to scale it, you're using a huge corporate cloud. If you want to make money on it, you're using a huge corporate ad network.


Firefox started out as Netscape so you can trace it back to big company.


Ah and so what good monoculture can you direct us to?


Rails? Ruby isn't a monoculture, but Rails does seem pretty dominant.


If you like the Rails monoculture, you may like Ember as a front-end framework. Most of the choices are made for you, and on the whole they're very good default choices. If you choose Ember, you get: a CLI tool (ember-cli), a templating/rendering engine (Glimmer), a router, a persistence layer (ember-data), a unit testing tool (qunit), a server-side renderer (fastboot), etc. If you want to swap one of things out you can, but the default has great support. There have been some bumps along the way, but everything fits together and works.


By the way, the OP I responded to said "front-end", not "JavaScript". How does your attitude jive with WebAssembly which will allow you to compile C++ or Rust for the browser?


I've been struggling to understand how WebAssembly will change things on the web. Will you be able to compile an arbitrary C/C++ binary to run on a browser? Will syscalls be emulated? What about the filesystem?

It looks like to get such a thing working we would need an OS running on top of JS. But maybe I'm missing something.


There is clearly some sort of mapping available to a JavaScript FFI for doing I/O, as both the Unreal engine and Unity have been ported to WebAssembly. If you can run a video game engine in the browser you can run anything. I believe there are proposals in place as well to add opcodes to directly interface with browser capabilities rather than going through a JS FFI.


> If you can run a video game engine in the browser you can run anything

no


enlighten us


ok, assuming your request is not sarcastic:

- operating systems

- hardware drivers

- a networking stack

- trading algorithms

- fighter jet firmware

- ultra low latency DSP

- processing data from LHC collisions or imaging satellites.

- etc

- etc

- etc

just because webgl enables fast gpu processing (by basically writing a small subset of C that is shader code) doesnt now mean the web platform can be used for evetything.


The implication was applications. But even then, everything on your list is emphatically not true:

- operating systems

Ironically Linux was one of the first things to ever be compiled to run in the browser using empscripten.

- hardware drivers

Linux doesn't run without hardware drivers. In this case, the hardware drivers were wrapping JS FFIs.

- a networking stack

Yes you can.

- trading algorithms

Why not?

- fighter jet firmware

It's not a fighter jet, but the SpaceX Dragon 2 space capsule's UI is built using Chromium.

- ultra low latency DSP

Why not?

- processing data from LHC collisions or imaging satellites.

Why not?


You can emulate a network stack and hardware drivers, but your "network stack" can't directly send packets outside of HTTP/Websockets/WebRTC/etc. and your "hardware drivers" just emulate hardware that doesn't actually exist.

Trading algorithms, ultra low latency DSP, "processing data from LHC collisions or imaging satellites" are I think references to performance limitations. WebAssembly requires a fair bit of runtime checks (though some can be pawned off to the hardware if you're clever), and has a number of other quirks that hurt performance, like the fact that most languages end up needing two stacks when compiled for wasm.

The issue is even clearer when you move to processing large data sets because WebAssembly only allows a 32-bit address space at the moment. Add to that the lack of concurrency or native SIMD, and it's pretty clear it is way too early to dance on the graves of native executables.


Runtime checks don't stop you from doing heavy data processing or having microsecond-level response times. It seems like your objections fall into two categories. One is caused by it running in ring 3, which can be solved by correcting "anything" to "any user program". The other is moderate performance limitations that don't stop it from running full video game engines. Those may not be ideal but they won't stop you from running anything. Video games are among the most demanding things you can do in most ways. The single core of today is better than the multi core of a few years ago, and nobody claimed that those couldn't run all programs.


The memory limits will definitely completely prevent you from running some real-world programs. Programs that don't fold well into a single threaded event loop are also a problem at the moment (including Unreal/Unity games that use background threads).

Also, unless you definition of "can run a game" includes "can run a game at 3fps" I'm pretty skeptical that the entire repitoire of games for those engines can make the jump to WebAssembly today.


Concurrency and SIMD support are in the works.


I'm aware, but "If you can run a video game engine in the browser you can run anything" isn't exactly hedged on the future is it?


wish i had more time for the multitude of your generic "why not?"s, but i don't :(

any language (high or low level) that allows you to write and execute deterministic assembly or machine code can be used to implement whatever you want. whether WASM ever reaches this type of broad security-sensitive wild-west remains to be seen, but my money is on "never".

wasm will certainly expand into additional areas, but they will all be security-gated and will not have unfettered, raw access to all hardware (even with some additional user-grokkable permission model), because it has to run on the web: "W"asm


That's entirely addresed by "The implication was applications."

Nobody was trying to make a claim about web assembly drivers. Just applications.


A tool called Emscripten provides a runtime environment. You can include things from that, like a virtual filesystem. That said, a lot of WebAssembly code doesn't have a dependency on file APIs. For example, you can check out the web-dsp project: https://github.com/shamadee/web-dsp

If you're looking for an intro, I wrote a series of posts on WebAssembly (there are 5 links at the bottom of this intro): https://hacks.mozilla.org/2017/02/a-cartoon-intro-to-webasse...


> It looks like to get such a thing working we would need an OS running on top of JS. But maybe I'm missing something.

You're not missing anything, that's exactly what's happening. WebGL/VR, WebCL, WebRTC, Filesystem API... the browser is becoming a simple feature-rich client OS for the cloud, and WebAssembly is the machine language.


Javascript fatigue fatigue.


It won't change the fact that a lightweight scripting language is coerced into running increasingly heavier apps. Defending it, comes off as a Stockhilm syndrome really.


What does "lightweight scripting language" mean? So much hand waving in this thread. ES6 is an advanced programming language rivaling most others and JS VMs are some of the most optimized runtimes in existence.


https://en.m.wikipedia.org/wiki/Scripting_language

Arguing grammar and references to adoption rates just illustrates the rest.


> To expect web development to be a perfect monoculture is a failure

Except, it's already a monoculture of one language (JS) and one display algorithm (DOM) and everything is just compensating for that.

The "runtime" is not general enough.


WebAssembly will remedy that.


People wouldn't complain about it so much if it didn't suck.


> I don't know about others but I tire of these comments chiding the "out of control" front-end ecosystem

Old people also get tired by all these "apps" wining for your attention; they want "just a phone". Young people grew up with it. Note to self: I'm getting old.

Old people were used to buying development toolset (compiler+tools+IDE), usually including a physical book, with which they can create software. Young people are used to assessing open source projects, and piecing a project together on top of many freely available libraries/frameworks.


I'd be careful of making such generalized statements, whipper-snapper...some of us senile old devs actually pieced together some "wild and free" C-libs (i think we used duct tape and hairpins, but im losing my memory in my advanced age) and even used dynamic-loading (DLLs of course) back in the day.


>Old people also get tired by all these "apps" wining for your attention; they want "just a phone". Young people grew up with it.

https://www.youtube.com/watch?v=z-194bOCJnE


Finding my "State Architecture Patterns in React" article series on this excellent list of yours was a nice surprise. Thanks!


Heh, sure. I think you took it in a couple directions I didn't necessarily agree with myself, and as a maintainer of Redux I'm _very_ biased in its favor. But, the writeups were excellent overall, and definitely worth including as relevant reading material.


Thanks for such a high quality comment with links to even higher quality resources.


Your link [1] seems to be broken (https://github.com/markerikson/react-redux-links/blob/master...) just a heads up.


Whoops, that's what I get for literally copying and pasting my comment :) Fixed - thanks!


> The company hasn’t previously talked about React Fiber

Except, you know, the entire presentation on it at React Conf back in March.


It's TechCrunch... I won't say anymore because I'm also a member of the press... but it took me 3 seconds to find this on Google: https://gist.github.com/duivvv/2ba00d413b8ff7bc1fa5a2e51c61b...

Somebody forgot to fact-check his article. That's for sure.


> Somebody forgot to fact-check his article. That's for sure.

I think that's just called "research," and it's usually step #1 for this kind of article...


Most reporters these days don't appear to fact check anything anymore. It's too important to get the first story out.


I always imagined that a company of TechCrunch's size would have interns that do all the fact-checking... similar to VICE: https://motherboard.vice.com/en_us/article/editorial-intern-...


they could just follow Dan Abramov on Twitter and they would have known about it for a good while -.-


They've talked about it so much they got sick of answering fan mail and stood up a site for it:

http://isfiberreadyyet.com/


Or on twitter, or on github :)


Or the dedicated website on Fiber's progress.

http://isfiberreadyyet.com/


They changed the article to "The company hasn’t previously talked MUCH about React Fiber" (emphasis is my own).



> This page is rendered with it.

Yep, I can tell from the 5 seconds of blank screen while my phone browser loads all that JS.


I think it runs the test suite live... It's not meant to be fast, but to demonstrate compatibility.


Are you sure? Because that's a waste of resources. I'd imagine they would cache the results, which would make the GP's concerns about speed valid.

Edit: Found the github repo - https://github.com/tomocchino/isfiberreadyyet

Looks like it fetches from the umbrella issue[1]

https://github.com/facebook/react/issues/7925


It fetches from GitHub. The website was built in a weekend by our manager to make the task of a complete rewrite a little more fun for the team. Sorry it’s not perfect! ;-)


This app was written by a manager, don't read into it too much ;)


The page is completely blank with javascript disabled (which is how I browse the web). So I guess it's not ready yet.

Do any of these fancy frameworks actually make a HTML fallback anymore? Most of the web pages I visit still renders fine without javascript.


That's the paradigm of single-page apps (SPAs). Rather than the traditional client/server model where the server renders the page and sends it back to you, SPAs do all the rendering on the client-side, and simply pass data back and forth between the server using APIs. It's (in my opinion) a superior model for apps, but has a lot of disadvantage that make it unsuitable for websites—mainly that SPAs take longer to load initially and don't work without JS enabled (which also includes search engine crawlers).

Having said that, it's entirely possible (though not easy) to have your React app work without JS enabled by rendering it on the server before sending the page to the client. This way, you get the best of both worlds, but it's a lot of work to implement this properly for a remotely complex app. The answer in the short-term is that you shouldn't be using a "fancy framework" for anything for which the above disadvantages are a deal-breaker (like most public sites) until the server-rendering story matures and becomes more compelling.


God, that page makes me want to drop everything and work to make tests pass. My team needs that.


Yay my tests pass. Wait, what were we building again?


A web page to show our unit test results. I forget why.


I love the way the graph shows they actually broke some tests recently


They didn't break tests, instead a whole load of new tests were added, specifically around server side rendering, which exposed some gaps in coverage.

This is exactly the type of openness you want to see for such a widely used project like React


That must be the slowest-loading page I've visited today, though.


I mean, what did you expect?

Won't be long until someone upgrades their 16-thread 32GB machine because Facebook was "too slow"?

I type this on a dual core phone with 2GB of RAM that's faster than most desktops 15 years ago thinking I need a faster device.


React Fiber was beautifully described by Lin on the last reactconf. https://www.youtube.com/watch?v=ZCuYPiUIONs Congratulations to the release.


It is one of the most accessible talks for someone like me who doesn't even know anything about React.


Yes, this presentation is so good.


Great. I was waiting for React to stabilize before I learned it.

I've been cleverly waiting since 2006 to learn jquery. Will start next year.


> Great. I was waiting for React to stabilize before I learned it.

But once it's stable, it's outdated.


It's been stable for more than a year now.


Best practices will come out in 2025!


Not sure if serious, but this update swaps out the internals without affecting users/developers, the API is fully backwards compatible


It's actually not. There are some lifecycle changes, potential modifications to setState (async only), etc. That's why it's being released as a new major version.


setState is not async-only in React 16. We keep the old behavior and just change the engine now. Later we will look into enabling new behavior.


I think we may be splitting hairs over the definition of API compatibility


it's one of those 99% compatible except for a few small quirks that only developers using it on large applications will notice.

The React devs have also been great with releasing codemods for breaking APIs.


No API changes doesn't mean that your code will work 100% as it was before.


Lin Clark's talk makes this sound like they implemented a scheduler in React -- basically JS is single-threaded, so they're implementing their own primitives and a scheduler for executing those on that main thread.

Sounds neat, but it also seems like an explosion in complexity -- perhaps there will be a bunch of weird scheduler bugs the operating system folks more or less figured out a long time ago?


This is a valid concern, and we'll do our best not to screw up. :-)

It's not as hard because there is no parallelism (at least now) and any mutations of the internal data structures happen in two or three tightly controlled places. But yea, debugging these is harder during development (for us) and we'll need to make sure our testing story is rock solid before shipping the asynchronous mode by default.

One thing that really helps here is we test all updates on Facebook which uses React in many places and has a giant user base. So regressions that lead to a drop in metrics (e.g. a crash or a race condition) automatically get reported to us, and this adds a second line of defense after unit and integration tests.


New code certainly carries risks that old, known code doesn't. And yes, this does carry over towards OS scheduler territory.

That said, the early feedback from others in the React community is that the Fiber implementation is simpler overall and easier to work with than the "React Stack" implementation. Fiber is also explicitly built to support additional renderers on top of the core reconciler, while building renderers on the Stack implementation apparently required some levels of monkey-patching or reaching into internals.


I'm an author of a custom renderer and yes it's so much easier to make a renderer to work with fiber :) I'll do a write-up on it eventually but I'm sure others with more time will beat me to it.


... that was pretty uninformative. The first 4 paragraphs the author is just talking about nothing.

React Fiber is nothing new. It's more than an year old. Facebook has talked about it many times. Anyone who uses React would know this.

Shouldn't we prefer official sources instead of articles written by third party content writers that are clearly not that knowledgeable? Seriously, this article looks like the author was being paid by character count and not quality.


See, Google should learn from Facebook: total API compatibility with the previous version.

They saw what happened with Angular and Angular 2—which Google named the same despite it being a totally different framework—and made a smart move.


That's a luxury you only get when your API is good. With Angular, the problem was the API.


Angular 1.x was never meant to be an MVC initially. It was mainly for templating then they started adding stuff to it (services, directives, digest cycle, dependency injection...) leading to API bloat

By the time Angular 2 was released, Ember, Vue and React have devoured the market


Unfortunately that is also the case with Angular v2. The API surface is enormous.


what does that have to do with whether or not to completely change the name of the re-write?


Facebook's motto is "move fast and break stuff." I'm glad they didn't follow this when making React Fiber. I suspect they learned from Angular 2's failure.


Is this the rewrite of the diffing engine that we've known about for a good while?


Yep - if you are interested in learning more technical details, check out https://github.com/acdlite/react-fiber-architecture .


Is this as big a change as Angular 2 for you react devs?


Not at all, nothing incompatible changes on the surface. There are more features and no breaking changes. It's mostly all under the hood!

There will be new public api's to build your own custom rendering engine, ex how there is react-dom for browsers, react-native for mobile, and it will get easier to target new platforms thanks to fiber.


For those who have side-effects in their component lifecycle methods might have some problems when they turn on async fibers.


Which, in Facebook's defense, they've explicitly and often warned against


This will not be default until React 17 I believe.


Thanks.


Most likely not, as it's supposed to be backwards compatible.


No, this is mostly an internal change, the API is not changing.


Some of the API semantics are changing, e.g. componentWillMount can't be trusted any more (it can fire many times) but they're going to be pretty small.


Whoah, hang on, really? Do you have a source on this or somewhere I can find out more details?

This will completey break a lot of libraries, components, etc that rely on the React lifecycle contracts.. Certainly doesn't feel like a small change! :-/


Several relevant pointers:

- https://github.com/facebook/react/issues/7671

- https://twitter.com/dan_abramov/status/790590733468241920

- https://www.reddit.com/r/reactjs/comments/5fg7iq/why_should_...

- https://daveceddia.com/where-fetch-data-componentwillmount-v...

Also see Lin Clark's "React Fiber" talk, linked upthread. Basically, pieces of work can start and be interrupted, and then be restarted, which will mean that `componentWillMount` can be called several times before the component _actually_ mounts.


To be clear this is irrelevant to React 16 which works in sync mode for compatibility. We'll share more about any changes when we start enabling the async mode in Facebook products and figure out the best migration strategy.


Fortunately, componentWillMount is largely made redundant by the use of constructors.


Except it's not for when you actually want to know when the component was put into the DOM, or you're running react server-side


> when you actually want to know when the component was put into the DOM

You can use componentDidMount for that and its semantics are still the same.


Yes, componentDidMount is the place to work with DOM. componentWillMount never worked this way—it is basically a glorified constructor from pre-ES6 era.


Oh apologies - I really the comment as componentDidMount. My bad.


Not in terms of it being a breaking change. It's meant to be a big speed improvement but the API is largely the same I believe.


Please note that Fiber is not meant to be a “big speed improvement”. (React is already pretty fast, and most speed improvements you can do in this space are marginal for most apps.)

Fiber is a rewrite that makes it easier for us to add new features to React (and it adds some). It also adds the foundation for enabling better perceived performance in the future releases thanks to async pre-rendering and more control over scheduling. That part is still work in progress and won't be enabled in React 16 by default.

So we do think eventually it’ll enable us to build more responsive apps, but we’re still working out the details and experimenting with what it can do. Don’t expect any drastic changes on this front in React 16.


I've been working with react for a bit more than a year, and I am trying elm. I think elm is the best thing that happened to the web in general. It's really daunting at first, but I highly encourage you to try it.


Elm blows me away. It's everything I saw that React was trying to do, but bound up in an actual language that has strong Haskell-ish types, is fully functional, and mind-bendingly fun to learn.

Elm really could be the future of bullet-proof web development. It has everything it needs except for runaway popularity.


"Mind-bending" being presented as a feature could possibly explain the lack of popularity.


I dismissed elm about 10 times. But once I tried it, it was great. The pure functional approach of elm is like solving an equation. Once it is solved, it's done, you can go home. In the sense of that you write elm code only once, if it work, it will always work. And if it doesn't work, it won't compile anyway.


Mind-bending in a good way, like the way that Haskell lets you express constructs that you may never really have used in other languages.


Elm is amazing! If you have all the pieces in place aka model, view and update; once you get it compiled successfully.. it just works, it's mind blowing. This also makes refactoring a joy.


Neither the comments nor the article mention this, so I figure I'd ask here -- the React team has been promising performance improvements for stateless functional components (I think because they require less/no overhead + lifecycle management) for some time now, but I don't think it's moved beyond that.

Does Fiber introduce any of those optimizations, or nah?


Not yet. Seems like there is a loose plan to start implementing some of those optimizations once fiber lands in React 16.


Yes, functional components take a separate, simpler code path that should be faster.


I'm excited for relay's new core api to be released, as well.


Me too! More info is available in that blog post: https://news.ycombinator.com/item?id=14142196.


Anyone have a link to some benchmarks? As much as I enjoy the hype train sometimes, I'm not finding hard data on how much faster this is than the current React/Preact/Inferno .

So much hype and no words about performance... I'm wondering if it's slower.


Please see my reply in https://news.ycombinator.com/item?id=14144042.

Regarding benchmarks, you might enjoy reading this: https://medium.com/@localvoid/how-to-win-in-web-framework-be...


Sorry, it wasn't entirely clear to me from the article. Is React Fiber going to be a completely new project, or is it just a name for the new release? It seems like the latter to me since there's no API changes...


It's a rewrite of the internals, particularly the diffing/reconciliation algorithm. React Fiber involves effectively re-implementing the JS call stack using a data structure called "fibers" to track work that needs to be done.


The question still stands; will we be bumping a version number, or changing a package name?


It's a major version release. v15.5 added deprecation warnings, v15.6 will add a few more warnings and fixes, and v16.0 will be a major release that includes the Fiber internals turned on. (This release path is described in the issues I linked in my earlier comment, particularly https://github.com/facebook/react/issues/8854 ).


It seems like what HopJS does to handle tail calls[1]. It is essentially a trampoline if I understood the docs correctly.

[1] https://pdfs.semanticscholar.org/d10d/aba5df8b4c563b62fd01a3...


They "announced" this ages ago, didn't they?


Yes.


So they have created a kind of map/reduce system to distribute computation over a tree, except everything still runs in the same Javascript main thread.


Not at all. Fiber is a method of divvying up the work the reconciler is doing so the main thread can process higher priority updates ahead of ones that matter less. There are other benefits, but this is the central one.


UI updates seem to be only possible in the main thread


My understanding is that it's a prioritisation system. If all of the UI changes can't be completed before the next render, the lowest priority changes are deferred to the next render.

It should help with things where perceived lag is undesirable, like typing in an input field.


Correct me if I'm wrong, but Fiber sounds like a micro tasker for React? Ember had been doing this for years through Backburner run loop


With the ability to pause large updates in the middle without committing them to UI, doing some higher priority updates, and then "rebasing" lower priority updates on top of them and continuing the work. So it's a bit more sophisticated.


Does Fiber yield control back to the js engine when doing the "rebasing"? Or are all of these synchronous in one frame?


Yea, it does. We are using requestIdleCallback to do the work "while we can", and then yield the control back. Again, this won't be a part of React 16, but it's part of the bigger picture we are moving towards.

Relevant code if you're curious: https://github.com/facebook/react/blob/233195cb6bc632ade61a8...


Interesting...

Since requestIdleCallback leaves the scheduling out of your hand, how do you ensure nodes are still updated when you need it? Like tests?


This is great news. Definitely merging css animations with React prop/state changes has been the biggest friction point with React (ie managing DOM state from 2 different sources).


Feels like every month they [re]announce React Fiber... there's no reason why this needs to be repeated again and again. And every time it shoots up to #1 on HN.


Sorry! Definitely no intention to bombard you with the same news. It's true that we had a couple talks at last month's React Conf but Facebook's conference F8 is this week which comes with a wide publicity push. Hopefully the next time you hear about it will be the final open source release later this year!


any news on bundle size? and maybe Closure Compiler compatibility for dead code elimination?


No final numbers yet, but we recently switched to Rollup for our own builds which will be part of 16.0, and we'll do some more work on file size in the next few months.

For DCE, we've started splitting some features like createClass out of the main package (https://facebook.github.io/react/blog/2017/04/07/react-v15.5...) which will reduce the bundle size for anyone not using them, regardless of how they compile their JS.


I'm excited! React-dom is currently one of the heaviest libraries that I'm using.


What bundle size range can we expect?

Many mobile users are on throttled connections, due to low data caps, and research has shown users stop loading a site usually after 1 second of loading time.

That means with a throttled connection, less than 8KiB overall CSS, JS and HTML can be transmitted, and with a 1Mbps connection, 128KiB can be transmitted.

How many minutes will users have to wait with React Fiber? And can it be improved in any way to get below a few seconds? Or will I have to continue to write vanilla JS to get responsive apps?


Answering @vote-for-pedro: Yes, currently, a working React installation takes around 6-10 seconds to load on dialup or throttled internet, without caching. Which is pretty good.

But React Fiber is doing a lot more than legacy react, so it'd be interesting to know how much larger it will get - and if it would still be performant enough.


Will React Fibre be utilising WebAssembly for a performance increase, particularly for their virtual DOM? (falling back to JavaScript when possible, of course)


WebAssembly isn't some magic bullet that's going to make every piece of code faster. Anything strongly connected to the DOM is still going to be slow, because the DOM slows things down a lot more than JavaScript does.


No, there are no plans to use WebAssembly in the next few releases although we'd like to explore this eventually.


> It’s already in use on Facebook.com today

What's preventing us from using it in our apps then ? Do they limit the use of the API to what's implemented in Fiber in some parts of the website ?


Not limited at all! You can install react@next from npm and use fiber today. It's just not released as a production ready version.


As far as I know the server side rendering isn't done yet.


Server rendering is missing, and a few other things. (You can check React 16 umbrella issue in https://github.com/facebook/react/issues/8854.)


Is fiber similar to what glimmer is to ember? Is it a big enough improvement to reconsider using react instead? (I was stoked on the glimmer stuff).


React is mostly an API contract (React.Component) and it didn't change. If you didn't like it for this reason, you won't like more.

If you left React for performance reasons especially related to real-time updates or animations, well this version is better.


Since Glimmer is implemented as a bytecode VM, they were able to go ahead and pull in many of the improvements that Fiber brings to React:

https://thefeedbackloop.xyz/designing-and-implementing-glimm...


Good to read that React is faster now.

Can we write an efficient rich text editor like Google Docs in React? Or would it still be too slow?


how this will affect react native?


Looks like it will be a big help to RN. However, I can't confirm that it's a drop in replacement.

https://facebook.github.io/react/contributing/codebase-overv...


Awesome sauce


How does it compare with vue.js? I never worked with any modern JS framework like this but on paper I did not like writing CSS in javascript and vue.js had a normal way to write CSS and to me it looked superior in other ways as well.


Check out styled-components. It's 100% CSS and it's awesome.

https://github.com/styled-components/styled-components

Edit: for the record there's nothing stopping you from using CSS files with classnames on your tags as usual, but libraries like styled-components allow you to easily scope and collocate your CSS with your components.


Hey, that's us!

Thanks for posting this Scott, if anybody has questions about styled-components feel free ping me here or on Twitter. (@mxstbr)


I've been using React with CSS/SCSS files. You do not need to write CSS in javascript.


What was your objection to writing CSS in JS? It would seem like a fairly trivial syntactic transition to go from one to the other? It's just selectors with a dictionary of properties right​?


Writing CSS in Javascript brings back memories of pre-2000 web development, table based layouts and more horribly inline styles inside HTML tags. CSS was a breath of fresh air, it meant separation of concerns.. HTML the content and CSS the visual representation. With the advent of jQuery it allowed separation of behavior as well.

With frameworks like React these days, content+behavior is in JavaScript already, and with writing CSS in JavaScript, we've gone back 1.5 decades I think.


I had this same initial reaction, but concluded as others have that this architectural approach was not actually a separation of concerns but merely an unwieldy separation of technologies: http://blog.andrewray.me/youre-missing-the-point-of-jsx/

When you start thinking in components it becomes much nicer to have the styling, logic and layout all in one file, isolated from the rest of the application and easy to edit, move around and apply to any page.

The previous 'best practice' of having these coupled aspects in separate files now seems quaint and tedious.

There are of course trade-offs, and CSS-in-JS has not triumphed yet, but if you've held off from considering this approach simply because you've had 'separation of technologies' drummed into you in the past you might be missing out.


CSS isn't separation of concerns, it's just separation of filetypes. The problem with inline styles was that you had to repeat them all over the place. This is not the case with Javascript, since it's a programming language and it has an import system. I feel like the sentiment that JSX and JS CSS break separation of concerns is cargo-culting what separation of concerns is all about and why it's useful.


It's another new proprietary syntax/api for CSS.


My objection is. I already learned how to write less and then later switched to SCSS. I like the syntax because its basically CSS and I do even avoid heavy nesting taking bootstrap as a good example. I use atom and scss-lint (now deprecated but still works great) to live lint my SCSS in Atom and spot errors and that includes order! Can I do this in CSS that is embedded into stupid react files that mix JS logic with the visuals? My guess is there is probably something out there where I can do something very basic but not even close to real feature complete linting. And yes, I don't think style and logic should be mixed together. What about autoprefixer ... how do I do all my SCSS/CSS processing if my CSS sits in JS files?


The style I like to use is the css/scss webpack loader. Basically, you import a .scss file into your .jsx. The scss is compiled and at the same time imported into .jsx as a hashmap where the key is your scss class-name & the value is the compiled class-name. I think this sounds like what you're looking for since it keeps jsx & css in their respective files but still being coupled into their own component which is easier to reason about.

http://javascriptplayground.com/blog/2016/07/css-modules-web...



Vue is pretty nice. Simple, high performance, and opinionated just enough to enforce some consistency in use.


Vue is much, much faster to load; it is possible to use vue applications over 3G where React apps choke. Vue is also, in my experience, much easier for back-end developers to contribute to without making app-braking performance mistakes. It has a canonical event-driven store, Vuex, with much nicer debugging tools than any of the flux frameworks (it includes getter values, for example). I have also found the error messages more meaningful across the ecosystem; React can sometimes produce very unhelpful errors.

What it lacks is some of the maturity. For example, dynamic loading of assets only works if they are static and the cli isn't trivially deplorable. It is also less standard among front-end developers, which mostly comes through in the selection of libraries available (Vue is close enough to vanilla templating that the learning curve is shallow.)


>I have also found the error messages more meaningful across the ecosystem; React can sometimes produce very unhelpful errors.

Can you file an issue next time? We're happy to fix confusing errors.

>much, much faster to load

I would love to see some data backing it up. I thought they are roughly in the same ballpark with Vue having slightly more built-in APIs but I might be wrong!


>> much, much faster to load

> I would love to see some data backing it up.

In Stefan Krause' js-framework-benchmark, the vue implementation starts in 55ms vs 89-113ms for the various react implementations. See the 'startup time' row here: https://rawgit.com/krausest/js-framework-benchmark/master/we...

I haven't profiled vue and react on this particular benchmark, but I have done so for some of the other implementations. Baring anything exotic during startup, the 'startup time' benchmark comes down to package size. It takes time to fetch, parse, compile and load javascript. React is just a bigger package.

All disclaimers about benchmarks, microbenchmarks, n=1, YMMV, etc apply.


I switched my complex app from React to Vue and the speed difference wasn't really noticable. Also, React had far better error messages. Vue's are often useless. This is actually the only thing I miss about React. Vue has been a better experience in every other way.


What I don't like about Vue is that it still has a clumsy templating pseudolang, whereas JSX lets you do what you need to do with inline JS statements.


Building your project upon an framework means abstraction, idoms on top of the programming language one has to learn and the pending migration-horror when the framework changes the API or gets deprecated.

Often it makes sense to think at least about alternative solutions like using libraries instead of a framework. Libraries are a different concept that is more flexible. In any case stable API is prefered to moving targets. A real probem is when project move so fast that one has to update to every new minor version of the framework/library, if he skips one he might be left out of the supported upgrade path (frameworks) - lot's of maintanance work that could be spent elsewhere (coding new features, etc).


Does this mean something for someone starting in React Native or the changes in that regard are just under the hood?


The maturity of the code base has been reset.


Under the hood. (We'll update RN to use it soon.)


Another day another JS Framework.


another day another dude that didn't read the article in the link, and also didn't read the link title.


Does it the Frontend ecosystem has yet another new framework? Do we have to retire the old one and learn the new one all over again?


No, because React Fibre is more or less a rewrite of the internals with no API changes.


This is much like the Glimmer announcement. They're simply optimizing and refactoring the rendering engine, with the API not changing, unlike with Angular.


No thanks I'll stick with plain html and jquery for the frontend of my small projects. Plain javascript works perfectly fine as long as you namespace your functions and separate them into different files. All these frameworks end up creating much more problems than they solve.


As someone who has worked on a truly large React project, I can assure you this is not true.

I cannot imagine the shitshow that project would be using jQuery to track the state.

Nearly as soon as you progress from 'backend rendering html' to single-page-app, it behooves you to use React.


What's the major advantage of moving from "backend rendering html" to single-page-app?


There are trade-offs. In very broad terms, single-page-apps are better for something that behaves more like an application (has a lot of user interaction or real-time behavior) than something that behaves more like a website (a page is loaded and the user views that now-static page for a period of time).

I try to fit the more traditional paradigm when I'm allowed to architect an app myself. I like rendering a page on the backend.

Anyway, you can read arguments by smarter people than myself: http://stackoverflow.com/questions/21862054/single-page-appl...

That's just one thread of many.


> As someone who has worked on a truly large React project, I can assure you this is not true.

Worked on or maintained an existing app for a number of years? There is a very big difference.


> As someone who has worked on a truly large React project, I can assure you this is not true.

there's no such things as the best approach for all needs. React brings you load of advantage but it assumes stuff you might not be ok with:

- spend time maintaining your project over time fixing stuff on version change that won't bring any direct value to your project

- you're ok with the licence and with the company maintaining it.

- hard to debug stack trace

- ship code that you have no control over, increasing the payload size (at least for small to medium size projects)

> I cannot imagine the shitshow that project would be using jQuery to track the state.

First nobody force you to use jquery. Manipulating the Dom isn't what it was when people had to maintain stuff on ie6. Dealing with the Dom directly isn't a "shit show" anymore and if you understand the concept of redux you can use those general concept and apply it in your project to manage the state. Plain Js simply assume the Dom isn't a magical piece of software, it just has prototype and stuff people usually don't like around here but hey, if you're coding stuff for the web, it's better to know how to deal with it than denying it

Es6 brings lot of great features you can use today, you don't need react for that


React is smaller than JQ and can go down to nothing with aliasing to react-lite, which strips off proptypes and other things.

React and Redux is the easiest debugging has ever been. While its true that stateless crashing components can cause some headaches (will be addressed in Fiber), inspecting state and flow of actions, how something rendered or changed and why --- haven't had such a experience before with no other tool or language.


> All these frameworks end up creating much more problems than they solve.

For small projects, sure, they might. But working at larger scale falls apart using plain html and jquery.


I assume by scale you mean many pages and not many visitors. For something like hackernews plain html and jquery would work perfectly fine.


Yeah, I didn't mean scaling to users.

Scale was a poor choice of words. Front-end complexity is a better metric, and in particular complexity which relies on manipulating state.


Even for small projects libraries such as Vue or React are very useful.

Of course it feels easier to stick with what you know, but managing the DOM with data is so much easier than using jQuery + Handlebars.

Seriously, spend some time learning Vue. You won't regret it.

It has to be said that you don't need any building process (Webpack, gulp, etc) to use Vue. You can simply write your templates in your index.html and import using <script>.


Just when I thought that we have enough JS frameworks and there's nothing to do fundamentally different, along came vue.js ...


That's what I thought until I decided to use Vue for one of my smallish projects and subsequently realized the cognitive benefits are enormous. Now I'll only revert to plain old JS for VERY small projects where the mild overhead of a framework really doesn't make sense.


for brochure websites, yes, For large projects, hell no.


Well thats great because you should always use tools that appropriate for the task.


React will die by the weight of the frameworks and libraries that have duplicated effort and created more confusion and divide amongst the developer community.


“React will die”

[Looks at facebook.com]

[Looks back at statement]

[Keeps building in React]


If you don't understand how it works and how it can be combined with tools like Redux to create a highly debuggable and predictable app, it certainly looks like a bunch of newfangled framework stuff for the sake of nothing.


It's made my life infinitely easier writing front end apps that need to keep state and update when the data changes. Remember the old days of backbone and marrionette? Remember even older, having horrible jquery sprinkled around everywhere?


jQuery is just a library, it never told you how to write code.


Everyone working on JavaScript right now lives in a bubble. Take a look at how back end languages have evolved and that's your future, you can try to deny it but your going to go the way C did and Java did. You can cite openness and the web and all these other things that make some kind of sense but history is against you.

In reality those arguments don't hold water. In reality our field is 30 years+ old and there is some maturity in the way things are done. All this crazy shit of rewrite everything all the time will just eventually tire people out and people will cut this redo an entire framework just because out.

JavaScript is one of the most important languages around right now and pretty soon the adults are going to be taking it away from the kids. Just as an FYI, this is another example of that. Too many important decisions are happening without much thought about what's there already.

I don't know maybe I'm wrong and maybe this time will be different but I've been around long enough. I learned the ins and outs of angular only to be told "fuck that" shortly after. It all seems very wrong too me but maybe you can tell me otherwise?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: