In school, 25 years ago, I was taught the "circling balls" atomic model and for me physics seemed kind of boring and very stable field of science, like we already knew almost everything.
Nowadays, thanks to superb youtube channels[1], I've learnt that we are rather in the beginning of the journey of solving the mystery of universum, and the reality is much more exciting than those boring "circling balls"; actually there are no balls at all but just "fields" (which of course is also just a mental model).
> In school, 25 years ago, I was taught the "circling balls" atomic model and for me physics seemed kind of boring and very stable field of science, like we already knew almost everything.
School-level of physics is pretty stable. There wasn't anything radical new in decades. Most things pupils learn is around 100-300 years old, because everything else is too complicated for them and mostly unnecessary. Even physics I learned at university took several years till it reached more modern levels.
I hate the circling ball model because it gives so many people completely wrong intuitions about what is actually happening in the world of particles.
After that kind of education people tend to imagine particles as little balls traveling through space and bouncing and occasionally doing something magical that normal balls don't do (like not having a radius or interacting with itself).
While what actually happens (according to better models) is that paricle is a nebulous object that evolves moving and reshaping and when we interact with it with it our measurement devices we reshape it and get results as if there was at given point in time some mass with some charge and spin and whatever at some region of space with some momentum and energy limited to some range. And to guess what will be similarily vague result of the next interaction with that object we in many cases can't draw a line and say "the ball flew throug there". And the lines we draw when we can, represent the motion of the whole fuzzy cloud that actually is the particle as it evolves in space.
I think we should start teaching model of the atom starting from the orbitals and treat classical model of the atom only slightly better than "raisin model" of the atom because what it gets right it gets right only because wave function evolution equations in some very specific cases simplify to classical equations of motion and we learned them first by observing macroscopic objets that are that special cases of motion.
The image of p orbital should suffice to explain to people why the circular model is wrong.
The circling ball/shell model is extremely useful and intuitive for high school chemistry though, so changing one without the other probably will cause more harm than good. I am a physicist and to be perfectly honest when I have to understand some kind of chemistry, my mind still uses the shell model.
Real chemistry starts with orbitals. Everything below is just excessive simplification that makes no sense when you go beyond very simple cases (taught in my country at the level of primary school).
You really could start with orbitals instead of shells and it would be as simple as the shells but made more sense and getting familiarized with them early would give you the right intuitions for tackling those more challenging cases.
Can you give me an example of a chemical reaction that is more intuitively understood with orbitals than with shells?
One aspect relevant to chemistry I can think of that can be better understood with orbitals is why shells have the size they do, but for that you need to understand Legendre polynomials, which I only learned in my second year of university, I think.
Orbitals tell you why the number of valence electrons is what it is, or why bound atoms form certain angles or why aromatic molecules make sense. Or they can even tell you why periodic table is like it is. [1]
To get introduced to them you don't need to know the math they are ruled by. Just seeing images of that model rather than shell model give you better intuitions about what's happening and more complex stuff isn't surprising and countrintuitive.
To me it seemed as well, that my physics teachers mindset was unchanged and frozen to the classical deterministic physic mindset of prequantum discoveries.
It was actually believed in 18xx), that theoretical physics was allmost solved.
But somehow this very wrong idea prevailed and made it to my school as well.
It's deliberate, you get taught different models of the atom etc the deeper you get into physics/chemistry. In that way you also learn about models and the scientific method and how the advances have been made.
In the same way that Newtonian physics is still useful for a lot of stuff, the circling balls model of the atom can still be useful in some areas
I don't think it's the process so much as the outlook that's taught to younger students that the disapproval comes from.
It's presented as a much smaller frame of study than it really is. And often it's treated so mechanistically, that rather than teaching you how understanding is important and will feed into your very worldview it's presented as something you study if you want to build assembling machines or something.
My and OP's gripes might be more endemic to underfunded midwestern rural schools like I attended, but I'm sure it wasn't that rare, unfortunately.
It would have been nice to know there were super advanced and interesting contemporary models. That would be inspiring. Do they even mention quantum mechanics at that level, if just to say hey look this is only the beginning. It was comical discovering in my early twenties that my understanding of physics and chemistry was outdated by a hundred years. Isn’t this the future?! Kids should be walking around the enterprise complaining about their calculus classes.
a good way to teach these physics to students is to lead them through the ages by teaching the old models, then let them find out in experiments those models' incorrectness, and then teach them the new discovered models that more correctly predicts reality, and continue.
You can do this with the atom, you can do with with Newtonian physics (and teach relativity as a result). You can do with classical physics of matter, then teach the double slit experiments, and then teach quantum mechanics as follow ups etc .
Because this way, the students learn our concept of physics from the ground. How it was developed and the limitations each time and struggle, to make further sense of it.
If done right, this leads to much better understanding, than just presenting the latest model.
But that is not "from the ground", it's just repeating the same mistakes people did in the past. It's just putting a lot of effort into learning something, only to be told over and over that it's wrong. That can be very demoralising.
> But the question was, is that actually a good way to teach?
This begs the question - what makes for a good way to teach int he first place? And what is the purpose of the teaching?
In high school, and at first year university, the teaching is meant to garner a good understanding of the basic concepts - esp. in high school. Nobody expects a highschooler to be able to compute forces for real life applications after having learnt physics.
The reason i claim that teaching the history, and the "incorrect" models that have been discovered and corrected throughout history would give the students a deeper understanding of not only how science is done, but give them a deep impression of how to advance their understanding via noticing inconsistencies or incorrect predictions from old models.
Contrast that with just teaching them the "correct" model, without the context, or the history of how such models came to be. It would just be a set of dry formulae, told to the students like gospel.
Sure, but, is teaching the specific mistaken understanding from the past the way to go there, or are there better simplifications that could be used instead?
There may not have to be, as far as we know. From the Wikipedia page[1]:
Many theoretical physicists believe these fundamental forces to be related and to become unified into a single force at very high energies on a minuscule scale, the Planck scale, but particle accelerators cannot produce the enormous energies required to experimentally probe this. Devising a common theoretical framework that would explain the relation between the forces in a single theory is perhaps the greatest goal of today's theoretical physicists. The weak and electromagnetic forces have already been unified with the electroweak theory of Sheldon Glashow, Abdus Salam, and Steven Weinberg for which they received the 1979 Nobel Prize in physics. Some physicists seek to unite the electroweak and strong fields within what is called a Grand Unified Theory (GUT). An even bigger challenge is to find a way to quantize the gravitational field, resulting in a theory of quantum gravity (QG) which would unite gravity in a common theoretical framework with the other three forces. Some theories, notably string theory, seek both QG and GUT within one framework, unifying all four fundamental interactions along with mass generation within a theory of everything (ToE).
Another related thing I've wondered is if, similar to how the possible electroweak force un-unified into the electromagnetic force and the weak force, is there a scenario where the electromagnetic force (or the weak force) un-unifies into some other sub force ? So I guess like the reverse of the GUT.
It is certainly possible. What we think of as a long timescale for the age of the universe may be very short in some future epoch where the fundamental forces further split, just as we see the epoch of electroweak unification as being short.
That isn't quite the same thing. Both electrostatic and magnetostatic interactions are mediated by the same boson (photons). Whereas each of the four fundamental forces have their own set of bosons (W and Z bosons for weak, gluons for strong, and the theorized graviton for gravity).
i thought about this as well, when i first learned about unification of forces, it seems like an obvious thing to try and reason about, so let's go!
[caveat - i'm not a physicist, but]
we know forces unify at higher temperatures/energies, and there is no limit to how high temperature can get (maybe? i guess temperature is the same as the mean velocity of the particles, so for particles with mass this cannot exceed c, right?) however the the high energy zone is inaccessible with current technology above a certain level.
but in the other direction, we hit a floor at zero kelvin. which we can access, and can therefore test. so we know that going right down to zero kelvin / electron-volts / joules / meteres per second does not split any of the known forces further. this is how we know that they are the fundamental forces, i think.
There are phenomena explained by the particle model, and there are phenomena that are not. This is true of all models, and it's a a strong claim that we could eventually land on a "correct" model at all.
To be fair, a "particle" as the term is used in quantum field theory doesn't refer to a billiard ball, it's a perturbation in a "field" and encapsulates behaviour which could be described as wave or particle or neither.
It is currently known that to probe the Planck scale is practically impossible, or implausible depending on your level of optimism.
That scale is the point at which we might gain some level of confidence that we know what "really" is a particle (or its constitution), whether it is truly a point particle or if it is just convenient to assume that at the present time.
Not just practically impossible, it is theoretically impossible. Given Heisenberg's uncertainty principle, observing a position with a certainty of the planck length would require it to have so much energy that it forms a black hole. Since you cannot get information out of a black hole, this makes observations below this scale impossible.
Of course, the scales involved are far beyond anything we have been able to probe. And this arguement relies fundamentally on the interaction between gravity and quantum mechanics, even though those theories are famously not compatable. So the 'theoretically' in theoretically impossible is doing a lot of work.
Are you asking if a photon is a particle or a wave? It is widely known to be something that our brains have trouble understanding, since it behaves as both at the same time. Thus, "shut up and calculate!" as the response when students try to figure out what quantum physics "means."
We used to think that electricity and magnetism were different types of interaction, but it turns out that they are more coherently described as a single “electromagnetic” force.
I'm not very knowledgeable of physics, but my intuition is that a fractional type should just be a type that falls between the characteristics of two integer types. So if A is the electromagnetic force and B is the strong nuclear force, A.5 should be some interaction that exhibits characteristics of both, is fully described by neither, yet exhibits no properties which can't be typified by some combination of the integer endpoints.
Of course that raises the question of "shouldn't A.5 then just be considered its own type?" at which point I suppose we'd have to refer to how these "types" are constructed, which seems more like a mathematical/computational (ontological?) question than a purely physics question. Then, I suppose, the question further resolves to: which assumption makes our equations easier to work with?
> some interaction that exhibits characteristics of both, is fully described by neither, yet exhibits no properties which can't be typified by some combination of the integer endpoints
Let's say I pick electromagnetic and gravity.
If I start using this 'electrogravity' 'half/mixed force' to describe the trajectory of charged objects I'm throwing, it seems like the correct response is not "shouldn't that be its own type?" but "you're just combining two forces for no reason". If you can look at a web of interactions and separate it into two completely independent mechanisms, then there's no fractional force.
And the scenario of throwing a charged ball fully fits "interaction that exhibits characteristics of both, is fully described by neither, yet exhibits no properties which can't be typified by some combination of the integer endpoints"
I appreciate your input. I don't think that's unreasonable.
However, this "theory" I proposed wasn't essentially about physics; it was ontological in nature, encroaching more on the field of CS's type theory as it applies to conceptions of physics.
Except it doesn’t as there is no actual physics in what you wrote… Without it there is no more correctness than any other random string of words
Don’t worry though, you aren’t alone. There is a thriving community of people coming up with their own “theories” or proving how pi is exact equal to 3.125
Listen guy, you're the one that called it a theory to begin with. I didn't make any claims, and you're right that there's no physics in what I wrote: I was interested in the ontology of fractional types, however and wherever they might be applied.
Do _you_ have an actual point to make, or are you being mean for the sake of it? I would sincerely like to engage with you, if you're quite done accusing me of trying to trisect the angle.
the act of coming up with hypotheses and then testing them is kinda how knowledge is generated, though. and often it can be effectively done by people who aren't thoroughly knowledgeable in a subject, especially as a means to becoming such
The problem here is the conflation of "mathematical model" and "theory" in physics, which doesn't hold in general language. Physicists have gotten stuck on the idea that the math doesn't just predict reality, it "is" reality. In truth, a mathematical model can be perfectly correct and also nonsensical, just look at any neural network. A theory should really be an philosophical/ontological thing, and a model should be a mathematical thing.
but isn't that exactly the process by which one obtains knowledge about a subject? i.e. i don't know much about X, i propose theory Y that may explain part of it, i test theory Y to see if it holds, if it does, i now know more about X, namely that Y explains it.
Sorry, but physics is more than just words and “intuition.” There is a common belief amount crackpots that scientists and mathematicians beucase thye just want to obscure things.
John Baez is actually quite level headed, one of my idols, and I see nothing wrong with that list.
However OP wasn't even asserting anything of a discovery or a revolution. He actually ended his post by asking a question: "Then, I suppose, the question further resolves to: which assumption makes our equations easier to work with?"
Any decent man would have explained to OP about how the best fitting generalizations/abstractions in mathematics and physics also fit the most specific cases. Instead we had a gatekeeper put OP down like a wild fox in a hen house. It's shit like that, that makes humanity stink of arrogance and petulance. It's people like that, that discourage positivity while dispersing platitudes. Their subscription to authority has no basis in merit, it is exactly like John Baez says: "Crackpot Index #9) List your credentials"
This is what epistemic status tries to make explicit. There’s a difference between serious conjecture and playful speculation. The former, I think, involves claiming an amount of status for oneself. People view the latter as the former and attack speculators for having insufficient status. It’s a shame, because speculation can be a useful learning activity/opportunity.
> the best fitting generalizations/abstractions in mathematics and physics also fit the most specific cases
Out of curiosity, how does this differ from the criteria of choosing the assumption which makes the equations easier to work with?
I feel as if they are the same, though perhaps lacking a careful qualification on my part: "which assumption makes our equations easier to work with (without introducing incorrect solutions)?"
Given the choice of abstractions, assuming each abstraction is apt as the next and none of them is "more wrong" than any other, wouldn't the choice of abstraction come down to ease? (or aesthetics, possibly)
I'd love your take, and doubly so if I seem to be coming at this backwards.
Yes, if there is no difference, then we'd likely use the one that is easier or more aesthetic. The heuristic would differ depending on the discipline and the culture of the science.
For instance, Matrix mechanics is equivalent to the Shrodinger wave formulation, but it did not catch on for reasons listed here.
Another system that did not catch on, is Nonstandard analysis, which uses infinitesimals. It is equivalent to the standard curriculum analysis that uses limits.
It's hard to pinpoint exactly why these systems weren't chosen. It's not just aesthetics or ease of use. It's a bit of arcane history too. Nonstandard analysis took a while to make rigorous. And by that time, standard analysis had already enveloped the "cult of science." Once standards are set, they rarely change if the current methods are "good enough." I've always sought intuition with everything, so I know about these alternative methods.
I find it sad that so many in academia resign themselves to symbol pushing without a real understanding, and then repeat the same misgivings on their pupils. If methods do exist to achieve better intuition, then we should promote them. Often, alternative yet equivalent formulations do provide that intuition.
HN is a bit of a cult of personality itself but ironically, about every month on the dot, a submission gets front paged - geometric / vector algebra or quaternions, and how they simplify and clarify the intuition behind 3D transformations.
Yet the same HN has curmudgeon gatekeepers that also pop up like clockwork in any science thread, just to make sure all lines of thought correlate to the rote symbol pushing they learned. They didn't gain intuition, so they must feel that it's either impossible to, or that no one else has the right to intuition either.
You did, thank you for the discussion! I actually have a first edition of Keisler's "Elementary Calculus," well-worn I assure you; throughout my mathematics degree, I sought various other methods of approaching the curricula. I'm very much a fan of Robinson's program of infinitesimals; occasionally, I'm treated to even more rarefied notions of these beloved ghosts of departed quantities.
Here's one I find particularly fun, wherein you disregard law of excluded middle to use nilpotents!
I'm currently trying to digest "Geometric Algebra: An Object-Oriented Approach." It is a real pleasure to see you mention some of my pet favorites; thank you, thank you, a hundred times over friend! You've put a song in my heart today.
EDIT: I was so wrapped up in having met a fellow traveler that I forgot to leave an "in," should you wish to continue this thread.
I also find it a tragedy that my foray into the constructivist and sundry corners of math was relegated to self-study. Having approached a few professors on this topic, the general consensus seems to be "why waste your time?"
To what do you think the intuitive power of these equivalent-but-alternative foundations is owed? Personally, I think it has something of the basis that colors the division of the analytically- vs algebraically-minded; yet even the cause of this is a mystery!
(However, let me be careful here: I don't wish to give the impression of undue competence. I'm broadly-read, but woefully underskilled.)
Thank you for sharing SDG. I haven't heard it mentioned in many years. The simplification power it provides is beautiful! [1]
I'm elated over having delighted and filled you with music. Your kind words have made me beam from ear to ear as well :-)
I think the intuitive power of these other formulations comes from the spatial/geometric imagery that these disciplines naturally provide. Another aspect is the conversion of unwieldy processes into simpler objects. Like nonstandard analysis replacing the limit process with the infinitesimal object.
The paradigm of replacing large processes with things that can be intuited might not be very profound if as programmers we bring up first class functions. A function is just an object anyhow! True but the human mind seems to be less efficient at composing functions than composing objects. So though it's all interchangable, we seem to work better when we are given mentally ergonomic foundations.
Back to what you said in your previous posts, the ease of manipulation in these alternative theories, it probably lends a bit from this exchange of complicated processes for simpler things.
No, I'm saying that griping about how things in social land are polarized isn't going to get us off this planet. If you have the brains to write these things, you have the brains to work on a solution that doesn't give a fuck about what some douchebag poser thinks about how reality works.
I'm not conversant enough with physics to do more with this article than say "wow, that's cool!" and let my mind run into all sorts of fun science-fictional speculation... but wow! Fractals are cool, and fractals in _physics_ are _very_ cool!
Good question. Consider the interesting fact that there are discrete energy levels instead of a continuum of energies for an electron in an orbital. This was a profound discovery at the time because it does seem like things should be more analogue than digital :)
I think if there is a spectrum in interactions between the two forces joined by this spectrum would be considered one force. Of course, there is a history of forces that once appeared to be distinct turning out not to be distinct but one and the same force instead. There is a line of thinking that speculates that all four known forces are to be joined in one.
> Each neutron in an atomic nucleus is made up of three elementary particles called quarks.
Well and tons and tons of virtual particles popping in and out of existance. Only a little over 1% the mass of a neutron is the three quarks normally listed.
While they are real, the name is a little misleading. Virtual particles are not quite the same kind of disturbance in a field as full-blown, bona-fide, certified particles.
They're two different kind of object that come out of the equations, and they happily interract or turn into each other sometimes.
But Virtual Particle doesn't mean they're not real, and also doesn't mean they're particles :)
Disclaimer: I'm ignorant, and this is my summary of Matt Strassler off the top of my head. Mistakes are mine.
Virtual particles are the universes equivalent of local variables inside a function. Parameters go in, results come out, but what stays in that local context still existed even if they didn't have global scope.
I realize the above is analogy, but can someone more informed than me confirm the correctness of the above description, because I find it to be a particular good one.
I can think of some analogous ways to recover some semblance of the analogy, but it's not really a great one.
A virtual "particle" is really more of a "disturbance" in a field: think splashing in a pool vs a wave. And in the context of the contents of a proton, imagine trying to differentiate a well-behaved 1hz wave from the other noise in the liquid, when the liquid is in a blender.
edit to add:
To recover the local/global variable analogy, imagine the compiler was continuously recompling the code based, inlining (or undoing that inlining), hoisting variables out of loops (or inserting them back), and trying to determine where a particular temporary value for calculation is being stored. Again, it's not a _great_ analogy; because I have a hard time determining how understanding the analogy would lead you to correct conclusions about virtual particles, but yeah.
Just as real as the concepts of particles at all. They're just really likely to recombine because they form in pairs of opposite charges and will attract and annihilate. Hawking radiation is adding another force with a gradient (extremely strong gravity near an event horizon) that'll absorb one of the particles of the pair, and then the other flies off into space as the real, bona fide particle it always was.
>>Only a little over 1% the mass of a neutron is the three quarks normally listed.
Pardon my layman ignorance. When the particles pop in and out of existence, how does mass manage to remain the same? Or does mass keep changing and is not fixed quantity but rather a range?
The mass of the 99% comes from (kinetic?) energy that quarks have as the come together to from a neutron. This system has some energy like a rope moved by standing wave. Standing wave is a sum of waves traveling back and forth and sometimes adding up and sometimes cancelling each other into nothing.
Gluons pop up out of nothing to carry this energy that struggling bound quarks have and add up into nothing passing their energy back to quarks.
If you pump in even more energy into this system (for example by smashing something into it) even something like new quark-antiquark pair can pop up into existence and one of the original quarks might fly off away with one of the new particles to form a separate meson. The other particle from the created pair stays and changes the identity of three quark particle so it's no longer a neutron but some other three quark particle.
It's a confusing force. When two particles interact via love they possess this fore which always equate to 1LUV. Tragedy is that it's unstable and one always has more than the other, 50:50 split is rarely seen. It's even possible that during an interaction one can have all the love and the other none and they still bind together in a way.
It’s been a while since my last physics lectures so this might be wrong, but the way i understand it:
We don’t have a good model of quantum gravity yet but our best guess is that the force carrier of gravity might be a particle called graviton. This hypothetical particle has no mass and therefore the length scale of gravity would be infinite. This matches the Newtonian and the general relativity model of gravity.
This is different from the source of gravity which would be the (gravitational) mass of an object (or more accurately the components of the stress energy tensor which describe the density and flux of energy but that’s also the point where I have to start with the hand waving because my knowledge becomes very fuzzy there)
It’s also true for the electromagnetic interaction: the force carrier here is the photon which is also massless and the length scale is also infinite here.
That is the big question in physics today! There is the hypothetical "graviton" which, as you should expect, has zero mass. There are theoretical problems due to the poorly understood interaction of general relativity and quantum mechanics -- if you could resolve those problems you would have a "theory of everything."
Pardon my following likely idiotic statement from a layperson, but the surprising matter of the charge distribution within the neutron suggests that the nuclear "weak force" is actually just electrostatic force. So that is to say, if the neutron is slightly negative near the surface, what if that's the entire basis of the force that makes neutrons able to bind protons in the atom nucleus. We would expect this to be a kind of dipole whose field drops off rapidly with distance.
First of all, the nucleus is held together by the strong nuclear force, not the weak nuclear force (the weak force causes certain kinds of decay, such as a neutron to decay into a proton and electron).
Secondly, as soon as you have multiple protons, the repulsion from the positive charges will be much, much greater than any dipole attraction.
Finally, physicists have done a lot of really precise measurements with subatomic particles, and I don't think a dipole interaction like that would match the observed results.
Actually the relationship between the weak nuclear force and the electromagnetic force is well understood. It would have to be related, given that the weak interaction is the cause of beta decay, which results in a neutron becoming a proton (which has a positive electromagnetic charge) and the creation of an electron (which has a negative charge). The force carrying particles of the weak force can also have electromagnetic charge, further demonstrating the connection between the two phenomena.
You can reason about all the forces being related in some way. For example, beta decay happens in free neutrons, but does not happen in a helium-4 nucleus; there should be a connection with the strong nuclear force that explains this. In some sense the fact that we can observe or measure an interaction implies that it must be related to other forces, since our ability to make an observation is itself dependent on such connections (in the end you need some electromagnetic effect that your eyes can perceive).
If this came from almost any other group it would be an easy write off. But NIST really does have some of the best analytical chemists and physicists around.
Headline is misleading. The "new details" are of the sort "if there is a fifth force, you won't find it over here."
> The scientists’ results improve constraints on the strength of a potential fifth force by tenfold over a length scale between 0.02 nanometers (nm, billionths of a meter) and 10 nm, giving fifth-force hunters a narrowed range over which to look.
This is not surprising and it would be possible believe this sort of a thing from a variety of qualified groups.
I’m not sure what’s misleading about it. “If this force exists, you’ll find it in this range” seems to be a valuable detail to know when searching for this force.
It feels to me like this is very similar to the trend of only caring about positive experiment results and thinking negative experiment results aren't interesting. But they are! Negative results are useful and give us information! And are often crucial contributions toward positive results from later experiments.
> the trend of only caring about positive experiment results
Probably starts in school. Negative results are just a loss of marks rather than a potential point of interest. Even if the lab report states that the results were unexpected and possible reasons given - it was an automatic fail. Never has it been considered, at least in my alma mata, that a negative result reasoned about might actually be interesting on its own and worth the time. Since aint nobody got time for that, said trend will probably continue for a long time.
Yes, technically it's not over there is a "New detail about a possible fifth force", but it's not exactly what first jumps to mind?
Consider "New details about a possible Game of Thrones book release date" being a similarly unsatisfactory headline if the article content is "it's not in the next 12 months". Technically true, that is a new detail, but is it really what the headline implies?
I think that's not the best example since it is assumed this book is eventually coming out. Maybe if it was "New details about a possible Game of Thrones seasons 5-8 do-over" essentially being "no plans in the next 12 months".
A constraint on the domain and range of values is a detail. Restricting the possible range of action for a force is certainly something physicists look for.
It absolutely is very interesting science, but the headline is arguably misleading. It's like saying "we have learned new things about where the body is buried" after searching an area and not finding the body: it's absolutely true, but misleading.
No, it isn’t misleading at all. Narrowing a constraint is learning something new. It could very well have gone the other way. Or in a way completely unexpected.
It is misleading because it suggests that they have found new signs, previously unknown, that such a force might exist, rather than new details that demote it from "possible with certain limitations" to "possible with narrower limitations".
This is presumably why dexwis seemed to think that it was so remarkable and that it might easily be written off as too-fantastic.
Isn't this pretty much the description of how the Higgs Bosun was found? Fermilab kept finding where it wasn't, narrowing in on where it could be, so that the first time LHC looked it pretty much looked directly at it. Just because an expirement doesn't find the proof of, the proof of it not is also valuable.
From what I read in this article they didn't prove that the force exists at all they actually showed that it wasn't present in several areas. This helps other people who are doing experiments in the area by cutting down on the range of sizes they need to look in for it but it doesn't provide evidence for its existence.
Indeed, and in addition, their research programs focus on improving the science and technology behind making measurements. So they have the best metrologists around too.
It makes me wonder that infinite things are still "hidden" from our understanding. As the silicon is widely used on our modern society we have spent enough time studying its properties. What about other materials, there is potentially an "universe" on everything and that is incredible.
I understand the interest in a fifth force is to explain the dark matter/dark energy anomaly discovered by the astronomer Vera Rubin. The orbital velocities of stars at increasing radii (radiuses?) from the center can't be accounted for by the current understanding of gravity and visible matter.
'(In Ancient and Medieval philosophy) ether, the fifth and highest essence or element after earth air, water and fire, which was thought to be the constituent matter of the heavenly bodies and latent in all things. [C15 via French from Medieval Latin 'quinta essentia' the fifth essence]
The headline is misleading. The work tightened the range of possible strengths of a fifth force by a factor of 10. In other words it ruled out the existence of a fifth force within a wide range of parameters that were previously open.
Note that the original article's headline is "Groundbreaking Technique Yields Important New Details on Silicon, Subatomic Particles and Possible ‘Fifth Force’" (which is too long for HN), which makes it clear that the focus is on the technique. It's the abbreviated HN title that sort of makes it sound as though the focus is on the force.
It seems like the title length constraint on HN is a bit too limiting and often results in these types of clarifying comments.
Couldn’t we expect more accurate and higher quality titles by relaxing the length constraint? I’m sure it’s been discussed before here, but I’m struggling to think of downsides from such a change.
Having short titles where you can't cram a lot of information is a feature IMO, since it means that you generally have to actually open the link and see what it's actually talking about.
I suspect (without having any evidence) that the longest the titles are, the most likely you are to end up with people commenting without having even bothered to open the link.
Which, hypocritically, is what I just did, but admittedly this comment isn't about the article itself.
In my view, the primary function of the title is to help a viewer understand whether the topic is of interest to them and therefore it is worthwhile to click the link. A more-informative title will better serve this critical function.
If some HN participants are prone to making off-base comments based solely on a title, let's address that directly (e.g. via guidelines + voting), rather than by nerfing the titles. Otherwise we're just throwing up our hands and saying "this is why we can't have nice things". I'd rather work toward having the nice things.
Incidentally, it's not just (or even primarily) about length; an article's "native" title often makes sense only in local context, and thus does not communicate well when seen out-of-context on the HN front page.
It’s good to make HN users dig for information. HN is designed to gratify intellectual curiosity; it’s the same reason TL;DRs are tolerated but discouraged. Dan has written about this many times over many years — I imagine he might drop by with a few persuasive references.
You’re right, by the way. It would be good if HN was designed solely for growth. It might even have been good in this one case, too. But it’s designed to spark curiosity, which is a much more delicate thing. Most long titles are long because they’re noisy. They don’t usually add precision.
I'm honestly baffled by this argument. Do you click through 100% of the links that reach the front page? More to the point, do you expect/desire that most HN readers should do so? Presumably not; so, we must expect that readers -- no matter how curious -- are exercising some decision algorithm. Why deliberately cripple the input to that algorithm? Shouldn't we trust readers to exercise curiosity even when provided with a good descriptive title?
It's fine to be baffled. For what it's worth, I was also skeptical for a long time.
Mistaken titles should be corrected. Clickbait titles should be reduced. But that's not what's being argued here.
What's being proposed is to change a longstanding community rule. Such things are known to happen, but you have to be careful about doing so. It's almost irresistible to propose changes. I recently proposed one too: that all links in selfposts should be clickable. I still feel that was a decent suggestion.
But we don't have the experience or the information to see all the possible implications of such proposals. Having been on Ye Olde HN for... 2021 minus 2007 years, I think the central question is whether the rule breaks down at scale. Because the title length in place since 2007, and the only reason to change it is that it no longer works, presumably due to HN scaling.
My skepticism alarms go off at such proposals. The clickable links in selfposts are a decent example of a proposal that seems to fit: no one had clickable links in 2007, even for posts from YC. But as HN scaled, that changed. Presently, most posts that make it to the front page get clickable links, so those that don't feel like obvious outliers – the shunned posts. Why not let everyone participate in a fair way?
The title length proposal is different. It's true that it might make some posts more accurate, like this one. But it's also true that a sufficiently creative person can pack a lot of information into 80 characters. Are you sure it's a good idea to change such a longstanding rule, especially when there's no pressing need to do so? Doing nothing is often the best course of action when running something – look how freenode turned out.
The point is, each proposal like this needs to be carefully thought through. It might seem entirely obvious that it's a good idea, much like the selfpost proposal seems like a good idea to me. But we should try to feel skeptical – how much money would you wager that your proposal won't go wrong? Would you place 450k on it? I wouldn't.
But we're asking them to bet far more than $50k on each change like this, because HN is literally the key to YC's power. It always has been. That's why I'm not too bothered if things stay mostly the same – there have been a lot of changes since 2007, but the substance of the site (such as the 30 link limit on the front page) has remained the same.
In fact, one could ask oneself "Why not show more links on the front page? After all, 40 would be more informative than 30." Many of your same arguments would apply. Yet there are subtle but important reasons not to.
Maybe a different limit for links versus "ask HN"s?
External links have some sort of constraint, weak as it may be. The limit more forces people to editorialize rather than focusing their thoughts toward concisitude.
The front page is a treasure. Many redesigns are attempted and they never work (for me, YMMV). Don’t jiggle what obviously works and everyone is very used to, unless you test the heck out of it.
I read the short headline, and the top comment gave me the clarity I need. As is often the case I don’t need to read the article.
You know, I just quickly zoom with a double tap and nail the UX element.
We have better elements to use, but I must also say those tend to come with costs.
Right now, HN is so lean, fast and clean, I will gladly work a little to vote or do some action in return for what is otherwise one of the best "just read the discussion" presentations on mobile. It's a pleasure.
People abuse the code formatting for block quote. It looks bad on mobile, but it also looks bad on desktop because it's a fixed width Font, which is not what you want for a quote. The solution is not to change the style – it's for people to stop using code style for quotes! (The style does actually seem to have changed on mobile in the last year or so: I think it wraps now, whereas I think it used to have a horizontal scroll bar.)
To quote something else, manually insert a greater than symbol at the start of every quoted paragraph,
> Like this
Which obviously doesn't indent nicely but is perfectly clear, and works well with HN's low formatting style.
Think for a moment why an app should ever be considered necessary to render a web site. Especially one with little in the way of demanding UX requirements.
Well there isn't a mobile stylesheet. It's the same as the desktop styling (and that code block issue affects desktop too).
There are multiple issues with using the desktop styling, but most are related to Fitt's law[1].
The whole UI is terrible for finger interactions, but the best example is the tiny upvote/downvote buttons immediately above/below each other well within the diameter of a normal finger. It's literally impossible to use that without zooming, and if you try to then there is no way to know if you vote up or down. It should be used in textbooks for how not to do a mobile interaction.
> It's literally impossible to use that without zooming
Let's keep the discussion honest. I have a high DPI phone screen, I'm at the normal text size for my device, and I regularly use those buttons without zooming with reasonable accuracy.
The hyperbole is not warranted here. While I agree touch targets could be bigger they are certainly not impossible to use reliably.
There is nothing to disappear. There isn’t a fifth force to within experimental precision. TFA is about an increase in that experimental precision which further reduces the possibility (or at least the magnitude) of any yet undiscovered “fifth force.”
Let's say that they've constrained this hypothetical force to have a strength less than 5 doodads.
And then let's say some theoretician comes up with evidence that if a 5th force exists, in order to be consistent with the laws of physics, it must have a strength of more than 6 doodads, for instance.
With the first bit of evidence, we've managed to rule out the existence of a fifth force, which we wouldn't have been able to do if that evidence didn't exist. There is no way to use that piece of evidence to rule in the existence of a fifth force.
“A vastly improved understanding of the crystal structure of silicon, the ‘universal’ substrate or foundation material on which everything is built, will be crucial in understanding the nature of components operating near the point at which the accuracy of measurements is limited by quantum effects,” said NIST senior project scientist Michael Huber.
It's not literally false, but I got an inaccurate impression based on the headline. To me it implied a positive discovery of evidence rather than a ruling-out.
If they'd said "determined new constraints on a hypothetical 5th force" or something I would have gotten a correct impression.
Nowadays, thanks to superb youtube channels[1], I've learnt that we are rather in the beginning of the journey of solving the mystery of universum, and the reality is much more exciting than those boring "circling balls"; actually there are no balls at all but just "fields" (which of course is also just a mental model).
[1] E.g. https://www.youtube.com/user/TheScienceAsylum