It’s a motte and bailey thing, where “it costs nothing to be nice” is the motte and “you’re a nazi” is the bailey. The people in the motte aren’t necessarily the same as those in the bailey, but this is roughly what has been happening for the last 15 years. It’s like jaywalking. You keep pushing new offenses down the treadmill, so you can selectively persecute anyone, because nobody is gonna keep up with all of them.
It’s nice to be educated about common misconceptions and potentially hurtful language, but unfortunately society doesn’t have enough slack right now to be a sufficiently chill about it and not devolve into tribal bullshit over it. At least that’s the concern.
What? When did I call anyone a nazi? Whatever you perceive happened for the last 15 years has nothing to do with my comment. I said what you yourself said: lets recognize what is potentially hurtful
That uses to be called CIS. When I was at Harvard they still took the algorithms class. Yeah, it was the ezmode version, but it still covered stuff like how to build a hash table and a trie.
I instantly knew as soon as I saw the image. It's the Magic Eye trick. The same one you use to view stereoscopic images. The differences will just flicker.
Most tests for potential are easily gamed by people who are taught how to pass the test, or simply avoided by people whose wealth and social status allows them to avoid the test.
For example: When I was 18 I was completely overlooked by the NFL because I had never played gridiron football. Had I been coached professionally for 10 years I may have been a star.
I sat in an interview for an army officer scholarship once, acutely aware that the man testing me had an accent that made it clear he was from a higher social class than me. He mentioned that I was not properly prepared for the meeting, but I was given no notes as to what to prepare. I was told later that in the private schools that feed the majority of candidates to this route, that they coach their pupils specifically for this test.
So I would like to hear a test for potential that is not easily gamed by wealthy people
One thing that many mathematicians today don’t think about is how deeply intertwined the field has historically been with theology. This goes back to the Pythagoreans at least.
That survives in the culture of mathematics where we continue to see a high regard for truth, beauty, and goodness. Which, incidentally, are directly related to logic, aesthetics, and ethics.
The value of truth in a proof is most obvious.
The value of aesthetics is harder to explain, but there's no denying that it is in fact observably valued by mathematicians.
As for ethics, remember that human morality is a proper subset thereof. Ethics concerns itself with what is good. It may feel like a stretch, but it's perfectly reasonable to say that for two equally true proofs of the same thing, the one that is more beautiful is also more good. Also, obviously, given two equally beautiful proofs, if only one is true then it is also more good.
> That survives in the culture of mathematics where we continue to see a high regard for truth, beauty, and goodness
As a non-mathematician, I've noticed this as well, and I have a suspicion the historical "culture" is holding the field back. Gödel proved there are an infinite number of true arithmetic statements unprovable within any (consistent, sufficiently powerful) formal system. But our "gold standard" formal system, ZFC, has about as many axioms as we have fingers — why is finding more axioms not the absolute highest priority of the field?
We struggle to prove facts about Turing machines with only six states, and it's not obvious to me that ZFC is even capable of resolving all questions about the behavior of six state Turing machines (well, specifically just ZF, as C has no bearing on these questions).
Yet Turing machines are about as far from abstract mathematics as one can get, because you can actually build these things in our physical universe and observe their behavior over time (except for the whole "infinite tape" part). If we can't predict the behavior of the majority of tiny, deterministic systems with ZFC, what does that say about our ability to understand and predict real world data, particularly considering that this data likely has an underlying algorithmic structure vastly more complex than that of a six state Turing machine?
More formally, my complaint with the culture of mathematics is:
1) We know that for any string of data, I(data : ZFC) ≤ min(K(data), K(ZFC)) + O(1)
2) K(ZFC) is likely no more than a few bytes. I think the best current upper bound is the description length of a Turing machine with a few hundred states, but I suspect the true value of K(ZFC) is far lower than that
3) Thus K(data) - K(data | ZFC) ≤ "a few bytes"
Consider the massive amounts of data that we collect to train LLMs. The totality of modern mathematics can provide no more than a few bytes of insight into the "essence" of this data (i.e., the maximally compressed version of the data). Which directly translates to limited predictability of the data via Solomonoff induction. And that's in principle — this doesn't even consider the amount of time involved. If we want to do better, we need more axioms, full stop.
One might counter, "well sure, but mathematicians don't necessarily care about real world problems". Ok, just apply the same argument to the set of all arithmetic truths. Or the set of unprovable statements in the language of a formal system (that are true within some model). That's some interesting data. Surely ZFC can discover most "deep" mathematical truths? Not very likely. The deeper truths tend to occur at higher levels of the arithmetic hierarchy. The higher in the hierarchy, the more interesting the statement. And these are tiny statements too: ∀x ∃y ∀z [...]. Well we're already in trouble because ZFC can only decide a small fraction of the Π_2 statements that can fit on a napkin and it drops off very quickly at higher levels than that. Again, we need more axioms.
> Yet Turing machines are about as far from abstract mathematics as one can get, because you can actually build these things in our physical universe and observe their behavior over time (except for the whole "infinite tape" part)
The infinite tape part isn't some minor detail, it's the source of all the difficulty. A "finite-tape Turing machine" is just a DFA.
Oh is that all? If resource bounded Kolmogorov complexity is that simple, we should have solved P vs NP by now!
I debated adding a bunch of disclaimers to that parenthetical about when the infinite tape starts to matter, but thought, nah, surely that won’t be the contention of the larger discussion point here haha
No, an LBA in general doesn't have a finite tape. It still has an infinite tape, to accommodate arbitrary length inputs, it's just that the tape cannot grow beyond the length of its input (or a constant multiple of it, which is equivalent by the linear speedup trick).
> > That survives in the culture of mathematics where we continue to see a high regard for truth, beauty, and goodness
> As a non-mathematician, I've noticed this as well, and I have a suspicion the historical "culture" is holding the field back.
Holding the field back from what? If the goal of the practitioners of the field is to seek mathematically beauty, then well, that is what they will focus on.
Besides that, I don't really follow your argument about Godel & information theory & that adding more axioms is the key to moving math forwards. In the vast majority of cases, the difficulty in finding a proof of a statement is not that the statement isn't provable under a given formal system, it's that we simply can't find it. But maybe I misunderstand you?
I can’t tell if this is crazy or brilliant. Math has been working diligently for a long time to reduce the axioms. Most of the obvious Gödel sentences are stupid things like there is a number that is the proof of itself. The whole project is to derive all of the structure of mathematics, with a high information complexity, from basic axioms but also from comp,ex definitions. I think the definitions (natural numbers as sets, integers as equivalence sets of pairs of natural numbers, etc.) pump up the information complexity from the axioms. Like the initial state of Life allowing arbitrary computation from the simple Life rules.
The idea that there might be more axioms that would let one deduce more about computable complexity classes or the like seems pretty unlikely.
The number of provable statements and unprovable statements is countably infinite and we aren’t lacking the ability to prove things due to obviously true missing axioms.
There are plenty of mathematicians - mostly set theorists - who are actively working on finding new axioms of mathematics to resolve questions which can't be resolved by ZFC. Projective Determinacy is probably the most important example of a new axiom of mathematics which goes far beyond what can be proved in ZFC, but which has become widely accepted by the experts. (See [1] for some discussion about the arguments in favor of projective determinacy, and [2] for a taste of Steel's position on the subject.)
I suggest reading Kanamori's book [3] if you want to learn more about this direction. (There are plenty of recent developments in the field which occured after the publication of that book - for an example of cutting edge research into new axioms, see the paper [4] mentioned in one of the answers to [5].)
If you are only interested in arithmetic consequences of the new axioms, and if you feel that consistency statements are not too interesting (even though they can be directly interpreted as statements about whether or not certain Turing machines halt), you should check out some of the research into Laver tables [6], [7], [8], [9]. Harvey Friedman has also put a lot of work into finding concrete connections between advanced set-theoretic axioms and more concrete arithmetic statements, for instance see [10].
I wonder if he’s familiar with Peirce’s alpha existential graphs. They are a complete propositional logic with a single axiom and, depending how you count them, 3-6 inference rules. They use only negation and conjunction.
They also permit shockingly short proofs compared to the customary notation. Which, incidentally was also devised by Peirce. Peano freely acknowledges all he did is change some of the symbols to avoid confusion (Peirce used capital sigma and pi for existential and universal quantification).
John Sowa is a good resource. Here is his annotation of Peirce's tutorial[1]. Another paper explores the influence of EG on Sowa's Conceptual Graphs[2]. I happen to find the juxtaposition of Frege's notation with Peirce's interesting. Sowa's commentary on yet another Peirce manuscript has some fun historical tidbits about the influence of Peirce on the design of SQL[3]. Here is another reference that mentions Peano's adoption of Peirce's notation[4].
That should be plenty to get you started! Digging through the references in those papers and the bibliography on Sowa's site will find you plenty more modern Peirce scholarship. I think Peirce would be pleased that his seemingly abstract approach to logic ended up inspiring one of the most pragmatically useful classes of software ever, the relational database.
First off, Poland benefitted immensely from EU funds. Better infrastructure is better in Poland for the same reason it is better in Spain, Spain just had those funds earlier.
As for the migration policies, I think you are pointing in the right direction but the reality is more subtle.
Old European countries consider work to be a privilege, Poles, having lived in the Warsaw Pact country, still perceive work to be an obligation.
A refugee arriving in Germany can't start working and lives off welfare while his case is settled, which may take years.
A similar refugee arriving in Poland has to find a job both to survive and to obtain legal residency.
This has the downside of driving down wages but the upside of creating even more jobs.
Amazon, Lidl, Zalando and many other labor-intensive businesses set shops just across the border for a reason.
reply