I used this recently to speed up some symbolic manipulation. I was using SymPy but its simplify is really quite slow and the documentation basically just says "work out exactly which simplifications you need and apply them", however in my case I needed something completely general. Now I check whether wolframscript is installed and then just call it in the command line. It's much faster for complicated simplifications, but also much slower for simple stuff (I guess it has a long start up time)
SymPy has a built-in converter to/from Mathematica (although it can't convert from Mathematica lists yet), so that's helpful
I’m a Mathematica addict and (as I have remarked elsewhere) rely predominantly on iPads as my platform of choice. Sure, Wolfram Online works fine when a network connection is available, but I’m really, really, really hoping that sooner rather than later they’ll release a fully fledged notebook & back-end kernel version of Mathematica for the iPad.
This shows some flexibility on their behalf, but it’s moving in the wrong direction (as far as I’m concerned, at least).
I would say that porting Mathematica to the iPad would require them to be able to run the Mathematica engine on iPadOS. And with the strict rules Apple has around arbitrary code execution I would highly doubt that is going to happen.
That’s actually a misconception: Apple has allowed interpreted languages (e.g. Python, as in Pythonista) to tun on iOS/iPadOS for several years now. The Wolfram Language is definitely an interpreted language and therefore would not fall afoul of their rules.
Other people below are showing 6-8 GB on their systems, so it's lower than what I recall, but still higher than your figure I would normally bet. Are you only counting 1.6 once?
Regarding running the engine on iOS: Technically they do that now with their Wolfram Player App on the iOS. The calculations are done on the iPad and not sent to a server like the Wolfram Cloud app.
> Apps should be self-contained in their bundles, and may not read or write data outside the designated container area, nor may they download, install, or execute code which introduces or changes features or functionality of the app, including other apps.
Oh, yes, please. The iPad (especially the Pro ones) is such an incredible machine — thin, light, fantastic battery life, and faster than most laptops. It would run Mathematica really, really well.
Wolfram Language is insanely great. I've been using wolframscript as (i) a shell replacement and (ii) a piping/glue language for larger projects.
Wolframscript makes this hard though for the following two reasons: (i) wolframscript requires users to login before any script can be ran; (ii) only 1 wolframscript REPL can be open at a time.
(i) makes it impossible to export wolframscripts to users who don't know what wolframscript is. For example, if I'm using wolframscript as a build tool for a larger Haskell project, and some user downloads my software, they'll first have to login to Wolfram Cloud before the wolframscript is able to run. There's no way I'm going to force people to do that, so it hinders the growth of the language.
(ii) makes it hard to use wolframscript as a complete terminal replacement.
People often use multiple terminals at once, at different locations in the file system. Wolframscript only allows once at a time.
Wolfram Language is insanely good. It should be more popular than Python. I think fixing some of these licensing/login issues could help it grow.
It's hard to come up with a short answer to explain the greatness of the tightly integrated system with tons of functionality and highly sophisticated function/rule based programming style, but let me give you a quick taste of it: the following code
will create and deploy a publicly accessible web form that asks for 3 coefficients and plots a parabola. How many lines of code / external libraries would you need to do the same thing in python?
Cool! Is it because the Wolfram libraries and infrastructure are great, or is there something in the semantics of the language that makes this so nice?
For me, it’s a combination of many many convenient built-in functions, and high level constructs for functional and rule-based programming. I recently played with the language to get functionality similar to what R’s dplyr provides - it can give a sense of what kind of advanced “meta-programming” is possible.
In my opinion, while Mathematica is the world’s best general purpose calculator, it does more harm than good in producing reproducible and explainable science and math. The topic of math software being closed source has been beat to death (by me and others), so I’ll veer into a different path.
As a tiny anecdote, a colleague of mine recently published a paper in quantum computation and computed a representation of some mathematical object, specifically the exact value of an angle which modulates a particular quantum gate. In the paper was this bizarre continued fraction along with arctans and pi’s that no experimental physicist in their right mind would have derived themselves.
“Where in the world did this come from?” I asked, while reviewing the paper.
“Oh, it’s just what Mathematica gives,” he responded. “I don’t even fully understand it. But I do use it as proof that the angle is an irrational number and thus breaks its inclusion in some known finite group.”
You might call out this otherwise accomplished physicist for what appears to be misusing the tool—taking results at face value without understanding them—but I guarantee scientists and engineers are doing this stuff daily, in more consequential places than abstract quantum physics papers.
Wolfram and his team have built something amazing, and built an incredibly successful business, but I hope he uses his business acumen to find a way to contribute to the “greater good” while he continues to enjoy running a company and making money.
what is the alternative? using some open source project that has no clear steward and nearly zero incentive (financial or otherwise) for tracking down esoteric bugs?
mathematica is built by a company that seems to take a lot of pride in quality. they are also clearly incentivized by keeping the company afloat and thus by addressing major issues.
i hear people barking incessantly about open source software in mathematics and science, but i just don't see what would actually be successful. in most people's daily jobs in engineering and even in large science projects, proprietary code is used everywhere. it's the only way to get things done in many cases. in my experience, getting a bug acknowledged, prioritized, and then fixed in open source software is like pulling teeth. what are you going to do with open source mathematics or scientific software where only like a person or two understand the bug? your story could easily have replaced mathematica with sage, and the story wouldn't be changed at all. okay fine, sage is open source. but what's more likely to be successful: (1) you are going to look through the likely millions of lines of code that makes up sage and find what's happening, or (2) you call up wolfram and ask what's going on since you have support for a product that you paid for?
how is open source software supposedly inherently better for math and science research?
Open source software first and foremost means reproducibility. With open software, anybody on earth with the right equipment can easily and trivially reproduce a result at no additional cost or hindering contractual terms. Reproducibility is paramount, and sits at the very core of science. Results that cannot be reconstructed hardly count as new knowledge.
(As a funny story, even Wolfram-the-company can’t run SMP, the predecessor of Mathematica. Why? They can’t decrypt it or unlock it, and can’t figure out how to work around it.)
Every major tech company on earth uses open source software at the heart of their business. So we know something sustainable is possible.
Open source, one-size-fits-all math software is a tough nut to crack. I think most people could get most of their algebra and calculus done with existing open packages like wxMaxima or Sage. Specialized computations, such as those in group theory, already have open source implementations that exceed any closed competitor.
As for the construction of a competing CAS, some organization would have to employ a team. That costs money, and so far no company (that I know!) sees it as either a worthwhile investment or charitable cause. If folks do know of a company, I’d love to know.
> Open source software first and foremost means reproducibility. With open software, anybody on earth with the right equipment can easily and trivially reproduce a result at no additional cost or hindering contractual terms.
why? why is proprietary software less reproducible? aside from cost, i might even argue that proprietary software is more reproducible. open source software often has dependency hell. how are you sure that the dependencies and software installation is the same between two computers?
how are results not able to be reconstructed with proprietary software, aside from cost (which is not your argument)? if someone has a result found in mathematica, then couldn't that same method be reconstructed in other software to compare results? if it can't, then mathematica provides some feature not found in other software.
and plenty of science operates by not being fully reproducible by independent parties due to cost and available equipment. that's often mitigated by people reproducing the experiments or results with different equipment and even methods. and even then, every regular joe still can't reproduce the results due to massive costs.
Proprietary software is less reproducible because the software artifacts literally cannot be reproduced in different environments. That's why I mentioned SMP as an example; even the vendors of the software can't reproduce their own artifact. Moreover, vendors rarely provide access to previously released versions, because "newer is always better". And nobody, except the vendor, is allowed to host or publish old/different versions. The availability of the source code does not also somehow prohibit binaries from being published.
> how are results not able to be reconstructed with proprietary software [...]?
Cost, legal viability, availability, shareability, etc. Cost is just one aspect. I also can't provide my copy of Mathematica along with my code that produces my results to you; I'm contractually obliged to not copy the software, and so even if I wanted to allow you to reproduce my results, I have no power to do so.
> and plenty of science operates by not being fully reproducible [...]
This doesn't mean we should be driving away from reproducibility just because it's lacking in some areas of science.
i feel like you're beating around the bush with a definition of reproducible that fits your preferences. and i have no idea what you mean by "artifacts".
here's a quote from the NSF: "reproducibility refers to the ability of a researcher to duplicate the results of a prior study using the same materials as were used by the original investigator. That is, a second researcher might use the same raw data to build the same analysis files and implement the same statistical analysis in an attempt to yield the same results…. Reproducibility is a minimum necessary condition for a finding to be believable and informative".
let's say someone has some data and a method to process that data. if they implement it in mathematica, then someone else can do the same implementation in mathematica and attempt to reproduce the results. so what if they had to pay for it? that doesn't affect the reproducibility. it's a barrier maybe, and a small one, but it doesn't make it inherently less reproducible. someone else could take the data and method and implement it in something else, say matlab, sage, octave, or whatever, and attempt to reproduce the results. i know of a researcher who does this very thing. he likes mathematica, but his graduate students use a range of software to do their own work and produce results. he takes their methods and investigates and implements them in mathematica to see if they are reproducible. this is good mathematics and good science. just because mathematica is involved doesn't all of the sudden make everything un-reproducible and bad science.
if you're thinking, but oh, if it's open source, i could crawl through and understand everything the code is doing. but nobody does that. it isn't even feasible in anything but the simplest of simple cases.
> This doesn't mean we should be driving away from reproducibility just because it's lacking in some areas of science.
i didn't say we should. but in all the comments sections of big physics announcements, we don't see people whining about reproducibility. i can't go and reproduce the LIGO experiment in my backyard because it takes huge amounts of money, equipment, and engineering expertise. that's why many things don't compete that well with mathematica. it takes a lot of time, money, and expertise to develop such a software package. the opposing model (open source software) just barely drags along and could disappear at any time as maintainers come and go.
Why can't companies provide paid support for open source software? I believe lot of open source products work on that model, including Linux. The bad thing with proprietary software is not many students and researchers from developing countries cannot afford it as the pricing is always based on developed economies scale. But if it's open source, every one can use it and more affluent researchers can get the paid support so they don't need to hunt through thousands of lines of code or fix bug themselves.
Your points are valid, but there are other considerations. With open source software I can find and fix a bug in a routine and submit a patch for other people. With closed-source software, there is no guarantee that a bug I find will be prioritized and fixed if I submit it even if the company values quality (they likely have more open issues than people to fix). Now granted, I'm personally probably not going to track down some obscure bug, but others may.
It seems like the result of the paper should then be conditioned on the correctness of the chain of re-writes that Mathematica used to arrive at the continued fraction. Isn't there a way to tell Mathematica to output all re-write rules used to form an expression? I seem to recall something like this is possible.
This is not possible in Mathematica, especially since it’s not all based off of rewrite rules. Wolfram even boasts the 100s of of pages of C and Mathematica code used to implement even individual algorithms.
With regards to conditioning the result, it should be as such, but no researcher would ever smear their result like that. Research is cutthroat.
I was a heavy user of mathematica 20 years ago, on a 486 with MS-DOS. Of course, it was inside a terminal. Has the ability to run mathematica textually been lost in the meantime?
I hope not, historically you've always been able to run the MathKernel from a terminal and interact with it that way. Maybe this is the same thing, just more targeted to use as an interpreter.
This seems like they just polished some rough edges off of the existing command line version of Mathematica. I wish they would not call it Wolfram Language. His ego/talent ratio is already so high it's NKS level buffoonery.
Wolfram derangement syndrome, the phenomenon by which HN threads unintentionally get sucked into an arcane crowdsourced psychoanalysis project dedicated to a single person, has been off topic for years. The funny thing is that it's a mirror image of the self-obsession it purports to analyze.
Why do people feel the need to denigrate Wolfram at every turn? He is obviously a genius of the highest caliber, who has dedicated his life to making useful tools to advance science and engineering. He has made a tremendous positive impact. People should focus more on making their own positive impact instead of trying to tear Wolfram down for his awkward but ultimately harmless lack of self awareness.
.. because when I met him personally at a hack event in the 1980s, he was obviously in the "dominant and loud" mode, and directly commanded anyone within earshot, at most opportunities.. later, it turned out he forked his company and left the people who built Mathematica out of future profits. Silicon Valley bred people like that .. it is still common in some environments.
My grad math advisor was one of the folks who wrote significant parts in Urbana of the symbolic core; the behind the scenes team got legal settlements after that strange move. He's not from Silicon Valley; I think he'd done some early prototyping and coding himself while at CalTech (in Lisp if i recall) then came to Urbana-Champaign as a research prof, and got the team of several to build it out. [ I've been using it since 1988, and love it. ]
My guess is that it's because his lack of self awareness generally precedes his demonstration of talent, which frequently leads to a bad impression. Also I hear he's nasty in person, though that's not relevant here.
I've known him for more than ten years -- we first met when I was a student in college. In my experience, he's always been helpful, kind, and encouraging.
Wolfram Language is developed by Wolfram Research. It’s a fairly sane name for a language and a company; the fact that Stephen Wolfram tends to come off as narcissistic is unrelated.
You're saying the fact that Stephen Wolfram comes off as a narcissist is unrelated to the fact that he names everything after himself?
Python isn't called "the van Rossum Language". C isn't called "the Ritchie Language" (or "the Bell language", for that matter). Lisp isn't called the "the McCarthy language". We should judge Wolfram Research's products in spite of their unfortunate names, but denying that they're the result of Wolfram's narcissism is silly at this point.
People name companies after themselves, and then they name their flagship product after the company–it's not that strange. (To be fair, in this case I would not be surprised if he named the company after himself, and then the language after himself, too. It's just that it's not relevant to keep bringing this up when it's not necessarily a strange practice.)
Half of the "Debian" name comes from Ian Murdock's first name. The other half comes from his then-girlfriend's name 'Debra'. I've never heard anybody castigate Ian/Debian for that.
Somebody naming something after themself seems a little tacky to me, but I don't think it's anything to get bent out of shape about.
Is the issue the branding, or is the issue FOSS vs proprietary? I'm much more sympathetic to the later criticism, which really has nothing to do with the first I think.
About NKS, granted there's a bunch of self promotion and inflated urgency in there. But I feel like there's a useful work, cataloging in exhaustive detail a large number of systems and patterns which are built into the number system and nature.
Is there anything actually wrong with NKS aside from the tone?
I think there is consensus the tone is what's mostly wrong with the book.
Amazon seems to have removed it, but "a new kind of review"[0] was absolutely hilarious and kind of accurate :)
> "As the saying goes, there is much here that is new and true, but what is true is not new, and what is new is not true; and some of it is even old and false, or at least utterly unsupported."
The author seems fairly knowledgeable on the subject of cellular automata.
EDIT: I see this has been posted elsewhere in the comments
This article pales in comparison with a presentation where he described a new Wolfram language feature (that was basically reflection) as New Kind of Programming no less. My theory is that this kind of marketing is very efficient against old-school professors who used to know Fortran, and are now in charge of spending.
It is one of my favourite works of crank science, the hardback has pride of place in that section of my library. It's beautiful, mind-expanding, awe-inspiring, and over-inflated in the grandiosity of its self-regard way past the point of parody.
Both of them certainly have their uses, but especially for MATLAB, are they? I'm working as an engineer in academia right now, and even with all the legacy holdover of MATLAB, many people I know (including almost every undergrad project I've helped with) are either using R or Python, and I know I don't go for it unless I'm absolutely forced. And that's nothing to say for the researchers whose eyes light up upon hearing of Julia.
Few of those people seem like they'll be advocates for buying expensive seat licenses if they become PIs down the line or join industry 5-10 years from now.
Mathematica seems a little different because I don't believe there's currently a viable alternative, let alone a commonly used one.
In my experience, the engineers who don't specialize in programming prefer MATLAB. Programmers will be comfortable in it, but will sooner reach for python etc.
Much like Excel, it's easier to get people to program if you can convince them it's not programming.
I definitely agree in this about Excel (some things I've seen some 'people who don't like programming' do in Excel are positive mind-blowing), but just haven't seen that same thing in MATLAB. That could definitely be just because of a limit of my own experience, though, and your point makes a lot of sense.
I mean yea, MATLAB is indispensable for the working engineer in controls or signal processing. Python is coming in vogue because of ML applications, but it's primarily used in the wild for scripting together builds and tests, which often are used for testing data generated by MATLAB models.
I've never seen a workplace use R or Julia or list them on a job posting outside data science or academia. Julia in particular is missing major components that make it useable, particularly in signal processing.
MATLAB also has the advantage of being useful as a CLI calculator for linear algebra, which is how a lot of engineers use it in their day to day. The REPL's of python and Julia are far too slow for that kind of work, frankly. Especially Julia. the JIT takes forever to warm up. It really doesn't fit into my workflow at all, and I've tried to force it.
Your point about Simulink is fair--even the Octave version, Xcos, doesn't provide much of an alternative for doing good controls simulations.
And while I agree that the Julia's JIT is pretty slow to come up, if I'm doing a lot of linear algebra work, I've typically just left a dedicated terminal open for it. I also haven't had much of a real speed issue with using a REPL or notebook, unless the data sets are unreasonably large. I think I disagree with your take on Python's most common uses, as well as the implication that it is just now "coming into vogue." I'm by nobody's measure a Pythonista, but my experience has been that it gets pulled in almost anywhere it can, from scripting to app development to scientific computing.
I am curious about your experience with signal processing, though--how have you seen MATLAB used where Python wasn't or couldn't? I haven't done much in that area since undergrad/internships, and there I would have said that LabView was much more prevalent as a tool, or even Labwindows. At least for many of the common functions like FFT, I don't recall MATLAB being any more common than any other language for that sort of thing (although as I said, my experience there might not be representative).
I meant python coming into vogue for signal processing/controls, sorry if that was unclear. And that's mostly anecdotal.
There's a lot of internal tooling around MATLAB that would take a lot of work to replace with Python. Like for example, I know a company that has a verilog compiler for their matlab models. That's not going to be rewritten for python.
MATLAB toolchains are also easy to maintain, since they're so damn expensive and locked down. Compared to python, which... isn't always straightforward.
And like I said before, MATLAB is just a fancy and expensive calculator. It's number crunch first, script second, and language third. Julia and python have a different balance, at least to me.
If I'm doing something quick and dirty, reaching for MATLAB is the absolute fastest way for me to do it, and the easiest to record. With python that's not true at all.
PS: I don't think I've ever seen someone use Labview to write and debug a DSP algorithm, so we've probably experienced totally different industries.
How is this related to Flash? Mathematica is a powerful pattern based symbol manipulation language, Flash is a media creation software, which do you think reaches more end users?
MapleSoft Maple was on par with Mathematica, when I worked with both (ca 2010). Not sure about current state of affairs. Maple's embedded programming language is also quite a bit nicer.
I remember that back when I was in University there was a set of simultaneous ordinary differential equations that MAPLE was able to solve that Mathematica choked on.
I remember reporting this to my tutor. He looked at me with a steely gaze and said “That is a very serious allegation to make, young man. Are you willing stand by it?”
P.S. I actually remembered one area where Maple is still absolutely superior, that is plotting. It doesn't require to explicitly set argument range, it tries to find features of the function and adjusts the range to include them. AFAIK, Mathematica still doesn't do this.
https://news.ycombinator.com/item?id=22625370
Probably too different to merge the submissions.