30MB binaries can be a bit of a problem for some people.
I started developing a system from home and when deploying to a VPS for testing and production. Transferring Go binaries was like waiting for a floppy disk to copy due to poor Internet infrastructure. Compression didn't help much.
It turned out to be significantly faster to push to a remote git and have a hook do the build. I think before you look to compress a go binary consider if there aren't other options.
30MB is on the small side. I've seen Electron apps go above 100 megabytes, and Android Studio is about 1.1 gigabytes. I feel sorry for anyone developing on less than 10 megabit down/up.
I've been throwing around an idea for a site called "Church of the Byte Savers" or some other silly name based around saving hard drive space and bandwidth. e.g. Dropbox is good because it has LAN sync, whereas OneDrive will send your data around the world just to get it to a second PC on your network.
I am in the same situation though my actual speed is more like 0.7 to 0.8Mbps up. Uploading a 30MB Go binary would take nearly 6 minutes.
I guess I should be thankful I mainly work with text and not images, video or large proprietary file formats otherwise I think I would be out of the game.
My phone is now about 5 times faster than my home Internet for both uploads and downloads but also significantly more variable and expensive. The fibre internet upgrade that was scheduled for last year had the technology downgraded and the rollout delayed due to politics.
Cooper being rare makes it more likely that they upgrade the end points to get 3mbps or more. I'd guess that hold true for down under as it does for the other places i know.
I've ameliorated this somewhat (I have 18/1), albeit at the expense of space, by uploading things to a common place, then copying or (sym)linking them to the final spot on the target computer.
This gives me the ability to `rsync` them up rather than copy, using its ability to only transfer the pieces of the files that have changed.
We have a ~20MB .war file that I have done this with over airport wifi, and it ended up only transfer a few 10's of KB (maybe 100) between compiles.
I do something similar. I have a database of all my files on S3 and their SHA-256 hashes. This not only means that I can skip uploading files which haven't changed, but I can do S3->S3 copies for files which are the same. But it's still too often that I have 10s or 100s of MB of data to upload. At 1 Mbit/s, it takes a half hour for every 100 MB!
A few years back I thought it would be fun to do a Scala project (which ended up being http://nightchamber.com) while staying with my in-laws over christmas.
It got much less fun when each deploy to the prod server took an hour to complete.
Note: I don't actually believe a 30MB static binary is a problem in this day and age
Depends what that 30MB does, or how much of it is actually needed - it's less of a problem if all those bytes are used in some form or another, than if a significant portion of them are "dead code" that is never used and only serves to take up space in memory, on disk, and in communications channels whenever it's transferred somewhere.
and I would not trade (build time | complexity | performance | debug-ability) for it
On the contrary, smaller binaries are usually going to be better at all of those, except perhaps "debug-ability" if you keep the debugging symbols in the same file.
>> Note: I don't actually believe a 30MB static binary is a problem in this day and age
> Depends what that 30MB does, or how much of it is actually needed [....]
Moreover, let's consider a CLI tool suite. Dozen little tools, each doing
perfectly what it was designed to (think of coreutils, util-linux, sysstat,
binutils, or iproute2). Such a suite suddenly weighs 300MB, more than X.Org or
PostgreSQL (or the two combined). Does it really do more than any of these two
to justify the size? I doubt it.
The availability of huge disks has probably skewed the perception of sizes to some extent. 300MB is more than sufficient for an entire operating system, including a GUI and several dozen CLI tools. A full install of Windows 98 was 175MB.
On the other hand, internet speeds have not increased quite as dramatically, so we can still relate with stories of "this app is how big? And it only does what?"
Said this here a few times already but since you mention it: last month I booted a win98 box. The bliss of small binaries. Everything was snappy even on a Pentium 3. Bonus point: enjoying the Turbo Pascal suite (IDE, Compiler) for a total of 800KB. I want to backport crypto, binary hardening some lisp over win98 and live in MB land (pun not intended).
Indeed. Just the other day I was trying (with no luck) to find a simple drawing app for Android (something like Paint on windows) that was less than 18 MB. Seriously, that sort of app should not need huge executables like that.
The amd64 Linux version of Terraform [1] unzipped uses 739 Megabytes on disk (`du -hc`). It is a collection of CLI tools, or one CLI tool that wraps many provider-specific tools (providers such as aws, azure, etc.).
I noticed this recently when trying to upload a container image that happened to include Terraform.
Eh, 300mb, who cares? There's plenty of disk space and usually plenty of bandwidth. If it's becoming a problem for you then sure, do something about it, but I don't think 300mb is a priori a problem.
"whole executable file is loaded to memory at startup" might be something you were going to do anyway, if you are running a latency-sensitive service. There's nothing worse than having to page in parts of the program after it starts serving.
Not entirely true. Most AV will only flag 'non-standard' UPX packed executables as suspicious. Standard UPX packed binaries are easily unpacked by the scanner and the contents scanned without notifying user.
It's anecdotal of course, but a small application of ours went from about one false-positive per month to no false positive in over a year when I stopped UPX packing it.
More anecdata here, but I've experienced the same thing as well (not quite as frequent as one false-positive a month though). UPX just seems to occasionally trip up the antivirus heuristic scans until you can get your application whitelisted.
Just be clear here: This is saying don't strip using the standard strip utility. As the author says in their post, go has flags to do stripping for go binaries.
Anybody who truly understands the technology and minutia behind modern browsers strikes me as almost more interesting and impressive than understanding Linux in this day and age.
This isn't belittling operations folks, this is more of a comment on how complicated, complex, and obfuscated modern browsers are.
Every instance of the program will have its own private copy of the uncompressed data. For normal uncompressed binaries the pages of the on-disk binary would be shared instead. (Except for any pages that get modified, which would trigger copy-on-write. A classic example would be load-time dynamic linker relocations).
Haha, the author linked to my poorly written poorly conceived old grumbling post. I'm honored.
I should update the post as the binaries come out especially larger in the new versions of Go written in Go. I believe Hello Worldis up to at least 3mb.
Honest question - mathematically, how can something be 7x smaller if x is not pre-defined? I always find it weird that anything can be more that 100% less than anything else because, you know, zero.
Am I way, way off base? Should I perhaps retake high school?
Edit: the smaller number is assumed 1x. Got it. Still feels weird.
You might be overthinking it a bit :) I always see it the X used as the multiplication symbol, meaning "seven times smaller". You know, like when in games you get a 2x multiplier. I doubt that is the algebra "X" ;)
You're right, but as another poster noted "2x smaller" generally means 1/2 the original size. That usage, to me, is a horrible misuse of English, and is a pet peeve.
You also see it with slower; "the unoptimized code is 2x slower". Huh? Do you mean "takes 2x the time"?
It makes perfect sense to me. I interpret "x" as shorthand/lazyhand notation of the Angloamerican multiplication symbol "×", which I verbally expand to the word "times". Furthermore, calling something "two times smaller" is equivalent to calling it "half the size", since "smallness" is simply the logical and mathematical inverse of "bigness", or "size". The same argument holds for unoptimized code being two times slower, or half as fast. (Slowness is the inverse of speed.)
> calling something "two times smaller" is equivalent to calling it "half the size"
This is the (IMO circular) assertion that is made by fiat. If you believe this, then you believe it.
I get the meaning, it just isn't logical to me. To use the 'x' or '*' as multiplication should mean that many repetitions of something, not the reciprocal of that number.
This is the second time in about a week that a post on HN has included a variation of "this weird trick" (the other being jlongster's article dealing with React) in the title. I can't help but parse these articles as equivalent to all the horrible ad spam all over the web about "this one weird trick" that will flatten your belly, tone your butt, help you make millions, give you a bigger penis, eradicate your cellulite, and a host of other "improvements" to your life.
Let's maybe spend a couple extra minutes coming up with less link-baity titles.
About as fascinating as two commenters who think I can't recognize the mocking/parody, and still find it to be equivalent to all the shitty content on the web using this one particularly "weird" headline "trick"? Criticizing something is not a signal that one missed the joke.
It's cool. And you're not the only one. Of course, I'm over here, on the other side of the tubes, unable to avoid pointing out I never said "I can't help but think...", but instead used "parse as equivalent" intentionally to signify only what the title itself evokes, joke or not. It's nothing but a couple downvotes either way, which is pretty unimportant.
The fact that it's a parody doesn't preclude it being annoying. Also, it's not actually that good a parody in my opinion. It's a click-bait title which links to an article with a really obvious message (lose weight:eat less::shrink binary:packer)
I started developing a system from home and when deploying to a VPS for testing and production. Transferring Go binaries was like waiting for a floppy disk to copy due to poor Internet infrastructure. Compression didn't help much.
It turned out to be significantly faster to push to a remote git and have a hook do the build. I think before you look to compress a go binary consider if there aren't other options.