Hacker News new | past | comments | ask | show | jobs | submit | iamcalledrob's comments login

I've found that often people who have been deep in the JVM ecosystem really struggle to enjoy Go, and vice versa.

It's like the JVM ecosystem commits deeply to the complete opposite of idiomatic Go, and once you've built those neural pathways, it's hard to unlearn it all.

Kitchen sink stdlib, Exceptions, Verbose naming, Complex build system, @Deprecated...


I spent around 15 years doing Java exclusively, and after discovering Go (around 2012) I couldn't wait to adopt it and have never looked back since. I don't think Go is perfect, but I find it a much better default pick for most things I've worked on compared to Java. Of course Go has limitations, especially in the type system and error handling, but rather than go back to Java for those cases where it matters most to me, I tend to choose Rust these days. It shares some of the advantages of Java (compared to Go) and also comes with its own set of other major advantages too.


I hate Java just as much, but at least it’s trying to improve and grow beyond its past limitations, instead of promoting bad design decisions as good for the programmer.


I have worked with many languages - from Java, Delphi, C/C++, Assembly, PHP, Go, Rust etc. I don't really understand the hatred for Java. The build system in Java is not complicated .. whether Gradle, Maven or whatever unless your requirements are very complicated (multiple builds etc).

In my opinion the most complicated system out there is Javascript and all of the myriad tools needed to support that ecosystem. It has approached the complexity of C++ and perhaps even exceeded it


I'm a committed Kotlin shill and a very experienced Gradle user but I'm the first to admit that it is absolutely arcane for those new to it. Once you know all the incantations it's really flexible and performant but good luck writing a plugin for the first time.


My favorite thing about gradle is that the build file is written in the same language as your code. This gives you the great power of easily adding a little hack into your build file, so it's pretty rare you actually have to make a plugin.

The downside is that every build file is a little bit unique, but IME it's not that much worse than what happens in golang. In golang people usually slap a makefile on top of the go commands, and then you have to read the makefile to figure out what targets you need to run, and if it gets a little complicated then they start calling out to shell scripts and things like that. ech.


To be fair, I do think Gradle is more complicated than Maven or Cargo.


Very interesting. Bulk of my career has been writing Kotlin/Java and after a year of Go I am _still_ really struggling. Nice to know I'm not alone!

I see the benefits (I've have seen some absolute Java ee monstrosities that would be impossible to build in Go) but why am I getting paged at 3am because someone forgot to set a field on a struct. Similarly all the codegen to workaround the lack of inheritance - who cares how fast the compiler is if you've got to do 30s of codegen on top.


> but why am I getting paged at 3am because someone forgot to set a field on a struct

https://golangci-lint.run/usage/linters/#exhaustruct


This is very odd.

Java is my "main" language, but I do enjoy Go. I think it is very ergonomic, and it shines in some contexts where Java is lackluster.

The error handling in Go is kinda meh, but hardly a dealbreaker.


> Kitchen sink stdlib, Exceptions, Verbose naming, Complex build system, @Deprecated...

Now do drawbacks.


Every time that I see verbose naming I can't but think someone got posessed by Dr. Heinz Doofenshmirtz's spirit.

- Behold! The LongRangeAtomicStructureCompactinatingReconfigurator!

- (rolling eyes) So a ShrinkRay.

- No, no, it is named LongRangeAtomicStructureCompactinatingReconfigurator to improve readability and communication.

- (rolling eyes even more) Sure.


That's a culture issue, not a language issue. And no, the language doesn't attract these type of people, but rather language is so good that it is prevalent in domains that hire those people.


A distinction that belongs in a dictionary perhaps. To a programmer the ecosystem and the language are one and the same. One cannot rewrite every piece of code out there to suit their fancy.


Java has Spring/Java EE, what people typically call EnterpriseJavaBean and it has Vert.x, Quarkus, Micronaut which are complete opposite.


That's not good, it means the language is effectively two languages now and you do not know what to expect when you open a supposedly "Java" codebase.

If you've known Java for years from the Spring POV and the codebase in written in the "modern" style, you're lost in the woods now.

Worse yet, choosing between the two styles is now a decision that every single project and company needs to make. Endless, pointless bikeshedding akin to which code formatting is to be used or Maven vs Gradle.

Such decisions should be made higher up, at the language design/culture level. It should be obvious which options are the correct ones by just looking at the feature set of the language, the ecosystem of code already written in it, the style guides, etc. If changes are needed, the few language creators should bless a single way forward and push for the ecosystem to all move to it instead of deferring the decision to the millions of users of the language.


Sure, Dr, sure.


I'd rather have this than a variable called `l`.


Are you sure there such an entity as "the JVM ecosystem"? What is the common thread you can run through Kotlin Android jockeys, Clojure scientists, and Enterprise Fizzbuzz Java 11 types?


Yes who needs generics and a proper error handling system.

When you can have Go where the boilerplate is so bad you literally have to rely on generating source code files in order to maintain a sane level of productivity.

I use idiomatic Go every day and it's like being back in the 90s.


In our entire codebase, Go's generics have been used, like, 3 times, even though they were introduced almost 3 years ago. It's unsurprising: generics are most useful for writing your own custom containers, and generally, most projects don't need their own custom containers. It felt somewhat anticlimactic, considering all the anticipation.


You often don’t need custom containers, but you need containers! I shouldn’t have to write my own custom method to map or for each over a list of items, etc.


You use it everyday, yet have not figured out that it has had generics for almost 3 years?


Its generics aren't as advanced as Java's or any other language with generics for that matter. Plus most things are now only adding support for generics, so there's a fragmented ecosystem of code before and after generics were introduced.


This. It's so frustrating that sync.Map isn't generic and reusing the builtin map API is impossible.


Its actually a good question on who needs generics. It makes code unreadable and was never really useful in commercial products.


Try writing Go without generics. No typed maps, arrays, slices, channels or function types. Only `interface{}`. Go always had generics, you just weren’t allowed to write your own until recently.


Kotlin is interesting as a middle ground, but I still find it much less productive than Go for most tasks, and unsuitable for tasks where you'd reach for Rust.

In practice, Kotlin is extremely complicated, and you end up spending time being clever. There are 1000 ways to do things. Operator overloading. Proxies. Properties. Companion objects. Exceptions AND result types...

The build system (assuming you use Gradle) is tantamount to torture for anyone used to "go build".

The coroutines APIs feel simultaneously more complicated and yet more restrictive than Goroutines. More structured but less flexible and more effort to use.

Access control feels awkward. There's no way to make a type package-private -- it's file-private or available to the whole module. This leads to either a larger API surface than desired, or the inability to break up complexity into multiple files.

Kotlin/Native and Kotlin/JVM really are two different beasts too.

Kotlin/JVM is mature, but then you are running on the JVM, so that cuts out a whole class of use cases you might bust out Rust for.

There is a very weak ecosystem for Kotlin/Native, and it's poorly documented. There are some scary bugs in the bug tracker.

You can't publish source-only libraries for Kotlin/Native either, so you need a complex CI setup to build binaries for every OS and arch under the sun. Or just don't publish libraries at all, which probably feeds in to the weak ecosystem...


Don't forget that it's made by someone trying to sell you an IDE!


There's a lot of valid critique of Go, but I've never found anything like it that lets me build lasting, high quality, bug free software.

Explicit error handling means that I actually think about and handle errors. No surprises in production. No random exceptions throwing deep inside some dependency. I've been running some Go services for years with no crashes.

The brain-dead simplicity means I am not tempted to waste time being clever.

The tooling means my code is trivial to build, even years later on a new machine.


It's not clear to me how this is better than Gradle. And I hate Gradle.

At first glance, Mill looks like it has many of the pitfalls of Gradle: - Plugins: Creates the temptation to rely on plugins for everything, and suddenly you're in plugin dependency hell with no idea how anything actually works. - Build scripts written in a DSL on top of a new language: Now I have to learn Scala and your DSL. I don't want to do either! - Build scripts written in a language that can be used for code too: Versioning hell when the compiler for the build system needs to be a different version to the compiler for the actual project code. See: Gradle and Kotlin


Author here! The issue here is that builds, and many other "just configuration" scenarios, are fundamentally complex. So many projects that start off as "just XML" or "just YAML" end up implementing their own half-baked programming language interepreter inside of their XML/YAML/JSON/whatever.

Examples:

* Github Actions Config Expressions https://docs.github.com/en/actions/writing-workflows/choosin...

* CloudFormation Functions https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGui...

* Helm Chart Templates https://helm.sh/docs/chart_best_practices/templates/

There is a reason why Bazel went with Python/Starlark, why Pulumi and CDK and friends are getting popular. Fundamentally, many of these use cases look surprisingly like programming languages: maybe not immediately, but certainly after you've dug in a bit. And having a properly designed purpose-build programming language (e.g. StarLark) or a flexible general purpose language (e.g. Typescript, Kotlin, Scala) does turn out to be the least-bad option


I agree that Bazel did pretty well with Starlark, but the reason that’s sane is because it’s not Python, though the syntax is similar. It avoids getting into trouble with people using Python language features that would result in upgrade hell and annoy other programmers who aren’t Python experts.

(Though, debugging complicated Starlark code can still be difficult.)

So why not use Starlark? :)


Starlark is great, but so is Scala. People underestimate how big the ecosystem is even for a niche language like Scala:

- Global publishing and distribution infrastructure

- IDE support in multiple IDEs

- A huge ecosystem of third party packages, both Scala and Java

- An excellent Scala standard library and Java standard library

- Good performance.

- Tooling! Jprofiler is great. Others use Yourkit or JFR.

- Mill leans havily on Scala's FP/OO hybrid style with types, while starlark provides none of that and is purely untyped procedural code


Just wanted to mention that there are much better config languages than Starlark by now: CUE, Pkl, etc.


Why do you call these other languages “better”? They’re different, but I’m not sure why either of the one’s you mentioned would be better for this use case.


Modern config languages offer strong validation and advanced IDE support, which is essential for a great user experience.

https://pkl-lang.org/intellij/current/highlights.html


I was going to mention Cue, but I’ve only read about it, not used it, and couldn’t actually say whether it’s better.


I'm afraid that no current config language is an obvious fit for Mill. That's because Mill is fully reactive and doesn't distinguish between build configuration and execution by design.


> end up implementing their own half-baked programming language interpreter inside of their XML

Greenspun's tenth rule.


There is basically no DSL. You simply write what a build needs, e.g. you write a function `collectCFiles()` that collects every file with extension `.c`. You then issue a command like `gcc ${collectCFiles()}`. And pretty much that’s it - you can use shell commands, or do anything in scala (or java or whathaveyou). You simply have your functions return either a value (e.g. a checksum) or a location, which is the only mill-specific logic. So your compileC() function will simply invoke your collectCFiles() function, and this invocation implicitly creates a dependency between these tasks. You have written literally the simplest way to describe your build logic. But in the background mill will cache your functions’ inputs outputs and parallelize those that need re-run, which is what a build tool should do.

The implementation may not be the theoretical best, but I think the idea is pretty much the perfect build system out there.


The first advantage the homepage lists is:

> Mill can build the same Java codebase 5-10x faster than Maven, or 2-4x faster than Gradle

Speed per se can be a good selling point (having to wait for slow builds is really annoying).

I can't really comment on anything else though as I just stumbled upon it here in HN ;)


The goal should be more like 50x faster than Gradle. Gradle is ludicrously slow (at least in every single Gradle project I’ve had to work with).


First invocation may be. Subsequent builds are very fast, unless someone decided to write random bullshit into the build scripts that execute at config time, making the config process impure.


I’m mostly thinking of Android projects. If I have time I’ll try some speed tests with a new basic project. But I don’t think I’ve even once done something in Android Studio and thought “huh, that was surprisingly fast”. Maybe some of the hot reloading stuff is okay (when it actually works).


Are we talking about Maven with its cache extension?

https://github.com/apache/maven-build-cache-extension

Because in my experience, this makes Maven very, very fast.


AFAIR author made quite unfair comparison with simple compile vs full maven build (that executes a lot of additional stuff)


For Scala (of which this is probably the main target) Maven builds are especially slow. I would not be surprised if that was his early focus.


Mill's early goal was to be a saner sbt, incidentally also fixing the parts of sbt that are/were unreasonably slow due to questionable design decisions.

Maven has never been relevant to the Scala ecosystem given most of the community has pretty much moved straight from ant to sbt. Only a few Spark related projects stubbornly use Maven, which is a major pain given the lack of cross-building abilities. Slow dependency resolution and inefficient use of Zinc merely add insult to injury.


Yeah... that's my experience with Scala all around - it's abysmally slow, especially if you use any sort of "metaprograming"... (one of the reasons I stay clear of the language)


I'm working on a project that encompasses both JVM (Gradle, Kotlin) and Golang.

My hot take: JVM build tools, especially Gradle, are a soup of unnecessary complexity, and people working in that ecosystem have Stockholm Syndrome.

In Golang, I spend about 99% of my time dealing with code.

In JVM land, I'm spending 30% just dealing with the build system. It's actually insane, and the community at large thinks this is normal. The amount of time it takes to publish a multi-platform Kotlin library for the first time can be measured in days. I published my first Golang library in minutes, by comparison.


You speak from my soul! I'm in the Java world for a really long time now and I'm wondering for years why the build tools need to be so complicated an annoying. I know Go, Node.js and bit of Rust and all have more pleasant easier to use build tools! The JVM (or GraalVM) as an ecosystem is just fine and probably one of the best, but build tools might be achille's heel. Maybe it would be a good idea for Oracle to invest into that area ...


My experience of JS projects is that build tools are frequently ad-hoc. That is, there simply isn't a general build tool at all, but just a large pile of scripts calling under-documented libraries. Parallelization, caching and quite often even portability are just missing.

To justify this statement consider this blog post I wrote a while ago about porting GitHub Desktop (an Electron app) from its prior build/deployment system to Conveyor [1]. Conveyor is a tool for shipping desktop apps and is implemented as a single-purpose build system. The relevant part is this commit:

https://github.com/hydraulic-software/github-desktop/commit/...

The amount of code that can be deleted is huge! Some of it is in-process code that isn't needed with Conveyor (setting up Squirrel etc), but a lot is just shell scripts that happen to be written in JS. Replacing that with a real build system not only simplifies the codebase but means the build steps are fully parallelized, fully incremental, easier to debug, portable (the build can run on any platform), progress is reported in a uniform way and so on.

So whilst the JS ecosystem's approach to build tools may be "simple" in some way, in the sense that there's no dominant build tool like Maven or Gradle, that simplicity does cost you in other ways.

[1] https://hydraulic.dev/blog/8-packaging-electron-apps.html (Disclosure: Conveyor is a commercial product made by my company)


I'm in JVM land. I spend very little time dealing with the build system. It is actually insane how well it works.

Also, why does it matter how long it takes to publish a library for the first time? It sounds like a non-issue to me. I have written dozens of libraries and published them to a local artifactory instance because it simply doesn't matter if your company specific code is accessible to the world or not.


One note from having worked with both that I don’t see mentioned: Golang dependencies are sources you basically pull and compile with your own code. In JVM-land dependencies are precompliled packages (jars). This adds one little step.


...or a big step, if cross-compiling is required (e.g. Kotlin Multiplatform)

I'm surprised there is no source-only dependency solution for JVM -- it'd solve this issue. Pull down the source and build on the fly. Perhaps there is and I'm unaware?


I'm afraid Java/Scala/Kotlin compilers are too slow to make that convenient. Even currently building pure Java projects can take minutes when it's compiling just like 300k lines. What if it had to compile millions of lines from all the dependencies?


The actual compilation step is 100% not the bottleneck - it can go as fast as 10k-50k lines per second! (According to the Mill benchmark, but that’s the Mill-independent part).

Comparatively, Go does “only” 16k lines per second based on some HN comments.


But you’re likely comparing on different hardware though. Go compiling only 16k lines per second is hard to believe for me. Maybe they meant on single CPU core. Rustc compiles over 50k lines per second on my MBP in debug mode and Go must be definitely faster, as everyone knows rust is very slow to compile.

But anyway, you may be right. I just ran mvn install for the second time with no source change on my current project. It took 57 seconds.


The java metric is also from a single core. But you are probably right that it should only be taken as a rough ballpark, but java is definitely in the same ballpark as go in compile speed.


What issue would it solve? The fact that you can build a jar in any OS and then just use that anywhere else is actually a huge benefit of using Java, as you don't force everyone to re-compile your library source code.


Well since the builds tend to be monstrously complicated for some reason, and there’s no standard build tool, maybe it’s more impossible than possible to consider source based distribution. Or it would be like JavaScript where you still need a build and publish step to turn “developer Java / other languages” into “vanilla source distributable Java”.


> The amount of time it takes to publish a multi-platform Kotlin library for the first time can be measured in days. I published my first Golang library in minutes, by comparison.

It's a bit Apple & Orange comparison: publishing a JVM only Kotlin library is quite easy, it's the multiplatform part that takes time.


Last time I published a JVM library I had to Open A Jira Ticket to request the rights to publish a package on the main package registry. Then I had to verify I owned the DNS name prefix for my package by fiddling the DNS records at my hosting provider. It took days just to get authorized! Not including the time needed to like, figure out how to make JARs happen.

In go: `git push` to a public repo

In js: `npm publish` after making an NPM account


Merely as a "for your consideration," GitLab ships with its own Maven repository (along with npm, docker, Nuget, and a bazillion others)[1] so you have total sovereignty over the publishing auth story. I can appreciate going with Central can be a DX win if you're distributing a library, since having folks add <repository> lines to their pom.xml or settings.xml is a hassle, but at least you get to decide which hassle you prefer :-D

In fairness, GitHub also finally got on board the train, too: https://docs.github.com/en/actions/use-cases-and-examples/pu...

1: https://docs.gitlab.com/ee/user/packages/maven_repository/


Sometimes barrier to entry is good. For example, both npm and cargo struggle with package name squatting and malicious packages that are miss spellings of common packages.


This isn't an issue in the Go ecosystem, because the package name is the GitHub repo.

I don't think a high barrier to entry is overall good, in fact I think it encourages larger more complex packages to justify the maintenance burden


Pedantically, that's only one way to resolve a go package - and for sure the more obvious[1] - but the most famous one I know of is gopkg.in/yaml.whatever that uses a <meta> tag to redirect to its actual GH repo, which only the deepest golang ninja would know how to use: compare view-source:https://gopkg.in/yaml.v3 with view-source:https://gopkg.in/yaml.v3?go-get=1

1: err, modulo that go.mod stuff that secretly adds a version slug to an otherwise normal github URL -- I'm looking at you, Pulumi: https://github.com/pulumi/pulumi/blob/v3.137.0/sdk/go.mod#L1


In rust: `cargo publish` after making an account on crates.io


I've been working on Java-based systems for about 20 years now, and I fully relate to that. Same experience.

This is so annoying that I prefer to use Rust over Java even in areas where things like better performance or better type system don't matter. But being able to start a fresh project with one `cargo init` and a few `cargo add` invocations to add any dependencies... well, this is priceless.


Interesting that you ended up going all the way to Rust land instead of just using one of the multiple tools that have been created to help with this, like:

* Spring Boot (it has a UI to create projects where you pick Java version, DB, build tool, some libs etc): https://spring.io/guides/gs/spring-boot

* JHipster - the nuclear option, pick what you want a la carte: https://www.jhipster.tech/

* JBang - a cute CLI for this: https://www.jbang.dev/

* Maven Archetypes - the old fashioned way (existed before "create-app" kind of tools appeared): https://maven.apache.org/guides/introduction/introduction-to...

And most IDEs also have "new project" wizzards.


Are you aware of Maven Archetypes[1]? I believe they were the "cookiecutter" before cookiecutter existed, although I am 10000000% on-board that their discovery story is total garbage :-(

1: https://maven.apache.org/archetype/index.html and https://maven.apache.org/archetype/maven-archetype-plugin/us...


But I don’t want to copy a full project with prepopulated list of dependencies chosen by someone else. I want to start small and add dependencies I need.

It’s like LEGO vs Playmobil. I want LEGO. ;)


How does that differ from `gradle init`?


Init and then what? The story of discovering and adding dependencies is still much worse. Nothing like cargo add/remove or crates.io where I can quickly search dependencies with their descriptions with standardized links to repos and documentation. Actually even Python is nicer in this regard with PyPi and pip install, even though virtual envs are pain.



Interesting. I spend nearly zero time with my maven setup and almost all the time is in coding. I am genuinely curious to know where that 30% time goes? Is it waiting for builds?


> the community at large thinks this is normal

Half are ignorant. Other half are like me and just stuck with no options.

But the tooling ecosystem on the JVM truly is horrific.


I think there are a lot of "JVM Lifers" who are so deep in the ecosystem they are unaware how much better things can be.

Anecdote: I wanted to publish a ~100LoC multiplatform Kotlin library -- just some bindings. I publish these sorts of things for Go with just a "git push".

Steps were: 1. Spend a few hours trying to understand Maven Central/Sonotype, register and get "verified". They're in the middle of some kind of transition so everything is deprecated or unstable. 2. Figure out signing, because of course published packages must be signed. Now I have a secret to keep track of too, great. 3. Discover that there is no stable Gradle plugin for publishing to the "new" Maven Central, it's coming soon... Choose one of the handful of community plugins with a handful of stars on GitHub. 4. Spend a few hours troubleshooting a "Gradle build failed to end" error, which ended up being due to signing not finding a signing key. 3rd party plugin didn't handle errors properly, and a bug in Gradle meant that my secret wasn't picked up from local.properties. 4. Eventually discover that because Kotlin Multiplatform can't be cross-compiled, there is no way to actually publish a multiplatform library without spinning up a bunch of CI runners. And you can't just publish code -- JVM packages have to contain compiled artifacts. 5. Realise this now involves maintaining GitHub Actions and Gradle, which is an ongoing cost. 6. Give up.

The harm that this kind of complexity must be causing to the ecosystem is immeasurable.


Although a lot of it is generic badness, Kotlin Multiplatform isn't the JVM ecosystem. You don't need CI runners to publish a JVM library. The reason it comes up with Multiplatform is because Kotlin defines "Multiplatform" to mean platforms like JavaScript, or their own LLVM based compiler toolchain that bypasses JVMs entirely.


Very true, although it definitely feels like part of the ecosystem since it uses the same project structure, build tooling etc.


I’d just like to add, NPM gets a lot of flak (mostly deservedly) but it too is still vastly easier than anything in the JVM ecosystem.

Even with all the headaches around modules versus CJS, and JS versus TypeScript, NPM is a lot easier than Gradle. Notably, you have a choice of alternate tools (eg pnpm, yarn, bun) that interoperate pretty well.

I guess my point is, Gradle and Maven are specifically and outstandingly bad.


If you think gradle and maven are bad, you should try Mill! There is more to build tooling than gradle or maven, the field has evolved significantly since those tools launched 15-20 years ago, and Mill tries to do things better


I must be missing something here. Don't the tools you mentioned do a lot less than Gradle? Gradle knows test depends on compile, which depends on code generation (say protobuf) - with caching and change detection. Compare that to chaining up the commands in the `scripts` section of `package.json`.

EDIT: another comment making this point: https://news.ycombinator.com/item?id=41969847


I could be convinced if those features of Gradle actually worked well, or even worked properly, like dependency management does in e.g. Bazel.

In practice, Gradle really seems to fall down on the basic task of just being able to build stuff in the first place. It feels like you’re constantly fighting version hell just to find a Gradle version and plugins that work together, let alone your actual code dependencies.

And if you actually do need to do something slightly more complicated, like code generation, it’s very difficult to work with and the docs are really bad.


I have no complaints for the well trodden path (e.g. https://github.com/google/protobuf-gradle-plugin). I have also written some custom build steps, and indeed the docs aren't very helpful - but the final implementation is quite simple.


Npm also gets a lot of flak for the low bar it sets for introducing malicious code by impersonating an idling maintainer or presenting yourself as a successor. The friction, the secrets to keep, they are there for a reason.


> I published my first Golang library in minutes, by comparison.

For what platform(s)?

Or did you really just push the source code?


That's the trick. You publish the source code. And it's still faster to build all dependencies from source than maven / gradle manages to resolve and download the binary dependencies ;)


That's true, Maven is ridiculously slow to resolve dependencies while Gradle only really works with reasonable speed if you allow it to hog your system with a deamon.

I myself wrote a dependency resolver that matches Maven in functionality, and even a large project that uses Spring Boot and its dozens of dependencies can be resolved in a couple of seconds. About 10x faster than Maven or something like that. If you look at Maven's source code you'll see why. It's the worst kind of Java Enterprise overengineering you can imagine, complete with its own dependency injection framework, everything is pluggable (for no reason, really, do you really need to replace HTTPS for your protocols?? In Plexus you can), to the point that all the de-coupling results in lots of things duplicating functionality everywhere. I am not sure but I would bet Maven parses your POM at least 10 times to do anything due to the de-coupled nature of it.


Maven is actually pretty behind in terms of JVM dependency resolution. Mill uses Coursier, same as my last company did, and when my last company switched from Maven to Coursier we saw a 2 order of magnitude speedup, with resolution commands that used to take 30min finish in a few seconds to give the exact same artifacts and versions.

I actually have no idea why these other resolvers are so slow, or why Coursier is so fast, but this slowness is very much a "maven" or "gradle" thing rather than a "jvm" thing. And Mill using coursier does significantly better!


Relatedly, Backblaze b2 is routinely blocked by corporate IT (and even Chrome's anti-malware list from time to time) for similar "bad neighbour" reasons.

It's bad enough that you basically have to stick a reverse proxy in front of it to reliably serve content at scale.


This also affects communication apps, like email clients.

It's a real bummer for the user experience, honestly. Yes, people can say "share all contacts", but the user experience is confusing, and many people won't.

This means that all 3rd party mail and messaging apps will be lacking contact information -- whereas of course Apple's own will have it by default.

Again, it's shameful API design by Apple, because they don't have to use their own APIs/permission systems.

This could be mitigated, by the way, by having a rate-limited "lookup" API where an app can say "Can I have the contact for bob@example.com, if it exists?". Most legit apps don't need a copy of your entire address book, but they may need to query it occasionally.


Another example of Apple further entrenching its monopoly -- Like other permission prompts, I bet Apple exclude their own apps from asking for this.

I bet iMessage doesn't ask you if it's allowed to access your contacts, in the same way that Photos doesn't ask you which photos you want Apple to know about. That would be an unacceptable user experience for Apple, but acceptable for 3rd party apps.

This seems to be a constantly overlooked part of the permissions discussion. I'm all in favor of Apple changing the rules on their platform to whatever they like, as long as their own apps have to play by the same rules.

Instead, they use permissions to advantage their apps over the competition.


No users think the Apple device with the Apple Contacts app is or should be hiding Apple Contacts app contacts from Apple Mail or Apple Messages app. If you don't want your contacts in the Apple suite, don't put them in the Apple suite.

Similarly, if you use Microsoft Contacts, you assume you see those in Microsoft Outlook and Microsoft Teams, and their devices using their OS.

Similarly for Google's suite, and their devices using their OS.

There are other Contacts apps, such as Clay (from clay.earth) that have other sets of contacts and can sync with still other contacts stores such as, say, LinkedIn. Those aren't visible to Messages without an affirmative action, so Apple is not advantaging itself.

If you're arguing that application suites aren't allowed, any number of users are going to be very annoyed with you.

If you're arguing that nobody can make both hardware and productivity assistant suite combined, you're either saying the PDA doesn't have a right to exist, or, saying that forcing the PDA to be open to other apps on the PDA in turn means the PDA isn't allowed to be an integrated suite now that it's open, and, I guess, saying Microsoft can't make Windows or Surface unless they spin off Office or damage what they make till none of it talks to each other seamlessly?

This entire line of thinking, that nobody's allowed to offer a seamless experience, seems like overregulation of what consumers are allowed to choose and buy.


The line of thinking here is that Apple should play fair. The power of defaults is very strong.

Most iOS users aren't going to be thinking of "Contacts" as "Apple Contacts". It's just the contacts on their phone. It's their contacts, not Apple's.

I think Apple should absolutely have to use the same permission prompts as 3rd party developers -- because this aligns the incentives to design a great user experience.

Instead, they have no incentive to design these prompts and APIs well -- in fact, a disincentive.


Rephrased: Users are not allowed to choose an integrated PDA.

And, still not even if it lets them make a different choice later.

Another implication: All first party apps must be interchangeable. I'm curious -- must third party apps also be?

And then, who decides what lowest common denominator functionality is, and what's OK to offer that others don't?

You've taken that choice away from the market.


The rules of the platform should be the same for all users of the platform. You can't play the game and be the referee.

I don't see how this prevents an integrated user experience. It's orthogonal.

If the user experience for permission management is well designed, and the APIs are thoughtful, this shouldn't be a problem.

It's a problem in iOS today because the user experience and APIs are an afterthought, and there's a disincentive for making them good.


> No users think the Apple device with the Apple Contacts app is or should be hiding Apple Contacts app contacts from Apple Mail or Apple Messages app.

I am a user and you are wrong.

I absolutely want every app, regardless of vendor, to be sandboxed from each other. Without explicit permission, I don't want Mail or Messages to know that I have a contact card for the peer.


Having worked with Swing recently, I worry that it will not work well on the near-distant future, because it feels frozen in time at about 2005.

There are a lot of assumptions baked in that aren't holding up today.

For example, high density and multi monitor aren't well supported. There's a bunch of stuff hard-coded in the JDK that no longer makes sense and you have to hack around.


I recommend using the JetBrains JRE, they have fixed lots of these sorts of issues, as well as greatly improved hot class reloading for development.


I haven't worked with FX or Swing lately but I could have sworn they delivered hidpi support. Maybe in this JEP? https://openjdk.org/jeps/263


I have been looking for something like this in Go for a while. I think there's a real opportunity for Go to provide a great developer experience for cross platform UI due to how simple the build process is. Speaking from experience, half the pain of cross platform development is managing build complexity, which Go basically eliminates.

I'm curious how you'll end up solving for cross-platform layout when native controls have different intrinsic sizes per platform?

This is something I haven't seen solved super well in cross platform toolkits.

Wishing you luck though.


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: