This is really exciting!!! I was a bit disappointed that the right-pad will be out only in 2017. I am looking forward to that release because there is a high demand for it now.
What kind of load balancing is being used on the back-end?
I called leftpad(str, ch, len) with the length I needed and noticed that is not very scalable because it is blocking.
A better approach I would recommend to those using it is to call the API in a for loop. In my tests, it had performance very close to those I see in C or assembly.
I was a bit turned off that the free version can only handle strings up to 1024 in length. I know you need to make some money, but it is big turn off for a lot of my projects.
Edit: I finally signed up for it but still noticed that I am only allowed to use 1024. I called your customer support line and they said I was calling the API from multiple IP addresses and for that I need an enterprise license. Please help me with this issue, it is very crucial at this point as my project is in a complete stop because of this.
Best practice for performance for large left-pad jobs is to call the service recursively using mapreduce. Remember that left-pad(str,pad,n) is equal to left-pad(left-pad(str,pad,n/2),left-pad("",pad,n/2),n). This should run in logarithmic time and is highly parallelizable.
If you don't like the "" magic string in there you could replace it with a call to left-pad(null, null,0).
{"errorMessage":"len exceeds 1007 characters. Contact a left-pad.io sales engineer for an enterprise license","errorType":"Error","stackTrace":["exports.handler (/var/task/index.js:33:22)"]}
wow. they even have an enterprise sales engineer! :)
Can we also get a containerized on-premise version of this service? We'd like the ability to spin up a couple of these behind a load balancer to let us achieve web-scale.
As a very sarcastic person, I highly approve of this. This absolutely reflects my opinion of all this mess.
Thank you for making this site so that I don't have to write an opinion piece like everybody else seems to have to. Instead, if asked about the issue, I can just point them at this site.
Yes. This isn't constructive, but this mess had so many layers that I really can't point to a single thing and offer a simple fix as a solution.
As such I'm totally up for just having a laugh, especially when it really isn't being nasty against specific people but just laughing about the whole situation.
I don't understand why this community has to have a weekly cycle of bashing different programming communities. Every week there's a new drama thread bashing Java devs, Go devs, Javascript devs etc. The thing that I come to this community for every week is to read about new developments in our industry, if you don't come here for that then what are you coming here for?
And wasn't it just a few months ago people were praising the innovation of Urbit for having a 'global functional namespace'? But because it's popular to hate on javascript devs for applying -- sorry, I forgot this was javascript bashing week -- for reinventing concepts from other areas in computer science and software engineering the HN community has to start hating on another programming community's work.
That said this is a pretty funny satirical page, apologies to the author for venting at the HN community.
> I don't understand why this community has to have a weekly cycle of bashing different programming communities.
It's funny if done in good taste. You should be able to take a joke with good humour, even if it's at your own expense, and this is a pretty good joke.
I don't think the general commentary is in good taste. To me the difference between a good faith joke and a bad faith joke is whether you actually like the people you are making fun of. I don't think that's the case here. My sense is that the people laughing actually think NPM and JavaScript suck. The tone I am getting is "haha, you are bad and I am a better programmer than your whole community".
Is that not how you perceive it? Am I just imagining contempt where there is none?
It's like, I can joke about you being an alcoholic if I think you're actually just a fun heavy drinker. If I think you're actually a violent alcoholic, it's not funny for me to joke about that.
I don't think people are making fun of other people here, we are lauhging at a particular software engineering pathology, that's all. There are important lessons in the story of left-pad library to be learned not only by JavaScript programmers, it's pretty universal, I remember someone asking about API for min and max for Java (not JavaScript) on StackOverflow.
If you want to actually know the "why", I don't think any one person can answer for certain but I have noticed similar trends on HN and attribute it to the culture necessary among start-ups: brutally challenging assumptions and discovering failure points.
I think that kind a culture is necessary in quickly shutting down projects that have a high likelihood of failure before too much time is invested in them. Unfortunately, that culture also tends to leak into many other aspects of the community.
> brutally challenging assumptions and discovering failure points
Nicely phrased. I'd apply this to engineers in the general too. People can often speak of them (and me) as difficult; when this is really what is needed of people that make things happen. Especially on critical infrastructure. You can't really wing details on the integrity of the bridge you are building.
It's sad, really. I've been coding for roughly 11 years, since I was 14 or so, and have always noticed the constant pissing matches between programmer groups. It's easy to fall into them, because when you do something a certain way for so long, another way looks so foreign and terrible. I tried to make an effort to always stay out of those arguments, but absolutely fell in here and there.
Lately I've stopped coding for work and moved into a PM++ position as I build up a small agency on my own, and it has been refreshing. I build little meteor apps for fun in my free time (omg meter doesn't scale noob lol) and can keep my blinders on to what the outside world thinks, and my devs can continue to do what they are comfortable with.
I can't speak for anybody else, but you highlighted the origin of my general annoyance.
Applying cross-domain concepts to improve your local domain is awesome.
Applying somebody else's invention from a different domain to your local domain and then acting like you invented a new thing is... annoying.
If various forms of "reinvention" didn't seem to come with a Twitter-cult and snazzy *.io website screaming, "Look at me! Look at me! See how awesome this totally new thing is that I'm pretending to have pioneered!" There'd be no inclination for anybody to respond with, "Um. You didn't invent anything. Please calm down."
If you throw yourself a parade everyday celebrating the genius of having figured out how to wake-up in the morning to attend your parade, then inevitably somebody is going to rain on it.
Each community has its own quirks and extremes as seen by everyone else.
Without the bashing, they would become even more extreme. At least after this a lot of devs will think twice about making "micro-modules".
There are lots of quirky cargo cults creeping into the .NET world, especially since the growth of Nuget. I've seen a few projects with DI/IOC everywhere, to the extent that I expect to see a convention test that fails if the new keyword is used in a project.
And the stubs and mocks. I recently had to fix a beautifully tested piece of code where the developer forgot to implement the actual functionality. The proliferation of async/await and the idea that no self-respecting app can exist without a REST API layer also grate.
Cargo cults seem to move in waves, and the frequency of those waves seems to be ticking up of late.
If that were true java would have started dying a few years ago, javascript frameworks wouldn't pop up every week, and most people would use shell scripts instead of build tools. This is exactly the kind of flame war stuff the guidelines seem to have tried to stop.
Just a data point, but our shop has been moving away from a lot of the more heavy Spring features, and splitting our Java code into simpler microservices. I think the points about overabstraction are getting through slowly.
>Without the bashing, they would become even more extreme. At least after this a lot of devs will think twice about making "micro-modules".
I don't think this "self-regulation" (I called this bashing, the guidelines call this flaming) is going to make people think twice about making micro modules. If they are useful they will be used, I think this community is better than to just bash these communities and call it 'self regulation.'
I'm pretty sure that attitude is called "paternalism". Self-regulation works better when it's self-directed, as opposed to other-directed (as in, someone else doing the regulation).
`I shouldn't do ${foo} because people will make fun of me` is a horrible attitude to foist on anyone else, and anyone who tries to propagate that attitude isn't helping the community at all.
I think there's a fundamental difference between, say [0] (thoughtful, nuanced), and [1] (just plain silly). The initial posts are all worthwhile and interesting, with the latter clearly being satire. It would be even funnier if the OP exposed a more robust string-handling library from ruby or python to the javascript runtime in node by way of a native lib binding. [2] is kind of just elitist, "get off my lawn" yelling, which misses the point fundamentally in this problem.
There are different flavors of comments on these threads. Some are pretty cogent and thoughtful. Others aren't. A "sky is falling augh, isArray is one line, javascript devs are teh worst" misses the underlying problem of javascript being addressed (interpreter fragmentation, not having a reliable way to test if something is an array on older platforms). I'd argue it doesn't contribute at all to the improvement (metacognitive or otherwise) of any javascript dev, but instead serves to harden people's opinions on one side or another of a mythical line.
The difference is whether the goal is to contribute to the discourse or just go for the shallow laughs. It's an important difference, because the former does lead to improvement all around, whereas the latter leads to high-fives and community decay.
There is another aspect that is worth considering: when someone is doing something particularly "stupid", this "stupidity" often isn't seen by the actor. Their self analysis can sometimes correct itself over time, but the usual biases can interfere. The point is it is far too easy to fall into really bad habits while blind to just how bad everyone else sees those habits.
While I'm speaking about a somewhat more general pattern of behavior, a very closely related phenomena is the "normalization of deviance"[1].
Part of the problem is that without an outside assessment, the actor isn't really working with accurate information. Criticism can be a way of giving that data. Quality "constructive criticism" is great, but the message is sometimes ignored or avoided. In those cases, a harsher criticism - and possibly a bit insulting - is necessary to convey both the severity of the situation and to get the recipients attention. This is where some of what you cal "'get off my lawn' yelling" is about. Sometimes we've seen these mistakes before, and we've tried being patient and nice; when that didn't work, the occasional "get off my lawn" may be appropriate. As others have said, this how a community self-corrects; when someone is being "stupid", they need to know that. They certainly aren't going to change without this necessary information.
Now I'll certainly agree that there is a very fine line here, and as long as fallible humans are involved mistakes will be made. I've certainly made mistakes in this area in the past. However, avoiding problems is not the answer, as unfixed small problems tend to become larger problems in the future.
I kind of disagree with your central claim that this is going on. But that's my sociologist head rearing up. There is, of course, a "wider HN community", but that's probably the community of people talking about the appropriate use of downvote as disagree, or whether we should have a blackband for so-and-so reason. Communities are defined by shared experience, values, and goals.
The community in question here is much more of a javascript dev community, and we can see the culture clash between them and the java, c, python communities throughout all these left-pad threads. So, I maintain: this is an attempt at other-regulation, namely people who are by-and-large not active parts of the javascript dev community making fun of the punchbag-of-the-week.
I'm just cautious any time someone frames an outright attack (and make no mistake, these are attacks) as "for your own good". It rarely is.
I think you should think ahead and change it to "any-pad" or something similar and offer adapters/plugins. What are users of your library do when some industry-disrupting startup comes in offering left- and mid- pads?
Changing your npm shouldn't be a problem, just unpublish it and create new one.
Also don't forget to check trademark list before creating npm because "they will come knocking on your doors closing all your accounts".
As a Java developer, I am a bit jealous. When people joke about us we usually only get a link to the Spring documentation of AbstractSingletonProxyFactoryBean (or maybe the enterprise hello world), but no one ever wrote that as a service. Maybe someone can do that? https://abstractsingletonproxyfactorybean.io seems to be available!
Not sure what about that is hard to understand, it's a convenient proxy factory bean superclass for proxy factory beans that create only singletons. Says it right there on the page.
But it's kind of disturbing that the class name is hiding that convenience from us. How are we supposed to know that it is not an inconvenient...?
It all starts getting really weird with Spring Data interfaces: a whole DSL written in camelcased method names, as if Spring deliberately turned a joke into a feature.
I'm late to the left-pad discussion. I thought it was considered a bad practice to depend on an external repo as part of your build process. At my company we use Artefactory to host our maven libs. Even if one were removed maven central our builds would continue to work fine (in theory).
I'm always surprised at how common this is. You check out some repository, and it ends up having to fetch more stuff from all over creation. Whether it's just git submodules or some more sophisticated dependency manager, it seems like an obviously bad idea.
It's OK if they're pulling from other repositories you own, but requiring external repositories as part of the checkout process seems like an obvious point of failure. There's all sorts of possibilities for malicious activities, and just plain downtime will affect you too. Storage is cheap, so it seems like there ought to be no reason not to pull in external dependencies and save them locally. Dependency managers should support and encourage this rather than defaulting to just referencing some random stuff off in space.
The reason is maintenance. Once you set up yet-another-repo-mirror (after your apt, rpm, nuget, pypi etc mirrors), someone needs to keep it up, back it up, secure it, refresh the packages, etc etc.
It's "cloud culture": rely on some else's maintenance effort and just work on your own problems. Like all things, it has drawbacks.
I don't buy it. You need a place to keep your own repository, and once you have that, keeping copies of other repositories is pretty much free. Keeping up, backing up, and securing N repositories is no more work than doing that for one repository. You can still outsource this, even. There's nothing wrong with using GitHub or Bitbucket or whatever for your repositories, just make sure that you're using your repositories, not relying on other people's.
The only thing that potentially gets harder is refreshing the packages, and I'd argue that's actually a good thing because that's really just saying that packages don't randomly change out from under you. In any case, your package manager should make it easy to refresh them when you want to, just say "go update that dependency to the latest available from the official source" and let it do its thing. If package managers don't offer this, they need to start.
You need a place and a person who knows all this stuff (or likely more than one). That's expensive and hard, as knowledge of all these setvices has to be correctly handed over across people and over time. That's on top of org-specific processes that are likely more important and more byzantine, which means they have priority. So the day your awesome_pkg_server breaks, nobody remembers how and why it was set up, and the whole thing is scrapped. Back to square 1. This assuming you actually have one or two people with enough free time in the first place...
Whereas standard "fetch package" scripts are trivial, and each individual stack is known well enough by developers working on it every day that there is zero administration and little need for extensive knowledge transfer... At least until left-pad disappears from under your feet; but that's rare enough that the trade-off is worth it overall (or people cannot even imagine it ever happening).
I'm not saying this is how it should be, I'm just describing why a lot of people do what they do.
Maybe the solution is an overall simplification and harmonization of all these services, so that the responsible option and the lazy option won't differ so much. You could have something like a prebuilt "repo server" image that can be downloaded and configured quickly, and which will then auto-cache all requested packages across different services. Cache invalidation would still be tricky but could be triggered selectively from an admin panel. Maybe all this should be packed into CI servers by default...
Terrible, terrible practice. But I think a lot of people got caught out as left pad was usually an indirect dependency, and using npmjs in your build is the quickest way to satisfy your dependency graph.
Hahaha - isn't it hysterical how everyone using npm for small reusable code pieces! Aren't they morons! How stupid of people to trust their package manager to be consistent and correct and return packages they were expecting.
How stupid of people to reuse small often used functions that only do one thing well.
How does everyone taking the piss intend to protect themselves from this in their OS package manager, or PPM or composer or pip?
It's not javascript devs fault that the standard library is so piss poor you need these short code snippets and I've definitely included small 5-10 line packages via npm or other package managers rather than roll my own because it's likely they have bug fixes I haven't considered. I can also use npm to share these snippets between the many projects I'm building.
* No I wasn't affected by this because I review the packages that I want to include, however the level of smugness here is absolutely ridiculous.
> How stupid of people to reuse small often used functions that only do one thing well.
This is a straw man argument. The reason so many people are criticizing left-pad et al is about the cost of adding a dependency.
> trust their package manager
That's exactly the problem: you're assuming that a package manager can be trusted. The network isn't reliable, packages can changed innocently or maliciously, and mistakes happen.
The "stupid" part is pretending these have zero risk, and the part that makes some people angry is when they also have to depend on this zero-risk assumption transitively through another library.
> It's not javascript devs fault that the standard library is so piss poor
Nobody is claiming it was. However, Javascript devs are responsible when they add critical dependencies on external components. They are also responsible when they publish libraries that extend those critical dependencies to the people that use their library without a very good reason.
Yes, I agree with you - there will always be some level of trust involved in your package manager. I think having a better standard library is probably the first desired solution, followed by npm keeping all old versions of a module around even after unpublishing and also some kind of review process build management system magic as part of your code review process.
It's all very easy to say of course and loads of people having been saying including small piece of code from your package manager is wrong. I'm saying it's not and giving reasons why. Trust your package manager or help make it trust worthy I guess.
Sounds like there's a market opportunity here for a company that does dependency verification. Send them a copy of your npm-shrinkwrap.json file and they ensure every dependency + sub dependency change is legitimate and has no obvious security holes.
There are probably enterprises that would pay good money for this, especially 5+ years from now when these apps are much larger and harder to maintain.
Not to mention, their package manager is governed by a single for-profit company, that could disappear at any time, taking the entire infrastructure with it (breaking everything...)
There's a lot of talk of decentralizing npm - but come on folks, what motivation would npm the company have to do that? It's the same issue as with GitHub (only that, with GitHub at least, if you're using git properly, GitHub disappearing tomorrow should have little impact).
What motivation does Sonatype have for sponsoring Maven? What motivation do dozens of web hosters have with providing a free mirror for Debian or Ubuntu packages? As far as I know, NPM is not a profitable endeavour, so making it distributed would decrease costs, and the company behind it can focus on consultancy or whatever it is they do.
> This is a straw man argument. The reason so many people are criticizing left-pad et al is about the cost of adding a dependency.
The cost of a dependency for a good package manager is zero, and the cost of not having that dependency is non-zero. So the problem is with NPM, not with adding a dependency.
"Zero" isn't a cost of anything -- there's no free lunch.
NPM, like any package manager, pretty much blindly accepts user input. I, as a module maintainer, could happily change every single function in my modules to
function whatever() {
return "WHHEEEEEEE!!!";
}
and check it in. Doing that will result in no fewer problems than just deleting the module entirely, and it's not NPM or any other similar package manager's responsibility to vet changes. Obviously curated package managers like those used by most Linux distributions do carry some responsibility, but even then, there's not ZERO risk.
Except a good package manager will let you specify exactly which version you wish to depend upon, so any changes you make upstream will have no effect on people who protect themselves against automatic version updates.
Seriously, the properties of good package management should be obvious by now. Any design that can break dependents and there's nothing they can do to protect themselves is broken.
But there already exist package managers that let you pin to a specific version like he's saying they should e.g. Maven, so I don't think your link quite applies here.
Npm allows that too -- you just specify an explicit version number, e.g. "1.5.3" instead of "*" or "^1.5.3" etc.
My comment was against this belief that "smarter" tools in general are some sort of panacea.
For example pining to a specific version won't help you match if the package is removed altogether (as in this case), and ever worse if it's replaced afterwards by another (newly registered) with an incompatible same version out. It won't try to overwrite your specific install of course (since the version already matches what you have), but you'll feel the pain when you try to duplicate/deploy etc a new install with the same package listing -- suddenly the package either won't be there (or will be modified in the worst case scenario).
So, then we opt for ever more features of a "sufficiently smart package manager", e.g. signatures, permanence of anything published etc...
And pinning versions just means you lose automatic updates, which is surely the point of using a package manager in the first place. Otherwise you might as well just download the package and include it verbatim.
No, it's not an NTS fallacy. And the package manager is mostly useful for tracking updates and simplify the process; having it automatically update is completely optional, and personally, I never do.
Exactly. A notification that new updates are available would be nice, but I don't think automatic updates should be the default, even if "semantic versioning" says the update should be backwards compatible.
There's this "Tone Argument" Fallacy Fallacy. You can't win an argument by complaining that the other's tone is unpleasant. So that's a valid fallacy, but it only applies to a particular argument in a particular circumstance. Certain people on the Internet then make the illogical jump to the conclusion that they are therefore justified to use whatever tone they care to in any argument. (And lo and behold, they always "win" arguments.)
Without knowing much of how npm works, I'm going to assume you have ways of protecting yourself there too. If nothing else, good old "cp -r" can effectively let you pin to a version.
The point is that people use package managers and third party modules installed from them to automate tedious work, and anything automated will have the potential to do something other than what you want. Is that risk high? Not really, but it's not zero either.
Yes, but a sanely designed package manager will at least require that to be a new, versioned release. _Already published_ versions of a package should be immutable.
If you do that, won't you have to increment your version number? Because with npm, except for left-pad due to oddities of the npm-specific ^ semver operator, you always have to increment your version and you can always pin earlier versions. That said, the ^ sevmer operator would probably upgrade to your sabotaged version if you went from 1.2.3 to 1.2.4, and people would have to figure out what the other operators meant.
As to curated package managers, I'd consider them a great counter-example. Try installing a recent (5.x) version of node and postgres (9.4) on ubuntu. You can go through the rigmarole of adding an entirely new repository for postgres (apt.postgresql.org) since the debian distributed version of postgres is rather old (and ubuntu-version specific). You can also do the same thing with nodesource for nodejs, but if you install node on ubuntu, you'll get a packet radio service instead.
Curated package managers are very conservative and move very slowly. Javascript is a relatively fast-moving, pretty anarchic environment.
The lesson of left-pad is probably: vendor your node deps and check them in. And also: npm probably needs to be replaced with a write-once store with cryptographic signing of packages. You still have to trust the author, but that's the point of open source. We're built on trust.
Node has a history of kicking out a new release virtually every week, often with crazy breaking changes. Quite sensibly the Debian packagers decided not to try to keep up with that game. There is an easy-to-add on repo if you really want it, although Typescript works just fine in the packaged version so I haven't.
Not sure what your point is on postgres, since 9.4 is right there in the repo.
postgres-9.4 is available in ubuntu 14.10 and later. If you're in 14.04 or 12.04 (current LTS versions), you have to use the postgres-maintained source. The process isn't super-simple.
Node moves fast, sure. I'm not sure I agree with your "crazy breaking" modifier there: I regularly do dev work against the latest 5.x branch of node, and rarely if ever have to worry about breaking changes. Then again, I'm just doing plain old restful api and business logic stuff, so maybe if you're dealing with the c++ interface it's a different environment.
Even still, I think it's easy enough to see from our disagreement over "too fast" and "quite sensibly" that there's plenty of ways to be a programmer. Rapid improvement can be kind of exhausting (especially if you get behind) but it's also a great way to see new ideas emerging. One of the things I actually love about the javascript world is how quickly things move. Things evolve very quickly because there's so much churn, speedy releases, pressure for mindshare, and so on. It's fascinating, both technologically and sociologically.
It ought to be pretty straightforward to say "apt-get install nodejs-5.3" and have that Just Work, but by the time the curators get around to updating their packages, you may well be a month or three behind. (again, unless you've updated your sources.list.d to include the nodesource repo).
So yeah: curated package managers have a place in the world for sure. But the trade-off here is that you're not getting the Latest and Greatest. Joyent tried to Slow Things Down with the 0.10 branch of node, and you see where that got them.
Yeah, except nobody does it. It's really the case where stupid ideas should be make harder to implement. Otherwise you get the present web, where a lot of companies fetch all dependencies from GitHub every time their CI server wants to run a build.
About the only good thing that comes from this is that we get to laugh out loud every half a year when there's a random GitHub outage and those companies all scream bloody murder.
> The cost of a dependency for a good package manager is zero
The recent left_pad fracas has proved that that is not the case. Snapshot dependencies in-tree? Sure. But a package manager based dependency will be predicated on trusting a lot of things.
That still isn't enough to refute "The reason so many people are criticizing left-pad et al is about the cost of adding a dependency.". Sure, npm may be a bad package manager. If you do have a bad package manager, there is a cost of adding a dependency, and you should consider it. That's what pdkl95's point was. Most other ecosystems, with good or bad package managers, do not have tiny one-function packages -- it's possible to have an ecosystem not depending on it. If npm isn't good enough to have zero-cost dependencies, then your ecosystem should definitely not depend on tiny packages. This may be NPM's fault, but it's the fault of the ecosystem too.
For the record, I believe that npm is a pretty good package manager. And the dependency cost I talk about is universal -- any package manager that allows for easy package updates is open to critical packages being broken. Removing the "unpublish" feature (and using something like Cargo's yank) makes it harder for left_pad like incidences to happen, but you still have issues with broken package updates, etc. You can further solve this with global version pinning, but not everyone will do this so you're still stuck.
> If you do have a bad package manager, there is a cost of adding a dependency, and you should consider it. That's what pdkl95's point was.
That's great, and my point was that with a good package manager, that cost is effectively zero.
> And the dependency cost I talk about is universal -- any package manager that allows for easy package updates is open to critical packages being broken.
Automatic updates are a terrible idea, even if the developer purports to use "semantic versioning" and the update is supposed to be backwards compatible. Specifying a version should be mandatory.
Your package manager could easily notify you that updates are available, but it should never update for you. That's as "easy" as updates should get, because anything else inevitably introduces silent breaking changes.
> my point was that with a good package manager, that cost is effectively zero.
yes, which doesn't refute pdkl95's point. If you're not in the world with a good package manager, you should consider the cost of the dependency. The Node ecosystem seems to not have done that. That's the bottom line here, if there is a cost to dependencies (which you don't seem to disagree with, at least wrt NPM), then you shouldn't have dependencies for 11-line functions. Everything else is orthogonal.
It's not fully mutable. You can't change the code of a particular version, for example. The particular problem with left-pad was that npm is mutable in one way: deleting packages.
no this is the fundamental engineering issue... there is no magically mythical perfect package manager, and even if one exists there is a real intrinsic cost to adding a dependency. It is not a huge cost... it is minor... but the benefit of using a one line package is also minor... so when you are making a minor trade off over and over again as with npm where its not unsually to end up depending on 1000s of packages... then this is not a trade off that should be made without thinking...
I never said perfect, I said good. Some basic properties are that upstream changes don't break downstream dependents if you've protected against automatic version updates.
And yet, automatic version updates seem to be the first thing people bring up in defense of package managers when I mention that I generally think they're a poor strategy.
It's a lot like DLLs/shared libraries, actually: people marshal the same old arguments every time explaining why they are a great idea in theory, but in practice, there seems to be a whole hell of a lot of work being done to make up for the fragility you bring on board with a system like that, and I'd rather just embed the source or link it statically (depending on which context we're talking about here) so that I don't have to waste any time thinking about it.
> The cost of a dependency for a good package manager is zero
I disagree. Another non-zero cost that I don't see mentioned yet is a licensing cost. What if there is some GPL'ed code somewhere in the dependency chain, but you don't know it because you are using code that uses GPL code but didn't realize it? Using lots of dependencies is just asking for licensing hell
That's a good point that's not often considered. That sounds like it should be part of a package manager too, because all you need is a license compatibility matrix.
The laughter is mostly the same as in the crowd points out the "new clothes" actually aren't any clothes at all. The modern, funky, hip, js "devs" told us we didn't "understand" they are doing modularity, code reuse, all those good things. Then the larger crowd realised they are publishing gazillions "modules" each of which is barely a function, some of which have horrendous runtime performance despite their wide inclusion by the crack a-team of js "devs".
I think it's less smugness (there certainly is some) and more disbelief / unfamiliarity.
JS isn't the first language without great libraries (C's stdlib, `math.h`, `string.h` are good but don't have string padding functions, for example) so there's precedence going back a very long way. Most solutions involve coming up with a utilities library that contains the basic things (like a math library, or a strings library).
There's nothing wrong with creating utility libraries for JS. I think people are surprised because the approach taken - that they are composable down to the function - is so radically different from the industry standard. Coupled with a package manager that requires fetching so many deps for such trivial functionality.
People do tend to form isolated communities around languages (look at the language for basic data structures between PHP, Ruby and Python) and this kind of thing probably helps shake up that isolation which is probably good for both sides.
Of course you can use `asprintf` to pad a string like you can use a for loop to pad a string in JS. But it's not a dedicated named function to pad strings like the `leftpad` we're discussing. Kind of missing the point of the argument.
Given how simple and popular `left-pad` module is (indicating the principle the community uses to determine granularity), I'd guess that if `String.format` was available in JavaScript there would still be a library to provide `leftpad`.
A function is dedicated if it is well-suited for a use case. It does not require it to be ill-suited for other use cases.
Either you arguing that `left-pad` is not a "dedicated" function for padding by 42 characters, because it can also pad for 43 or 41, or I don't understand you at all?
It's a memset followed by a strncpy, hardly worth linking to a library for. You'd be done writing the function before you finish googling for an existing one.
I think the difference is that when a C programmer hits an undefined behaviour, they either blames the spec, the compiler, or themselves. A JavaScript programmer doesn't need a prompt to blame the spec proliferation and dilution, the necessity to write cross-platform by default, and their lack of foresight in not including a 3rd-party library that would have provided some vestiges of a targetable platform. The C code that started as a nice asprintf(3) for GNU/Linux will sprout a few if-defs once it has to run on some embedded PoS without a sane malloc().
Even the Java ecosystem managed to get their crap together and make the excellent Apache Commons library collection to deal with a similar situation (anaemic to useless standard library). C++ has STL and Boost.
Only the PHP and JS communities seem to think that not having a solidly designed "base" library is fine (and even PHP seems to be getting better at it with 7).
PHP is absolutely the last language I thought anyone would accuse of missing standard features. PHP is massive and includes absolutely insane numbers of functions to do everything. If you can think of 3 basic, universal functions that PHP doesn't include, I'd be shocked.
Also, PHP 7 has absolutely nothing to do with standard libraries, as it doesn't really add anything in that regard.
I never said I like PHP. I use it daily for work and hate it. I was just refuting something on a factual basis -- PHP's "standard library" is absolutely enormous.
We run almost everything on Wordpress and still make fun of PHP. And JS. And Java. And ObjC. And Ruby. And Python. Every language has its share of problems and sometimes you just have to vent.
Sure, but: "we make fun of every language/every language has its share of problems, etc." is a completely different statement/message than "everybody makes fun of PHP" -- which singles it out as some huge abnormality...
The Java standard library is anemic? As in Java SE? That's rich. Java SE is an everything but the kitchen sink standard library, with functionality like collection classes, local and networking IO, windowing toolkits (more than one), database connectivity, the list goes on.
There is no language with a less anemic standard library, and Java SE is often criticized for being too large.
The difference is absolutely massive. When working on Java projects, I've always included commons/StringUtils reflexively, and even then Ruby is quite a bit ahead.
The same applies to collections / enumerables. There's a lot in Ruby that I don't like, but powerful "batteries included" standard libraries for the basics like strings and collections are the bedrock of any programming language. It only takes hard work to write them and test them, not any specific skill, so there is no excuse for all the language out there that are skill lacking, in 2016, a sensible collection of utility methods.
The JRE lib seems to be a compromise between "common types for everybody, but leave the convenience to external libraries" and "batteries included". This leaves both camps deeply dissatisfied, but allows for a great deal of integration between libraries. Maybe I have looked at Microsoft flavored C++ one time too often, but the fact that a single string type could reign unchallenged for twenty years makes it easy for me to accept that I need to use weird imports to tackle exotic convenience desires via statics (that, and the sad underutilization of CharSequence).
>There is no language with a less anemic standard library
PHP would like to see you out back in the alleyway, as some other posters in this thread have pointed out.
> Java SE is an everything but the kitchen sink standard library, with functionality like collection classes, local and networking IO, windowing toolkits (more than one), database connectivity, the list goes on.
Yes. However, there are a lot of design decisions held onto because of backwards compatibility that continue to cause massive pain points. Object.toString() comes to mind. The fact that String.format() has to be either statically imported or typed out every time. I'm so sick of lengthy string concatenation in Java.
"When I use format " + createSentenceStringFromList(work.getCoworkers(), "and") + " act like I've committed " + A_HORRENDOUS + " crime."
I don't consider C++'s Boost something to be proud about. Last time I've checked, Google forbade a significant part of it to their developers, and in my opinion rightly so. But who likes everything there should just as well understand this Node affair.
Boost has been incredibly successful as a breeding ground for new C++ libraries. Much of the functionality that was added to the standard library in C++11 came from Boost. This continues to this day.
It's a collection of libraries, no one is using all of it.
From my point of view, if we remain by why not allowing everything Boost, it isn't controversial at all, for the same reason that Linus (an I too) believe that it is a good that Linux kernel is pure C and not C++.
And the same reason why some people now see that the presented Node popular practices aren't something that the battle-tested players consider reasonable to do.
Javascript has an excellent standard library - lodash. Here is its string padding: https://lodash.com/docs#pad.
Lodash also ships hundreds of independent npm modules for those who don't need the entire package. Here is pad as an independent package: https://www.npmjs.com/package/lodash.pad.
Lodash is here today, it works, it is reliable, and it doesn't pull the rug from under you.
Lodash only got padLeft in version 3.0.0 (released last January), and then that method was renamed to padStart in 4.0.0 this January, so it's not exactly the paragon of reliability (although I understand your sentiment)
dalton likes to name methods as ecmascript itself, what I very much appreciate. padStart() was proposed as standard after lodash 3.0.0 was released. So he changed it on the next major release:
That's exactly what is up for discussion here.
One library with 1000 methods, where you only use 2, is probably not better then a library doing what you need.
I'm with Knuth here, a lot of the arguments for discrete function units sound like premature optimization. In the end, you won't get the equivalent of the MODULA-3 standard library or something equally well-defined, but PHP4's mudball of functions.
All this because a few bytes of functions would ruin your experience and/or dead code eliminiation isn't part of your pipeline.
> How stupid of people to trust their package manager to be consistent and correct and return packages they were expecting.
Obviously people shouldn't have trusted npm - that's the reason things broke in the first place!
> How does everyone taking the piss intend to protect themselves from this in their OS package manager, or PPM or composer or pip?
The usual method is to bundle dependencies rather than pull them down from the web at deploy-time. That or use a package manager with rather stronger guarantees on availability of packages.
> It's not javascript devs fault that the standard library is so piss poor you need these short code snippets
This is very true.
> the level of smugness here is absolutely ridiculous.
Wouldn't it be a lot easier and more secure if you simply looked up the code in these packages you're using, reviewed the code to make sure it's legitimate, then copied it to say, your own utils.js package? I'm not even talking about the left-pad package, which shouldn't even be part of the discussion... If you're worried about including 2-4 lines of code on your own then you shouldn't be making an application other people will use in the first place.
Apparently it's not obvious that having a ton of dependencies on 3rd party packages, of which can change at any time on the whim of the owner, is pretty insane in terms of security and minimizing complexity of your application.
You do realize that even though you reviewed the 20 different packages you use at one time, they can change at any time?
I would love to hear what a senior dev at any reputable, long established technology company thinks about including tens, if not hundreds (and some apps, thousands?) of 3rd party packages from strangers on the internet with no validity checks in place. Sign me up!
I dunno; I think it's a balancing act between dependencies and code reuse. I think npm needs to not let people unpublish stuff without thought - that is where the problem is and I'm sure they are thinking about a solution to this.
I think a lot of this is implying "don't use micro packages on your package manager, copy and paste code in helpers.js FTW". I've seen people advocating this as if it's a better solution than have npm remain consistent even after people unpublish packages.
Well, I don't think we can trust NPM on being consistent. They gave away this guy's package to a corp, despite his protest just because they felt like it. I'm not arguing if it was the correct decision or not, it's just an arbitrary decision based on isaac's mood that day.
It's best to treat NPM as mutable and don't depend on it. For example, I check in node_modules to git, even though everyone says no, use shrinkwrap! Yeah, that would definately help here.
NPM is to blame here and people are right to make fun of the entire ecosystem around it. Also, It's just easy and funny.
Noone will start copying stuff around just because of this I believe. The micromodule ecosystem
emerged because it works for people apparently. Devs will find ways to make it work better.
But rather because they didn't wish to lawyer up for a trademark dispute that they had no particular reason to want to be involved in. (One where there was a reasonable chance that they would lose.)
There is (obviously, I would say) no possible solution to the problem of the maintainer of a module you depend on deciding to mess with you. The only serious way to even make a go of it is to move your repository to a curated model where you (in this case, the NPM owners themselves) personally vet all contributions before allowing them in, and even then, I'm confident I can slip something into my own code that will cause problems if I want to -- it's code that I wrote and I've maintained for years and that you've looked at briefly every once in a while.
A sane package repository won't disappear overnight, or ever. I can still install packages distributed with Debian Potato, the first ever Linux distro I used, which is now 15 years old. The code that I wrote against the libraries distributed with it will still compile, and had I only burnt stripped binaries to a CD back then, they would still runtime-link and run today on a brand-new install. And because it is all free software, the people and orgs who wrote those libraries, compiler, and standards, had given up their rights to force unpublication.
In a sane package management system, and with version pinning on the consumer's side, your hissy fit would have had exactly zero effect. Assuming your package were free software, people could fork it, and still make improvements.
That wouldn't mess with me. It would just create a new version, while my dependencies tracking file is locked to the previous one. I only fetch new versions to my local development machine, and only update the dependencies file after testing it locally and on the staging server.
How about adding a line that alters the output randomly every 1000th second. Is your UI testing strategy good enough to pick that up? Or do you code review all your dependencies on version changes?
I was bitten by a JS function change in a package that has been named also in this thread. It was an "oh, we moved that functionality to a new function, but reused the old name for a new function that does nothing like the old". And no semantic versioning.
Slipped through UI testing and broke production functionality. I did not like dependencies before, and I like them even less now.
And people keep wondering why everyone's software is so ridiculously bad.
You think the only thing about the whole situation that is a problem is that npm let an author unpublish, rather than all the projects came to depend on so many external functions, sorry got to get the npm terminology the right, "modules"?
Thank you. I thought I was the only one who thought the whole thing sounded like a smug "look, there are worse programmers than me out there".
Even more annoying, IMO, is the fact that this was a freaking denial-of-service over thousands of projects which was solved IN HOURS. Instead of celebrating the resilience of the duct-taped Jenga game that is the internet, we're complaining about how other programmers should be more like ourselves? What is this, vi vs emacs? :D
Well, Jane Street found Ocaml's standard library lacking and went on to build a better one. To me it seems much better than build a "standard library" piece-by-piece from one-function packages.
My understanding is that Lodash started as a replacement for Underscore.js and was originally fairly monolithic. Only in the most recent version do they make it so that each function is separately requirable and published. They have modularized after starting monolithically (without verifying, I also think that Lodash develops in a single repository, but publishes a bunch of modules from that one repo).
It even includes a function for padding strings. I think you would be doing better for yourself using lodash over left-pad, and the cost of including a bigger library on server-side JS is negligible.
>Hahaha - isn't it hysterical how everyone using npm for small reusable code pieces! Aren't they morons! How stupid of people to trust their package manager to be consistent and correct and return packages they were expecting.
How stupid of people to reuse small often used functions that only do one thing well.
I don't get the supposed irony.
Beyond a certain level it is indeed stupid of them.
And the examples we've seen were all below that level.
I think you're taking it too seriously. This criticism is a good think for npm. Too much cheerleading inevitably leads to lower quality in most communities. I assume people are not going to quit using npm anyway (although I don't use it myself).
>It's not javascript devs fault that the standard library is so piss poor you need these short code snippets and I've definitely included small 5-10 line packages via npm or other package managers rather than roll my own because it's likely they have bug fixes I haven't considered.
Correct, it's the fault of the devs who prioritize their ego and feeling trendy over the practicalities of software design. JavaScript is not a good platform for Real Work(TM). The fact that the stdlib is tiny and such basic functions have to be imported from some random guy's library is a testament to this fact.
Personally I can't wait for the JS-everywhere fad to die out.
In most other languages you end up with a poor approximation of npm's micro-libraries via stackoverflow copy-paste.
Actually thinking about it, an ideal solution would be a combination of stackoverflow and npm: all the solutions to a problem grouped together in one place with community votes and comments, and then you can subscribe to an answer from your code.
> How stupid of people to trust their package manager to be consistent and correct and return packages they were expecting.
Well... Maybe a better word is naive? Because every additional package dependency is a risk. It's a risk in a security sense (has anyone ever wondered what is in left-pad in your organization?) and an engineering sense, as demonstrated by the latest JavaScript Bro if the month's industry-crushing personal power fantasy that you all cheerfully turned into reality by letting him control trivial but critical parts of your codebase.
It's true that to an extent, that risk is unavoidable if you want code reuse. Too little code sharing has the exact same problem! But I think that arguing that any programmer in a position to need left-pad should be given edit rights to a non-toy project is what were needling you all about.
The node communtiy has gone too far in one direction and embraced reuse for reuse's sake, not to address any real complexity.
>How does everyone taking the piss intend to protect themselves from this in their OS package manager, or PPM or composer or pip?
By signing packages and not allowing literally anybody to re-upload a brand new package that does something completely different under the name of an existing, widely-used package. There's not much excuse for that in any respectable package manager.
I've added exactly 0 apt sources, precisely because relying on ppas and the like is terribly, terribly insecure. Allow some random person to run code on my machine as root — that's insane.
Yes, it's a smug joke, but I think it's a funny joke nonetheless (if you got the context). Don't take it too serious.
So many good things happen in the JavaScript ecosystem. Nothing is perfect.
This kind of criticism can help. Even if it's a bit much these days/hours ;)
I can't imagine npm and many others won't learn something from this. What more can you ask for?
I think it's more that people writing libraries are depending on left-pad. You can write the function in almost the time it takes to add and install the dependency.
Furthermore, left-pad didn't really do one thing well. It did one specific subcase of one specific problem well. That's like a tenth of a thing, for a definition of well that basically means "written by a monkey taking an npm tutorial".
This also feeds into an extreme example of the continuing discussion around the cost/value/risks of extensive, transitive dependencies with the general consensus being that you would've been better off writing your own left-pad function 1,000 times rather than forcing the risk of this dependency onto your downstream.
This [1], discussed in another thread is an incredible summary of the issues; I hadn't even thought about packages that are required by other packages; it can lead to literally thousands of deps :-o.
Node/npm and the web need a standard library that can be included piecemeal, from a well trusted source and packages that only depend upon this small subset should be highlighted.
I'm impressed how fast someone was able to turn this into a Unicorn :-) VCs will be stepping all over each other over this!
Seriously, some languages have broken package systems and it looks like it's not the case for JS ... in fact it's working so well they are used ad nauseam, for trivial things even.
This is to be expected from a language that is very flexible, but does not offer 'standard' ways of doing things built in, or at least offers the functionality as part of the distribution of the language libs/modules.
JS just hit puberty, and will mature ...
The first is that every added dependency creates ongoing cost, and that in a world where it becomes typical for a single application to have hundreds of dependencies, this cost is non-trivial and is, at minimum, worth considering.
The second is that even before the left-pad fiasco, a lot of experienced individuals (myself included) believed that npm is fragile by design; and that the dependency situation means that if you want to avoid nasty surprises, you need to pin every version of every dependency and you need to keep your own local copies of all of the files in case something like this happens.
Overall, you're accusing others of failing to fully understand the situation and of being condescending and rude; but you're doing it in a way that is condescending and rude, and that fails to fully understand the situation. Responses like yours are completely unhelpful. They do not change hearts or minds. They do not help npm fix their problems. And your caricature of those who find the situation funny is so distorted from reality that nobody will see themselves in it.
The fact that your post is top of HN is just more evidence that Hacker News is no longer a useful website. It's kind of funny, YC gets better every year, but HN gets worse. I wonder at what point Sam and pg will realize that this cesspool is actually hurting their brand.
>>> a lot of experienced individuals (myself included) believed that npm is fragile by design
I agree with you!
>>> you're accusing others of failing to fully understand the situation and of being condescending and rude
I'm saying that people who say "massive helpers.js file yo" is not a solution. Improve the standard lib, npm and use lodash.
I'm also saying we all rely on other package managers so don't think just because you saw npm as flawed you get a free ride to say this couldn't happen with other systems. All of them have various ways of poking holes in any security you think they have.
At some point you have to put trust in your package manager. I'm riling against the people saying to copy and paste code everywhere.
>>> The fact that your post is top of HN is just more evidence that Hacker News is no longer a useful website. It's kind of funny, YC gets better every year, but HN gets worse. I wonder at what point Sam and pg will realize that this cesspool is actually hurting their brand.
Disagree sure, but I think to say that my post is from a cesspool is a bit over the top!
var zero = require("number-zero");
var hundred = require("number-one-hundred");
var isDivisibleBy5 = require("is-divisible-by-5");
var isDivisibleBy3 = require("is-divisible-by-3");
var isDivisibleBy5and3 = require("is-divisible-by-5-and-3");
var numberToString = require("number-that-is-between-one-and-hundred-to-string");
var fizzbuzz = require("string-fizz-buzz");
var fizz = require("string-fizz");
var buzz = require("string-buzz");
var incrementByOne = require("increment-by-one");
var forLoop = require("for-loop");
var ifCondition = require("if-condition");
var elseIfCondition = require("else-if-condition");
var elseCondition = require("else-condition");
var lessThan = require("less-than");
var print = require("print-string");
forLoop(zero, lessThan(hundred), incrementByOne, function(i) {
condition(isDivisibleBy5and3, print(fizzbuzz),
elseIfCondition(isDivisibleBy3, print(fizz),
elseIfCondition(isDivisibleBy5, print(buzz),
elseCondition(print(numberToString(i))))));
});
HELP!! The CEO heard about this new service and now my manager told me we need to upgrade all our packages to this new service ASAP! But there's nothing on stack overflow I can use to change our system! I need to get this pilot done STAT so we can plan the migration and send it out for bid!
There are rumors twitter is talking to those guys about 140 len version, they are likely to be acquired. We need to wait to see. Tomorrow we should know.
Can someone explain to me why I might need this? I checked the site and the documentation is horrible. The site isn't professionally done and there are no videos.
Can I rely on this service going to be around in 5 years? It just seems like this company might be, you know, a feature rather than a company.
I'm working on it now. All plans will be free except if you want to use negative numbers. AWS will charge me more for 1 + (-1) as opposed to a 1 + 1 operation.
My gut feeling tells me serious software engineers who look down on javascript programmers are feeling justified now. Brogrammers are exposed, hence the lot of knee jerk. Indeed, it is pretty funny, but dependency management still remains a hard problem.
Very nice! Any plans for integration with http://shoutcloud.io/ ? I would love to have my strings both left padded AND capitalized, but the APIs are incompatible. :(
{"message": "Could not parse request body into json: Unexpected character (\'o\' (code 111)): was expecting comma to separate OBJECT entries\n at [Source: [B@6859f1ef; line: 2, column: 22]"}
when using double quotes. It seems some JSON parsing fails. Not sure if this can be exploited, so I wanted to let you know.
this is fantastic! What is your stack? Are you NoSQL or relational? Redis? What is your test coverage? I am sure you hiring only trendy developers. I see huge potential in your service, do you accept private investments? I would like to get in now, before Google or YC snatches you! again, keep up good work and - can't wait for right-pad next year!
Do you have any client libraries available for different languages?
I don't want to create a direct dependency between my code and your API. I'd rather create a dependency between my code and your client library's code, as I'm sure you will always keep that up to date with any API changes.
If left-pad.io goes down, will it take the rest of the WWW infrastructure with it? I'm missing a Q&A for important and apparently relevant questions like these.
I'm ready to get downvoted to hell with this comment but here we go..:
I feel like only non-javascript devs are bashing against small modules and NPM. All great javascript devs I know LOVE that mentality.
Let me offer some reasons why I (as a current Javascript dev having professionally coded in C/C++/Java/Python/PHP/Scheme) think this is great:
- Unlike most other languages, javascript doesn't come with a battery standard library. So you're often left on your own to reinvent the wheel. I mean, common, in Python you do "'hello'.ljust(10)" but AFAIK there isn't such thing in javascript. Javascript is more like the wild west where you need to reinvent everything. So having well tested libraries that does one thing extremely well is really beneficial.
- Javascript, unlike most other languages, has some pretty insane gotchas. I.e. "'0' == 0" is true in javascript. Most devs have been burned so bad in so many ways in Javascript that it's comforting to use a battle-tested library, even for a small feature, rather than reinventing it.
- And anyway, where should we put that function? Most big projects I've worked on have some kind of "helper file" that has 1500 lines, and then at some point different projects start depending on it so noone likes to touch it, etc. So, yeah, creating a new module takes a bit more time, but remember that it's not about the writing time but more about the maintenance time. I'd much rather have lots of small modules with clear dependencies than a big "let's put everything in there" file.
- I feel arguing about whether something should be in a separate module is similar to arguing without something should be in a separate function. For me, it's like hearing "Hey, learn how to code, you don't need function, just write it when you need it." And hey, I've worked in projects professionally where they had no function and it was TERRIBLE. I was trying to refactor some code while adding function, and people would copy my function inside their 1500 lines file. Let me tell you I left that company really fast.
- It's fair to say that UNIX passed the test of time and that the idea of having lots of small programs is extremely beneficial. It forces common interface and great documentation. Similar to how writing test force you to create better design, modularizing your code forces you to think about the bigger picture.
- As far as I'm concerned, I really don't care whether a module is very small or very big, as long as what it does is well defined and tested. For instance, how would you test if a variable is a function? I don't know about you but my first thought wasn't:
function isFunction(functionToCheck) {
var getType = {};
return functionToCheck && getType.toString.call(functionToCheck) === '[object Function]';
}
Who cares if it's a 4 lines module. I don't want to deal with that javascript bullshit. Yes, I could copy past that in my big helper file, but I'd much rather used one that the javascript community use and test.
- Finally, it seems like Node/javascript hasn't started that way. Not so far ago with had Yahoo monolithic javascript libraries and jquery. Even the first versions of most popular node library (such as express) were first written as a monolithic framework. But it's been refactored into dozen of small modules with clear functions. And now, other libraries can just import what they need rather than the whole project.
OK, so I told you about the good thing. What about the bad thing?
- Adding dependencies to a project is REALLY HARD TO MAINTAIN. I've had so many bad experience using node because of that. I.e. I work on a project, it's tested and work fine. 2 months later I clone and start the project and everything breaks. Oh, X and Y libraries decided to fuck everything, that other library now depend on a new version of Node, but I can't upgrade node because that other library depend on a previous version of Node. It's complex. I won't go on in explaining my solution to this problem, but enough to say that it's a problem and installing random amateur libraries in a professional project can lead to disaster.
- It takes longer to code. I've touched that earlier. It's a tradeoff about write now vs maintain later. Take a look at segmentio github repo: https://github.com/segmentio. I'd personally love to have that as onboarding experience rather than some massive project with everything copy/pasted a few time. But yes, it took them more time to create those separate modules.
>2 months later I clone and start the project and everything breaks. Oh, X and Y libraries decided to fuck everything
Are you not using semver? A common mistake I've seen is to depend on the "*" or "latest" version of a package, which obviously will break when the package releases a major update.
Also, applications should use shrinkwraps to pin their dependency versions to versions they've been tested with.
What kind of load balancing is being used on the back-end? I called leftpad(str, ch, len) with the length I needed and noticed that is not very scalable because it is blocking.
A better approach I would recommend to those using it is to call the API in a for loop. In my tests, it had performance very close to those I see in C or assembly.
I was a bit turned off that the free version can only handle strings up to 1024 in length. I know you need to make some money, but it is big turn off for a lot of my projects.
Edit: I finally signed up for it but still noticed that I am only allowed to use 1024. I called your customer support line and they said I was calling the API from multiple IP addresses and for that I need an enterprise license. Please help me with this issue, it is very crucial at this point as my project is in a complete stop because of this.