This title of my article doesn't make much sense to me. Why would you get fired for giving feedback? Is this just a US thing? I give feedback to my superiors all the time, and expect my subordinates to do the same. In fact, as far as my team goes, you're more likely to get into trouble (not fired) if you rarely give feedback.
American workplaces are very toxic. Waves of managers ride on the coattails of the latest dude in charge. That they are completely clueless doesn’t prevent them from getting the job. If you point out their failures, you will immediately be labeled as part of the opposition.
In other words, this is pure Machiavellian politics. The truth doesn’t matter. If you don’t kiss the donkey’s ass, the best advice IMO is to shut the hell up, collect your paycheck, and go home and kiss your wife. Your family is the only people who matter. The farce will go on for years. Like all Ponzi schemes, it will eventually collapse. But it may take years.
when your boss is a one-upper who doesnt understand your teams product, they will do everything possible to sabatoge your efforts and spin metrics to make you look bad. attempting to give feedback to such a person will get you this response: "you need to earn trust and learn to disagree and commit".
idk if this is a US thing; its mainly a thing where a manager is tasked with supporting a product that they dont understand. managers in these situations attempt to treat their managerial role as if the product itself does not matter - the team has metrics to hit, and to them thats more important than the product.
> give feedback to such a person will get you this response: "you need to earn trust and learn to disagree and commit"
Sure, but you won't get fired, right?
Even so, this is just about dealing in general with people who have ego (for the sake of brevity) issues. I don't understand why this should be advice for dealing with senior leadership in general.
It's not only US thing, eastern european too. We're on average rather alright developers but culturally there's no skill of good management - we have been destroyed in this aspect by Soviet occupation and communism.
> Why would you get fired for giving feedback? Is this just a US thing?
Possibly: in many places there they have "at will" employment, where they can basically fire you with little to no severance at any time for any reason. There are limits, but compared to most of Europe that's the gist of it.
So yeah, you can be fired just because someone doesn't like you. And giving feedback is a good way to not be liked.
every employment should be at will!! is there a constitutional or birthright to employment at XYZ company? these discussion always look at one side, like “employer can’t fire employee X without paying severance and other junk” but if I started a company and said “if you leave the company you have to pay severance back to the company for leaving” everyone would be up in arms about it…
I would agree in an ideal world where salaries are tied to a qualification (a notion a tad more nuanced than just diploma), instead of their job. While there is no (nor should be any) constitutional right to work at any given company, I think people should have an inalienable right to live. Which in practice means food, shelter, and health care at a minimum, regardless of their ability (perhaps even willingness) to work anywhere.
The compromise most EU countries have settled on is that once you're employed and past some probation period (in my line of work that can last up to 8 months), then they can't fire you without a damn good reason or a hefty severance (the better the reason the lower the severance, basically). But it cuts both ways: I personally can't leave immediately, I have to tell my employer 3 months in advance. It is in a way a kind of severance.
I think this arrangement would be fair if you had to pay back to the company the same amount they have to pay you. to get a notice for some period of time makes sense (both ways) but I can’t see a reason why this is all not equal on both sides.
> I can’t see a reason why this is all not equal on both sides.
I can.
Think of what employment is for a second: shareholders (or company owners) own the capital, and the employee gets to follow orders. Structurally, the company pay workers less than their actual value: shareholders gotta hold, and they squeeze the margin out of the employee. The margin may be thin, but it's never meant to be zero (except for non-profits, but they're the exception).
Such a relationship is fundamentally asymmetrical, such an exchange fundamentally unequal. If you want any hope of restoring fairness from this system, termination conditions have to be asymmetrical as well.
And I can’t see a reason why it should be equal. Even if you strip the relationship down to its most basic principle, there is asymmetry. One party provides work with the expectation of pay, the other party providess pay with the expectation of work.
But the asymmetries don’t end there. Terminating employment is a far greater threat to the employee than the employer. This creates a power imbalance which could easily be exploited by malicious or incompetent employers. That power imbalance is fundamental to this relationship and is reason enough (in the opinion of many countries) to bolster worker rights.
I once had an owner of a small business threaten to sue me for quitting because it would cause financial harm to the business. And that was with giving 4 weeks notice.
At-will employment laws protect employees that want to quit, not just employers that want to fire.
I'm dealing with related issues at work right. Leadership is beating us over the head with the "innovation" hammer, without any consideration why and where they want to innovate. This mandate is a recipe for disaster in the hands of inexperienced tech-leads.
For example, there was an initiative to move from Talend to Azure Data Factory. Devs had to upskill, and it took them several months to deliver something that fails intermittently (with a huge cost behind it), and noone knows how to fix it.
I rewrote the pipeline in 2 hours (as simple Windows Service that parses a file and writes data to a DB), and polished it in around a day or so. Added some informative and error notifications, and we immediately saw benefits.
Innovation is fun, but some things just don't really need much of it. We've known how to do stuff like ETL for decades. We really don't need cloud-hosted solutions behind client secrets and gated services to load data into a DB.
Ironically, my boring solution seemed more "innovative" because users can now get customised notifications in different Teams Channels.
"Boring"/"exciting" tech isn't stifling or improving anything. These are orthoganal issues. Innovation can emerge from boring tech.
On a tangential note, one of my pet peeves is the way that many people (mostly Americans?) pronounce words like "processes" as "process-eez".
Words with Greek roots that end in -is or -es generally use the -eez suffix. e.g. analysis -> analyses; thesis -> theses
In the case of Latin, it's -ix or -ex. e.g. index - indices, appendix - appendices.
There are of course exceptions and outliers (suffix -> suffixes; octopus -> octopodes!?), but words like "process" and "bias" do not fall into the categories mentioned, so there's no reason to use the non-standard "processeez" and "biaseez". Unless - IMO - you want to sound like a snob... Think about it - how does one pronounce words like "successes" or "princesses"?
One could argue that language evolves - this is true, but in general language evolves to have simpler rules with fewer exceptions rather than the other way around.
Notably, it's only the noun plural that becomes "-eez" ("these processes"), while the verb present tense remains "-iz" ("she processes his application").
It seems to be going along with the gradual adoption of "often" with a "t" sound -- "off-tuhn" instead of "off-uhn".
Nobody said it with a "t" when and where I grew up (or on TV that I remember), because obviously the second syllable of "often" was the same as in "soften", "moisten", "hasten", "fasten", "glisten", and so forth. All silent t's.
But now it's at the point where probably a majority of people I hear on television and podcasts, as well as in my personal life, pronounce the "t". But only in "often" -- not in a single one of the other words I listed.
Both "often" and "processes" seem to fall in the category of hypercorrection, where people are trying to sound more correct.
I've heard both, and the "-esseez" plural just seems less ambiguous on poor videoconferencing lines and recordings. "-esses" is a mouthful to pronounce.
Really? I think it's the opposite. One does things like this because one is afraid of being assailed by pedants and made to feel inferior. This is why, I think, I hear people, mostly British, say things like "to so-and-so and I". They're afraid to use the wrong form or the pronoun and be scolded, so they overcorrect.
And about this:
> in general language evolves to have simpler rules with fewer exceptions rather than the other way around.
I don't think that's generally true. Rather, language changes in many ways, but one of them is the accumulation of exceptions to a formerly simple system. This gives us the complex paradigms of "be" and "go", for example.
I enjoy intentionally mispronouncing words to my fiancee, and this one is definitely going into the rotation, so thank you for that! (Now, to figure out how to get "tortoise" into casual conversation.)
I not sure what you're talking about. I'm referring to things where pronunciation is generally based on the etymology. I don't think the word "adjective" falls into this category. One could perhaps make a case for aluminum vs aluminium (cf: platinum), but those are pretty much different words that refer to the same thing.
"Process-eez" is the same word as "processes" with a pronunciation based on a misunderstanding (presumably) of the etymological "rules".
By and large they do. English, like all indo European languages, used to have many grammatical cases and verb forms. Now we mostly retain cases in pronouns, and most verbs are about two forms per tense.
Latin used to have all its cases suffixes, and today's Romance languages have dropped nearly all of them.
English has simple verb and noun morphology, but very complicated syntax and phonology. Hard to say that it’s uniformly more or less complex than Latin.
I was watching one of his vids recently about C++ tricks and he talked about one time he submitted about 1m changed lines of Windows source code in a single commit, to a repo that didn't even support branching. It's crazy how devs used to work back in the day.
Is it really that hard to write sensible comments?
Well, after a few decades writing software, I've come to the conclusion that yes, yes it is. But just because it can be hard to write a good comment it doesn't mean you need to agonise over it.
Here are a few "simple" examples:
// Once we receive the cancellation ack, we should automatically send the updated flow
if (releasedFlow.Status == ReleasedFlowStatus.IndicativeCancelled)
{
await _mediator.Send(new SendFlowCommand ...
It should be pretty clear from the context that that's what is happening and it doesn't explain why. This is because it would take multiple paragraphs to explain. Could I just post a link to the documentation about this business rule? Yes, but the location of the said documentation changes so often it can render the comment useless. Could I change it to say something like "Please refer to the documentation"? Sure, but then why don't I put that comment behind every piece of business logic? No, the simple purpose of this comment is just that - a comment on what the business rule was at the time of writing, especially with respect to the other possibilities. You can spend ages overthinking this but there's no point. Just read, understand and move on.
What about this:
if (foo)
{
DoFoo();
}
else if (bar)
{
DoBar();
}
else
{
// Intentionally empty
}
Dead code with a comment that explains nothing. Is this code better with or without the dead code and comment? I'd argue that the code is better with it. In fact, this is a well-known technique: https://en.wikipedia.org/wiki/Intentionally_blank_page
Yes, you can argue that the coder should explain why it's blank. But this depends on the context, and who the intended audience is. Again, no need to overthink it. Read it, modify it if you really think it's necessary (keeping in mind that the alternative is often no comment or empty else statement), and move on with your life.
I've had to do this several times in the past. Honestly, my best advice would probably be make several backups, then to do as little as possible. If you need to make a small change, fine. Bigger changes? Consider if you can't do the bulk of the work in a technology or stack you understand and only make a small change to the legacy code base.
Most of the time I spend with C++ code revolves around figuring out compile/link errors. Heaven forbid you need to deal with non-portable `make` files that for some reason work on the old box, but not yours... Oh, and I hope you have a ton of spare space because of some reason building a 500k exe takes 4GB.
Keep in mind, this advice only applies to inherited C++ code bases. If you've written your own or are working on an actively maintained project these are non-issues. Sort-of.
When I ask ChatGPT4 what the length of its context window is, it tells me (4096 tokens).
When I ask Gemini, it basically tells me "it depends" with a few paragraphs of things I generally don't care about and then suggests I ask for a ballpark estimate (1k - 3k tokens).
Beware of hallucinations with this kind of question. An LLM doesn't have knowledge about itself unless that was fed into a system prompt by the developers somewhere on the backend, or if it's Internet connected and does a search for itself. While they often do so in terms of e.g. basic stuff like its name and that it's an AI, context windows start veering into advanced details and I would much rather rely on official documentation on the service in this case.