Best Practices – Why I Hate Them
> If you can’t explain why you’re doing something, and you’re just doing what you’ve been seeing others do, maybe rethink it and educate yourself on the topic.
This is very interesting to me, and applied to our entire lives, not just IT. I've started with the above opinion about 25 years ago. I was as sure as the author about it and I was similarly very outspoken about it. With time though I started noticing some problems with the theory/philosophy. I will fast forward and simplify not to make this very long:
- people notice a pattern: if problem X then solution Y
- the Y solution spreads in adoption
- problem X is not really experienced any more because it is "presolved"
- future people end up doing Y without knowing why
At this point you have NO clear optimum path:
1. you can keep doing Y but you will certainly do it when X is not present so you expend energy for nothing and even make things worse in some cases
2. you observe point 1 and stop doing Y, but you will stop doing it even when X is present, at least until you start noticing X reappearing as a problem and the X->Y pattern is again understood
Now you may say: just rethink everything before doing it. But that is not possible, it would several lifetimes just to discover the absolute basics of every day life. Most things you can not deduce, you must go through trial and error for significant periods of time.
So what do we do about tradition? Do we follow it blindly? Do we reinvent the wheel at every step? Clearly those are just the extremes but I do want to mention that if I would have to chose just between them I would follow tradition blindly. 20 year old me would absolutely think I am an idiot for this! But I payed the price of thinking I can outsmart past generations so many times that I became very skeptical about how smart I am in reinventing things. And this is not me being stupid, it's just that most cycles of pattern recognition do take generations to observe and optimize for, no one person can do it.
As everything, it depends on the context. Following a set of rules blindly just for the sake of it when you and your team is producing good quality code and have a good team culture is absurd. However, relying on “best practices” guidelines might be useful in scenarios where you don’t want to fall in your own (or someone else’s) subjective opinion that doesn’t necessarily imply the best solution.
As a personal story, not so long ago, I was the technical leader of a team that had the tendency of not following even the most basic set of common sense standards. The code base consistency and practices around that code where an absolute mess. Chaos and opinions everywhere. Everyone had his/her own way of doing things. So in order to have everyone on the same page and producing something predictable and consistent, I had to heavily impose a set of best practice guidelines. I felt kinda bad at the beginning as I don’t like to behave as a dictator but the good thing is that, at the end, everyone felt the benefit and new team members were able to catch up and being productive quickly as everything followed a pattern.
So, yes, best practices might be a useful resource for drawing a line when everything else is just an opinion without technical support behind (or a poorly one).
I don't agree with everything in the article (ML or blockchain are not "best practices" in any sense), but I do think cargo culting in software engineering is definitely alive and kicking as ever.
For example, there's been a definite shift in programming language trends away from dynamic languages and towards "new static languages", like TypeScript and Rust for example. In web development I've noticed these languages and all the promises of safety they bring has also brought a pretty siginficant change in mindset. Pragmatism and "move fast and break things" is now totally out of fashion, and the new guardians of ultimate code quality would probably hate to admit that what they're practicing is a fashion. Compiler safety is heady stuff.
One of the key benefits of "Best Practices" is, when provided by the documentation of something you're using, you can reliably consider it as good advice and follow it, even if you don't necessarily understand why it is the best practice.
An instance of this would be "Best Practices for Cloud Firestore"[1] which spends a whole lot of time discussing different ways of avoiding "hotspotting" without really going into any detail about what it is or what specifically causes it, other than that it adds latency. If my project manager asks me why we're not using sequential IDs, and I say it's a best practice according to the documentation, he'll happily accept that answer even if neither of us understand why it's a best practice.
[1] https://firebase.google.com/docs/firestore/best-practices
This article is written in an alien universe. "Best Practice" is the well-trodden path. It's typically easy to find, many people know about it, it avoids most of the pitfalls you don't know about, you can get help if you find yourself slightly lost, it's easy terrain to navigate, you can probably walk it or ride it, it gets you there without you putting in much more effort.
It is probably not the shortest nor fastest path, nor most efficient path for 4-horse carriage.
Here's a best practice: Don't build Splunk infrastructure on Windows. Is it bad? No, it's fine, but fine-grained control, observability, permission and logging framework, tooling, they're all minus compared to being on +nix. Don't build Splunk infrastructure on BSD, or Solaris, even if they're +nix. Not enough people use it that when you run into trouble, there's no help. Does one have to "defend" and "explain" that every time? Does the author hate that principle? There is such a thing as best practice (no capitalization) whether one admits it or not. It's not for the old seadogs, it's for the newly initiated, you don't need to know the whys at the very beginning. At the beginning, the best advice is not to fall into the ditch.
What I hate about "best practice discourse" is how it's used to shut down rational consideration of alternatives the "best practice promoter" doesn't like.
People at my org are currently attempting to forbid the use of a certain reactive framework because they claim it is incompatible with the "best practice" of MVC. In reality, reactive frameworks have been developed in part to provide a simpler, more predictable, more productive alternative to MVC.
I'll leave you with a long quote from Scott Adams that I'm always reminded of in this context:
> When you are trained in the ways of persuasion, you start seeing three types of people in the world. I’ll call them Rational People, Word-Thinkers, and Persuaders. Their qualities look like this:
> Rational People: Use data and reason to arrive at truth. (This group is mostly imaginary.)
> Word-Thinkers: Use labels, word definitions, and analogies to create the illusion of rational thinking. This group is 99% of the world.
> Persuaders: Use simplicity, repetition, emotion, habit, aspirations, visual communication, and other tools of persuasion to program other people and themselves. This group is about 1% of the population and effectively control the word-thinkers of the world.
A more accurate way to say “best practices” is to say “minimally acceptable”. This one trick is especially helpful when talking to management.
If everyone is doing it, then it means it’s the bare minimum that you should be doing, in order to be the “best”, you’ll need to go far above and beyond “minimally acceptable”.
When management hears, “best practices”, they think, “If we do that we’re done!”, which is not true, but they don’t understand that. When you try to improve things management becomes an impediment, but they usually don’t explain themselves, because they think it’s obvious, we’re already doing the “best”, why do we need to do anything else?
My life has become much easier since I figured this out. There definitely are best practices, but the phrase is a lot of times used by people as a tool to justify what they do. It's also used a lot by people who don't actually want to do good work for their customers, only to impress their peers. "I have now refactored my code into proper modules to follow best practices. My design is now preventing from implementing a feature my users actually need." The same goes for other stress inducing phrases like "code smell" etc.
> When you want to specifically point out that maybe the code calculating taxes should not be referencing MongoDb, be specific: say it’s transgressing the Dependency Inversion Principle
I would argue that even that is still evading the actual argument. Why do we need the Dependency Inversion Principle?
Rather, I would argue that a good-faith argument would answer the questions:
- What (potential) problem does it solve?
- How likely are we, specifically, to actually run into that problem?
In French we say "good practices" (bonnes pratiques) so at least there is room for improving them ;)
This post is written from my own point of view, where I often get annoyed by the rules people make for programming, engineering, and business generally. But to play devil's advocate: it's often very difficult to know exactly why we do some things, and when it would be bad to change a practice. And it's good, in my opinion, that most companies have a mix of the more independently minded engineers and more humble engineers that just want to follow the 'best practices' or trends.
Those people are going to CYA, you might not always appreciate it, but if everyone was always trying to pull the rug out - it would be chaos. They give engineers like us that like to question these things room to occasionally try new approaches to streamline or improve the 'best practices', without as much risk.
Best practices and receieved wisdom can be useful but only when interpreted by someone with a degree of wisdom and judgement. As ever, the answer to whether you should do A or B is 'it depends'. Applying a best practice blindly is, ironically, a 'worst practice'.
Best practices are incredibly useful, we need to learn from the accumulated wisdom of those who've gone before, but you need to know when to modify or ignore them too.
One example in networking of a "best practice" that quickly turned into a "worst practice" is disabling auto negotiation. Back in the mid 1990's when 100 mbps nic's first came out implementation of the new auto negotiation protocol was quite buggy. The workaround at the time was to "lock down" network ports by disabling auto negotiation. Somehow this quickly became canon and an entire generation of network engineers swore this as their first commandment.
Meanwhile the bugs with auto negotiation were worked out within a few years and by year 2000 there was no reason to do this anymore and every reason not to:
Disabling auto negotiation causes duplex mismatches. If not immediately then eventually it always does. There are too many ways one side of a link can get reset to the industry default (auto negotiation enabled): Replacing hardware, automatic nic driver updates, staff turnover, and simply not checking each and every interface every time a change is made to the network. There is no practical way to keep such an inherently unstable system in sync long term.
Ironically these inevitable duplex mismatches only seem to reinforce the belief that you must "lock down" every port everywhere resulting in a vicious cycle of bad practice.
The best way to avoid duplex mismatches is to never disable auto negotiation unless you are absolutely forced to by some very old or very poorly designed device.
So the problem is that the practice is called "best"?
The idea is that of all practices for solving a problem the "best practices" were those which often if not always proved best for as many contexts as possible (also in hindsight). Version control is usually considered a best practice.
I don't see a problem here. When you work with people who blindly follow what is called "best", cargo cult will not stop when you stop calling it best.
And beside that I will always prefer "cargo cult" over "not invented here syndrome".
A great blog post echoing the same sentiments (that no longer appears to be there):
If you have ever dealt with me directly as a customer, attended one of my presentations, or even simply stomached one of my diatribes in a casual, technical conversation, you have no doubt heard about one of my pet peeves (no, not “Tips –n- Tricks” – we’ll visit that another time) but the term “best practices.”
I loathe that term. I know we are guilty of it at Microsoft – so are just about every single technology-based organizations. In our ever-evolving industry, to put something down in print as THE ABSOLUTE BEST PRACTICES goes against the very nature of the word “practice.” It can seem arrogant. It implies way too much finality. It also can mean serious disruption when a practice changes. I work in a consulting practice. As products, regulations, politics, and trends change, so do our recommended practices. This is why I purposely try to use “under the current recommended practice” or simply just “current recommended practices.”
Source: http://blogs.technet.com/b/gladiatormsft/archive/2014/08/25/...
"Best practices" is something you can advise on, write on, etc. - i.e. it is sellable.
Following (mandating) best practices immunizes (management) against being accused of doing things wrong. Might still fail, but then it failed despite doing everything "right".
Also, following "best practices" is much faster that creating solutions on your own: It will only get you to some sort of "average" in most cases, but that is what most places settle for these days.
Very often so-called best practices are really quite bad and ended up considered best practices more for marketing reasons than because of merit. Let me list some 'best practices' that are actually quite bad in many, if not most, circumstances: git flow, comments for every method parameter, microservices, feature branches, automated code formatting, scrum.
Used to have to argue with my dev team manager about this, he was a big fan of reading about some new-ish concept that we weren't using yet and then trying to tell the team he needs to spend a couple weeks changing everything over to it. I'd be the only one trying to force him to give us a list of "pros" for it over what we currently do and he wouldn't be able to answer, then go off and half-implement it across the system without telling anyone, causing the degradation of our ability to maintain the system (since it's now less consistently written) and wasting his own time.
People need to take best practices with a pinch of salt, and actually think through whether they'll provide any benefit to them in practice for whatever software they're working on.
Completely agree with the article. And as with such, all best practices have a time an a place.
Here's one great example of what not to do: mindlessly enforcing code line sizes. No, your hand won't get chopped off if you go over 80 chars (unless it's the Linux kernel, but they used to have good reasons for that)
If the first thing a dev does to your python code is mindless chopping it off at 80 chars he's just cargo culting.
Who defines a best practice? Do they have some particular authority?
What empirical evidence backs it up?
What makes it so unequivocally correct in all circumstances?
Have you mathematically proven it to be the "best"?
Is there a history of other practices that preceded it with explanation as to the superiority?
The answer is basically no/none/unknown to all of these.
I could often make an argument for smug “Considered Harmful” documents. They should be considered harmful.
I've dropped the "best" for "better" or "good" for a while now. I also don't have a problem going to a colleague with more experience and who I know puts out good stuff, and asking them what they think the "best" solution is. I gain some knowledge from asking them and looking at how they use it, and I don't feel stupid or like I'm doing something wrong by not understanding it inside-out. There's a middle ground that provides productive results. And at the end of of it all, if you didn't know what a plane or an airfield was and you saw flying metal contraptions delivering cargo, building an airfield isn't such a bad first step to understanding what's going on.
This is a problem I've heard others talk about but I think I've been really lucky because across the three places I've worked we've had code standards and best practices as living documents with ongoing discussions for reasoning about them and refining them.
By which I don't mean that there's a constant bickering about how to do things, of course. We have standards and people adhere to them in their daily work week. But these are established not by deferring to some vague authority, but by actual debates with plenty of examples and reflection on how it applies to our own code base.
A bug bear of mine for a long time has been the best practice of always having SQL server in full logging mode, with highly regular log backups.
It's great for important production services. In practise I run test databases I could rebuild from scripts. Archive databases that never see a write. Databases that load and drop automatically from other sources. Every single time someone looks at these, there's a someone panicking about best practise. Something about this particular issue just makes critical thought non existent.
The writer is correct in what (s)he is saying but these are not "best practices". Best practice is to not name a variable "asdfg", not using AI for the sake of the trend.
There's similar opposition to the "best practices" term in a 2008 book, e.g., search inside at https://www.google.com/books/edition/The_New_School_of_Infor...
For example: "they are highly unlikely to match the specifics of any particular environment, and so they are inefficient by their very nature"
The best developers I have ever worked with never said “Best Practice”. The worst/most inexperienced developers I have ever worked with say it all the time. Because they don’t have any clue what they are talking about. If you can’t explain why a so called “Best Practice” is useful in this specific context, and what the pros/cons are, then you should honestly just keep quiet and wait until you learn/know more.
Disappointed. I was hoping for a list of best practices and how their cost can be greater than their value.
Ah - I could've written this article (or at least a winding, ranty version thereof). Very good.
A best practice is often like a musician's rudiment.
It helps to build understanding.
Sometimes action precedes knowledge.
I agree completely with this article (but it probably depends on which area of development you work in).
I'm an android developer and whenever I hear the term best practice it immediately sets off my BS detector.
Best practice is what you won't get fired for doing. Most of us want to be able to support ourselves and our families more than we value exercising our individual engineering judgement.
> – “Do we really need machine learning?” - BEST. PRACTICE. Also let me tell you about blockchain.
This one really made me smile. Nailed.
undefined
Two things:
"Best Practices" is a marketing trick. The fact of the matter is that software engineering (and programming, which I consider to be a technically separate subfield) is still coming out of its alchemy stage. We're someplace between the pre-jouleian thermodynamics and phlogiston phase. Like, it's better than wizards sitting in dark rooms experimenting with things which normal minds were not meant to see. However, we've got a bunch of rules that are either flat out wrong or almost definitely wrong.
So, we can either tell everyone that we're currently riding on a train built by crazy wizards and we're just now figuring out all the stuff we don't know. OR we can hype things up a bit to keep the muggles from freaking out too much.
Personally, I would like to just get by in life by telling the truth. However, many of my more successful peers are able to achieve much more than me by dressing up what they're doing with a good sounding story. Part of that is "best practices". I don't really like it, but I have a hard time begrudging them because it looks suspiciously that everyone knows what's going on and in fact are comforted by the language usage.
The other side of this is that author wants to replace 'best practice' with 'pattern'. Honestly, I don't see what the difference is. I doubt you can actually define that term so that it's any better. And the true hucksters (given 5 minutes) can find out how to spin 'pattern' to be just as problematic (if not more so) than 'best practice'. "I don't like the words you use. Use the words I like!" is a position that I'm rarely going to get behind. Like, from my point of view people are just making a bunch of noises from their mouths either way.
The second one is the assertion that if you can't explain it then you're cargo cult-ing. I don't like it when people can't explain what they're doing. I've been plagued by it for my whole career. Someone suggests something. It makes zero sense to me. I probe with questions. Eventually I hit the bedrock of their knowledge and there's still an embarrassing amount of blank lines still lying around. However, they're still able to be productive.
The very idea of "experience" is that you've got some know-how which you're unable to explain. If you could explain it, then it would be "technique". And this is something that permeates our lives. Like, how are you even reading this right now? Can you actually explain how your brain is recognizing letters, words, sentences, ideas? You're definitely not cargo cult-ing as you read this. The whole basis of ML is that we don't actually know how to explain a bunch of stuff to the computer so we just find clever ways of throwing statistics at the problem.
I find the cargo cult assertion wrong. Even though I really don't like it when people are unable to adequately explain what they're doing and/or telling others to do.
Lost me at "To begin with, if these practices are the best, there’s no reason why to search for better practices"
Noone is saying best practice are the best. Only that it's the best we are practicing, are there better yea sure. Can I make a better solution than Json? Maybe. But do anyone have a good or even a mediocre understanding of it? Probably not, why? Well it's not best practice.
Best practice is like ITIL just a way of doing things that's fairly proven to work well, and many people know or understand it to some extent. And it constantly changes.
Best practices are snake oil. The people using them are constantly making trade-offs while they dont have to and they are so ignorant they cant recognise that their code is shit because everytime you say that, they hide behind its popularity or authority. They cannot believe all the people using their "best practice" are bad at coding. It's a closed loop of stupidity.
No one things blockchain is a best practice and anyone who things they need one is a fool