Stop Overengineering

A personal appeal.

  1. Overengineering works against results.?Shifting the focus away from results is never good — for personal projects, corporate codebases, or anything in between.
  2. Overengineering over indexes on things you know.?Overengineering invents new constraints instead of tackling real ones. The made-up constraints we tell ourselves are usually ones we already know how to solve (known-knowns vs unknown-unknowns).
  3. Overengineering is not elegant.?Overengineering yields complicated solutions. All other things equal, a machine with fewer moving parts is better than one with many.
  4. Overengineering is fragile (simplicity is antifragile).?Generalizing abstractions rarely creates the optionality that we convince ourselves it does. Overengineering leads to over-specification, which ironically leads to greater coupling.
  5. Overengineering increases maintenance costs.?More engineering means more knowledge that needs to be transferred to coworkers, future contributors, a larger surface for bugs, and continued upkeeping costs.
  6. Overengineering is indirection.?All problems in computer science can be solved by another level of indirection, except for the problem of too many layers of indirection.
  7. Overengineering is NPV negative, even for real concerns.?What’s the net present value of fixing an esoteric edge case? What will happen in the failure mode? How often does the event occur? Overengineering is never a reasonable allocation of resources.
  8. Overengineering is a precise bet on the future.?The more assumptions you make about the future, the more it should be discounted.
  9. Overengineering misses deadlines.?There has never been an overengineered product that was delivered on the deadline.
  10. Overengineering does not work towards product-market fit (therefore, it works against it).?Overengineering is never customer-centric.

Originally posted on https://matt-rickard.com/stop-overengineering

Kevin Minutti

AI/ML Software Engineer

1 年

Solid advice, I've experienced a few of the costs - some lessons need to be learned the hard way.

回复

要查看或添加评论,请登录

Matt Rickard的更多文章

  • Lessons from llama.cpp

    Lessons from llama.cpp

    Llama.cpp is an implementation of Meta’s LLaMA architecture in C/C++.

  • To be, or not to be; ay, there’s the point.

    To be, or not to be; ay, there’s the point.

    It doesn’t have the same ring to it as the Hamlet that we know, but this is from the first published version of Hamlet…

  • AI Agents Today

    AI Agents Today

    The term AI agent is used loosely. It can mean almost anything.

  • Norvig's Agent Definition

    Norvig's Agent Definition

    There’s no consensus on what an AI agent means today. The term is used to describe everything from chatbots to for…

    1 条评论
  • The Lucretius Problem

    The Lucretius Problem

    Just as any river is enormous to someone who looks at it and who, before that time, has not seen one greater. So, too…

    1 条评论
  • Eroom's Law

    Eroom's Law

    Despite advances in technology and increased spending, the number of new drugs approved per billion dollars spent on…

    1 条评论
  • Copilot is an Incumbent Business Model

    Copilot is an Incumbent Business Model

    The Copilot business model has been the prevailing enterprise strategy of AI. An assistant that helps you write the…

    1 条评论
  • What if Google Wasn’t The Default?

    What if Google Wasn’t The Default?

    Google has paid Apple to be the default search on their operating systems since 2002. But recent antitrust cases…

  • The Cost of Index Everything

    The Cost of Index Everything

    Many AI products today are focused on indexing as much as possible. Every meeting, every document, every moment of your…

  • Strategies for the GPU-Poor

    Strategies for the GPU-Poor

    GPUs are hard to come by, often fetching significant premiums in their aftermarket prices (if you can find them). Cloud…

社区洞察

其他会员也浏览了