Avoiding Footguns

Footguns are features or designs more likely to be misused, often leading to self-inflicted problems or bugs (“shooting yourself in the foot”). See a list of C functions?banned?in the git codebase for being footguns. Some more examples:

  • Inconsistent naming
  • Manual garbage collection for connections or open files
  • Race conditions with async code
  • Multiple sources of truth
  • Long argument lists
  • Shadowing variables in deep scopes

Avoiding footguns comes with experience — often, the footguns are perfectly legal code that can be compiled. Some languages avoid certain footguns (sometimes to introduce other ones). For example, garbage-collected languages remove one class of memory management footguns (at the expense of a GC).

Language-level footguns are probably the biggest class of footguns — e.g., default arguments are mutable in Python, useEffect without dependencies in React, not closing connections in a defer block in Go or the Drop trait in Rust.

Linters can sometimes catch footgun constructions and surface them as warnings. But the most effective way is just learning them.


Originally posted on https://matt-rickard.com/avoiding-footguns

Justin Hunter

SaaS Business Advisor and SaaS Angel Investor

1 年

Good points. At my last company, after we witnessed users of our tool shooting themselves in the foot by trying to do (in our minds) surprising and counter-intuitive things, we put appropriate safeguards in place. In some cases, “bans” (you can’t do that, because…). In other cases big bold “are you SURE you want to do that? It would normally be a terrible idea to do that unless you are trying to achieve this specific extremely rare edge case goal...” The tool was significantly better for it. This topic reminds me of “poka-yoke” solutions which proactively prevent footgun damage in both manufacturing processes and in customer-usage contexts. https://en.m.wikipedia.org/wiki/Poka-yoke

要查看或添加评论,请登录

Matt Rickard的更多文章

  • Lessons from llama.cpp

    Lessons from llama.cpp

    Llama.cpp is an implementation of Meta’s LLaMA architecture in C/C++.

  • To be, or not to be; ay, there’s the point.

    To be, or not to be; ay, there’s the point.

    It doesn’t have the same ring to it as the Hamlet that we know, but this is from the first published version of Hamlet…

  • AI Agents Today

    AI Agents Today

    The term AI agent is used loosely. It can mean almost anything.

  • Norvig's Agent Definition

    Norvig's Agent Definition

    There’s no consensus on what an AI agent means today. The term is used to describe everything from chatbots to for…

    1 条评论
  • The Lucretius Problem

    The Lucretius Problem

    Just as any river is enormous to someone who looks at it and who, before that time, has not seen one greater. So, too…

    1 条评论
  • Eroom's Law

    Eroom's Law

    Despite advances in technology and increased spending, the number of new drugs approved per billion dollars spent on…

    1 条评论
  • Copilot is an Incumbent Business Model

    Copilot is an Incumbent Business Model

    The Copilot business model has been the prevailing enterprise strategy of AI. An assistant that helps you write the…

    1 条评论
  • What if Google Wasn’t The Default?

    What if Google Wasn’t The Default?

    Google has paid Apple to be the default search on their operating systems since 2002. But recent antitrust cases…

  • The Cost of Index Everything

    The Cost of Index Everything

    Many AI products today are focused on indexing as much as possible. Every meeting, every document, every moment of your…

  • Strategies for the GPU-Poor

    Strategies for the GPU-Poor

    GPUs are hard to come by, often fetching significant premiums in their aftermarket prices (if you can find them). Cloud…

社区洞察

其他会员也浏览了