Laws, anecdotes, common sense, -ism's, idioms, and other shit people say

A catalog of "laws" tech people like to cite. Many are not laws. Some are sharp observations dressed up in lab coats. Some are folklore. A few are actual research.

Often there is no malice to the originator of these laws. People who cite them use them opportunistically to make the re-teller's point and justify their position (see Larman's Law). With these "laws," as one of the laws below states, "the street finds its own uses."

In the spirit of these "laws," these have not been thoroughly vetted. However, as bi-directional centaur written with-by AI, they sound good enough to be true, and perfect to cite in transformative thought-leadership keynotes.


§The classics everyone cites

§Conway's Law

"Organizations which design systems... are constrained to produce designs which are copies of the communication structures of these organizations."

Melvin Conway, "How Do Committees Invent?", Datamation, April 1968

The shape of the software ends up matching the communication structure of the people who built it - not necessarily the org chart, since informal channels (the Slack DM, the lunch table, the cross-team Discord) often route around the formal hierarchy and the seams in the software follow the seams in those communications, wherever they actually run. Used to argue for "inverse Conway" team structures, where you design the communication boundaries you want to see in the software, and as a diagnostic for why integration is harder than it should be. The 1968 paper was theoretical; the 2008 Microsoft study by Nagappan, Murphy, and Basili found organizational metrics predicted defect density better than any code metric, which is about as strong as empirical confirmation gets for a "soft" law.

§Dunbar's Number

"There is a cognitive limit to the number of individuals with whom any one person can maintain stable relationships, that this limit is a direct function of relative neocortex size, and that this in turn limits group size."

Robin Dunbar, "Neocortex size as a constraint on group size in primates," Journal of Human Evolution, 1992

Humans top out around 150 stable social relationships, with smaller circles for close friends (~50) and intimates (~15). Used by management writers to set headcount limits, by social-network designers to argue for friend-list ceilings, and by anybody who wants to feel scientific about saying their company is "too big." The "150" is really a confidence interval (100-250) hammered into a round number by consultants; Lindenfors et al. (2021) argue the figure has no statistical basis at all, Dunbar disagrees loudly, and the truth is probably "there's some limit and 150 is in the range."

§Goodhart's Law

"When a measure becomes a target, it ceases to be a good measure."

Marilyn Strathern's pithy paraphrase, 1997, of Charles Goodhart, 1975

Used wherever measurement is doing political work: KPIs, school testing, ML benchmarks, OKRs, AI evals. The moment any of these are tied to compensation, Goodhart wins; people optimize for the metric rather than the thing the metric was meant to track. Originally Charles Goodhart's 1975 academic point about UK monetary policy, condensed by Marilyn Strathern in 1997; closely related to Campbell's Law (1976), which said much the same thing first.

§Brooks's Law

"Or: nine women cannot have a baby in one month. Adding manpower to a late software project makes it later."

Fred Brooks, The Mythical Man-Month, 1975

Used to argue against panic hiring on a slipping project, on the basis that onboarding cost and communication overhead grow faster than the new heads' contribution. The underlying mechanisms (ramp time, communication scaling as n²) are real; the constants change with tooling - async docs, AI coding assistants, smaller deploys - but the shape of the curve is the same as it was in the OS/360 era Brooks was describing.

§API or You're Fired (the Bezos Mandate)

"All teams will henceforth expose their data and functionality through service interfaces... There will be no other form of inter-process communication allowed... Anyone who doesn't do this will be fired. Thank you; have a nice day!"

Steve Yegge, "Stevey's Google Platforms Rant" (2011), recounting Jeff Bezos's internal Amazon mandate, ~2002

Every Amazon team had to expose its data and capabilities as a network-callable API, no exceptions; the mandate is what turned Amazon's pre-2002 monolith into the service-mesh architecture that eventually became AWS. Used in any platform-engineering or API-strategy conversation as the canonical example of "expose everything, integrate nothing," and as the founding myth of microservices despite Amazon doing service-oriented architecture before "microservices" was a marketing term. The memo itself has never surfaced; the version everyone quotes is from Steve Yegge's 2011 Google Platforms Rant, which Yegge (an ex-Amazon engineer who had moved to Google) accidentally posted publicly while meaning to post internally - one of those rare cases where the secondhand telling, leaked by mistake, is the canonical source.

§Postel's Law (Robustness Principle)

"TCP implementations should follow a general principle of robustness: be conservative in what you do, be liberal in what you accept from others."

Jon Postel, RFC 760, 1980

Send strict, accept loose - tolerate sloppiness in your inputs, do not produce sloppiness in your outputs. Used as the design philosophy for early internet protocols and as the assumed default in browser HTML parsing; increasingly cited against itself, as security researchers (Sassaman, Patterson, Bratus, 2012) argue that "liberal in what you accept" creates exploit surfaces and protocol fragmentation. Gospel for decades; HTTP/2 and QUIC walked it back deliberately, which suggests the half-life of a "principle" in protocol design is roughly thirty years.

§Moore's Law

"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year."

Gordon Moore, "Cramming more components onto integrated circuits," Electronics, 1965

Transistor density doubles every year (the 1965 figure; revised to two years in 1975), which is what made fifty years of cheaper computing possible. Used as the load-bearing assumption behind every roadmap and every "AI compute will get cheaper" argument; declared dead annually since at least Bob Colwell's 2013 Hot Chips keynote and re-declared dead by every chip-industry figure on a stage since. As of 2026, leading-edge density still progresses but cost-per-transistor has stopped falling, which is the part that mattered economically - an empirical observation that has aged into an epitaph.

§Wirth's Law

"Software is getting slower more rapidly than hardware is becoming faster."

Niklaus Wirth, "A Plea for Lean Software," IEEE Computer, 1995

Used to explain why a 2026 IDE on a 2026 laptop feels about the same as a 1996 IDE on a 1996 laptop, despite three orders of magnitude more compute under the hood. Wirth's complaint was specifically about feature bloat and abstraction layers; Electron, web frameworks, and language VMs have made the case for him in ways he could not have imagined; observation, not law, but the correlation has held for thirty years.


§Network and scale laws

§Metcalfe's Law

"The value of a network is proportional to the square of the number of users (n²)."

Robert Metcalfe, ~1980, popularized by George Gilder, 1993

Used in the dot-com era to justify telecom and social-network valuations on the back of an envelope, and still cited approvingly when somebody wants to argue that getting bigger is intrinsically more valuable. The math assumes every connection has equal value, which is plainly false; Briscoe, Odlyzko, and Tilly (2006) argue real social networks scale as n·log(n), and the empirical fit is much better; the n² version is observation and salesmanship dressed as law.

§Reed's Law

"The utility of large networks, particularly social networks, scales exponentially with the size of the network (2^n)."

David Reed, 1999

Used in the early-2000s social-software era to argue that group-forming networks have qualitatively more value than simple person-to-person ones, on the grounds that the count of possible subgroups grows exponentially. The math is correct under absurdly generous assumptions (every subgroup has equal value, every user joins every subgroup); in practice the value distribution is power-law, not exponential. Speculation dressed as law, mostly cited now to explain why a particular social platform was supposed to be a hit.

§The Eight Fallacies of Distributed Computing

"Essentially everyone, when they first build a distributed application, makes the following eight assumptions. All prove to be false in the long run and all cause big trouble and painful learning experiences."

L. Peter Deutsch (1994, original 7) and James Gosling (1997, added the 8th), at Sun Microsystems

Used to ward off any new engineer (or, increasingly, any LLM agent) who is about to ship a distributed system that quietly assumes the network behaves like a function call. The eight fallacies:

  1. The network is reliable.
  2. Latency is zero.
  3. Bandwidth is infinite.
  4. The network is secure.
  5. Topology doesn't change.
  6. There is one administrator.
  7. Transport cost is zero.
  8. The network is homogeneous.

Real working list, every item earned through actual production failures; every distributed system you have ever debugged was failing at #1 or #2 in some way you had not noticed.


§Software engineering folklore

§Knuth's Law / Premature Optimization

"We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%."

Donald Knuth, "Structured Programming with go to Statements," ACM Computing Surveys, 1974

Do not waste effort tuning code paths that do not matter, but do invest in tuning the few that do; the trick is knowing which 3% you are in. Used as a permission slip for refusing performance work and, less often, as a reminder that the 3% case is real and worth profiling for. Aphorism, wildly out of context; the truncated misquote ("premature optimization is the root of all evil") has done more damage than the original could have prevented.

§Spolsky's Law of Leaky Abstractions

"All non-trivial abstractions, to some degree, are leaky."

Joel Spolsky, "The Law of Leaky Abstractions," 2002

Used to explain why every "managed" service eventually requires you to know what is underneath it, and why DevOps engineers end up understanding the network stack whether they planned to or not. Observational; every cloud-native engineer learns it the second time their region goes down, which is when the abstractions stop hiding the underlying reality.

§Linus's Law

"Given enough eyeballs, all bugs are shallow."

Eric Raymond, The Cathedral and the Bazaar, 1999, attributed to Linus Torvalds

Used as the theoretical case for open source - more readers means more bug reports - and as a complacent reassurance about the security of widely-used libraries. The law holds for attention-getting bugs but not for buried ones: Heartbleed (2014) sat in OpenSSL for two years with millions of "eyeballs" on the source. Observation, partial, useful as a banner for FOSS but not as a security argument.

§Cunningham's Law

"The best way to get the right answer on the internet is not to ask a question; it's to post the wrong answer."

Ward Cunningham, attributed by Steven McGeady

Used to defend any number of bait-and-switch posts on social media, mailing lists, and Stack Overflow, and as the inadvertent business model of every comment section. Aphorism; not actually a law, but anyone who has watched a wrong-but-confident technical post attract corrections faster than the same author's earnest question knows it works in practice.

§Zawinski's Law

"Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can."

Jamie Zawinski, ~1990s

Used to mock feature creep and to predict what every chat app, every CRM, every "platform" eventually becomes. The "mail" was 1990s shorthand for "messaging plus identity plus calendar," which means the law is not really about email per se but about the gravitational pull of unifying communication tools - replace "mail" with "chat" and the law applies to Slack, Notion, and Discord.

§Ninety-Ninety Rule

"The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time."

Tom Cargill, Bell Labs, ~1985

Used to explain why software projects always overrun: the messy edges, error handling, polish, deploy, and integration are larger than they look from the green-field beginning. Aphorism, painfully accurate, repeatedly confirmed by survey data on actual project delivery; cousin to Hofstadter's Law and to the planning fallacy.

§Hofstadter's Law

"It always takes longer than you expect, even when you take into account Hofstadter's Law."

Douglas Hofstadter, Gödel, Escher, Bach, 1979

Used in every estimation meeting as a wry shrug about why the schedule will slip even though everyone knows the schedule always slips. Aphorism; the recursive form is the joke and also the lesson, since accounting for the slip prompts more ambitious commitments that themselves slip.

§Gall's Law

"A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work."

John Gall, Systemantics, 1975

Used as the architectural argument for "start with a monolith," for evolutionary product development, and against any attempt to design a microservices mesh from a whiteboard. Observational; well-supported by decades of failed enterprise rewrites and by the survival pattern of every successful platform that started out as a one-person script.

§Eagleson's Law

"Any code of your own that you haven't looked at for six or more months might as well have been written by someone else."

Folk programming aphorism, frequently attributed to a Peter Eagleson; original source unclear.

Used to defend writing comments and good documentation against the "good code is self-documenting" school, on the empirical grounds that future-you is a stranger. Aphorism, attribution shaky, observably true; a six-month gap is roughly when the mental model you held while writing has decayed past free recall.

§Schneier's Law

"Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break."

Bruce Schneier, "A Plea for Simplicity," 1999

Used to argue against rolling your own crypto, your own auth, or any other security primitive that has a properly-engineered open-source equivalent. Observational and effectively axiomatic in security circles; the practical corollary is that the security of any system depends on review by people who were not paid to defend it.

§The Bus Factor

"The number of team members who must be hit by a bus before a project is in serious trouble."

Folk programming term, ~1990s

Used in code reviews and risk assessments to identify silent single points of failure - the engineer who is the only one who understands the deploy script, the architect who carries the model in their head. Folk term; the phrasing is grim by design and works as a forcing function because nobody wants to write up a postmortem that reads "Project derailed because Alex went on vacation."

§Larry Wall's Three Virtues of a Programmer

"We will encourage you to develop the three great virtues of a programmer: laziness, impatience, and hubris."

Larry Wall, Programming Perl, 1991

In Wall's gloss: laziness is effort spent to reduce effort, impatience is productive anger at slow code, and hubris is the pride that makes you write code other people will not want to insult. Programmers cite the line to defend writing tooling instead of doing the task by hand, on the theory that a good programmer would rather spend a day on a script than ten minutes on the job repeated for a week; management hears it stripped of its definitions and worries it is a permission slip. From the Camel Book; aphorism, but a load-bearing one for anyone who has met both kinds of programmer.

§The Iron Triangle (Project Management Triangle)

"Quality, schedule, cost - pick two."

Folk project-management aphorism, widely cited from the 1980s onward

A software project has three competing constraints - schedule (when), quality (how good), and cost or scope (how much, how big) - and you can only optimize for two of them; pushing on one corner deforms the other two. Used in every estimation conversation as a way to push back on stakeholders demanding all three, and as a diagnostic for any project silently sacrificing quality to hit schedule and budget. Folk aphorism rather than research; the underlying constraint-tradeoff math is real and shows up in formal project-management literature; the precise three-corner framing is a useful simplification of more complicated models.

§Larman's Laws of Organizational Behavior

"Organizations are implicitly optimized to avoid changing the status-quo middle- and first-level manager and 'specialist' positions and power structures."

Craig Larman, codified through his writing on Large-Scale Scrum, 2010s

Used to explain why every Agile, DevOps, and platform-engineering rollout ends up looking like the org chart that preceded it: the org chart is the immune system, and it has more practice than the consultants. The five laws:

  1. Organizations defend status-quo manager and specialist positions. (Above.)
  2. Change initiatives get reduced to redefining new vocabulary as the old behavior. Agile becomes "we already do stand-ups." DevOps becomes "we already deploy to prod."
  3. Change initiatives get derided as "purist," "theoretical," or "needing pragmatic local customization," which deflects from the weaknesses they were meant to address.
  4. Inspections that reveal serious problems get ignored, discounted, or attributed to some other group.
  5. (In large organizations) Culture follows structure. You cannot exhort your way to a new culture; you change reporting lines, incentives, and team boundaries first.

Practitioner observation rather than peer-reviewed research, sharp enough to read as satire; anyone who has watched an enterprise "adopt" a methodology can list specific instances of #2 and #3 from memory, and #5 is the inverse-Conway argument applied to culture.


§Organizational and economic laws

§Parkinson's Law

"Work expands so as to fill the time available for its completion."

C. Northcote Parkinson, The Economist, November 1955

Used to argue against generous deadlines on the grounds that any allotted time will be consumed regardless of the work's actual size, and as the cynical reading of how teams "always" run right to the deadline. The 1955 essay was satire about the British civil service - now widely cited as gospel by people who have not read it - but the empirical observation has held up well enough that the satire has aged into management orthodoxy.

§Pareto Principle (80/20 Rule)

"Roughly 80% of consequences come from 20% of causes."

Vilfredo Pareto's land-ownership observation, 1896, refined by Joseph Juran for quality control

Used to justify almost any prioritization scheme: the 20% of bugs causing 80% of crashes, the 20% of customers driving 80% of revenue, the 20% of features anyone uses. Real distributions tend to be power-law or log-normal rather than precisely 80/20 - the numbers are rule-of-thumb decoration on a real underlying skew - but the heuristic is operationally useful, which is why it has outlasted a hundred more rigorous frameworks.

§Peter Principle

"In a hierarchy every employee tends to rise to his level of incompetence."

Laurence Peter, The Peter Principle: Why Things Always Go Wrong, 1969

People get promoted because they were good at their old job until they reach a job they are bad at, then they stop getting promoted; the result is that every position is filled by someone barely capable of doing it. Used to mock corporate hierarchies and as an argument against the "best engineer becomes the manager" career model. Originally satirical, but Benson, Li, and Shue (2018) found empirical support: top salespeople get promoted to managers and underperform - the joke turned out to be partially true.

§Dilbert Principle

"Companies tend to systematically promote their least competent employees to management positions to limit the damage they can do."

Scott Adams, "The Dilbert Principle," Wall Street Journal, May 1995

Used as a more cynical relative of the Peter Principle: not "people fail in management" but "management is where you put people you can't trust with real work." Aphorism with a comic-strip lineage, obviously not literal policy anywhere, but a more accurate description than HR would like to admit of certain promotion patterns at certain firms.

§Pournelle's Iron Law of Bureaucracy

"In any bureaucratic organization there will be two kinds of people: those who work to further the actual goals of the organization, and those who work for the organization itself. The Iron Law states that in all cases, the second group will gain and keep control."

Jerry Pournelle, frequently in his writing, ~1990s

Used to explain why universities accumulate administrators rather than professors, why open-source foundations grow paid staff faster than committers, and why any sufficiently large nonprofit eventually reorients around its own perpetuation. Observational and deeply pessimistic; consistent enough with bureaucratic outcomes that it functions as something stronger than aphorism even though it has no peer review behind it.

§Sayre's Law

"Academic politics are so vicious because the stakes are so low."

Wallace Sayre, attributed by Charles Frankel, 1960s

The lower the actual stakes of a dispute, the more energy people will pour into it - bikeshedding's older sibling. Used wherever a small-stakes argument has gone nuclear (committee meetings, internet forums, code-style debates) and as the precise reason your most-discussed ticket will be the one with the lowest impact. Aphorism, usually paraphrased ("the intensity of feeling is inversely proportional to the value of the issue at stake"); cousin to the Law of Triviality.

§Culture Eats Strategy for Breakfast

"Culture eats strategy for breakfast."

Attributed variously to Peter Drucker, W. Edwards Deming, and Mark Fields, then a Ford executive. Probably none of them, definitively.

Used at consulting offsites to explain why a transformation initiative failed - the org chart did not believe in it - and as a counterweight to anyone who thinks a memo can change a company. The earliest documented use is Mark Fields, around 2006; the Drucker Institute has confirmed Drucker never wrote the line and the Deming attribution is similarly thin, leaving an observation that is mostly true delivered through a quotation that is mostly fake.

§Technology Is Easy, People Are Hard

"Containers will not fix your broken culture."

Bridget Kromhout, "Containers Will Not Fix Your Broken Culture (And Other Hard Truths)," ACM Queue, 2018

Understanding, deploying, and using a piece of technology is often the easy part; changing how the organization functions to actually get value from it is the hard part. Used in DevOps, platform-engineering, and "digital transformation" conversations to push back on tool-centric solutionism - a Kubernetes migration cannot fix bad communication, a CI pipeline cannot fix a blame culture, and the AI rollout will not fix any of it either. Real Kromhout, real ACM Queue piece; cousin to Conway's Law, Larman's Laws, and Culture Eats Strategy, all variations on "the people problem is the actual problem."

§It Is Not Necessary to Change. Survival Is Not Mandatory.

"It is not necessary to change. Survival is not mandatory."

W. Edwards Deming, frequently in his lectures, 1980s

Organizations will reliably refuse the changes they need and accept extinction rather than discomfort, and pointing this out at them changes nothing. Used in management circles as bracing Darwinian advice ("evolve or die") to complacent executives, which gets Deming exactly backwards - he meant the futility of the change conversation, not the urgency. Solidly Deming, one of his most quoted lines, and one of his most consistently misunderstood.

§If You Break It, You Own It (the Pottery Barn Rule)

"You are going to be the proud owner of 25 million people. You will own all their hopes, aspirations, and problems. You'll own it all."

Colin Powell to George W. Bush, August 5, 2002, as reported in Bob Woodward's Plan of Attack (2004)

Used to argue that whoever destabilizes a system - a country, a vendor relationship, a legacy codebase - inherits responsibility for whatever comes after, whether they wanted it or not. The "Pottery Barn rule" branding was Tom Friedman's gloss in a New York Times column the same year and stuck despite Pottery Barn having no such policy. Powell later denied using the phrase but confirmed the underlying argument: real Powell, real Friedman branding, fake retail policy.

§Powell's 40/70 Rule

"Use the formula P=40 to 70, in which P stands for the probability of success and the numbers indicate the percentage of information acquired. Once the information is in the 40 to 70 range, go with your gut."

Colin Powell, My American Journey (1995)

Do not decide with less than 40% of the information, do not wait for more than 70%; the last 30% you fetch usually does not change the answer, only the calendar date on which you give it. Used as a corrective to the engineer-default of trying to gather every input before committing, and as cousin to Amazon's "two-way doors," "disagree and commit," and similar bias-toward-action arguments. From Powell's autobiography and his leadership talks; aphorism with operational teeth.

§Bezos's Two-Pizza Rule

"Teams should be small enough that they can be fed with two pizzas."

Jeff Bezos, ~2002, recounted in Brad Stone, The Everything Store, 2013

Used as a heuristic for capping team size at roughly six-to-eight engineers, on the underlying claim that communication overhead grows nonlinearly with team size and is the dominant constraint above that count. The pizza framing is branding and has migrated into Amazon Leadership Principles material; the underlying point is Brooks's Law restated, and the empirical case is the same.

§Law of Triviality / Bikeshedding

"The time spent on any item of the agenda will be in inverse proportion to the sum of money involved."

C. Northcote Parkinson, Parkinson's Law and Other Studies in Administration, 1957

Used to mock disproportionate engineering attention to trivial details - the variable name, the button color, the bikeshed - while substantive decisions sail through unexamined. From the same author as Parkinson's Law; the "bikeshed" framing is from a famous 1999 post by Poul-Henning Kamp on the FreeBSD list, which is where the word entered the engineering lexicon. Benjamin Franklin's "penny-wise and pound-foolish" is the finance-side version (older than Franklin; traces to Robert Burton's Anatomy of Melancholy, 1621); the theological version is debating how many angels can dance on the head of a pin, which is a 17th-century strawman accusation against medieval scholasticism that has outlived the actual debate it ridiculed.

§Jevons Paradox

"It is wholly a confusion of ideas to suppose that the economical use of fuel is equivalent to a diminished consumption. The very contrary is the truth."

William Stanley Jevons, The Coal Question, 1865

Making something cheaper increases total demand for it rather than reducing consumption; efficiency creates more demand, not less. Used to explain why cloud got cheaper and we now spend more on cloud, why AI made coding faster and we now ship more code, and why every "productivity gain" turns into more work to do; the load-bearing lens for every "this will free up time" argument that empirically does not. Originally an 1865 observation about British coal consumption, replicated in transport, lighting, and computing; one of the rare 19th-century laws that has gotten more relevant over time, not less - cousin to John Maynard Keynes's 1930 prediction that productivity gains would deliver a 15-hour work week by 2030, which we did not get for exactly Jevons reasons.

§The Power Law of Venture Returns

"Venture returns don't follow a normal distribution overall. Rather, they follow a power law: a small handful of companies radically outperform all others."

Peter Thiel, Zero to One, 2014

In a venture portfolio, most companies fail or break even and the entire fund's return rides on one or two outliers - the inverse of normal-distribution thinking. Used by VCs to justify nine misses for one hit and to defend bets that look reckless on a per-investment basis but rational at the fund level. Empirically robust across decades of fund data; philosophically contagious, in that it has migrated into product strategy and career advice in places where the math does not actually apply.

§Goldratt's Theory of Constraints (the Fat Boy Scout)

"An hour lost at a bottleneck is an hour out of the entire system. An hour saved at a non-bottleneck is a mirage."

Eliyahu Goldratt, The Goal, 1984

Until you fix the slowest part of your process, optimizing anywhere else is wasted effort, because the slowest part still gates everything else. Used in manufacturing, software delivery, and DevOps to argue against local optimization in favor of finding and lifting the binding constraint - illustrated in The Goal by a Boy Scout hike where the troop's pace is set by Herbie, the slowest scout with the heaviest pack, and the lesson is to redistribute Herbie's load before doing anything else. Goldratt's Theory of Constraints has solid empirical support in operations research; the Goal version is the most popular business novel ever written, partly because the bottleneck argument is genuinely correct and partly because Herbie is hard to forget.

§Sinclair's Law

"It is difficult to get a man to understand something, when his salary depends on his not understanding it."

Upton Sinclair, I, Candidate for Governor: And How I Got Licked, 1935

People will not see what their paycheck depends on them not seeing - whether the issue is climate science, an architectural rewrite that would obviate their job, or a tool category their team's existence is justified by. Used to explain motivated reasoning in any domain where the reasoner has financial skin in the game and to ward off the comfortable assumption that "everyone just wants what is true." Real Sinclair, real source; one of those rare aphorisms that has gotten more applicable with time, especially in tech, where entire job categories have a structural interest in the persistence of the problems they were created to solve.

§Bullshit Jobs

Cartoon of a long meeting table full of bored, glazed-over office workers.

"There are millions of people across the world... whose jobs are entirely pointless... Bullshit jobs are jobs which even the person doing the job can't really justify the existence of."

David Graeber, "On the Phenomenon of Bullshit Jobs," STRIKE!, 2013, expanded as Bullshit Jobs: A Theory, 2018

A substantial fraction of paid work in modern economies is, by the workers' own admission, pointless - jobs that exist for political, hierarchical, or symbolic reasons rather than because anyone actually needs the output. Graeber surveyed YouGov respondents who classified 37% of UK jobs as making no meaningful contribution, and his taxonomy (flunkies, goons, duct-tapers, box-tickers, taskmasters) maps recognizably onto a lot of corporate org charts. Anthropological argument rather than rigorous social science, with the headline 37% number contested in later replications; the underlying phenomenon is real enough that the term has migrated from leftist anthropology into mainstream HR conversations.

§70% of Transformation Initiatives Fail

"More than half of [the companies in this study] failed in the early stages... and only a few were unqualified successes."

John Kotter, "Leading Change: Why Transformation Efforts Fail," Harvard Business Review, March-April 1995

A confident-sounding statistic - usually quoted as "70%" or "80%" - that says most corporate transformation programs fail; routinely cited at the start of slide decks selling the next transformation program. Used by management consultants and change-management practitioners to justify their fees, and by skeptical engineers to push back on the next reorg. The number is essentially folklore: Kotter's 1995 article, the universally cited headwater, never actually says "70%" - he wrote that "more than half" of his sample of about a hundred companies failed in the early stages, and the crisp percentage is downstream invention by McKinsey and BCG citations of him; Mark Hughes (2011) traced five canonical citations of the 70% figure and found none with valid empirical backing.

§The Minto Pyramid Principle

"Start with the answer first, then group and summarize the supporting arguments."

Barbara Minto, The Pyramid Principle: Logic in Writing and Thinking, 1973

Structure communication with the conclusion at the top, supporting arguments grouped beneath it, and details further down - the inverse of academic writing's "build to the conclusion" approach, designed for executives who will stop reading after the first paragraph. Used as the standard McKinsey memo and presentation format and as the underlying logic of the journalistic inverted pyramid. Real method, widely-taught, especially useful in any environment where the audience is busier than the author.

§The Shit Sandwich

Open with good news, deliver the bad news, close with good news.

A standard structure for delivering criticism or unwelcome information without losing the audience; widely taught in management training, performance reviews, and HR contexts. Used because it works on the listener's attention curve and because most cultures treat undressed bad news as rude. Folk; reliably useful in low-context, formal-feedback settings, less so anywhere people parse only the substance and ignore the framing.

§The Reverse Open-Faced Shit Sandwich

Open with a lot of good news, then quickly deliver the bad news, then stop.

The American operating mode: 55 minutes of positive context wrapped around a 5-minute critique, where from the listener's point of view the 55 minutes is mostly noise and the 5 minutes is the entire point. As described in Erin Meyer's The Culture Map, people unfamiliar with this structure routinely miss the weight of the negative bit, because they do not realize the surrounding praise is meant to be ignored and the five minutes is the actual content. Classic example: a performance review opens with all your achievements and closes with "oh, and just one last thing - you could do a little better on delegating to your team"; at the next review, the same manager asks "why didn't you focus on delegating to your team?"

§Whichard's EV Theory

"Stock price = story × revenue."

Often used by Brandon Whichard on Software Defined Talk.

A company's market value is roughly the strength of its narrative multiplied by its actual financial performance, and the bigger one is, the smaller the other can be. Used to explain why early-stage AI startups trade at multiples that look insane next to revenue (story is doing all the work) and why utilities trade at single-digit P/E (story is doing none of it). Folk theory rather than CFA-grade modeling, but a useful working frame; cousin to Soros's reflexivity and to the Halo Effect applied to public markets.

§POSIWID (The Purpose Of A System Is What It Does)

"The purpose of a system is what it does. There is, after all, no point in claiming that the purpose of a system is to do what it constantly fails to do."

Stafford Beer, lectures and writing on management cybernetics, 2001-2002

Ignore what the people running the system say the system is for; look at what it actually produces, and that output is the real purpose - intent does not matter, results do. Used in systems thinking, public-policy critique, and incident reviews to cut through stated mission, brand pillars, and good intentions: the prison system's purpose is incarceration regardless of its rehabilitation rhetoric, the on-call rotation's purpose is wearing down senior engineers regardless of its "customer focus" rhetoric. Stafford Beer's contribution to management cybernetics; widely cited in organizational analysis; the kind of aphorism that gets sharper the more politely you have to discuss the system in question.

§IKEA Effect (a.k.a. the Betty Crocker Cake Mix Anecdote)

Vintage Betty Crocker print ad: hand holding two eggs next to a frosted layer cake. Headline: 'Betty Crocker Cake Mixes bring you that Special Homemade Goodness... BECAUSE YOU ADD THE EGGS YOURSELF.'
The actual ad. Mid-1950s.

"Labor leads to love. We provide four studies in which consumers' valuations of self-created products were higher than those of objective observers."

Norton, Mochon, and Ariely, "The IKEA Effect: When Labor Leads to Love," Journal of Consumer Psychology, 2012

Used to explain why customers value products they helped assemble or configure more than equivalent off-the-shelf alternatives, and as the rigorous version of the Betty Crocker cake-mix story (in which General Mills supposedly forced housewives to add an egg so they would feel they were "baking" - a tidy narrative attributed to motivational researcher Ernest Dichter and almost entirely folklore). Real research with replicated results; the cake-mix anecdote is the same idea decades earlier with much shakier evidence behind it.


§Cognitive bias / human factors that get cited as laws

§Dunning-Kruger Effect

"People who are unskilled in a particular domain often hold inflated views of their performance in that domain."

Justin Kruger and David Dunning, "Unskilled and Unaware of It," Journal of Personality and Social Psychology, 1999

People who are bad at something do not realize they are bad, and people who are good at something often underestimate themselves; the basis for every "the smartest people are the most humble" LinkedIn post. Disputed: Nuhfer et al. (2017) and others argue the canonical chart is largely a statistical artifact of regression to the mean, and the popular folk-version (a precise V-shape of confidence vs. ability) is more robust as a meme than as research. Real research at the source, much weaker than the popular version implies.

§Baader-Meinhof Phenomenon (the Frequency Illusion)

"A cognitive bias in which, after noticing something for the first time, people tend to notice it more often."

Named in 1994 by a St. Paul Pioneer Press commenter who happened to read about the German Baader-Meinhof Group twice in 24 hours; described academically as the "frequency illusion" by Stanford linguist Arnold Zwicky in 2005

Once something rare enters your awareness - a word, a concept, a model of car - you start seeing it everywhere; the canonical example is noticing a Volkswagen Bug and then seeing them on every block. Used to explain why the technology you just learned about feels suddenly ubiquitous and as a corrective to anyone confusing "I noticed it more" with "it happened more." Real cognitive psychology with a goofy name (the Baader-Meinhof Group has nothing to do with the phenomenon - the name is an accident); two mechanisms are at work, selective attention and confirmation bias.

§The Doorway Effect

"We forget what we walked into the room for."

Gabriel Radvansky, Sabine Krawietz, and Andrea Tamplin, "Walking through doorways causes forgetting," Quarterly Journal of Experimental Psychology, 2011

Walking through a doorway disrupts working memory in a way that the same number of steps in an open space does not - the brain treats a threshold as a "boundary" and dumps the active context. Real research, replicated; observed in physical doorways, virtual-reality doorways, and to a lesser extent in any task switch that involves entering a new spatial context. Used to explain the universal "I came in here for something" experience and as a reminder that working memory is fragile around boundaries.

§Imposter's Syndrome

"Despite outstanding academic and professional accomplishments, women who experience the imposter phenomenon persist in believing that they are really not bright and have fooled anyone who thinks otherwise."

Pauline Clance and Suzanne Imes, "The Imposter Phenomenon in High Achieving Women," Psychotherapy: Theory, Research & Practice, 1978

High achievers feel they have fooled everyone and that any moment now they will be unmasked as frauds, regardless of how competent they actually are. Used to name a specific kind of professional anxiety - common in tech, especially among engineers who change jobs faster than their skills can solidify - and as a generous reframing of "this person is hard on themselves" into "this is a known pattern with a name." Originally Clance and Imes's 1978 study of 150 high-achieving women, since extended far beyond the original sample; Clance herself prefers "imposter phenomenon" over "syndrome" because it is not a clinical diagnosis - useful as language, iffier as research the wider it gets applied.

§Hick's Law

"Decision time T = b · log₂(n + 1), where n is the number of equally probable choices."

W.E. Hick (1952), Ray Hyman (1953)

How long it takes a person to make a choice grows with the log of the number of options, not linearly; doubling the menu adds a fixed amount of time rather than doubling decision time. Used in interface design to argue against bloated menus and in product design to limit options at the moment of choice. Solid empirical research, replicated for decades; the practical implications are limited because people are usually choosing from familiar, not equiprobable, options. Related: The Paradox of Choice, or, the cereal aisle.

§Pilgrim's Law

"A lot of effort went into making this effortless."

Mark Pilgrim, footer of his various websites and his book Dive Into Python

What may seem simple actually took a lot of work; the polish is the labor. Used as a quiet brag about good UX, good documentation, and any product where the seams have been hidden well; the antithesis of "this looks easy, why is it taking so long?" Related: Tesler's Law, "Every application has an inherent amount of irreducible complexity. The only question is who has to deal with it."

§System 1 and System 2

"System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. System 2 allocates attention to the effortful mental activities that demand it, including complex computations."

Daniel Kahneman, Thinking, Fast and Slow, 2011

Humans have a fast, intuitive, lazy, almost-always-running pattern-matcher and a slow, deliberate, expensive, easily-fatigued reasoner. Used to explain why smart people make dumb decisions when tired or rushed, and to argue for designing around System 1's limits with checklists, code reviews, and second-pair-of-eyes conventions rather than exhorting people to "think harder." Solid research with a popular metaphor that has gotten heavier than the science, especially in management-book treatments that treat the two systems as anatomy rather than convenient labels.

§What You See Is All There Is (WYSIATI)

"The measure of success for System 1 is the coherence of the story it manages to create. The amount and quality of the data on which the story is based are largely irrelevant... I will refer to it as WYSIATI, which stands for what you see is all there is."

Daniel Kahneman, Thinking, Fast and Slow, 2011

We judge a story by how coherent it is rather than by how much information underwrites it, so a tidy thin-evidence conclusion feels as confident as a messy well-evidenced one. Useful corrective on any postmortem, retro, or estimate, because the room is already overcommitted to the story it can already tell and is mostly blind to the data not present. Replicated, foundational to behavioral economics, and the reason every estimation meeting is also an exercise in noticing what is missing.

§Donald Rumsfeld's Theory of Knowing

Donald Rumsfeld at a Pentagon news briefing, February 12, 2002.
Donald Rumsfeld at the Pentagon, February 12, 2002.

"There are known knowns - there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns - the ones we don't know we don't know."

Donald Rumsfeld, US Department of Defense news briefing, February 12, 2002

The line is dense but plainly means: there are things you know, things you know you do not know, and things it has not occurred to you that you do not know. Used by project managers and risk analysts to insist on a budget line for the third category - "things we have not thought of yet" - rather than letting unknown-unknowns silently eat the schedule. Rumsfeld won the Foot in Mouth award for the delivery, but the taxonomy itself is older than 2002, derived from the 1950s Johari Window; Slavoj Žižek later added the unknown knowns (things we know but refuse to admit).

§Anchoring

"If you are asked whether Gandhi was more or less than 144 years old when he died, you will end up with a much higher estimate of his age at death than you would if the anchoring question referred to death at 35."

Daniel Kahneman, Thinking, Fast and Slow, 2011

The first number you see drags every subsequent estimate toward it, even when the first number is obviously irrelevant. Used as the cornerstone of negotiation tactics - never name your price first, never let an engineer name a deadline last - and as a warning that "feels low" usually means "lower than the previous number you saw five minutes ago." Replicated robustly since Tversky and Kahneman demonstrated it with a rigged roulette wheel and questions about UN membership; anchors survive even when subjects are warned about them, which is the disturbing part.

§Availability Heuristic

"We defined the availability heuristic as the process of judging frequency by 'the ease with which instances come to mind.'"

Daniel Kahneman, Thinking, Fast and Slow, 2011

We estimate how common something is by how easily examples come to mind, so vivid and recent events feel more probable than dull and distant ones, regardless of base rates. Used to explain why news coverage warps risk assessment (sharks, plane crashes, last week's outage) and why "common bugs" usually means "last week's incidents" in any team's collective memory. Replicated, foundational; the practical takeaway is that anyone who cites "what comes to mind" as evidence is mostly citing themselves.

§Loss Aversion / Prospect Theory

"The loss aversion ratio has been estimated in several experiments and is usually in the range of 1.5 to 2.5."

Kahneman & Tversky, "Prospect Theory: An Analysis of Decision under Risk," Econometrica, 1979

Losses hurt about twice as much as equivalent gains feel good, so people will spend disproportionate effort to avoid a loss they would not have spent equivalent effort to capture as a gain. Used to explain why teams cling to bad architectures, sunk-cost projects, and features that ten loud users will scream about while a thousand silent users would happily migrate. The finding won Kahneman the 2002 Nobel (Tversky died in 1996, ineligible) and survived the replication crisis that took out adjacent psychology findings (priming, ego depletion); foundational, not folklore.

§The Halo Effect

"The estimates were apparently affected by a marked tendency to think of the person in general as rather good or rather inferior, and to color the judgments of the qualities by this general feeling."

Edward Thorndike, "A constant error in psycho-logical ratings," Journal of Applied Psychology, 1920

A single positive impression bleeds into every other judgment about the same thing, so we rate good-looking people as smarter and successful companies as well-managed without further evidence. Used to puncture business books that explain a company's success by listing its current habits, since the same habits at firms that went bankrupt almost never make the dataset; Phil Rosenzweig made this case in The Halo Effect (2007) with In Search of Excellence, Built to Last, and Good to Great as the case studies, and Rick Chapman's In Search of Stupidity (2003) is the tech-industry counter-canon. Real 1920s psych at the source, devastating critique at the business-book level.


§Internet-era aphorisms

§As Said by Franklin... or Was It Twain?

Benjamin Franklin, 1767 portrait by David Martin.
Benjamin Franklin, 1767 (David Martin portrait).

"As said by Franklin... or was it Twain?"

Folk attribution; the joke is older than either of them.

Some quotes and sayings are misattributed, and they tend to attach to a small set of famously witty people - Benjamin Franklin, Mark Twain, Winston Churchill, Yogi Berra, Albert Einstein, Oscar Wilde - regardless of where the line actually came from. Used to flag any quote that "sounds like Twain" with appropriate skepticism, and as the meta-rule explaining why so many entries on this page have shaky attributions in the first place. Famous quippers attract attribution; the attribution drifts toward the brand, not the originator, and once a line is on a Hallmark card under "Mark Twain" it is never coming back.

§Streisand Effect

"Attempts to remove a piece of information from public view, especially online, often have the unintended consequence of publicizing the information more widely."

Mike Masnick, Techdirt, 2005, after Barbra Streisand's 2003 lawsuit

Used to mock anyone who tries to suppress a story (a takedown notice, a defamation lawsuit, a leak attempt) on the grounds that the suppression itself is the most reliable amplification mechanism available. Named for Streisand's $50M suit over an aerial photograph of her Malibu house, which had been viewed six times before the lawsuit and 420,000 times in the month after; observation, well-documented, reliable enough that any communications professional should treat it as a planning constraint.

§Godwin's Law

"As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1."

Mike Godwin, ~1990

Used in usenet, mailing lists, and now social media as a tongue-in-cheek shorthand for "this thread has run its course." Observation, originally a meme-engineering experiment by Godwin to discourage gratuitous Hitler comparisons; the law has held up despite Godwin himself noting the obvious exception for arguments where the Nazi comparison is actually warranted.

§Poe's Law

"Without a clear indicator of the author's intent, parodies of extreme views are indistinguishable from sincere expressions of those views."

Nathan Poe, on a Christianforums.com message board, 2005

Used wherever satire and sincere belief converge - flat-earth posts, parody startups, far-right TikToks - on the basis that without an explicit wink, sincere and parody readings are statistically indistinguishable. Observation; named after a 2005 forum post about creationism that has since become broadly applicable, and as social media has flattened context, the law has gotten stronger rather than weaker.

§Betteridge's Law of Headlines

"Any headline that ends in a question mark can be answered by the word no."

Ian Betteridge, "TechCrunch: Irresponsible journalism," February 2009

Used by skeptical readers as a quick triage tool for content marketing and lazy journalism: if the editor would commit to "yes," they would have written it as a statement. Aphorism, useful, statistically defensible because question-mark headlines are a polite way of overstating a thinly-supported claim while preserving deniability; cousin to clickbait and to all the structural reasons online publishing is what it is.

§Sturgeon's Law

"Ninety percent of everything is crap."

Theodore Sturgeon, in defense of science fiction, 1957

Used to deflect any "this medium is bad" critique - true of every medium, every era - and to set the right expectation for sampling: if you read ten random things in a field, nine will be unmemorable. Aphorism, well-aged, originally a defense of science fiction to readers who had only encountered the bottom decile; the 90% figure is rhetorical but the order of magnitude is correct.

§Hanlon's Razor

"Never attribute to malice that which is adequately explained by stupidity."

Robert J. Hanlon, Murphy's Law Book Two, 1980; older variants exist

Used as the default first hypothesis when something has gone wrong - bad faith is rare, sloppy execution is common - and as a corrective to anyone who reaches for conspiracy when incompetence will do. Aphorism with older variants attributed to Heinlein, Goethe, and Napoleon; the modern phrasing is Hanlon's; reliable in most ordinary settings but breaks down in adversarial ones, where assuming stupidity gets you compromised.

§Occam's Razor

"Entities should not be multiplied beyond necessity."

William of Ockham, 14th century

Contrary to popular use: this is a principle of theoretical economy - prefer the explanation with fewer entities or assumptions when explanations are otherwise equivalent - rather than a directive that "the simplest answer is correct" in everyday life. Used in science as a guide for choosing among competing models, in engineering as an argument against unnecessary abstraction, and in everyday rhetoric as a license for laziness. Philosophical principle, not a law of nature, and routinely misapplied as a synonym for "I don't want to think about this carefully."

§Chesterton's Fence

"Don't ever take a fence down until you know the reason it was put up."

G.K. Chesterton, The Thing, 1929

Used in code reviews and refactoring conversations to argue against ripping out apparently-pointless code until you understand why it was added; the same logic applies to org policies, team rituals, and any old guardrail. Aphorism, durable, widely-cited because the failure mode (deleting load-bearing code that turns out to be there for a reason) is so common that the principle pays for itself the first time you remember it.

§Segal's Law

"A man with a watch knows what time it is. A man with two watches is never sure."

Folk attribution; surfaces ~1930

Used to mock the false comfort of a second source - more data does not necessarily mean better data, especially when the sources disagree - and as a warning about averaging conflicting estimates rather than picking the better one. Folk aphorism, attribution shaky, and the underlying point about measurement disagreement is real enough that the statistics literature has formal versions of the same joke.

§Better to Ask Forgiveness Than Permission

"It's easier to ask forgiveness than it is to get permission."

Grace Hopper, Navy Microcomputer User's Group Conference, 1986; phrasing also attributed to several others

If you are confident an action is the right one, do it and apologize later rather than wading through a permission process designed to default to "no" by attrition. Used in startups, internal-tooling projects, and any large org with slow review processes; cousin to Powell's 40/70 rule and to "it's only stupid until it works." Most reliably attributed to Grace Hopper in a 1986 talk, though the underlying spirit is older folk wisdom; reliable in low-stakes, two-way-door settings, dangerous in high-stakes one-way-door ones.

§The Street Finds Its Own Uses for Things

"The street finds its own uses for things - uses the manufacturers never imagined."

William Gibson, "Burning Chrome," Omni, July 1982

People will use a technology in ways the designers never imagined, and the unintended uses will outnumber the intended ones; the user, not the engineer, decides what a tool is actually for. Used to defend any creative misuse of a product (Twitter as journalism, Slack as a CRM, ChatGPT as a therapist), to predict that any feature you ship will be used in three ways you did not plan for, and as the most generous reading of Larman's Law applied to consumer products. Real Gibson, real source; cousin to POSIWID; the most-quoted line from Gibson's pre-Neuromancer short fiction.

§Physician, Heal Thyself

"Physician, heal thyself."

Luke 4:23 (King James Version), drawing on an older Greek proverb

Anyone offering criticism or advice should first apply it to their own situation before dispensing it to others. Used as the standard comeback to a critic who is plainly committing the same sin they are calling out, and as the standard reply to anyone who recommends a discipline they themselves do not practice. Old enough to predate the Gospels (the Greek "Iatre, therapeuson seauton" was already proverbial when Luke quoted it), which means it has been working for at least 2,500 years and is unlikely to stop now.

§Amara's Law (a.k.a. the "1-Year / 10-Year" Quote)

"We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run."

Roy Amara, Institute for the Future, ~1973

New technology is a flop in year one and dominant in year ten; nothing changes in the next quarter and everything changes in the next decade. Used to dismiss short-term hype cycles while preserving the case for long-term compounding effects: VR was nothing for fifteen years and then quietly became headset-shaped, AI replaces nothing this quarter and reshapes the economy by 2035, and so on. Often misattributed to Bill Gates ("most people overestimate what they can achieve in a year and underestimate what they can achieve in ten years"), which is the same idea in different words; Amara's version (1973) predates the Gates version by a couple of decades and is probably the original.

§On Bullshit

"It is impossible for someone to lie unless he thinks he knows the truth. Producing bullshit requires no such conviction."

Harry Frankfurt, On Bullshit, Raritan Quarterly Review, 1986; expanded as a book in 2005

Bullshit is speech intended to persuade without regard for whether it is true - distinct from lying, which requires the speaker to know the truth and contradict it; the bullshitter is indifferent to truth, the liar is opposed to it. Used in media criticism and increasingly in AI-output discussions to name a particular failure mode that ordinary words like "false" or "wrong" do not capture; large-language-model output is often pure Frankfurtian bullshit, plausible-sounding speech with no underlying commitment to accuracy. Real philosophy paper from a Princeton professor; surprise bestseller in book form; the framework has aged extremely well, especially since 2022. Related: "if it's bullshit work, have the bullshit artist do it."

§Your Scientists Were So Preoccupied With Whether They Could

"Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should."

Dr. Ian Malcolm, in Michael Crichton's Jurassic Park (1990) and the 1993 film

Used to mock any technology that shipped without anyone asking what would happen if it worked - cloning dinosaurs, deepfakes, social-media feed algorithms, generative AI agents, the entire genre. The line has aged well enough that it now functions as a generational warning to a cohort of tech founders who saw the movie as kids. Real Crichton, real Spielberg; one of those rare cases where the popular mockery doubles as a perfectly good piece of ethics.

§Everybody Needs Money

"Everybody needs money. That's why they call it money."

§It's Only Stupid Until It Works

"It's only stupid until it works."

Used to defend an idea that conventional wisdom rejects, on the grounds that the dismissal itself is often why the idea is undervalued - if it sounded smart, somebody with capital would already be doing it. Or, when the Halo Effect results in success.


§Inside Jokes, et.c

Some writers and podcasters maintain a private verbal currency: phrases used so often they function as in-jokes for the regular reader. The phrases are not laws, but they are dense little objects the audience learns to read.

§Tylerisms (Tyler Cowen)

Recurring phrases on Marginal Revolution and on Cowen's Conversations with Tyler podcast.

§Merlinisms (Merlin Mann)

Most of these live in Mann's wisdom file on GitHub, with the older ones from the 43 Folders era.


§Anecdotes

These are not laws. They are stories tech people tell to make a point, and the stories outlast the points they were originally meant to illustrate.

§The Drunk Under the Lamppost (the Streetlight Effect)

Cartoon of a drunk man on his hands and knees beside a streetlight, an empty bottle behind him.

A drunk is searching for his keys under a streetlight. A passerby asks if he is sure he dropped them there. "No," says the drunk, "I dropped them in the park, but the light is better here."

Used to mock methodology that searches where the data is convenient rather than where the answer might be: macro KPIs because they are easy to instrument, A/B tests because they are quick to run, AI benchmarks because they are public; in tech, every "data-driven" decision that mistakes "data we have" for "data we need." Older than its modern name and sometimes attributed to Mulla Nasruddin of Sufi tradition; behavioral economists call the formal version the streetlight effect; a parable that turned out to have a real cognitive-science name attached.

§The Milkshake Story (Jobs to Be Done)

A fast-food chain wanted to sell more milkshakes. Recipe and price tweaks moved nothing. A consultant hung around the restaurant and noticed people were buying milkshakes early in the morning, alone, and taking them back to their cars. The job they were hiring the milkshake to do was not "have a tasty drink" but "make a long, boring commute survivable" - the milkshake was filling, lasted thirty minutes, could be drunk one-handed, and did not produce crumbs.

Used to argue that customers buy progress on a problem, not features, and that the way to improve a product is to study the situation in which it is used rather than the product itself. From Bob Moesta and Clayton Christensen's work at McDonald's in the early 2000s and the canonical illustration of Jobs to Be Done; a real consulting engagement, almost certainly tidier in the retelling than in the conference room, useful framework regardless.

§Rats and Cocaine

Cartoon of a pack of wide-eyed rats running away from an untouched bowl labelled FOOD.

Lab rats given access to a lever that delivers cocaine will press it in preference to food and water until they collapse from starvation, dehydration, or exhaustion.

An organism will choose immediate, intense pleasure over long-term survival if the pleasure circuit is hooked up directly enough; observed in Olds and Milner's 1954 brain-stimulation studies and in the cocaine self-administration experiments through the 1960s and 70s. Used in addiction medicine, behavioral economics, and increasingly in screen-time and social-media discussions as the canonical illustration of "this product is designed to exploit a circuit your ancestors did not need to defend against." Real research, replicated; the popular version oversimplifies - Bruce Alexander's Rat Park experiments (1978) showed rats with a richer environment chose cocaine far less, complicating the "drugs are irresistible" narrative; cousin to Harry Harlow's wire-monkey vs. cloth-monkey attachment experiments, which made an analogous point about emotional needs the lab-rat framing misses.

§The Three Envelopes (in the President's Desk)

A new CEO arrives at her office. Her predecessor has left three sealed envelopes in the desk drawer, marked "Open in case of emergency: 1, 2, 3." Six months in, the first crisis hits; she opens envelope #1: "Blame the previous administration." It works. A year later, the second crisis hits; envelope #2 reads: "Reorganize." It works. Two years in, the third crisis hits; she opens envelope #3: "Prepare three envelopes."

Used to mock the limited toolkit of executive crisis-response: blame-the-predecessor, then reorganize, then exit. Folk joke with multiple variants - sometimes attributed to the Soviet politburo (Khrushchev to Brezhnev), sometimes to Eisenhower, sometimes just "an old story" - all of which is to say nobody knows where it came from, and the joke survives on its own merit.

§The Challenger O-ring (Tufte's Case Study)

The night before Challenger launched, Morton Thiokol engineers warned NASA that the cold-weather forecast risked O-ring failure. They sent thirteen charts. None of them plotted O-ring damage against launch temperature; the relationship - clear in the data, fatal at 31°F - was invisible in the presentation. Challenger broke apart 73 seconds into flight on January 28, 1986.

Used to argue that bad information design is not a cosmetic issue: people died because a chart was hard to read. Edward Tufte's reconstruction in Visual Explanations (1997) shows the correctly-drawn version - one scatterplot of damage against temperature - and the relationship is unmissable. Real disaster, real Tufte critique; cousin in spirit to the Columbia disaster and to Diane Vaughan's normalization of deviance; working consensus that the information-design lesson is correct even if it is not the only lesson.


§Dad Wisdom

Common sense, or just dumb shit dads say that kids never seem to listen to. Called dad wisdom because mothers more often resist the urge to cover their frustration with condescending quips.

§Look Somewhere You Haven't Looked Yet

"If you can't find something, look somewhere you haven't looked yet. And if you still can't find it, look everywhere you already looked, again."

The place you have already searched is, by definition, not where the thing is, so the place you have not searched is the only place it could still be; if exhaustive search fails, search again because you missed it the first time. Used by every father pulled into a "Dad, where are my keys / shoes / homework / phone " search and a thousand-yard stare of generic frustration. Folk wisdom, bulletproof, impossible to argue with - which is the entire problem when you are eleven. Related: "it's always the last place you look."

§Do As I Say, Not As I Do

"Do as I say, not as I do."

Attested in John Selden's Table Talk, mid-1600s; almost certainly older

Used as the universal cover when an authority figure asks someone else to behave in a way the authority figure obviously does not - smoking, drinking, swearing, working too late, eating their vegetables. Honest about hypocrisy while preserving authority, which is more than most institutional disclaimers manage; the line is in Table Talk (mid-1600s) and is likely much older still.


§Scheduling

§If You're Early, You're Never Late

"If you're early, you're never late."

By definition, if you arrive before the scheduled time, you will also be there at the scheduled time; the framing is bulletproof and was clearly invented by someone who valued punctuality more than the comfort of the people they were managing. Used as the morning lecture from a parent who has been awake since 5 a.m. and as a leadership-conference one-liner about respect for other people's time. The standard rebuttal, equally Dad-coded: "It's not that I was late - it's that this meeting was scheduled too early."

🍕 Get more pizza.