๐ง๐ต๐ฒ ๐ฆ๐ฒ๐ฐ๐๐ฟ๐ถ๐๐ ๐ฃ๐ผ๐น๐ถ๐ฐ๐ ๐ง๐ต๐ฎ๐ ๐ ๐ฎ๐ฑ๐ฒ ๐ฃ๐ฎ๐๐๐๐ผ๐ฟ๐ฑ๐ ๐๐ฒ๐๐ ๐ฆ๐ฒ๐ฐ๐๐ฟ๐ฒ
In the 2000s, every IT department had the same rule. Change your password every 30 to 90 days, and mix letters, numbers, and symbols.
The goal was stronger passwords, but what we got was the opposite.
Users did what users always do under pressure. They picked Password1, then Password2, then Password3. They wrote them on sticky notes. They reused the same one across systems. Human memory has a limit. The policy didn't care.
In 2017, NIST dropped the recommendation. Forced rotation had made things worse.
This is the Law of Unintended Consequences:
"Whenever you change a complex system, expect surprise."
Robert Merton popularized the term in sociology. Engineers run into it weekly. Every change to a complex system can produce three kinds of result:
๐ญ. ๐จ๐ป๐ฒ๐
๐ฝ๐ฒ๐ฐ๐๐ฒ๐ฑ ๐ฏ๐ฒ๐ป๐ฒ๐ณ๐ถ๐๐. The fix accidentally improves something else.
๐ฎ. ๐จ๐ป๐ฒ๐
๐ฝ๐ฒ๐ฐ๐๐ฒ๐ฑ ๐ฑ๐ฟ๐ฎ๐๐ฏ๐ฎ๐ฐ๐ธ๐. The fix introduces a problem somewhere unrelated.
๐ฏ. ๐ฃ๐ฒ๐ฟ๐๐ฒ๐ฟ๐๐ฒ ๐ฟ๐ฒ๐๐๐น๐๐. The fix makes the original problem worse. That's the password story. The most dangerous of the three.
A few engineering versions:
You enable verbose logging to debug a production issue. The logs fill the disk and the service crashes. The fix made the system less stable than before.
You finally fix a bug that's been in production for two years. A downstream service was quietly depending on that behavior. Now it breaks. Hyrum's Law has merged with this one.
You want to increase code quality, so you start to require two approvals on every PR. Engineers respond by shrinking PRs until reviewers rubber-stamp them. Reviews go shallower, not deeper. Quality drops.
You add an alert for every error in production. Within a month, there are too many alerts and the team stops looking. The next real outage runs for an hour before anyone notices.
Same pattern every time. Your change interacts with parts of the system you didn't model: the humans, the dependent services, the operational reality. Your fix lands in a system more complex than your mental model of it.
The discipline isn't predicting every consequence. That's impossible. It's expecting some, shipping in a way that lets you see them fast, and treating second-order effects as part of the design.