What the Railroad Panic of 1873 and the 1979 Toyota Corolla Can Tell Us About OpenAI vs DeepSeek
Bloomberg reports that Amazon is in talks to invest up to $50B in OpenAI, as part of a funding round could reach $100B, alongside Nvidia and SoftBank. The stated goal is more compute, more scale, and tighter strategic alignment.
The open question is what phase we are actually in. Is this pre railroad panic of 1873 moment, where scale results in overbuild, or a 1979 Corolla moment, where Japan efficiency quietly prevails over Detroit?
China’s DeepSeek points to the latter. Built under constraint, it delivers a large share of economically useful cognitive function apparently at a fraction of frontier cost. The exact ratio is uncertain, but even modest cost vs compute efficiency implies sharply diminishing returns to scale.
In neurobiology, energy-constrained systems prune and optimize to thrive; they do not simply expand.
OpenAI is betting this is a railroad moment. China is following the neurobiology of constraint. That divergence should be deeply troubling for anyone who studies how efficient cognitive systems actually evolve.