The app for independent voices

“People have a very high degree of trust in ChatGPT, which is interesting, because AI hallucinates. It should be the tech that you don't trust that much.” - Sam Altman. Many haven't realized, but it's 2004 all over again:

Many forgot, but some years ago, Business Insider reported the following dialogue between Zuckerberg and a friend ~2004, referring to the social network that preceded Facebook:

"Zuckerberg: Yeah so if you ever need info about anyone at Harvard, just ask. I have over 4,000 emails, pictures, addresses, SNS

[Redacted Friend's Name]: What? How'd you manage that one?

Zuckerberg: People just submitted it. I don't know why. They 'trust me.' Dumb f---s."

-

It's 2025, 20+ years later, and instead of implementing guardrails, protective mechanisms, and transparency features to support users, tech CEOs laugh and brag about how users trust their tech products "too much."

Putting AI anthropomorphism aside (which is an essential aggravant in this case), among the factors that make users trust too much is the absence of adequate design, privacy, and transparency features that help users navigate new technologies.

It happened in 2004 with social media, and it's happening again in 2025 with AI chatbots.

Jun 23
at
6:27 PM

Log in or sign up

Join the most interesting and insightful discussions.