Think Better – O’Reilly

Over the years, many of us have become accustomed to letting computers do our thinking for us. “That’s what the computer says” is a refrain in many bad customer service interactions. “That’s what the data says” is a variation—“the data” doesn’t say much if you don’t know how it was collected and how the data analysis was performed. “That’s what GPS says”—well, GPS is usually right, but I have seen GPS systems tell me to go the wrong way down a one-way street. And I’ve heard (from a friend who fixes boats) about boat owners who ran aground because that’s what their GPS told them to do.

In many ways, we’ve come to think of computers and computing systems as oracles. That’s an even greater temptation now that we have generative AI: ask a question and you’ll get an answer. Maybe it will be a good answer. Maybe it will be a hallucination. Who knows? Whether you get facts or hallucinations, the AI’s response will certainly be confident and authoritative. It’s very good at that.



Learn faster. Dig deeper. See farther.

It’s time that we stopped listening to oracles—human or otherwise—and started thinking for ourselves. I’m not an AI skeptic; generative AI is great at helping to generate ideas, summarizing, finding new information, and a lot more. I am concerned about what happens when humans relegate thinking to something else, whether or not it’s a machine. If you use generative AI to help you think, so much the better; but if you’re just repeating what the AI told you, you’re probably losing your ability to think independently. Like your muscles, your brain degrades when it isn’t used. We’ve heard that “People won’t lose their jobs to AI, but people who don’t use AI will lose their jobs to people who do.” Fair enough—but there’s a deeper point. People who just repeat what generative AI tells them, without understanding the answer, without thinking through the answer and making it their own, aren’t doing anything an AI can’t do. They are replaceable. They will lose their jobs to someone who can bring insights that go beyond what an AI can do.

It’s easy to succumb to “AI is smarter than me,” “this is AGI” thinking.  Maybe it is, but I still think that AI is best at showing us what intelligence is not. Intelligence isn’t the ability to win Go games, even if you beat champions. (In fact, humans have discovered vulnerabilities in AlphaGo that let beginners defeat it.) It’s not the ability to create new art works—we always need new art, but don’t need more Van Goghs, Mondrians, or even computer-generated Rutkowskis. (What AI means for Rutkowski’s business model is an interesting legal question, but Van Gogh certainly isn’t feeling any pressure.) It took Rutkowski to decide what it meant to create his artwork, just as it did Van Gogh and Mondrian. AI’s ability to imitate it is technically interesting, but really doesn’t say anything about creativity. AI’s ability to create new kinds of artwork under the direction of a human artist is an interesting direction to explore, but let’s be clear: that’s human initiative and creativity.

Humans are much better than AI at understanding very large contexts—contexts that dwarf a million tokens, contexts that include information that we have no way to describe digitally. Humans are better than AI at creating new directions, synthesizing new kinds of information, and building something new. More than anything else, Ezra Pound’s dictum “Make it New” is the theme of 20th and 21st century culture. It’s one thing to ask AI for startup ideas, but I don’t think AI would have ever created the Web or, for that matter, social media (which really began with USENET newsgroups). AI would have trouble creating anything new because AI can’t want anything—new or old. To borrow Henry Ford’s alleged words, it would be great at designing faster horses, if asked. Perhaps a bioengineer could ask an AI to decode horse DNA and come up with some improvements. But I don’t think an AI could ever design an automobile without having seen one first—or without having a human say “Put a steam engine on a tricycle.”

There’s another important piece to this problem. At DEFCON 2024, Moxie Marlinspike argued that the “magic” of software development has been lost because new developers are stuffed into “black box abstraction layers.” It’s hard to be innovative when all you know is React. Or Spring. Or another massive, overbuilt framework. Creativity comes from the bottom up, starting with the basics: the underlying machine and network. Nobody learns assembler anymore, and maybe that’s a good thing—but does it limit creativity? Not because there’s some extremely clever sequence of assembly language that will unlock a new set of capabilities, but because you won’t unlock a new set of capabilities when you’re locked into a set of abstractions. Similarly, I’ve seen arguments that no one needs to learn algorithms. After all, who will ever need to implement sort()? The problem is that sort() is a great exercise in problem solving, particularly if you force yourself past simple bubble sort to quicksort, merge sort, and beyond. The point isn’t learning how to sort; it’s learning how to solve problems. Viewed from this angle, generative AI is just another abstraction layer, another layer that generates distance between the programmer, the machines they program, and the problems they solve. Abstractions are valuable, but what’s more valuable is the ability to solve problems that aren’t covered by the current set of abstractions.

Which brings me back to the title. AI is good—very good—at what it does. And it does a lot of things well. But we humans can’t forget that it’s our role to think. It’s our role to want, to synthesize, to come up with new ideas. It’s up to us to learn, to become fluent in the technologies we’re working with—and we can’t delegate that fluency to generative AI if we want to generate new ideas. Perhaps AI can help us make those new ideas into realities—but not if we take shortcuts.

We need to think better. If AI pushes us to do that, we’ll be in good shape.

Related Content

Russia's finance minister says Russian companies have begun using bitcoin and other digital currencies in international payments to counter Western sanctions (Gleb Bryanski/Reuters)

AMD’s CES 2025 press conference: How to watch

What podcasts looked like in 2024 — literally

Leave a Comment