The tech conference circuit had a well-known beat a few years ago. The same speakers gave polished forecasts about the future of work, artificial intelligence, and the next digital revolution as they traveled from stage to stage, one week in San Francisco and the next in London. They had clear slides. They had names for their frameworks. Their assurance was often comforting.
On a Tuesday morning, there are a ton of assured threads about AI strategy, workforce transformation, and the future of innovation when you browse LinkedIn. It’s odd how similar their sounds are. the same tone. identical structure. Four neat boxes with the same neat diagrams that explain complex change. It’s difficult to ignore the possibility that they were written in part by a chatbot.
| Category | Details |
|---|---|
| Topic | Impact of Generative AI on Thought Leadership |
| Key Concept | “Faux-Expert” crisis created by AI-generated expertise |
| Key Term | Thought Doership (builders vs commentators) |
| Influential Voice | John Winsor |
| Institution Mentioned | Harvard Business School Digital Data Design Institute |
| Core Issue | AI allows anyone to generate expert-sounding insights |
| Cultural Impact | Declining trust in tech gurus and online experts |
| Technology Involved | Generative AI / Large Language Models |
| Industry Affected | Consulting, tech commentary, corporate leadership |
| Reference | https://hbr.org |
There is a growing perception that generative AI has subtly undermined the tech guru’s entire business model.
The economics of “thought leadership” depended on scarcity for many years. Through experience, investigation, and narrative, a consultant or writer developed credibility. White papers, books, and keynote addresses all contained their insights. Because information seemed to be hard to come by, businesses paid well for those ideas. That scarcity has evaporated.
These days, anyone can create a compelling framework about AI transformation in a matter of seconds with a good prompt. A language model will produce something polished enough for a conference stage if you ask it to summarize three strategy books and suggest a leadership model. As this develops, it’s difficult to ignore how rapidly the barrier surrounding packaged expertise has crumbled. It has been dubbed the “faux-expert crisis” by some observers.
It’s incredibly simple to sound authoritative thanks to generative AI. A few cues. A post with a carousel. Perhaps an interview for a podcast. Someone becomes a “AI transformation advisor” out of the blue. The time it takes to go from casual observer to public expert has decreased from years to months or even weeks.
Additionally, the audience is frequently unable to distinguish between the two due to the overwhelming amount of insight.
In private discussions, executives discreetly acknowledge this. Speakers have been hired. purchased frameworks for consulting. conducted workshops on leadership. However, a lot of organizations are still stuck. Although the insight seems convincing at the time, it is much more difficult to turn it into actual operational change.
Boardrooms are beginning to suspect that a lack of ideas isn’t the issue. It’s a shortage of people who have actually built anything.
Speaking with operators—engineers experimenting with AI tools, product managers grappling with disorganized data pipelines, and startup founders realizing that real systems seldom behave like conference slides—makes the difference clear. Their tales are not as neat.
a pilot project that fails in the third week. Customers are confused by a chatbot rollout. A promising AI integration that works brilliantly in a demo but breaks under real-world traffic. Though they are a reality of technological change, those details are rarely found in thought-leadership posts. A new term, “thought doership,” has been used by some authors to characterize the change.
The difference seems subtle but significant. A thought leader describes potential futures. A thought-doer attempts to construct it, occasionally failing, occasionally succeeding, and always learning in front of others. Observing the tech sector at the moment, it seems that builders are gaining credibility.
Think about how businesses that are actually implementing AI are doing things. Grand strategic manifestos are not what they rely on. Small groups are conducting experiments instead. testing tools for automation. incorporating models into processes. shattering objects. repairing them. Repeating.
Ironically, this cultural change might be accelerated by generative AI. Commentary itself becomes less valuable when anyone can produce compelling commentary. Experience, the kind of scar tissue that only results from creating systems that fail in unexpected ways, is still valuable.
This moment might be similar to previous technological cycles. Digital strategists filled conference halls with predictions about online communities and e-commerce during the early internet boom. Many of those predictions sounded impressive. The infrastructure that made the internet useful was actually created by a small number of businesses.
It’s possible that the same dynamic is happening again. AI hasn’t, of course, replaced expertise. It has, if anything, raised the standard. First-hand knowledge, interdisciplinary thinking, and a willingness to acknowledge uncertainty are now more difficult to fake. Perhaps the most important part is the last one.
The most prominent voices in the AI debate frequently sound incredibly optimistic about what lies ahead over the next ten years. However, those who are closest to the technology tend to sound more wary. They are aware of the unpredictable nature of big systems. They are aware of how easily presumptions can be disproven.
There is a subtle irony lurking beneath the surface of the current wave of AI commentary. It’s possible that the same tools that made it simpler to come up with expert-sounding ideas are also revealing how flimsy some of those concepts were all along. And the entire culture of expertise might change as a result of that insight.
The days of the tech guru are far from over. However, the regulations have evolved. Presentation is no longer sufficient for authority. The true signal comes from something messier in a world where a chatbot can generate sophisticated insights on demand.
Work completed. systems constructed. Experiments are conducted. Put another way, those who struggle with technology on a daily basis may have a greater say in the future than those who describe it.


