My Personal Tech Radar: Lessons from 2025
Every December, I review my technology tracking from the year. What did I correctly identify as important? What did I miss? What surprised me?
This exercise keeps me honest about my predictive accuracy (worse than I’d like) and helps me improve my approach. Here’s my 2025 review.
What I Got Right
AI agents were real but constrained. At the start of the year, I was skeptical of autonomous agent hype but bullish on constrained agents for specific workflows. That proved accurate. The successful deployments I’ve seen this year are all narrow and supervised. The autonomous agent dreams remain dreams.
Regulation accelerated. I expected AI regulation to move faster than the tech optimists assumed. The EU AI Act implementation, US executive orders, and the general direction of policy globally all confirmed this. Organizations that built compliance capabilities early are glad they did.
Foundation model commoditization. I wrote early in the year that basic LLM capabilities would commoditize rapidly. That’s happened - the gap between OpenAI and competitors narrowed, open source models improved dramatically, and “AI-powered” stopped being a differentiator.
Climate tech maturation. I expected climate tech to continue maturing from hype toward practical deployment. That’s played out - less speculative enthusiasm, more actual projects getting built.
What I Got Wrong
I underestimated robotics progress. I was too pessimistic about physical AI / robotics. The demonstrations from Boston Dynamics, Figure, and others this year were more impressive than I expected. I still think timelines to commercial deployment are long, but the technical progress was faster than I anticipated.
I overestimated Web3 recovery. I thought Web3 applications would show more real traction by now. Some specific use cases (stablecoins, tokenization of real assets) have progressed, but the broader ecosystem recovery has been slower than I expected.
I missed the speed of multimodal adoption. I knew multimodal AI was improving, but I didn’t anticipate how quickly it would move into production enterprise applications. The document understanding and visual inspection use cases scaled faster than I expected.
What Surprised Me
Nuclear energy momentum. The tech sector’s embrace of nuclear power surprised me. I knew there was interest, but the Microsoft Three Mile Island deal and the broader momentum around nuclear for data centers moved faster than I expected.
Private space ecosystem depth. The commercial space sector is more developed than I realized. Beyond the SpaceX headlines, there’s now a real ecosystem of companies building viable businesses on space infrastructure.
Enterprise AI caution. Despite the hype, large enterprises moved more cautiously on AI deployment than I expected. The contrast between breathless press coverage and measured actual adoption was striking.
Meta-Lessons About Prediction
Beyond specific technologies, some patterns emerged:
Hype and reality eventually converge, but timelines are hard. Most technologies I track do eventually deliver value. But predicting when is much harder than predicting what. I consistently underestimate how long deployment takes.
Incumbents matter more than I credit. I have a bias toward assuming new entrants will disrupt. In practice, incumbents who get serious about emerging tech often catch up faster than I expect.
Regulatory and social factors are as important as technical factors. Pure technology prediction misses crucial dynamics. The AI story of 2025 was as much about governance, policy, and trust as about model capabilities.
Contrarian views are valuable but overrated. I try to think independently and challenge consensus. But sometimes the consensus is right. Being contrarian for its own sake isn’t smart.
Updating My Approach
Based on this year’s lessons, I’m adjusting how I track emerging tech:
More attention to deployment indicators. I spent too much time tracking technical capabilities and not enough tracking actual deployment. For 2026, I’m focusing more on who’s buying, who’s deploying, and what the results are.
Longer timeline assumptions. I’m extending my default deployment timeline estimates. Technologies that seem ready for production usually need another 2-3 years of maturation.
Better incumbent tracking. I’m adding more systematic tracking of what established companies are doing with emerging technologies, not just startups.
Integration of non-technical factors. More explicit consideration of regulatory, social, and market adoption factors alongside technical capabilities.
Implications for Others
If you’re tracking emerging technologies professionally:
Build in feedback loops. Systematically review your past predictions. This is uncomfortable but essential for improvement.
Distinguish signal from noise. Most tech news is noise. The skill is identifying the few things that actually matter.
Maintain appropriate uncertainty. Strong opinions, weakly held. Be willing to update when evidence changes.
Focus on implications, not just trends. Understanding a technology trend is less valuable than understanding what it means for specific decisions you need to make.
Stay patient. The most valuable technology insights are usually about things that won’t fully play out for years. Patience in a world of hype cycles is a competitive advantage.
Looking Ahead
My radar for 2026 is taking shape. AI agent deployment, climate tech scaling, privacy technology, and enterprise AI maturation are all areas I’ll be watching closely.
But I hold these views loosely. The thing about emerging technology is that it emerges in unexpected ways. The best approach is structured attention combined with willingness to be surprised.
Another year of tracking begins.