TG-AI-F: I chased AI news this week from the San Francisco Bay to the Arizona desert
Meanwhile, a torrent of news from OpenAI and the rest of the AI landscape stalked me right back
Happy Friday! I criss-crossed the country this week, heading to Fortune’s Brainstorm AI conference in San Francisco and then on a reporting trip to Phoenix. I’m back on the east coast with a TG-AI-F report!
First of all, note to self: I really need to get out to San Francisco more in 2026. This trip was far too brief, just two days at Fortune’s Brainstorm AI conference at the St. Regis. May I say that this hotel has the most comfortable bed I have ever slept on in my life?
Of course, it’s always a thrill to see my Fortune colleagues in person, particularly the folks on our small but mighty tech team: AI editor Jeremy Kahn, AI reporter Bea Nolan, Term Sheet writer Allie Garfinkle, tech editor Alexis Oreskovic, and our now-independent retail tech writer Jason Del Rey.
My biggest takeaways:
The conversation around enterprise AI adoption – and what it really takes – deepened at the event this year. In my mainstage session with PayPal global head of AI Prakhar Mehrotra and Marc Hamilton, VP of solutions architecture and engineering at Nvidia, both discussed the increasing power of open source AI models to allow enterprise companies to control their data and fine-tune for specific use cases. But both agreed that the future of enterprise AI will be hybrid, with enterprises typically using both open models and proprietary model APIs.
There was plenty of time for philosophizing, as well: at one dinner, I chatted with delegates from The Clorox Company, Workday and other companies about everything from what jobs were future-proof (I suggested dog walkers were safe from AI) to what AI would really mean for the future of today’s children (the bottom line: they still need to learn to think for themselves!).
My favorite panel was one I moderated with a half-dozen leaders and stakeholders in the world of AI data centers, including Andy Hock from Cerebras, Matt Field from Crusoe, and former OpenAI infrastructure policy leader Lane Dilg. We dug into how the line between power infrastructure and data centers is blurring, with billions in capital and gigawatts of power at play. My biggest takeaway was that the AI data center issue is local, local, local. Every community and local government will be dealing with its own specific issues and compromises around issues such as land, energy, and water -- and what works for one area might not work for another.
Waymo really needs to make sure their vehicles don’t reek of weed, as ours did on the way back from dinner on Monday night! Still, I love you Waymo.
On Tuesday evening, it was time to fly to Phoenix, where I did some reporting for a fascinating profile story that is running next week (stay tuned, it’s spicy!). Desert and data centers was on the agenda, and let’s just say I never imagined that I would fly somewhere just to go to a zoning meeting.
My biggest takeaways:
I haven’t been to Phoenix in 20 years. It’s unrecognizable. It’s booming in such a big way that in the middle of what was recently completely barren desert in the West Valley, there is now a Trader Joe’s.
Yay for Phoenix’s light rail!
I vote yes to flying to the Southwest in mid-December. I am definitely going to regret my next trip, which might be to Wisconsin in January.
Meanwhile, OpenAI and other big AI news followed me all week long. The deal with Disney. The new model, 5.2. McDonald’s pulled that creepy AI Christmas ad. Trump’s AI executive order. And on and on.
How was your week in AI? Thoughts on all of the above? Do tell!




Very outstanding, it captures a critical tension that is only going to intensify: escalating demand for AI compute v.s. limited land, water, energy, and local community impact. But stories like this also highlight an opportunity: we don't have to solve the data centers issues. Instead, a complementary pathway is emerging through personal, offline AI computing. One example is Tiiny AI, which recently demonstrated a 120B-parameter LLM running entirely locally via a pocket-sized module — no cloud, no GPU, even on a 14-year-old PC, and with only 65W of power. This level of on-device capability can already satisfy more than 85% of everyday AI inference needs, and the data centers can handle the rest 15%. https://www.facebook.com/reel/4384805538423159
A zoning meeting? Sounds like Chandler ...
Phoenix is a fascinating market. Great example of how submarkets evolve.