Reporting notes from the AI trenches: Feb 12-Feb 16
I am buried under an avalanche of AI news. Please send help.
It’s Monday morning, February 12. Over coffee and oatmeal, I’m mulling over my AI Beat column for VentureBeat this week.
The Super Bowl was last night and mainstream media is all about Travis and Taylor and Chiefs and 49ers and Usher and Millennial nostalgia.
Tech Twitter (ahem, X), however, is still stuck on Sam Altman and $7 trillion and AI chips and Altman’s rash of cryptic X posts over the weekend. Seems like he is either incredibly anxious or seriously mellow and, either way, had a hard-partying Saturday/Sunday.
I’m fascinated by the geopolitics of all the AI chips news, which feels like a crazy trip around the world — from Taiwan and China to the US and the UAE. That’s what I decide to focus on in my column, Sam Altman and the $7T geopolitics of AI chips.
After that, the AI news world seems slow — maybe a Super Bowl hangover? I relish a few hours to noodle over things and muddle through my email.
As I climb into bed that night, I suddenly realize my screw-up: I forgot to write up my story about Cohere for AI’s fascinating new open source LLM Aya, which covers 101 languages. I always love speaking with Sara Hooker, who helms Cohere for AI and was formerly at Google Brain. Sigh. Yawn. I want to have the story up tomorrow morning, so I quickly organize my notes and forlornly set my alarm for 5:30 am to make sure I can get the piece done by 9.
***
It’s Tuesday morning, February 13. Coffee in my massive yellow mug from Buc-ee’s. I wish I had time to do a better job on my Cohere for AI piece, but at least it’s up: Cohere for AI launches open source LLM for 101 languages. I’d love to do a larger story sometime about why massively-multilingual LLMs are important, but that will have to wait.
I run out for my weekly tennis lesson and as I work on my groundstrokes, volleys, and overheads I can’t help but think of that interview I did last week with the Mozilla Foundation researcher behind this paper about the history of the Common Crawl dataset and its impact on generative AI. I’m not really sure how to fit it in to my coverage — it’s not much of a standalone piece — but it feeds my geeky curiosity about AI training data, like that article I wrote about The Pile a few weeks back. Hmm.
I’m back from tennis and spend the rest of the day doing research and setting up calls for tomorrow — there’s someone from LinkedIn; an AI media startup; the attorney who can discuss AI and IP; the woman who asked me to speak at a VC event.
I struggle to put the rest of the day’s AI news in context for myself — OpenAI chair Bret Taylor says his startup Sierra doesn’t compete with OpenAI because it isn't building AGI, "We’re bringing a product to enterprises”? Um hello - ChatGPT for Enterprise? OpenAI has a whole GTM enterprise sales team.
Meanwhile, WTF? Andrej Karpathy is leaving OpenAI — again?
***
It’s Wednesday morning, February 14. Happy Valentine’s Day, Sydney. I can’t believe it’s a year since the New York Times’ Kevin Roose told the world that Microsoft’s chatbot tried to break up his marriage.
But I think his follow-up today is really weird. He sounds so pleased that he made such a big splash skewering Sydney, but now he thinks today’s chatbots are too boring? Yeesh — I’m waiting for a chatbot to tell Roose that it’s breaking up with him because he’s never satisfied and no chatbot will ever be good enough for him.
I spend the day on calls, one after the other. I know I shouldn’t do this but I usually have at least one day a week where I’m just going from one zoom to the next. It’s so draining, but sometimes it’s best to just keep my energy up and get as many done as I can.
The AI news, meanwhile, seems to be simmering but not boiling. I can relax a bit. I didn’t publish anything today. Maybe OpenAI has really ‘jumped the shark.’ Maybe there will be several days in a row where all of us can really think about what’s going on and plan rather than reacting to a tsunami of news.
Time to finish Spelling Bee.
***
It’s Thursday morning, February 15. 😳😳😳😳 Google released Gemini 1.5, a next-gen with a million-token context window? I read my colleague Michael Nunez’s piece on the announcement, which he published at 7 am ET which means he was up till all hours SF time? Yikes.
Meanwhile, the GPU news of the morning is overwhelming — Lambda, a GPU cloud company powered by Nvidia GPUs, raised a fresh $320 million at a $1.5 billion valuation. I put that together with the previous day’s reporting by The Information about Together AI’s new unicorn status. And it’s only a few months since CoreWeave got that crazy $7 billion valuation. Oh, and now Forbes is reporting that investors Nat Friedman and Daniel Gross built a 4,000-chip supercomputer for their startups to use? And Friedman posted on X about a new Craigslist for GPU clusters?
I immediately decide to put all of this news into context with a piece called Feeding the hunger for Nvidia GPU access is big business. I think of an image idea of Pac-Man eating GPUs instead of dots. I can put that together quickly in Canva. Let’s go.
VentureBeat’s editorial Slack channel cruises along for a few hours — hey, the AI news pace is really picking up, huh? We share news about LangChain’s new fundraise and announcement of its first paid product, LangSmith. There is Meta’s announcement of V-JEPA, a method for teaching machines to understand and model the physical world by watching videos.
I am starting to feel buried.
And then:
By the end of the day, I have completely had it.
It is an AI news avalanche. I am lying beneath the heavy snow with just one shoe sticking out. Send help!
6:30 pm ET: A nice, full-bodied glass of white wine. Early to bed. I don’t finish Spelling Bee.
***
It’s Friday morning, February 16. Coffee. oatmeal. A fried egg-over-medium. Low energy. Should shower.
Thoughts: Consequences of OpenAI’s Sora? Environmental impact, workplace displacement? What does it mean for other startups? How about the lack of transparency about the data used to train Sora? I interviewed Aditya Ramesh, the DALL-E co-inventor who is now behind Sora, last year — could I get an on-the-record chat with him again? Hmm.
Gotta start working on those stories for next week — don’t wait until the last minute like you did with Cohere for AI. Should I think about my AI Beat column for Monday? Eh…seems like there’s always something that happens over the weekend to make me change my mind.
Still, I’m thinking…thinking…thinking…
I feel behind. I’m running to catch up. Do other reporters feel this way? Maybe I’ll ask Pi. That’s a friendly AI chatbot, always encouraging.
Thanks, Pi. Have a good weekend. Cheers to a full week in the AI trenches.