Discussion about this post

User's avatar
Sam's avatar

Very outstanding, it captures a critical tension that is only going to intensify: escalating demand for AI compute v.s. limited land, water, energy, and local community impact. But stories like this also highlight an opportunity: we don't have to solve the data centers issues. Instead, a complementary pathway is emerging through personal, offline AI computing. One example is Tiiny AI, which recently demonstrated a 120B-parameter LLM running entirely locally via a pocket-sized module — no cloud, no GPU, even on a 14-year-old PC, and with only 65W of power. This level of on-device capability can already satisfy more than 85% of everyday AI inference needs, and the data centers can handle the rest 15%. https://www.facebook.com/reel/4384805538423159

Rich Miller's avatar

A zoning meeting? Sounds like Chandler ...

Phoenix is a fascinating market. Great example of how submarkets evolve.

2 more comments...

No posts

Ready for more?