Hello Reader,
Today, I want to talk about the simplicity and complexity of data centres, and the importance of going back to first principles in the age of AI.
The more you learn, the less you realise you know
When I first started covering data centres a decade ago, things were much simpler. I wrote two newsletters a month consisting of three stories and a feature where I typically interviewed experts in the field. I even got to visit a new data centre every year or so. I did it for over five years as one gig among several.
Then generative AI happened and data centre builds soared around the world. Suddenly, everybody wanted to know more about data centres. With that small advantage of a few years' experience behind me, I started writing regularly on LinkedIn. And this is probably how most of you found me.
But here's the thing. The more I've learned about data centres, the more I realise I don't know. Take water. I've previously written about why data centres need water and my visit to the first data centre-operated water reclamation plant in Southeast Asia.
Even then, I would never have imagined how complicated water management is. Data centres do extensive pre-treatment and chemical calibration on the water they use. Without it, things can go wrong incredibly quickly. Check out why poorly managed water can literally destroy data centres from the inside.
When it comes to data centres, the issue often isn't a lack of solutions - but getting the word out. This is exactly why I'm serious when I say that most of what I know comes from the experts I speak to. They're the ones deep in the trenches, and I would love to make sure their knowledge reaches a wider audience.
There's no shortage of topics I want to dig into, though there are only so many hours in the day. If you're working on something interesting in this space and think it deserves more attention, I'm always keen to hear from you.
Back to first principles
That same instinct to question what I think I know showed up somewhere unexpected this week.
I started on my fourth app over the weekend, with the idea of creating an AI writing assistant that taps on Claude Opus 4.6. Everything came together quickly enough, but the user interface just felt off. After an hour of trial and error, it turned out I was heavily influenced by the claude.ai chat interface and the special commercial font it uses. It made me realise just how much impact something as seemingly simple as a typeface makes.
This made me reflect on how I've been learning various fundamentals over the last two weeks: the building of highly scalable web services, smartphone-responsive designs, typography, and good UI patterns. On one hand, I'm digging deep into niche areas I never thought twice about. On the other, the desire to create a great app experience meant I practically had to study the first principles of good design.
I realised I also practised this earlier in the week when I wrote a couple of editorial pieces. I had interviewed experts, whose conversations were recorded and subsequently transcribed. The old workflow was to type out the transcription for hours, tease out key talking points, and weave it into a story.
This time, I worked from a completely different angle, leaving aside my usual process. I asked myself: what makes a great story that I would enjoy reading? With this approach, two separate chat sessions on Claude, and a firm hand in framing the narrative, I put together two pieces I was pleased about.
I'll be honest. I felt spooked when I read them again. They had all the hallmarks of what I consider a great story: pacing, insights, and tight coupling of quotes from different experts. Except that I produced them in a third the usual time. First-principles thinking isn't new. But AI is forcing us to apply it to skills we thought we'd already mastered.
How are you going back to first principles?
As usual, you can reply to this email to reach me.
Regards,
Paul Mah