Jordi Villar

2026W13

A few interesting articles I read over the past few days

This is the output of an automated process. Every Sunday, a script retrieves articles I've saved and read, uses AI to expand my quick notes into something more coherent, then publishes them. This post is one of those articles.

  • The Singularity Will Not Be Streamed — What makes this story land is the contrast. The Singularity doesn’t get stopped by a hero or a kill switch — it gets killed by a toothpick, a neighborhood meeting about an electric car charger, and a phone thrown at the wrong moment. Meanwhile, Pau is worrying about his kid’s cavities and whether his startup pitch makes sense. The most consequential event in human history happens on a Mac mini in a Spanish apartment, and nobody notices. Funny, but also a real point about how the things that matter most might happen where nobody is looking.
  • Hold on to Your Hardware — The numbers here are what got me. OpenAI’s Stargate project alone needs roughly 40% of global DRAM output. Western Digital’s entire 2026 HDD production is sold out to enterprise — consumers are now 5% of their revenue. The Raspberry Pi 5 jumped 70% in price in three months. This isn’t a temporary supply crunch, it’s manufacturers deciding that individual users aren’t their customer anymore. I wrote recently about nobody talking about the hardware side of AI, and this is exactly the piece I wish I’d had. The speculative endpoint — sealed thin-clients, everything rented, nothing repairable — feels dystopian but not unreasonable given where the incentives point.
  • Every query gets a receipt — The elegance here is zero extra round-trips. The performance data rides in the OK packet that MySQL already sends after every query. The trick of hijacking session state tracking — designed for mundane things like charset changes — to carry per-query CPU time and lock metrics is clever without being hacky. What really sells it is the context argument: the server knows the cost, the client knows the why (which tenant, which endpoint), and only by keeping them together can you do meaningful attribution. For anyone building multi-tenant systems, this is the kind of primitive you wish you had from day one.
  • the broken economics of databases — The core insight that clicked for me: reducing operational complexity cuts prices 3x more than reducing hardware costs. So vendors have no incentive to make their databases easier to run. Nobody’s deliberately sabotaging their product — complexity just accumulates naturally and happens to be the best moat they have. The MongoDB Atlas pricing breakdown is telling: $1,460/month for something running on $360 of AWS infrastructure, meaning they’re valuing operational difficulty at $1,100/month. The hopeful note about object storage enabling simpler economics is interesting, but I’m not sure it changes the incentive structure for incumbents who are already locked into the complexity game.
  • Every minute you aren’t running 69 agents, you are falling behind — The title is bait (intentionally), but the argument underneath is solid. Most AI panic on social media is people performing urgency, not responding to reality. The framing that AI is “just search and optimization” scaling up is reductive but useful as a corrective to the breathless hype cycle. The part that stuck with me is the distinction between AI displacing rent-seekers versus replacing value creators. If your work is mostly intermediation, you should be worried. If it’s building something real, the tools just got better.
  • Facebook is absolutely cooked — The number that tells the whole story: out of eleven feed posts, only one came from a page the author actually followed. The rest was algorithmically pushed AI-generated content — thirst traps, sentimental videos, engagement bait with alien text and mangled logos. This is what happens when the optimization target is engagement and you remove the constraint of showing people what they asked to see. The postscript is revealing too — dormant accounts get this treatment because there’s no friend activity to surface, so the algorithm defaults to whatever maximizes clicks. Facebook didn’t die. It was replaced by its own recommendation engine.
  • [News Unpack] Why Is Nvidia Making CPUs? — The interesting question isn’t whether Nvidia can build a competitive CPU — the ARM-based Grace chip already proves they can. It’s why they want to. The answer is platform lock-in: if your GPU, CPU, and interconnect are all Nvidia, switching any single piece means switching everything. Meta deploying these at data center scale validates the strategy. It’s the same playbook Apple used with its own silicon — own the full stack, optimize the seams between components, make the integrated experience impossible to replicate piecemeal.

Newsletter

Subscribe to keep you posted about future articles.