Read Before Monday #50
Welcome to March! This week I cover from hand drawn blueprints to AI designed chips; the way we create, compute, and communicate is shifting faster than we can comprehend. Let's start with CAD, which went from an elite engineering tool to a 3D printer staple. Then a math duo cracked a 20-year-old group theory problem while governments still can’t crack the problem of digital sovereignty, relying on U.S. cloud infrastructure like it’s a rental they’ll never own. Meanwhile, AI is designing wireless chips we don’t even fully understand, and artists are exposing just how much data we unknowingly leave behind. The world isn’t just evolving; it’s rewriting the rules in real time. The real question: are we keeping up, or just watching the machines take over?
___
This article traces the evolution of Computer-Aided Design (CAD) from its inception in the 1950s to its current state. It highlights key milestones, including early developments at MIT, the creation of commercial systems by companies like Applicon and Computervision in the late 1960s, and the rise of software like AutoCAD in the 1980s.
___
After two decades of dedication, mathematicians Britta Sp?th and Marc Cabanes have proven the McKay conjecture, a pivotal problem in group theory that relates the properties of a finite group to those of its subgroups.
___
Bert Hubert article argues against European governments and societies relying on U.S.-based cloud services, citing risks of legal overreach, data privacy violations, and potential political instability in the U.S. He emphasises the urgency for Europe to develop its own digital infrastructure to maintain sovereignty and security.
___
Researchers at Princeton Engineering and the Indian Institute of Technology have utilised deep-learning models to design millimeter-wave (mm-Wave) wireless chips.?These AI-generated designs, created within hours, exhibit unconventional structures that outperform traditional human-crafted counterparts.?The AI approaches chip design holistically, treating the chip as a single entity rather than a compilation of individual components.
My take: "Ah, you think Moore’s Law is your ally? You merely adopted the transistor count. AI was born in it, molded by it." - Okay, I butchered the Bane quote again, but stick with me. We’re at the point where AI is not just optimising designs, it’s creating things that human engineers don’t even understand. This is next-level weird. This is both exciting and terrifying. On one hand, AI is showing us how inefficient our own designs have been. On the other, we have no idea why its solutions work so well. Then there’s the practical side. What if these chips are impossible to manufacture at scale? What if a bug in the AI’s design logic creates a flaw no human can diagnose? And let’s not even start on security risks, how do you secure something you don’t fully understand? Talking about machines; where it is them talking Gibberlink.
___
The New Yorker article "The Artist Exposing the Data We Leave Online" goes into the work of artists who highlight the vast amounts of personal information individuals unknowingly share in the digital realm.?These artists use various mediums to visualise and critique the pervasive data collection practices of modern society.?
___
This Week in GenAI
Here it is the live this week, with Marco Silva which joined me to talk about Alexa, GenAI turns Nazi after malicious code and its the end of DeepSeek week opensource news. Plus there's some fun stuff around Pokemon.
In other news:
PS: If you got this far on the edition, then congrats and join me celebrating one year of #ReadBeforeMonday.
Here's the first edition: https://www.dhirubhai.net/pulse/read-before-monday-vitor-domingos-mgoaf/
Digital Transformation Strategist / Founder Differently Enabled
1 天前Happy Anniversay. And THANK YOU for keeping the conversation going!