Back to the Future: The History of AI and Computational Design
Ester Gerston and Gloria Gordon work on the ENIAC computer U.S. Army/ARL Technical Library Archives

Back to the Future: The History of AI and Computational Design

Digital technologies have become integral to our everyday lives; they influence almost everything we do. After several decades of computer-driven technical innovation, digital design theory and digitally intelligent design have become a critical chapters in the history of contemporary design.

Back to the Future 1940s

In the modern sense, the first computers were developed during World War II. The famous ENIAC started operation in 1946: it weighed twenty-six tons and occupied an area of 127 square meters at the University of Pennsylvania in Philadelphia. Computers got smaller and cheaper, but not necessarily more powerful, after the introduction of transistors during the 1950s. Mainframe computers were affordable for medium-sized companies, and professional offices became available in the late fifties. However, a mass-market breakthrough came only with the IBM System/360, launched with great publicity in 1964. Later in the decade, its most advanced models had the equivalent of around one-five-hundredth of the RAM we find in most smartphones today. Design, a profession that relies on complex problem-solving and visual communication, was primarily included in the digital revolution in its early stages. Computers were too expensive and limited for designers to use them effectively in their daily work. When converted into digital data, images and drawings required a lot of storage and processing power that was not available then. The computer-aided design (CAD) pioneers were mechanical engineers who developed software for controlling milling machines at MIT in 1959. One of the most influential innovations in this field was Sketchpad, a program that allowed users to draw and manipulate simple shapes on a touch-sensitive screen with a light pen. Sketchpad was created by Ivan Sutherland at MIT in 1963, demonstrating the potential of interactive graphics for design. Some design avant-gardes embraced the ideas of computation and cybernetics as sources of inspiration and experimentation. Still, they needed access to the machines or the skills to use them. Artificial intelligence (AI) is a term that was coined in 1956 at a landmark summer seminar at Dartmouth College. One of the pioneers and leaders of this field was Marvin Minsky, who wrote an influential paper in 1960 called Steps Toward Artificial Intelligence. In this paper, he described AI as a "general problem-solving machine" and proposed methods for training AI that resemble what we now call machine learning (ML) but with much less computational power. However, Minsky later became a fierce critic of a branch of AI called "connectionist" or "neutral," inspired by the human brain's structure and function. He especially disliked the Perceptron, an electronic device that could learn from its inputs and outputs, created by Frank Rosenblatt. Minsky spent the rest of his career trying to discredit this approach to AI. One of the influential figures in the history of AI and architecture was Nicholas Negroponte, a young student at MIT, when he developed a computer program to replace architects. Unlike Minsky, who shifted from a connectionist to a symbolic approach to AI, Negroponte started with a comprehensive approach that relied on a pre-defined set of rules derived from architectural knowledge. His program was designed to interact with the end users and offer them multiple choices that would guide the design process from beginning to end. This type of AI is also known as "expert," "knowledge-based," or "rule-based," and it dominated the field of AI for a long time until the recent resurgence of connectionist or neural methods based on machine learning (ML). However, Negroponte's program was unsuccessful, and he documented its failures in his famous book The Architecture Machine (1970). Negroponte's 1970 failure marked the end of an era for cybernetics and AI in architecture. In the 1960s, many cybernetics and artificial intelligence researchers needed to be more confident about the potential of their fields, but they soon faced a reality check in the early 1970s. The scientific community became skeptical about their claims, and the funding and support (especially from the military) dried up. This was the start of the "AI winter," a period of stagnation and decline in computer science. The political and economic crises of the 1970s also changed the public attitude toward technology from optimism to pessimism. While designers ignored technology, other industries, such as aviation and automotive, adopted computer-aided design and manufacturing tools. Architects would only realize what they had missed much later.

Back to the Future 1980s

?In the mid-20th century, many experts predicted that the future of computing would depend on large and powerful mainframes. But they were wrong. The real revolution came from small, cheap, and accessible personal computers (PCs) that anyone could use. The first PCs were very limited in their capabilities, but they opened up new possibilities for computing and creativity. The IBM PC, with Microsoft's MS-DOS operating system, was launched in 1981, followed by the Macintosh, with its innovative graphical user interface, in 1984. These machines enabled the development of affordable and user-friendly software for design, such as AutoCAD, which was released by Autodesk in 1982. By the end of the 1980s, PCs and workstations could handle complex and realistic graphics, transforming the design field. In the early 1990s, architects and designers discovered that PCs could be powerful drawing tools despite their lack of intelligence or creativity. They did not attempt to use PCs to find design solutions, as the early artificial intelligence researchers had hoped and failed in the 1960s. They abandoned that idea in the 1970s after realizing its limitations and challenges. Instead, they used computers to create drawings without referencing cybernetics or computer science. They applied their expertise, discipline, and design theories to computer-aided drawings. This was the turning point when computers started to have a tangible impact on architecture: as drawing machines, not problem solvers. Besides using CAD as a practical and efficient tool to draft, store, access, and edit blueprints in a digital format, some designers also realized that computers could help them create new drawings. These drawings would be hard or impossible to draw by hand as they involved complex and unconventional forms and structures. This idea was inspired by the Paperless Studio, an innovative and interdisciplinary group of designers, theorists, artists, and technologists that gathered at Columbia University's Graduate School of Architecture, Planning, and Preservation in the early 1990s under the leadership of Dean Bernard Tschumi. Deleuze's influence was not the only factor that shaped the digital avant-garde. Another critical factor was the emergence of a new type of CAD software, which enabled the easy manipulation of a particular type of continuous curves, called "splines." Splines had been used for centuries to smooth the hull of boats and later by car and aircraft makers to optimize aerodynamics. In the early 1990s, new and affordable CAD software made splines available to all designers, and splines became a common feature of digital design. Splines are continuous and differentiable functions, and by a fortunate or unfortunate coincidence, they were seen as the perfect representation of the Deleuzian Fold in digital form. Gilles Deleuze, who died in 1995, was probably unaware or indifferent to this connection. He was interested in something other than car design, streamlining, or aerodynamics in general.?In 1996, Greg Lynn coined the term "blob" to describe the new style of digital curviness. Later, Patrik Schumacher introduced the term "parametric ."From the late 1990s until today, digital streamlining has been the hallmark of the first digital turn in architecture: the image of a new architecture that would have been impossible or difficult to design and build without digital tools. Beyond the popularity of digital streamlining in the 1990s and beyond, there is a more profound and more significant aspect of the digital way of production independent of form and style. The use of scripted code for digital design and fabrication allows the creation of "families" of different objects that share the same mathematical logic. The values of some parameters or coefficients in the code can generate object variations. Deleuze and Cache called this new generic technical object an "objective" in 1988. They defined it as a set or range of different appearances with a standard code or structure. This is still the most relevant description of the new technical object of the digital age. Greg Lynn and others also came up with similar definitions of continuous parametric variations in design and fabrication in the early 1990s. The mathematical and technical foundation of digital mass customization uses non-standard digital design and fabrication. Each product can be different but produced in a digital workflow (CAD-CAM) without extra cost. This radical idea goes against modern industrial standardization and mass production principles. This idea also realizes the postmodern desire for diversity and variation thanks to digital technologies. Digital mass customization can offer variations for the sake of variations without any cost. And this is also an idea that Gilles Deleuze might have liked, as he was anti-modern and inspired the concept of the objective with Bernard Cache. The idea of digital mass customization was developed, theorized, explored, and experimented with by architects and designers before anyone else. It is among the most critical and revolutionary ideas architects and designers ever created. Standardization, which lowers costs in the industrial mode of production, only digitally limits the choices.

I hope you found this newsletter valuable and informative; please subscribe now, share it on your social media platforms, and tag me as Iman Sheikhansari. I would love to hear your feedback and comments!





要查看或添加评论,请登录

Iman Sheikhansari的更多文章

社区洞察

其他会员也浏览了