It’s here! The 21st century digital renaissance has just churned out its latest debutante, and its swanky, sensational entrance has sent the world into an awed hysteria. Now sashaying effortlessly into the discipline of architecture, glittering with the promise of being immaculate, revolutionary, and invincible: ChatGPT. OpenAI’s latest chatbot has been received with a frenzied reception that feels all too familiar, almost a déjà vu of sorts. The reason is this: Every time any technological innovation so much as peeks over the horizon of architecture, it is immediately shoved under a blinding spotlight and touted as the “next big thing.” Even before it has been understood, absorbed, or ratified, the idea has already garnered a horde of those who vouch for it, and an even bigger horde of those who don’t. Today, as everyone buckles up to be swept into the deluge of a new breakthrough, we turn an introspective gaze, unpacking where technology has led us, and what more lies in store.
The tendency of architectural practice to glamorize moments of technological glory may have been founded in the early successes of CAD and BIM at a time when manual drafting was the norm. Curiously enough, not every such moment has translated into the disruption it was meant to be. We remember, only too well, that in our earlier days in the profession, 3d printing was the ultimate promised land, leaving people on tenterhooks with its indications of rapid production, cost efficiency, and waste minimization. Yet this lofty vision was never fully realized, and the innocuous technology struggles to see large-scale adaptation today.
Despite such underwhelming experiences, architects continue to be fascinated, and even fearful, of the direction toward which we are veering. Indeed, in the 21st century, for every starry-eyed architect fawning over all things science, there is an equally petrified counterpart cowering under the leviathan that is emerging technology. Yet before we throw in the towel on architecture, let us first ask this: What does it take for an innovation to succeed?
Invariably, like any other mass-produced commodity, the innovation must be accessible and adaptable, supported by a proportionate level of skill, hardware, and other resources. Architectural technology, particularly, must have the added aspect of utility, or the knowledge of where it fits in the machinery of the design process. If it becomes an integral, indispensable cog in the machine, it will inevitably be sustained, as AutoCAD, BIM, and generative design have been. But if it cannot prove itself absolutely critical to the process and remains optional or supplementary—such as 3D printing, VR, and AR—it is less likely to flourish.
Discerning the scope of each technological innovation is an agency of its user. Each new tool operates within a specific domain, delineated by its functions, to produce a specific result. For instance, form exploration with advanced modeling tools would not look the same as form exploration with more elementary tools. It is ultimately the decision of the architect to ascertain the route to obtain a desired outcome. Likewise, amplifying or diminishing the role of a certain step in the workflow may also define the scope of a certain technology. For instance, at Arup’s London office, VR/AR and related systems are used significantly in enriching the design process.
In one way, it does seem like we have cause for euphoria. We have come a long way from pen-and-paper architecture, and the picturesque, Matrix-like future of AI beckons at our threshold. A majority of AI software—like DALL-E, Stable Diffusion, Disco Diffusion, Midjourney, and even ChatGPT—operate by utilizing data derived from open-access cyberspace to generate responses based on text prompts provided by the user—in other words, giving tangible shape to an idea in minutes. This reveals exciting new avenues in architecture, as designers are already discovering. Italian architect Arturo Tedeschi recently used ChatGPT to write a script for Grasshopper 3d, combining the strengths of both text-based AI and advanced modeling technology. While this was indeed remarkable, using AI to engineer a building from scratch is still a distant dream, albeit an entirely plausible one. Meanwhile, we can still harness image-based AI’s abilities in conceptualization, treating it, as London-based architect Arthur Mamou-Mani puts it, as a “more involved moodboard.” In theory, asking software the right questions could potentially assist in exploring an idea. Yet this is highly problematic because of the way AI works. With its hyper-dependence on recycling data in cyberspace, AI is vulnerable to blindly imitating architecture styles, putting the design language of the future at the risk of further objectification. The result is a consumerist copy-paste architecture produced to pander to one’s instant gratification sensibilities. An even more poignant concern is that in the absence of any regulatory frameworks, architectural firms may be susceptible to data breaches. However, if regulation is indeed imposed in this ecosystem, it will likely limit the resource bank of AI software, rendering it useless. A more resourceful approach in practice today is employing image-based AI in the post-production phase, to augment the rendering process, or even in supplementary design needs of marketing and online content creation.
The domain of text-based as well as image-based AI is potentially more pervasive than other technologies, as it can be employed at almost any stage of the design process. Yet it is necessary to acknowledge that the bulk of the design process is allocated to tasks (such as coordinating between services) decidedly less glamorous than the sparkly clickbait visuals that AI produces. In a way, architects also work based on prompts that the client provides directly, through a brief, and indirectly, through their personal subjectivities. This layer is then superimposed with the architect’s own sensibilities, his or her individualistic style, and interpretations. Embedding this knowledge within artificial systems of intelligence may yet take some time, but meanwhile, let us bask in the knowledge unfolding before us, taking it with a grain of salt and not overestimating its capacity.
Feature image courtesy of the authors.