Racing Architecture and Software (from 2019)
“Could race be not simply an object of representation and portrayal, of knowledge or truth, but also a technique that one uses, even as one is used by it—a carefully crafted, historically inflected system of tools, mediation, or enframing that builds history and identity?”
-Wendy Hui Kyong Chun, Race and/as Technology; or, How to Do Things to Race[1]
Architects talk a lot about technology. Not as much about race. I suppose it’s difficult to discuss race in a discipline historically and contemporaneously dominated by western, Caucasian thoughts and figures. Our canon of significant architecture appears heavily white and male, and so are, to a certain extent, the environments in which design occurs.[2] We tend to discuss postmodern imagery instead of those who live in postmodern buildings or the digital drawing as a whole over the people we draw in them.[3] As a result, inequity in architecture is quite easy to pinpoint, but difficult to address. There are, however, numerous initiatives seeking to shift this oversight. In 2017, for instance, Curbed published a slew of articles and interviews on the subject of race and architecture;[4] NCARB has made a commitment to diversity and is frequently publishing statistics on underrepresented minorities in architecture;[5] and calls to decolonize architectural history and theory in schools together with new policies actively demanding diversity in the workplace are tackling the subject of uneven representation in both academic and professional environments;[6] But statistics and graphs shed little light on the complex relationship between race and architecture. Race is not simply a descriptor or a categorization; it is a complex function that is affected and mediated by culture, science, and technology. It is a malleable, sometimes visible, and other times invisible—yet always present—concept.
While it is clearly evident in the contexts of social inequity, cultural identity, whiteness, blackness, art, and politics, race can also be explored through technology. And as work and labor become increasingly virtual, race is becoming ever more present in the instruments we use in architecture. Sometimes it is clearly identifiable, as in the case of errors in image datasets for machine learning; other times it is more subtle, such as the white, gloved hand we take for granted on the screen. These issues have percolated to the surface in disciplines tangential to architecture, but have also been slowly trickling into our own discourse. It is therefore the right time to discuss our instruments—specifically software—and who they are made for, what they represent, and what they really allow us to do.
Four Points on Software and Race
Software, somewhat of an overlooked topic, has evolved into the ultimate mediator. Not only does it help produce most architectural media, it provides us with a specific worldview based on the logics of computation and virtuality. It is because of software that some emphasize process over product, describe objects in terms of operations, and dialogues on drawing’s digitization persevere. In short: software is a major lens through which most of us see and design the world.
But software is designed by humans for other humans. How then can we be sure that these instruments are bias-free and neutral? We can’t. What we can do is scrutinize this medium in the same way that other architects have done in the past. As Ellie Abrons has recently observed, “[i]f architects have long indulged in the scrutiny of our media in order to better understand ourselves as architects, (think Robin Evans or Stan Allen), the examination of software now implicates our very own subjectivities.”[7] Race, therefore, is an inevitable component of this conflation of media and subjectivity. Moreover, architecture is not only a field becoming ever-more driven by technology, but as Abrons suggests it is also a protagonist in the production of digital culture and subjects. Excluding race from that discourse is ignoring the realities of how technology produces culture and vice-versa. Thinking about race and software allows us to examine, for instance, the biases in our own information systems or the flaws in our methodologies. It may reveal blind spots in the very software or datasets we use to further research on cities and environments, or unearth unforeseen subjectivities forged solely within digital culture.[8]
Though it might be initially uncomfortable to discuss race as a component of technology and architecture, we must persevere if we are to develop intersectional modes of operating. Identifying that discomfort is, in fact, part of the problem is a good start. Racism was not solved by anti-discrimination laws, nor is it some marginalized idea held by a few. Racism endures, as Toni Morrison puts it, because race “has a utility far beyond economy, beyond the sequestering of classes from one another, and has assumed a metaphorical life so completely embedded in daily discourse that it is perhaps more necessary and more on display than ever before.”[9] It is everywhere, but actively suppressed.
What follows is an outline of four talking points on race and software. It is intended to tease the subject out of already familiar discourses on architectural technology. Most sources come from new media and race scholars and are intended to supplement existing modes of examining cultural products and design methods.
1. Race as a Technology
Understanding race, as Wendy Chun has extensively written, not only as an object of representation, but as a technique allows us to examine how we shape ourselves, others, and our surroundings. There are technologies that deal extensively with race, such as mapping, segregation, eugenics, and advertising, to name a few. But these technologies for describing and manipulating race largely obfuscate its role in the mediation of our identities and environments. They view race as either a biological or cultural categorization. Instead, regarding race as a tool can shed light on the way it is evolving with design technologies and how it complexifies design in general. As Chun argues, combining “race and technology displaces claims of race as either purely biological or purely cultural because technological mediation, which has been used to define humankind as such (“man” as a “tool-using” animal), is always already a mix of science, art, and culture.”[10]
Race as a tool for mapping difference is perhaps most evident in U.S. history. Often used negatively for profiling or making predictions, this mechanism manifested itself spatially in mapping technologies such as segregation and political gerrymandering, tools that in many ways persist today. Chun reminds us that “segregation, importantly, did not only map space but was also a reaction to the transgression of space brought about by modern technologies.”[11] Prominent historical figures have also been found guilty of perpetuating such mechanisms. Frederick Law Olmsted Jr., urban planner and son of the notable landscape architect, once argued that for any housing developments to succeed, “racial divisions...have to be taken into account,” advocating for policies that avoid “the mingling of people who are not yet ready to mingle.”[12] Evidently, even those in charge of designing new solutions and innovations can succumb to the fear of change, which in turn reinforces the status quo and can lead to devices like geographic redlining.
Seen in this light, the technologies that architects use such as software, hardware, and information systems can be said to incorporate race in various ways. Psychrometric charts, for instance, chronicle average comfort levels of a specific population sample in order to determine the range of acceptable indoor temperatures. Less clear, however, is what populations are being taken into account in these samples. In mid-century offices the standard comfort level was derived from the average inhabitant: a suited white male. It wasn’t until the 1990s that researchers challenged the “assumption of universal applicability, arguing that it ignores important contextual differences that can attenuate responses to a given set of thermal conditions” such as “ how individuals and cultures vary in their perceived need for and expectations of air conditioning.[13] Today this type of racial data is being digitized into datasets for simulation software. Tools such as Siemens Tecnomatix Jack, an interface for simulating human ergonomics in factory/warehouse settings are driven by databases representing a sampling of human subjects, largely from military and government conducted studies.[14] Jack is a technology not only for the representation of specific subjects, but in fact becomes a tool for predicting what those subjects—represented virtually as two white figures, Jack and Jill—are capable of doing and how they would supposedly inhabit a space. This genre of software exhibits eerie echoes of eugenics in its ability to “analyze human performance”[15] and its emphasis on streamlining labor systems.
Regarding race as a mechanism that operates across disciplines and scales implies that it is a powerful, mutable concept not something to ignore. Colorblind attitudes (like abstracting all humans into white stand-ins) can do as much damage as racist ones. Thus, “the formulation of race as technology also opens up the possibility that...the best way to fight racism might not be to deny the existence of race but to make race do different things.”[16] It can be a valuable instrument to reflect not only on how and for whom we design, but also why we design.
2. Skepticism of Defaults
Software can be understood as a raced technology simply because it assumes certain attributes of its users. “The interface and Internet,” writes Michele White, “are raced as white by the prevalence of white hand-pointers, references to hands on web sites, and the tendency to depict white users.”[17] Who is Adobe Photoshop designed for? What do the default settings in AutoCAD presume about its users?
Default settings are designed for a specific purpose. They create the context in which work is done. But software has grown so complex that many users resign themselves to the default values set by those who designed the application. Default fonts, start screens, tool layouts may appear benign, yet as Michele White notes, they also “provide spectators with constant messages about what individuals who use the Internet and computer look like.”[18] This is most evident in representations of white hand pointers, and default neutral color palettes, but can also be seen in default content that is loaded or packaged out-of-the-box (OOTB) with our applications.
As previously noted, it is not uncommon for software to use virtual white figures as stand-ins for a variety of anthropometric information. Technomatix Jack collapses its data into two figures, Jack and Jill. But it is not the only instance of software defaulting to a white figure for reference. Perhaps the most famous default humanoid is Sketchup’s Bryce. Up until 2009, Bryce was “the default guy, the guy who automatically pops up into an empty space, ready to act as a human scale model to whatever it is you’re building...He’s kind of [sic] the guy you’d see anywhere, really—especially in the bowels of Silicon Valley.”[19] As harmless, as the intention might have been—Bryce was based on a real person, as a joke—having the most popular 3D-modeling program default to a faceless white man carries much significance.[20] It illustrates the lack of diversity in computer science environments and reminds us that virtual space is a very real site for dealing with race. In 2009, Bryce was replaced with Sang, another team member.[21] Since then, the catalog of Sketchup scale figures has grown, but remains remains a representative sampling of Silicon Valley demographics.
We might say the same of architectural renderings. Most of us are quite familiar with the photomontaged perspective full of white european scale figures populating a proposed building.[22] Of course the story is not that this building is intended only for those people, but a problem arises with the assumption that white figures stand for all people. Dora Epstein Jones has recently commented on several ongoing trends of populating architectural drawings. Focusing primarily on the work of MOS and their “populated plans,” Jones points out that designers should “consider the ethical implications of peopling,” and asks “Should we be blind to the whiteness of most of the little people?”[23] We shouldn’t. Precisely because what’s being communicated in these drawings is a future space for specific inhabitants, race and its depiction in drawing software can contribute to the overall concept.[24] The default figures included in software such as Sketchup or Autodesk Revit can in some cases enhance a project’s documentation, but they could also be read, as Amelyn Ng notes, as “bundles of demographic preferences that are on the one hand ethnically ‘diversifying,’ and on the other, increasingly gendered, racially typed, and ‘realistic.’ Racialized render realism turned deployable action-figure.”[25] In other words, even when done in the name of diversity, the scale figure can run the risk of being a stereotype or caricature.
3. The Myth of Neutrality
System defaults are designed to be a neutral starting point in most applications. But they also provide the context in which work will be produced. More importantly, work in any software application is the product of an imaginary relationship between the will of its user and its developers. And more often than not, whiteness in the forms of tool palettes, icons, canvas, and mouse pointer are presented to users as blank elements, unbiased and neutral. Whiteness as neutrality, however, is a common myth that pervades art, technology, and image-making. Writing about this misconception, Brian O’Doherty states: “An image comes to mind of a white, ideal space that, more than any single picture, may be the archetypal image of twentieth-century art...Unshadowed, white, clean, artificial—the space is devoted to the technology of esthetics.”[26] For O’Doherty, the white-walled galleries synonymous with modern art were a paradigm of de-contextualization that paralleled modern painting’s abstraction. Thus, white walls, while apparently neutral, were in fact full of meaning.
Just as white walls are not as neutral as they seem, white computer screens perpetuate a similar myth of neutrality. American Artist, an anonymous artist “whose work extends dialectics formalized in Black radicalism and organized labor into a context of networked virtual life”[27] suggests that white computer screens replacing black screens in the 1970s was a powerful manoeuvre. In an essay titled “Black Gooey Universe,” Artist states, “the transition of the computer interface from a black screen, to the white screen of the ’70s, is an apt metaphor for the theft and erasure of blackness, as well as a literal instance of a white ideological mechanism.”[28] By framing the history of Silicon Valley’s lack of diversity against its technological advancements, Artist produces a reading of software that reveals its blind spots. As the shift from command line terminals to Graphical User Interfaces—from black screen to white screen—reduced the amount of knowledge required to operate a computer, operating systems significantly obfuscated their underlying mechanisms. The white screen became associated with “user-friendliness,” while the black screen was seen as difficult and cumbersome, metaphors too easily associated with race.
American Artist and Michele White therefore agree that metaphors in the context of software are powerful mechanisms. Not only can icons can be read as metaphors for specific functions, but images such as the white hand cursor and language such as “dark skin mode” create clear allusions to race. Design software packages like Adobe Creative Cloud and Unity provide users with options to change their interface color, and while the former labels this preference as “Appearance,” the latter uses the term “Skin.” Autodesk, Microsoft, and Apple have also been progressively darkening their apps and operating systems in response to optometrists’s warnings on the long term effects of screen exposure. Yet the language being used to describe this transition to dark virtual spaces is far from neutral and fraught with racial associations—much like the controversial terms “master” and “slave,” which have been embedded in the language of computation for decades.
4. Science Bias
In the past 20 years since the spread of personal computing systems, architects have flocked to each new design tool like kids at a toy store. Today we can identify Artificial Intelligence as one such toy. But we must be wary of these systems as they have proven to be flawed, particularly in the way they address race. Many recent news articles cite examples of AI failing to interpret images containing non-white people.[29] In an MIT Media Lab study, researchers found that machine learning algorithms can discriminate based on classes like race and gender.[30] Rates of error in software developed by IBM, Microsoft, and Amazon, “were no more than 1% for lighter-skinned men whilst for darker-skinned women, the errors soared to 35%,” and more strikingly, “when tasked to classify the faces of Oprah Winfrey, Michelle Obama, and Serena Williams, the systems failed.”[31]
Of course, architects rarely use facial recognition datasets. Machine Learning (ML) and AI in architecture are typically deployed by experimental practices for (1) their visual effects, as evident in the work of Casey Rehm, and (2) for optimization or generative design, as in some recent projects by Certain Measures. Still, facial recognition application programming interfaces (APIs) are becoming easier to use, and thus more accessible to designers. The collective iheartblob, for example, have recently exhibited a series of architectonic face filters similar to those found in Snapchat that produce unsettling or glitched effects. The result is an Instagram feed full of distorted selfies featuring the protagonists of the group and their friends. For iheartblob, the architecture of the face is a new site for experimentation, yet, seeing as many facial recognition apps are tainted with racially biased data, their limits must be acknowledged.
We must also remember that the industries most excited about machine learning and artificial intelligence are those in mass surveillance, advertising, and data gathering/profiling. Architects should tread lightly as they immerse themselves in these murky waters. As artist Trevor Paglen and researcher Kate Crawford have recently put it, “datasets shape the epistemic boundaries governing how AI systems operate, and thus are an essential part of understanding socially significant questions about AI...But when we look at the training images widely used in computer-vision systems, we find a bedrock composed of shaky and skewed assumptions.”[32] Images are sometimes labeled with “racist, misogynistic, cruel, and simply absurd categorizations.”[33] These taxonomies, revealed in their project, ImageNet Roulette, are funny at times, but also illustrate how problematic AI training model taxonomies can be.
It’s not always clear why we appropriate certain software. At times it might be convenience or curiosity, other times it might be industry standards or the result of branding. Regardless, these systems have evolved into a rich ecology that dictates the way we design and work. We are as much users as we are human beings. With that comes a variety of subjectivities not produced by us, but by others for us and executed in code. Including race in discourses related to media and architecture is crucial as we evolve into increasingly digital subjects and beings alongside software and its continual updates.
[1] Wendy Hui Kyong Chun, “Race and/as Technology; or, How to Do Things to Race” in Camera Obscura 70 (May 2009): 7-35. https://doi.org/10.1215/02705346-2008-013
[2] 2017 NCARB data shows, for example, that “only 2 percent of licensed architects in the U.S. are African-American.” Accessed September 20, 2019. https://www.ncarb.org/nbtn2017/demographics
[3] Though recently, Dora Epstein Jones has touched on the subject of scale figure diversity.
[4] https://www.curbed.com/2017/2/22/14677844/architecture-diversity-inclusion-race
[5] “NCARB’s Commitment to Diversity” July 10, 2018. Accessed September 20, 2019. https://www.ncarb.org/press/ncarb-commitment-to-diversity
[6] At the 2019 Chicago Architecture Biennial, the architectural history collective Aggregate have organized a project that brings in historians and scholars to discuss the legacy of colonization in architectural history education. See “Indigenous Knowledge and the Decolonization of Architectural Pasts and Futures” Aggregate.com Accessed September 20, 2019. http://we-aggregate.org/project/indigenous-knowledge-and-the-decolonization-of-architectural-pasts-and-futures
[7] Ellie Abrons, “Foreword” in Digital Fabrications: Designer Stories for a Software-Based Planet (Los Angeles: Applied Research and Design, 2019).
[8] Memes, for example, have recently been adopted by some in the architecture community. See Ryan Scavnicky’s ongoing work such as, “Mutant Authorship. Agency, Capitalism and Memes” Archinect.com Accessed September 20, 2019. https://archinect.com/features/article/150085454/mutant-authorship-agency-capitalism-and-memes
[9] Toni Morrison, Playing in the Dark: Whiteness and the Literary Imagination (Cambridge: Harvard University Press, 1992) 9.
[10] Chun, “Race and/as Technology,” 8
[11] Ibid., 18.
[12] Richard Rothstein, The Color of Law: A Forgotten History of How Our Government Segregated America (New York: Liveright Publishing, 2018). 51
[13] Richard de Dear, Gail Brager, Donna Cooper, “Developing an Adaptive Model of Thermal Comfort and Preference,” (Berkeley: Center for Environmental Design Research, University of California, 1997)
[14] Tecnomatix Jack uses the following databases for anthropometric information: ANSUR, Asian_Indian, Canadian Land Forces, Chinese, German, NHANES, and North American Auto Workers.
[15] Tecnomatix brochure.
[16] Chun, ““Race and/as Technology,” 28.
[17] Michele White, “The Hand Blocks the Screen: A Consideration of the Ways the Interface Is Raced” las modified August 18, 2009. HASTAC.org. Accessed September 20, 2019. https://www.hastac.org/electronic-techtonics/michele-white-hand-blocks-screen-consideration-ways-interface-raced
[18] Ibid.
[19] Stephanie Syjuco, “Default Men and 3-D Diversity: Bryce vs. Sang.” Accessed September 20, 2019. https://openspace.sfmoma.org/2009/12/default-men-and-3-d-diversity-bryce-vs-sang/
[20] The Sketchup Team, “Aint Our First Rodeo,” Accessed September 20, 2019. https://blog.sketchup.com/sketchupdate/aint-our-first-rodeo
[21] Syjuco, “Default Men”
[22] It is no myth that the same cutout figures are often recycled and used in architecture schools and offices around the world. There are, luckily, newer websites which collect and share images of more diverse figures. Such as “Just Not the Same” Accessed September 20, 2019. http://justnotthesame.us/terms-of-use/
[23] Dora Epstein Jones. “Little People Everywhere: The Populated Plan” in Log 45. (New York: Anyone Corporation, 2019) 67.
[24] This notion has been explored by MOS Architects in a recent catalog. See Hilary Sample and Michael Meredith. A Situation Constructed from Loose and Overlapping Social and Architectural Aggregates. (Art Architecture Design Research/Spurbuch, 2016.)
[25] Amelyn Ng, “OOTB,” e-flux Architecture. Accessed September 20, 2019. https://www.e-flux.com/architecture/positions/280207/ootb/
[26] Brian O’Doherty, Inside the White Cube: The Ideology of the Gallery Space (San Francisco: The Lapis Press, 1986) 15.
[27] American Artist. Accessed September 20, 2019. https://americanartist.us/contact
[28] American Artist, “Black Gooey Universe” in Unbag no. 2. Accessed September 20, 2019. https://unbag.net
[29] Maurizio Santamicone, “Is Artificial Intelligence Racist?” Accessed September 20, 2019. https://towardsdatascience.com/https-medium-com-mauriziosantamicone-is-artificial-intelligence-racist-66ea8f67c7de
[30] Joy Buolamwini, “Gender Shades.” Accessed September 20, 2019. https://www.media.mit.edu/projects/gender-shades/press-kit/
[31] Ibid.
[32] Kate Crawford and Trevor Paglen, “Excavating AI: The Politics of Training Sets for Machine Learning” Accessed on September 20, 2019. https://www.excavating.ai/
[33] Trevor Paglen Studio, “ImageNet Roulette” Accessed September 20, 2019. https://imagenet-roulette.paglen.com/