Every Tuesday, Megan Lubaszka logs in to her computer at Gensler’s Los Angeles office. She signs in to a GoTo Meeting, dons her headset, and enters a virtual world. She waits for the other avatars to start popping up around her. When they do, she sees their names appear in bubbles over their heads. Lubaszka—or rather her avatar—waves. Once they’ve all assembled, the project’s lead designers will guide the group on a tour of this soon-to-be-actual world. It’s a fairly straightforward design critique—except it takes place in the cloud, “within” the building that’s under inspection.
“VR Jams” are now a sacred weekly ritual at 28 Gensler offices around the world. “There’s a reason why we do them every week, on our own time,” Lubaszka says. “It’s such a beautiful experience to take a design team into their project and hear them reacting, like, ‘Oh, that proportion really worked out here!’ It’s just this experience of total wonder.”
Across the U.S., architecture firms are beginning to incorporate virtual reality (VR) into their practices at a fast clip. But, for the VR uninitiate, wrapping one’s head around the technology and what differentiates it from the other technologies that accompany it—like AR (augmented reality) or MR (mixed reality)—can feel overwhelming. Even experts will, confusingly, use these terms interchangeably. Why? Because of the nature of the technology. All VR/AR starts at the same place: a model in 3D. VR and AR are both ways of experiencing that model—outside of a screen.
Men's White Black Studio DC Men's Studio DC When you use a headset such as the Oculus Rift or the HTC Vive to roam around and interact with a modeled environment, as if you’re inside a video game, that’s VR. Here’s how Recode writer Eric Johnson explains it: “When VR users look around—or, in more advanced headsets, walk around—their view of that world adjusts the same way it would if they were looking or moving in real reality.” VR is what you may be most familiar with; cheap headsets have become more readily available, and the film industry has made some VR movies already.
Augmented reality is in a more experimental phase. Most folks became conscious of AR thanks to a little app called Pokémon Go, which lets smartphone users “capture” Pokémon that appear (through the lenses of their phone cameras) to inhabit the real world around them. In contrast to VR, which creates a virtual world, AR superimposes digital objects/imagery onto our physical world, through a smartphone, tablet, or special AR viewer. Both of these technologies exist on a spectrum: Improvements made to virtual reality will almost certainly affect AR, and visa versa.
When Metropolis covered VR in our December 2016 “Year in Review,” we suggested VR had reached a watershed moment among architects and designers. VR’s advantages as a communication tool are obvious. It gives clients an immediate understanding of a space. No abstraction is required. Dimensions become clear. Design intent becomes manifest. While most VR versions of architectural models remain rather rudimentary, it’s only a matter of time before they become unnervingly realistic. Chicago-based visualization firm Sonny+Ash, for example, is already adapting gaming technology to create VR visualizations in which every detail, from the texture of upholstery to the reflection of light upon a surface, can be experienced virtually.
However, a few firms are now exploring the latent capacity of VR to disrupt the design process itself. The challenge is in great part a technological one. Firms seriously interested in investigating VR have established partnerships with VR software providers (like IrisVR) or even hired software developers themselves. To achieve its weekly VR design critiques, for example, Gensler courted a relationship with the video game technologist Henry Yu. A father tired of the destructive violence of the gaming industry, Yu decided he wanted to help build things in the real world instead. His company, Kalloc Studios, developed Fuzor, software that acts like a plug-in for Revit or Rhino, turning 3D models into roamable virtual environments where designers, like gamers, can meet and work together. By building this kind of expertise, some firms think we will revolutionize not only the way we conceptualize designs but also the way our world is physically built and experienced.
“A lot of our competitors, they’re basically taking an architectural space and making it as close to reality as possible, making a photo-real representation of what you can actually experience once the building is finished.” Brian Hopkins, Ennead Architects’ director of applied computing, is explaining why, just a few months ago, his firm decided to take the leap into VR.
“We thought we would take another approach: What are all the things that we can’t see when we’re in reality? How can we start to poke the boundary between the real and unreal—the real being the sensorial, or the world of the sensory, and the unreal being that which we can’t see?”
Hopkins is describing the emerging field of immersive analytics, a multi-disciplinary initiative that seeks to take data visualization to the next level. For architecture, immersive analytics seems a natural partner: After all, every model of every building already holds within it many complex data sets waiting to be displayed.
For example, for a planetarium in Shanghai, Hopkins’s team, using the Ladybug and IrisVR plug-ins for Grasshopper 3D, was able to overlay environmental data about the building’s reception of daylight on the design itself, in order to experience that data in VR. Suddenly, the designers could feel everything they knew about the data set—how much light would be concentrated where, where sunbeams would hit in the morning and in the afternoon. In Hopkins’s words, “It makes the quantifiable qualitative. We start to describe light in ways that are actually closer to metaphor than data analytics. We’re humanizing the data.”
For its next challenge, Ennead wanted to up the ante. What would it take to visualize, not environmental data, but human behavior within VR? Certain plug-ins already existed for just this purpose—Interior Architects, for example, uses InsiteVR to record and analyze user behavior, like where they (virtually) go and gaze. However, the team at Ennead wanted a custom solution—and a truly immersive, interactive experience.
Using the Unreal gaming engine, the architects and developers crafted a high-fidelity virtual simulation of an existing Ennead project: the Beus Center for Law and Society at Arizona State University (ASU). The school had already worked with the architects and Unified Field to develop a tracking app that can locate students with similar interests or professors during their office hours and facilitate communication. For their VR simulation, the designers at Ennead leveraged this data, using it to power avatars that roam through the virtual simulation.
So, why bother programming artificially intelligent avatars to wander through a virtual simulation of a building that already exists? Hopkins and his team consider the ASU project a kind of test run. When they can program the avatars to simulate circulation, they’ll then be able to analyze design layouts, circulation paths, and points/durations of interactions between different user types. What’s more, when the architects can use the data pulled directly from the building’s users and tie individuals to avatars, they’ll have a real-time simulation of the building in use, which will give them a whole new way to analyze the successes—and failures—of their designs.
It’s easy to imagine a future in which immersive analytics, while incredibly useful for architecture, becomes absolutely crucial for urban planning, a discipline replete with complicated data sets. And it’s also in urban planning where the potential of augmented reality becomes more readily apparent.
For the firm AECOM, which often takes on large-scale transportation and infrastructure projects that span miles, AR has already become a vital visualization tool. By loading maps and satellite imagery into AECOM’s plans, AR allows the designers to take a God’s-eye view of a site, complete with accurate topology and dimensions. Through custom coding, done in-house, they then load in various layers of information, including different iterations or stages of the project, or even environmental or social conditions that exist outside the project itself. These layers are then superimposed on a physical model.
Morgan Garrard—who works at AECOM and holds the impressive titles of designer/immersive technology specialist and MR/AR/VR content developer—explains the process: “We can toggle between the regular city view, sort of a satellite view of the buildings, and then toggle to, say, environmental justice areas. So we’ll be able to see, in various gradients of red, where the areas of environmental justice are that need to be addressed….If you’re talking about graphs and charts and numbers, there’s only so much you can take in at once and make sense of. If you can turn all of that information into visual data, and have it there for somebody to interact with, that’s a big deal.”
But what would be an even bigger deal, at least in practical terms? If that information were to be mapped, not just onto a model but onto a real-life construction site.
AR’s fabrication and construction applications have been considered a type of holy grail for the technology ever since its inception. SHoP Architects has perhaps the best sense of what an AR-enriched future could look like. The firm broke new ground with the design and fabrication of the Barclays Center in Brooklyn, where 12,000 facade panels, each distinct from one another, were cut directly from a high-resolution 3D model created on software more commonly used in the aerospace industry.
Although the firm is not yet reliably using AR on construction sites, Barclays has laid the foundation for this eventuality. In the words of John Cerone, SHoP’s associate principal and director of virtual design and construction, building Barclays was about “figuring it out, road-mapping best practices in real time.” Cerone sees a clear through-line from that project to a current one for the Botswana Innovation Hub. As with Barclays, every component has been digitally modeled and cut by machines. On the construction site, there are no drawings, only IKEA-like assembly diagrams. To Cerone, it seems “inevitable” that in the firm’s next project, VR/AR will replace the diagrams and become integral to the construction process. “It’s too efficient for it not to.”
Architects using VR will be able to intuitively understand the dimensions of the construction pieces—how big they are, if one person can lift them, etc. And, since the machines use those same models to cut the pieces, they will know exactly what is being sent to the construction site. With AR viewers in their helmets, perhaps, or in their glasses (start-up Stimuli offers just such an option), construction workers will have those same digital models juxtaposed with their physical surroundings. Cerone notes: “Then we’re both occupying the same digital/real space. It is virtual, because you’re digitizing reality, but it’s representative of what’s really there.”
Indeed, the merger of the physical and virtual worlds is perhaps the most exciting—and revolutionary—possibility for VR/AR technologies in the architecture profession. One day, architects could design entirely in a VR environment; those files could be directly used in digital fabrication; contractors could see the digital and physical side by side to guide the construction; AR avatars could provide insights to make buildings more comfortable for their occupants; and facility managers could cross-reference real data from buildings to keep energy costs down. When that happens, architecture will never be the same again.
If you liked this article, you may enjoy reading about the video game designers making architectural puzzles in VR.