The Optim Interview

Alex Coulombe of Agile Lens Talks Unreal Engine Workflows

January 22, 2020

At Theia Interactive, our studio staff create visualizations for a high-profile list of enterprise clients, while our programmers create tools to streamline the process of creating those visualizations. Like many of our peers in the industry, we were losing time and money while our 3D artists spent hours dealing with complicated and tedious Unreal Engine workflows; dragging and dropping materials onto meshes, decimating some meshes while deleting others, and setting up LOD’s one-by-one.

Our Datasmith add-on, Optim, was originally born out of a desire to speed up and smooth out the Revit to Unreal Engine workflow for one of our most challenging and complex projects.  When it came time to test Optim we reached out to visualization experts across a wide range of industries, including AEC, product development, and studios like ourselves.

We’re big fans of the groundbreaking visualization work created by NYC-based Agile Lens, so we were excited when Agile Lens Co-Founder and Creative Director Alex Coulombe signed on as Optim’s first alpha user.

Unreal Engine Workflows Solved

Theia Interactive’s CTO, Stephen Philips, interviewed Coulombe prior to the November release of Optim during Autodesk University 2019.

Theia Interactive: Alex, you were the first person to gain access to our alpha version of Optim. You have a reputation in the immersive realities community for jumping right into the latest tech and showing everyone how it’s done. Tell us about Agile Lens and some of your latest projects you’ve been working on.

Alex Coulombe: It was an honor to jump on so early. I knew from your first description what a game-changer Optim would be for our workflow. Agile Lens builds VR and AR experiences for both architecture and live events. Unfortunately, our most exciting projects are under NDA, but a few I can mention include a giant VR stadium for Intel, the Rice University Music and Performing Arts Center, The Shed in Hudson Yards, Kenneth Branagh’s Macbeth, a Magic Leap ghost speed-dating experience called Ghosted AR, and Loveseat, a live VR performance at last year’s Venice Film Festival. We’ve been fortunate to gain access to a lot of XR hardware and software early and been given the freedom to explore their practical utility. A lot of our projects start with the prompt “I don’t know if this is possible, but…” and we thrive in pushing the boundaries of what’s possible.

TI: Unreal Engine seems to be a fairly significant piece of your business, but it’s certainly not the only tool in your toolbox. What types of projects do you find yourself using Unreal for instead of some other VR design review platform?

AC: We like to talk about “active design” or “designing from the inside out” which is to say we believe in the power of working through a project while inside virtual reality and understanding the full implications of design decisions on the human-scale experience. There are some great turn-key, one-click solutions for VR out there, but we specialize in tailored solutions; sometimes we’ll set up the ability to sketch inside our VR experience, move the locations of key design elements, or actively change lighting conditions. For example, maybe a client is considering options for a curtain wall’s louver system and wants to adjust parameters while also changing the time of day to see how that affects both cost and thermal load. That’s where Agile Lens comes in.

TI: How much time would you say you typically spend doing repetitive optimizations to your Unreal Engine projects? Time spent deleting useless meshes, merging separate meshes, creating LODs, or organizing the scene, for example.

AC: I wish we tracked that time precisely, but the short answer is “significant.” Our first major Unreal Engine project was in early 2016 and I remember exporting our client’s Revit file as an FBX, dragging it into Unreal Engine, then waiting over an hour for everything to import. At that point we really saw what we were in for – missing materials, flipped normals, awful UVs that wouldn’t bake, and a model of such size that we couldn’t get 30 fps in the editor viewport, let alone 90 fps in high-resolution VR. We were not prepared for that workload and had severely under-scoped our fee.

TI: What’s your least favorite part of that process?

AC: Oh, you know what’s great? When you spend a day cranking through a mindless task, then the next day something has changed and you have to do it all over again! Genuinely, I’ve always loved refining workflows and feel great joy when something that used to take an hour now only takes a few minutes. Back when I worked at architecture firms, I was always looking for opportunities to utilize Lisp scripts in AutoCAD or Maxscripts in 3ds Max or Grasshopper in Rhino or Dynamo in Revit — it was so satisfying to feel a release valve on the technical requirements of the work I was doing so I could spend more time thinking creatively and, you know, designing.

TI: Has Optim helped you save any time on the Unreal projects you’ve been working on?

AC: Yes. Moving from the standard Unreal Engine workflow to Datasmith was something like a 10x increase in speed, and now with Optim, we’re looking at roughly a bazillion-x. We’ve been able to take highly complex models, particularly ones coming from the AEC world, and get them into our projects with enough optimization already in place that we can immediately hop into the VR Editor and start to evaluate the state of a design, and get right to improving it.

After we started using Optim, our workflow immediately became half as technical and twice as creative.

TI: What Optim features are most helpful to you in your workflow?

AC: I love the way Optim handles light which can often come in from other software at weird intensities. The ability to create rules is really excellent and we love the way the new auto-rules feature sets up some of our most common needs. And frankly, there’s just something very satisfying about how data-rich the initial import window is — even if we aren’t going to change many settings, we love seeing how many materials, objects, and lights are about to come in, and how many duplicates, polygons counts, etc. It’s also great at tipping us off if there’s something wrong with our initial import. “Wait, 3400 materials? I thought this was a white model!”

Like many of our original Alpha and subsequent Beta testers, Alex has provided us with invaluable feedback. His expertise in all things related to visualization, and his enthusiasm for the work he does, has made our job easier!