Episode 340 — Zoic’s Executive Creative Director / Co-Founder – Andrew Orloff


Episode 340 — Zoic’s Executive Creative Director / Co-Founder – Andrew Orloff

Zoic Studios is a VFX company of accomplished artists and producers with a deep understanding of story, process and relationships. With offices in Los Angeles, New York and Vancouver, the company has prided itself in partnership, integrity and enthusiasm as the key founding principles. Their projects are born in think tanks, where every idea has value and potential. Dynamic ideas are fleshed out, and a plan for crafting a visual representation of the idea is set early in the creative process. 

Since 2002, Zoic’s team has a proven track record of success. They are masters in their fields of 3D, Compositing and Production Management. Their reputations, in turn, attract additional talent and foster a strong work ethic.

They built a robust digital infrastructure, which is flexible and strong to support the staff. It is a “liquid” pipeline, designed by their team for rapid expansion and contraction. This custom pipeline is resolution-independent and changeable based on project needs. Leading visual technology firms provide additional support and insight to the latest methods. 

Zoic has worked on a variety of high-end productions, both in film and television: Many Saints of New York, Judas and the Black Messiah, Mother!, Game of Thrones, The Walking Dead, Legends of Tomorrow, Breaking Bad, Homeland, See, The Underground Railroad, The Boys, Fargo, and multiple more.

In this Podcast, Andrew Orloff, Executive Creative Director / Co-Founder of Zoic Studios, talks about the change in the industry with introduction of streaming content and virtual production, how Zoic has incorporated Unreal Engine into their pipeline – as well as gives tips to up-and-coming artists and VFX supervisors alike!

Zoic Studios: https://www.zoicstudios.com

Zoic Studios on Twitter: @ZoicStudios

Zoic Studios on IG: @zoicstudios

Zoic Studios on LinkedIn: https://www.linkedin.com/company/zoic-studios

Andrew Orloff on IMDb: https://www.imdb.com/name/nm1364187/

Andrew Orloff on LinkedIn: https://www.linkedin.com/in/andreworloff



[02:35] Andrew Orloff Introduces Himself and His Company

[19:35] How TV Budgets Have Changed

[24:58] The Future of Virtual Production

[37:00] Innovation of Unreal Engine

[45:15] An Outlook on Innovation and Disruption

[51:33] How to Choose Your Software as a Company

[59:42] Importance of Accessibility

[1:00:22] When and How to Transition from Being a Generalist



Welcome to Episode 340! This is Allan McKay.

I’m sitting down with Andrew Orloff, the Executive Creative Director / Co-Founder of Zoic. We get into a lot of great topics, including virtual production and Unreal Engine. 

Please take a few moments to share this Episode with others. That would mean the world to me!

Let’s dive in!



[00:58] Have you ever sent in your reel and wondered why you didn’t get the callback or what the reason was you didn’t get the job? Over the past 20 years of working for studios like ILM, Blur Studio, Ubisoft, I’ve built hundreds of teams and hired hundreds of artists — and reviewed thousands of reels! That’s why I decided to write The Ultimate Demo Reel Guide from the perspective of someone who actually does the hiring. You can get this book for free right now at www.allanmckay.com/myreel!

[1:06:04] One of the biggest problems we face as artists is figuring out how much we’re worth. I’ve put together a website. Check it out: www.VFXRates.com! This is a chance for you to put in your level of experience, your discipline, your location — and it will give you an accurate idea what you and everyone else in your discipline should be charging. Check it out: www.VFXRates.com!



[02:35] Allan: Again. Andrew, thanks for taking the time to chat. Do you want to quickly introduce yourself?

Andrew: Yeah. My name is Andrew Orloff. I’m one of the Co-Founders of Zoic Studios, a visual effects company focused primarily on broadcast visual effects. We’re coming into our 19th year. I’m officially the Creative Director and President of the BC office, which is our main production hub. We also have offices in Los Angeles and New York. And most recently, I’ve also taken on the title and responsibilities of Head of our Realtime Group. I [head] up the creative and technical leadership for the entirety of our realtime efforts, mostly centered around integrating Unreal Engine into the visual effects process.

[03:20] Allan: I love that! And also the only visual effect studio I know of that has valet, which I’ve always thought was really great! 

Andrew: It’s different when you come to Vancouver.

[03:37] Allan: I came from the commercial world where you’ve got clients coming and paying $1,000 an hour to sit in a Flame suite. And the reason many clients come back is because the experience and the relationships they build. Do you want to give a bit of background about how you got started? I’m always fascinated with everyone’s story because you never know when someone’s going to say, “Well, I was a lawyer, and I decided I like blowing stuff up.” How did you come across visual effects in general?

Andrew: Yeah. My journey is a little bit more linear. From a very early age, I was obsessed with science fiction, horror, fantasy movies, comic books and books. That was my go-to and also the filmmaking process. I had some experience that could be a career path: My grandfather was a screenwriter and my uncle was a creative director of an ad agency. So I had a good amount of exposure to the filmmaking process and had the wonderful privilege of being exposed to editing sessions. And I kind of knew all about it. I was so obsessed with those genres at the time – and that was really the golden age. (I was born in 1970. So I’m the 7-year old kid [who sat] there watching Star Wars crawl, for the first time.

[05:51] Allan: I was literally going to say, you would have done this.

Andrew: Letting it wash over [me]. And then all the other subsequent movies. And I became not only obsessed with the imagery but how the imagery was created. And I’ve always been a creative person and an artistic person, and I made my own artwork and wrote my own things. I was also really fascinated with computers and how they worked. From the Apple and Apple IIe days, just really pushing to try and find ways to express myself artistically on the computer, voracious looking all around to Fangoria magazine.

And then in the 80s, when CG started to first come into the feature film world – with Tron and The Last Starfighter – there was something about the design of that, the aesthetic of that, the possibility of that that was fascinating to me. And I just wanted to find out how they did that. That pushed me on a journey of trying to use the computing power that I had (mostly Apple IIe’s at our public school.) This is a San Francisco public school computer lab. Aside from some suspect copying a floppy disk, one of our great [pastimes] was doing artwork on the computer. And we started digitizing things by hand. I’d take comic book frames and redraw them by hand and put them onto a KoalaPad, which is a very rudimentary tablet of the time, and I digitized them by hand. I would make models of the space shuttle and then go with graph paper and measure it, and draw out every cross section of it on the graph paper, then write down all the coordinates of it. Then, I’d type it into a graphics program that would allow me to rotate the wireframes, like building wireframes by hand.

At that point, there wasn’t really a job there. People would say, “There’s art. I can understand that. There’s computers. I can understand that. But art is a computer. Why?” Why would you do that? It was in such infancy of it! But I was pretty obsessed, and I kept on to that. And I decided the best way to express myself was to go to UCLA and try for the UCLA film program. So I did that, starting as a Fine Arts major. The way that worked at the time is you had to apply separately into the film program. And so I was fortunate enough to be accepted to that program. And that’s really where I got a very special opportunity to work across departments, and worked a lot in the animation department (hand drawn animation), and I worked a lot with practical filmmaking. Unlike USC at the time, which was more of a more practical model of getting groups together – pitching as director, writer, producer teams and then hiring your team from the student body like it would be in a studio and the faculty acting as a proxy for the studio – the UCLA method was: Everybody made a movie and everybody rotated crew positions on everybody else’s movie. So that was really enlightening.

The equipment at the time was very antiquated. I mean, we were animating on the animation stands that they used in Snow White. They were using Moviolas and magnetic tape, so it was all pretty primitive. But what I still appreciate to this day is that the feeling of the hands on almost like a sculptural feeling to filmmaking, the physicality of it was really great. And I think that goes from everything to like splicing film together by hand with pieces of tape to moving lights around. I think that served me well as a visual effects artist to have that physical model of a film set. People say people throw around the words “photoreal” and “organic”. And what they mean by that is they want it to look like a real movie. We’re not doing cinema verite documentary visual effects. We’re doing visual effects that have the same qualities and the same language. Citizen Kane isn’t a documentary, Jurassic Park isn’t a documentary. There are artistic decisions being made around practical conventions that have been going on for 100 years.

So that hands-on feeling that I got through that film education was great. And I was still really interested in computers and doing computers on the side. I bought a used Macintosh SE from my roommate who wasn’t using his, and I used that to work my way through college, doing graphic design mostly. So I always kept my hand in that realm. And when I graduated, I did my normal rounds as a new film student, PA-ing, Prop Assistant jobs, commercials and working on set again. But during that time, I really wanted to get into computers. I graduated in ‘92 and ‘93 Jurassic Park came out and I was like, “This is what it’s going to be!” And I was continuing to make money in-between film gigs doing graphic design. I was at a CD-ROM place where I was doing cover art for them, graphic design, font work and marketing stuff. And the interface designer at the time was working on an SGI workstation that had a copy of Power Animator on it. And I would look over his shoulder and I knew from my magazine research, that’s what they used in Jurassic Park. I had to learn it! I’ve never seen one at college. There were no training programs available. My training program was a manual, which was super thick.

[12:41] Allan: Remember, many forests got killed in the days of Softimage.

Andrew: And there was also no Internet. Some board chatter, which I’ve always kind of been into, but not a lot of community for sharing knowledge. So I asked, “Could I stay after my shift to go through the manual and learn?” I did that for a while until [they asked me] to help out. Not very long after that, I was doing that full-time. And then my very first job in film, television, visual effects (and my entire career which is 26 years) was in broadcast visual effects. I was asked to help put the team together by Saban Entertainment to help them with their shows, mostly Power Rangers and the other Saban satellites. There were a lot of folks that ended up all over the place: producers, Flame artists in very high positions all around the industry. We all kind of scattered to the winds. And that was our crucible. That’s where we kind of really did learn our craft. It was very fast and furious. It was very generalized from the CG perspective. We were doing everything from tracking the cameras and making the model to doing the rigging, to rendering shots. And then I was freelancing, and I freelanced for a good 7-8 years. And I freelance with Lonnie [Peristere] and Chris [Jones], the Co-Founders of Zoic. We ended up together at Radium, working on Buffy, Angels, Smallville, Firefly. Then as that contract came to an end, we decided to form our own company. We focused on the artist experience in delivering high-end creative partnership, focusing on broadcast.

We’re heavily invested with the Berlanti verse, the Arrowverse shows: Stargirl, Flash, Legends of Tomorrow, and also working a lot with Mike Flanagan and Bly Manor. And we do a bunch of different types of work. We do a good amount of work where we are the supervisor, the sole vendor and the sole creative solution, from everything from character design to final delivery. And then we also work on a sequence level with the visual effects supervisors. And there’s a lot of community there at this point. And we do sequence work on some shows that are multi-vendor shows. 

And one of the things that gets asked about Zoic Studios in particular is: What’s your specialty? Especially when we started, there was a feeling that you had to specialize. Are you a matte painting shop? Are you a dynamic shop? Are you a character shop? And people look at our body of work and they’re like, well, “There’s a vampire and there’s a spaceship, and then there’s a castle. What’s the common thread here?” The common thread here is that we get asked to do work that has what I call a tonal specificity: something that the creators need, a specific vision that’s unique to that show. And that really exploded during streaming, because every streaming, one visual and tonal differentiation, you need to know. When you look at Midnight Mass and you look at other things that we have worked on, like Squid Game, everything has this very specific, tonal sweet tooth vibe. And the visual effect carries a lot of weight in those types of shows. And we’ve always been about building boutique teams. A commercial boutique was the hot term at the time for visual effects. And there’s a lot to be learned from that. There’s a creative continuity, a creative nexus of having a core team. And we’ve always tried to marry that with technology and smart scheduling, and utilizing the power of the art and technology while keeping that Nexus of core team, episode after episode, season after season. So that we can deliver these really specific images to our clients.

That’s what we do. With our pipelines, we do literally everything. We do our own previs, character design, visual effects design all the way down, being very involved in the editorial process. It’s just been a great experience! 

For about 12 years, broadcast visual effects weren’t getting a lot of respect. It was kind of like the red-headed child of the visual effects industry for a while there. And I think that started to turn around, especially with streaming probably about 8 years ago. People were like, “Whoa! The quality here is really exceptional, and the storytelling is amazing!” And now all the broadcast shows, all the streaming shows have huge visual effects, footprints and budgets. And the industry is just on fire and I think that’s going to just continue because people love it. People love to be in their house, and the material is great. Yeah, it’s really engaging.

[19:35] Allan: No, I think it’s made such a massive shift. It’s an exciting thing now how some of the episodics $1 – $2 million budgets. And that was unheard of [before]! I love the fact that things have gone that direction! But on top of that, you actually have the budget to be able to do things right.

Andrew: And it’s interesting because after working a career in broadcast, what streaming is doing is way closer to traditional broadcast than it is to feature film. If there’s a spectrum of budget and schedule, which are the two main differentiators historically between broadcast visual effects and feature film visual effects (with features having long time, long budget and TV having the opposite short time, short budget) I’m feeling that with streaming, it’s like hours and hours and hours of material. A season of 1-hour drama that has 10 episodes – that’s 3 feature films. That’s 2 Dunes in the amount of time that it took to make half a Dune!

[21:39] Allan: That’s the challenge!

Andrew: One of the things that Zoic has paid forward is a creatively focused, versatile, nimble pipeline that allows us to deliver quickly and iterate quickly. And realtime is a huge part of that. That’s where our realtime effort is really focused, because from what I see from other companies is that their challenge right now – moving the feature pipeline – is to be that nimble, to be that interactive, to be that iterative in the time frame. And they’re really highly invested as we’ve been invested for two decades of rapid turnaround, highly flexible pipeline. They’ve been invested in high quality, high output, long term and very linear task stepping. And we get into more complex things like character animation and such: those steps are like, “You got to approve all the animation on an unfurred character. And once the fur goes in, if you have a single note, you can’t go back.” And we’ve never been able to do that.

We’ve never had the luxury of doing that. We’ve had to be simultaneously firing on all cylinders to make our schedules and be able to do crazy things like talk to people at the VES and other filmmakers. Back to the practical part in my experience in film school, that’s not the way that filmmaking really works. And it creates a separation between visual effects and filmmaking, where it’s like a different process that people hate. You rarely hear creators say, “You know what I really love doing? Working with visual effects and visual effects vendors!” What you hear is, “We’re only going to do this when we have to.” We’ve always been looking to change that. And I think realtime has the potential to really shift that paradigm and bring the spirit of collaboration, the language of practical filmmaking, back into the visual effects process. That’s the thing that I’m seeing with LED walls. That’s awesome. Love LED walls! It’s so cool. But everybody’s focused on the fact they don’t have to pay for all these visual effects and make a gazillion notes on key edges. I hate it the other way around.

[24:40] Allan: Now we either create a separate department for pre-production which ties in the post, or you’re basically involving visual effects way earlier.

Andrew: It moves it into a different space than it has been traditionally.

[24:58] Allan: That’s right! I’m curious just to talk about realtime for you. When did you first start to drink the Kool-Aid? Both in terms of both previz, but also post vis, all the way through to compositing and finishing shots. How are you utilizing it in your production?

Andrew: I spent a good amount of time over the last two years going like, “Fad! This is a hype. It’s not really feasible. It’s never going to hit the quality level.” Then I started to look into it and saw what people were doing with it. And of course, everybody’s eyebrows raised up when The Mandalorian came out. That’s when I realized maybe I should take a more serious look at this. We talked a little about my background. I have that thing if it’s free and it’s new and it’s interesting, I’ll just try it on my own time just to see what it’s all about. And the proof for me was kind of immediate. Wow, this is potentially paradigm shifting! If it’s done right, there’s a lot of questions that haven’t been answered. I’ve been truly operating as a President of the BC office. So I get a lot of involvement in hiring and purchasing of technology and looking at where things are headed as far as what’s sustainable, from an economic perspective and environmental perspective; all this machinery that we need to constantly get and what it’s for; and what we’re really digging deep and analyzing. What is it that makes a visual effects company work? And what makes it profitable? And how do you get into the hardware you need to buy? And the one thing is about iteration. And it was very clear from just even playing around with Unreal that there was a lot of potential there for disrupting so much because it’s almost bringing that process back to integrating the visual effects process with traditional filmmaking techniques. So I’d say for me, the proof was in the hands and the eyes of actually working with the software.

And then an opportunity came up through one of our sales VP who had a relationship with Epic, asking if we wanted to apply for the mega grant program, to explore what it would take to integrate our VFX pipeline with Unreal. That sounded interesting to me. We had just done a very large, complicated, all CG character sequence for Star Girl. What if we just did it again in Unreal and compared the two, process- and quality-wise? 

[28:36] Allan: Because you would never usually get to do something like that. Because again, time is money and people are way more money.

Andrew: And it was during the pandemic. We were looking for something to do during a production. We had the artists. So that was a silver lining to this very horrible situation that we’re still dealing with today. But Epic thought it was a great idea. [They gave] us the funding and the support to really do it. And through that process, we learned a lot about it. We still got a long way to go, but I learned a lot about what Unreal can do for visual effects. And by the end of that project, I was just 100% in!

I mean, my personal feeling is: If we were to look 5 years in the future – this will be the future of visual effects in a lot of ways. It’s not slowing down and accelerating, whereas the linear pipeline is getting more complicated and more cumbersome, and harder to integrate for visual effects facilities. If you’re a new visual, if I was making Zoic today, I’d 100% focus on realtime. I couldn’t see starting it again and trying to do it the way that we did it. And we really struggled to make licensing deals with Maya and get enough After Effects licenses at the time, and get enough machines to do the rendering and machines for tape layoffs. And that was a real financial business struggle. And the young people are coming in with these skills because the software is free, and there’s a lot of information out there, and they’re getting a lot of feedback online. Anybody who works with me knows that if I find something that I feel passionate about, I’m going to jump in all the way. We went from having some people in our office interested in Unreal to a 30-person department that contains animators, technical artists, asset people, level designers, previs and editorial artists and art directors and creative directors. So it’s been pretty rapid over the last 16-18 months. And we’ve broken it down into 3 main parts that I think are important to the visual effects process: 

  • First one is interactive visualization, which includes traditional previs. But we do our traditional previs to scale and start integrating things like LiDAR of locations and sets. So that goes into tech vis where we can use it as a tool on set. We’re using Unreal exclusively for that previs and tech vis just as an amazing interactive communication tool. 
  • And then we’re pushing that forward into look development. Our first look development stages and our lighting development stages for assets happen in Unreal because it’s so interactive. 
  • And then we’re also developed a lot of ingestion and plate technology that’s able to bring our plates into the 3D scene, the shot plates once they’re shot and then conform the previs and put it into the plates to match what editorials cut in for their attempts to give a very clear roadmap to what the final visual effects are going to be.

And at that point, we know exactly what’s going to be rendered. We got timing, lighting direction, blocking animation all included in there. And along the way, it’s all based on interactive sessions with the creative stakeholders. It’s me or another supervisor or another artist with the Unreal editor listening to what they’re saying. And that’s where I feel – aside from it being a simulation, the lenses are simulated, the lights are simulated, the light profiles are simulated – when you’re done, everything can be translated directly to the set. Since it’s all simulated, we’ve looked at previs and tech vis as kind of fancy storyboards for a long time. This feels different because it is previsualization, but it’s also a lot more, especially when you’re integrating with motion capture: a lot more blocking information, a lot more lighting information, a lot more information to give to special effects, a lot more information to give to stunts. You can measure where the green screen needs to be. And if you bring that level even onto a set in the most rudimentary way, just even on a laptop, you can throw measurements down on it and tell them, “Okay, put this here”. And that’s been just a fantastic, really interactive way of pushing the creative process forward! 

Those shots stay in the cuts for a very long time, and it gets rid of an issue that inserts visual effects in. Somebody sees that gets freaked out because they don’t know what it is. And then they’re rushing the pipeline to get temps in there. And you’re going so fast and pushing so many bandaids on it in Nuke or whatever, using whack elements from reels or going outside to shoot something on an iPhone just to throw it in there – to give them an idea what the visual effects design is. 

And then when you finally get to breathe in that, that’s why we have done temps. Traditionally, there’s nothing to pay forward. This pays forward directly, and we’ve written tools to take this. And if we are going into our traditional linear pipeline, we’re exporting everything: light positions, light colors, everything back into Maya or Houdini, wherever it needs to go. The dynamic people have size, intensity, speed from the Niagara temps that we put in there and that’s a big thing. Like, how many wedges have we all looked at of dynamics just spinning, going like, maybe a little slower, maybe a little faster. Okay, I’ll do a new wedge. Let’s look at it tomorrow. That’s very different from taking that Niagara parameter with the director. They’re going faster, faster, faster or too fast back. That’s it. That’s exactly what I’m talking about. Exactly how fast I want it to be. 

I find that when they have that agency, they tend not to go back and forth. Then you show them the dynamics and they say, “Well, it doesn’t look the way that I expected it.” You approve the sphere and they say, “Well, that’s what gets creators frustrated with the process and why they tend not to like it.” This is different. I call it “genie out of the bottle”. Try to put the toothpaste back in the tube. 

[37:00] Allan: Kind of like what we’re talking about earlier with Flame and photo, even the Quantel systems. The whole point is that you’re going to go in there and sit on a nice couch kickback but direct someone in terms of what you want. And it’s very different if you go to desktop compositing where it’s in realtime. Typically, that was such a disconnected thing where either the studio themselves will have an internal team or you outsource to a dedicated previz company who are going to do all the animatics. But a lot of the time, it’s about visually just getting the idea across. Whatever tricks they want to do, it’s forcing you to actually solve those problems the first time around. And it means that everything just flows throughout the whole process.

Andrew: That’s changing, too. There are some cases again, like I said, sometimes we do the whole thing from visual effects, and we’re the supervisor and the sole vendor. And sometimes we’re a secret, and that’s starting to happen with real time in this interactive visualization space because we will get something from the Third Floor that’s been done in Unreal, and they did all the things, and they did a lot of layout, and they did a lot of blocking, and they need to take it to the next step. And then we’re getting it for maybe even Final Pixel, or just we’re doing the temps. They’ll give us their Unreal file, and then it feels like it’s a true passing of the baton to a real thing. We can actually pick up where they left off and pay it forward. I’m looking at like three pillars. There’s the interactive visualization portion. 

The stuff that we’re doing for the tv shows and the streaming shows that we’re on all have LED walls. It’s a very different process. It’s asset development, it’s optimization, it’s blueprinting, it’s being able to build a visual effects machine. We approach it that way to give the customer the blueprint functionality they can really use to quickly and interactively access what they need to. So if it’s sky changes or weather changes or being able to move and edit the scenery around, every different application of this has different parameters that need to be identified and put into the deliverable. So it’s a completely different way of bidding. It’s more of a mix between a software development and a visual effects delivery than it is a traditional visual effects delivery.

And we’re feeling a lot of the creativity that we brought to our asset development. Character design and environment design is really on display here. And what we’re doing is allowing us to enter into areas that we traditionally wouldn’t get into, like AR applications, which is kind of a new way of storytelling. It’s a new way to deliver and get visual effects out there, which is really exciting. And then finally, there is what we’re calling Final Pixel, which is using Unreal Engine as the sole and final render engine for composite compositors for visual effects instead of another traditional linear CPU renderer like V-Ray or Arnold.

Now every project that comes into Zoic Studios goes through the termination process of whether it’s going to go through the linear rendering pipeline or the Unreal / Final Pixel pipeline. It feels like every week we’re converting more and more. And in the future, there’s a bunch of super double top secret stuff that we’re doing right now. But in the very near future, you’re going to see a lot of Final Pixel rendering from Zoic Studios, which means taking that interactive visualization component and virtual art department component – and marrying them together and rendering. We’ve been working directly with Epic to render images and the LED wall. Not everybody, not every show, not every show can afford to build a Mandalorian sized volume.

But there’s a lot of that going on, too, which is realtime compositing using Unreal Engine and camera tracking. Every show needs rendered images to some degree and Final Pixel rendering. I feel that’s where the growth is going to be, the exponential growth. And we’re seeing just a lot of efficiencies, not just in the render times, but again in having everything in one place and kind of condensing and unifying the pipeline in a way where people are going over multiple task areas. Because we have two types of people that come into the realtime department: new people who have never worked another way before that are doing really well and teaching us what it’s like to grow up in a world of working in real time. And then people are coming from other departments in visual effects who see this as a new way to creatively express themselves in their work. At the same time, we have generalized task areas. People tend to be more technical or tend to be more asset oriented or more material oriented. But the material people still do a lot of level design and layout and lighting. So everybody’s kind of participating, collaborating in a way where the task lines are a lot more blurred. And there’s a lot of controversy about that. Honestly, in the industry, a lot of artists are super psyched about it and love it.

[45:15] Allan: The problem with technology is that by deciding to do what we do, we have to embrace the fact that we do need to adapt. You can’t just say, “Alright, I’m going to learn 3D. While we’re not using Indigo anymore? I’m not using NURB for modeling anymore?” The whole point is for people to innovate and look at what’s coming down the pipe. And for me, I think that there are those people who get very comfortable being like the one trick pony. Everyone’s scared of change, but at the same time, it’s a chance for you to really look at what are the upcoming trends and be the ones to jump on those before anyone else. And if anything, look at ways that can make your life easier and don’t be scared of how it’s going to disrupt the way that you’ve been doing things in the past.

Andrew: Yes, I wouldn’t be surprised. And I don’t have a crystal ball that this will be an industry standard shortly. The pure economics of it from a visual effect standpoint and the creative possibilities combined make it something that people will very seriously be looking at over the next few years. Every single facility that I’ve talked to [has told me,] “We’re starting up a realtime department.” There’s nobody saying, “No, this is not happening. We’re not doing anything in this area.” Everybody’s doing it! Everybody’s got an LED wall test, everybody’s got their virtual art department. One of the benefits of working in the broadcast and streaming space is we run through projects a lot more quickly, so we’ve already had an opportunity. We’ve already broadcast Final Pixel. We did several Final Pixel composited scenes for Superman and Lois, which was the new Berlanti Superman show last year. And we’re continuing to do that and gaining a really solid reputation as being able to deliver Final Pixel.

Also, we know that there is a certain attitude of “I’m not going to try to render volumetric interactive clouds in Unreal just yet. I’m not going to try to do water sims in Unreal just yet.” There are a lot of other things, but the digital double technology in Unreal is quickly gaining a quality. And I think when [Unreal Engine 5] comes out, people are going to start. It’s going to accelerate even more. And 5 isn’t the end! My question is: What does 7 look like? And when is it coming out? Five years? Six years? What I’m saying is: Just look at video games! When the first PlayStation came out, we were like, “Wow!” We couldn’t even imagine what PlayStation 3 and 4 and 5 were going to be like.

[48:19] Allan: Actually, the first time I ever heard of Method back in 2001 was when they did a PlayStation 2 commercial and it was for PlayStation 5. And I thought it was really genius! It was essentially saying, “This is what’s down the line. Here’s our roadmap. But right now check out PlayStation 2. Just like with the iPhone or VR. The first time I ever put on some goggles, I was excited about what would happen 3-5 years from now. It’s more about what is possible down the line.

Andrew: Yeah. And we’re looking forward to that too! I don’t know what this metaverse spatialization of the internet is.

[49:04] Allan: The big buzzwords.

Andrew: Yeah, those things could have a dramatic effect on what visual effects will be called in the future. There’s not really a lot of room for the traditional visual effects film linear pipeline in that space. It doesn’t work for our AR clients, for example. It’s all going to be Unreal. We just helped a company called Famous Frames with the Carolina Panthers AR project that got some nice buzz with the Panther jumping around on the stadium and grabbing a flag. And it was a very cool thing! Their mascot was interacting with the environment in a realtime tracking way. And we helped them with asset development, some lighting stuff and animation. And that was a deliverable for us. And they did all the conceptualizing, that was their baby. And we were able to deliver material to them in a creative relationship. In some ways, it was very close to what we do with our broadcast clients, but also very different. There was no other way to deliver that experience. 

And also, one of the things that’s also transformational – and I don’t want to sound like I’m ungrateful for everything that we’ve gotten from our linear vendors because they’ve been amazing! When I first started in the industry with $100K – $200K SGI Workstations and $75K Power Animator licenses with $35K add ons for. Stuff like that, like that was not achievable. We couldn’t have built in that environment. It was a PC revolution and the foresight of these companies to get their prices in line and the hardware companies to give us the hardware that we needed to exceed. I will be eternally grateful for that. I don’t want to say that that’s bad, because they continue to be great partners for us. But the level of interaction that we get from Epic and the support – and the interaction that we get on the software and development side – is unparalleled. It is miles different than any other software company I’ve dealt with in the time that I’ve been doing this.

[51:33] Allan: I think that’s a really valid point that most people don’t consider a lot of the time. For me, it’s more about the relationship with the companies who are developing the software. You’ve got to look at how many people they have and what’s their reputation so far? Because if you’re investing time and money into a software, it’s about knowing that they’re going to be around 5 years from now and really understand their roadmap for you. Is that something that you think a lot about? Obviously, Epic is a good example where they’re financially stable. They’ve been around for 25 years, at least in terms of being a game developer. They’ve got the foresight to swallow up the right people in the industry to ensure that they’re evolving rather than following all the trends and the tech. So then there’s other companies who might be doing some cool stuff, but at the same time, it’s more of a shaky investment.

Andrew: There are two things I wanted to say there. The first thing is about Epic itself. One of the other things that we did during the pandemic was to fully convert our CG compositing pipeline across board – across all of our software packages– to be fully Asus compliant. We’ve decided that for quality reasons, we’re going to do this. And as we were doing this, we were working with Unreal. How do they work with Asus? And whom do they have on their staff? The gentleman who invented Asus! That doesn’t surprise me at all! They were able to set up a personal meeting with us and our technical people to understand the process. It was like, such an important concept and also just like, there’s a lot of competition for us to keep our employees and that type of engagement where you have access, where they’re just giving very generous with their access to their experts across the board is inspiring.

[54:21] Allan: With Epic, even 10 years ago, I remember a lot of my friends at ILM left and went over to Epic. I’ve been impressed with whom they decide to absorb. My friends at Quixel are a great example. Looking at YouTube, whenever Epic acquires a company, there’s always going to be a lot of negativity around. But with Epic, their companies are in good hands. They know that they’re going to be utilizing them the right way.

Andrew: You’re absolutely right! Epic is using their economic power and their vision in the marketplace. I’ve been really honored and just privileged to work with them. This wouldn’t have happened for us! So I have to thank them for it. But the other thing they do really well is foste a real sense of community. For me, visual effects has always been at least 75-80% about the community of artists and that sharing of creative and technical information. Without the internet, for all the folks that grew up in that era where we were all communicating together, we had that small community of sharing tips and techniques amongst each other. I think community is important, and some of the larger software pay lip service to it. But when you look at how active the Blender community is and how excited they are, and how Epic has captured that lightning as well with their community spotlights, and they spend their resources communicating with the user base and the visual effects / game community all the time. There’s a conversation, a community among the developers. And I think that causes users to gravitate towards it. One of the other great pleasures of Zoic is that we’re able to integrate a lot of new people to the industry into our pipeline pretty quickly. And that pays dividends. 

And one of the most awesome experiences I’ve had is working with people who came to Zoic as an intern or [a junior artist], who have been here for 8-12 years and are now VFX supervisors in their own right. That’s an amazing experience that not everybody gets to have. I’m super grateful for that! 

One of the things that impacts diversity in our industry is access points, and the fact that Unreal is free helps a great deal. And I think that that’s important to the survival of our industry to bring people in from everywhere and to give people real opportunity. We still need to figure out some very difficult problems, economic problems – about how to get hardware to people that need it, especially with public schools. Like I said, as a California kid in a San Francisco City public school, we had six Apple IIe’s just sitting there. And I sat on those things all the time, day after day, hour after hour. And I don’t think that that happens anymore.

[59:43] Allan: Like Google collab and a few things like that are giving people access in a way that’s great. But you’re right! Access is always going to be one of those things that makes things a bit more difficult when you could have so much potential to do so many things. But if all you have is a pen and paper because of lack of access, it’s hard to really innovate or get on that bandwagon.

Allan: Yeah. So there’s some work to do. There definitely there’s education, there’s hardware access. But the fact that the software is available for free and the community is so strong, and there’s so much information out there about how to use it definitely is a step in the right direction. I appreciate that.

[1:00:22] Allan: When it comes to specializing, how impactful do you think it is for people to really learn the spectrum of everything before they decide to get boxed into a certain thing? My perspective of people’s careers has always been: Start as a generalist and niche down at some point. As you move into being a CG supervisor or a VFX supervisor, you need to start to vary a lot more. What’s your take on how people should start out?

Andrew: I agree with you 1,000%, and I think that when we’re working with new people, there’s a lot of parts of the practical film process that are missing that we try to educate people about. What’s depth of field? Why would you choose this lens over that lens? Lighting, direction, lighting direction and fall off, and how that affects scale, and the issues around painting with light. Those things are essential. I’ve worked with people who struggle with this because they may not have gone to a film school or they came into the VFX from another thing. They kind of picked up some buzzwords along the way, but they don’t quite understand how it all fits together. And here’s what I think is a great exercise for any visual effects. Artists take the movies that have visual effects, grab a frame, look at it and describe what makes it exciting for you. Why do you think it’s good, and then try to figure out what they did? What conscious decisions the artists that worked on this did to give that feeling to you? And I think that it could be that general because it can get wrapped up in technology here, wrapped up in technique. That is the downside of all the YouTubers. “A YouTube video said that that’s the cool way to do it.”

There’s a lot of people that can push buttons a lot faster than I can and work a lot more efficiently than I can. And they don’t have that conception of what the overall arching filmmaking techniques are, because if you’re a CG supervisor, you need to know what the compositors are trying to do. If you’re a compositor, you need to know what the lights and CG do, so you can ask for what you need when you need a revision. So everybody kind of needs that type of information, especially if you’re going to go into becoming a supervisor. You need to know that. And as far as the technical stuff goes, yeah, it’s probably a good idea to have a general working knowledge of even departments that you’re not in.

And if you’re going to be a supervisor, you kind of have to know a lot. We’re pretty good at Zoic about being pretty even handed with our supervisors. But when they make that transition, that’s the first thing that comes up: How much do I need to know about this other department that I’ve been interfacing with? And my thing is: You don’t need to know every button in Houdini to be a visual effects supervisor.

[01:04:36] Allan:You’re 1,000% correct! The more that you learn where things are coming from, the more that you can actually speak to those people in terms of how perhaps you might need things better – the more that not only can you communicate with them. You can also think ahead as to how you could make their life easier. And as you move up, you’re going to need to learn how to work on set in terms of relationships as well. It’s always about the 360 of what’s immediately around you. Yeah, just to wrap things up: Thanks so much! It’s been really awesome to chat with you about all things visual effects.

Andrew: Yeah, I could talk about this stuff forever. I hope you can tell I’m super passionate. And I’m grateful that I’ve had the opportunity to do what has been a lifelong passion and talk to people about it is always fun.


I hope you enjoyed this Episode. I want to thank Andrew for taking the time to chat. Please take a few moments to share this Episode with others.

I’m going to do a few solo Episodes coming up. Until then –

Rock on! 


Click here to listen on iTunes!

Get on the VIP insiders list!

Upload The Productive Artist e-book.

Allan McKay’s Facebook Fanpage.

Allan McKay’s YouTube Channel.

Allan McKay’s Instagram.


Let's Connect

View my profile on