Episode 293 — Unreal Fellowship — Brian Pohl

Episode 293 — Unreal Fellowship — Brian Pohl

The Unreal Fellowship is a 30-day intensive blended learning experience designed to help experienced industry professionals in film, animation, and VFX learn Unreal Engine, develop a strong command of state-of-the-art virtual production tools, and foster the next generation of teams in the emerging field of real-time production. 

During the program, Fellows learn Unreal Engine fundamentals, model ingestion, animation and mocap integration, lookdev, lighting setups, and cinematic storytelling.

Brian Pohl is a Technical Program Manager at Epic Games. He helps artists and clients get the training they need to integrate the Unreal Engine into their creative pipelines. Pohl also acts as the academic Dean for the Unreal Engine Fellowship Program.

In this Podcast, Allan McKay interviews Brian about virtual productions and the tool Unreal Engine provides, as well as the Unreal Fellowship, a training program that teaches artists the fundamentals.



[04:40] Brian Pohl Introduces Himself and Talks About His Background

[12:07] Brian’s Transition to Epic

[14:00] The Definition of Virtual Production

[20:30] Current Challenges of Virtual Production

[26:31] The Role of Artists Within Virtual Production 

[42:30] Brian Discusses Unreal Fellowship

[45:19] Qualifying Prerequisites for Unreal Fellows

[52:08] Structure of the Fellowship 

[1:02:51] On Metahumans and Other New Technologies

[1:15:30] Configurations for Unreal Engine

[1:23:27] Additional Resources and Information About the Fellowship



Hi, everyone! 

This is Allan McKay. Welcome to Episode 293! I’m seeing with Brian Pohl, the Technical Program Manager at Epic Games as well as the academic Dean for the Unreal Engine Fellowship Program. We talk about everything to do with virtual production, Unreal Fellowship and where we are going with production. We get into the definition of virtual production, the role of artists in it and where the technology is going. Just in 2020 alone, Epic Engine was used on Call of the Wild, Midnight Sky, Jingle Jangle, Over the Moon, Star Trek Discovery, His Dark Materials, Westworld. 

In addition, I have a brand new free training out right now on building a personal brand and how to command the fees you want and position yourself as an expert. Check it out at www.Branding10X.com. I will be putting out some really cool stuff in April and May.

Please take a moment to share this Episode with others, or information about virtual production and Unreal Fellowship.

Let’s dive in! 



[01:11]  Have you ever sent in your reel and wondered why you didn’t get the callback or what the reason was you didn’t get the job? Over the past 20 years of working for studios like ILM, Blur Studio, Ubisoft, I’ve built hundreds of teams and hired hundreds of artists — and reviewed thousands of reels! That’s why I decided to write The Ultimate Demo Reel Guide from the perspective of someone who actually does the hiring. You can get this book for free right now at www.allanmckay.com/myreel!

[1:28:19] One of the biggest problems we face as artists is figuring out how much we’re worth. I’ve put together a website. Check it out: www.VFXRates.com! This is a chance for you to put in your level of experience, your discipline, your location — and it will give you an accurate idea what you and everyone else in your discipline should be charging. Check it out: www.VFXRates.com!



[04:40] Allan: Brian, thank you so much for taking the time to chat! Do you want to quickly introduce yourself?

Brian: Sure! My name is Brian Pohl. I’m the Technical Program Manager at Epic Games. I’m currently the Academic Dean for the Unreal Fellowship, a new program we deployed to train artists. I was a Technical Account Manager at Epic before moving onto the training department. I acted as a client advocate, to help them integrate Unreal into their pipelines. There were a couple of us. It was a good gig, it gave me a lot of opportunities to learn the technology and meet with clients. I worked as a Previs Supervisor before that, for 15 years. I’ve worked at other houses, ILM, Sony Imageworks, Method Studios.

Brian Pohl, Technical Program Manager at Epic Games and Academic Dean at Unreal Fellowship

[06:20] Allan: Your background is more in previs. It’s a role in which you get to work closer with directors. Before working with Method on Independence Day, was it clear that would be an area to migrate to?

Brian: Previs is about being on the bleeding edge type of thing. Working on Attack of the Clones, it was an off-the-shelf software that we were using. We tried very hard to make things in real time but there isn’t much you can accomplish in open GL and Maya. It was pretty much gray boxes and gray shaded characters. We did get to the point of starting to do animation in basics. Phantom Menace did their previs with Electric Image. As for Attack of the Clones, we did it in Maya. There were a lot of limitations. Everyone in previs always strived to be as fast as they could go. The best we could do is playblasting out of Maya, which wasn’t horrible. You put it into editorial and it was okay. As far as being on the cutting edge, we always looked for technology to try and speed things up and get as close to real time as we could. There was some early technology that attempted to do that. When I owned my own company, we tried to use a technology called Mac Studio Pro. It was a real time game engine like solution that used GPU’s. I also remember in the early days, trying to use Cryengine and Cinebox. It looked fantastic but was a complete pain to get things into though. It was much more complicated to get your models into the engine. Everybody waited for things to get better. We tried to use those two on Independence Day but it was still too clunky. We could get work faster in Maya playblasting.

[12:07] Allan: That’s cool! With you going to work for Epic, how did that come about?

Brian: Well, I had an in at Epic. Marc Petit contacted me to ask if I’d be interested to work for the company. I worked at Autodesk prior to that. We worked on some cloud based asset management software. I stayed there for about 3 years. Afterwards, I went to work in visual effects, at Method and a couple of previs houses. Marc asked what my availability was. I got the gig, [it was] contract based at first. 

[14:00] Allan: Virtual production has become a big buzzword. It’s such an exciting time. How would you explain what virtual production is?

Brian: I consider virtual production is a spectrum of technologies. It’s not just one thing. If you tried to put it into one concept, it’s where the practical meets the virtual. You have the ability to work with practical equipment that influences things in the virtual world: that could be tracking your camera with some sort of optical tracking system and relaying that information into something like a game engine. But virtual production is the ability to merge both practical tech and virtual tech. 

[15:45] Allan: When did it become a thing? I was talking to the guys working with LED technology and they’ve been working with it for a long time. The Mandalorian was a big catalyst, but when did you double down on it?

Brian: Right. If you do ask anybody out there, there has been an entire history of virtual production. It wasn’t until recently that we started to get display technology that would have a high enough pixel density for practical use. The Mandalorian was the first time seeing that tech put into practice. I remember seeing that set and no one told me what it was. I was walking into the soundstage and it seemed like a huge set. But then I walked in and saw that it wasn’t a set at all. There were some practical pieces, of course, but the background was being rendered in Unreal. It was a complete shocker to see! For me, there was a lot of preliminary set up for all of this. Getting Unreal Engine to operate at a photorealistic capacity so that you could effectively place unreal renderings in the background and shoot it with a camera, without seeing any kind of moire pattern. As far as how far back, when did Epic start coming up with this? That would be a great question for Kim Libreri, our CTO. I’m sure he was working on this way in advance. I came on in 2017. I started hearing rumors about it. Unreal was just starting to integrate a sequencer into its toolset. You can’t really do things like The Mandalorian without it. It was around 2016-2017 that the word started getting out among us, employees. It blew me away!

[20:30] Allan: It’s such a game changer! It’s opening so many doors, especially during COVID. What have been some of the bigger challenges when tackling virtual production?

Brian: The biggest challenge right now is talent. Personnel! There aren’t many people who are used to working within a game engine, especially those who are used to working in VFX and animation. When I started learning Unreal Engine, it was a paradigm shift in thinking. I wasn’t used to the concept of having an entire pipeline in a box. I was so used to linear production methods and workflow. When I got into Unreal, it was tough to understand the concept of levels and sublevels, and subscenes. All these types of ways that Unreal was saving data! I don’t even question that now. Why wouldn’t you work like that? So talent is a huge thing to overcome, and that’s where the Fellowship came in. We’re trying to train more and more workforce. The next thing is having to sustain a dual pipeline. With Unreal, at least in the early days, there has been a lot of evolution. You’ve got VFX Sups and Directors breathing down your neck on unproven technology. So you have the pressures of normal production, plus the pressures of having this new way of doing things and people not used to doing things this way. You have to bring on experts who are strong in game engine tech, but then often don’t understand film production pipelines. “Why do you need to take so long to render? Why do you need to have your quality levels set so high?” There is this whole conversation that needs to be had and it’s a slow process for some people.

[24:52] Allan: There has always been a sense of a disconnect between games and film. In games, you have to come up with everything they need in film, but do it within the limitations of the engine.

Brian: Sure! In the early days of Unreal, the ingestion process was a lot faster than I’ve seen before but it was still pretty rudimentary in 2015. You could import an FBX and it was pretty solid, but there were some things to consider. There were no easy ways to bring in model data and preserve the look and feel. The shader systems were different. It was laborious. That’s where Epic went into overdrive to make it easier. 

[26:31] Allan: One big question I’ve had for a while: Where do artists fit in? There are jobs being created right now. You don’t have previs right now, but previs artists need to be at production level and at final production. You’re doing final production level work in conception stages. How does that work in terms of the workflow?

Brian: Right. First of all, you’re right. In a traditional pipeline, you’re using classic previs. No matter what you made, it would never be taken into the final product. It was just the quality levels and the speed with which things had to be built weren’t high enough. You didn’t want to load your characters to be too heavy. You’d have to do your animation, playblast it, take a look at it in real time, make your changes, playblast it again. There are these big loops. With previs, we’d do step keyframing. If you had time, you’d put in some animated movements. Most of the time, the previs coming out of that whole process was used for reference only. They’d produce an entire sequence out of that. Ultimately, it was that sequence that would be sent out for bidding to VFX houses. They would use those previs sequences to construct something better. It was an informative process and it helped the Director see what the film would look like.

In come: Game engines. They are an interesting development. You get the means to remove a lot of the limiters. You could achieve a lot of great images and clarity and playback can be in real time. You could iterate on the fly which made the decision making process a lot faster. It was a blessing and a curse. The better you make things look for a Director, the more they want to pick at it. Now, Directors are becoming more savvy. When game engines came onto the scene, we saw a whole shifting of the production workflow. Things you did in post, you were starting to do in production. Work began in pre-production. But what’s nice, in the end, the post-production process can be used for what it’s intended for: to make the image look more beautiful. There was a lot of [talk of], “Can we create assets in pre-production and carry them through the entire pipeline?” And the answer is yes! You can build things at a quality level that can be considered final. The only specific limitation, at least in the early days, was that the animation rigs were a little too simple. You had to have characters with less sophisticated rigs. There are better ways to get more sophisticated rigs. And now, with MetaHuman which just came out a few days ago, we can now effectively rig in a complex manner inside the engine. 

[36:21] Allan: All the Digital Humans looks really amazing! Do you see there being dedicated departments like, “virtual production scene assembler”?

Brian: Definitely! The whole process of getting data into the engine is pretty much the same. It hasn’t changed. People will be working with DCC tools. That’s still fairly the same. There are, however, some new geometry based tools. We’re even getting the ability to do traditional style modeling. Do I think we’ll be doing a ton of modeling in the engine? No. I think DCC applications have a big lead on modeling tools. We just need to be able to manipulate them. I see the advantage of it.

[38:33] Allan: I mostly mean when working on set, and you need operators as well. Rather than doing it in previs and borrowing the budget from pre-production. Can you pull a VFX artist off the street and have them operate within a game engine? 

Brian: Well, let’s just look at Quixel. Epic acquired Quixel (www.allanmckay.com/271) and all of a sudden we have a library of photo realistic, scanned geometry of environments. It looks great! The Quixel team is still scanning the world. Unreal Engine users get the benefit of that. Production designers just need to search for what they’re looking for and get something together. I think that’s a huge time saver, especially in pre-production. You want to be able to get something quickly, and up and functional in 15 minutes for the Director to look at. That doesn’t mean we’re trying to put modelers out of work. This is a tool for rapid modelization. We still need modelers to build custom things, and now they don’t need to focus on small things, or routine everyday things. 

[42:30] Allan: I’d love to talk about the Fellowship. It’s so exciting that you have it, especially given the timing. You’re creating opportunities for everyone. Can you first talk about what this Fellowship is?

Brian: Of course! [When COVID-19 hit, suddenly people were] out of work. Epic put down the decree that this is the time for people to learn Unreal Engine. In combination with an act of charity, Epic could help. We created a program in which we pay people to learn Unreal. It’s an opportunity for us. Every graduate is an ambassador for us. Individuals get paid to learn. It softens the blow. It caught on pretty well. We’ve done 3 cohorts now. Our fourth is starting March 1st, 2021. Is this something that will go on forever? Unlikely. But for the immediate time, we’ve budgeted for it. It’s a scholarship program, basically.

[45:19] Allan: What kind of artists typically do you seek out? I imagine there is a huge demand. Who ideally takes advantage of this sort of thing?

Brian: Right. We did set up how we wanted to choose people. We were given preference to those people who were unemployed. Now we are starting to consider people’s work conditions. 

  • We’d like for them to have 5 years of production experience. 
  • We’d prefer that you have 3-5 film credits on your IMDb page or you have a couple of episodics. 
  • If you’re a game artist, we’d like for you to have 2-3 AAA titles under your belt. 

We were not’ taking complete novices but a lot of the people coming in, we wanted them to be professionals that wanted to get retrained to use real time technology. Out of our cohorts, 65% came from visual effects backgrounds. We also opened it up to people who were stage operators and film production folks. We had a couple of producers there as well, grips and gaffers as well. When you’re working in virtual production and you’re working on a LED wall, we wanted to get them trained as well. We did not, however, accept folks outside of media entertainment. We decided to wait. It’s our intention to hold training for those other markets as well. 

[48:35] Allan: There are only so many virtual production stages around the world. What type of roles are people able to walk out with? 

Brian: We kind of see it in phases. What the current cohorts are going through are phase one: It focuses on fundamentals, buttonology, linear content creation, animation creation and general operations of using the engine. But it’s all angled at virtual production. We have a couple of guest speakers come in to speak about virtual production and what can be possible with this technology. If you’re in lockdown, it’s hard to prescribe what people need to have to have the technical hands-on experience. But if you have a workstation, a current graphics card, you can do this program. We didn’t want there to be a barrier. We’ve actually also provided people with virtual machines, on a cloud. We’d have them download the Teradici software. As long as they had a good connection, they’d get a decent performance. We had people take advantage of that, and some would even request in in addition to their own machine.

[51:32] Allan: I’d want to use your machine anyday!

Brian: I had a couple of people who had a MacBook Pro with a larger screen monitor and they would run Unreal in the cloud.

[52:08] Allan: How is the course structured? How many hours of training and how are they used? 

Brian: We ask that everyone is able to commit to 6 weeks of training. Our first program was a full 6 weeks, without any prep for it. We just threw them into the deep end. We had 15 people and we had a higher bar (we wanted 10 years of experience). What we moved to eventually in cohort 2 is to move to 4 weeks, but learn 8 hour days of learning. We settled for 5 weeks, with the first week being prep and some preliminary courses on our online portal. We get them into Slack and they meet their mentors. Weeks 1-5 is where they hit it.

  • A typical day would start out with a coffee hour in the morning and we’d have people taking this from all other timezones. We’d want to use that hour to gather. 
  • Then after that, they move into 2 hours of instruction or class. The way we’re scheduling this cohort is that they have a 2-hour block from [8:00] to 10:00 a.m. PST. 
  • Then, they’ll go into a one-hour instructors lab and they can ask questions. 
  • Somedays, we have classes in the afternoon. 
  • We also made sure to have weeklies and dailies. It’s a smaller group dynamic that mentors can spread out. 
  • We have guest speakers in the afternoon. 
  • We end the day with a happy hour.
  • On Fridays, we do weeklies. Each mentor recommends 2-3 people from their teams to present. That was inspirational. 
  • We also set up times for our technical artists to hold labs as well, twice a week for 3 hours. Fellows can ask anyone questions on how to do things, based on the instruction they had or for their own projects. That’s what makes the secret sauce. You’re going through this program with a mentor holding your hand. It’s the combination of their skill sets and the community. 

[59:10] Allan: With the March intake, how many people do you typically accept? What’s the headcount?

Brian: Each cohort is a hundred people. If you start doing more than that, it gets too chaotic. This term, we’re doing a bit different ratio: 20 fellows per 1 mentor. But that mentor will have 2 technical artists working with them and a subject matter expert from Epic. And those experts will rotate every week between the teams. That’s what you can’t get enough of. 

[1:00:55] Allan: If you think of Sony or ILM, you would have a mentor in the beginning. I think in a lot of ways, you’re taking it to a whole other level! You can’t fail with all these resources!

Brian: I think this coming term is going to be really interesting. Now, they’ll have access to Metahumans. I was amazed at what they did with Echo, the female we had in the past cohorts. She’s more stylized or CG-ish. Metahumans are closer to photorealism. 

[1:02:51] Allan: I was just chatting with Doug Roble at Digital Domain (www.allanmckay.com/290) who is setting up their digital humans. His predictions on where technology is going. I was joking that the mirrors on our desks will be replaced with computer cams.

Brian: Oh, yeah! By using an iPhone to do facial capture, you can put that onto a character. We can go down the rabbit hole of keyframing versus a mocap. It’s still a performance! I think some animators will adapt the technology, while others will do keyframing. 

[1:04:32] Allan: It’s basically looking at creature animations versus more stylized animation. There will always be extremes. In terms of partners, can you list some of them?

Brian: This is just a sampling of projects that used Unreal. These are the projects that have been using it since 2016. I can give you an insight. Some of the 2020 projects we can speak about are: Call of the Wild, Midnight Sky, Jingle Jangle, Over the Moon, Star Trek Discovery, His Dark Materials, Westworld. These are just from 2020. They’re using Unreal Engine in different capacities. There is a project called Earth to Ned. Ned was an animatronic alien character. It was a talk show host scenario. They used a virtual character in the background and they were driving it live while filming. All the previs houses are using Unreal right now for an array of different projects. Some are doing in-camera visual effects. Some are using their virtual art departments. There is quite an array!

[1:08:02] Allan: For you, what technology are you really excited about?

Brian: I was definitely keyed up with the Metahumans. With cubic motion and three lateral as part of our family, there is a lot of technology there that can create near photorealistic humans. There are some examples that are better than others, especially with overcoming the uncanny valley. All of this will continue to improve and the case for virtual humans, it will continue to grow. It’s a web and cloud based creation, and you can create whatever you want. That’s a huge one for me! I’m thinking back to my previs days. Now, we can dial in a human in a couple of hours, inside your project. There will be some time for the cloud to process your request. It’s not instant yet, but your feedback will be. As far as other technologies, control rigs are another big one. It’s going to change how we use game engines. It’s a whole infrastructure of technology that gives you a chance to potentially create a rig for a character, but it can be influenced by the subsystems of the engine. You can have a character and physics are acting on, but you can still keyframe. You can get a result that’s less time consuming or laborious. You can produce animation that will be much more startling. It will be impressive. 

[1:12:50] Allan: In terms of how open ended Unreal is, looking for complex solvers (like cloth or hair), do you see there being custom solutions?

Brian: I would say that’s a possibility. Simulations isn’t my area of expertise. There is Nanite, there is Lumen. These are two techs that will be in Unreal Engine 5. That will change a lot. Production designers will go crazy. And then of course, we’re still in the process of integrating Chaos. You’re going to have interesting ways of being able to use Chaos to drive cloth simulations. I would say those have to be the biggest tech improvements on the horizon. 

[1:15:30] Allan: What would be the ideal specs to start creating environments in Unreal? In terms of RAM, CPU, etc?

Brian: Sure! A classic configuration for a lot of our artists is typically 96 GB of RAM. SSD drives, a 2080TI is probably the base now as far as GPU. If you’re going to be doing professional work — to integrate and display — you’re going to have to pick up an RTX quadro card and the sync card that goes with it. You need to have the ability to synchronize your displays. An i7 or an i9 processor are great choices. A lot of people are going with Threadrippers these days and for good reasons. There are some other pieces of technologies you’ll have to consider for mocap. Get yourself an iPhone for facial captures. The V-Cams have improved as well. You can customize them now. You’ll want to pick up some VIVE Controllers. If you’re going to do mocap, you can get yourself an Xsens suit. 

[1:19:17] Allan: It’s pretty amazing all the technology we have these days! Do you think that having iPhone 12 (wth LIDAR support), do you think it’s the future of virtual cameras?

Brian: I wouldn’t doubt it! I’m sure that all of our developers are looking into it. I could theorize what might happen but right now, most of the focus has been with facial capture tech.

[1:20:24] Allan: I wish I’d known Apple was going to drop something like that. 

Brian: The first point of application would be using the LIDAR on your phone, for a quick scan on set. You can export that directly into Engine, for previs purposes. I know there are a lot of workflows that are using LIDAR devices. It’s already being done. Set preservation is a big thing. You capture the set and if you need to make changes, you’ve captured it. You have a scan on it. You can rebuild it if you need to. 

[1:23:27] Allan: This has been amazing! Where can people go to find out about Unreal Fellowship? The main thing is how can someone apply for the Fellowship?

Brian: Well, the March class is full but if you go to www.UnrealEngine.com and go down to Education and Online Learning, you can get started there. There is a page for the Fellowship as well: www.UnrealEngine.com/en-us/fellowship. You can get the basic understanding of what the Fellowship is. You can fill out your name and email address to express your interest. I believe there is another application process coming up soon. The initial submission process drew over 6,000 applications. We’ll reopen the process to get a fresh income of applicants.

[1:26:00] Allan: Do you have any indication when the next one would be opening?

Brian: The fifth cohort — and these aren’t locked in dates — is looking to start around May 10th, 2021. The summer fellowship would be in late July and the next one would be in October. When we open the application process, I’m not exactly sure. 

[1:27:33] Allan: Thank you again, Brian, for taking the time! This has been amazing and the Fellowship is a brilliant idea. This will be the future. Going to the source of it all is amazing.

Brian: Thank you, I appreciate that! 


I hope you enjoyed this Episode. I want to thank Brian for taking the time to chat. If you have any questions, please email me and I will pass them on: [email protected].

I will be back next week to talk about the importance of taking risks. Until next week —

Rock on!


Click here to listen on iTunes!

Get on the VIP insiders list!

Upload The Productive Artist e-book.

Allan McKay’s Facebook Fanpage.

Allan McKay’s YouTube Channel.

Allan McKay’s Instagram.

Let's Connect

View my profile on