Episode 351 – CBS VFX – Ava DuVernay’s ONE PERFECT SHOT

by Allan McKay  - May 17, 2022

[smart_track_player url=”<<enclosure url=”https://traffic.libsyn.com/forcedn/allanmckay/EP351_CBS_VFX.mp3″ length=”2448335″ type=”audio/mpeg”></enclosure>></enclosure>” title=”351 – CBS VFX – Ava DuVernay’s ONE PERFECT SHOT” social=”true” social_twitter=”true” social_facebook=”true” social_gplus=”true” ]

 

Episode 351 – CBS VFX – Ava DuVernay’s ONE PERFECT SHOT

 

When Oscar®-nominated, Emmy® award-winning filmmaker Ava DuVernay teams with some of cinema’s most noted directors in the new HBO Max original docuseries One Perfect Shot, movie lovers will see dazzling, unprecedented views inside memorable shots from film history. The series provides a never-before-possible perspective on filmmaking, allowing directors to step into their favorite shots and take viewers on amazing 360º views of their visions. DuVernay and her production company, ARRAY Filmworks, selected CBS VFX (www.CBS-VFX.com) to produce these visually breathtaking moments.

Created for television by DuVernay and inspired by a popular Twitter account of the same name, each episode of One Perfect Shot arms one acclaimed director with an arsenal of visual tools to pull back the curtain on their most iconic shots. Filmmakers share their obstacles, challenges, lessons and triumphs as they detail the road to their one perfect shot. The series features prolific filmmakers Patty Jenkins, Aaron Sorkin, Kasi Lemmons, Jon M. Chu, Malcolm D. Lee and Michael Mann, as well as blockbuster films including Wonder Woman, The Trial of the Chicago 7, Harriet, Crazy Rich Asians, Girls Trip and Heat.

Making the incredible images in One Perfect Shot required CBS VFX to use more than 25 years of visual effects innovation. Much of the work integrates the visual effects company’s proprietary Parallax virtual production system, which brings three-dimensional technology and an intricate level of detail to virtual locations. Combining Parallax with new tools in the Unreal Engine from Epic Games – including real-time rendering and ray tracing – led to remarkable production flexibility.

The digital production techniques from CBS VFX incorporated into One Perfect Shot represent an evolution of the work the visual-effects studio has done to replicate real-world locations in virtual sets and improve virtual production – which has been particularly important during the challenges of the pandemic.

CBS VFX has developed groundbreaking technology and created visual effects for multiple series across many networks, including Apple TV’s upcoming Roar, The Offer for Paramount+, This Is Us on NBC, Netflix’s Dead to Me, and Big Shot on Disney+. CBS VFX maintains a dedicated virtual production soundstage, and its team is available to television production professionals for live demonstrations of virtual production, digital set extension and realtime pre-viz technologies.

In this Episode, Allan McKay interviews Craig Weiss, Executive Creative Director at CBS VFX and Jim Berndt, Head Virtual Production at CBS VFX about their work on One Perfect Shot created for HBO Max by Ava DuVernay, the creative process and challenges of short deadlines, the promise of virtual production and realtime — and the future of VR; as well as their work on some technologically innovative projects.

CBS – VFX: https://cbs-vfx.com

Craig Weiss’s LinkedIn: https://www.linkedin.com/in/craig-weiss-8763ab50

Jim Berndt’s LinkedIn: https://www.linkedin.com/in/jim-berndt-432b00

 

HIGHLIGHTS:

[03:25] Craig Weiss and Jim Berndt Introduce Themselves and CBS VFX

[05:48] The Story Behind One Perfect Shot

[12:45] Behind the Scenes of Creating the Show

[19:58] The Promise of Virtual Production

[24:45] The Six Films and Directors Featured in One Perfect Shot

[26:16] The Creative Process and the Challenges of Short Deadlines

[34:29] Craig Discussed the Development of Parallax

[41:48] VR Experience of Stranger Things for Netflix

[50:25] The Future of VR

 

EPISODE 351 – CBS VFX – AVA DuVERNAY’S ONE PERFECT SHOT

Hi, everyone! 

This is Allan McKay. Welcome to Episode 351! I’m speaking to Craig Weiss, Executive Creative Director at CBS VFX and Jim Berndt, Head Virtual Production at CBS VFX about their work on One Perfect Shot created for HBO Max by Ava DuVernay. We talk about the creative process and challenges of short deadlines, the promise of virtual production and realtime, the future of VR and so much more! 

I’m super excited for this one! We get into so many cool things, including their Emmy-nominated VR Experience of Stranger Things for Netflix.

Let’s dive in! 

 

FIRST THINGS FIRST: 

[01:22]  Have you ever sent in your reel and wondered why you didn’t get the callback or what the reason was you didn’t get the job? Over the past 20 years of working for studios like ILM, Blur Studio, Ubisoft, I’ve built hundreds of teams and hired hundreds of artists — and reviewed thousands of reels! That’s why I decided to write The Ultimate Demo Reel Guide from the perspective of someone who actually does the hiring. You can get this book for free right now at www.allanmckay.com/myreel!

[1:01:37] One of the biggest problems we face as artists is figuring out how much we’re worth. I’ve put together a website. Check it out: www.VFXRates.com! This is a chance for you to put in your level of experience, your discipline, your location — and it will give you an accurate idea what you and everyone else in your discipline should be charging. Check it out: www.VFXRates.com!

EPISODE 351 – CBS VFX – AVA DuVERNAY’S ONE PERFECT SHOT

Hi, everyone! 

This is Allan McKay. Welcome to Episode 351! I’m speaking to Craig Weiss, Executive Creative Director at CBS VFX and Jim Berndt, Head Virtual Production at CBS VFX about their work on One Perfect Shot created for HBO Max by Ava DuVernay. We talk about the creative process and challenges of short deadlines, the promise of virtual production and realtime, the future of VR and so much more! 

I’m super excited for this one! We get into so many cool things, including their Emmy-nominated VR Experience of Stranger Things for Netflix.

Let’s dive in! 

 

FIRST THINGS FIRST:

[01:22]  Have you ever sent in your reel and wondered why you didn’t get the callback or what the reason was you didn’t get the job? Over the past 20 years of working for studios like ILM, Blur Studio, Ubisoft, I’ve built hundreds of teams and hired hundreds of artists — and reviewed thousands of reels! That’s why I decided to write The Ultimate Demo Reel Guide from the perspective of someone who actually does the hiring. You can get this book for free right now at www.allanmckay.com/myreel!

[1:01:37] One of the biggest problems we face as artists is figuring out how much we’re worth. I’ve put together a website. Check it out: www.VFXRates.com! This is a chance for you to put in your level of experience, your discipline, your location — and it will give you an accurate idea what you and everyone else in your discipline should be charging. Check it out: www.VFXRates.com!

 

INTERVIEW WITH CBS VFX

[03:25] Allan: Thanks both of you for coming on the Podcast! Do you want to quickly introduce yourselves?

Craig: Thank you for having me, Allan! My name is Craig Weiss. I’m the Executive Creative Director and VFX Supervisor at CBS VFX which is a division at Paramount.

Jim: I’m Jim Brendt. I lead the Virtual Production Department at CBS VFX.

[03:49] Allan: Can you talk about CBS VFX, both some of the history and the stuff you’ve done in the past?

Craig: Yes, so I’ve had this group for many moons, going on to about 25-26 years. I was 7 years old when I started it. (I’m just kidding!) Our core business has been visual effects in broadcast, before streaming. We’ve been able to ride that streaming wave which has been a great thing for our industry as we see this uptake in VFX. Our core business is visual effects for streaming. We’re working on some interesting projects coming up. One is the story of The Godfather for Paramount + called The Offer. We have a project on Apple TV+ called War. We do other shows like This is Us or Hacks. We’ve done a wide gamut of shows. We got into virtual production probably 7-8 years ago, pre-LED walls. That’s how we segued to this project.

Jim: It’s been a great adventure so far! I started as an FX animator. I remember seeing some of your tutorials and thinking, “Maybe one of these days, I’ll move to the Max world.” We’re pretty much a Maya, Houdini shop. The nice thing about working for this department is that there are always opportunities to grow and expand. From FX animation, I moved into motion capture, to VR to virtual production. 

[05:48] Allan: That’s great! I’d love to talk about One Perfect Shot. How did that project land at CBS VFX?

Craig: It was really interesting! I got a call because we had done a bit of work in virtual production. The producer got my name, and initially it was more of an exploratory, consulting session of looking at technologies and how we’re going to be able to do this. They started to explain the project to me and I said, “I’d love to throw my hat into the ring”. When they explained the project, it was about using our technology to immerse the director to their one perfect shot. The goal from the beginning was to create something in realtime. It wasn’t the traditional pipeline because it’s a documentary. The way it was shot, it didn’t allow for that linear pipeline. It was a more realtime experience for the director to be immersed in the environment and for us to be able to see that environment. We needed to block it out with the director, when they’re behind objects. So we had a lot of technology that allowed us to place the director in a 3D space and accurately move in front of and behind different pieces. We kind of show a few frames from different movies: The Dark Knight (with Heath Ledger standing on the corner). And they said, “What would you do with this?” 

We went on a holiday break and decided to just go for it; to see what the pitfalls would be. We went and built that scene in Unreal. They brought in Stephen Rosenbaum as a consultant. I got on a call and pitched that idea, showed how the camera moves around. That was the thing that opened everyone’s eyes, “Wow! There is a lot of potential here.” It was more than they anticipated. The original pitch was to use the weather channel technology to have a director walk through. We can do better than that! Once we got onboarded, it was about figuring out what directors they could get. That was an interesting challenge. It all came down to schedules and booking because they were trying to shoot it within a block of time. You have to have a miracle to line up those schedules. That was a revolving door with film. It was a bit of a wild ride! Once we got going, it was a really great experience!

[09:27] Allan: Jim, do you have anything to add?

Jim: We had a space stage and we invited all the creators to come by to see what the show would look like. It helped convince people. They were talking about placing a director inside the City of Ahmedabad. It was in VR so they had the ability to walk around the objects. Demonstrating that sold them on how the technology works. 

[10:23] Allan: I love the idea behind it! Such a unique concept! What was the inspiration behind One Perfect Shot?

 

 

Craig: It’s based on a Twitter account. 

Jim: You could share the best directors and best movies. It was the perfect show. It really boiled down to scheduling. There were many directors and films that were talked about. For us, it was an opportunity to get involved on some of the best movies with talented filmmakers. The list was very long. And they had to make sure that all the rights for the movies were approved for. There were so many episodes we could’ve done! The few that we’ve done were mind opening. You find tidbits that director was dealing on the day of the shoot, they share the inspiration and the technical difficulties. It was the perfect show for us to be involved with! 

Craig: It’s really for the cinephiles and hard core cinema fans that would go in there and post what they considered the one perfect frame from a movie. That is really what the inspiration was. And Ava DuVernay who is a filmmaker and writer adapted this for HBO. And that’s how it came about.

[12:45] Allan: Which visual effects innovations were used on the project?

Craig: For us, it’s really about the ability to create realtime instead of a traditional linear pipeline. All of this was about realtime rendering and the ability to re-render that. We had a 2-week turnaround. We couldn’t treat this as a traditional VFX show. There wasn’t enough time. We had the time upfront to build the asset. But once the asset was ready, it was a really fast turnaround. A lot of what we built for this had a dual purpose. We had that asset running to set and then we had to ray trace that asset in Unreal, for the final product. With some environments, we were able to take advantage of some of the Unreal stock assets (some of the environments of grass and trees). A lot of these had to be matched. So they were built in Maya and brought into Unreal. Unreal 5 has the promise of unlimited polys which is amazing! But this was running in 4.27 so we had to be really careful about our scenes and how big they were. Could they run in realtime? Could they ray trace? A lot of work went into that development. 

Jim: Because of the scheduling issues that we had with the show, we didn’t have a lot of advance notice on the environments that we’d be given. If someone said, “We want to have Pan’s Labyrinth,” we had to actually have all this lead time to create the environment with our 3D team. This was one of the reasons that early on, we decided that LED walls wouldn’t be the perfect solution for this, because of the lead time. What you shoot is what you get. They wanted the director to sign his / her name and walk around the environment, pick up objects. We tried to build something that could have a dual purpose but we knew that we had to push the edges to the max and expect no hiccups. What we ended up having to do is create an environment that would be 85-90% of what it would look like. We decided to render the entire thing using Unreal Engine just because of the fast turnaround. We also had to have time for revisions. It was a pretty intense technical challenge in the time that was given to us. In the end, they were happy to cut all the rough material. It was a really good looking temp comp. Everybody was pretty impressed whenever they saw [the preview of the environment] in video village.

[16:49] Allan: That’s so cool! Has there ever been any hesitation toward going with Unreal? Was there ever that moment?

Craig: We’ve been doing this for a while with realtime onset compositing, with having the background and going into post (with Arnold or V-Ray) to fine-tune and render with depth of field. But for this, we knew we didn’t have the time. There was a little trepidation. For us, as VFX artists, you always want to push it to the limit. What we decided and what worked for this project is that we weren’t trying to fool the viewer and say, “You’re back in this movie.” The goal was to reach the stylization that would work and allow the director to feel immersed in it. It was about finding that balance of what looked really good and told the story, and captured that moment of that frame in the film. Which helped us get to the place where we could be successful and not fearful that we couldn’t get it running in realtime.

 

 

Jim: We’ve done other projects in Unreal. We looked at other options and custom solutions out there used mostly for sporting events. We had to settle on one. There was enough support that we settled on Unreal Engine. But it [was] an extensive set up so that it fulfills everything that we wanted it to do, from mapping every lens to doing a tech rehearsal. You only have 6-8 hours with the director. Even though it’s a live event, what you capture there is what you’ll use on the show. So Unreal was a pretty stable solution for us. 

[19:58] Allan: It has definitely changed things a lot. Even from a more of a budget breakdown, where what’s now in pre-production carries over to post-. It’s all merging.

Jim: It needs to be the right project. There are so many different considerations. A lot of the stuff we did in the past had to do with creating locations in LiDAR so that they could be shot at any time inside a stage. It is pretty amazing once you see what you can get done. Depending on the project, there is a solution. For this one, it would’ve been interesting to have an LED wall, but would that open different issues for the turnaround that we had?

Craig: We did explore it but so much of this is having the director walking head to toe in the environment which means that the ground would have to be dressed right. A lot of these shots are in a wide. [There has to be an] ability for a director to walk in front and behind. With an LED wall, there are challenges with that. And the turnaround time! LED is amazing but there is still some value in green screen for sure! It hasn’t gone away completely. What I found exciting and using the Engine for so much, is that you can see realtime effects on the horizon. Once you use it a lot, it’s hard to go back. Some of the narrative stories also had supplemental stuff that we would shoot for the sequence. So the director would be talking and then they’d cut away. We brought in the showrunner and the editor and did a quick session on stage with a virtual camera. They went through the story beats. Whether we were doing it handheld or keyframed, but we were able to do that in realtime with creatives. The promise of virtual production and what we did here is that you build a bridge between post-production and production. It used to be that you’d shoot, talk, go away, come back and edit it together. Now you’re inviting them into the process: They get to see the stuff and participate, without waiting to get a render back. That part of it is such an exciting part of visual effects to not be the guys in the dark room, at the end — but to be on set participating in creation of that imagery. 

[23:44] Allan: You’re absolutely right: Virtual production is great when you have the prep time. But if it is running gun, it’s not ideal. 

Jim: It’s always tempting to push the limits, to get the best looking environment you can. With creating the characters to be used in each of these episodes, we had to think about using meta humans, or doing something fancy. In No Man’s Land, the explosion is in Houdini assets brought into Unreal. You have these layers of complexity. You have to choose your battles.

[24:45] Allan: Craig, which films and filmmakers were featured in One Perfect Shot?

Craig: It was 6 films and 6 directors:

  • Patty Jenkins for Wonder Woman;
  • Kasi Lemmons who directed Harriet (which was not a big VFX film but a beautiful story of Harriet Tubman);
  • Jon M. Chu who directed Crazy Rich Asians;
  • Michael Mann with Heat (which was a fan favorite for everyone who worked on it). He came on set so prepared with his notes! When we were working on that, we tried to use his information as much as we could. Michael created the battle plan and it was a blueprint of where every car was going to be. It was a hidden treasure!
  • Aaron Sorkin for The Trial of the Chicago Seven;
  • Malcolm D. Lee for Girls Trip.

 

[26:16] Allan: Were there any challenges that stood out?

Craig: Typically when we’re on a green screen stage, we’re on a level floor. We had a 15 X 15 area for the director to walk around. Some of the challenges were that some of these environments weren’t on an even ground. So if it were Harriet, she might have been on a hill. We had to think that through in terms of how the director will walk the environment. And we came up with a compromise. In VFX, you have to match the lighting of the environment to the foreground of your green screen. But this is a documentary. It was a happy compromise. We didn’t want dark circles around the actors’ eyes. And you’d think you’d have 6 weeks of pre-production. Here, we had 1-2 days before with the director and the DP, and we had to go through the story beats without the guest director. Then the director would show up and we may get 3 hours. Then they’d step into our world and we had to nail everything! It felt like a live broadcast because we weren’t going to get the director back.

Jim: We had to light the environments before, to make sure there were no hiccups. We had technical rehearsals which were godsend. When the director came, everything had to work. Otherwise, we’d have to deal with that horrible saying of “fix it in post”. It has been a really good experience in a sense that we were able to use most of the tracking data to our advantage. And because of that, we were able to have a fast turnaround and be able to revise shots. The formula was the correct one for this specific project. The data would be captured and set and processed in Unreal in May within a couple of days after. We did a lot of shots.

[31:15] Allan: Could you walk through from the moment the project came into the rest of the process?

Craig: There were options. Once they figured out the director and the film, there were so many frames to choose from. They didn’t get that shot until pretty late in the game. How complex was that frame? For No Man’s Land, we had to build a trench with sandbags, soldiers, mud, and reflect the water. For Chicago Seven, we had to rebuild the park. Heat was a good example. Those are two really well known blocks of LA. We had to match every building. Once we knew the frame, we did a complete breakdown. Some of these films were made 25 years ago. You had to do your own research. We were fortunate to have Michael send us some stuff. But other times, we were looking on Google. We did a detailed dive into the elements.

Once we knew what the environment was, we had to settle with the showrunner on where the director would be. Where is that sweet spot? It was a matter of doing the research, building the environment and figuring out where the director is going to be.

Jim: You’re as good as your teammates. We had a dedicated team that was trying to beat the deadline. We’d have to start playing with level one versions of each environment just so that we could see something. It was crunch time for these guys!

[36:29] Allan: Can you talk a little bit about your proprietary virtual set Parallax?

Craig: A lot of that was built from an early desire. We started in a more traditional 2D and then 3D and matte painting was introduced. We’d model the geometry, provide it to a matte painter. They’d light and paint over it. That was the evolution of environmental matte painting within broadcast. About 9-10 years ago, I was the guy on set telling the director, “You can pan or tilt, but don’t dolly it.” These were budgeted shots. That part was frustrating because you’d have to explain it. A lot of what we built in Parallax was from the desire to allow that director the freedom. Early on, in the beginning of Mari and texture projection, it was like hammer and chisel. We started in the beginning of trying to figure it out. We realized we got 85% there. Then other tools came up and we started to push it. We worked with some third party vendors that have developed tracking systems and lens distortion tools to help develop the pipeline. We wrote a lot of stuff that glued stuff together. We’d have a version that would go into Unreal Engine and run in realtime on set. Sometimes you’re doing 100 shots. That was the genesis. Now with Unreal 5, we may not go into post- at all and finish everything in camera. That’s the LED wall. It’s coming and it’s exciting!

[38:43] Allan: Jim, I know you started working with Unreal previs. How has that disrupted the process?

Jim: I think that the main advantage is that you can explore with creatives right there. It depends on the project. We have a mocap stage and we were merging the environment with actors. It is the pipeline, how everything goes together. What makes a difference for us? We play with some great technology. It was almost photorealistic. Then you’d get all the data. In Unreal, with the lens distortion information, everything looks almost perfectly. I think that what a lot of the creative like is being able to point the camera to the greenscreen, they can see where the tower would be, [for example]. You have a switch on the monitor. It helps them come up with these cool shots. That’s where you get the most satisfaction because it’s helpful on set. 

[41:48] Allan: You’ve worked on the experience for Stranger Things. Can you talk about collaborating with Netflix?

Craig: This was always part of the path of what we were doing in LiDAR. Netflix came by and saw some of our technology and they were interested in creating the experience. That was really fantastic for us! It was less technical and more of a storytelling process. That was the early days of VR. We knew we had to go to the set. The takeaway was the storytelling process. You can stand someone in an empty room, but how are you going to direct them via VR where to go? Version 120 of that…

Jim: 183!

Craig: 183! That was an exciting time! VR was struggling because of revenue models. Then COVID hit. I’m seeing it will make a resurgence. Unreal will elevate the experience for the user. Fasten your seatbelts! We all know the promise is there. It’s just a matter of when. We are going to start on that wave again.

Jim: That was a fantastic project! We were able to combine so much different technology. We’d bring study groups of people. We were using our tracking system. We created 4 different versions of the experience. One of the things they wanted to try was cordless VR. Another version had a guy inside the experience being the Demogorgon. They didn’t use that one!

[45:32] Allan: You’ve brought up COVID. What do you think are the long term changes it has introduced to the industry?

Craig: I think on two fronts it’s been positive: When we were building the Parallex pipeline and building the directors in, it was met with raised eyebrows. We understand the value of being on a real location. But sometimes with schedules, there are certain limitations. Then COVID hit and the phone was ringing off the hook. “Can we come to your stage?” There is a saying, “If you wait long enough, unexpected things happen.” COVID hit and it is disrupting and bringing technologies forward. It has accelerated the LED wall. It made creatives be more open to it. That was a big positive and it’s here to stay. This is the world we live in. You’re always going to choose. If they can go on a real location — go for it!

The other thing that happened during COVID is remote workflows. We got involved on something fun for the Drew Barrymore Show. They built this beautiful set, spent a ton of money but now she can’t have a live audience. She is in New York and we bring the audience to LA, put them on green and composite them in realtime. We were the first out of the gate to do that. We had two synchronized remote heads running through Fiverr so there was no delay. The guest would walk out and then walk onto the stage with her. That was something we developed through COVID. That would’ve never happened. Now we’re going to do that for a film launch. We’re going to use that same technology. That becomes an option for people.

[49:25] Allan: I always think of the Avengers because you had all this A-list talent together. With One Perfect Shot, it’s hard to get access to everyone. But being able to bring the set to them changes the game plan a lot. 

Craig: What’s neat and exciting for us is putting on your VFX cap and using that to solve other problems.

[50:25] Allan: Where do you see VR is heading?

Jim: It’s a great tool for creatives to explore the environment before they get on set. It brings up a new set of challenges. On set, you have control over hardware. I see it as a dual thing: One is exploring the environment. You can send the headsets to a director. And then there is the collaboration part of it. You have artists meeting up in VR environments to create one of these perfect shots. You have the throughput in Unreal 5 and you’ll be able to handle heavier sets. And you have the ability to collaborate and revise models, and plan what they will build next. That can be a really useful tool!

Craig: I have two thoughts. In our industry, I’m working with a technologist. He’s built his own software where he’s going into point clouds in VR; and once you do that, you will never go back. VR holds some of that promise. What does that look like? What’s the next version of that? Maybe we’ll use Zbrush and be able to emerge ourselves to do the editing. On a consumer level, the topic is the metaverse. I’m on a panel at our company and we’re seeing how we can expand. We’ve all seen Fortnight and the early beginnings. Now, we’re seeing meta humans. I think the advancement of technology is moving so quickly! It’s going to expand the VRX universe. It’s not going to be traditional streaming. Be it 5 or 10 years from now, it will involve all the people in the VFX industry. 

[54:43] Allan: For me the first 20 seconds of VR were so exciting. I can’t wait to see where it goes!

Jim: I’ve seen a lot of good uses with it. Friends meeting up and creating their storyboard in a common environment.

Craig: We’ve seen what Nvidia is doing with AI. We can use it to enhance the components of a metaverse environment. 

[56:17] Allan: Nvidia is looking at where we’ll be in 20 years. They’re looking at weather patterns to recreate in their environment. Is there any technology you have your eye on?

Craig: The realtime stuff to me is really seductive. It creates a better experience as a VFX artist. You feel like you’re on set. It will continue to bridge the gap. VFX will be on the forefront of everything because of the economics and time, and the level of realism we can bring. And then, we mentioned what Nvidia is doing with AI and how AI is assisting with VFX. If we could make it about a creative experience, you’d have more fun creating.

Jim: For me, digital doubles and location capturing are two exciting areas. We’ve gone through point cloud. I can see down the line where you’ll be able to capture any environment in the world, as well as the type of materials and textures there. With Unity, digi doubles will get to a point where we’ll just need 30 extras and the rest will be perfect humans. You’ll have talent that’s good enough to interact with. Nothing is impossible! You come from the VFX world, you know what it takes to create an explosion. And now you can just go preset, preset, preset – and you’re done. And it looks fantastic! It’s a storyteller’s dream!

[1:00:46] Allan: These are just tools in the toolkit. Where can people go to find out more about CBS VFX?

Craig: www.CBS-VFX.com. Come check it out! We have a lot of projects on the horizon!

[1:01:26] Allan: This has been so great! I really appreciate it! 

Jim: Great to be here!

Craig: Allan, I’m a great fan! Thank you for having us!

 

Thanks for listening! I want to thank Craig and Jim for coming on the Podcast. It was so much fun to chat with them! 

Next week, I will be speaking to Activision Blizzard King’s Executive Principal Recruiter for Art in Animation and Creative Robin Alan Linn to talk about everything about career and how to break into the industry. 

Please take a moment to share this Episode with others. I’ll be back next week. Until then –

Rock on!

 

Click here to listen on iTunes!

Get on the VIP insiders list!

Upload The Productive Artist e-book.

Allan McKay’s Facebook Fanpage.

Allan McKay’s YouTube Channel.

Allan McKay’s Instagram.

 

bonus

Get the free guide just for you!

Free

Episode 350 - F is for Family - Showrunner Mike Price
Episode 352 – Activision Blizzard – Executive Principal Recruiter for Art in Animation and Creative Robin Alan Linn

Leave a Reply

Your email address will not be published. Required fields are marked

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

You may be interested in