Episode 289 – EmberGen

by Allan McKay  - March 9, 2021

[smart_track_player url=”<<enclosure url=”https://traffic.libsyn.com/secure/allanmckay/ep289_-_Embergen_PUB_v2.mp3″ length=”43795730″ type=”audio/mpeg” /></enclosure>” title=”289 – EmberGen” social=”true” social_twitter=”true” social_facebook=”true” social_gplus=”true” ]

 

Episode 289 – EmberGen

JangaFX was founded in 2016 by Nick Seavert, an entrepreneur who is well versed in both startups and real-time VFX. JangaFX was initially bootstrapped with credit cards, software sales, and many thousands of man hours instead of outside investments. The company’s core motive is to provide real-time VFX tools to artists and designers who work in video game or film industries. 

The first tool by JangaFX called VectorayGen was released in 2017. EmberGen is a standalone real-time volumetric fluid simulation tool built for generating flipbooks, image sequences, and VDB files.

In this Podcast, Allan McKay interviews Nick Seavert about his starting JangaFX and the challenges he experienced, the multiple applications of EmberGen and the company’s collaboration with Otoy.

JangaFX Website: https://jangafx.com

JangaFX on YouTube: https://www.youtube.com/channel/UCtRjzr2QlxJcoQQ2c74dJpw

EmberGen Website: https://jangafx.com/software/embergen

JangaFX and Otoy Collaborate on EmberGen FX: http://www.cgchannel.com/2020/04/otoy-and-jangafx-team-up-to-create-embergen-fx/

 

HIGHLIGHTS:

[04:17] Nick Seavert Introduces Himself

[04:51] The History of JangaFX

[07:46] Nick Shares JangaFX’s Beginning Challenges 

[09:12] The Diversity of EmberGen

[10:53] JangaFX’s Collaboration with Otoy

[12:50] Nick Discusses Some of the Features of EmberGen 

[17:39] The Programing Philosophy of EmberGen

[22:13] Nick Shares Some Future Developments for EmberGen

 

EPISODE 289 — EMBERGEN

Hello, everyone! This is Allan McKay. 

Welcome to Episode 289! I’m sitting down with Nick Seavert, the CEO and Founder of Janda FX which makes EmberGen. I’m excited to talk about this powerful fluids tool that works in real time. I think this will be a really exciting Episode!

Please take a moment to support this Podcast by sharing it with others.

Let’s dive in!

 

FIRST THINGS FIRST:

[01:16] Have you ever sent in your reel and wondered why you didn’t get the callback or what the reason was you didn’t get the job? Over the past 20 years of working for studios like ILM, Blur Studio, Ubisoft, I’ve built hundreds of teams and hired hundreds of artists — and reviewed thousands of reels! That’s why I decided to write The Ultimate Demo Reel Guide from the perspective of someone who actually does the hiring. You can get this book for free right now at www.allanmckay.com/myreel!

[26:13] One of the biggest problems we face as artists is figuring out how much we’re worth. I’ve put together a website. Check it out: www.VFXRates.com! This is a chance for you to put in your level of experience, your discipline, your location — and it will give you an accurate idea what you and everyone else in your discipline should be charging. Check it out: www.VFXRates.com!

 

EMBERGEN

[04:17] Allan: Just to get started, do you want to quickly introduce yourself?

Nick: My name is Nick Seavert. I’ve been doing VFX for the past 12 years ago. I started in games, modeling Half-Life 2. My main thing is video games in general. Throughout the course, I found the tools weren’t to my liking but at the time I didn’t know what I was doing. I got involved in a couple of startups with a mentor I met on the internet (sounds shady). I did that for 4 years, and I failed in a couple of businesses. I was getting back into VFX as a hobby but figured the tools were still not too good or fast enough. I figured if my particle system in Unreal Engine is in realtime, then my tools could be in realtime. I hopped in and learned everything I needed to learn. I started JangaFX. For the past 4 years, we’ve built our team. 

[04:50] Allan: For you starting Janga at the time, were you clear about what you wanted to do? Or did it have to be something related to games?

Nick: For sure, it was about games at the time. We’re now pivoting more toward film stuff as well. At the time, we were all about games. The workflow for games with EmberGen is impeccable and so fast, which is why so many studios adopted us. But we’re really moving toward film stuff now. I think I definitely had a road map. I just didn’t know how to get there.

[05:40] Allan: When did you figure it was time to pivot and go all in? Knowing that GPU’s have stepped up, do you think it was all about the right timing? I was at ILM at the time when they started developing Plume. Do you think it was the right time with technology?

Nick: Sure! There was a program I saw in 2009. It was a 3D fire simulator. It proved to me what I wanted to do. But I didn’t have the team building skills at the time. I think 2014-15 was the ripest time to get started. That’s when GPU’s were powerful enough. But now, they’re so much faster. At this point, with EmberGen, it’s a memory bandwidth problem. If they want to give us 20 TB in the next GPU — we need that! This is the perfect time. It might become the Microsoft story: Right time, right place. Everything lined up! I think we could’ve started a year earlier, but if we did that we wouldn’t be where we are today.

[07:46] Allan: I want to jump in and talk about some of the features. How did you know that you were onto something that was going to work? And what were the big challenges in the beginning?

Nick: I think [when it came to] the big challenges for us (our team have almost PhD’s degrees), I don’t think the computational physics were the hard part. The hard part was getting it to be a tool that actually rocks; where users can actually say, “Holy. Shit! This is awesome!” When we first launched it, users would testify that as soon as you started it, there was an explosion. It’s already simulating. It’s already doing cool stuff. That’s what we wanted our users to see. All those reactions were the first time I knew it was going to work.

[09:12] Allan: Have there been any cases where things have been done in a surprising way?

Nick: For sure! I just saw one this weekend. This guy posted this morphed face. He used EmberGen to generate the VDB files of the face. Then he took it into something like TyFlow to mesh the VDB, and then some kind of distortion thing. I didn’t even know you could use EmberGen that way! It was such an off-the-cuff use. And then I’ve seen people do some weird simulations with it. I thought it was for fire and explosions; and they’re using it for some sandstorm or torrential rain, or whatever. There are so many extra things you can do. And you talked about Krakatoa earlier. This week, we’re releasing an all new GPU particle rendering system. You can render millions upon millions of particles inside of EmberGen. I can’t wait to see what people are going to create with that.

[10:53] Allan: I can’t wait! This was developed for games. You, guys, collaborated with Otoy to create EmberGen FX. What has that been like? Did that come out of a necessity?

Nick: Yeah, so one of the big things is that the gist of the film workflow, I get it and that’s what our customers want. But I’m not really the guy who knows this stuff. I know about games. So we were looking for a partner that could help us reach more of that market. We’re doing a lot for ourselves already. However, we wanted a company like that to help us get to the next level, and give the customers what they wanted. With some pictures we have coming out at the end of the year, our standalone is completely viable. But for a more integrated workflow, Otoy is going to be the way to go. So partnering with them was such a good idea. They helped us immensely! We’re trying to get a deeper connection and see how closely we can work together. There will be a lot of cool things that come out of that, I think!

[12:50] Allan: I’m excited! For me, I love games. But film is where most of my business is. What is the highest res sims that you put out, so far? What are some of the ways you’re able to push it?

Nick: Hundreds of millions of voxels. I don’t think we can do a billion yet. But 400 million voxels is possible if you’re really pushing it. We don’t necessarily recommend it but you can do it if you want to. We have a really cool feature where you can simulate in low res but then up-res it with a click of a button. We’re working on lots of optimization stuff. We aren’t going to release any new features after the GPU particles. We’re just in an optimization mode. By the end this year, the software will be able to do some high res stuff. For a lot of things, especially when you go through up-res-ing: we have some volume processing things we’ve added, you can sharpen volumes at motion blur. It actually exports with your VDB’s. And that’s the cool thing about all of our stuff. Whenever you’re doing it, it actually affects the volume. That’s a huge, huge thing for film people: You don’t have to apply wavelength turbulence to it, but you probably could. It is so great! 

[15:17] Allan: I’m curious if you were to build EmberGen from the ground up, specifically for sims, what would be the ideal? Where would you focus the most? Obviously, video cards would be one. Also, would you tie into multiple cards or go all in on one?

Nick: All in on one! Maybe like an RTX Titan or RTX 2080 Ti. Similar programers use 2080 Ti’s and then we’ve added a programer on a GTX 1060. Why on a 1060? Because that’s actually our target. Because we work a lot with the consumer market, not all of them have access to an RTX Titan. If we can make it run really well on a 1060, it will run phenomenally on a 2080. We have low standards for GPU’s and it helps us innovate. We do test it on bigger cards, however, and we’re blown away by the performance. 

[16:36] Allan: That’s cool! And right now, you have a standalone solution. You aren’t looking to bridge it with others, correct?

Nick: With us and our team, we will always be standalone. We won’t work in its own plugin. Maybe that sounds naive, but we want to be as big as Houdini. We want to do destruction and liquids, and cloth. We want to have a standalone tool that’s easy to plug into a pipeline. That’s the overall 5-10 year goal. A quick segue back: We do plan on having multiple GPU support. It probably won’t come until next year.

[17:39] Allan: Going down that path, do you see it having multiple machines to share out a sim (rather than multiple cards) if you were to go into a cloud simulating?

Nick: Maybe, maybe not. We do plan to have a solution for the batch simulation processing for larger studios. They could say, “We like this particular pre-set that you have. Alter these parameters and give us a hundred different variations, and output it as an MP4, or something.” In that case, we might distribute to multiple machines. But part of our philosophy is that if it doesn’t run on one GPU, we aren’t doing good enough. That is EmberGen is so fast, because of this programing philosophy. It’s called Handmade Network, if you’re interested in that — look it up. Most programers are lazy. They don’t care about performance, they just want it to work. In a lot of cases, like a tool or a web app, that works [because] you don’t care how long it takes. But there are certain tools where you need the performance and you need to be precise with what you’re trying to write in your code. And that’s what the Handmade Network relies on. If it doesn’t run on one GPU, we aren’t satisfied! We want to go bigger and better all the time. We’re always coming up with optimizations. We actually give a fuck about this stuff and we show it. 

[19:51] Allan: I think that’s smart, even with optimizing for 1060. It means that you’ll have gasoline to fire. I was talking with Jordan Mechner who built Prince of Persia (www.allanmckay.com/244) as well as Louis Castle (www.allanmckay.com/249). The programers from the 80s had to be really careful about memory and everything they had to put in. For me, man hours are the most expensive part of any production. So when someone can write a tool, that speeds things up. Could you talk about applying for a MegaGrant from Epic?

Nick: Yeah, so it’s our second time applying. The first time, we got denied. We applied again with more video footage. I really hoped it’d get accepted. With money raising, we don’t raise it for equity. We don’t want to give it away. But we do want to expand our team. 

[22:13] Allan: Well, you’re just down the street from them. I was working with Activision on Call of Duty. We had a rule that at 6:00 p.m. when the sun would start to go down, we had to close our blinds. We’d have people looking in trying to see what we were doing. Same thing with adding a feature people don’t want. It’s kind of crazy! I was chatting with a game studio in Finland, and we were talking about adding VDB’s. I know you can’t put in high volume sims. But in case of something like volumetric fog, how likely do you think that would be? Rather than simming in realtime and putting it on cards, to have volumetrics inside of a game?

Nick: We’re working on this kind of stuff pretty soon. Our prelim investigation shows that VDB is not the way to go in games. We’re going to write our own volume format that is highly optimized that game studios can use. We want to have it that you can use EmberGen and see it in realtime in your game engine. So we’re working on our volumetric stuff. That’s the next step for games! Horizon has the Forbidden West (that’s the latest game they have going out). I saw this swirling hurricane in the trailer. It’s probably some procedural noise, my guess is. But it’s really good! I think the rendering is there. But there needs to be a really lightweight format that you can stream off your hard drive, for it to work for games. I think if you’re working Houdini, you have big caches and you know it stutters. We just can’t have that in games. So having a lightweight 3D or 4D texture format is what we’re aiming for. We have the research on our hands, we just need to implement that. And of course, this is with baked stuff, right? With actual dynamic sims, we need lots more memory bandwidth. It’s not enough right now.

[25:47] Allan: I’m excited about the static sims that are pre-baked. That’s so great! Where can people go to find out more about EmberGen?

Nick: JangaFX.com. You can search for EmberGen.

[26:08] Allan: This has been so great, man! Thanks for taking the time to chat!

Nick: Absolutely! Thank you very much for having me!

 

I hope you enjoyed this Episode. I want to thank Nick for taking the time to talk. This was awesome! I’m excited for everyone to dig into this tool. I will have another Podcast with Nick (www.allanmckay.com/302) in which Nick and I dig into how he went on from being an artist to starting his own company.

Please take a few minute to share this Podcast with others. 

Next week, I will be back speaking to Doug Roble who is one of the original employees at Digital Domain. We talk about the work they’re doing with digital humans. They’re no longer rigging but filming the actors and using AI to animate them. That was a really fun Episode! Doug has done so many interesting things, including a TED Talk about his computer generated version of himself.

I will be back next Episode. Until then —

Rock on!

 

Click here to listen on iTunes!

Get on the VIP insiders list!

Upload The Productive Artist e-book.

Allan McKay’s Facebook Fanpage.

Allan McKay’s YouTube Channel.

Allan McKay’s Instagram.

 

bonus

Get the free guide just for you!

Free

Episode 288 -- Lux Machina
Episode 290 -- Digital Domain -- Doug Roble

Leave a Reply

Your email address will not be published. Required fields are marked

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

You may be interested in