690 👀

AggroCast Resurrected — GDC 2023 [Episode Four]

AggroCast Resurrected — GDC 2023 [Episode Four]

The AggroCast Is Out With A New Episode As the Team Dives Into GDC 2023 and all new and emerging technologies that GDC had to offer up the gaming world

The AggroCast crew had the opportunity to take in what this year’s GDC had to offer up out of San Francisco. Raymond sits down with David to go over the many different things that were a highlight for GDC this year. Some of it mo-cap, some of it physics engines, and some of it how AI could be changing the game. Pun intended. All of it will be something we will see soon in not only our video games but also many other forms of media out there. Hear just what is coming around the corner.

You can also listen to the AggroCast podcast on your favorite streaming services too. That means you can listen and subscribe on iTunes if that is how you want to go. If that is not your jam, you can also hit it up on Spotify too. You can truly help us all out by giving the show on all of those platforms and our YouTube channel as well. If you have any questions, comments, or suggestions for the AggroCast, feel free to shoot over an email to us at podcast@aggrogamer.com.

AggroCast Resurrected — GDC 2023 [Episode Four]


TRANSCRIPTION

It’s alive…

It’s alive…

It’s alive…

It’s alive…

It’s alive…!

AggroCast RESURRECTED!!!

Raymond Bruels: Welcome to the AggroCast Resurrected, this is Ray Brules with David Hades Becker. Today we’re talking about GDC and some of the announcements that came out of that convention.

David Hades Becker: More along the tech line of announcements, because I know there was a couple of games dropped out there because it’s a gaming convention, and the big one was Counter-Strike 2, nobody expected that to come out, but it wasn’t Portal 3, so.

RB: Alright, so for a plebeian like me, what is GDC?

DHB: GDC is the Game Developers Conference, it’s a place for all the developers to get together and talk about new stuff they’ve worked on, plugins they’ve used for engines, or even just new engines they’ve built. In the past they’ve shown like the Unreal 3 engine was debuted there, some of the new technology they’ve used in like Unreal 3, 4, and 5 have been debuted there before even going out to the mass public and stuff like that.

RB: And that was your first convention with Aggrogamer, right?

DHB: Correct, yeah, it was way back in 2008, my very first convention going out there with Andy.

RB: 15 years?

DHB: Yeah, it’s been a journey, and I guarantee it’s a whole different ballpark. Funny enough, back then, the big thing was online streaming games, like that was the big, like onLive was the big thing being shown off, whereas, you know, it took so long, and then Stadia did the same thing, and then it crashed and burned just the same way. Plus, sometimes it’s interesting to watch as the technology dies like that.

RB: Well, yeah, especially when you get the long view…

DHB: Yeah!

RB: As you’re able to look back and you see how far things have come, or what everybody thought was going to be the big thing, and it crashed and burned, and now something completely different.

DHB: Yeah, because back then, yeah, obviously it was the streaming video games as a platform, and it’s been resurrected so many times, but I think it was like a few years later, even though VR has been around for a lot longer than that, that was when they first started talking about having like the Oculus Quest and all that stuff before, you know, Meta bought them out and everything. And people then were like, oh, VR is not going to be a thing, it died before, yet here we are. VR and AR is the next bigger thing people are pushing towards.

RB: All right. So what big news came out of this year’s GDC?

DHB: Well, I know there’s a lot that came out. Most of the stuff I paid attention to was I’m an Unreal developer, so I usually pay attention a lot more to that. I know Unity had a couple of things. There was the one that they announced with Ubisoft, the Ghostwriter that uses AI to dynamically generate NPC text or even dialogue. So if they can get put, I think the idea people have put out there is going into an Assassin’s Creed. Obviously, if your NPCs have the same bit of dialogue over and over, even though they try to make it feel like a fully fleshed-out world; it doesn’t. Because how many times can you hear the town crier like, “Hey, come over here because somebody did something” over and over. This one, it can they can have it generate live text based on what you’re doing, as well as if somebody else says something, another NPC can react to it. So it’s using AI to generate the world and actually make it more immersive.

RB: That sounds like there’s a lot of potential there.

DHB: It does, but there’s also a lot of back end people don’t want to recognize as well, because obviously our AI systems on your local systems, like your PC or even your consoles, aren’t going to be as strong. So currently, I think they have to call out to an API to generate that. So it’s just more of having your games always be online to get that full immersive experience. Unless you want them to solidly record all this stuff and just have it generate thousands of lines of spoken dialogue that we don’t have to hire an actor to come in.

RB: Well, AI writers have become such a buzzword.

DHB: Yes.

RB: We went from Metaverse and VR, AR, XR to now everything is ChatGPT and all of these writers.

DHB: Yeah, you can have a whole relationship that way. I watched South Park.

RB: Yeah, that episode was amazing. We sat down and watched it the other day.

DHB: Oh, you’re such a great texter. I love you. Copy paste. Good.

RB: No effort whatsoever. Exactly. But what about you told me the other day?

DHB: Where’s my story? You didn’t write me a story. ChatGPT, please write a story. And that’s kind of what we’re getting at now. But this is kind of like the more macro scale, instead of using it for an individual conversation. Now you’re having it build out all the stuff that would normally take a developer or even voice actors hours and hours of time to do. You can have this AI just generate it real fast. Even if you have the actors come in and do the voices, writing all that dialogue and text and then having it be dynamic that adds that extra level of immersion into a video game, but also years of extra work for just a writer to get all that out there, let alone record it.

RB: Well, I’ve got friends that are writers for video games. That’s kind of a frightening concept that they could be replaced in some instances.

DHB: In some instances, but from what I’ve understood on it, it’s not specifically to remove them completely. It’s to do the mundane stuff was like, Do you really need to know what random pie cook is going to be saying over here? No, you don’t because you’re writing the grand story to tell this epic thing. But obviously, the player is going to go over to the shop and you don’t want them to say the same thing over and over. You want them to feel like a real being. And that’s where the AI comes in is like so that the writer could take the time and focus on the bigger portion of the story and not write down 50 lines of dialogue for the same person to repeat over and over. So in effect, it feels like a double-edged sword where, yeah, technically is removing a job, but at the same time, giving the actual human something more to focus on and kind of drive towards.

RB: That’s always the challenge with technology is it’s supposed to make life easier.

DHB: Exactly. Work smarter, not harder. And there’s always going to be that complaint that you’re going to get rid of jobs. Or you’re going to be like, “You took your jobs!” You know, because you know, but at the same time, you’re making your life easier so that the human component can do their job better.

RB: Did they talk about when we’re going to see Ghostwriter in effect?

DHB: I’m not sure if they did, but if I had to guess, it’s going to be in one of the next Assassin’s Creed games. Just because I know Ubisoft was pushing it quite a bit. And I don’t want to say it would probably be the next one, but it could be because they’ve announced like four or five over the next decade. So in one of them, we could probably see. I mean, we could also be surprised to see it. This it’s probably going to be this upcoming November when they announce this next game is coming out. So we’ll see.

RB: So what else did you see?

DHB: Well, moving on to the other one that the other stuff I really enjoyed was the Unreal stuff because that’s where I got my degree. I’ve worked in Unreal since, you know, Unreal three and one that was called Rocket Ship at one point because it was the working title for it all. And they pumped out a bunch of new updates that on the on the face. Some people kind of just like threw it off, but it’s like, I know the one that everybody glommed onto was the Fortnite editor so that they no longer have to build your Fortnite maps. You can let the community do it and other aspects of it. And they’ve pumped that out there so that now Epic Games isn’t going to be the sole builder of it all. They’re allowing the community to do it and kind of build their own little off branches from it.

RB: So Fortnite is so interesting as a game because of how the iterations.

DHB: Yeah, and it’s just another iteration. It’s going to keep it going on in perpetuity, effectively, because now you don’t have to worry about paying single developers. You just pump the tools out there and let the community do it.

RB: Yeah, they create their own DLCs.

DHB: Exactly kind of like what Roblox has done and some of the other games like that where it’s like, here’s the tools, you’re paying us for the tools. But in this instance, I don’t believe Unreal is charging for people to use the tools. They’re just giving it out there. And they’re getting paid through all the other ways they make money off of Fortnite, with the microtransactions and everything. The dirty side of it all, not loot boxes, but loot boxes.

RB: Well, maybe they’ve even got swag at merchandise as well.

DHB: Oh yeah. And then from there, there was also the new physics engine that they showed that I believe just released in early demo form for anybody playing with the Unreal editor. What they did was it’s real dynamic physics engine. They showed it off using a car. Whereas normally when you have a car driving through a bumpy terrain, animators have to make all that move. So it’s like, oh, it’s going over a tree trunk, make the car bounce here. This one, the new physics engine allows that to just happen real time. So if there’s a piece of like a geometry in the world, car drives over it. As long as the developer did it correctly, it’s going to react how a real car would. I don’t remember they used some new electric vehicle or whatever to show that off, but they drove over a bunch of mud through some rivers and trees and branches and all that stuff. And you could see the individual shocks moving and the wobble of even like the antenna on the car flailing around as they were just driving through this terrain. And that which led them into the other part that was really interesting was procedural terrain. Which isn’t a new thing, but it’s something new for developers on the micro as well as up to the macro scale. So like you can have a developer build an area and then on the fly, whoever’s building the level design goes, well, we need to move this here and there. Developer no longer has to go, oh crap, I’ve got to scrap this and rebuild. They can just pick it up and move and the engine itself will manipulate everything around it. And like one of the things they showed is they came to a big valley and the description was like, if the developer all of a sudden wants the person to have to drive left around this area, they just dropped in this big mountain and it filled the area not only with the rocks, debris, foliage, just random other tidbits in there that were still manageable if the developer didn’t like when they dropped. But it was just almost like a button click, boom, there it was. And then furthering along that, they’re like, well, if they don’t want that and they want to have a split road, they like move the mountainside over and one of the branches like pops up and now there’s a walkway just dynamically got built and you watch as the river forms around this new mountain and all that stuff. So it was a new way for artists to go pretty much keep development moving on the fly and they didn’t have to like spend months and months of time going, okay, well, we have this concept art, we need to make it look this way for another piece of concept art to come in and completely change that and scrap months of work. Now it’s like, oh, you scrapped 30 seconds of my work. And by saying micro to macro, you have like the game area where the player is at and then obviously you have the environment everybody looks out to. That can also get procedurally generated as well. So even though you’ll never go to the mountains or whatever, you’ll see them out in the distance, and if they want to shift that around or change all that stuff, the engine itself will do that based on how the artist drags and drops and moves, manipulates things around.

RB: So really interesting tools.

DHB: Yeah, and you see the tools, but if you have a crafty programmer in there, that’s stuff that can be used live and play as well, not just during the development process at all. So like, say for instance, it’s something, except they have the Nanite system instead of, it still uses pixels and polygons, but if you have like a meteor come down and crush stuff, you can have a crater just real-time be in the engine as a scripted event in the game and the world will change and manipulate around that. And I think at one point they mentioned even little creatures like bugs and like just random things that you normally wouldn’t even interact with but makes the world feel lived in, those would also get interacted with as well. So obviously if you have like a huge fire, insects aren’t gonna go over there, but you see them swarming away. It’s little things like that that most people don’t think about and they’re giving the tools out there for the developer. It’s live as of right now, if you wanna use the preview build. I think they plan on going live with that in a week or so, I believe, or maybe, or end of April, that it should be live as a production-quality version.

RB: And so that’s tools for the setting.

DHB: Yeah.

RB: Did they do anything for actors? Cause I know Mocap is huge.

DHB: Yeah, so I know that, so it was GDC when they blew everybody’s mind back with Hellblade and some of the real-time motion capture they were able to do. Ninja Theory, who’s done all that stuff, they felt like they’d been pioneering it and they’ve been giving out the tools to the community to use. This time they go out there and they have the exact same actress who’s played Senua in their game and all that stuff. And they did a presentation where they had her record 30 seconds of dialogue. They recorded her face from a couple of different angles using a cell phone. Handed that off to an artist and before the end of the presentation was done, he had been able to build a muscle structure of her face real-time and then was able to build a 3D model of her face, facial structure to put on there and replay the animation. And then swap that out with other characters and other faces as need be. So it’s, which I mean, when you look at it that way, it’s like, oh yeah, it’s a AAA developer doing these things. But they’re giving that out to like indie developers where anybody could take their cell phone, have their motion capture actor, capture it with a cell phone, get rudimentary stuff, and then take the time to do it all. So you don’t have to spend thousands and thousands of dollars on this giant mo-cap studios. You can just do it with a cell phone or basic camera work. Obviously, you get better quality when you have the bigger rigs because they put out a video where they have the actress do everything live. She has the dots on her face and the cameras right in front of her, but she was doing all that live in game, in engine. And you could see the difference between it capturing the little wrinkles and all that stuff in her face as to where the cell phone is a little more cartoony, but for 30 seconds to a minute of work, it’s still rather impressive.

RB: Well, and I wonder, as I’m not a huge developer, in terms of resources, what is that going to take to be able to generate that stuff and to do it? I know it was done on a cell phone. Yeah, just thinking about that.

DHB: Well, as a developer, and some of the stuff I do online, I don’t run anything, I don’t run like a high-end gaming rig. I have a decent gaming computer, but that’s what I do my development on, and it works perfectly fine. It’s a little of a resource hog, so I can’t play a game while I’m developing, but I mean, if I’m developing, why am I playing a game?

RB: Multitasking!

DHB: Exactly, multitasking. But it’s not out of the realm of an average consumer. Like, I think my gaming PC by itself, I only paid like two or three thousand for my gaming PC, and it runs this stuff just as well as I saw in the tech demos they were running.

RB: At what point are we going to see all of this come together, especially with XR?

DHB: It’s got to be soon, because I think they mentioned the MetaHuman Animator, which is what this is. It’s only like a couple months out before it’s just released to the public, and then it’s just going to take any artist or any other developer to use it in a creative way for using XR, or even building it into to do generated TikTok avatars.

RB: Yeah, well Snapchat does a bunch of that too, but like when we were at CES, and Canon could do the mapping of the body and face to do the calls through virtual reality. If we have these resources available in terms of the physics engines and the MetaHuman Animator, holy cow!

DHB: Yeah, and mind you, we’re just talking about games here. This goes into every other aspect, because if most people don’t know, shows like The Mandalorian and other big movies, they use what’s called The Volume, which is just a huge area that is kind of running an unreal instance on a bunch of screens around the actors. So they’re kind of like their own little personal holodeck, and they have the 3D assets there that they can react to as well. They use it mostly for to get the light, but also to have the dynamic backgrounds. And so it builds into that, and then like on top of that, like you’re saying, doing the filters, you could slap that on other people in movies or animation or whatever. So if you wanted to have a cartoon that looked like it was perfectly animated, like for realistic, just slap one of those avatars on somebody as they have the basic motion capture, and you have the high-end cameras, and you don’t even have somebody sitting there doing the dots on the face. It just records live and then replays it.

RB: And then we have the possibility of deep fakes. James Earl Jones had signed over his voice. So you can do all kinds of stuff.

DHB: Yeah, I wouldn’t be surprised in the next few years if we have an entire… I mean, we have them now, but like an entire CG movie where you even lose that uncanny valley. And because you have the real voice that somebody’s used, and you have an AI rebuilding it and all that stuff, and you have a real actor who’s behind it all. And like I fully recommend watching these tech demos because even though the eyes were a little bit off because they were using a cartoonish filter, they were still able to get some of like the basic, like little tiny details that most people don’t even realize they catch. And it’s what makes the eyes look different for most people when they’re watching CG characters.

RB: And we’ve seen that with like the Luke Skywalker on The Mandalorian.

DHB: Yeah, the deep fakes they’ve done there.

RB: Yeah, and how it just keeps getting better and better and better.

DHB: And this isn’t even a deep fake because the deep fake is using the actual person’s likeness. This one you could do a 3D scan if you do a deep enough 3D scan. This is rigging that up so we have the muscle structure and everything underneath it reacting and moving the… It sounds horrible, but the flesh as it should.

RB: It’s actually kind of scary when you consider how all of these things could come together. You know, you got the AI on top of that. And if…

DHB: Okay, you can have an entire scripted scene happen with a realistic-looking person that was written by an AI. And we have all the tools pretty much at our disposal right now. But at the end of the day, it’s all to make the human element have an easier job doing it all. Because, I mean, I could see where somebody can get crafting creative and make it all AI-generated. But at the same time, these are all tools specifically to make your motion capture people have an easier job. Your animators have an easier job. Your writers have an easier job so they can focus on the bigger grander scale things and ignore the little things.

RB: Yeah, anything else? Because that sounds pretty mind-blowing as it is.

DHB: Yeah, that’s where I’m kind of at right now. My mind’s been blown with just that kind of stuff. There was tons of stuff that came out of GDC, but those are like the bigger points that stuck with me.

RB: Have you considered going back to GDC since it’s been so many years since you’ve…

DHB: I’ve given the consideration, but just like everything, the cost of it keeps going up and up. And sometimes it’s hard for a small developer to justify. I think it’s $4,000 for the weekend, just for the ticket. And then it takes place in San Francisco and we all know how expensive it is out there. So it’s a pricey thing to be able to throw down for. And even when I went there last time it was, I think it was $2,000 for the ticket. But we went as press and I got in for free. I still had to pay all the other prices for going to San Francisco.

RB: Yeah, San Francisco is not cheap, but holy cow. Yeah, so it looks like there’s a lot of possibility on the horizon just because of these tools. And I look forward to seeing what the developers do with them.

DHB: Yeah, and exactly. And that’s all it’s going to take is the developers are going to do it and it’s going to push it further. And it feels like we’re in an exponential growth on all of these types of things. Because like I said, 15 years ago we were looking at basic VR-type stuff that it would be laughed about right now. How is that a thing? And here we are as like, oh, we can pretty much do a deep fake with your cell phone.

RB: So there we have it. GDC, some of the highlights that came out of this year. David, thank you for providing your insights.

DHB: Not a problem. Thank you.

RB: And we’ll see you all next week. Well, you get to hear us next week.

Thank you for listening to the Aggrocast resurrected, the official podcast of AggroGamer. For more gaming news and reviews, please check us out at aggrogamer.com. If you enjoyed today’s content, follow and subscribe to get notifications of our next episodes.

RB: Downstairs is junk.

DHB: Exactly. It burns so good.

0 Comments Go ahead and login or register