Image Image Image Image Image Image Image Image Image Image | July 23, 2024

Scroll to top



Is This What You Expect From Next Gen Gameplay?

I’ve been meaning to write about this for a while but I didn’t have enough time to give it the coverage I think it deserves. Here goes…..

We all have expectations as to what the next gen consoles will give us. Better graphics, more storage space, wireless joypads etc etc, but what about what really matters: the gameplay?

Well LucasArts & NaturalMotion have something in store for us all!

Indiana Jones is set to showcase a fantastic looking technology, from NaturalMotions, that aims to replicate “real-world physics”:

Utilizing a run-time animation technology called euphoria, the game is poised to push the limits of what was previously possible. The company that developed the technology, NaturalMotion, has been working for years to hone euphoria to accurately replicate real-world physics such as strength, weight, and momentum with in-game character models. Whereas before, all animations had to be pre-programmed, euphoria allows for reactions and behaviors to occur in real-time — allowing models to react realistically to whatever situation might arise.

To be clear, euphoria is not an AI program, it’s an animation technology that allows for physically accurate behaviors. The AI is programmed by the developer, in this case LucasArts, into the euphoria models to control them. What we saw at LucasArts was the marriage of euphoria physics and LucasArts A.I. scripts.

Before I go on I recommend you check out the two videos from IGN:

IGN – Euphoria In Action



Imagine the possibilities! Madden with players realistically crumpling all over the place from the impacts, trying to block or breaking their fall, with each impact having a different reaction.

Imagine how euphoria could be utilised in games like Grand Theft Auto with each character reacting differently to what you throw at them. Different responses from being hit by a car, motorbike, explosions etc. Surrounded by police? Why not knock one officer into a couple of his buddies to create a window of escape.

Then there’s the wrestling games like the Smackdown series, realistic motions when you grapple & throw your opponent. Unlimited number of techniques at your disposal instead of just the couple that you currently have. And what about if you get thrown & you are now able to adjust how you land to minimize the damage you would receive & make it easier for you to get up & continue the fight instead of lying there motionless for a couple of seconds….

While it may sound good on paper, LucasArts was eager to show off the technology. We were shown two interactive technical demos highlighting the system in action with Chris Williams, Project Lead at LucasArts and Peter Hirschmann, Vice President of Product Development on hand. The first glimpse we got, the Tower Demo, showed off AI reactions to being dropped through a series of planks and beams. The second, which we’ll call the “Fight Demo,” featured Indiana Jones beating up an enemy A.I. to showcase varying reactions to getting punched and tossed around. Both showed potential for a new kind of gaming experience.

“For us, this is bleeding edge stuff,” said Hirschmann. “It’s what next-gen is about. It’s not creating a linear path. It’s creating intelligent characters, really interesting setups and environments, and putting the player in there and letting them see what the hell happens.”

The Tower demo showed off what LucasArts was talking about. It gave the player the ability to toss in AI-controlled character models at will, letting them fend for themselves as they tumbled through a Price is Right “Plinko-style” structure. While initially, it seemed like a rag-doll physics demonstration, it soon became clear that the AI models were actively trying to stop their falls. Reacting to their surroundings without the restrictions of pre-programmed animations, the models reached out to whatever beams happened to be closest. Since they were falling at relatively significant speeds, they often missed, but would try again when they passed by the next plank or beam.

Eventually, as they slowed themselves with attempts to grab on, they’d be successful. However, the exercise didn’t end there. Some prevailed in pulling themselves up to a standing position, but others lost their grip and continued to fall. The demo achieved an even greater level of complexity when several models were thrown into each other. A hanging model hit by another wouldn’t just hang there stupidly while the other ricochets off. Instead, it’ll actually reach out its hand to grab the other. In the same way, the tossed model will reach out its hand to try and grab onto the hanging ones leg. As further proof of the advanced AI at work, the hanging model would even reach down and try to pull the other up to safety.

To add yet more depth, fire hydrants and other obstacles were tossed at the models as they frantically tried to hang on. As expected, they could do little to stop from getting pummeled. Surprisingly, they didn’t give up efforts to take the least amount of damage possible. Though some models were knocked cold after getting smacked, others fumbled around to break their fall. To stop themselves from tumbling, the models would actually stick their hands out in a last ditch cushioning attempt. Depending on the layout of the environment, they would react accordingly. For instance, when they got slammed into a wall, their body reacted with a convincing thud. Part of the body impacted first, and the rest of the limbs went flailing around, impacting surfaces like they actually existed.

“You don’t have to animate 5,000 different permutations,” said Hirschmann. “You author one behavior and then [the character model] stumbles back, he’ll try to grab, if there’s nothing there he’ll fall over. With this kind of AI and these kind of behaviors it creates these situations where no two players’ experiences are ever the same.”

Now this is what I was expecting form next gen games when the PS3 was announced. Gameplay taken to a new level that wasn’t possible before. Completely different reactions depending on the situations you, your allies & enemies find themselves in.

It will be a while before we are able to experience this new technology first hand but when we finally do, I think it could have a massive impact on how we play games:

Unfortunately we could not acquire media from any of the demos, so we’ve posted video from NaturalMotion’s site that should give you a sense of what we’re talking about. Note that this is not media from Indiana Jones, and actually from NaturalMotion’s non-run-time product called endorphin. In it you’ll be able to see how its technology manipulates a character model’s physical reactions. Ideally, as this technology eventually spreads and development teams get more comfortable using it, we’ll start to see strikingly more realistic A.I. behavior and reactions. While Indiana Jones won’t release until next year, it’s good to know new technologies are being pursued that could provide gamers with exciting, innovative experiences. We’re hoping euphoria turns out to be one of them.

You can read the full article on IGN by clicking the link below:

IGN – LucasArts & Natural Motions Next Gen Plans