Self Organization - Evolving Computer Programs

.Pixote.

Antediluvian as Feck
Modder
I watched this BBC documentary recently, and this part on self organization with computer programs blew me away. It's amazing, and at the same time frightening. I've since learned that this type of self animation (Euphoria) is being designed for the next generation of computer games. We wouldn't want defense computers using these sorts of algorithms would we - WarGames (1983).

Code:
Euphoria represents a step change towards creating truly believable characters, worlds and games. Instead of playing back canned animation, Euphoria uses the CPU to generate motion on the fly – by simulating the character’s motor nervous system, body and muscles.


[youtube]http://www.youtube.com/watch?v=lSG--GY2p2o[/youtube][youtube]http://www.youtube.com/watch?v=Qi5adyccoKI[/youtube]
 
The day a computer asks me if it has a soul, I'll shit bricks and move to a lonely island in Polinesia. Until then, I think it's safe to stick around them.
 
Evolution simulation is very simple algorithm. And, AFAIK, computers are all about algorithms. It is possible to write an algorithm for AI in games, because it's predictable. But I don't suppose that computers will be able to gain conciousness. I mean, I don't suppose that making mathematical model of conciousness, thought, creativity, conscience and such is possible.
 
Cool videos, Pixote. :ok: I wonder how hardware-intensive it is to create these complex animations/collisions in real time.

Next question is, of course, how long before we can see this in games? :)

The idea of being able to create a convincing character without having to use motion capturing or spend weeks carefully building the animations sounds very enticing. I guess time will tell just how much of an impact (if any) it will have on the gaming industry.
 
Euphoria is just a commercial implementation of physical animation techniques that have been around forever. Like most physical full-body animation techniques, it uses PD servos to generate torque in joints and synthesize motion, while a learning controller is employed to dynamically adjust motion parameters, yielding a motion that satisfies input constraints.

Here's a kicker - the controller needs to be trained offline. Moreover, the controller tuning is a notoriously fiddly process and even the slightest errors in parameters will result in a character that trips us and falls over every two seconds. That drunken walking minigame in GTA IV? That's the real Euphoria at work. If I recall correctly, in order to implement Euphoria in your game, you actually had to fly in NaturalMotion consultants and have them prepare your controllers for you. As far as I know, that may still be the case. Not a fact they are likely to bring up in their promotional videos, that's for sure.

The issue of tuning physical motion controllers still hasn't been adequately addressed in computer animation research, let alone commercial products like Euphoria. That's why every game (including GTA IV and Force Unleashed) still uses good old-fashioned motion capture with blending to achieve 90% of in-game motion, while Euphoria is used to synthesize response motion, i.e. motion that results from dynamically-acting external forces. For instance, when a character in GTA IV is hit by a bullet, that's when Euphoria kicks in, as it attempts to balance the character and drive it towards the nearest known mocap pose. The animation system then blends into the corresponding mocap motion and it's business as usual. Of course, there is no way to predict how long it will take for Euphoria to do its thing, hence the situations where the character comically staggers for ten seconds while his adversaries continue to drill him full of holes.
 
Back
Top