Work in progress Automatic Lip Sync

This is a quick test of the first hackish pass at doing automatic lip sync based on phoneme recognition from PocketSphinx.

I had reluctantly set the idea aside due to poor results reported by UE4 forum user ShaneC, who wrote a nice wrapper for PocketSphinx as a plugin. After some fiddling around and much searching of usergroups about Sphinx, I was able to determine that passing in a blank option for the phoneme recognition file makes it work pretty reliably for some reason.
Playback of the visemes is a bit choppy right now for a number of reasons, mostly to do with just because it hasn’t had a pass to make it look nice, and because I’m not passing through the information about how long each phoneme lasts.

Needless to say this is completely pre-alpha and doesn’t represent the final result, just an extremely early first test.

Left 4 Dead 2 Style Wound System Complete (Mostly)

Great news!

The Left 4 Dead 2 style wound and dismemberment system mentioned previously, which will be powering the gruesome world of Dungeon Survival, has now been completed (save for a couple of odds and ends, and art assets).

Previous screenshots were taken using a hacky method of transforming the hit location into mesh-relative coordinates, whereas the proper way of doing it, and really the only way that truly works, is to use the “pre-skin” vertex positions.

For non-developers this essentially means getting at the points that make up the character mesh before they’re transformed by the animation rig.

The effect of this is that the opacity-mask sphere which creates the “hole” in the character mesh so that the wound geometry shows through will follow along with any animation properly.

This meant diving into the guts of Unreal Engine 4’s material system, right down into the proto-shader files that underlie familiar visual node-based materials. Luckily this ended up being quite painless!

Remaining work to be done on the wound system itself is to use a blood mask to dirty-up the edges of the character mesh where it’s hidden by the wound-ellipse.

Other than that, all that remains is to create wound meshes for the characters and hook up and tune the system.

As an aside, I decided to work on this today because work on motion controller IK has stopped for the moment while the new full-body IK with limits plugin is debugged a bit by its author for my use case.

I can’t wait to show off some of the gameplay with the new full body IK and motion controls, as it’s something you don’t often see outside of triple-A games that use software like Havok Behavior or Autodesk HumanIK, and is perfectly suited to VR avatareneering..

Progress Update Since Last Time

Dungeon Survival has progressed a lot since the last update. No screenshots currently, but I would like to describe some of the features I’ve completed.

  • Procedural dungeons!
  • Zombies with animation (attacks, laying, sitting, hits etc) and AI, and swarm avoidance using detour and recast. Zombies wander and will detect and chase players, and attack when close enough.
  • Upgraded to UE 4.7.6.
  • Added several traps, as well as ragdoll assets for the player and zombies, and code to ragdoll on death.
  • Early melee animations

Motion controls for the arms/hands are working great with the Rift, and I’m eager to put in Vive/SteamVR support. The game is naturally suited to moving around a room, and I have some ideas for handling long distance locomotion.

Being able to bend over to reach items on the ground is already amazing, but I’m quite limited by the Razer Hydras I currently support.

More work is going into levels and traps, as well as some additional enemies to add to the rats and zombies. Eventually a character creator is going in.

I’ve also found and plan to implement physics rope bridges, and a system for detecting player breathing which will translate into character breath in game.

Despite infrequent posts here, development continues to progress at a good clip. I’m currently waiting on certain art assets to become available before the next level design phase takes place.

Quick and Dirty Razer Hydra Support in UE4 Blueprint

Edit: To use the calibration feature, stand in a t-pose (arms at shoulder level extended to either side) and press the calibration button (start by default).

It doesn’t matter what pose your character is in during calibration as long as they are generally standing.

Many thanks to Getnamo for his Razer Hydra (info and download) plugin and excellent help figuring out how to set this up (more detail about calculations). I was definitely overcomplicating things.

Copy the Hydra Plugin into your project dir and enable it in the Plugins menus, restart.

Make a new BP derived from HydraPlayerController. I called mine DHydraPlayerController, D being for Dungeon (Survival).

Go to World Settings and set the game mode override settings to use your HydraPlayerController BP you just made.

GameMode Override settings for Hydra player controller.

GameMode Override settings for Hydra player controller.

Event graph for Hydra player controller.

This is in the Hydra player controller. Notice the variables there and the function Calibrate Hydra (I use start, easy to find on the controller). The base offset one is just a static 40 in Z, it doesn’t change. I could use a float but I plan on doing a more thorough calibration step in the future to get more exact mappings, so I’m leaving it as a vector for now. The midpoint we calculate.

Event graph for Hydra player controller.

Event graph for Hydra player controller.

Next comes the Calibrate function, also in the controller:

Calibrate function in Hydra player controller.

Calibrate function in Hydra player controller.

Next we’re going into the character’s animation blueprint

Open your character's animation Blueprint.

Open your character’s animation Blueprint.

First, add these variables

Character anim Blueprint variables.

Character anim Blueprint variables.

Then create this (looks messy but is simple). Pull off the character node to get the controller.

Set positions in the animation Blueprint event graph. Pull from the character node to get the Controller.

Set positions in the animation Blueprint event graph. Pull from the character node to get the Controller.

The last step is to set up something like this in the AnimGraph. Ignore the wiring that goes off screen, it’s just doing a static rotation to fix the default hands for my particular character.

Set up the hand IK using the animation blueprint variables from before.

Set up the hand IK using the animation blueprint variables from before.

I’m using component space for the IK target positions.

I’ll be porting this in a more polished state into the VR Game Templates by Mitch (of Team Metatron).