Unofficial OVRLipSync Plugin for UE4

After messing with PocketSphinx’s phoneme recognition and not getting the results I wanted, I looked back into porting Oculus’ LipSync plugin for Unity over to UE4, since the Unity plugin is just a wrapper for a DLL and some examples of how to use it.
I’m happy to report that I’ve got a basic version of the OVRLipSync plugin working in UE4, and it’s ready for people to use. I’ve gone ahead and made an example project to show how to use the UE4 version of the plugin (quite straightforward, see example images below). The project has an example mesh to see it in action, and should work out of the box:

Example Project + Plugin Repo:
https://github.com/ChairGraveyard/ovrlipsync-example

Plugin Only Repo:
https://github.com/ChairGraveyard/ovrlipsync-ue4

For those that just want a quick rundown on how to use it without downloading the example project, here are some screenshots of the VisemeGenerationActor derived blueprint class.

VisemeGeneratActor event graph setup:

SetMorphTargets function:

MorphIdxToNames array:

As long as the mesh has the appropriate morphs as listed above, it will work decently well. I’m sure there are things I’m doing wrong, and bug reports are welcome!

Floating Hands: Good Riddance

I never liked the idea of floating hands, and now that I’m no longer trying to get a dev kit from HTC/Valve (they expressed interest in mini-games with low replay value and minimal content, the opposite of what Dungeon Survival is aiming for), I’m doing away with them entirely in favor of the true-first-person I always intended to provide, and was going to end up being an option if floating-hands had stayed.

With an inverse kinematic solution that’s more robust than the generic CCD/tw0-bone IK provided out of the box by most engines, and which also features constraints, the “elbow problem” which prompted Oculus and Valve to so vehemently recommend against true-first-person avatar bodies simply doesn’t exist, so the argument for floating hands crumbles.

For fans of floating hands rather than a tracked, full body avatar (why!?), they may come back as an option post release.

Work in progress Automatic Lip Sync

This is a quick test of the first hackish pass at doing automatic lip sync based on phoneme recognition from PocketSphinx.

I had reluctantly set the idea aside due to poor results reported by UE4 forum user ShaneC, who wrote a nice wrapper for PocketSphinx as a plugin. After some fiddling around and much searching of usergroups about Sphinx, I was able to determine that passing in a blank option for the phoneme recognition file makes it work pretty reliably for some reason.
Playback of the visemes is a bit choppy right now for a number of reasons, mostly to do with just because it hasn’t had a pass to make it look nice, and because I’m not passing through the information about how long each phoneme lasts.

Needless to say this is completely pre-alpha and doesn’t represent the final result, just an extremely early first test.

Demonstrating Dynamic Pickup Objects in Dungeon Survival (Quick Video)

Hey guys, this is a quick video I did to demonstrate how the physics objects and picking them up will work in Dungeon Survival, which also plays into combat.

I’ve been working with physics engines, including implementing them from scratch into commercial game engines and doing complex physics implementations for about 7 years now. One thing I, like every other VR enthusiast, realized, is that without the haptic feedback of a real sword you’d need to do something special, and specifically different than the kinematic attachment you normally see, e.g., it’s bolted onto your hand or whatnot, and has essentially infinite mass – can move anything, regardless of how heavy, in order to get that feeling of weight.

To that end I wanted to test a method of letting the objects you pick up, including weapons, still be dynamic (e.g., can be moved by other physics objects in the simulation, and can’t move infinite mass).

Without further adieu, here’s the video:

 

YouTube version: https://www.youtube.com/watch?v=mYw-e0qTJkA

Please note this is pre-alpha footage, and does not represent the final state of anything featured!

Notice that when I hit small objects with low mass, like the mug and basket, they are easily moved with the sword, which is of a similar mass.

The table, by contrast, blocks the sword and moves it as I swing (as hard as I reasonably can).

In Dungeon Survival, you’ll need to be aware of your surroundings, as your weapon may be caught in obstructions, blocked/moved away by an opponent, or knocked out of your hand outright!

Belated Screenshot Saturday – Dungeon Survival

dungeonsurvival_interiorcabin_overworld

About Dungeon Survival

Features Overview

  • Made from the ground up for VR, with unique features to support motion controls
  • Next-gen VR features today: Voice-input commands, NPC facial morphs and more.
  • Co-op multiplayer (at least 2-4 players, maybe more unofficially supported)
  • Streaming, no-load-screens transitions between levels of the dungeon
  • True First Person (you can see the character’s body, including animations)
  • Deep Item System – Fire burns things, water douses them, etc. – generalized gameplay that layers up to produce complex interactions
  • Real-time Physics Interactions and Combat – Items are physics based as in games like Skyrim, combat is based on physical collisions – if you see a hit – you hit!
  • Environment Interaction Over Hack ‘n’ Slash – Use your wits to overcome the dungeon’s deadly foes and traps! Ingenuity is rewarded over blunt force.
  • A Customizable Experience – Want hardcore roguelike style gameplay with permadeath? Or would you rather have a more Skyrim-style experience? You choose!
  • Modding Support – Via both configuration file editing (recombine base gameplay of items/monsters with new looks and combinations of behaviors) and advanced modding through the UE4 engine. Modding means for a replayable experience you can keep coming back to for years to come.


About The (VR) Dungeon Survival Project

The (VR) Dungeon Survival Project is a game built out of my desire to make something that captures the magic of deep item and environment interaction gameplay of roguelikes, on top of a foundation of survival and real time gameplay.

In addition, the game is my experiment into VR and supporting novel input devices for more interactive, fine-grained gameplay. What this means is that ultimately Dungeon Survival will support setups that allow the player to play with natural input, such as grasping and manipulating objects with the HTC Vive controllers.

Mod support is also very important to me, as I fully understand that mods are the key to longevity of games of this sort. Most things will be able to be tweaked or added to without any tools other than a text editor, but advanced functionality will require downloading a mod pack with source files for the game, and getting access to Unreal Engine 4.

Left 4 Dead 2 Style Wound System Complete (Mostly)

Great news!

The Left 4 Dead 2 style wound and dismemberment system mentioned previously, which will be powering the gruesome world of Dungeon Survival, has now been completed (save for a couple of odds and ends, and art assets).

Previous screenshots were taken using a hacky method of transforming the hit location into mesh-relative coordinates, whereas the proper way of doing it, and really the only way that truly works, is to use the “pre-skin” vertex positions.

For non-developers this essentially means getting at the points that make up the character mesh before they’re transformed by the animation rig.

The effect of this is that the opacity-mask sphere which creates the “hole” in the character mesh so that the wound geometry shows through will follow along with any animation properly.

This meant diving into the guts of Unreal Engine 4’s material system, right down into the proto-shader files that underlie familiar visual node-based materials. Luckily this ended up being quite painless!

Remaining work to be done on the wound system itself is to use a blood mask to dirty-up the edges of the character mesh where it’s hidden by the wound-ellipse.

Other than that, all that remains is to create wound meshes for the characters and hook up and tune the system.

As an aside, I decided to work on this today because work on motion controller IK has stopped for the moment while the new full-body IK with limits plugin is debugged a bit by its author for my use case.

I can’t wait to show off some of the gameplay with the new full body IK and motion controls, as it’s something you don’t often see outside of triple-A games that use software like Havok Behavior or Autodesk HumanIK, and is perfectly suited to VR avatareneering..

Brief Dungeon Survival Update

Got motion controls via the hardware-agnostic interface in UE4 working with the Skyrim-style inventory I already had set up for a while now.

It’s amazing fun tossing items around and catching them in the air, and the IK for the hands is great 😀

I’d really love to get my hands on an HTC Vive DK2 in order to get all this integrated on more final hardware, but even the Razer Hydra is really awesome.

(Note for coherency sake that these controls already worked in a previous build of Dungeon Survival which did not have the inventory system, some character work, enemies etc set up, and this update is mainly about now being able to support Vive and Oculus Touch, and PSVR Move controllers).

Plugin Spotlight: Sound Visualization in Packaged Builds

I’m sure many UE4 developers are aware of the neat Sound Visualization plugin that comes with the editor, which allows you to get frequency and spectrum information from a SoundWave.

Those who have tried to use the plugin for anything in their own projects will also know that the stock version does not support running outside the UE4 editor, which makes it rather useless for serious projects.

If you were to fix the in-editor-only problem, you’d quickly run into the issue that decompressing a SoundWave to use in the Sound Visualization plugin causes the entire sound to be decompressed – this can be bad with large sound files which decompress to multiple gigabytes of WAV sound data.

To that end, a while ago I had built an asynchronous multi-threaded sound decompression class which allows the user to pass in a portion of the sound to be decompressed, and returns only the decompressed sound for that portion.

Not having done anything with it for a while, I sent it over to Unreal Forums user eXi, who has packaged it all up with the Sound Visualization functions into its own plugin.

Check it out: https://forums.unrealengine.com/showthread.php?94974-eXi-s-Sound-Visualization-Plugin-%28works-in-cooked-builds%29&p=441220

Storyteller – Fireside Tales Beta 2 Released

 

The new version of Storyteller brings it to Unreal Engine 4.10 and the 0.8 Oculus Runtime, greatly overhauls the environment and object graphics, adds the option for SSAO, new animated books, new artwork and maps. The camera will also now function while using mouse and keyboard and not in the VR mode.

In addition to to many small tweaks and updates, the selection of built in audiobooks has been improved.

And finally audibook files are now in a more convenient location (Content\Audiobooks) rather than with the game exe.

 
Get it here:
https://storyteller-vr.com/storyteller-fireside-tales/

Update: Left4Dead2 Style Wounds for Unreal Engine 4

Quick update to my L4D2 style wound implementation for Unreal Engine 4 – I’ve managed to get the wound hit point to follow the animation properly for characters. This was the main thing holding me up from continuing work on this system (namely, extending it to use a capsule instead of a sphere).

An example download is available here: https://forums.unrealengine.com/showthread.php?90474-WIP-Dynamic-Left-4-Dead-2-Style-Wounds-Dismemberment

For those that want to incorporate the new changes I’ve made, here are two screenshots showing the main pieces of code you need to implement (and the modified code for determining the sphere location).

First is in the character blueprint – this is showing the new way to transform the socket location into skeletal-mesh-relative-coordinate-space (that’s a mouthful):

Next up is the material itself. First set your Material domain to Masked, then replicate the part that plugs into Opacity Mask:

 

 

New Version (0.8 Oculus Runtime!) of Storyteller Coming Soon

A new update is coming soon to Storyteller, mainly to port up to Unreal Engine 4.10 and the 0.8 Oculus Runtime, as Storyteller hasn’t been compatible for some time now being on the old 0.5 Runtime.

In addition to that there are some changes to the level, and some performance improvements:

  • Added all cave rocks to Static Mesh Instances, almost doubled FPS.
  • Updated all level objects with better versions with higher quality meshes and textures from new art packs I’ve purchased or found.
  • Updated all books, maps and scrolls with much higher quality art.
  • Added an animated book for the storyteller.
  • Added SSAO (toggle-able in game) and a mild bloom to make the fire look nicer.
  • Changed the audiobook loading to look within a nicer directory – Content/Audiobooks, rather than being stuffed in with the exe.
  • Removed problematic bundled audio (e.g. ones that had more than one voice, or sound effects etc.), replaced one.

I will be updating the actual narrator with a higher quality character mesh, as well as multiple narrators to choose from, and more levels, but that’s coming a bit later on, as my main focus is still on Dungeon Survival.

exx4c0n

Dungeon Survival on Unreal Engine 4.10/Oculus 0.8 Runtime

Bone-Break Dismemberment Plugin for UE4

Unreal Engine forum user Jeff Lamarche recently posted a plugin that adds support for UDK-style “bone-break” dismemberment (read more here) to Unreal Engine 4.10 Blueprints.

While this method does require some art support and preparation, it’s a tried-and-true way to add dismemberment to enemies, and is still frequently used in modern games. One such example is the recent Warhammer: End Times – Vermintide.

Bone-Break Plugin for UE 4.10

Grass Bending Test

Note: The following video is pre-alpha test footage, and in no way represents the quality of the final game.

It’s hard to see due to the video framerate and compression, but there are volumes for affecting the grass parented to the player’s feet.

Credit for the effect goes to Daniel Wenograd.

To add this to your own Unreal Engine 4 project, check out Daniel Wenograd’s post on the UE4 forums.

Storyteller – Fireside Tales featured in Vice article about Tripping in the Rift

Connington’s first VR trip was closer to what you’d typically do after taking shrooms. After donning the Rift headset, he was sitting in a cave around a crackling campfire listening to an old man reading George Martin’s A Song of Ice and Fire (the series that became Game of Thrones). The simulation, Storyteller – Fireside Tales, is an immersive audio book that makes it feel like the story’s being told by someone sitting next to you.

“The echoing of the old man’s voice through the cave, the dripping of the water behind me, and the warmth of the fire in front were so intense and real that I felt like I could reach out and touch them,” Connington says.

“I had left my student life behind and become part of the ASOIAF world, feeling like Hodor could walk in at any moment.” he says. “I started to dream I was Bran, stuck in a cave somewhere North of the Wall.”

Screen Shot 2015-11-18 at 11.46.09 AM.png

Image: Oculus Rift simulation Storyteller – Fireside Tales

Read more: http://motherboard.vice.com/read/real-drugs-virtual-reality-meet-the-psychonauts-tripping-in-the-rift

Thanks to Motherboard/Vice and Jon Connington for the shoutout!

Exponential Squared Fog for Unreal Engine 4

While upgrading versions on DotCam and TK-Master’s Ocean plugin I realized while there are some underwater camera effects in the new version, there isn’t any fog effect, and having previously run into the limitation of a single Exponential Height Fog actor per level in UE4, I realized I would need to buckle down and write a fog shader myself. Not hard, of course, but there are a couple of tricks that should save time and hair pulling.

First, let’s see how the current underwater shader effect in the example level looks:

Not bad – the animated 4-Way Chaos setup combined with the wavy normal map does an admiral job at simulating the effect of looking through clear water.

But what if your ocean has a high amount of plankton, or other debris?

Something like the above is a bit more suited.

To replicate this, create a new material and set it to be a Post Process type.

Then recreate this node layout:

The code for the custom node (ExponentialDepthFog, in the image) is very simple:


 

// Get the distance from scene depth
float dist = length(pixelDist);
float fogFactor = 0;

// Compute our fog lerp factor
fogFactor = exp( -pow(dist * fogDensity, 2));
fogFactor = saturate(1 - fogFactor);

// Lerp between fog and scene color.
return lerp(sceneColor, fogColor, fogFactor );

Once you’ve created the material, add it to your underwater PostProcessVolume under the blendables section.

Unreal Engine 4 Plugins Shoutout!

Big news and updates for the game will be coming soon, but for now I’m going to do a shoutout to some of my favorite Unreal Engine 4 plugins and their authors!

Speech Recognition

First up is a brand new plugin from forum user ShaneC – Speech Recognition

Driven by pocketsphinx (portable edition of speech recognition library, CMU Sphinx), this plugin allows you to take microphone input and a wordlist, and detect which words the user has said. Adds several blueprint nodes, which ShaneC was kind enough to demonstrate the usage for.

ShaneC-SpeechRecognition

Screenshot3

Dungeon Survival will be using this plugin for multiple systems, which I’ll be going over in a future update. For now, it’s a secret 😉

Ocean Simulation

Next up we’ve got another free community plugin, DotCam and TK-Master’s Ocean plugin. This is one of my favorites, having always been interested in ocean simulation and rendering.

4.8_Ocean

The latest version features screen space reflections, and an infinite-system component for unbounded oceans. Already in use in Dungeon Survival and the new upcoming version of Storyteller.

Dungeon Generation

Dungeon Architect by Ali Akbar, the procedural dungeon creation tool you’ve always wanted.

For quite some time, Dungeon Survival was planned to feature only hand-crafted dungeon levels. This was before the scope of the game had increased to include an overland component, and towns and villagers to visit and interact with.

After some feedback from Reddit’s Roguelike subreddit, I decided that procedural generation of at least dungeon layouts was something I wanted to put in. I spent some time investigating and implementing methods of my own, but quickly abandoned them once forum user Ali Akbar showed off his amazing Dungeon Architect plugin.

That’s all for now, but I have big updates and news about Dungeon Survival on the way!

Motored Ragdolls

By default, ragdolls in most games will bonelessly flop to the ground. Newer games and especially AAA developers have been putting in motored joints to alleviate this, or sometimes blending from an animation, or to one using motors.

With some simple motor forces it’s possible to get a stiffer ragdoll that resists being tossed around or pushed over, and naturally falls in a flatter state that’s easier to recover from for stand up animations.

Fans of melee combat – Reverence is a must see!

Reverence launced its Kickstarter campaign today, asking for a modest $5k in light of the features already implemented.

Obviously being a huge melee combat fan myself, I am going to back this for sure. While the graphics may be rough around the edges, the gameplay features more than make up for that lack, and that can still result in a beautifully executed game, as in the case of a favorite of mine, Mount & Blade.

Some of the highlights in Reverence include:

* Full melee combat control!
* Dynamic real time slicing – cut through trees etc.
* Physics based parrying
* Stance system

Fans of melee combat should definitely check this out and give it a go!

Dungeon Survival and Linux

Recently we’ve had the news that Oculus will at least initially only support Windows. Understandable, if not ideal, fine.

Today, Valve announced the same for the Vive, which again, isn’t great. Though at least Valve have indicated that they will eventually support it, while Oculus’ statements have been less clear on that point.

In any case, I’ve always wanted to bring Dungeon Survival and especially its VR support to Linux, and even Mac, despite my not regularly using the former, and never using the latter. It’s been hard sticking with that, but I will continue to do so. Unfortunately, for the moment this means that Dungeon Survival won’t support VR on Linux for the Rift or Vive, though hopefully it will support OSVR on Linux. Completely weird.

Mac support is and always has been more nebulous just due to the fact that Mac OpenGL drivers are…not so lovely.

I would like to ultimately have full support on all three major PC platforms, and will try to maintain that goal, even if it means support is delayed for a while.

Making (optional) Hardcore/Perma-Death Modes Fun

“Let me sing you the song of my people.” – Death

Hardcore or permanent (perma)-death modes are an interesting dilemma. On one hand, they absolutely do amp up the tension of most games in a very noticeable way. Many players will become hyper-vigilant, or think of extremely creative ways that a designer never conceived in order to solve challenges. Perma-death is also incredibly frustrating.

Although perma-death is almost universally considered a frustrating mechanic, it clearly exists/existed in games, and is desired by at least a small group of players as a game mechanic. I know that for myself, that immediately makes me wonder why, other than the obvious frustration issue, perma-death died out.

I think the answer comes down to a dichotomy of how interesting in gameplay terms the death is compared to how frustrating it is to start over. If the former can clearly and cleanly overcome the latter, perma-death becomes a compelling mechanic (for some people, at least). One example of this is seen in old school dungeon crawler and roguelike games that had such modes: leaving behind your character and all his stuff and allowing the player to find it with a new character.

This is but one method, but it’s a core one to the concept of perma-death. It creates a sense of permanence of the world. It’s not a world of heroes that revolves around a single earth-shattering protagonist, it’s a brutal, grey and drudgery filled death trap, where the slightest misstep can spell your doom. It also means that the world can go on without your character. The story doesn’t end because they died on dungeon level 25, the game doesn’t force you to reload a save or checkpoint. That character is simply dead, lying on dungeon level 25, their bones being picked clean, loot forgotten….

Until their outraged sibling/wife/offspring/relation comes hunting for the family heirlooms and a big helping of revenge.

So if you’re considering adding perma-death to your game as an optional mode, strive to make it more interesting than frustrating, and use it to make the world more alive.

Book Review: Designing Virtual Worlds

The quote by Richard Bartle from this article is great, and an example can be seen in the choices made by World of Warcraft’s designers (as well as those that chose to blindly copy WoW itself) when they decided to clone Everquest, but didn’t understand the decisions of Everquest’s designers, and why Verant created MUD-like combat in the first place. Verant’s goal, after all, was to create (one of) the first graphical MUDs, and they did that admirably.

Back then, most people would have scoffed if you told them that graphical MMOs for the next 15 years would simply copy-forward the MUD-style serial hotbar combat without any real though to the reasons why such combat existed.

The stagnation of combat in modern MMOs for the past decade can at least partially be attributed to this lack of insight, or lack of desire on the part of the designers to even attempt such introspection.

And for those that might exclaim, “But wait! Hotbar graphical MUD style combat has been copied so often only due to insurmountable technical limitations of bandwidth, latency and processing power.” To this, I point to Planetside, released in 2004, a mere one year after WoW, which features real time FPS style combat on an MMO scale. MUD-style hotbar combat hasn’t been necessary for at least that long.

Metaversing

It has been over a year since my last review of a vintage virtual reality book. I’ve recently come across a good one that I’d like to share.

In 1978, Richard Bartle co-authored MUD, the very first virtual world. In 2003, he shared his twenty-five years of virtual world and MMORPG experience in the book Designing Virtual Worlds. Here are some excerpts from the preface:

Too much virtual world design is derivative. Designers take one or more existing systems as foundations on which to build, sparing little thought as to why these earlier worlds were constructed the way they were.

Are designers even aware that there are decisions they can unmake? Although a good deal of design is evolutionary, that does not mean designers can’t be revolutionary, too.

The key is in recognizing the face that what seems eminently logical to you from your usual perspective might turn…

View original post 753 more words

FATED and the Quest for Optimized VR

FATED BLOG

Mik

Hi everyone!

With Oculus’s recent announcement regarding requirements and specs for the consumer version of their HMD (https://www.oculus.com/blog/powering-the-rift/), I figured it was the perfect time to write that performance bit I teased about the last time around. Let’s see how we’re dealing with optimization on FATED! First, some math to have a clear vision of what we’re trying to achieve.

Know Your Numbers!

FATED is pretty much fillrate bound (http://en.wikipedia.org/wiki/Fillrate), and it’s safe to assume that most early VR games will also be. This is why the following info is important.

A current generation game will generally push about 124 million pixels per second when running at 60 fps in 1080p. FATED is currently running on a GTX 970 (the recommended card for the consumer version of the Oculus) at ~305 million pixels per second.

1920X1080 upscaled by 140% = 2688X1512 * 75(Hz) = ~305…

View original post 912 more words