GIANT SQUID

@giantsquidology-blog / giantsquidology-blog.tumblr.com

Development Blog of Giant Squid
Avatar

Camera Control in ABZÛ

Last time we wrote about our Fluid Controls, which touched on our camera. Today we’ll expand on that and explore how the camera works in ABZÛ.

Our design for the fluid camera started with these goals:

  1. Don’t roll with respect to the horizon, even as the diver rotates freely.
  2. Mirror the fluidity of the diver’s movement while still following predicatively rather than lagging behind.
  3. "Just work” from any direction with a wide gamut of unusual scene collision arrangements without needing lots of designer annotation.
  4. Seamlessly move in and out of in-game cutscenes.

Overview

We expanded on the basic ideas that John Nesky developed for Journey, while also responding to wrinkles introduced by freeform-swimming:

Orbit Camera

The primary module which runs the interactive camera during gameplay. We use a three-step pipeline: Detectors, DOF Solvers, and Constraints.

Detectors gather data from the scene, sanitize it, and extract the camera-specific inputs: player input, diver kinematic-prediction & acrobatics (as described in the last article), 2D forward direction, the follow direction and strength, gameplay boundary conditions, collision neighborhoods & distance-fields, designer hints, and special events (surface-breaching, flipping, boost-chaining, going-over-ledges, riding creatures, etc).

The 2D Forward Detector generally picks the direction that the diver is pointing.  However when the diver is pitched up or down, this direction becomes ambiguous. Therefore, the detector uses her belly or back direction in those cases.

DOF Solvers compute the main Degrees of Freedom:

  1. Tracking Position - the world-space location we’re looking at, usually the diver’s collar-bone.
  2. Pitch - looking up and down, deadzoned around a slightly-down pitch.
  3. Yaw - looking left and right, pulled along like a leash.
  4. Distance - pulled-back from tracking position, annotated by level designers.
  5. Framing - where the tracking position is placed on-screen, typically positioning the diver according to the rule of thirds.

The DOF is converted to a world-space POV (”point-of-view”) representing the actual location and rotation of the camera. The rotation, represented as a quaternion, is computed using euler-angles, and the location is the sum of the tracking position and a rotated local vector which combines the distance pull-back and framing offset (X=forward and Z=up):

POV.Rotation = Quat.Euler(0, DOF.Pitch, DOF.Yaw) // convert the screen-space framing into a // world-space “parallax” offset using the camera’s // field of view and the screen’s aspect ratio TanFOV = Math.Tan(0.5 * DegreesToRadians(FieldOfView)) ScreenToWorld= DOF.Distance * Vec(TanFOV, TanFOV/AspectRatio) Parallax = ScreenToWorld * DOF.Framing; // Pullback in the local forward/backward direction (X), // and parallax in the side-to-side directions (YZ) LocalOffset = Vec(-DOF.Distance, Parallax.X, Parallax.Y) POV.Location = DOF.Tracking + (POV.Rotation * LocalOffset)
Screenshot of our in-game Camera DOF Visualizer. The yellow bit is the framing parallax (here placing the diver in the bottom third of the screen).

Constraints take the DOF results and nudge and/or clamp them to safe ranges to account for: line-of-sight occlusion, water-surface breaching, smoothing (using critically-damped springs, to avoid acute speed-hitches), and “custom camera” matching (discussed later).

An example LOS (”line-of-sight”) constraint. First the distance is clamped so that it doesn’t go inside solid collision, and then the pitch & yaw are nudged to try and restore the original orbit distance (biased towards the follow direction).

The orbit camera behaves a little differently at different times.  For instance, when the Diver flips, we don’t want to swing the camera around with her. Therefore, each step is post-hooked by an override. These are ordinary game objects which implement an abstract interface with various optional methods to override default orbit behavior. For the programming curious, it looks something like this (though it’s a bit more complicated in production):

interface IOrbitCameraDelegate {   OverrideCamTracking(Camera* Cam, vec3* InOutLocation)   OverrideCamFraming(Camera* Cam, vec2* InOutLocation)   OverrideCamPitch(Camera* Cam, float *InOutPitch)   OverrideCamYaw(Camera* Cam, float* InOutYaw)   OverrideCamDist(Camera* Cam, float* InOutDist) }

Delegates default to the diver, but can also be set explicitly in scripting for special moments. Structuring these overrides to use ref-arguments instead of return values was helpful to perform blending or hysteresis in the delegate itself.

Custom Camera

We use custom cameras for scene bookending, cutscenes, authored-animations, and other special-cases where we need total control, without any side-effects, smoothing, or constraints. The POV is supplied by a second optional delegate:

interface ICustomCameraDelegate {   CamBlend(Camera* Cam, float* OutTime, EasingType* OutEasing)   CamPOV(Camera* Cam, vec3* OutLoc, quat* OutRot) }

The advantage of an abstract interface is that anything can be a custom camera. It helped us consolidate our scripting to actors, without having webs of tightly-coupled components.

Orbit <-> Custom Blender

imageimage

This module transitions fluidly between the orbit and custom cameras. It has three states:

Pure Orbit: there’s no custom camera, so we just pass through the orbit result (99% of the time).

Pure Custom: like pure orbit, we just pass through the custom camera, however we also update the orbit constraints to match the custom camera’s rotation, so that when we return it won’t swing wildly and induce simulation sickness.

Blending: when a new custom camera is set or unset we bookmark the current blended POV (because we might be transitioning from another custom camera, not just the orbit) and then blend in or out of the custom camera. There’s lots of tricky bits here that are necessary to keep the camera fluid:

  1. Extrapolate the blend-from location & rotation using the intial blended POV speed so there’s no speed hitches during the blend.
  2. Apply easing to the interpolation so it’s not an unnatural linear movement (we use smoothstep by default, but this can be overridden by the custom camera delegate).
  3. Don’t interpolate along a straight line - in general we compute cubic hermite splines whose tangents are scaled by the amount of rotation so we don’t feel like we’re “cutting across corners.”
  4. Make sure the rotation axis is consistent. In general, we rotate along the smallest arc using slerp, however, e.g., if we started rotating clockwise, then we make sure to keep rotating that way even if the smallest arc changes mid-transition. In 3D we detect this by ensuring that the dot-product of two consecutive rotation axes is positive.

Shake

Shake is applied after all the other processes as a “post effect” so that we avoid feedback between the shaking parameters and the baseline POV. We support two kinds of shakes: a simple-shake which is easy to script, and a custom shake which takes a curve asset for syncing up with animations.

Conclusion

I hope you enjoyed our whistle-stop tour. It all seems pretty straightforward in hindsight, but we also experimented with many more false-starts and nice-in-theory-bad-in-practice prototypes along the way. As with the diver movement, each module in the final build had about a bazillion tuning parameters that were constantly mixed and monitored throughout the project.

As always, if you have any questions or would like to follow up for more detail, you can ping me @xewlupus -- we look forward to feedback on how our devblogging efforts can better serve fellow developers :)

Max Kaufmann Gameplay Engineer

Avatar

Fluid Motion in ABZÛ

My name is Max Kaufmann, Gameplay Engineer at Giant Squid (@xewlupus).  In designing the aquatic feel for ABZÛ, we endeavored to allow players to swim gesturally through full 3D environments.

Fluid control presents a challenge because smoothed movement introduces input lag.  It's frustrating when steering is not responsive: when players tilt the stick they expect the diver to move right away and when they let go they expect the diver to stop. However, that does not feel fluid. We’re trying to simulate swimming without “feeling swimmy."

Additionally, as designers, we risk acclimating to slippery controls rather than fixing them. Several times our personal baselines diverged from the experience of new playtesters. We learned to rely on rules, not just our intuition, to rate the success of a particular game-feel tuning.

Our first rule: our controls succeed when the player intuits that the diver is going to drift a little bit after they release the thumbstick.

We began with the camera. We designed the camera yaw to be predictive. Like how bikers look into turns before entering an intersection, when the player steers we rotate the camera to extrapolate where the diver will be facing after drifting, rather than her current heading.

When a player releases the stick, the camera is already aligned correctly and the drift-direction is intuitive. Additive animations also turn her head and curve her body towards the direction of her drift to reinforce the hint. 

On the downside, extrapolating camera yaw to match her predicted direction can swing wildly and induce simulation sickness when the diver is performing gestural moves, like loops and flips. Through playtesting, we identified these moves and added special detectors which hold the camera steady, dollying out to frame the acrobatics. This maintains the tranquil, underwater atmosphere.

imageimage

Loops introduce another control issue: input reversing.

We apply steering and pitching in her local coordinate space and integrate with quaternions instead of euler angles. This allows the diver’s rotations to be freeform and not pitch-locked like flight sims and other underwater games. 

This is problematic when the diver pitches upside down. In this belly-up orientation, steering reverses what players expect, and “left” becomes “right.” Through playtesting, we identified our next rule: ignore reversed input.

Instead of steering when the players rotate into this state, we apply a twisting-force to unroll her body and resume player steering when she is finishes the transition from belly-up to belly-down. 

imageimage

The twisting force is one of many corrective forces we apply to softly nudge the diver. We also slow down near sea creatures, steer away from geometry, and level the diver out or fix-up nearly-vertical headings when the player wants to ascend or descend.

We tried many ways to calculate these forces. For most smooth values, we use critically-damped springs, which advance a speed/value pair to a target set-point in a fixed amount of time without any discontinuities. It has a nice ease-in/ease-out character that keeps the response fluid.

The implementation of Critical Damping can be found in Game Programming Gems vol. 4.

However, critical damping doesn’t give us enough control over how we ease in the twisting force. We instead had the best results with a PID Controller. These are further generalized springs that incorporate set-point motion and lag compensation. 

The constants stand for “Proportional, Integral, Derivative”

PID Controllers are frequently used in racing games to design the handling for different types of cars.

 A negative derivative term, for instance, can make it feel "slippery", and a high integral term can make it "push harder" over time. These gave us all the knobs we needed to "mix" her force, like a sound engineer mixing audio channels. When her correction forces harmonize, players hardly notice the adjustment at all.

Developing original controls is challenging but the result differentiates our game. We're confident in the synchronicity between our fluid game feel, and the meditative themes of ABZÛ's narrative world.

Avatar

Modelling Sealife - Eagle Ray

Hello, I'm Bryce, the environment artist at Giant Squid. I wanted to share a little bit about our art style and our process for creating the fascinating undersea life in ABZÛ. Let’s start with an eagle ray I did early in the project.

Our goal is to capture the essence of the creature and simplify the real life animal into a stylized version of itself. The first step is to analyze a lot of real life reference:

We try to pick out shapes that best represent the likeness of the creature and will do several quick drawings to determine the look and shape of the creature (pencil sketches here from our creative director Matt). If you look at a real eagle ray, you will notice there are a lot more curves and forms than our stylized version.

Next we make an untextured gray model. Looking at the wireframe here, you can see how we use hard edges on our models to give them a unique style and accentuate forms along the creature.

Then we create the texture, but we have to do a lot of simplification to make sure it stays within the style and can be recognized from far away. If we were to add all the real life detail to the texture the creatures would be visually noisy and hard for the player to identify.

Another trick we use when texturing is to actually cut into the geometry for certain designs or details. In some cases this is better than painting in the details to keep them crisp and clear and to avoid blurriness. On the eagle ray model you can see how I cut in the shape of the mouth and gills to get nice crisp hard lines.

You’ll be able to find the final eagle ray cruising around in the game, but if you can’t wait you can check out the 3d model right here on sketchfab.

I hope you will enjoy discovering all wonderful wildlife in ABZÛ!

Avatar

While working on ABZÛ, our creative director Matt Nava makes lots of beautiful sketches to communicate ideas and direction for the game. We were looking through some of the concept art he’s created so far and found a few to share in our first ever art post.

These are some early pieces that help set the tone and mood of the game. Lots changes during development, but you can think of this a hint of what’s to come!

Avatar

Here are our first two official ABZÛ wallpapers. They are available for desktops, tablets and phones in these formats:

If your resolution isn't listed you can just take the closest larger image and it should scale down just fine. You can also download the entire set here.

Avatar

AUDIO NARRATIVES

Hi I’m Steve, sound designer at Giant Squid. In this post, I’ll be describing a process that we use to establish the audio direction for new areas in ABZÛ.

An audio narrative is a short sound design test piece based on a written description of a moment in the game. It functions as concept art in audio form, setting the mood and inspiring gameplay and art direction. It gives us an early glimpse of a more complete audio atmosphere, and serves as a great reference point as we iterate further.

This is a narrative that was created a to help define the open ocean areas in ABZÛ. Giant Squid’s creative director Matt Nava wrote the story prompt below.

The player dives into the water. It is open and clear, with a sandy floor. There is a gently rolling tide. She swims quickly for a distance through this barren area, occasionally jumping out of the water like a dolphin and splashing back in. Before long she discovers a reef. She must dive further down to reach it. It is an environment rich with life. Schools of thousands of small fish dart by, surrounding the diver. Some are scared as she swims close. A very large goliath grouper fish floats idly near the coral. When a smaller yellowtail snapper fish comes close, the grouper eats it in one lightning quick bite. The player lands on the seafloor and walks through tall sea grass. The grass gives way to a stoney floor where the diver finds the entrance to a deep cave. As she enters, the temperature drops rapidly and all light fades away. The cave is lit only by tiny bioluminescent creatures hovering in the still icy water. There is something magical about its quiet atmosphere.

After designing the sound for it, I presented it to the team. We all read and listened at the same time. It’s fun to match the events to the sounds as they play.

One of my main design goals was to make the soundscape feel inviting, so that a player would feel safe to explore. At the same time, I wanted to make it feel foreign because being underwater is such a drastic change from our normal environment. One way I tried to make the player feel safe in this alternate space was using processed versions of commonly heard everyday sounds. For example, to make the environment feel rich with life, I used a lot of typical animal calls like loons, donkeys, and cranes. However, I pitched-shifted, equalized, filtered, and ran the sounds through vocoders to make them mesh with the underwater environment.

Creating the underwater soundscape was one of the biggest challenges, but easily one of the most fun. The large grouper fish was mainly pig sounds. His swimming is a gurgling fish tank layered over the background water. When he eats the yellowtail, you’ll hear an apple chomp and myself gulping water as loudly as possible. The player’s footsteps were made by pressing my hands into a mix of snow and sand.

Recording apple chomping foley with Pete Angstadt, a gameplay engineer at Giant Squid

The audio narrative allows for a lot of easy experimentation with things we might change later. One thing I tried was placing the listener sonically within a helmet. To do this, I placed a microphone inside my motorcycle helmet and played a series of frequency sweeps from my speakers. As the microphone recorded the sweeps, I got information about what they sounded like from within the helmet, as well as how long it reverberated. This generated an impulse response, which I could feed into the sound effects to get the desired helmet effect.

Impulse response results and the helmet recording setup

It turned out placing the listener inside a helmet acted as a barrier between the player and the game world. ABZÛ is not a scuba simulator. Instead, it aims to capture an intimate experience with the world and life around you as you play. Due to this mismatch with the game’s direction, we abandoned the helmet idea. The beauty of the audio narrative as a part of preproduction is that we were able to test the idea within the context of a fuller soundscape without spending too much time on it.

I hope you enjoyed this explanation of how we use audio narratives as a part of our process. Much of what is heard here has been expanded upon for the actual game. Working on ABZÛ has so far been absolutely humbling and an experience unlike any other. I can’t wait for you to hear the whole thing.

-Steve Green, Sound Designer

Avatar

INTRODUCTION

Giant Squid was founded about three years ago with the intention of creating a small game studio driven by artistic and innovative design. Ever since, we have been hard at work hiring a team and creating our first game, ABZÛ.

Before starting Giant Squid, I had the honor of working on two games with the talented team at thatgamecompany. I was the art director on Flower and Journey during my time there. I learned a lot about the technical details of how games are made, but more importantly how games have the power to create meaning in people's lives. With Giant Squid and with ABZÛ, I wanted to take what I had learned further.

ABZÛ is an underwater exploration game without much in the way of precedent. All of our energy has been devoted to solving tricky new problems that the design has presented, in an effort to make an experience that will have a moving effect on our players. We have had to figure out how to simulate thousands of dynamic fish, design a very unique character control scheme and camera system, and take an unconventional approach to rendering underwater environments, among many other things. The Giant Squid team has come up with some very innovative solutions to these challenges and we decided that it would be great to share some of these development stories with the community. Members of our team will periodically write about working at Giant Squid and creating ABZÛ on this development blog. I hope you enjoy it.

-Matt Nava, Creative Director

Sponsored

You are using an unsupported browser and things might not work as intended. Please make sure you're using the latest version of Chrome, Firefox, Safari, or Edge.