This is the multi-page printable view of this section. Click here to print.

Return to the regular view of this page.

The Basics

High-level usage and scripting references for GS_Play feature sets. Drop-in functionality for intermediate developers and designers.

GS_Play is a full production set of modular features that can all be individually toggled on to enable only the features you need for your project. The “Basics” section covers what each feature set does, how to work with it from the editor and scripts, and what events and nodes are available — without requiring deep architectural knowledge.

If you are new to GS_Play, start with Get Started first. If you need architecture-level detail, component internals, or extension guides, use the Framework API section instead.


How This Section Is Organized

Each gem section follows this structure:

Gem overview page — What the gem does, the problems it solves, and a summary of its feature sets with links to feature sub-pages.

Feature sub-pages — One page per feature set. Each covers:

  • What it does and what components are involved.
  • Editor setup for drop-in use.
  • Relevant ScriptCanvas nodes and EBus events.
  • A quick reference table.
  • Links to the Framework API for deeper reference.

Pages in this section deliberately omit architecture internals, extension code, and low-level component details. If you need those, follow the Framework API links at the bottom of each page.


Sections

1 - GS_Core

The foundation gem for the GS_Play framework — game lifecycle, save system, stage management, input, actions, and utility libraries.

GS_Core is the required foundation for every GS_Play enabled project. All other GS gems depend on it to drive their complex behaviour, and utilize its systemic features. It provides the game startup sequence, persistence, level loading, input handling, a triggerable action system, and a shared utility library.

If you have not set up GS_Core yet, start with the Simple Project Setup guide before reading further.

For architecture details, component properties, and extending the system in C++, see the GS_Core API.


Quick Navigation

I want to…FeatureAPI
Start a new game, continue from a save, load a specific file, or return to the title screenGS_ManagersAPI
Save and load game data, or track persistent flags and counters across sessionsGS_SaveAPI
Move between levels, or configure per-level spawn points and navigation settingsGS_StageManagerAPI
Read player input, disable input during menus, or swap control schemes at runtimeGS_OptionsAPI
Use easing curves, detect physics zones, smooth values, pick randomly, or work with splinesUtilitiesAPI
Trigger a reusable behavior on an entity from a script, physics zone, or another actionSystems: GS_ActionsAPI
Animate a transform, color, or value smoothly over timeSystems: GS_MotionAPI

Installation

GS_Core is a required gem. It will be added to your project when you enable any other GS_Play gem.

For a complete guided setup, follow the Simple Project Setup guide or video tutorial.

Follow these steps in particular:

  1. Configure Project
  2. Prepare Managers
  3. Prepare Startup

 

Quick Installation Summary

Once the gem is registered to your project:

  1. Create a Game Manager prefab and place it in every level.
  2. Create prefabs of any managers you wish to utilize in your project
  3. Add all the manager prefabs to your GameManager Managers list.
  4. Implement a way to activate “Begin Game”
    • Create a UI to fire New Game, or Load Game.
    • Create a Script to fire New Game, or Load Game OnStartupComplete.
    • Toggle “Debug Mode” on. This skips through the begin game process.

GS_Managers

Controls the game startup lifecycle — spawning and initializing all manager systems in a guaranteed order, then providing top-level game navigation: New Game, Continue, Load Game, Return to Title, and Quit. The starting point for any game-wide behavior.

GS_Managers API


GS_Save

Handles all save and load operations, including entity state persistence across level loads and simple key-value record tracking for global flags and counters.

GS_Save API


GS_StageManager

Manages level loading and navigation. Place named Exit Points in your levels to control where the player arrives, and use Stage Data components to configure per-level settings like NavMesh references and spawn configuration.

GS_StageManager API


GS_Options

Manages player input through swappable Input Profile assets and Input Reader components, with group-level enable/disable for suppressing input during menus, cutscenes, or transitions.

GS_Options API


Systems

Core framework systems used across multiple gems: the GS_Actions triggerable behavior system and the GS_Motion track-based animation engine.

Systems API


Utilities

A collection of shared tools: easing curves (40+ types), spring dampers for smooth value following, weighted random selection, color and float gradients, spline helpers, and Physics Trigger Volume components.

Utilities API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

1.1 - Managers System

How to work with the GS_Play manager system — startup events, game navigation, and standby mode from scripts.

The Managers system is how GS_Play starts your game. The Game Manager spawns all other managers in a guaranteed order, coordinates their initialization stages, and then broadcasts events that signal when each stage is complete and when the game is fully ready to run.

This gives you the ability to ensure startup happens as expected, can create your own managers of any type, and can toggle full-game standby.

For architecture details, component properties, and extending the system in C++, see the GS_Managers API.

Game Manager component in the O3DE Inspector

 

Contents


Startup Sequence

Game Manager Startup Pattern Graph

Breakdown

When the project starts, the Game Manager runs three stages before the game is considered ready:

StageBroadcast EventWhat It Means
1 — Initialize(internal)Each manager is spawned. They activate, then report ready.
2 — SetupOnSetupManagersSetup stage. Now safe to query other managers.
3 — CompleteOnStartupCompleteLast stage. Everything is ready.
Do any last minute things. Now safe to begin gameplay.

For most scripts, you only need OnStartupComplete. Wait for this event before doing anything that depends on managers to be completely setup.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Responding to Startup

ScriptCanvas

Connect to GameManagerNotificationBus and handle OnStartupComplete to know when the game is fully ready:

To check at any point whether the game has already finished starting, use the IsStarted request:


Game Navigation

The Game Manager owns the top-level game flow. Call these from title screens, pause menus, and end-game sequences. They coordinate the Save Manager and Stage Manager automatically.

ScriptCanvas NodeWhat It Does
TriggerNewGameStarts a new game with the default save name.
TriggerNewGameWithName(saveName)Starts a new game and writes to a named save file.
TriggerContinueGameLoads the most recent save and continues from it.
TriggerLoadGame(saveName)Loads a specific save file by name.
TriggerReturnToTitleReturns to the title stage, tearing down the current session.
TriggerSaveAndExitGameSaves the current state and exits the application.
TriggerExitGameExits the application without saving.

Standby Mode

Standby is a global pause. The Game Manager enters standby automatically during level transitions and other blocking operations. It broadcasts OnEnterStandby to halt all gameplay systems, and OnExitStandby when the operation completes.

Listen to these in any script that drives continuous logic — timers, ticks, or animation sequences:

EventWhat to Do
OnEnterStandbyPause timers, halt ticks, stop animations.
OnExitStandbyResume timers, re-enable ticks.

Both events are on GameManagerNotificationBus.


Debug Mode

When Debug Mode is enabled on the Game Manager component in the editor, the game starts in the current level instead of navigating to your title stage. This allows rapid iteration on any level without going through the full boot flow.

Debug Mode only changes startup navigation. All manager initialization and event broadcasting proceed normally.


Quick Reference

NeedBusMethod / Event
Know when startup is completeGameManagerNotificationBusOnStartupComplete
Check if game has startedGameManagerRequestBusIsStarted
Start a new gameGameManagerRequestBusNewGame / TriggerNewGame (SC)
Continue from last saveGameManagerRequestBusContinueGame / TriggerContinueGame (SC)
Load a specific saveGameManagerRequestBusLoadGame / TriggerLoadGame (SC)
Return to titleGameManagerRequestBusReturnToTitle / TriggerReturnToTitle (SC)
Pause all systemsGameManagerRequestBusEnterStandby
Resume all systemsGameManagerRequestBusExitStandby
Know when standby changesGameManagerNotificationBusOnEnterStandby / OnExitStandby

Glossary

TermMeaning
StandbyGlobal pause broadcast to all managers and their subsystems
Startup SequenceThe three-stage lifecycle (Initialize → SetupManagers → StartupComplete) before gameplay is ready
ManagerA component that extends GS_ManagerComponent and registers with the Game Manager
Debug ModeStarts the game in the current editor level instead of navigating to the title stage

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

1.2 - Save System

How to work with the GS_Play save system — saving game state, loading saves, and tracking progression with the Record Keeper.

The Save system handles all persistence in a GS_Play project. The Save Manager coordinates file operations, Savers serialize per-entity state, and the Record Keeper tracks flat progression data. Together they give you a complete save/load pipeline that works out of the box and extends cleanly for custom data.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Save Manager component in the O3DE Inspector

 

Contents


How Saving Works

Save System Pattern Graph

Breakdown

When a save is triggered, the Save Manager broadcasts OnSaveAll to every Saver component in the scene. Each Saver serializes its entity’s relevant state into the save file. When loading, the Save Manager broadcasts OnLoadAll, and each Saver restores its entity from the save data.

The Save Manager also maintains a list of all save files with metadata (timestamps, names), so you can present a save/load UI to the player.

OperationWhat Happens
New saveCreates a new save file, broadcasts OnSaveAll to all Savers.
Load saveReads save file, broadcasts OnLoadAll to all Savers.
Save dataWrites current game state to the active save file.
Load dataReads data from the active save file into memory.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Responding to Save Events

ScriptCanvas

Connect to SaveManagerNotificationBus to know when save or load operations occur:


Triggering Saves and Loads

These methods are available on SaveManagerRequestBus:

ScriptCanvas NodeWhat It Does
NewGameSaveCreates a fresh save file for a new game.
LoadGame(saveName)Loads a specific save file by name.
SaveDataWrites current state to the active save file.
LoadDataReads the active save file into memory.
GetOrderedSaveListReturns all save files sorted by most recent.
ConvertEpochToReadable(epoch)Converts a save file timestamp to a human-readable string.

Record Keeper

The Record Keeper is a lightweight key-value store for tracking game-wide progression — quest flags, counters, unlock states, completion markers. It lives on the Save Manager prefab and is automatically persisted with the save system.

Unlike Savers (which are per-entity), the Record Keeper is a global singleton. Any script or component can read and write records by name.

ScriptCanvas NodeWhat It Does
HasRecord(name)Returns whether a record with the given name exists.
SetRecord(name, value)Creates or updates a record. Value is a float.
GetRecord(name)Returns the current value of a record.
DeleteRecord(name)Removes a record.

Responding to Record Changes

Listen on RecordKeeperNotificationBus for the RecordChanged event. This fires whenever any record is created, updated, or deleted — useful for UI that displays progression state.


Built-In Savers

Two Savers ship with GS_Core for the most common use cases:

SaverWhat It Saves
BasicEntitySaverEntity transform (position, rotation, scale).
BasicPhysicsEntitySaverEntity transform plus rigidbody velocity and angular velocity.

Add these components to any entity that needs to persist its position across save/load cycles. They handle serialization and restoration automatically.


Quick Reference

NeedBusMethod / Event
Trigger a saveSaveManagerRequestBusSaveData
Trigger a loadSaveManagerRequestBusLoadGame(saveName)
Create a new saveSaveManagerRequestBusNewGameSave
List all savesSaveManagerRequestBusGetOrderedSaveList
Know when savingSaveManagerNotificationBusOnSaveAll
Know when loadingSaveManagerNotificationBusOnLoadAll
Check a progress flagRecordKeeperRequestBusHasRecord(name) / GetRecord(name)
Set a progress flagRecordKeeperRequestBusSetRecord(name, value)
Know when a record changesRecordKeeperNotificationBusRecordChanged

Glossary

TermMeaning
SaverA component that serializes one entity’s state into the save file
Record KeeperA global key-value store for tracking progression flags and counters
Save FileA serialized snapshot of all Saver data plus Record Keeper state

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

1.3 - Stage Management

How to work with the GS_Play stage management system — level loading, stage transitions, exit points, and stage data.

The Stage Manager handles all level-to-level navigation in a GS_Play project. It owns the master list of stages, processes transition requests, and coordinates with the Game Manager’s standby mode to ensure clean load/unload cycles. Stage Data components in each level control how that level initializes.

For architecture details, component properties, and extension patterns, see the Framework API reference.

Stage Manager component in the O3DE Inspector

 

Contents


How Stage Transitions Work

Stage Change Pattern Graph

Breakdown

When you request a stage change, the system follows this sequence:

StepWhat Happens
1 — StandbyThe Game Manager enters standby, pausing all gameplay systems.
2 — UnloadThe current stage’s entities are torn down.
3 — SpawnThe target stage’s prefab is instantiated.
4 — Set UpThe Stage Data component in the new level runs its layered startup sequence.
5 — CompleteThe Stage Manager broadcasts LoadStageComplete. Standby exits.

The Stage Data startup is layered — SetUpStage, ActivateByPriority, then Complete — so heavy levels can initialize incrementally without causing frame-time spikes.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Stage Data

Stage Data component in the O3DE Inspector

Each level should have a Stage Data component as its root entity. Using a start inactive child “Level” entity, the Stage Data system controls the initialization sequence for that new level. Stage Data holds level-specific configuration, and scripts.

EventWhat It Means
OnBeginSetUpStageThe level is starting its setup. Initialize per-level systems.
ActivateByPriorityActivate heavy entities in priority order (lazy loading).
OnLoadStageCompleteThe level is fully loaded and ready.
OnTearDownStageThe level is being unloaded. Clean up per-level state.

Listen to these on StageDataNotificationBus in any script that needs to react to level lifecycle.


Triggering Stage Changes

ScriptCanvas

The exitPointName parameter is optional. If provided, the system will position the player at the named exit point in the target level.


Exit Points

Stage Exit Point component in the O3DE Inspector

Exit Points are named position markers placed in a level. They define where entities spawn when arriving from another stage. A door in Level A can specify that when transitioning to Level B, the player should appear at Exit Point “DoorB_Entry”.

Exit Points are registered and unregistered with the Stage Manager automatically when they activate and deactivate.

ScriptCanvas NodeWhat It Does
ChangeStageRequest(stageName, exitPoint)Transitions to a stage and positions at the named exit point.
RegisterExitPoint(name, entity)Manually registers an exit point (usually automatic).
UnregisterExitPoint(name)Manually unregisters an exit point.
GetExitPoint(name)Returns the entity ID of a registered exit point.


Responding to Stage Events

ScriptCanvas


Entity Level Configuration

Stage Data entity level setup in the O3DE Editor

The Stage Data entity must live outside of the levels Game Manager prefab. It is left active.

Inside, it has a secondary level “wrapper” entity that you set to “Start Inactive” by default. This enables the Stage Data to control exactly when the level begins loading.


Quick Reference

NeedBusMethod / Event
Change to a different levelStageManagerRequestBusChangeStageRequest(stageName, exitPoint)
Load the default stageStageManagerRequestBusLoadDefaultStage
Know when a load startsStageManagerNotificationBusBeginLoadStage
Track loading progressStageManagerNotificationBusStageLoadProgress
Know when a load finishesStageManagerNotificationBusLoadStageComplete
React to level setupStageDataNotificationBusOnBeginSetUpStage
React to level teardownStageDataNotificationBusOnTearDownStage
Find an exit pointStageManagerRequestBusGetExitPoint(name)

Glossary

TermMeaning
StageA spawnable prefab representing a game level or screen
Exit PointA named position marker in a level that defines where entities arrive from another stage
Stage DataA per-level component that controls level initialization and holds level-specific settings
Default StageThe first stage loaded on application start (typically the title screen)

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

1.4 - Options & Input

How to work with the GS_Play options system — input profiles, input groups, and runtime binding management.

The Options system manages player-facing configuration and input handling. Its primary feature is the Input Profile system, which provides group-based input binding management that can be toggled at runtime without code changes.

For architecture details, component properties, and extension patterns, see the Framework API reference.

Options Manager component in the O3DE Inspector

 

Contents


Options Manager

The Options Manager is a singleton that holds the active Input Profile and makes it available to all Input Reader components. It responds to the Game Manager lifecycle automatically.

ScriptCanvas NodeWhat It Does
GetActiveInputProfileReturns the currently active Input Profile asset.

Input Profiles

Input Profile asset in the O3DE Asset Editor

An Input Profile is a data asset created in the O3DE Asset Editor. It contains named input groups, each holding a set of event mappings. Each event mapping binds a gameplay event name to one or more raw input bindings (key presses, axis movements, button presses) with configurable deadzones.

The key advantage over raw O3DE input bindings is the group system. Groups can be enabled and disabled independently at runtime — a pause menu can suppress gameplay input by disabling the “Gameplay” group, without tearing down and rebuilding bindings.


Enabling and Disabling Input Groups

ScriptCanvas

Enabling, Disabling, and Checking State of Input Groups in Script Canvas


How Input Flows

StageWhat Happens
1 — Raw InputO3DE’s input system captures key/axis/button events.
2 — Input ReaderThe GS_InputReaderComponent on the entity matches raw input against the active Input Profile’s event mappings.
3 — Event MappingMatched input fires a named gameplay event (e.g., “Jump”, “MoveForward”).
4 — ConsumerOther components on the entity (controllers, reactors) handle the gameplay event.

Input Readers filter by group — if a group is disabled, none of its event mappings fire.


Quick Reference

NeedBusMethod / Event
Disable an input groupInputReaderRequestBusDisableInputGroup(groupName)
Enable an input groupInputReaderRequestBusEnableInputGroup(groupName)
Check if group is disabledInputReaderRequestBusIsGroupDisabled(groupName)
Get the active profileOptionsManagerRequestBusGetActiveInputProfile

Glossary

TermMeaning
Input ProfileA data asset containing named input groups with event mappings
Input GroupA named collection of event mappings that can be enabled or disabled at runtime
Event MappingA binding from a gameplay event name to one or more raw input sources
Input ReaderA component that matches raw input against the active Input Profile

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

1.5 - Systems

Core framework systems — the GS_Actions triggerable behavior system and the GS_Motion track-based animation engine.

GS_Core provides a growing number of standalone systems that are used across multiple gems. Currently, the GS_Motion track-based animation engine powers UIAnimation and Juice Feedback playback.

For architecture details, component properties, and C++ extension guides, see the Framework API: Systems.

 

Contents


GS_Motion

Provides tween-style Motion Track components for animating transforms, colors, and values over time. Multiple tracks on the same entity run in parallel; chains are configured by setting an On Complete motion name.

GS_Motion API


See Also

For the full API, component properties, and C++ extension guides:

For related systems:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

1.5.1 - Actions System

How to work with the GS_Play action system — triggerable, composable behaviors that fire from scripts, triggers, or code.

The Actions system provides a universal pattern for attaching discrete, reusable behaviors to entities and triggering them from any source — ScriptCanvas, World Triggers, UI buttons, or C++ code. Actions are data-driven components that fire on named channels, enabling composition without custom scripting.

For architecture details, component properties, and creating custom actions in C++, see the Framework API reference.

 

Contents


How Actions Work

An Action is a component you attach to an entity. Each Action has a channel name. When something calls DoAction(channelName) on that entity’s bus, every Action component whose channel matches the name will execute.

This decoupling is the core value — the system that fires DoAction does not need to know what kind of action is attached. You can change, add, or remove action components on an entity without modifying any calling code.

ConceptWhat It Means
ChannelA named string. Actions on the same channel fire together.
CompositionMultiple actions on the same channel execute in parallel — stack components to compose behaviors.
ChainingAn action can fire a different channel on completion, enabling lightweight sequences.

Triggering Actions

ScriptCanvas

[ActionRequestBus → DoAction(channelName)]
    └─► All Action components on this entity with matching channel execute

To know when an action completes:

[ActionNotificationBus → OnActionComplete]
    └─► Action has finished executing

Built-In Actions

GS_Core ships with these ready-to-use actions:

ActionWhat It Does
PrintLogLogs a configurable message to the console. Useful for debugging trigger chains.
ToggleMouseCursorShows or hides the system mouse cursor.

Additional actions are available in other gems (e.g., World Trigger actions in GS_Interaction, dialogue effects in GS_Cinematics).


Common Patterns

World Trigger → Action

A World Trigger detects a collision or interaction event and fires DoAction on its entity. Action components on the same entity respond — one might play a sound, another might set a record, another might toggle an entity.

UI Button → Action

A UI button press fires DoAction with a channel name. Actions handle the response — navigate to a different UI page, start a new game, or toggle the pause menu.

Chaining Actions

Set an action’s “Chain Channel” property to fire a different channel when it completes:

Channel "OpenDoor" → [ToggleEntity action] → chains to "PlayDoorSound" → [AudioEvent action]

Quick Reference

NeedBusMethod / Event
Fire an actionActionRequestBusDoAction(channelName)
Know when an action completesActionNotificationBusOnActionComplete

Glossary

TermMeaning
ActionA component that executes a discrete behavior when triggered on a named channel
ChannelA named string identifier that groups actions — all actions on the same channel fire together
ChainingConfiguring an action to fire a different channel on completion, creating lightweight sequences

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

1.5.2 - Motion System

How to work with GS_Motion — the track-based animation and tween system that powers UI transitions, feedback effects, and custom animations.

GS_Motion is a track-based animation engine built into GS_Core. It drives timed property changes — position, rotation, scale, color, opacity — through authored data assets rather than hand-coded interpolation scripts. Domain gems extend GS_Motion with their own track types: GS_UI adds 8 LyShine-specific tracks for UI animation, and GS_Juice adds transform and material tracks for game feel feedback.

For architecture details, the domain extension pattern, and all track types, see the Framework API reference.

 

Contents


Key Concepts

ConceptWhat It Is
TrackA single animated property — what changes, how long, and which easing curve.
MotionA collection of tracks that play together. Tracks can start at different times within the motion.
Motion AssetA data asset authored in the O3DE Asset Editor containing the tracks and their configuration.
ProxyAn optional entity redirect — lets a track target a child entity instead of the motion’s owner.
CompositeThe runtime instance created from an asset. Each entity gets its own deep copy.

How It Works

  1. Author a motion asset in the Asset Editor. Each domain has its own asset type (.uiam for UI, .feedbackmotion for Juice).
  2. Assign the asset to a component or embed it in a serialized field (e.g., a page’s show/hide transitions).
  3. Play the motion from ScriptCanvas or C++. The system initializes a runtime composite, resolves proxies, and ticks all tracks.
  4. Each track receives an eased progress value (0 → 1) every frame and applies its property change to the target entity.
  5. When all tracks complete, the motion fires its OnComplete callback.

Easing Curves

Every track can use any of the 40+ easing curves from the GS_Core curves library. Curves are configured per-track in the asset editor.

Available families: Linear, Quad, Cubic, Sine, Expo, Circ, Back, Elastic, Bounce — each with In, Out, and InOut variants.


Proxy Targeting

UiAnimationMotion with proxy bindings configured in the Inspector

When a motion asset has tracks with identifiers (named labels), those tracks appear in the proxy list on the component. Proxies let you redirect a track to a different entity in the hierarchy — for example, a page show animation might animate the background separately from the content panel.

Each proxy entry maps a track label to a target entity. If no proxy is set, the track targets the motion’s owner entity.


Domain Extensions

GS_Motion is not used directly — it provides the base system that domain gems extend with concrete track types.

UI Animation (GS_UI)

Eight tracks for LyShine UI elements (position, scale, rotation, alpha, color, text). Asset extension: .uiam. Used for page transitions, button hover/select effects, and standalone UI animation.

UI Animation API


Feedback Motions (GS_Juice)

Two tracks for game feel effects — transform (position, scale, rotation) and material (opacity, emissive, color tint). Asset extension: .feedbackmotion. Used for screen shake, hit flash, and visual feedback.

Feedback Motions API


Quick Reference

NeedWhere
Animate UI elementsUse .uiam assets with UiAnimationMotionComponent or page transitions
Create feedback effectsUse .feedbackmotion assets with FeedbackEmitter component
Change easing curveEdit the curve type on individual tracks in the asset editor
Redirect a track to another entityConfigure proxy entries on the component
Loop an animationEnable loop on the motion asset

Glossary

TermMeaning
TrackA single animated property within a motion — defines what changes, duration, and easing
MotionA collection of tracks that play together as a single animation
Motion AssetA data asset authored in the Asset Editor containing track configurations
ProxyAn entity redirect that lets a track target a child entity instead of the motion’s owner
CompositeThe runtime instance created from a motion asset — each entity gets its own deep copy
Domain ExtensionA gem-specific set of track types that extends GS_Motion for a particular use case

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

1.6 - Utilities

General-purpose components and math helpers — physics triggers, easing curves, spring dampers, gradients, splines, and more.

GS_Core includes a library of general-purpose components and math helpers available to every system in the framework. These utilities handle common game development tasks — physics overlap detection, value interpolation, animation curves, gradient sampling, and spatial math — so you can focus on gameplay logic rather than reimplementing fundamental patterns.

For full API details and code examples, see the Framework API reference.

 

Contents


Physics Trigger Volume

A reusable physics overlap detector. Handles trigger enter, stay, and exit events with filtering and callback support. Used internally by Pulsors, Targeting Fields, and World Triggers — and available for your own components.

Physics Trigger Volume API


Easing Curves

40+ easing functions organized into families. Every GS_Motion track, spring, and interpolation system in the framework can reference these curves by enum.

FamilyVariants
LinearLinear
QuadIn, Out, InOut
CubicIn, Out, InOut
SineIn, Out, InOut
ExpoIn, Out, InOut
CircIn, Out, InOut
BackIn, Out, InOut
ElasticIn, Out, InOut
BounceIn, Out, InOut

Select a curve type via the CurveType enum in component properties, asset fields, or C++ code.

Curves API


Spring Dampers

15+ spring functions for physically-grounded value interpolation. Springs produce natural-feeling motion that reacts to velocity and acceleration, making them ideal for camera smoothing, UI follow, and any value that should “settle” rather than snap.

FunctionUse Case
Simple SpringBasic spring with damping
Acceleration SpringSpring with acceleration bias
Double SpringTwo-stage spring for overshoot effects
Timed SpringSpring that reaches target in a fixed time
Velocity SpringSpring driven by velocity
Quaternion SpringSpring for rotation values

Springs API


Gradients

Multi-stop gradient types for sampling values over a range. Used extensively by GS_Motion tracks and GS_Juice feedback tracks to define animation curves.

TypeWhat It Samples
ColorGradientRGBA color with positioned markers
FloatGradientSingle float value
Vector2Gradient2D vector

Gradients are editable in the Inspector with visual marker placement.

Gradients API


Entity Helpers

Utility functions for finding entities in the scene by name.

FunctionWhat It Does
GetEntityByName(name)Returns the entity with the given name.
GetEntityIdByName(name)Returns the EntityId of the named entity.

Entity Helper API


Weighted Random

Template-based weighted random selection. Given a collection of items with weights, returns a randomly selected item biased by weight. Useful for loot tables, dialogue variation, and procedural placement.

GS_Random API


Angle Helpers

Functions for angle and orientation math, plus 22 preset section configurations for direction classification.

FunctionWhat It Does
YawFromDir(direction)Extracts yaw angle from a direction vector.
QuatFromYaw(yaw)Creates a quaternion from a yaw angle.
PickByAngle(angle, sections)Maps an angle to a section index using a preset configuration.

Section presets range from 2-section (left/right) to 16-section (compass-style), plus diagonal and cardinal configurations. Useful for animation direction selection and 2D-style facing.

Angle Helper API


Spline Helpers

Utility functions for working with O3DE spline components.

FunctionWhat It Does
FindClosestWorldPoint(spline, point)Returns the closest point on the spline in world space.
FindClosestLocalPoint(spline, point)Returns the closest point in local space.
FindClosestFraction(spline, point)Returns the 0–1 fraction along the spline.

Spline Helper API


Serialization Helpers

Utility functions for common O3DE serialization patterns. Simplifies working with SerializeContext and EditContext in component reflection.

Serialization Helper API


Common Enums

Shared enumeration types used across the framework.

Common Enums API


Glossary

TermMeaning
Easing CurveA function that maps linear progress (0→1) to a shaped output for smooth animation
Spring DamperA physically-modeled interpolation function that settles naturally toward a target
GradientA multi-stop sampler that returns interpolated values (color, float, vector) over a 0→1 range
Physics Trigger VolumeA reusable overlap detector that fires enter, stay, and exit callbacks

For full definitions, see the Glossary.


See Also

For the full API, component properties, and code examples:

For related systems:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

2 - GS_AI

Foundation scaffold for AI-driven entity behavior in the GS_Play framework.

GS_AI provides the structural foundation for AI-driven entity behavior. It defines the base architecture that AI controller implementations build upon, integrating with the Unit system’s controller pattern to drive NPC entities through behavior logic rather than player input.

For architecture details and extension patterns, see the GS_AI API.


How It Works

GS_AI establishes the conventions and base classes for AI in the GS_Play framework. AI controllers extend the Unit Controller pattern — the same mechanism that player controllers use to drive units. Switching an entity between player control and AI control is a standard possession swap with no special handling required.

AI implementations listen to the same game lifecycle events as every other GS_Play system. AI controllers respond to standby mode, stage transitions, and manager lifecycle broadcasts automatically through the manager pattern.


Integration Points

SystemHow AI Connects
Unit ControllersAI controllers extend the unit controller pattern to drive NPC movement and actions.
GS_ManagersAI systems respond to startup, standby, and shutdown lifecycle events.
InteractionAI entities can be targeting targets, trigger world triggers, and emit/receive pulsors.
CinematicsThe Cinematic Controller (in GS_Complete) demonstrates AI-to-cinematic handoff.

See Also

For the full API and extension patterns:

For related systems:


Get GS_AI

GS_AI — Explore this gem on the product page and add it to your project.

3 - GS_Audio

Audio management, event-based sound, music scoring, mixing, and Klatt voice synthesis for the GS_Play framework.

GS_Audio is the audio management gem for GS_Play. It provides event-based sound playback, a multi-layer music scoring system, mixing buses with effects chains, and a built-in Klatt formant voice synthesizer with 3D spatial audio. All audio features integrate with the GS_Play manager lifecycle and respond to standby mode automatically.

For architecture details, component properties, and extension patterns, see the GS_Audio API.


Quick Navigation

I want to…FeatureAPI
Manage the audio engine, load event libraries, or control master volumeAudio ManagerAPI
Play sounds with pooling, 3D spatialization, and concurrency controlAudio EventsAPI
Configure mixing buses with filters, EQ, and environmental influence effectsMixing & EffectsAPI
Layer music tracks dynamically based on gameplay stateScore ArrangementAPI
Generate text-to-speech with configurable voice parameters and 3D spatial audioKlatt VoiceAPI

Installation

GS_Audio requires GS_Core and the MiniAudio gem. Add both to your project’s gem dependencies.

For a complete guided setup, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Audio gem in your project configuration.
  2. Create an Audio Manager prefab and add it to the Game Manager’s Managers list.
  3. Create Audio Event Library assets for your sound effects.
  4. Place GS_AudioEventComponent on entities that need to play sounds.

Audio Manager

The Audio Manager is the singleton controller for the entire audio system. It initializes the audio engine, manages mixing buses, loads audio event libraries, and coordinates score playback. Like all GS_Play managers, it extends the Manager base class and plugs into the Game Manager’s startup sequence automatically.

Audio Manager API


Audio Events

Audio Events are the primary way to play sounds. Each event defines a pool of audio clips with selection rules (random or sequential), 2D or 3D playback mode, concurrent instance limiting, and repeat-hold behavior. Events are grouped into Audio Event Library assets that the Audio Manager loads at startup.

Audio Events API


Mixing & Effects

The mixing system provides named audio buses with configurable effects chains. Each bus can have filters applied — low pass, high pass, band pass, notch, peaking EQ, shelving, delay, and reverb. Audio Bus Influence Effects allow environmental zones to dynamically modify bus effects based on the listener’s position.

Mixing & Effects API


Score Arrangement

Score Arrangement Tracks are multi-layer music assets. Each arrangement defines a time signature, BPM, fade behavior, and a set of Score Layers — individual audio tracks that can be enabled or disabled independently. This allows dynamic music that adds or removes instrument layers based on gameplay state.

Score Arrangement API


Klatt Voice Synthesis

GS_Play includes a built-in text-to-speech system based on Klatt formant synthesis. The Klatt Voice component converts text to speech in real time with configurable voice parameters — frequency, speed, waveform, formants, and pitch variance. The system supports 3D spatial audio and inline KTT tags for expressive delivery.

Klatt Voice API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

3.1 - Audio Manager

How to work with the GS_Play Audio Manager — engine initialization, bus routing, and master control.

The Audio Manager is the singleton controller for the entire audio system. It initializes the MiniAudio engine, manages mixing buses, loads audio event libraries, and coordinates score track playback. Like all GS_Play managers, it extends the Manager base class and responds to the Game Manager lifecycle automatically.

For component properties and API details, see the Framework API reference.

Audio Manager component in the O3DE Inspector

 

Contents


What It Manages

ResponsibilityWhat It Does
Engine lifecycleInitializes and shuts down the MiniAudio audio engine.
Mixing busesCreates and routes named audio buses with effects chains.
Event librariesLoads Audio Event Library assets at startup.
Score playbackManages Score Arrangement Track assets for dynamic music.
Master volumeControls global volume levels per bus.

Quick Reference

NeedBusMethod
Audio management requestsAudioManagerRequestBusPlayAudioEvent, StopAudioEvent, SetMixerVolume

Glossary

TermMeaning
Audio Event LibraryA data asset containing named audio event definitions loaded at startup
Mixing BusA named audio channel with an effects chain for routing and processing sounds
Score ArrangementA multi-layer music asset managed by the Audio Manager for dynamic playback

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

3.2 - Audio Events

How to work with GS_Play audio events — data-driven sound playback with pooling, spatialization, and concurrency control.

Audio Events are the primary way to play sounds in GS_Play. Each event defines a pool of audio clips with selection rules, playback mode (2D or 3D), concurrent instance limiting, and repeat-hold behavior. Events are grouped into Audio Event Library assets that the Audio Manager loads at startup.

For asset structure and component properties, see the Framework API reference.

Audio Event Library asset in the O3DE Asset Editor

 

Contents


How It Works

  1. Create an Audio Event Library asset in the O3DE Asset Editor.
  2. Define audio events — each with a name, clip pool, and playback settings.
  3. Load the library by assigning it to the Audio Manager.
  4. Play events by name from ScriptCanvas or C++.

Event Configuration

SettingWhat It Controls
Pool SelectionRandom — picks a random clip from the pool. Increment — plays clips sequentially.
2D / 3D2D plays without spatialization. 3D positions the sound in the world.
Concurrent LimitMaximum instances of this event that can play simultaneously.
Repeat HoldMinimum time between repeated plays of the same event.

GS_AudioEventComponent

Place a GS_AudioEventComponent on any entity that needs to play sounds. The component references events by name from loaded libraries and handles 3D positioning automatically based on the entity’s transform.


Quick Reference

NeedBusMethod
Play a sound eventAudioManagerRequestBusPlayAudioEvent(eventName)
Stop a sound eventAudioManagerRequestBusStopAudioEvent(eventName)

Glossary

TermMeaning
Audio EventA named, data-driven sound definition with clip pool, playback mode, and concurrency settings
Audio Event LibraryA data asset grouping multiple audio events for batch loading
Pool SelectionThe strategy for choosing clips — Random or Increment
Concurrent LimitMaximum simultaneous instances of the same event

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

3.3 - Mixing & Effects

How to work with GS_Play audio mixing — named buses, effects chains, and environmental audio influence.

The mixing system provides named audio buses with configurable effects chains. Each bus can have multiple audio filters applied for real-time audio processing. Audio Bus Influence Effects allow environmental zones to dynamically modify bus parameters based on the listener’s position.

For component properties and filter types, see the Framework API reference.

 

Contents


Available Filters

FilterWhat It Does
Low PassRemoves high frequencies. Simulates muffling (underwater, behind walls).
High PassRemoves low frequencies. Simulates thin/tinny audio (radio, phone).
Band PassPasses a frequency band. Isolates specific ranges.
NotchRemoves a narrow frequency band.
Peaking EQBoosts or cuts a frequency band.
Low ShelfBoosts or cuts frequencies below a threshold.
High ShelfBoosts or cuts frequencies above a threshold.
DelayAdds echo/delay effect.
ReverbAdds room/space reverb.

Environmental Audio Influence

Audio Bus Influence Effects allow spatial zones to modify bus effects dynamically. When the listener enters an influence zone (like a cave or tunnel), the zone’s effects are applied to the specified bus with priority-based stacking. Multiple overlapping zones resolve by priority.


Quick Reference

NeedBusMethod
Control mixer settingsGS_MixingRequestBusMixer control methods
Set master volumeAudioManagerRequestBusSetMixerVolume

Glossary

TermMeaning
Mixing BusA named audio channel that routes sound through an effects chain
Audio FilterA real-time audio processing effect applied to a mixing bus
Audio Bus Influence EffectA spatial zone that modifies bus effects based on listener position

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

3.4 - Score Arrangement

How to work with GS_Play score arrangements — multi-layer dynamic music with configurable time signatures and layer control.

Score Arrangement Tracks are multi-layer music assets for dynamic, adaptive game music. Each arrangement defines a time signature, tempo, fade behavior, and a set of independently controllable Score Layers. This enables music that responds to gameplay — adding percussion during combat, muting melody during dialogue, or transitioning between intensity levels.

For asset structure and playback API, see the Framework API reference.

Score Arrangement asset in the O3DE Asset Editor

 

Contents


How It Works

A Score Arrangement Track is a data asset containing:

FieldWhat It Controls
Time SignatureMusical timing (4/4, 3/4, 6/8, etc.).
BPMTempo in beats per minute.
Fade ControlHow layers fade in and out.
Score LayersIndividual audio tracks that play simultaneously.

Each Score Layer is an independent audio stream within the arrangement. Layers can be enabled or disabled at runtime, creating dynamic music that evolves based on game state.


Supported Time Signatures

4/4, 4/2, 12/8, 2/2, 2/4, 6/8, 3/4, 3/2, 9/8


Glossary

TermMeaning
Score Arrangement TrackA multi-layer music data asset with time signature, tempo, and controllable layers
Score LayerAn individual audio stream within an arrangement that can be enabled or disabled at runtime
BPMBeats per minute — the tempo of the arrangement

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

3.5 - Klatt Voice Synthesis

How to work with GS_Play Klatt voice synthesis — text-to-speech with 3D spatial audio, voice profiles, and inline parameter control.

GS_Play includes a built-in text-to-speech system based on Klatt formant synthesis. The KlattVoiceComponent converts text to speech in real time with configurable voice parameters. Voices are positioned in 3D space and attenuate with distance, making synthesized speech feel like it comes from the character speaking.

For component properties, voice parameter details, and the phoneme mapping system, see the Framework API reference.

Klatt Voice Profile asset in the O3DE Asset Editor

 

Contents


How It Works

  1. Configure a voice using a KlattVoiceProfile — set frequency, speed, waveform, formants, and pitch variance.
  2. Assign a KlattPhonemeMap — maps text characters to ARPABET phonemes for pronunciation.
  3. Speak text from ScriptCanvas or C++ — the system converts text to phonemes and synthesizes audio in real time.
  4. Position in 3D — the voice component uses KlattSpatialConfig for 3D audio positioning relative to the entity.

Voice Configuration

ParameterWhat It Controls
FrequencyBase voice pitch.
SpeedSpeech rate.
WaveformVoice quality — Saw, Triangle, Sin, Square, Pulse, Noise, Warble.
FormantsVocal tract resonance characteristics.
Pitch VarianceRandom pitch variation for natural-sounding speech.
DeclinationPitch drop over the course of a sentence.

KTT Tags

KTT (Klatt Text Tags) allow inline parameter changes within speech text for expressive delivery:

"Hello <speed=0.5>world</speed>, how are <pitch=1.2>you</pitch>?"

The KlattCommandParser processes these tags during speech synthesis, enabling mid-sentence changes to speed, pitch, and other voice parameters.

For the complete tag reference — all attributes, value ranges, and reset behavior — see the Framework API: KTT Voice Tags.


Phoneme Maps

Two base phoneme maps are available:

MapDescription
SoLoud_DefaultSimple default mapping.
CMU_FullFull CMU pronunciation dictionary mapping.

Custom phoneme overrides allow project-specific word pronunciations (character names, fantasy terms) without modifying the base map.


3D Spatial Audio

The KlattSpatialConfig controls how synthesized speech is positioned in 3D:

  • Voices attenuate with distance from the listener.
  • The KlattVoiceSystemComponent tracks the listener position and updates all active voices.
  • Multiple characters can speak simultaneously with correct spatial positioning.

Quick Reference

NeedBusMethod
Control a voiceKlattVoiceRequestBusVoice synthesis methods (entity-addressed)
System-level voice controlKlattVoiceSystemRequestBusListener tracking, engine management

Glossary

TermMeaning
Klatt SynthesisA formant-based speech synthesis method that generates voice from frequency parameters
KTT TagsInline text tags that modify voice parameters mid-sentence during synthesis
Phoneme MapA mapping from text characters to ARPABET phonemes for pronunciation
KlattSpatialConfigConfiguration for 3D audio positioning and distance attenuation of synthesized speech

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

4 - GS_Cinematics

Node-graph dialogue sequences, cinematic stage management, polymorphic performances, and world-space UI with typewriter and audio babble.

GS_Cinematics is the complete dialogue and cinematic control system for GS_Play. It provides a node-graph authoring tool for branching dialogue sequences, a runtime sequencer with conditions and side effects, a UI layer for text display and player choice, and a Cinematics Manager for scene staging. Custom conditions, effects, and performance types are discovered automatically through O3DE serialization, so project-specific dialogue behaviors can be added without modifying the gem.

For architecture details, component properties, and extending the system in C++, see the GS_Cinematics API.


Quick Navigation

I want to…FeatureAPI
Coordinate cinematic sequences and manage stage markers for actor positioningCinematics ManagerAPI
Author branching dialogue with conditions, effects, and performances in a node graphDialogue SystemAPI
Display dialogue text, player choices, and speech babble on screen or in world spaceDialogue UIAPI
Move actors to stage markers during dialogue with navigation or teleportPerformancesAPI

Installation

GS_Cinematics requires GS_Core, LyShine, and RecastNavigation. The node editor tools additionally require GraphCanvas and GraphModel as editor-only dependencies.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Cinematics gem in your project configuration.
  2. Create Cinematics Manager and Dialogue Manager prefabs, add to the Game Manager’s Managers list.
  3. Create .dialoguedb assets with the Dialogue Editor.
  4. Place DialogueSequencer and DialogueUI components in your level.
  5. Bake NavMesh in levels where PathTo performances will be used.

Cinematics Manager

The Cinematics Manager coordinates the begin and end of cinematic sequences, broadcasting enter/exit events so other systems know when to yield control. It maintains a registry of stage marker entities placed in each level — named anchor points that performers and cameras look up at runtime to determine positioning during a sequence.

Cinematics Manager API


Dialogue System

The Dialogue System is the authoring and runtime core. Dialogue content is stored in .dialoguedb assets containing actors, portraits, and sequences. Each sequence is a graph of polymorphic nodes — text, selection, random branch, effects, and performances. A visual node editor lets writers author graphs directly in the O3DE Editor. At runtime, the sequencer walks the graph, evaluates conditions, executes effects, and emits events for the UI layer.

Dialogue System API


Dialogue UI

The Dialogue UI feature set puts dialogue text and player choices on screen. DialogueUI handles screen-space text display with a Typewriter for character-by-character reveal. DialogueUISelection renders player choices as selectable buttons. World-space variants place speech bubbles above actors. Babble plays audio tied to the active speaker. The DialogueUIBridge routes sequencer events to the correct UI implementation and routes player selection back.

Dialogue UI API


Performances

Performances are polymorphic actions that move or reposition actors during dialogue. MoveTo translates actors to named stage markers, PathTo navigates via NavMesh, and Reposition teleports instantly. All run asynchronously — the sequencer waits for completion before advancing. Custom performance types can be created through extending the class.

Performances API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.1 - Cinematics Manager

How to coordinate cinematic sequences in GS_Play — beginning and ending cutscenes, registering stage markers, and reacting to cinematic events from scripts.

The Cinematics Manager is the GS_Core-integrated manager component for the GS_Cinematics system. It signals when a cinematic sequence begins and ends, broadcasts events so other systems — UI, movement, input — know when to yield control to a cutscene, and maintains a registry of named CinematicStageMarkerComponent entities placed in each level.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Cinematics Manager component in the O3DE Inspector

 

Contents


Stage Markers

Cinematic Stage Marker component in the O3DE Inspector

Stage markers are named anchor entities placed in each level that serve as spatial reference points for cinematic sequences. Performers and camera systems look up markers by name at runtime to determine where actors should stand, face, or move to during a cutscene.

This design decouples authored dialogue data from level-specific layout. The same sequence asset plays in any level as long as that level provides CinematicStageMarkerComponent entities with the expected names.

ComponentPurpose
CinematicStageMarkerComponentMarks a named position in the level for cinematic staging.

To register a marker, add CinematicStageMarkerComponent to an entity in the level and give it a name. The Cinematics Manager automatically discovers and registers all markers in the level on startup via RegisterStageMarker. At runtime, any system can retrieve a marker entity by name with GetStageMarker.


Cinematic Events

When a cinematic begins and ends, the Cinematics Manager broadcasts events on CinematicsManagerNotificationBus. Listen to these in any system that needs to yield or reclaim control during a cutscene — player input, HUD, AI, camera.

EventWhen It FiresWhat to Do
EnterCinematicA cinematic sequence has started.Disable player input, hide the HUD, suspend AI.
ExitCinematicThe cinematic sequence has ended.Re-enable input, restore HUD, resume AI.

Starting and Ending Cinematics

Call BeginCinematic to signal that a cinematic is starting and EndCinematic when it completes. These calls broadcast EnterCinematic and ExitCinematic respectively. They do not drive animation or sequence playback directly — that is the role of DialogueSequencerComponent. The Cinematics Manager handles the global state change so all listening systems respond in one coordinated broadcast.

BusMethodWhat It Does
CinematicsManagerRequestBusBeginCinematicBroadcasts EnterCinematic to all listeners.
CinematicsManagerRequestBusEndCinematicBroadcasts ExitCinematic to all listeners.
CinematicsManagerRequestBusRegisterStageMarkerAdds a marker to the registry by name.
CinematicsManagerRequestBusGetStageMarkerReturns the entity for a marker by name.

ScriptCanvas Usage

Reacting to Cinematic State

To pause gameplay systems when a cinematic starts and resume them when it ends, connect to CinematicsManagerNotificationBus:

Triggering a Cinematic

To start a cinematic from a trigger or cutscene entity, call BeginCinematic, drive the sequence through the Dialogue Sequencer, then call EndCinematic on completion:

Looking Up a Stage Marker


Quick Reference

NeedBusMethod / Event
Start a cinematicCinematicsManagerRequestBusBeginCinematic
End a cinematicCinematicsManagerRequestBusEndCinematic
Register a stage markerCinematicsManagerRequestBusRegisterStageMarker
Retrieve a stage marker entityCinematicsManagerRequestBusGetStageMarker
Know when a cinematic startsCinematicsManagerNotificationBusEnterCinematic
Know when a cinematic endsCinematicsManagerNotificationBusExitCinematic

Glossary

TermMeaning
Stage MarkerA named anchor entity in a level used as a spatial reference for cinematic positioning
CinematicA global state where the Cinematics Manager has signaled that a cutscene is in progress

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.2 - Dialogue System

How to author and play back branching dialogue in GS_Play — dialogue databases, node types, conditions, effects, and the runtime sequencer.

The Dialogue System is the authoring and runtime core of GS_Cinematics. Dialogue content is stored in .dialoguedb assets (DialogueDatabase), which contain named actors and a collection of DialogueSequence records. Each sequence is a graph of polymorphic nodes. At runtime, GS_DialogueManagerComponent manages the database and maps performer names to entities in the level, while DialogueSequencerComponent drives playback — walking the graph, evaluating conditions on branches, executing effects, and emitting events that UI and other systems consume.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Dialogue Manager component in the O3DE Inspector

 

Contents


Dialogue Database

Dialogue Authoring Pattern Graph

Breakdown

A dialogue sequence is authored in the node editor, stored in a .dialoguedb asset, and driven at runtime by the Dialogue Manager and Sequencer:

LayerWhat It Means
DialogueDatabaseStores named actors and sequences. Loaded at runtime by the Dialogue Manager.
DialogueSequenceA directed node graph. The Sequencer walks from startNodeId through Text, Selection, Effects, and Performance nodes.
ConditionsPolymorphic evaluators on branches. Failed conditions skip that branch automatically.
EffectsPolymorphic actions at EffectsNodeData nodes — set records, toggle entities.
PerformersNamed actor anchors in the level. Resolved from database actor names via DialoguePerformerMarkerComponent.

Conditions, effects, and performances are discovered automatically at startup — custom types from any gem appear in the editor without modifying GS_Cinematics.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.

DialogueDatabase Asset

Dialogue Database asset in the O3DE Editor

The DialogueDatabase is a .dialoguedb asset authored in the O3DE node editor. It is the single source of truth for all dialogue content in a project section.

Asset ContentsPurpose
ActorsNamed character definitions with portrait and metadata.
SequencesNamed DialogueSequence records, each a graph of nodes.

Load a database at runtime by calling ChangeDialogueDatabase on DialogueManagerRequestBus. The manager resolves performer names from the database against DialoguePerformerMarkerComponent entities placed in the current level.


Node Types

Each sequence is a directed graph of DialogueNodeData nodes. The sequencer walks the graph starting from startNodeId and advances through the nodes in order.

Node TypeWhat It Does
TextNodeDataDisplays a speaker line. Supports LocalizedStringId for localization.
SelectionNodeDataPresents the player with a list of choices. Branches based on selection.
RandomNodeDataSelects a random outgoing branch.
EffectsNodeDataExecutes one or more DialogueEffect objects without advancing to a text node.
PerformanceNodeDataTriggers a DialoguePerformance action and waits for OnPerformanceComplete before continuing.

Conditions

Conditions are polymorphic objects attached to sequence branches. The sequencer evaluates all conditions on a branch before following it. Branches whose conditions fail are skipped.

Condition TypeWhat It Evaluates
Boolean_DialogueConditionA base boolean comparison.
Record_DialogueConditionChecks a game record via the RecordKeeper system. Extends Boolean_DialogueCondition.

Effects

Effects are polymorphic objects executed when the sequencer reaches an EffectsNodeData node. Effects can also be reversed.

Effect TypeWhat It Does
SetRecords_DialogueEffectSets one or more game records via the RecordKeeper system.
ToggleEntitiesActive_DialogueEffectToggles one or more entities active or inactive.

Performances

Performances are polymorphic objects executed when the sequencer reaches a PerformanceNodeData node. A performance can be blocking — the sequencer pauses and waits for OnPerformanceComplete before advancing — or non-blocking, where dialogue continues immediately after the performance fires.

Like conditions and effects, performance types are discovered automatically at startup via the type registry. Custom performance types from any gem appear in the editor without modifying GS_Cinematics.

Performance TypeWhat It Does
MoveTo_DialoguePerformanceSmoothly moves a named performer to a CinematicStageMarkerComponent position. Fires MoveTo_PerformanceNotificationBus — a companion component on the performer entity responds and moves it, then signals completion.
PathTo_DialoguePerformanceNavigates a named performer to a marker using the scene navmesh. Uses RecastNavigation pathfinding through the world geometry rather than a direct interpolation.
RepositionPerformer_DialoguePerformanceInstantly teleports a performer to a marker. Non-blocking — dialogue advances without waiting.

Runtime Playback

GS_DialogueManagerComponent

The Dialogue Manager is the top-level manager for all dialogue. It holds the active DialogueDatabase, maps performer names to level entities via DialoguePerformerMarkerComponent, and is the entry point for starting sequences by name.

BusMethodWhat It Does
DialogueManagerRequestBusStartDialogueSequenceByNameStarts a named sequence from the active database.
DialogueManagerRequestBusChangeDialogueDatabaseLoads a different DialogueDatabase asset.
DialogueManagerRequestBusRegisterPerformerMarkerRegisters a performer entity by name for the current level.
DialogueManagerRequestBusGetPerformerReturns the entity for a named performer.

DialogueSequencerComponent

The Dialogue Sequencer drives sequence playback. It walks the node graph, evaluates conditions, executes effects, triggers performances, and emits notifications when text lines begin and when the sequence completes. It is typically placed on a dedicated sequencer entity in the level alongside DialogueUIBridgeComponent.

BusMethod / EventPurpose
DialogueSequencerRequestBusStartDialogueBySequenceBegins playback of a given sequence object.
DialogueSequencerNotificationBusOnDialogueTextBeginFires when a TextNodeData begins — carries speaker and text data.
DialogueSequencerNotificationBusOnDialogueSequenceCompleteFires when the sequence reaches its end node.

Localization

Text nodes store speaker lines as LocalizedStringId references rather than raw strings. A LocalizedStringId holds a key and a default fallback text. At runtime, the sequencer calls Resolve() on each LocalizedStringId, which looks up the key in the active LocalizedStringTable and returns the localized string for the current locale. If no table is loaded or the key is not found, the default text is returned.

To use localization, load a LocalizedStringTable asset through your project’s initialization flow before any dialogue plays.


ScriptCanvas Usage

Starting a Dialogue Sequence

Reacting to Sequence States

Listen on DialogueSequencerNotificationBus to receive speaker and text data as each line begins:


Quick Reference

NeedBusMethod / Event
Start a sequence by nameDialogueManagerRequestBusStartDialogueSequenceByName
Load a different databaseDialogueManagerRequestBusChangeDialogueDatabase
Register a performer in the levelDialogueManagerRequestBusRegisterPerformerMarker
Get a performer entityDialogueManagerRequestBusGetPerformer
Start a sequence object directlyDialogueSequencerRequestBusStartDialogueBySequence
React to a text line startingDialogueSequencerNotificationBusOnDialogueTextBegin
React to sequence completionDialogueSequencerNotificationBusOnDialogueSequenceComplete

Glossary

TermMeaning
DialogueDatabaseA .dialoguedb asset containing actors and dialogue sequences
DialogueSequenceA graph of nodes that defines a single dialogue conversation
DialogueNodeDataA polymorphic node in the sequence graph (Text, Selection, Random, Effects, Performance)
DialogueConditionA polymorphic evaluator attached to branches that gates progression
DialogueEffectA polymorphic action executed at EffectsNodeData nodes (e.g., setting records)
DialoguePerformanceA polymorphic action executed at PerformanceNodeData nodes — moves, paths, or repositions performers. Can be blocking or non-blocking
PerformerA named actor entity in the level mapped from the database via DialoguePerformerMarkerComponent

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.2.1 - Dialogue UI

How to display dialogue text, player choices, and typewriter effects in GS_Play — screen-space and world-space UI components and the bridge that connects them to the sequencer.

The Dialogue UI layer is the display side of the GS_Cinematics system. It receives events from DialogueSequencerComponent through DialogueUIBridgeComponent and routes them to the correct UI implementation — screen-space for HUD-style dialogue or world-space for speech bubbles above actors. Player choices are handled by selection components, and the TypewriterComponent reveals text character-by-character. BabbleComponent optionally plays per-character audio babble to give speakers a voice.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

DialogueUIComponent canvas in the O3DE UI Editor

 

Contents


Component Overview

ComponentSpacePurpose
DialogueUIComponentScreenDisplays the current speaker line on the HUD.
WorldDialogueUIComponentWorldDisplays speech bubbles positioned above actors in 3D space.
DialogueUISelectionComponentScreenRenders player choice options on the HUD.
WorldDialogueUISelectionComponentWorldRenders player choice options in 3D world space.
DialogueUIBridgeComponentRoutes sequencer events to UI and player input back to sequencer.
TypewriterComponentReveals text one character at a time with configurable timing.
BabbleComponentPlays per-character audio babble for the active speaker.

DialogueUIBridgeComponent

The Bridge component is the central connector between the sequencer and the UI. Place it on the same entity as DialogueSequencerComponent. It listens for OnDialogueTextBegin and OnDialogueSequenceComplete from the sequencer and forwards those events to whatever UI components are registered with it. It also receives player selection events from the selection UI and forwards them back to the sequencer.

This design keeps the sequencer and the display fully decoupled — swapping in a new UI implementation only requires registering it with the Bridge.

BusMethodWhat It Does
DialogueUIBridgeRequestBusRunDialogueSends a text line to the registered dialogue UI.
DialogueUIBridgeRequestBusRunSelectionSends selection options to the registered selection UI.
DialogueUIBridgeRequestBusRegisterDialogueUIRegisters a UI entity as the active dialogue display target.
DialogueUIBridgeRequestBusCloseDialogueTells the registered UI to close.
DialogueUIBridgeNotificationBusOnDialogueCompleteFires when the dialogue UI reports it has finished displaying.
DialogueUIBridgeNotificationBusOnSelectionCompleteFires when the player makes a selection.

Dialogue Display Components

Screen-Space Text

DialogueUIComponent handles HUD-style dialogue display. Add it to a UI canvas entity. It exposes the active TypewriterComponent so other systems can check typewriter state or force completion.

World-Space Speech Bubbles

WorldDialogueUIComponent extends DialogueUIComponent for world-space placement. It positions the dialogue panel above the speaking actor’s entity in 3D space. Use this for over-the-shoulder dialogue or conversations where the camera stays in the world rather than cutting to a UI overlay.


Selection Components

Screen-Space Choices

DialogueUISelectionComponent renders the player’s available choices as a list on the HUD. Each choice is backed by a DialogueSelectButtonComponent entity that is configured with option text and a selection index. When the player activates a button, it fires back through the Bridge to the sequencer.

World-Space Choices

WorldDialogueUISelectionComponent extends DialogueUISelectionComponent for world-space display. Choices appear positioned in 3D space rather than as a HUD overlay, useful for games with diegetic UI.


TypewriterComponent

TypewriterComponent in the O3DE Inspector

The TypewriterComponent reveals a string one character at a time. It fires OnTypeFired for each character revealed and OnTypewriterComplete when the full string is displayed. Use ForceComplete to instantly reveal the remaining text — typically wired to a player skip input — and ClearTypewriter to reset the display to empty.

BusMethod / EventWhat It Does
TypewriterRequestBusStartTypewriter(text)Begins revealing the given string character by character.
TypewriterRequestBusForceCompleteInstantly reveals all remaining characters.
TypewriterRequestBusClearTypewriterClears the display and resets state.
TypewriterNotificationBusOnTypeFiredFires each time a character is revealed.
TypewriterNotificationBusOnTypewriterCompleteFires when the full string has been revealed.

BabbleComponent

BabbleComponent pairs with TypewriterComponent to play short audio sounds for each character revealed, giving the impression of a speaker’s voice. Each actor has a SpeakerBabbleEvents record that maps them to a specific babble sound profile. The component returns the correct BabbleToneEvent for the current speaker via GetBabbleEvent, which is called by the typewriter on each OnTypeFired.

BusMethodWhat It Does
BabbleRequestBusGetBabbleEventReturns the babble audio event for the active speaker.

ScriptCanvas Usage

Forcing Typewriter Completion on Skip Input

Wire a player input action to ForceComplete so pressing a button instantly reveals the current line:

Reacting to Typewriter Events


Quick Reference

NeedBusMethod / Event
Route sequencer events to UIDialogueUIBridgeRequestBusRunDialogue / RunSelection
Register a UI entity with the bridgeDialogueUIBridgeRequestBusRegisterDialogueUI
Close the dialogue UIDialogueUIBridgeRequestBusCloseDialogue
React when dialogue UI finishesDialogueUIBridgeNotificationBusOnDialogueComplete
React when player makes a choiceDialogueUIBridgeNotificationBusOnSelectionComplete
Start typewriter revealTypewriterRequestBusStartTypewriter
Skip / instantly reveal textTypewriterRequestBusForceComplete
Clear the typewriter displayTypewriterRequestBusClearTypewriter
React to each character revealTypewriterNotificationBusOnTypeFired
React to full text revealedTypewriterNotificationBusOnTypewriterComplete
Get babble event for speakerBabbleRequestBusGetBabbleEvent

Glossary

TermMeaning
DialogueUIBridgeThe connector component that routes sequencer events to UI and player input back to the sequencer
TypewriterA text reveal component that displays characters one at a time with configurable timing
BabblePer-character audio playback that simulates a speaker’s voice during typewriter text reveal

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.2.2 - Performances

How to move and reposition actors during dialogue in GS_Play — the polymorphic performance system, built-in movement types, and async completion.

Performances are polymorphic actions that move or reposition actors during dialogue sequences. The sequencer triggers a performance when it reaches a PerformanceNodeData node and waits for OnPerformanceComplete before advancing. This async model lets multi-step actor choreography complete fully before dialogue continues. Three built-in performance types cover direct movement, navmesh navigation, and instant teleportation. Custom types can be created by extending the base class.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Dialogue Performer Marker component in the O3DE Inspector Performance Node in the Dialogue Editor

 

Contents


How Performances Work

When the sequencer encounters a PerformanceNodeData node, it:

  1. Resolves the named performer entity via DialogueManagerRequestBus → GetPerformer.
  2. Instantiates the performance object specified on the node.
  3. Calls DoPerformance() — the public entry point.
  4. Waits. The sequencer does not advance until the performance broadcasts OnPerformanceComplete.
  5. Resumes from the next node once completion is received.

The performance itself handles the movement logic, monitors completion, and signals back through its notification bus. This means a single PerformanceNodeData can represent any action that takes time — walking to a spot, running a path, playing an animation, triggering a VFX sequence — as long as it broadcasts completion when done.


Built-in Performance Types

Performance TypeMovement MethodWhen to Use
MoveTo_DialoguePerformanceDirect translation to marker(s)Open areas, stylized movement, non-physical traversal.
PathTo_DialoguePerformanceRecastNavigation pathfindingRealistic navigation around obstacles and geometry.
RepositionPerformer_DialoguePerformanceInstant teleport to markerOff-screen repositioning between scenes, no visible movement needed.

All three extend DialoguePerformance, the abstract base class. The base class provides DoPerformance(), ExecutePerformance(), and FinishPerformance() hooks, plus TickBus integration for per-frame movement updates.


MoveTo_DialoguePerformance

MoveTo_DialoguePerformance translates an actor toward one or more named stage markers using direct movement — no obstacle avoidance, no navmesh. It broadcasts movement requests over MoveTo_PerformanceNotificationBus. Listening systems (typically the performer’s movement component) receive those requests and execute the actual translation. When the actor reaches the final marker, the performance calls FinishPerformance() and broadcasts completion.

Use this for stylized games where physical path correctness is less important than snappy, predictable actor placement, or for any case where the path between actor and marker is guaranteed to be clear.

BusPurpose
MoveTo_PerformanceRequestBusQuery state of the performance.
MoveTo_PerformanceNotificationBusReceives move-to-marker requests from the performance.

PathTo_DialoguePerformance

PathTo_DialoguePerformance navigates an actor to one or more named stage markers using the RecastNavigation navmesh. It requests a path from the navmesh, walks the actor along that path, and completes when the actor arrives at the final marker. Use this in games with detailed geometry where actors must walk around walls, furniture, and obstacles rather than moving in a straight line.

RecastNavigation must be enabled in the project and a navmesh must be baked in any level where PathTo performances are used.

BusPurpose
PathTo_PerformanceRequestBusQuery state of the performance.
PathTo_PerformanceNotificationBusReceives path-to-marker requests from the performance.

RepositionPerformer_DialoguePerformance

RepositionPerformer_DialoguePerformance teleports an actor instantly to a named stage marker. It does not animate, translate, or navigate — it sets position directly. Useful for placing actors at the start of a new scene, recovering from a previous sequence that ended far from the required starting point, or repositioning actors that are off-camera and do not need visible movement.

BusPurpose
RepositionPerformer_PerformanceNotificationBusReceives teleport requests from the performance.

Async Completion

All performances signal completion asynchronously. The sequencer does not poll or time out — it simply waits for the performance to call FinishPerformance(), which broadcasts OnPerformanceComplete over the appropriate notification bus. This means:

  • A performance can take any amount of time.
  • A performance can be driven by external events — animation callbacks, physics arrival, etc.
  • Multiple performances can run in parallel if the node graph is structured to fork, because each notifies independently.

When writing custom performance types, always call FinishPerformance() when the action is done. Forgetting to do so will stall the sequencer indefinitely.


Extending with Custom Performances

Custom performance types are discovered automatically through O3DE serialization at startup. Extend DialoguePerformance, reflect it, and your custom type appears in the node editor’s performance picker and can be placed on any PerformanceNodeData node.

See the Framework API reference for the full base class interface and extension guide.


ScriptCanvas Usage

Performances are authored in the dialogue node graph and executed automatically by the sequencer. Most projects do not need to drive performances from ScriptCanvas directly. The common scripting patterns involve the movement side — receiving move or path requests from a performance and executing them on the performer entity.

Receiving a MoveTo Request

[MoveTo_PerformanceNotificationBus → StartMoveToMarker(markerEntity)]
    └─► [Move performer toward markerEntity position]
    └─► [When arrived → MoveTo_PerformanceRequestBus → ReportArrival]

Receiving a Reposition Request

[RepositionPerformer_PerformanceNotificationBus → RepositionToMarker(markerEntity)]
    └─► [Set performer transform to markerEntity transform]

Quick Reference

NeedBusMethod / Event
Receive a MoveTo move requestMoveTo_PerformanceNotificationBusStartMoveToMarker
Receive a PathTo navigation requestPathTo_PerformanceNotificationBusStartPathToMarker
Receive a reposition teleport requestRepositionPerformer_PerformanceNotificationBusRepositionToMarker
Query active MoveTo stateMoveTo_PerformanceRequestBus(see Framework API)
Query active PathTo statePathTo_PerformanceRequestBus(see Framework API)

Glossary

TermMeaning
PerformanceA polymorphic action that moves or repositions an actor during a dialogue sequence
PerformanceNodeDataA dialogue graph node that triggers a performance and waits for completion
Async CompletionThe pattern where the sequencer waits for OnPerformanceComplete before advancing

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

5 - GS_Environment

World time and environmental systems for GS_Play — time-of-day progression, day/night cycle management, and data-driven sky configuration.

GS_Environment manages the living world — the passage of time, the shift between day and night, and the visual appearance of the sky. It provides a singleton time authority that other systems subscribe to for world tick events and phase-change notifications, decoupling every time-dependent system from managing its own clock. Sky presentation is data-driven through configuration assets.

For architecture details, component properties, and extending the system in C++, see the GS_Environment API.


Quick Navigation

I want to…FeatureAPI
Control world time, time passage speed, or respond to day/night changesTime ManagerAPI
Define how the sky looks at different times of day with data assetsSky ConfigurationAPI

Installation

GS_Environment requires GS_Core and the Atom renderer. Add both to your project.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Environment gem in your project configuration.
  2. Create a Time Manager prefab and add it to the Game Manager’s Managers list.
  3. Create SkyColourConfiguration assets for your sky appearance.

Time Manager

The Time Manager is the singleton authority for world time. It advances time each tick according to a configurable speed, exposes query and control methods, and broadcasts WorldTick and DayNightChanged events so any system can react to time progression without polling.

Time Manager API


Sky Configuration

The Sky Configuration system defines how the sky looks at different times of day through data assets. SkyColourConfiguration assets hold colour values for dawn, midday, dusk, and night — swappable per-region, per-weather, or per-level without modifying entity hierarchies.

Sky Configuration API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Environment

GS_Environment — Explore this gem on the product page and add it to your project.

5.1 - Time Manager

How to work with the GS_Play Time Manager — world time, day/night cycles, and time passage control.

The Time Manager is the singleton controller for world time in your project. It drives the time-of-day value, controls how fast time passes, determines day/night state, and broadcasts timing events that other systems (lighting, AI schedules, environmental effects) can react to.

For component properties and API details, see the Framework API reference.

Time Manager component in the O3DE Inspector

 

Contents


How It Works

The Time Manager maintains a continuous time-of-day value. Each frame, it advances the time based on the configured passage speed and broadcasts a WorldTick event. When the time crosses the day/night threshold, it broadcasts DayNightChanged.

You set the main camera reference so the Time Manager can position sky effects relative to the player’s view.


Controlling Time

ScriptCanvas NodeWhat It Does
SetTimeOfDay(time)Sets the current time of day directly.
GetTimeOfDayReturns the current time of day.
SetTimePassageSpeed(speed)Controls how fast time advances (multiplier).
GetWorldTimeReturns the total elapsed world time.
IsDayReturns whether it is currently daytime.
SetMainCam(entity)Sets the camera entity for sky positioning.

ScriptCanvas

Time Manager entity setup in the O3DE Editor


Responding to Time Events

Time Manager entity setup in the O3DE Editor


Time Manager Entity Configuration

Time Manager entity setup in the O3DE Editor

In order to drive many of the environment effects and weather systems, the Time Manager needs many entities and components available to it.

EntityWhat It Does
SkyPivotThis entity holds all the rest of the celestial entities. It’s pivot is the pivot of the planet.
Sun / MoonTwo directional lights, pointed in opposite directions. One is disabled at a time and has it’s colours and intensities handled by the Time Manager sky configuration profile.
Moon BodyA graphic, placed in the sky relative to the moon directional light. Purely Cosmetic.
Planet AxisAn entity that carries celestial bodies at an offset. They spin with the rest of the sky. Purely Cosmetic.
StarsThe stars component. If it’s inside the pivot, the stars follow the rotation of the sky.

Quick Reference

NeedBusMethod / Event
Set time of dayTimeManagerRequestBusSetTimeOfDay(time)
Get current timeTimeManagerRequestBusGetTimeOfDay
Check if daytimeTimeManagerRequestBusIsDay
Change time speedTimeManagerRequestBusSetTimePassageSpeed(speed)
React to time ticksTimeManagerNotificationBusWorldTick
React to day/night changeTimeManagerNotificationBusDayNightChanged

Glossary

TermMeaning
World TimeThe continuously advancing time-of-day value maintained by the Time Manager
Time Passage SpeedA multiplier controlling how fast world time advances each frame
Day/Night PhaseThe current phase (day or night) determined by the time-of-day threshold

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Environment

GS_Environment — Explore this gem on the product page and add it to your project.

5.2 - Sky Configuration

How to work with GS_Play sky configuration — data-driven sky color settings for time-of-day transitions.

Sky Colour Configuration assets define the visual appearance of the sky at different times of day. These are data assets created in the O3DE Asset Editor that the Time Manager references to drive sky color transitions as time progresses.

For asset structure and property details, see the Framework API reference.

Sky Colour Configuration asset in the O3DE Asset Editor

 

Contents


How It Works

A Sky Colour Configuration asset defines color values for key times of day (dawn, midday, dusk, night). The Time Manager samples the configuration based on the current time of day and interpolates between the defined colors to produce smooth transitions.

Different environments (desert, forest, underwater) can use different configuration assets. Swapping the active configuration changes the sky appearance without modifying entity hierarchies.


Glossary

TermMeaning
SkyColourConfigurationA data asset defining sky color values for dawn, midday, dusk, and night

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Environment

GS_Environment — Explore this gem on the product page and add it to your project.

6 - GS_Interaction

Physics-based pulse events, proximity targeting with cursors, and data-driven world trigger zones.

GS_Interaction provides three independent but composable systems for making the world respond to entities. Pulsors broadcast typed events via physics volumes, the Targeting system finds and locks onto the best interactable in proximity with a cursor overlay, and World Triggers fire configurable responses from zones and conditions without requiring script code.

For architecture details, component properties, and extending the system in C++, see the GS_Interaction API.


Quick Navigation

I want to…FeatureAPI
Broadcast typed physics events from trigger volumes to receiving entitiesPulsorsAPI
Find and lock onto the best interactable entity in proximity with a cursor overlayTargetingAPI
Fire configurable responses from zones and conditions without scriptingWorld TriggersAPI

Installation

GS_Interaction requires GS_Core, LmbrCentral, LyShine, and CommonFeaturesAtom. Add all required gems to your project before placing Interaction components in a level.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Interaction gem in your project configuration.
  2. Configure physics collision layers for trigger volumes.
  3. Place Pulsor, Targeting, or World Trigger components on entities as needed.

Pulsors

The Pulsor system is a physics-driven event broadcast layer. A PulsorComponent on any entity with a trigger collider emits a typed pulse event when another entity enters or exits that volume. PulseReactorComponents on the receiving entity listen for those events and respond. Pulse types are polymorphic and registered at runtime, so project-specific types can be added without modifying GS_Interaction.

Pulsors API


Targeting

The Targeting system handles “which interactable entity is the player looking at or closest to right now?” A TargetingHandler on the player maintains a live registry of nearby targets, evaluates candidates on demand, and exposes the current target via bus events. Cursor components render a tracking reticle on screen that updates based on the selected target’s properties.

Targeting API


World Triggers

World Triggers are a data-driven system for firing game responses when conditions are met, with no script boilerplate required. TriggerSensor components define the condition side (interact, collider overlap, record check), and WorldTrigger components define the response (set record, toggle entity, change stage, print log). The combination lets most interactive world objects be authored entirely in the editor.

World Triggers API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

6.1 - Pulsors

How to work with the GS_Interaction Pulsor system — emitting typed physics-based pulse events and reacting to them from scripts.

The Pulsor system is a physics-driven event broadcast layer. A PulsorComponent lives on any entity with a trigger collider and emits a typed pulse event when another entity enters or exits that volume. PulseReactorComponent instances on the receiving entity listen for those events and respond. Because pulse types are polymorphic and registered at runtime, you can define project-specific pulse types without modifying the Pulsor or Reactor components themselves.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Pulsor component in the O3DE Inspector

 

Contents


How Pulses Work

Pulse Pattern Graph

Breakdown

Pulse Reactor component in the O3DE Inspector

When a physics trigger fires, the PulsorComponent iterates all PulseReactorComponent instances on the entering entity and calls ReceivePulses() on each one that returns true from IsReactor() for the emitted pulse type. Each reactor independently decides whether it handles a given pulse, so multiple reactors on one entity can each respond to a different subset of pulse types without interfering with each other.

StepWhat Happens
1 — Collider overlapPhysics system detects entity entering the Pulsor’s trigger volume.
2 — Pulse emitPulsorComponent reads its configured PulseType and prepares the event.
3 — Reactor queryEach PulseReactorComponent on the entering entity is called with IsReactor() for that type.
4 — ReactionReactors that return true have ReceivePulses() called and execute their response.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Pulse Types

A pulse type is a class derived from the abstract PulseType base. It carries no payload — the type itself is the signal. Reactors match on type identity, so adding a new type automatically makes it distinct from all existing ones.

Built-In Pulse Types

TypePurpose
Debug_PulseTest and development use. Logs or prints when received.
Destruct_PulseSignals that the receiving entity should be destroyed or deactivated.

Reactor Types

A reactor type is a class derived from the abstract ReactorType base. It defines what the receiving entity does when a matching pulse arrives. One entity can carry multiple reactors — one for each pulse type it needs to handle.

Built-In Reactor Types

TypePaired WithBehavior
Debug_ReactorDebug_PulseOutputs a debug message when it receives a matching pulse.
Destructable_ReactorDestruct_PulseHandles entity destruction or deactivation on pulse receipt.

Extending with Custom Pulse Types

Custom pulse and reactor types are discovered automatically through O3DE serialization at startup. Any team or plugin can add new interaction semantics — damage types, pickup types, status effects — by extending the base class and reflecting it, without modifying GS_Interaction itself. Custom types appear automatically as options in the editor’s component property dropdowns and are available to all Pulsors and Reactors in the project.

For the full extension guide and C++ interface, see Framework API: Pulsors and Framework API: Pulses.


Components

ComponentRoleKey Bus
PulsorComponentEmits a typed pulse when its trigger volume fires.
PulseReactorComponentReceives pulses and executes a reaction if the type matches.PulseReactorRequestBus (ById)

ScriptCanvas Usage

PulseReactorNotificationBus exposes the reactor’s state for script-driven queries. A common pattern is to wait for a Reactor to receive a pulse, then process its incoming data for your own pulse processing.


Quick Reference

NeedBusMethod
Check if entity handles a pulse typePulseReactorRequestBus (ById)IsReactor()
Manually trigger pulse processingPulseReactorRequestBus (ById)ReceivePulses()

Glossary

TermMeaning
PulsorA component that emits a typed pulse event when its trigger volume fires
Pulse ReactorA component that receives pulses and executes a response if the type matches
Pulse TypeA polymorphic class that identifies the kind of pulse being emitted

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

6.2 - Targeting

How to work with the GS_Interaction Targeting system — selecting interactable entities, managing cursor display, and handling interaction input from scripts.

The Targeting system answers the question “which interactable entity should the player act on right now?” A GS_TargetingHandlerComponent on the player maintains a live registry of nearby targets. Proximity detection volumes populate that registry automatically as the player moves. When the handler evaluates candidates, it selects the best target and broadcasts the result so the cursor layer and any downstream logic stay in sync.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Targeting Handler component in the O3DE Inspector

 

Contents


How Targeting Works

The targeting pipeline runs across four layers — proximity detection, target registration, target selection, and cursor display. Each layer is a separate component so they can be mixed and matched for different entity configurations.

LayerComponentWhat It Does
Proximity detectionGS_TargetingInteractionFieldComponentPhysics trigger volume on targetable entities. Registers and unregisters with the handler as the player enters and exits range.
Target dataGS_TargetComponent / GS_InteractTargetComponentDefines the target’s visual properties — size, spatial offset, colour, and sprite.
Target selectionGS_TargetingHandlerComponentMaintains the candidate list, evaluates it, and exposes the selected target via GetInteractTarget.
Cursor displayGS_CursorComponent + GS_CursorCanvasComponentReads the selected target’s visual properties and renders a cursor on screen or in world space.

Target Components

GS_TargetComponent is the base marker that makes an entity targetable. It carries four properties the cursor layer reads to display correctly:

PropertyPurpose
SizeHow large the targeting cursor appears over this entity.
OffsetSpatial offset from the entity’s origin, used to position the cursor at a sensible point (e.g. centre of a character’s torso).
ColourTint applied to the cursor sprite when this target is selected.
SpriteThe cursor image to display for this target.

GS_InteractTargetComponent extends GS_TargetComponent with specialised interaction semantics. Use it for any entity that can be interacted with via the InteractInputReaderComponent.


Targeting Handler

The GS_TargetingHandlerComponent is the central coordinator. It tracks all currently-registered candidates and determines which one is the best target at any given moment.

Notifications

Subscribe to GS_TargetingHandlerNotificationBus to react to targeting state changes:

NotificationWhen It Fires
OnUpdateInteractTargetThe selected target has changed. Payload is the new target entity id (or invalid if none).
OnEnterStandbyThe targeting handler has suspended evaluation (e.g. during a cutscene or level transition).
OnExitStandbyThe targeting handler has resumed evaluation.

ScriptCanvas — Listening for Target Changes

ScriptCanvas — Querying the Current Target


Interaction Fields

GS_TargetingInteractionFieldComponent is a physics trigger volume placed on targetable entities. When the player’s collider overlaps the field, it calls RegisterTarget on the handler. When the player leaves, it calls UnregisterTarget. The handler never has to poll — the field layer pushes updates automatically.

The field’s volume shape (sphere, box, capsule) determines the interaction range independently of the entity’s visual or physics collision bounds, so you can have a large detection range with a small visible object.


Cursor System

Interact Cursor component in the O3DE Inspector

The cursor layer visualizes the current target selection on screen. It is composed of two components that work together:

ComponentBusRole
GS_CursorComponentGS_CursorRequestBus (Single)Global cursor coordinator. Manages canvas registration, visibility, offset, and position.
GS_CursorCanvasComponentGS_CursorCanvasRequestBus (ById)Per-canvas cursor. Renders the sprite and responds to hide/show instructions.

ScriptCanvas — Cursor Control

[GS_CursorRequestBus → SetCursorVisuals]
    └─► Updates the sprite and colour from the selected target's properties.

[GS_CursorRequestBus → SetCursorPosition]
    └─► Moves the cursor to track the target's world position.

[GS_CursorRequestBus → HideCursor]
    └─► Hides the cursor when no target is selected or targeting is suspended.

UI Cursor Canvas

GS_CursorCanvasComponent UI canvas configuration in the O3DE UI Editor

A LyShine ui canvas that pairs with the in-world CursorComponent.

The cursor canvas applies visuals to the UI image in behalf of the targeting system.


Interaction Input

InteractInputReaderComponent extends GS_Core::GS_InputReaderComponent to map raw button input to interaction events.

Place it on the same entity as the GS_TargetingHandlerComponent. When the player presses the configured interact button, the reader fires an event the handler can act on to confirm the current selection or trigger the target’s interact action.

This keeps input handling decoupled from targeting logic — you can swap input mappings or replace the reader component without touching the handler.


Quick Reference

NeedBusMethod / Event
Get the current targetGS_TargetingHandlerRequestBus (ById)GetInteractTarget
Manually register a targetGS_TargetingHandlerRequestBus (ById)RegisterTarget
Manually unregister a targetGS_TargetingHandlerRequestBus (ById)UnregisterTarget
Know when the target changesGS_TargetingHandlerNotificationBus (ById)OnUpdateInteractTarget
Know when targeting suspendsGS_TargetingHandlerNotificationBus (ById)OnEnterStandby
Know when targeting resumesGS_TargetingHandlerNotificationBus (ById)OnExitStandby
Hide the cursorGS_CursorRequestBusHideCursor
Update cursor positionGS_CursorRequestBusSetCursorPosition
Update cursor appearanceGS_CursorRequestBusSetCursorVisuals

Glossary

TermMeaning
Targeting HandlerThe coordinator component on the player that maintains the candidate list and selects the best target
Interaction FieldA physics trigger volume on a targetable entity that registers/unregisters with the handler automatically
Target ComponentA marker component that makes an entity targetable with visual properties for the cursor
CursorThe visual overlay that tracks the selected target on screen or in world space

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

6.3 - World Triggers

How to work with the GS_Interaction World Trigger system — configuring sensor types and trigger types to create event-driven world responses without scripting.

World Triggers are a data-driven system for firing game responses when conditions are met. A TriggerSensorComponent defines the condition side — physics overlap, player interact, record check. A WorldTriggerComponent defines the response side — changing stage, toggling an entity, writing a record, logging a message. Each component holds an array of polymorphic type objects, so any combination of conditions and responses can be composed on a single entity without scripting.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

 

Contents


How World Triggers Work

World Trigger Pattern Graph

Breakdown

Each trigger is split into two container components. Logic lives in type objects held by each component:

PartRole
TriggerSensorComponentHolds andConditions and orConditions arrays of TriggerSensorType objects. When conditions pass, fires WorldTriggerRequestBus::Trigger on the same entity.
WorldTriggerComponentHolds a triggerTypes array of WorldTriggerType objects. On Trigger(), calls Execute() on every type. On Reset(), calls OnReset().

You can add any mix of sensor types and trigger types — for example, a door that requires both a physics overlap AND a record flag (andConditions) before it opens (TriggerType_ToggleEntities), then sets a quest record (TriggerType_SetRecord).

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Sensor Types

Trigger Sensor component in the O3DE Inspector

Add sensor types to the andConditions or orConditions arrays on a TriggerSensorComponent. The four built-in types cover the most common condition patterns:

Sensor TypeConditionNotes
SensorType_InteractAny unit with targeting interacts with this entity.Requires GS_TargetComponent or GS_InteractTargetComponent on the entity.
SensorType_PlayerInteractThe player-controlled unit specifically interacts with this entity.Extends SensorType_Interact; filters to entities with the "Player" tag.
SensorType_ColliderA physics collider enters the entity’s trigger volume.Automatically activates a PhysicsTriggerComponent — no manual setup needed.
SensorType_RecordA game record reaches a configured value.Connects to RecordKeeper automatically; fires without polling.

Trigger Types

World Trigger component in the O3DE Inspector

Add trigger types to the triggerTypes array on a WorldTriggerComponent. The four built-in types handle the most common response patterns:

Trigger TypeResponseNotes
TriggerType_PrintLogPrints a message to the development log.Development and debugging use.
TriggerType_SetRecordWrites a value to the Record Keeper by name.Useful for setting quest flags, unlock states, or counters.
TriggerType_ToggleEntitiesEnables or disables referenced entities. Seeds initial state at startup via startActive.Works with any entity in the level — doors, pickups, blockers.
TriggerType_ChangeStageTransitions the game to a different stage (level).Queued on next tick; coordinates with Stage Manager automatically.

Reset and Re-Arming

Calling Reset() on WorldTriggerRequestBus calls OnReset() on every trigger type in the component. Each type can reverse its effect or prepare for re-activation:

  • TriggerType_ToggleEntities inverts entity states on both Execute and OnReset, enabling toggle behavior.
  • Switches that cycle on repeat interaction.
  • Area triggers that fire each time a player re-enters.
  • Multi-step sequences where earlier steps re-arm later ones.

ScriptCanvas Usage

WorldTriggerRequestBus exposes both core methods directly to scripts:

A common script pattern is to use a SensorType_Record condition to gate a sequence, then script the reset from a separate event so the trigger only re-arms after additional conditions are met:

You can also bypass the sensor component entirely and call Trigger() directly from any script when you need to fire a world trigger response programmatically — for example, from a dialogue completion callback or an animation event.


Editor Setup Pattern

Most interactive world objects follow this assembly in the editor:

  1. Create an entity for the interactive object (door, switch, collectible, zone border).
  2. Add a TriggerSensorComponent to the entity.
  3. Add the appropriate sensor type(s) to andConditions or orConditions (e.g. SensorType_PlayerInteract for interact, SensorType_Collider for proximity).
  4. Add a WorldTriggerComponent to the same entity.
  5. Add the appropriate trigger type(s) to triggerTypes (e.g. TriggerType_ToggleEntities to open a door, TriggerType_SetRecord to record the event).
  6. Configure each type’s properties in the editor — no scripting needed for these patterns.

For interact-based sensors, also add GS_TargetComponent or GS_InteractTargetComponent so the Targeting system can select this entity.


Quick Reference

NeedBusMethod
Fire a trigger response from scriptWorldTriggerRequestBus (ById)Trigger()
Re-arm a trigger after it has firedWorldTriggerRequestBus (ById)Reset()
Fire sensor evaluation from event-driven typeTriggerSensorRequestBus (ById)DoAction() / DoResetAction()

 

Condition TypeSensor Type
Any unit interactsSensorType_Interact
Player only interactsSensorType_PlayerInteract
Physics overlapSensorType_Collider
Record/flag stateSensorType_Record

 

Response TypeTrigger Type
Log a debug messageTriggerType_PrintLog
Write a game recordTriggerType_SetRecord
Toggle an entity on/offTriggerType_ToggleEntities
Change stage/levelTriggerType_ChangeStage

Glossary

TermMeaning
TriggerSensorComponentContainer component that owns sensor type objects and evaluates them on each event
WorldTriggerComponentContainer component that owns trigger type objects and calls Execute on each Trigger()
TriggerSensorTypeA type object that defines one condition — extend this to create custom sensor logic
WorldTriggerTypeA type object that defines one response — extend this to create custom trigger logic
andConditionsArray on TriggerSensorComponent where all types must pass for the trigger to fire
orConditionsArray on TriggerSensorComponent where any one type passing is sufficient
Re-ArmingCalling Reset() on a WorldTriggerComponent so its types can fire again on the next Trigger()

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

7 - GS_Juice

Game feel and feedback motion system — screen shake, bounce, flash, and material effects driven by the GS_Motion engine.

GS_Juice is the game feel and feedback system. It provides motion-based visual feedback effects — screen shake, bounce, flash, pulse, material glow — authored as data assets and played on entities at runtime. GS_Juice extends the GS_Motion animation system with feedback-specific track types, so all timing, easing, and proxy targeting features are available for feedback effects.

For architecture details, track types, and the domain extension pattern, see the GS_Juice API.


Quick Navigation

I want to…FeatureAPI
Play screen shake, bounce, flash, or material glow effects on entitiesFeedback SystemAPI

Installation

GS_Juice requires GS_Core. Add both gems to your project.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Juice gem in your project configuration.
  2. Create .feedbackmotion assets in the O3DE Asset Editor.
  3. Place FeedbackEmitter components on entities that need to play effects.

Feedback System

The feedback system is the core of GS_Juice. Feedback Motion assets (.feedbackmotion) define one or more animation tracks that play together as a feedback effect. The Feedback Emitter component plays a feedback motion on itself or on a target entity. Effects are additive — they modify properties relative to the entity’s current state, so multiple effects stack without conflict.

Two track types are included:

  • Transform Track — Animates position offset, scale, and rotation using gradients. Ideal for screen shake, bounce, squash-and-stretch, and recoil.
  • Material Track — Animates material properties (opacity, emissive intensity, color tint) using gradients. Ideal for hit flash, damage glow, and fade effects.

Feedback System API


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Juice

GS_Juice — Explore this gem on the product page and add it to your project.

7.1 - Feedback System

How to work with GS_Play feedback motions — transform and material animation effects for game feel.

The Feedback System is the core of GS_Juice. Feedback Motion assets define animation tracks that play together as a visual effect — screen shake, bounce, hit flash, glow pulse. The Feedback Emitter component plays these effects on entities, with support for proxy targeting and stacking.

For track type details, asset structure, and component properties, see the Framework API reference.

FeedbackEmitter component in the O3DE Inspector

 

Contents


Track Types

TrackWhat It AnimatesFields
FeedbackTransformTrackPosition offset, scale, rotationVector2Gradient (XY), FloatGradient (scale), FloatGradient (rotation)
FeedbackMaterialTrackOpacity, emissive intensity, color tintFloatGradient (opacity), FloatGradient (emissive), ColorGradient (tint)

Both track types use Gradients from GS_Core, giving you full control over the animation shape at every point in the effect’s duration.


Feedback Emitter

The FeedbackEmitter component lives on any entity and plays a Feedback Motion.

PropertyWhat It Does
FeedbackMotionThe motion asset and proxy configuration.
playOnActivateIf true, plays the motion automatically when the entity activates.
MethodWhat It Does
Play()Plays the feedback motion on the owning entity.
PlayOnTarget(entityId)Plays on a different target entity.
Stop()Stops the currently playing motion.

Authoring Feedback Motions

  1. Create a .feedbackmotion asset in the O3DE Asset Editor.
  2. Add transform and/or material tracks.
  3. Configure each track’s gradients — these define the animation curve shape.
  4. Set timing (start time, duration) and easing for each track.
  5. Optionally set track identifiers for proxy targeting.
  6. Assign the asset to a FeedbackEmitter component.

Stacking Effects

Feedback effects are additive — each track applies its property change relative to the entity’s current state. Multiple effects on the same entity stack naturally without conflict. A bounce effect and a flash effect can run simultaneously on the same entity.


Proxy Targeting

UiAnimationMotion with proxy bindings configured in the Inspector

When a motion asset has tracks with identifiers (named labels), those tracks appear in the proxy list on the component. Proxies let you redirect a track to a different entity in the hierarchy — for example, a page show animation might animate the background separately from the content panel.

Each proxy entry maps a track label to a target entity. If no proxy is set, the track targets the motion’s owner entity.


Glossary

TermMeaning
Feedback MotionA .feedbackmotion data asset containing animation tracks that play as a visual effect
Feedback EmitterA component that plays a feedback motion on its entity or a target entity
FeedbackTransformTrackA track that animates position offset, scale, and rotation via gradients
FeedbackMaterialTrackA track that animates opacity, emissive intensity, and color tint via gradients

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Juice

GS_Juice — Explore this gem on the product page and add it to your project.

8 - GS_Item

Inventory and equipment systems for GS_Play — container-based item storage, slot management, stacking, and visual equipment integration with GS_Performer.

GS_Item provides the foundational inventory and equipment infrastructure for GS_Play games. It handles how items are stored, organized, and associated with characters — from simple containers that hold collectables to slot-based equipment systems that drive visual changes through GS_Performer.

For architecture details, component properties, and extending the system in C++, see the GS_Item API.


Quick Navigation

I want to…FeatureAPI
Store, stack, and manage items in container-based inventoriesInventoryAPI
Equip items into typed character slots with visual integrationEquipmentAPI

Installation

GS_Item requires GS_Core. GS_Performer is an optional dependency for visual equipment integration.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Item gem in your project configuration.
  2. Define item data assets for your game’s items.
  3. Place Inventory components on entities that need to store items.
  4. Place Equipment components on characters that need to equip items.

Inventory

The Inventory system is a container-based model for item storage. Each inventory is a collection of item slots where each slot holds an item reference and a stack count. The system handles adding, removing, querying, and capacity management. Any entity can host an inventory — a player backpack, a chest, a vendor, or a loot bag.

Inventory API


Equipment

The Equipment system is the slot-based layer that governs which items a character has equipped. Each slot has a defined type and enforces compatibility rules. Equipment integrates with GS_Performer’s Skin Slot system — equipping an item automatically updates the matching visual asset, connecting inventory state to character presentation.

Equipment API


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Item

GS_Item — Explore this gem on the product page and add it to your project.

8.1 - Inventory

How to work with GS_Play inventory — container-based item storage with slots and stacking.

The Inventory system provides container-based item storage for entities. Inventory components define slot-based containers that hold item data, support stacking, and enable item transfer between entities (player to chest, vendor to player).

For component properties and the inventory data model, see the Framework API reference.

 

Contents


How Inventory Works

ConceptWhat It Means
ContainerA component on an entity that holds item slots. Players, chests, vendors, and loot drops are all containers.
SlotAn individual position within a container. Each slot holds one item type.
StackingMultiple units of the same item can occupy a single slot up to a maximum stack size.
TransferItems move between containers via slot references — no intermediate state required.

Glossary

TermMeaning
ContainerA component on an entity that holds item slots
SlotAn individual position within a container that holds one item type
StackingMultiple units of the same item occupying a single slot up to a maximum count

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Item

GS_Item — Explore this gem on the product page and add it to your project.

8.2 - Equipment

How to work with GS_Play equipment — typed equipment slots with visual integration via GS_Performer skin slots.

The Equipment system provides typed equipment slots for characters. Equipment slots enforce type constraints (helmet slot accepts helmets, weapon slot accepts weapons) and integrate directly with the Skin Slot system in GS_Performer to update visual appearance when items are equipped or unequipped.

For component properties and the equipment data model, see the Framework API reference.

 

Contents


How Equipment Works

PhaseWhat Happens
EquipAn item is placed into a typed slot. Equip notifications fire.
Visual UpdateIf linked to a GS_Performer skin slot, the character mesh updates automatically.
Stat ApplicationEquipment stats are applied to the entity (via RPStats integration, when available).
UnequipThe item is removed from the slot. Visual and stat changes revert.

Integration with Skin Slots

Equipment slots can be linked to Performer Skin Slots. When an item is equipped, the corresponding skin slot’s actor mesh and materials are automatically swapped to match the item. No manual wiring is required — the connection is data-driven through slot name matching.


Glossary

TermMeaning
Equipment SlotA typed slot on a character that accepts items matching its type constraint
Skin Slot IntegrationAutomatic visual update when equipping or unequipping items via GS_Performer

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Item

GS_Item — Explore this gem on the product page and add it to your project.

9 - GS_Performer

Character rendering and animation for GS_Play — modular skin slots, billboard 2.5D rendering, and velocity-driven locomotion parameters.

GS_Performer is the character presentation layer for GS_Play. It handles the visual side of any entity that needs to look like a character — whether that is a fully-rigged 3D mesh swapping equipment, a sprite-based billboard character, or an animated unit whose blend parameters track real movement velocity. GS_Performer sits above GS_Unit and GS_Core, consuming movement and game state events to keep visual output in sync with gameplay.

For architecture details, component properties, and extending the system in C++, see the GS_Performer API.


Quick Navigation

I want to…FeatureAPI
Manage registered performers or query them by namePerformer ManagerAPI
Swap equipment visuals at runtime with modular slot-based assetsSkin SlotsAPI
Render billboard-style 2.5D characters that face the camera correctlyPaper PerformerAPI
Drive animation blend parameters from entity velocity automaticallyLocomotionAPI

Installation

GS_Performer requires GS_Core. EMotionFX and GS_Unit are optional dependencies for animation and velocity-driven locomotion.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Performer gem in your project configuration.
  2. Create a Performer Manager prefab and add it to the Game Manager’s Managers list.
  3. Place PerformerSkinSlotComponent and SkinSlotHandlerComponent on character entities.
  4. For 2.5D characters, add a PaperFacingHandlerComponent.

Performer Manager

The Performer Manager is the singleton coordinator for all registered performers in the active scene. It follows the GS_Core manager pattern — spawned by the Game Manager, participates in standby, and is accessible globally via its request bus. Systems that need to enumerate or query active performers route through this component.

Performer Manager API


Skin Slots

The Skin Slot system is the modular equipment and appearance layer for characters. Each PerformerSkinSlotComponent represents an equipment slot holding an actor asset and material overrides. The SkinSlotHandlerComponent manages the full collection for one character, providing batch operations and appearance-changed notifications. This makes character visual configuration composable rather than monolithic.

Skin Slots API


Paper Performer

The Paper Performer system provides billboard-style 2.5D character rendering. The PaperFacingHandlerComponent rotates the entity’s mesh to face the active camera each frame, accounting for movement direction to prevent visual artifacts from naive always-face-camera implementations. Custom facing logic can be substituted without modifying the core component.

Paper Performer API


Locomotion

The Locomotion feature set connects an entity’s physical movement to its animation layer. The VelocityLocomotionHookComponent reads velocity from the physics system each tick and drives animation blend parameters — speed and direction — that EMotionFX or Simple Motion components consume. Animations stay tightly synchronized with gameplay movement without manual state machine scripts.

Locomotion API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

9.1 - Performer Manager

How to work with the GS_Play Performer Manager — performer registration and lifecycle.

The Performer Manager is the singleton controller for the performer system. It tracks all active performers in the scene, handles registration and lookup, and responds to the Game Manager’s lifecycle events automatically.

For component properties and API details, see the Framework API reference.

PerformerManager component in the O3DE Inspector

 

Contents


How It Works

Performers register with the Performer Manager when they activate. The manager provides lookup by name so that other systems — dialogue, cinematics, AI — can find performers without maintaining their own references.

The Performer Manager responds to standby and shutdown broadcasts from the Game Manager, coordinating performer state across level transitions.


Quick Reference

NeedBusMethod
Query performer managementPerformerManagerRequestBusManager-level queries

Glossary

TermMeaning
PerformerAn entity that represents a character in the world with visual, animation, and interaction capabilities
Performer ManagerThe singleton that tracks all active performers and provides lookup by name

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

9.2 - Performers

The character rendering entity types in GS_Performer — billboard 2.5D paper performers and fully-rigged 3D avatar performers.

Performers are the character entity types in GS_Performer. Each type defines how a character’s visual layer is structured and rendered — whether a flat sprite quad that rotates to face the camera, or a fully-rigged 3D mesh driven by EMotionFX. Choose the performer type that fits your character pipeline, then add Performer Features to layer in additional capabilities.

For component details and C++ extension, see the Framework API: Performers.


Paper Performer

The Paper Performer is a billboard-based character type for 2.5D and top-down games. The PaperFacingHandlerComponent orients the entity’s sprite quad toward the active camera each frame, accounting for movement direction to prevent visual artifacts. It is the simplest performer type — lightweight, with no animation graph required.

Paper Performer API


Avatar Performer

The Avatar Performer is the full 3D character pipeline. It drives a rigged EMotionFX actor and integrates with Skin Slots and the Velocity Locomotion Hook for a complete character presentation layer. Use this for characters with full 3D rigs, animation blend trees, and runtime equipment swapping.

Avatar Performer API


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

9.2.1 - Avatar Performer

How to work with GS_Play avatar performers — 3D pipeline, animation, equipment.

Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

9.2.2 - Paper Performer

How to work with GS_Play paper performers — billboard-style 2.5D character rendering with camera-aware facing.

The Paper Performer system provides billboard-style character rendering for 2.5D games. The Paper Facing Handler rotates sprite-based characters to always face the active camera while maintaining proper orientation relative to movement direction. This prevents the visual artifacts that occur when a flat sprite is viewed from the side.

For component properties and the facing algorithm, see the Framework API reference.

 

Contents


How It Works

The PaperFacingHandlerComponent is placed on any entity that uses sprite-based or flat-mesh character rendering. Each frame, it calculates the optimal facing direction based on:

  1. The active camera’s position (via CamCore notifications).
  2. The entity’s movement direction.
  3. Configurable facing rules.

The result is a character that always presents its front face to the camera without snapping or popping during movement direction changes.


Camera-Aware Variant

The CamCorePaperFacingHandlerComponent (in GS_Complete) extends the base paper facing with direct camera core integration. It listens to CamCoreNotificationBus for camera position updates, ensuring the facing calculation uses the exact camera state rather than a cached position.


Entity Configuration

Performer Entity Configuration in the O3DE Editor

Note this has been put inside a GS_Unit prefab.


Glossary

TermMeaning
Paper PerformerA billboard-style character that always faces the camera for 2.5D rendering
Paper Facing HandlerThe component that calculates optimal facing direction each frame

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

9.3 - Performer Features

Universal capabilities that extend any performer type — modular equipment via skin slots, and velocity-driven animation via locomotion.

Performer Features are components that layer onto any performer entity, regardless of type. They are not tied to how a character is rendered — they extend what any performer can do. Skin Slots give any character modular equipment slots. Locomotion connects physics movement to the animation graph automatically.

For component details and C++ extension, see the Framework API: Performer Features.


Skin Slots

The Skin Slot system provides modular equipment and appearance management. Each PerformerSkinSlotComponent represents one equipment position on the character — helm, chest, weapon — holding a swappable actor mesh and material set. The SkinSlotHandlerComponent coordinates all slots on a character and supports preset profiles for NPC archetypes or full equipment sets.

Skin Slots API


Locomotion

The Velocity Locomotion Hook connects entity movement to animation automatically. It reads the entity’s physics velocity each frame and writes blend parameters — speed and direction — directly into the EMotionFX animation graph, driving walk, run, idle, and fall transitions without manual state machine scripting.

Locomotion API


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

9.3.1 - Skin Slots

How to work with GS_Play skin slots — modular character appearance with swappable actor meshes and materials.

The Skin Slot system provides modular character appearance. Each slot represents an equipment position on a character (head, body, arms, weapon) and holds an actor mesh with its materials. The Skin Slot Handler manages the full collection of slots for a character, enabling runtime equipment swapping through data changes rather than entity restructuring.

For component properties and data structures, see the Framework API reference.

Skin Slot Configuration Profile in the O3DE Asset Editor

 

Contents


How It Works

Each character entity has one SkinSlotHandlerComponent and multiple PerformerSkinSlotComponent children — one per equipment position.

ComponentWhat It Does
SkinSlotHandlerManages the full set of skin slots for a character. Provides batch operations.
PerformerSkinSlotIndividual slot. Holds a SkinSlotData (actor asset + material assets).

When you equip a new item, you update the slot’s SkinSlotData — the system swaps the actor mesh and materials automatically.


Skin Slot Data

FieldWhat It Is
Actor AssetThe 3D mesh for this equipment slot.
Material AssetsMaterials applied to the mesh.
Slot NameNamed identifier for the slot (e.g., “Helmet”, “Chest”, “MainHand”).

Skin Slot Configuration Profiles are preset assets that define complete character appearances — useful for NPC archetypes or equipment sets.


Quick Reference

NeedBusMethod
Access a skin slotPerformerSkinSlotRequestBusPer-slot queries and updates
Manage all slotsSkinSlotHandlerRequestBusBatch operations on all slots

Glossary

TermMeaning
Skin SlotAn equipment position on a character that holds an actor mesh and materials
SkinSlotHandlerThe manager component that coordinates all skin slots on a single character
SkinSlotDataThe data structure containing actor asset, materials, and slot name
Skin Slot Configuration ProfileA preset asset defining a complete character appearance for batch application

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

9.3.2 - Locomotion

How to work with GS_Play velocity locomotion — automatic animation parameter driving from entity velocity.

The Velocity Locomotion Hook automatically drives animation blend parameters from an entity’s velocity. Instead of manually scripting animation state transitions, you attach this component and it reads the entity’s physics velocity each frame, writing the appropriate blend values to the animation system.

For component properties, see the Framework API reference.

Velocity Locomotion Hook component in the O3DE Inspector

 

Contents


How It Works

The VelocityLocomotionHookComponent runs on the tick bus. Each frame it:

  1. Reads the entity’s current velocity from the physics system.
  2. Extracts speed, direction, and vertical velocity.
  3. Writes these as blend parameters to the entity’s EMotionFX animation graph.

This creates a seamless connection between movement and animation — walk, run, idle, and fall transitions happen automatically based on physics state.


When to Use

ScenarioResult
Character walkingSpeed parameter drives walk/run blend
Character idleZero velocity triggers idle state
Character fallingVertical velocity drives fall animation
Character stoppingDeceleration naturally transitions to idle

The component is stateless and lightweight — it reads velocity and writes parameters each frame with no internal state machine.


Glossary

TermMeaning
Velocity Locomotion HookA component that reads entity velocity and writes animation blend parameters each frame
Blend ParameterA named value sent to the animation graph that controls state transitions and blending

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

10 - GS_PhantomCam

Priority-based virtual camera management with blend profiles, multiple behavior types, and spatial influence fields.

GS_PhantomCam is the camera management system for GS_Play. It uses virtual cameras — Phantom Cameras — that compete for control through a priority system. Whichever holds the highest priority drives the real camera’s position, rotation, and FOV. Transitions are governed by Blend Profile assets, and Influence Fields let spatial zones adjust camera behavior dynamically.

For architecture details, component properties, and extending the system in C++, see the GS_PhantomCam API.


Quick Navigation

I want to…FeatureAPI
Manage the camera system lifecycle and know which camera is activeCam ManagerAPI
Place virtual cameras with follow targets, look-at, and priority-based switchingPhantom CamerasAPI
Control how the real camera reads from the active phantom camera each frameCam CoreAPI
Define smooth transitions between cameras with custom easingBlend ProfilesAPI
Create spatial zones that modify camera priority dynamicallyInfluence FieldsAPI

Architecture

Camera Blend Pattern Graph

Breakdown

Blend Profiles are referenced from within a Phantom Camera’s blend settings. When the Cam Manager determines a transition is needed, it reads the outgoing camera’s blend settings to find the profile to use.

ScenarioWhich Profile Is Used
Camera A becomes inactive, Camera B was waitingCamera A’s blend profile governs the outgoing transition.
Camera B is activated and is now highest priorityCamera B’s blend profile governs the incoming transition.
No blend profile is assignedTransition is instantaneous — no interpolation.

Because profiles are assets, the same profile can be shared across many Phantom Cameras. A single edit to the asset changes the feel of every camera that references it.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Installation

GS_PhantomCam requires GS_Core only. Add both gems to your project before placing PhantomCam components in a level.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_PhantomCam gem in your project configuration.
  2. Create a Cam Manager prefab and add it to the Game Manager’s Managers list.
  3. Place a Cam Core entity with GS_CamCoreComponent in every level.
  4. Place Phantom Camera entities with desired behavior components.
  5. Create Blend Profile assets for camera transitions.

Cam Manager

The Cam Manager is the singleton camera system manager, integrated with GS_Core’s manager lifecycle. It handles startup and shutdown, responds to enable/disable events so the camera can be suspended during cinematics, and broadcasts whenever the dominant camera changes so downstream systems stay in sync.

Cam Manager API


Phantom Cameras

Phantom Cameras are virtual camera definitions placed as components on ordinary entities. Each holds field of view, clip planes, optional follow and look-at targets, positional and rotational offsets, and a priority value. The highest-priority camera drives the real camera. Specialized behavior components extend the vocabulary: first-person, orbit, static orbit, clamped look, and spline track.

Phantom Cameras API


Cam Core

Cam Core is the rendering bridge between the Phantom Camera system and the actual O3DE camera entity. It reads the current dominant camera’s data each frame and writes position, rotation, and FOV to the real camera. It also broadcasts per-frame camera position updates for shadow casters, audio listeners, and LOD controllers.

Cam Core API


Blend Profiles

Blend Profiles are data assets that define how the camera transitions when control shifts between Phantom Cameras. Each profile specifies blend duration, position easing curve, and rotation easing curve — independently, so position can ease out while rotation snaps. Profiles are shared by name across cameras for visual consistency.

Blend Profiles API


Influence Fields

Influence Fields are spatial or global modifiers that dynamically shift camera priority. GlobalCameraInfluence applies to cam core unconditionally, while CameraInfluenceField defines a physics trigger volume that modifies the priority when the player is inside it. Multiple fields with the same camera target stack influence.

Influence Fields API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

10.1 - Cam Manager

How to work with the GS_PhantomCam manager — enabling and disabling the camera system, and responding to active camera changes.

The Cam Manager is the camera system’s singleton manager. It integrates with GS_Core’s standard manager lifecycle, handles the camera system’s startup and shutdown through the two-stage initialization sequence, and maintains awareness of which Phantom Camera is currently dominant. Any time the active camera changes, the Cam Manager broadcasts a notification so dependent systems can react without polling.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Cam Manager component in the O3DE Inspector

 

Contents


Manager Lifecycle

The Cam Manager is a GS_ManagerComponent and participates in the standard GS_Core startup sequence. It activates alongside all other managers during the two-stage initialization, making it safe to call camera system methods any time after OnStartupComplete.

StageWhat Happens
InitializeCam Manager activates and registers with the camera pipeline.
Startup (OnSetupManagers)Manager is initialized. Safe to query camera state.
Complete (OnStartupComplete)All managers ready. Safe to enable or disable the camera system.

Enabling and Disabling the Camera System

The camera system can be suspended and resumed independently of other gameplay systems. This is used during UI-only states, loading screens, or when a cinematic takes direct control of the camera.

NotificationWhat It Does
EnableCameraSystemActivates the camera system. Phantom Cameras resume competing for priority.
DisableCameraSystemSuspends the camera system. The real camera stops receiving updates from Phantom Cameras.

Both events are on CamManagerNotificationBus. Broadcast them to switch the camera system state from any script or component.


Tracking the Active Camera

The Cam Manager broadcasts SettingNewCam on CamManagerNotificationBus whenever a different Phantom Camera becomes dominant. This fires any time a camera with higher priority activates, a dominant camera deactivates, or a blend resolves to a new target.

Use this event in any system that needs to know which camera is currently active — aiming reticles, world-space UI elements, minimaps, or audio listener positioning.

ScriptCanvas


Suspending the Camera System

ScriptCanvas

Broadcast DisableCameraSystem to suspend camera updates, for example when a cutscene takes direct camera control:

Because DisableCameraSystem and EnableCameraSystem are notifications rather than requests, they are fire-and-forget. Any script can broadcast them.


Changing the Camera Target

ScriptCanvas

The follow target, designated by the CamManager, can be set with Get/Set Target:


Cam Manager Entity Configuration

Cam Manager entity setup in the O3DE Editor

The Cam Manager MUST carry the MainCamera with it. This enables proper control of the camera and ensures there is a Cam Core no matter what stage you load into to start.

Additionally, you can add any phantom cameras that will persist between stage changes. This is usually the “rig” cameras that constantly track the player.

Using a PossessedUnit event on a player controller enables you to immediately set the Cam Manager target to your spawned player unit.


Quick Reference

NeedBusMethod / Event
Enable the camera systemCamManagerNotificationBusEnableCameraSystem
Disable the camera systemCamManagerNotificationBusDisableCameraSystem
Know when the active camera changesCamManagerNotificationBusSettingNewCam(newCamEntityId)
Access camera manager methodsCamManagerRequestBus(see Framework API)

Glossary

TermMeaning
Cam ManagerThe singleton manager for the PhantomCam system that tracks active cameras and broadcasts changes
Dominant CameraThe Phantom Camera with the highest priority that currently drives the real camera

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

10.2 - Phantom Cameras

How to configure and use GS_PhantomCameraComponent — priority, targets, data fields, and the available camera behavior types.

Phantom Cameras are virtual camera definitions placed in the level as components on ordinary entities. They do not render anything. Instead, each Phantom Camera holds a configuration record that describes how the real camera should behave when that Phantom Camera is dominant. The Cam Manager determines dominance by comparing priority values — whichever active Phantom Camera holds the highest priority drives the real camera at any given moment.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Phantom Camera component in the O3DE Inspector

 

Contents


How Priority Works

Every GS_PhantomCameraComponent has a priority value. The Cam Manager continuously tracks all active Phantom Cameras and routes camera control to the one with the highest priority. When a Phantom Camera is disabled or deactivated, the next-highest priority camera takes over.

ConditionResult
Camera A priority 10, Camera B priority 5 — both activeCamera A is dominant.
Camera A deactivatesCamera B becomes dominant immediately (or blends if a Blend Profile is set).
Camera A and Camera B both priority 10The most recently activated one is dominant.
No Phantom Cameras are activeThe real camera holds its last known position.

PhantomCamData Fields

Each Phantom Camera stores a PhantomCamData record that defines the full camera configuration for when that camera is dominant.

FieldTypePurpose
FOVfloatField of view in degrees. Applied to the real camera’s FOV when dominant.
NearClipfloatNear clip plane distance.
FarClipfloatFar clip plane distance.
FollowTargetEntityIdEntity the camera’s position tracks. Camera moves with this entity.
LookAtTargetEntityIdEntity the camera’s rotation aims at. Overrides authored rotation.
PositionOffsetVector3Positional offset applied relative to the follow target.
RotationOffsetQuaternionRotational offset applied after look-at or authored rotation.

FollowTarget and LookAtTarget are both optional. A Phantom Camera with neither set holds the position and rotation of its entity in the level.


Camera Behavior Types

Beyond the base GS_PhantomCameraComponent, several behavior components can be added to a Phantom Camera entity to specialize how it moves and aims:

ComponentWhat It Does
ClampedLook_PhantomCamComponentAdds yaw and pitch angle limits to look-at rotation. Prevents the camera from swinging past defined bounds.
StaticOrbitPhantomCamComponentLocks the camera to a fixed orbit radius around its target entity. The orbit position does not follow input — it is a fixed authored angle.
Track_PhantomCamComponentMoves the camera along a spline path. Useful for cinematic dolly shots or scripted flythrough sequences.
AlwaysFaceCameraComponentMakes the entity that holds this component always rotate to face the currently dominant Phantom Camera. Used for billboards and 2D sprites in 3D space.

Each behavior component is additive — the base Phantom Camera still drives priority and targeting while the behavior component modifies position or rotation evaluation.


Activating and Deactivating Cameras

Phantom Cameras participate in priority competition only while their entity is active. Enable and disable the entity to add or remove a camera from competition. The transition between dominant cameras is governed by the Blend Profile referenced in the outgoing camera’s blend settings.

ScriptCanvas

To raise a camera’s priority and make it dominant:

Because priority is numeric, you can also activate multiple cameras simultaneously and let priority values determine dominance without manually disabling others.


Requesting Camera Data

Use PhantomCameraRequestBus (addressed by the Phantom Camera entity’s ID) to read or write its configuration at runtime:

ScriptCanvas


Quick Reference

NeedBusMethod / Event
Read a camera’s full configurationPhantomCameraRequestBus(id)GetPhantomCamData
Change a camera’s follow targetPhantomCameraRequestBus(id)SetFollowTarget(entityId)
Change a camera’s look-at targetPhantomCameraRequestBus(id)SetLookAtTarget(entityId)
Change a camera’s FOVPhantomCameraRequestBus(id)SetFOV(value)
Make a camera dominant(entity activation)Enable the Phantom Camera entity
Remove a camera from competition(entity activation)Disable the Phantom Camera entity
Know when the dominant camera changesCamManagerNotificationBusSettingNewCam(newCamEntityId)

Glossary

TermMeaning
Phantom CameraA virtual camera definition that describes how the real camera should behave when dominant
PhantomCamDataThe configuration record holding FOV, clip planes, follow/look-at targets, and offsets
PriorityA numeric value that determines which Phantom Camera drives the real camera
Follow TargetAn entity whose position the camera tracks
Look-At TargetAn entity the camera’s rotation aims toward

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

10.3 - Cam Core

How to work with GS_CamCoreComponent — the rendering bridge that reads the dominant Phantom Camera each frame and drives the real camera entity.

Cam Core is the rendering bridge between the Phantom Camera system and the actual O3DE camera entity. GS_CamCoreComponent runs on tick, reads the dominant Phantom Camera’s configuration each frame, and writes position, rotation, and field of view directly to the camera entity that the renderer uses. Phantom Cameras define intent — Cam Core executes it.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Cam Core component in the O3DE Inspector

 

Contents


What Cam Core Does Each Frame

On every tick, Cam Core follows this sequence:

StepWhat Happens
1 — Query active cameraReads the current dominant Phantom Camera from the Cam Manager.
2 — Read PhantomCamDataFetches the active camera’s position, rotation, FOV, and clip planes.
3 — Apply blendIf a blend is in progress, interpolates between the outgoing and incoming camera values.
4 — Write to real cameraPushes the resolved transform and FOV to the real O3DE camera entity.
5 — Broadcast positionFires UpdateCameraPosition on CamCoreNotificationBus so dependent systems receive the final camera location.

Because Cam Core owns the final write to the camera entity, it is also the correct insertion point for last-mile adjustments such as screen shake offsets, post-processing overrides, or recoil displacement — applied after blend resolution but before the frame renders.


Responding to Camera Position Updates

CamCoreNotificationBus broadcasts UpdateCameraPosition every frame with the resolved camera transform. Connect to this bus in any system that needs per-frame camera location data.

Use CaseWhy UpdateCameraPosition
Audio listener positioningFollow the camera without polling the camera entity.
LOD or culling controllersReceive camera position reliably after blend is applied.
Shadow caster updatesReact to camera movement each frame.
World-space UI anchoringKnow exactly where the camera landed after Cam Core processed its frame.

ScriptCanvas


Cam Core Setup

Cam Core requires exactly one GS_CamCoreComponent in the level, and a separate entity with an O3DE Camera component that Cam Core is configured to drive. The Camera entity is what the renderer uses — Cam Core holds a reference to it and writes to it each frame.

EntityRequired Components
Cam Core entityGS_CamCoreComponent
Camera entityO3DE CameraComponent (standard)

Assign the Camera entity reference in the Cam Core component’s properties in the editor. Both entities should be present in every level that uses the PhantomCam system.


Querying Cam Core State

Use CamCoreRequestBus to read the current camera state or set the camera entity reference at runtime:

ScriptCanvas

Cam Core state in Script Canvas


Quick Reference

NeedBusMethod / Event
Know the camera position each frameCamCoreNotificationBusUpdateCameraPosition(transform)
Get the real camera entity IDCamCoreRequestBusGetCameraEntity
Get the current resolved FOVCamCoreRequestBusGetCurrentFOV
Know when the active camera changesCamManagerNotificationBusSettingNewCam(newCamEntityId)

Glossary

TermMeaning
Cam CoreThe rendering bridge that reads the dominant Phantom Camera each frame and writes to the real O3DE camera
Blend ResolutionThe process of interpolating position, rotation, and FOV between outgoing and incoming cameras

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

10.4 - Blend Profiles

How to create and use GS_PhantomCamBlendProfile assets to control transition timing, easing, and interpolation between Phantom Cameras.

Blend Profiles are data assets that define how the camera transitions when control shifts from one Phantom Camera to another. Without a Blend Profile, transitions are instantaneous. A well-authored Blend Profile is often the single largest contributor to a camera system feeling polished rather than mechanical.

For architecture details, asset structure, and extending the system in C++, see the Framework API reference.

Cam Blend Profile asset in the O3DE Asset Editor

 

Contents


What a Blend Profile Contains

Each GS_PhantomCamBlendProfile asset defines the shape of a single camera transition:

FieldPurpose
DurationHow long the blend takes, in seconds.
Position EasingThe easing curve applied to position interpolation. Controls how the camera accelerates and decelerates as it moves toward the new position.
Rotation EasingThe easing curve applied to rotation interpolation. Can be set independently from position easing.
FOV EasingThe easing curve applied to field-of-view interpolation.

Position, rotation, and FOV each have independent easing settings. This makes it straightforward to compose transitions that feel distinct in each axis — for example, position eases out slowly while rotation snaps quickly, or FOV leads the transition while position lags.


How Blend Profiles Are Referenced


Creating a Blend Profile Asset

Blend Profile assets are created from the Asset Browser in the O3DE Editor:

  1. Right-click in the Asset Browser where you want to store the profile.
  2. Select Create Asset → GS_PhantomCamBlendProfile.
  3. Name the asset descriptively — for example, CombatCameraBlend or CinematicSoftBlend.
  4. Open the asset to edit duration and easing curves.
  5. Assign the asset to one or more Phantom Camera components in the level.

Easing Curve Reference

Easing curves control the acceleration profile of an interpolation. Common choices and their camera feel:

CurveFeel
LinearMechanical, even movement. Rarely used for cameras.
EaseInStarts slow, accelerates. Camera hesitates before committing.
EaseOutStarts fast, decelerates. Camera arrives gently.
EaseInOutSlow start and end, fast through the middle. Most natural for most transitions.
EaseOutBackSlight overshoot before settling. Adds energy to arriving at a new view.

ScriptCanvas — Triggering a Blend

Blends trigger automatically when the dominant Phantom Camera changes. To trigger a blend from script, simply change which camera is active or dominant:

[Custom Event → EnterCombat]
    └─► [EntityId → Set Active (combat camera entity)]
            └─► Cam Manager detects priority change
            └─► Blend Profile on combat camera governs the transition
            └─► Cam Core interpolates position, rotation, FOV over blend duration

There is no separate “start blend” call. The blend begins the moment the Cam Manager determines a new dominant camera and ends when Cam Core finishes interpolating.


Quick Reference

NeedHow
Create a blend profileAsset Browser → Create Asset → GS_PhantomCamBlendProfile
Assign a profile to a cameraSet the Blend Profile reference in the Phantom Camera component properties
Make a transition instantLeave the Blend Profile reference empty on the Phantom Camera
Share one profile across many camerasAssign the same asset reference to multiple Phantom Camera components
Change blend timing without changing camerasEdit the Blend Profile asset — all cameras referencing it update automatically

Glossary

TermMeaning
Blend ProfileA data asset defining duration and per-axis easing curves for camera transitions
Position EasingThe easing curve controlling how the camera accelerates/decelerates in position
Rotation EasingThe easing curve controlling rotation interpolation independently from position

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

10.5 - Influence Fields

How to use GlobalCameraInfluenceComponent and CameraInfluenceFieldComponent to apply dynamic camera modifications globally or within spatial zones.

Influence Fields are modifiers that change camera priority dynamically beyond what any individual Phantom Camera’s base prioerity specifies. Two types are available: global components that apply unconditionally, and spatial field components that apply only when the player enters a defined volume.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Camera Influence Field component in the O3DE Inspector

 

Contents


Types of Influence Fields

ComponentScopeWhen It Applies
GlobalCameraInfluenceComponentGlobalAlways active. Applies priority effects to the system unconditionally. Usually placed on SceneData entity
CameraInfluenceFieldComponentSpatialActive when the player is inside the associated physics trigger volume.

Both types use CamInfluenceData records to describe what they modify. Multiple fields can be active at the same time — their effects are resolved by priority stacking.


Priority Stacking

When multiple influence fields are active at the same time, their priority changes add to the target camera’s priority. This enables growingly more granular spaces to regularly shift in, by adding a new influence field, and shift out, when you exit and are in a more general space.


Quick Reference

NeedComponentHow
Apply a camera prioerity alwaysGlobalCameraInfluenceComponentAdd to any entity, preferably the StageData entity
Apply a priority in a zoneCameraInfluenceFieldComponentAdd with a PhysX trigger collider; player entering activates it

Glossary

TermMeaning
Influence FieldA modifier that adjusts camera behavior dynamically on top of the dominant Phantom Camera
Priority StackingThe effect of taking the camera base priority, then additively applying influence field changes ontop to eventually determine the dominant phantom camera
Global InfluenceAn unconditional priority modifier that applies regardless of spatial position
Spatial InfluenceA volume-based modifier that activates when the player enters its physics trigger

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

11 - GS_Stats

RPG statistics and status effect systems for GS_Play — timed and permanent modifiers that alter entity stats and drive reactive gameplay behavior.

GS_Stats provides the statistical and modifier infrastructure for RPG-style character systems. It handles the application, tracking, and expiry of status effects — the layer between raw gameplay events and the numerical state of a character.

For architecture details, component properties, and extending the system in C++, see the GS_Stats API.


Quick Navigation

I want to…FeatureAPI
Apply timed or permanent modifiers to entity stats with stacking and expiryStatus EffectsAPI

Installation

GS_Stats requires GS_Core. Add both gems to your project.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Stats gem in your project configuration.
  2. Define status effect data assets for your game’s modifiers.
  3. Place status effect components on entities that need to receive modifiers.

Status Effects

Status Effects are the primary mechanism for applying timed or permanent modifiers to entities. Each effect defines what it changes, how long it lasts, whether it stacks, and what happens on expiry. The system handles the full lifecycle: application, duration tracking, stack management, and cleanup. Effects are data-driven — designers author new types in the Asset Editor without touching code.

Status Effects API


See Also

For the full API, component properties, and C++ extension guide:


Get GS_Stats

GS_Stats — Explore this gem on the product page and add it to your project.

11.1 - Status Effects

How to work with GS_Play status effects — timed modifiers that alter entity stats or behavior.

Status Effects are timed or permanent modifiers that alter entity stats or behavior. They support stacking, expiration, and side-effect triggering — enabling standard RPG mechanics like poison damage over time, temporary buffs, and debuff stacking.

For component properties and the status effect data model, see the Framework API reference.

 

Contents


How Status Effects Work

PhaseWhat Happens
ApplicationA status effect is applied to an entity with a duration and configuration.
ActiveThe effect modifies the entity’s stats or triggers periodic side effects.
StackingIf the same effect is applied again, stacking rules determine whether it refreshes, stacks count, or is rejected.
ExpirationWhen the duration expires, the effect is removed and any cleanup side effects fire.

Glossary

TermMeaning
Status EffectA timed or permanent modifier that alters entity stats or behavior
StackingRules governing how repeated applications of the same effect interact
Side EffectAn action triggered when a status effect is applied, ticks, or expires

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:


Get GS_Stats

GS_Stats — Explore this gem on the product page and add it to your project.

12 - GS_UI

The complete UI framework for GS_Play — canvas lifecycle management, single-tier page navigation, enhanced buttons, data-driven animation, and input interception.

GS_UI is the complete user interface framework for GS_Play, built on top of O3DE’s LyShine rendering layer. It provides a singleton manager for canvas lifecycle and focus, a single-tier page navigation architecture, motion-based button animations, a LyShine-specific track animation system authored as data assets, and input interception utilities for pause menus and stage transitions.

For architecture details, component properties, and extending the system in C++, see the GS_UI API.


Quick Navigation

I want to…FeatureAPI
Load and unload UI canvases, manage focus stack, or set startup focusUI ManagerAPI
Navigate between pages, handle back navigation, or cross canvas boundariesPage NavigationAPI
Add button animations or intercept input for UI canvasesUI InteractionAPI
Animate UI elements with position, scale, rotation, alpha, and color tracksUI AnimationAPI
Add a load screen or pause menuWidgetsAPI

Installation

GS_UI requires GS_Core and LyShine. Add both gems to your project before placing UI components.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_UI gem in your project configuration.
  2. Create a UI Manager prefab and add it to the Game Manager’s Managers list.
  3. Create LyShine UI canvases with GS_UIPageComponent on root elements.
  4. Create .uiam animation assets for page transitions and button effects.

UI Manager

The UI Manager is the singleton that owns canvas lifecycle for the entire game session. It loads and unloads canvases by name, maintains a global focus stack, and handles startup focus deterministically. All canvas operations go through the Manager so that cross-canvas navigation and focus transitions remain coordinated.

UI Manager API


Page Navigation is the single-tier architecture that structures all navigable screens. Canvases contain root pages that can parent nested child pages recursively. Navigation is driven through the Page component: NavigateTo, NavigateBack, ChangePage, and ChangeUI handle all routing. Pages play UiAnimationMotion assets on show and hide for data-driven transitions.

Page Navigation API


UI Interaction

Enhanced button animations and input interception. GS_ButtonComponent extends LyShine interactables with UiAnimationMotion-based hover and select animations. GS_UIInputInterceptorComponent captures input events while a canvas is focused, preventing them from reaching gameplay. Both systems work together on interactive canvases.

UI Interaction API


UI Animation

UI Animation extends GS_Motion with eight LyShine-specific tracks: position, scale, rotation, element alpha, image alpha, image color, text color, and text size. Tracks are authored together in .uiam assets and played in parallel at runtime. Page components embed motion instances for show/hide transitions, and a standalone component makes animations available on any UI entity.

UI Animation API


Widgets

Standalone UI components for game-event-driven scenarios outside the page navigation model. Load screens display automatically during stage transitions. Pause menus overlay gameplay and suppress gameplay input while active.

Widgets API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

12.1 - UI Manager

How to work with the GS_Play UI Manager — canvas lifecycle, focus stack, and cross-canvas navigation.

The UI Manager is the singleton controller for all UI canvases in your project. It loads and unloads canvases, maintains a global focus stack that tracks which canvas the player is currently interacting with, and handles cross-canvas navigation so that opening a pause menu over gameplay UI and returning to it works automatically.

For architecture details, component properties, and extension patterns, see the Framework API reference.

UI Manager component in the O3DE Inspector

 

Contents


How It Works

The UI Manager maintains a global focus stack — a history of which UI canvas has focus. When you open a new canvas (like a pause menu), it pushes onto the stack. When you close it, the previous canvas (gameplay HUD) regains focus automatically.

At startup, the UI Manager loads canvases and waits for root pages to register. One canvas is designated as the startup focus UI — it receives focus automatically when the startup sequence completes.


Loading and Unloading Canvases

ScriptCanvas NodeWhat It Does
LoadGSUI(name, path)Loads a UI canvas from an asset path and registers it by name.
UnloadGSUI(name)Unloads a canvas and removes it from the active map.
FocusUI(name)Pushes a canvas onto the global focus stack and focuses its root page.
NavLastUIPops the current canvas off the focus stack and returns focus to the previous one.
ToggleUI(name, on)Shows or hides a canvas without changing the focus stack.

ScriptCanvas

If you wish to change between canvases, leaving the previous inactive, and enabling the next:

Toggle off your current canvas, then focus the next. Focus automatically handles enabling the canvas for use.

Using Page: NavigateBack will naturally hide the new canvas, and reactivate the past one.


Querying UI State

ScriptCanvas NodeWhat It Does
GetFocusedUIReturns the name of the canvas currently on top of the focus stack.
GetUICanvasEntity(name)Returns the canvas entity for a named UI.
GetUIRootPageEntity(name)Returns the root page entity for a named UI.

ScriptCanvas


UI Input Profile

UI Input Profile asset in the O3DE Asset Editor


Quick Reference

NeedBusMethod
Load a canvasUIManagerRequestBusLoadGSUI(name, path)
Unload a canvasUIManagerRequestBusUnloadGSUI(name)
Focus a canvasUIManagerRequestBusFocusUI(name)
Return to previous canvasUIManagerRequestBusNavLastUI
Show/hide a canvasUIManagerRequestBusToggleUI(name, on)
Get focused canvas nameUIManagerRequestBusGetFocusedUI

Glossary

TermMeaning
UI CanvasA LyShine canvas loaded and managed by the UI Manager
Focus StackA global history of focused canvases — opening a new one pushes, closing pops
Root PageA page registered with the UI Manager as a canvas entry point

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

12.2 - Page Navigation

How to work with GS_Play page navigation — single-tier page hierarchy, focus management, and transitions.

Page Navigation is the core of the GS_UI system. Pages are the fundamental unit of UI organization — each page represents a screen, panel, or section of your interface. Pages can be root pages (registered with the UI Manager as a canvas entry point) or nested child pages (managed by a parent page). Navigation uses a push/pop focus stack so the system always knows where to return when the player backs out.

For architecture details, the NavigateTo algorithm, and component internals, see the Framework API reference.

Root page entity setup in the O3DE Editor

 

Contents


Page Types

TypeHow It’s ConfiguredWhat It Does
Root Pagem_isRoot = trueRegisters with the UI Manager by name. Acts as the canvas entry point.
Child Pagem_isRoot = falseRegisters with its parent page automatically. Managed by parent’s show/hide logic.

A canvas typically has one root page at the top, with child pages nested beneath it for sub-screens (settings tabs, inventory categories, confirmation dialogs).

Nested pages entity hierarchy in the O3DE Editor

Example of pages within pages to determine UI display structure, and handling changing focus between pages. SettingsWindowPage represents the main options windows. The child pages are the actual different windows that display within the space, changing with side menu changes.

For guides on complex page navigation, see Lesson: Create UI.

 

Required Companion Components

Root page entities require two additional components alongside GS_UIPageComponent:

ComponentWhy It’s Needed
FaderComponentDrives alpha-based fade transitions for show/hide animations.
HierarchicalInteractionToggleComponentDisables input on the entire page subtree when the page is hidden, preventing clicks from reaching invisible elements.

Add both to every root page entity in the UI Editor.


These are the primary methods for controlling page flow:

ScriptCanvas NodeWhat It Does
NavigateTo(forced)Focuses this page. Automatically pushes at the correct ancestor level in the hierarchy.
NavigateBackWalks up the page hierarchy to find the first ancestor with a non-empty focus stack and pops it. If no stack is found, calls NavLastUI on the UI Manager (closes the canvas).
ChangePage(target, takeFocus, forced)Swaps the displayed child page without pushing a new focus stack entry.
ToggleShow(on)Shows or hides this page.
ChangeUI(targetUI, takeFocus, hideThis)Cross-canvas navigation — switches to a different UI canvas.
FocusChildPageByName(name, forced)Focuses a child page by its m_pageName string.

Handling Page Components

Navigating Pages

Changing UIs


Return Policies

Each page has a NavigationReturnPolicy that controls which element gets focus when the page is returned to:

PolicyBehavior
RestoreLastReturns focus to the last interactable the player was on (e.g., resume where you left off).
AlwaysDefaultAlways returns to the page’s default interactable (e.g., always land on “New Game” button).

Page Transitions

Pages support three motion slots for animated transitions:

SlotWhen It Plays
onShowWhen the page becomes visible.
onShowLoopLoops continuously while the page is visible.
onHideWhen the page is hidden.

Each slot takes a .uiam (UI Animation Motion) asset. See UI Animation for track types.


Quick Reference

NeedBusMethod
Navigate to a pageUIPageRequestBusNavigateTo(forced)
Go backUIPageRequestBusNavigateBack
Switch child pageUIPageRequestBusChangePage(target, takeFocus, forced)
Show/hide a pageUIPageRequestBusToggleShow(on)
Switch canvasesUIPageRequestBusChangeUI(targetUI, takeFocus, hideThis)
Focus child by nameUIPageRequestBusFocusChildPageByName(name, forced)

Glossary

TermMeaning
PageThe fundamental UI unit — a screen, panel, or section of the interface
Root PageA page registered with the UI Manager as a canvas entry point
Child PageA page managed by a parent page, navigated to via ChangePage or FocusChildPageByName
NavigationReturnPolicyControls which element gets focus when returning to a page (RestoreLast or AlwaysDefault)

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

12.3 - UI Interaction

How to work with GS_Play UI interaction — motion-based button animations and input interception for focused canvases.

UI Interaction covers two systems that handle player input at the UI layer: enhanced buttons with motion-based hover and select animations, and the input interceptor that captures input events while a canvas is focused.

For component properties and the full API, see the Framework API reference.

 

Contents


Buttons

GS_ButtonComponent in the O3DE Inspector

GS_ButtonComponent extends LyShine’s interactable system with UiAnimationMotion support. When the player hovers over a button (mouse or gamepad navigation), a configurable animation plays. When the button is selected, a separate animation fires. Both animations use .uiam assets, giving full control over position, scale, rotation, alpha, and color transitions.

Setup

  1. Add a LyShine UiButtonComponent (or other interactable) to your entity.
  2. Add GS_ButtonComponent to the same entity.
  3. Author .uiam assets for hover, unhover, and select states.
  4. Assign the assets to the corresponding fields in the Inspector.

Integration with Page Focus

When page navigation moves focus to a button (via keyboard or gamepad), the button treats focus as a hover event and plays its hover animation. This ensures consistent visual feedback regardless of input method.


Input Interception

UI Input Interceptor coponent in the O3DE Asset Editor

GS_UIInputInterceptorComponent prevents input from leaking to gameplay systems while a UI canvas is active. It uses a GS_UIInputProfile asset to define which input channels to intercept. Intercepted events are re-broadcast on UIInputNotificationBus so UI-specific logic can respond to them.

When the canvas loses focus, the interceptor deactivates automatically and input flows normally back to gameplay.

Setup

  1. Add GS_UIInputInterceptorComponent to the root page entity.
  2. Configure a GS_UIInputProfile asset with the channels to intercept.
  3. Connect to UIInputNotificationBus in any component that should respond to intercepted events.

Integration

Buttons and input interception work together on interactive canvases:

  • The input interceptor blocks gameplay input while a menu is open.
  • Buttons provide visual hover/select feedback driven by UIInputNotificationBus events.
  • The page system routes gamepad focus to the correct button automatically.

Glossary

TermMeaning
GS_ButtonComponentAn enhanced button with GS_Motion-based hover and select animations
Hover AnimationA .uiam motion that plays when the button receives focus or mouse hover
Select AnimationA .uiam motion that plays when the button is activated
GS_UIInputInterceptorComponentComponent that captures input events while a canvas is focused
GS_UIInputProfileData asset defining which input channels to intercept

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

12.3.1 - Buttons

Redirected — button documentation is now part of UI Interaction.

GS_ButtonComponent is an enhanced button that adds GS_Motion-based animations for hover and select states. For the full guide, see The Basics: UI Interaction.

For component properties and API details, see the Framework API reference.

 

GS_ButtonComponent in the O3DE Inspector

Contents


How It Works

The GS_ButtonComponent connects to LyShine’s interactable notification system. When the interactable reports a hover event, the button plays its hover animation. When it reports an unhover event, the animation reverses or stops. Select works the same way.

Animations are defined as .uiam assets and assigned in the Inspector. Each button can have independent hover and select motions.


Integration with Page Focus

When page navigation moves focus to a button (via keyboard or gamepad), the button treats focus as a hover event and plays its hover animation. This ensures consistent visual feedback regardless of input method.


Handling Button Events

Button Events nodes in the Script Canvas


Glossary

TermMeaning
GS_ButtonComponentAn enhanced button with GS_Motion-based hover and select animations
Hover AnimationA .uiam motion that plays when the button receives focus or mouse hover
Select AnimationA .uiam motion that plays when the button is activated

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

12.4 - UI Animation

How to work with GS_Play UI animations — motion-based LyShine property animation for pages, buttons, and standalone components.

UI Animation extends the GS_Motion system with 8 LyShine-specific animation tracks. Animations are authored as .uiam (UI Animation Motion) assets in the O3DE Asset Editor and can be used for page transitions, button hover effects, and standalone component animation.

For track type details, the domain extension pattern, and component internals, see the Framework API reference.

UI Animation Motion asset in the O3DE Asset Editor

 

Contents


Available Tracks

Each track animates a single LyShine property over time:

TrackTarget ComponentWhat It Animates
PositionAny LyShine elementPosition offset from anchor
ScaleAny LyShine elementElement scale
RotationAny LyShine elementElement rotation
Element AlphaAny LyShine elementWhole-element transparency
Image AlphaUiImageComponentImage-specific transparency
Image ColorUiImageComponentImage color tint
Text ColorUiTextComponentText color
Text SizeUiTextComponentFont size

Multiple tracks can run simultaneously within a single motion — for example, a page show animation might fade in (Element Alpha) while sliding up (Position) and scaling from 90% to 100% (Scale).


Where Animations Are Used

ContextHow It’s Assigned
Page transitionsAssigned to onShow, onShowLoop, and onHide slots on GS_UIPageComponent.
Button statesAssigned to hover and select slots on GS_ButtonComponent.
Standalone playbackAssigned to UiAnimationMotionComponent on any entity.

Authoring Animations

  1. Create a new .uiam asset in the O3DE Asset Editor.
  2. Add tracks for the properties you want to animate.
  3. Configure timing, easing curves, and property values for each track.
  4. Optionally set track identifiers for proxy targeting.
  5. Assign the asset to a component slot in the Inspector.

Proxy Targeting

UiAnimationMotion with proxy bindings configured in the Inspector

When tracks have identifiers (named labels), they appear in the motion’s proxy list. Proxies let you redirect a track to a different entity in the UI hierarchy — for example, animating a background panel separately from a content area within the same page transition.


Quick Reference

NeedHow
Animate a page transitionAssign .uiam assets to onShow/onHide slots on the page
Animate a button hoverAssign .uiam asset to the button’s hover slot
Play an animation on any entityAdd UiAnimationMotionComponent and assign a .uiam asset
Target a child elementUse proxy entries to redirect tracks
Loop an animationEnable loop on the motion asset

Glossary

TermMeaning
UI Animation Motion (.uiam)A data asset containing LyShine-specific animation tracks for UI elements
UiAnimationMotionComponentA standalone component for playing UI animations on any entity
Proxy TargetingRedirecting a track to a different entity in the UI hierarchy via named labels

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

12.5 - Widgets

How to work with GS_Play UI widgets — load screens, pause menus, and other standalone UI components.

Widgets are standalone UI components that handle specific game scenarios outside the page navigation model. They activate and deactivate in response to game events rather than player navigation — a load screen appears during a stage transition, a pause menu overlays gameplay when the player pauses.

For component properties and the full API, see the Framework API reference.

 

Contents


Load Screen

GS_LoadScreenComponent shows a loading canvas during stage transitions and hides it when loading is complete. Place it on the Game Manager entity and configure a LyShine canvas to use as the loading screen.

GS_LoadScreenComponent in the O3DE Inspector

The component listens for StageManagerNotificationBus events and drives the canvas visibility automatically — no ScriptCanvas or manual triggers required.


Pause Menu

PauseMenuComponent handles the pause overlay. It listens for the configured pause input action and toggles a designated pause canvas on and off, broadcasting events so other systems know to yield control.


Glossary

TermMeaning
WidgetA standalone UI component that activates in response to game events rather than player navigation
Load ScreenA canvas displayed during stage transitions, managed by GS_LoadScreenComponent
Pause MenuAn overlay canvas displayed when the game is paused, managed by PauseMenuComponent

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

13 - GS_Unit

The character and entity control system — unit registration, player and AI controllers, input processing, and modular movement.

GS_Unit is the complete character control system for GS_Play. It handles how entities are registered as controllable units, which controller possesses each unit, how raw player input is captured and converted into intent, and how that intent is translated into physical movement through a modular stack of movers, grounders, and influence fields.

For architecture details, component properties, and extending the system in C++, see the GS_Unit API.


Quick Navigation

I want to…FeatureAPI
Register units, track active controllers, or spawn units at runtimeUnit ManagerAPI
Possess and release units with player or AI controllersControllersAPI
Capture raw input and convert it into structured movement intentInput DataAPI
Move characters with movers, grounders, and movement influence fieldsMovementAPI

Installation

GS_Unit requires GS_Core. Add both gems to your project before using any Unit components.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Unit gem in your project configuration.
  2. Create a Unit Manager prefab and add it to the Game Manager’s Managers list.
  3. Configure physics collision layers for character movement.
  4. Place a Player Controller and Unit entity in your level.

Unit Manager

The Unit Manager is the singleton coordinator for everything unit-related. It handles registration of new units as they spawn, maintains an indexed list of all active units queryable by name or ID, and provides the spawn interface for runtime unit creation. It also participates in the standby system for clean level transitions.

Unit Manager API


Controllers

Controllers are the possession layer — they determine which entity a given intelligence (player or AI) currently owns and drives. Player and AI controllers share the same possession interface, so swapping control mid-game or having both active simultaneously requires no special-casing.

Controllers API


Input Data

The Input Data feature set converts hardware signals into structured intent your movement code can consume. The PlayerControllerInputReader reads raw input events, InputData components store the current state, and Input Reactors compute movement vectors from keyboard or joystick input with deadzones and scaling.

Input Data API


Units

Unit overview description, leads to movement.

Units API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

13.1 - Unit Manager

How to work with the GS_Unit manager — spawning units, registering player controllers, and coordinating standby across all controlled characters.

The Unit Manager is the global coordinator for all units in a GS_Play project. It maintains a registry of every active unit entity, provides the interface for spawning new units at runtime, tracks which player controllers are currently active, and participates in the GS_Core standby system so that all controlled characters pause cleanly during level transitions.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Unit Manager component in the O3DE Inspector Unit Manager entity setup in the O3DE Editor

 

Contents


What the Unit Manager Does

The Unit Manager runs as a singleton that all other unit systems report to. When a unit entity activates, it registers itself so the manager can track it by name and ID. When gameplay code needs a new unit spawned, it asks the manager — not the spawning system directly — so the manager can assign a unique name and fire the appropriate notification.

ResponsibilityDescription
Unit registryEvery active unit registers and deregisters automatically.
SpawningIssues RequestSpawnNewUnit and fires ReturnNewUnit when the entity is ready.
Player controller trackingStores which controllers are registered as player-driven so multi-actor setups stay consistent.
Standby coordinationBroadcasts EnterStandby and ExitStandby to pause and resume all units during transitions.
Unit validationCheckIsUnit lets any system confirm whether an entity is a registered unit.

Spawning Units

How Spawning Works

Spawning a unit is a two-step async operation. Your script calls RequestSpawnNewUnit, the manager creates the entity using the prefab spawner, and when the unit is fully initialized the manager fires ReturnNewUnit on UnitManagerNotificationBus with the new entity ID and its assigned unique name.

You should always listen for ReturnNewUnit rather than attempting to use the entity immediately after the request, because spawning can take one or more frames.

StepWhat Happens
1 — RequestCall RequestSpawnNewUnit with the prefab reference and desired spawn transform.
2 — WaitThe manager spawns the entity and runs its initialization.
3 — ReceiveReturnNewUnit fires on UnitManagerNotificationBus with the entity ID and name.

ScriptCanvas — Spawning a Unit


Registering Player Controllers

Player controllers must register themselves with the Unit Manager on activation so the game always knows which controllers are active. This is handled automatically by GS_PlayerControllerComponent, but if you are building a custom controller, call RegisterPlayerController in your activation logic.

[On Entity Activated (custom controller)]
    └─► [UnitManagerRequestBus → RegisterPlayerController(selfEntityId)]

This registration matters most for split-screen or multi-player setups where the manager needs to route input correctly across multiple simultaneous controllers.


Checking Unit Identity

Any system can confirm whether an arbitrary entity is a registered unit without holding a direct reference to the Unit Manager:

ScriptCanvas NodeReturnsNotes
CheckIsUnit(entityId)booltrue if the entity is an active registered unit.

ScriptCanvas

[UnitManagerRequestBus → CheckIsUnit(targetEntityId)]
    └─► bool
            ├─► true  — proceed with unit-specific logic
            └─► false — entity is not a unit, skip

Standby Mode

The Unit Manager participates in GS_Core’s global standby system. During level transitions or other blocking operations, the Game Manager signals standby and the Unit Manager relays that signal to all units. Units should pause movement, suspend input processing, and halt any tick-driven logic while in standby.

Listen on UnitManagerNotificationBus to coordinate your own logic with unit standby:

EventWhen It FiresWhat to Do
EnterStandbyBefore a level transition or blocking operation begins.Pause all movement, halt ticks, stop animations.
ExitStandbyAfter the blocking operation completes.Resume movement, re-enable ticks.

ScriptCanvas


Quick Reference

NeedBusMethod / Event
Spawn a new unitUnitManagerRequestBusRequestSpawnNewUnit(prefab, transform)
Know when a unit is readyUnitManagerNotificationBusReturnNewUnit(entityId, uniqueName)
Register a player controllerUnitManagerRequestBusRegisterPlayerController(entityId)
Check if entity is a unitUnitManagerRequestBusCheckIsUnit(entityId)
Know when standby beginsUnitManagerNotificationBusEnterStandby
Know when standby endsUnitManagerNotificationBusExitStandby

Glossary

TermMeaning
Unit ManagerThe singleton manager that tracks all active units, handles spawning, and coordinates standby
Unit RegistryThe internal list of every active unit entity, indexed by name and ID
StandbyA pause state broadcast to all units during level transitions or blocking operations

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

13.2 - Controllers

How to work with GS_Unit controllers — the possession model, player and AI controller setup, and switching control at runtime.

Controllers are the possession layer of GS_Unit. A controller is the intelligence — human input or AI logic — that owns and drives a unit at any given moment. Because both player and AI controllers share the same possession interface, your unit entities do not need to know what is controlling them, and swapping control at runtime requires no special-casing.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Player Controller and Input Reader components in the O3DE Inspector

 

Contents


The Possession Model

Unit Possession Pattern Graph

Breakdown

Every unit has exactly one controller at a time, or no controller at all. Possession is established by calling Possess on the unit and released by calling DePossess. The unit fires UnitPossessed on UnitNotificationBus whenever ownership changes so other systems can react.

ConceptDescription
PossessionA controller attaches to a unit. The unit accepts input and movement commands from that controller only.
DePossessionThe controller releases the unit. The unit halts input processing and enters a neutral state.
UnitPossessed eventFires on UnitNotificationBus (addressed by entity ID) whenever a unit’s controller changes.
GetControllerReturns the entity ID of the current controller, or an invalid ID if none.
GetUniqueNameReturns the string name assigned by the Unit Manager when this unit was spawned.

Controller Types

GS_Unit ships three controller components that cover the full range of typical game use:

ComponentPurpose
GS_UnitControllerComponentBase class establishing the possession contract. Extend this for custom controllers.
GS_PlayerControllerComponentHuman-driven controller. Connects into the Input Data pipeline and routes player intent to the possessed unit.
GS_AIControllerComponentNPC controller. Provides the entry point for AI logic to issue movement commands through the same unit interface as player input.

Both GS_PlayerControllerComponent and GS_AIControllerComponent extend the base controller, meaning any code that works with the base controller interface works transparently with either type.


Player Controller

GS_PlayerControllerComponent registers itself with the Unit Manager on activation, making it visible to the game’s controller tracking system. It connects to the Input Data pipeline on the possessed unit so that raw input events are routed correctly as soon as possession is established.

Setup

  1. Add GS_PlayerControllerComponent to a controller entity (not the unit entity itself).
  2. Ensure GS_InputDataComponent and GS_PlayerControllerInputReaderComponent are on the unit entity.
  3. Call Possess to attach the controller to the target unit.

ScriptCanvas — Possessing a Unit


AI Controller

GS_AIControllerComponent gives AI logic the same possession interface as the player controller. An AI behavior system possesses a unit, then issues movement commands through UnitRequestBus or directly drives the unit’s Input Data component to simulate input. The unit processes those commands through the same mover stack it would use for player input.

ScriptCanvas — Handing a Unit to AI

// Called from your behavior tree or AI activation event when the NPC starts acting
[AI behavior activates]
    └─► [UnitRequestBus(unitEntityId) → Possess(aiControllerEntityId)]

[UnitNotificationBus(unitEntityId) → UnitPossessed(aiControllerEntityId)]
    └─► [AI logic begins issuing movement / action commands]

Switching Controllers at Runtime

Because possession is a simple attach/detach operation on the unit, switching from player to AI control (or back) is two calls:

[Trigger: cutscene starts, character incapacitated, etc.]
    └─► [UnitRequestBus(unitEntityId) → DePossess]
            └─► [UnitRequestBus(unitEntityId) → Possess(aiControllerEntityId)]

[Trigger: cutscene ends, character recovers, etc.]
    └─► [UnitRequestBus(unitEntityId) → DePossess]
            └─► [UnitRequestBus(unitEntityId) → Possess(playerControllerEntityId)]

The DePossess call ensures the previous controller’s input pipeline is disconnected before the new controller attaches, preventing stale input state from leaking between ownership changes.


Querying Controller State

ScriptCanvas NodeReturnsNotes
GetController (on unit)EntityIdCurrent controller entity. Invalid ID means no controller.
GetUniqueName (on unit)stringThe name assigned to this unit at spawn time.

ScriptCanvas


Standby and Controllers

When the Unit Manager broadcasts EnterStandby, both player and AI controllers should stop issuing commands. Player controllers automatically halt input reading. AI controllers must be suspended by whatever behavior system drives them. On ExitStandby, normal command flow resumes.

EventController Action
UnitEnteringStandbyHalt input reads, suspend AI commands.
UnitExitingStandbyResume input reads, restart AI commands.

Both events fire on UnitNotificationBus addressed by unit entity ID.


Quick Reference

NeedBusMethod / Event
Attach a controller to a unitUnitRequestBus (by ID)Possess(controllerEntityId)
Release a controller from a unitUnitRequestBus (by ID)DePossess
Know when possession changesUnitNotificationBus (by ID)UnitPossessed(controllerEntityId)
Get the current controllerUnitRequestBus (by ID)GetController
Get a unit’s nameUnitRequestBus (by ID)GetUniqueName
Know when unit enters standbyUnitNotificationBus (by ID)UnitEnteringStandby
Know when unit exits standbyUnitNotificationBus (by ID)UnitExitingStandby

Glossary

TermMeaning
PossessionThe act of a controller attaching to a unit and becoming its active intelligence
DePossessionReleasing a controller from a unit, halting its input processing
Player ControllerA human-driven controller that routes input from the Input Data pipeline to a unit
AI ControllerAn NPC controller that issues movement commands through the same unit interface as player input

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

13.3 - Input Data

How to work with the GS_Unit input pipeline — reading player input, converting keyboard and joystick signals into movement vectors, and chaining input reactors.

The Input Data system is the pipeline that converts raw hardware signals into structured intent your movement code can consume. It sits between the active input profile and the mover stack, so controllers, movement components, and action systems all read from a single consistent source rather than polling hardware directly. Swapping input profiles at runtime — for remapping, controller switching, or accessibility options — requires no change to any downstream component.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Input Reactor component in the O3DE Inspector

 

Contents


Pipeline Overview

Unit Input Handling Pattern Graph

Breakdown

The input pipeline has three stages. Each stage is a separate component on the unit entity, and they run in order every frame:

StageComponentWhat It Does
1 — ReadGS_PlayerControllerInputReaderComponentReads raw input events from the active input profile and writes them into GS_InputDataComponent.
2 — StoreGS_InputDataComponentHolds the current frame’s input state — button presses, axis values — as structured data.
3 — ReactReactor componentsRead from GS_InputDataComponent and produce intent: movement vectors, action triggers, etc.

All reactor components downstream of the store stage read from the same GS_InputDataComponent, so there is no duplicated hardware polling and no risk of two reactors seeing different input states for the same frame.


Input Data Component

GS_InputDataComponent is the shared state store. It does not read hardware — it only holds the values written by the Reader and consumed by Reactors.

Add it to any unit entity that needs to process input. A unit should have exactly one GS_InputDataComponent. All input reactors on the same entity read from it automatically.


Input Reader

GS_PlayerControllerInputReaderComponent connects the unit to the game’s active input profile. It receives input events from the profile and writes the resulting state into the unit’s GS_InputDataComponent each frame.

When the player controller possesses the unit, the reader activates. When the unit is depossessed or control is handed to an AI controller, the reader goes silent and the Input Data component retains its last-written state until overwritten.

Setup

Add GS_PlayerControllerInputReaderComponent to the unit entity alongside GS_InputDataComponent. No additional configuration is required — it picks up the active input profile automatically.

Script Canvas - Enabling and Disabling Input Groups

Input Groups handling nodes in the O3DE Script Canvas


Input Reactors

Reactors are the components that transform stored input state into useful intent values. Each reactor handles one concern:

ComponentInput SourceOutput
GS_InputReactorComponentBase class — reads from GS_InputDataComponent.Override to produce custom intent.
GS_InputAxisReactorComponentAxis values from GS_InputDataComponent.Processed axis output (scaled, deadzoned).
KeyboardMovement_InputReactorComponentKeyboard directional buttons (WASD / arrow keys).Normalized 3D movement vector.
JoyAxisMovement_AxisReactorComponentJoystick axis pair from GS_InputDataComponent.Scaled, deadzoned 3D movement vector.

Add as many reactors as your unit needs. A typical ground character has both KeyboardMovement_InputReactorComponent and JoyAxisMovement_AxisReactorComponent active so it responds to either keyboard or gamepad input without requiring separate prefabs.


Keyboard Movement Reactor

KeyboardMovement_InputReactorComponent reads the four directional button states from GS_InputDataComponent and produces a normalized movement vector. The vector direction is relative to the unit’s forward axis, so rotating the unit automatically adjusts the movement direction without any additional calculation.

The reactor writes the resulting vector into a movement data field that the active mover reads each frame.

ScriptCanvas — Reading the Movement Vector

You generally do not need to read the movement vector directly — the mover components consume it automatically. However, if you need to inspect or override it:

// Use only if you need to inspect or override the movement vector — movers consume it automatically
[On Tick]
    └─► [GS_InputDataComponent → GetMovementVector]
            └─► Vector3 — current frame's movement intent

Joystick Axis Reactor

JoyAxisMovement_AxisReactorComponent reads a pair of axis values (horizontal and vertical) from GS_InputDataComponent and applies deadzone filtering and magnitude scaling before writing the resulting vector. This prevents stick drift at rest and allows variable-speed movement when the stick is partially deflected.

SettingEffect
DeadzoneAxis values below this threshold are treated as zero.
ScaleMultiplier applied after deadzone removal.
Axis mappingWhich input profile axis pair to read (left stick, right stick, etc.).

Adding Custom Reactors

To handle input not covered by the built-in reactors — jumping, sprinting, interaction, ability activation — extend GS_InputReactorComponent or GS_InputAxisReactorComponent in C++. Your reactor reads from GS_InputDataComponent the same way the built-in ones do, so it automatically benefits from any input profile that writes to those fields.

For the C++ extension guide, see the Framework API reference.


Quick Reference

NeedComponentNotes
Store input state on a unitGS_InputDataComponentRequired on every unit that processes input.
Read from active input profileGS_PlayerControllerInputReaderComponentActivates automatically when player controller possesses.
Keyboard → movement vectorKeyboardMovement_InputReactorComponentReads WASD / arrow keys from input data.
Joystick → movement vectorJoyAxisMovement_AxisReactorComponentReads axis pair, applies deadzone and scale.
Custom button or axis handlerExtend GS_InputReactorComponent (C++)Reads from GS_InputDataComponent same as built-ins.
Inspect movement vectorGS_InputDataComponent → GetMovementVectorVector3, current frame’s movement intent.

Glossary

TermMeaning
Input Data ComponentThe shared state store that holds the current frame’s input values for a unit
Input ReaderThe component that reads raw input events from the active input profile and writes them into the Input Data Component
Input ReactorA component that reads from the Input Data Component and produces intent values (movement vectors, action triggers)
Movement VectorA normalized 3D direction produced by input reactors and consumed by movers

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

13.4 - Units

What makes an entity a unit — the GS_UnitComponent, entity configuration, collision setup, possession, standby, and movement.

A “unit” in GS_Play is any entity that can be possessed by a controller and driven through gameplay. The GS_UnitComponent is the marker that transforms an ordinary entity into a unit — it provides the possession interface, unique naming, standby awareness, and automatic registration with the Unit Manager.

Units are the characters, vehicles, creatures, or any other controllable actors in your game. They do not contain decision-making logic themselves — that comes from the controller that possesses them. A unit provides the body: movement, collision, and visuals. The controller provides the brain: player input or AI logic.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Unit component in the O3DE Inspector

 

Contents


How Units Work

Registration

When a GS_UnitComponent activates, it registers itself with the Unit Manager. The manager tracks all active units and responds to CheckIsUnit queries. When the component deactivates, it unregisters automatically.

Possession

Units are possessed by controllers through the UnitRequestBus:

  1. A controller calls Possess(controllerEntityId) on the unit.
  2. The unit stores the controller reference and broadcasts UnitPossessed on UnitNotificationBus.
  3. The controller can now drive the unit’s subsystems (movement, actions, etc.).
  4. Calling DePossess() clears the controller reference and releases the unit.

Standby

Units participate in the standby system for clean level transitions. When the Unit Manager signals standby, the unit broadcasts UnitEnteringStandby on its notification bus. Child components — movers, input reactors, and others — listen for this signal and pause their processing. UnitExitingStandby reverses the process.


Entity Configuration

A minimal unit entity requires:

  1. GS_UnitComponent — Registers the entity as a unit and provides the possession interface.
  2. Movement components — At least a mover and optionally a grounder for ground detection.
  3. PhysX collider — For physics interaction and ground detection raycasts.

A fully featured unit entity typically includes:

ComponentRole
GS_UnitComponentMarks the entity as a unit; handles possession and standby.
GS_MoverContextComponentCoordinates movers and grounders; holds the Movement Profile.
A mover (e.g. GS_3DSlideMoverComponent)Defines how the unit moves each frame.
A grounder (e.g. GS_PhysicsRayGrounderComponent)Detects surface contact and reports ground state.
PhysX Rigid Body + ColliderPhysics presence in the world.
Mesh or Actor componentVisual representation.

 

Entity Arrangement

Unit entity setup in the O3DE Editor

Entity arrangement


Collision Setup

Units require properly configured PhysX collision layers to interact with the environment and other units. If you have not set up collision layers yet, refer to the Simple Project Setup guide.

Typical collision layer assignments:

LayerPurpose
Unit layerThe unit’s own collider. Collides with environment geometry and other units.
Ground detection layerUsed by grounder raycasts. Collides with terrain and walkable surfaces only.

Movement

The Movement feature set is a composable stack where each component handles one concern of character locomotion. Movers define core behavior (free, strafe, grid, slide), the MoverContext manages movement state and selects the active mover, Grounders report ground state, and Influence components apply external forces like wind or currents.

Movement API


Quick Reference

NeedBusMethod
Assign a controller to this unitUnitRequestBus (ById)Possess(controllerEntityId)
Release the current controllerUnitRequestBus (ById)DePossess()
Get the possessing controllerUnitRequestBus (ById)GetController()
Get the unit’s unique nameUnitRequestBus (ById)GetUniqueName()

 

EventBusFired When
UnitPossessedUnitNotificationBusA controller takes possession of this unit.
UnitEnteringStandbyUnitNotificationBusThe unit is entering standby (level transition).
UnitExitingStandbyUnitNotificationBusThe unit is exiting standby and resuming.

Glossary

TermMeaning
UnitAny entity with a GS_UnitComponent — the controllable body in gameplay
ControllerThe intelligence (player or AI) that possesses and drives a unit
PossessionThe act of a controller claiming ownership of a unit via Possess()
StandbyA paused state units enter during level transitions, coordinated by the Unit Manager
Unique NameAn identifier generated at activation used to look up a unit by name

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

13.4.1 - Mover Context

Movement state coordinator for GS_Unit — manages movement modes, context states, input axis transformation, and movement profile priority.

GS_MoverContextComponent is the coordinator that sits above the movers and owns the movement state for a unit. It does not move the character itself — it decides which mover runs, transforms input into the correct movement vector, and exposes state to every other system that needs to know what the character is doing.

Add exactly one GS_MoverContextComponent to any unit that uses movers.

For architecture details, component properties, and extending the system in C++, see the Framework API: Mover Context.

Mover Context component in the O3DE Inspector

 

Contents


Movement Modes

The Mover Context uses three independent mode tracks — movement, rotation, and grounding — each holding a named string. Only the mover or grounder whose assigned mode name matches the active mode on that track will run. All others stay dormant.

TrackControlsExample modes
MovementWhich mover applies velocity each frame"Free", "Slide"
RotationWhich component rotates the character"Free", "Locked"
GroundingWhich grounder runs surface detection"Free", "Disabled"

The three tracks are independent — changing the movement mode does not reset rotation or grounding.

Reverting Modes

Every mode change stores the previous value. Calling RevertToLastMovementMode, RevertToLastRotationMode, or RevertToLastGroundingMode restores it. This is how the Slide mover hands control back to Free movement after a slope is cleared — without needing to know what mode was active before.


Context States

Context states are named key/value pairs stored on the Mover Context. Any component can write or read them via MoverContextRequestBus. They provide a lightweight shared blackboard for movement systems to coordinate without direct dependencies.

Built-in StateValuesMeaning
"grounding"0 — FallingNo surface contact.
1 — GroundedStable surface contact.
2 — SlidingSurface too steep to stand. Slide mover will activate.
"StopMovement"1 — ActiveHalts all mover velocity processing for this frame.

The grounder writes "grounding" each frame. The Slide mover writes it back to 0 or 1 when the slope clears. Your own companion components can write custom states in the same way.


Input Axis Transformation

The Mover Context processes raw input through two transformations before handing it to the active mover.

StepMethodWhat It Does
Raw inputSetMoveInputAxisWritten by the input reactor. Screen-space direction.
1 — Camera-relativeModifyInputAxis()Rotates the axis to align with the active camera’s facing.
2 — Ground-projectedGroundInputAxis()Projects onto the contact surface to prevent skating above or below terrain.

Movers read the ground-projected axis. Both intermediate values are available via bus if a component needs an earlier stage.


Movement Profile Priority

The Mover Context resolves which GS_UnitMovementProfile governs the unit’s locomotion parameters at runtime using a priority stack:

  1. Influence fields — Any MovementInfluenceFieldComponent or GlobalMovementInfluenceComponent that has added a profile to the unit. Highest priority. Multiple influences are stacked additively.
  2. Global profile — The scene-level default from GlobalMovementRequestBus. Applied when no influence overrides are active.
  3. Default profile — The profile assigned directly to the GS_MoverContextComponent. Used as the final fallback.

To change a unit’s movement feel in a specific area, place a MovementInfluenceFieldComponent with a volume shape and assign an alternate profile to it. The unit’s own component property does not need to change.


Script Canvas Examples

Changing movement mode:

Script Canvas nodes for changing movement mode

Reverting to the previous movement mode:

Script Canvas nodes for reverting movement mode

Setting a context state flag:

Script Canvas nodes for setting context state


Quick Reference

NeedBusMethod
Change active movement modeMoverContextRequestBusChangeMovementMode(modeName)
Restore previous movement modeMoverContextRequestBusRevertToLastMovementMode()
Read the ground-projected inputMoverContextRequestBusGetGroundMoveInputAxis()
Write a context stateMoverContextRequestBusSetMoverState(name, value)
Read a context stateMoverContextRequestBusGetMoverState(name)
Push a movement profile overrideMoverContextRequestBusAddMovementProfile(influencer, profile)
Remove a profile overrideMoverContextRequestBusRemoveMovementProfile(influencer)
Listen for mode changesMoverContextNotificationBusMovementModeChanged, GroundingModeChanged
Listen for state changesMoverContextNotificationBusMoverStateChanged

Glossary

TermMeaning
Mover ContextThe coordinator component that owns movement state, selects the active mover, and transforms input each frame
Movement ModeA named string on a mode track that determines which mover or grounder is active
Context StateA named key/value pair on the Mover Context used as a shared blackboard between movement components
Input AxisThe movement direction written by input reactors, transformed through camera-relative and ground-projected stages
Movement ProfileA data asset storing locomotion parameters (speed, acceleration, jump force) for a unit

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

13.4.2 - Movement

How to work with GS_Unit movement — mover types, mover context, grounders, movement profiles, and influence fields.

The Movement system is a composable stack of components that each handle one concern of character locomotion. Movers define the core behavior, grounders report surface contact, the Mover Context manages which mover is active and what movement state the character is in, influence fields apply external forces, and Movement Profiles store the parameters that govern all of it. You assemble what you need — most characters use two or three components — and replace individual pieces without touching anything else.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

 

Contents


How it Works

Movers & Grounders Pattern Graph

Breakdown

Movers and Grounders are multiple components on a Unit that are constantly changing activation based on the units state. When a mover is active, it freely processes the unit’s movement based on it’s functionality. At any moment, through internal or external forces, the Movement, Rotation, or Grounding state can change, which disables the old movers, and activates the new one to continue controlling the Unit.

Input
    ↓  InputDataNotificationBus::InputStateChanged
Input Reactor Components
    ↓  MoverContextRequestBus::SetMoveInputAxis
GS_MoverContextComponent
    ↓  ModifyInputAxis() → camera-relative
    ↓  GroundInputAxis() → ground-projected
    ↓  MoverContextNotificationBus::MovementModeChanged("Free")
GS_3DFreeMoverComponent  [active when mode == "Free"]
    ↓  AccelerationSpringDamper → rigidBody->SetLinearVelocity
GS_PhysicsRayGrounderComponent  [active when grounding mode == "Free"]
    ↓  MoverContextRequestBus::SetGroundNormal / SetContextState("grounding", ...)
    ↓  MoverContextRequestBus::ChangeMovementMode("Slide")  ← when slope too steep
GS_3DSlideMoverComponent  [active when mode == "Slide"]

The Slide mover activates when the Unit is walking on too steep an angle. It takes control over the unit, slides down the hill, then restores the previous movement behaviour.


Mover Types

Mover component in the O3DE Inspector

Each mover defines one locomotion behavior. A unit has one active mover at a time, selected by the Mover Context. All movers read from the unit’s movement vector (produced by the input pipeline) and translate it into entity movement each frame.

ComponentBest ForHow It Moves
GS_3DFreeMoverComponentFlying, swimming, zero-gravityUnconstrained motion in all three axes. No ground contact required.
GS_3DSlideMoverComponentGround-bound characters on uneven terrainSlides the character along the contact surface, maintaining ground contact smoothly.
GS_PhysicsMoverComponentCharacters needing full collision responseDrives an O3DE physics rigid body. Forces and impulses apply normally.

Choosing a Mover

  • Use GS_3DFreeMoverComponent for any character that needs to move through air or water without ground contact.
  • Use GS_3DSlideMoverComponent for most ground-bound characters. It handles slopes and steps without requiring physics simulation.
  • Use GS_PhysicsMoverComponent when the character must interact physically with the world — pushing objects, being knocked back by forces, or responding to physics joints.

Mover Context

GS_MoverContextComponent sits above the movers and coordinates everything: it selects which mover runs each frame, owns the movement mode tracks, transforms raw input into camera-relative and ground-projected vectors, and manages the movement profile priority stack.

Add exactly one to any unit that uses movers.

Mover Context →


Grounders

Grounder component in the O3DE Inspector

Grounders detect whether the character has contact with a surface and report that information to the Mover Context. Without a grounder, the Context cannot distinguish ground-bound from airborne states.

ComponentMethodNotes
GS_GrounderComponentBase class — extend for custom detection logic.Not used directly.
GS_PhysicsRayGrounderComponentFires a downward raycast each frame.Suitable for most characters. Configurable ray length and collision layer.

Add one grounder to a unit that uses GS_3DSlideMoverComponent or any mover that needs ground awareness. GS_3DFreeMoverComponent does not require a grounder because it never needs ground contact.


Movement Profiles

Movement Profile asset in the O3DE Asset Editor

GS_UnitMovementProfile is a data asset that stores all locomotion parameters for a unit configuration: walk speed, run speed, acceleration, deceleration, air control, jump force, and any mover-specific values. The Mover Context component holds a reference to a profile asset, so multiple prefabs can share the same profile or each have their own without duplicating inline values.

Creating a Profile

  1. In the O3DE Asset Browser, right-click and select Create Asset → GS_UnitMovementProfile.
  2. Configure the parameters in the asset editor.
  3. Assign the asset to the Movement Profile slot on GS_MoverContextComponent.

Changing a profile asset at runtime allows you to alter all locomotion parameters at once — for example, switching a character between normal and encumbered movement profiles without touching individual component properties.

Reading Movement Profile Data - ScriptCanvas


Influence Fields

Movement Influence component in the O3DE Inspector

Influence components dynamically apply priority that change a units dominant movement profile. Priority values are additive — a unit accumulates priority values from all active sources and evaluate the final priority to determine the acting movement profile.

ComponentScopeTypical Use
GlobalMovementInfluenceComponentAll units globallyStandard movement for the region.
MovementInfluenceFieldComponentUnits inside a spatial volumeOpen fields, narrow channels, indoors and outdoors transitions.

GlobalMovementInfluenceComponent is placed once in the level, usually on the Stage Data entity. MovementInfluenceFieldComponent uses a shape component to define its volume — place one wherever you need to change movement priority.


Assembly Guide

A typical ground-bound player character uses this component combination:

ComponentRole
GS_InputDataComponentHolds input state — movement vector written here each frame.
GS_PlayerControllerInputReaderComponentReads input profile and fills GS_InputDataComponent.
KeyboardMovement_InputReactorComponentConverts keyboard input to movement vector.
JoyAxisMovement_AxisReactorComponentConverts joystick input to movement vector.
GS_PhysicsRayGrounderComponentDetects ground contact below the character.
GS_MoverContextComponentCoordinates grounders and movers, holds the Movement Profile.
GS_3DSlideMoverComponentMoves the character along the contact surface.

A flying or swimming character replaces the grounder and slide mover with GS_3DFreeMoverComponent and removes the grounder entirely.


Quick Reference

NeedComponentNotes
Free 3D movement (fly / swim)GS_3DFreeMoverComponentNo ground contact required.
Surface-hugging ground movementGS_3DSlideMoverComponentSmooth contact on slopes and steps.
Physics-simulated movementGS_PhysicsMoverComponentFull rigid body collision response.
Coordinate movers and stateGS_MoverContextComponentRequired on any unit with movers.
Raycast ground detectionGS_PhysicsRayGrounderComponentUsed with slide and physics movers.
Shared locomotion parametersGS_UnitMovementProfile (asset)Referenced by GS_MoverContextComponent.
Global force on all unitsGlobalMovementInfluenceComponentStandard movement values.
Localized force in a volumeMovementInfluenceFieldComponentNarrow paths, roughage, indoors to outdoors.

Glossary

TermMeaning
MoverA component that defines one locomotion behavior (free flight, surface sliding, physics-driven)
Mover ContextThe coordinator that tracks movement state and selects the active mover each frame
GrounderA component that detects surface contact and reports ground state to the Mover Context
Movement ProfileA data asset storing all locomotion parameters (speed, acceleration, jump force) for a unit
Influence FieldA component that applies additive priority to a units active movement profile

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.