This is the multi-page printable view of this section. Click here to print.

Return to the regular view of this page.

GS_Play Gameplay Framework for O3DE

GS_Play is a modular gameplay framework for O3DE — a full production-ready set of feature gems you can enable individually to build exactly the game you need. Each gem is independently togglable, so you bring in only what’s relevant to your project.

New to GS_Play? Start with Get Started for installation steps, GS_Core basics, and picking your first toolsets.

Intermediate developers and designers can jump straight to The Basics — editor setup, Script Canvas nodes, and EBus events for each feature set, no architecture knowledge required.

Need deep technical detail? The Framework API covers full component references, EBus interfaces, extension patterns, and class internals.

Want to follow along? Learn has video tutorials and step-by-step lessons for targeted topics and full project walkthroughs.

If you find mistakes, gaps, or anything unclear in the documentation, don’t hesitate to reach out.


Contact & Support

Live Support
Real-time help in the #support channel
Join Discord
Email
We typically respond within 1-2 business days
gs_play@genomestudios.ca
Genome Studios
Studio website and other projects
genomestudios.ca

 

Follow Us


Sections

1 - Get Started

Start here — orientation, project setup, and best practices for working with GS_Play.

GS_Play is an intermediate-to-advanced gameplay framework for O3DE. It is a modular, production-ready set of feature gems you enable individually — bring in only what your project needs. Every feature set is built on simple, consistent patterns so you can reason about what the system can do and extend it precisely, without having to understand every internal detail first.

This section covers the essentials: what GS_Play is designed for, how to get a project running, and the conventions that will help you work with the framework effectively.


How This Section Is Organized

Understanding GS_Play — What GS_Play is, what genres and use cases it targets, and how to think about its modularity and pattern-first design. A good first read if you are new to the framework or evaluating whether it suits your project.

Project Quick Start — A step-by-step lesson covering the minimum setup for a working GS_Play project: gem installation, manager configuration, camera setup, and startup sequence. Follow this to get from a blank O3DE project to a running game loop.

Best Practices — Conventions and instincts worth developing early. Covers prefab wrapper structure, how to orient yourself using the Patterns index, and other framework-level habits that prevent common friction points.


Sections

1.1 - Understanding GS_Play

What is GS_Play and what does it do?

1.2 - Project Quick Start

Easy set up to get started.

1.3 - Best Practices

Conventions and habits that make working with GS_Play faster and less frustrating.

A short list of habits worth building early. None of these are gatekept by the framework, but skipping them tends to create friction that is annoying to untangle later.


Follow the Feature Patterns

Every GS_Play feature set has a defined pattern — a simple, consistent way to think about how its pieces connect. Before diving into a new feature, spend a few minutes with its pattern diagram. Once the pattern clicks, the rest of the documentation becomes much easier to navigate and the editor setup becomes obvious.

The Patterns Index has diagrams and breakdowns for every major feature. If you are ever confused about why something is structured the way it is, the pattern page is the first place to look.


Configure Every Level the Same Way

Every level in your project should follow the same base configuration as established in the Simple Project Setup guide:

  • A Game Manager prefab placed in the level.
  • A StageData component configured for the level.
  • An inactive child Level entity holding the level’s content, activated by the Stage Manager startup sequence.

This is the standard GS_Play level structure. Deviating from it will cause the startup sequence, stage transitions, and manager initialization to behave unexpectedly.

Leave Debug Mode on in the Game Manager prefab during development. Debug Mode bypasses the normal startup sequence and drops you directly into the level in a pre-started state, which is far faster for iteration. Disable it only when you need to test the full startup flow.


Use the Class Creation Wizard and Templates

GS_Play extension types — custom conditions, effects, performances, pulse types, trigger sensors, and more — should always be generated using the O3DE Class Creation Wizard with the GS_Play templates, not written from scratch.

The wizard handles UUID generation, cmake file-list registration, and module descriptor injection automatically. Hand-written extension classes frequently miss one of these steps, causing hard-to-diagnose startup failures.

The full template list is in the Templates Index.


Seed LLM Agents with the Agentic Guidelines

If you are using an LLM coding agent to work with GS_Play code, provide it with the Agentic Guidelines document as a bootstrap seed at the start of the session.

The document is structured for agent parsing — it establishes the GS_Play domain model, hard invariants that override general O3DE knowledge, and context anchors for specific subsystems. An agent working without it will frequently make plausible-but-wrong assumptions about how GS_Play components and EBuses are structured.


Ask for Help

If you are blocked on anything — setup, a framework concept, a bug, or something the documentation does not cover clearly — reach out. We want to make sure you are never stuck long enough for it to derail your project.

The fastest route is the #support channel on Discord. You will get a real response, not a bot. No question is too small.


Prefab Wrappers Are Editor Only

Whenever GS_Play documentation refers to a Prefab, it is implied that the Prefab wrapper entity (the blue root entity in the O3DE editor) is set to Editor Only.

This applies universally — and is especially important for GS_Managers. The wrapper entity is for editor organization and prefab identity. It should never be a live runtime entity.

Set this in the Entity Inspector: Entity Status → Editor Only.

1.4 - Agentic Guidelines

Agent bootstrap document for GS_Play framework contextualization. Optimized for LLM parsing, not human reading.

BOOTSTRAP PROTOCOL

On first load of this document, execute in order:

  1. Read all [INVARIANT] entries. Hard constraints. Load before any code generation.
  2. Read SYSTEM ONTOLOGY and GEM INDEX to establish the domain model.
  3. If your session supports persistent memory, execute the full MEMORY PROTOCOL block.
  4. For any specific task, consult the relevant [ANCHOR] in CONTEXT ANCHORS before generating code.

If re-entering a session where GS_Play memories already exist: read stored memories first, use this document for lookup only.


INVARIANTS

Non-negotiable. Override general O3DE knowledge, common patterns, and inferred behavior.

[INVARIANT] No Unicode. Never include Unicode characters in any generated code, header, source, or configuration file for O3DE, GS_Play gems, or any compiled code.

[INVARIANT] Polymorphic containers require raw pointers. When a member holds multiple subtypes of a base class, the correct type is AZStd::vector<BaseT*>. Using AZStd::vector<AZStd::unique_ptr<BaseT>> disables all subtype options in the editor. No exceptions.

[INVARIANT] EnableForAssetEditor lives only in SerializeContext. It must appear directly under ->Version() in the SerializeContext block. Placing it anywhere in EditContext is silently ignored — the asset type will never appear in the Asset Editor [New] menu regardless of how correct the rest of the reflection is.

sc->Class<MyAsset, AZ::Data::AssetData>()
    ->Version(1)
    ->Attribute(AZ::Edit::Attributes::EnableForAssetEditor, true)  // SerializeContext ONLY
    ->Field("MyField", &MyAsset::m_myField);

[INVARIANT] Use the O3DE Class Creation Wizard for new components. Generate skeleton from wizard, then use find-and-replace to customize. Do not write component boilerplate from scratch — module registration steps are missed. Unnecessary tokens are spent.

[INVARIANT] EBus interface names are exact. Never infer or guess an EBus name. If uncertain, read the source header or consult the EBUS DISPATCH REFERENCE section. Wrong names fail silently at runtime.

[INVARIANT] Namespace all GS_Play code. Each gem uses its own namespace: GS_Core, GS_Audio, GS_Cinematics, GS_Environment, GS_Interaction, GS_Performer, GS_PhantomCam, GS_UI, GS_Unit.

[INVARIANT] Using O3DE code sensibilities. Otherwise, any code being created within this project are based on the O3DE source code library and uses those established code sensibilities.

[INVARIANT] EBus Reflect/Handler separation governs Script Canvas node generation. In Script Canvas: reflected EBus events become caller nodes (invoke the bus). Handler class declarations become notification nodes (listen for events). Apply the minimum necessary:

  • Respond-only — Inherit from {BusName}::Handler, override virtual methods. Do NOT reflect caller events. No SC caller node is generated or needed.
  • Call-only — Reflect the EBus events in Reflect(). Do NOT inherit Handler in the component. No SC notification node is generated or needed.
  • Both — Inherit Handler (generates notification nodes) AND reflect events (generates caller nodes).

Generating both when only one is needed pollutes the SC node palette and signals incorrect intent to maintainers.


MEMORY PROTOCOL

[MEMORY] If your system supports persistent memory, create or verify these entries at session start:

Memory KeySource Section
gs_play_invariantsINVARIANTS section above
gs_play_gem_indexGEM INDEX table
gs_play_ebus_refEBUS DISPATCH REFERENCE section
gs_play_hot_pathsHOT PATHS section

At session end: If you discovered or confirmed any EBus method signatures, component property names, or TypeIds, ask the user: “Should I save what I learned about [topic] to memory?”

Memory freshness rule: If a stored memory conflicts with source code you have read during a session, source code is authoritative. Update the memory.

If no memory system is available: Re-read this document at the start of every new GS_Play session before generating any code.


SYSTEM ONTOLOGY

What this system is: gs_play_gems is a modular O3DE gameplay framework in C++17. It provides game systems as base classes for developer extension. It is a framework, not a library — the expected pattern is: extend base classes, override virtual methods, communicate exclusively via EBus.

Core architectural rules:

  • Each Gem is a self-contained feature module. GS_Complete is the framework’s reference integration layer demonstrating cross-gem patterns — it is not a required home for your project code.
  • Runtime cross-gem communication between GS_Play framework components uses EBus. Your own project code may include and combine gem headers as needed.
  • All game systems are singleton Manager Components responding to GameManagerNotificationBus lifecycle events.
  • Extensible type systems (dialogue effects, pulse types, motion tracks) use polymorphic base classes. Extend the base class and reflect it — types are discovered automatically through O3DE serialization at startup.
  • All user-facing data must be reflected via SerializeContext + EditContext.

CLASS WIZARD PROTOCOL

[INVARIANT] Always use the Class Creation Wizard CLI for new component, asset, and system class generation. The wizard handles file scaffolding, CMake registration, module descriptor insertion, and system component wiring. Manual boilerplate misses registration steps and wastes tokens.

CLI invocation (agents use CLI mode only)

python ClassWizard.py \
  --engine-path  <engine-path> \
  --project-path <project-path> \
  --template     <template_name> \
  --component-name <Name> \
  --namespace      <GemNamespace> \
  --automatic-register \
  [<template-specific flags>]

--automatic-register is required for build integration (CMake file lists, module descriptors, system component entries). Omitting it generates files only.

Available templates — O3DE base

Template--template valueSuffixWhat it createsTemplate-specific flags
Basic Componentdefault_componentComponentStandard game component--skip-interface, --include-editor (requires editor module)
Level Componentlevel_componentComponentComponent on the level entity--skip-interface
System Componentsystem_componentComponentGlobal system entity component, auto-activates--skip-interface
LyShine Componentlyshine_componentComponentUI component, adds Gem::LyShine.API dependency(none)
Data Assetdata_assetAssetAsset class + system component + .setreg config--file-extension <ext> (required), --asset-group <group>
AttimageattimageAttimageRender pipeline attachment image(none)

Available templates — GS_Play extensions

GS_Play provides gem-specific templates that extend the above. These handle GS_ base class wiring, EBus interface generation, and framework-specific registration steps automatically. Full reference: Templates List.

TemplateGemGeneratesUse For
GS_ManagerComponentGS_Core${Name}ManagerComponent + optional busGS_Play manager with startup lifecycle hooks
SaverComponentGS_Core${Name}SaverComponentCustom save/load handler for GS_Save
GS_InputReaderComponentGS_Core${Name}InputReaderComponentController-side hardware input reader
PhysicsTriggerComponentGS_Core${Name}PhysicsTriggerComponentPhysX trigger volume with enter/hold/exit callbacks
PulsorPulseGS_Interaction${Name}_PulseNew pulse type for Pulsor emitter/reactor system
PulsorReactorGS_Interaction${Name}_ReactorNew reactor type responding to a named channel
WorldTriggerGS_Interaction${Name}_WorldTriggerNew world trigger response type
TriggerSensorGS_Interaction${Name}_TriggerSensorNew trigger sensor condition type
UnitControllerGS_Unit${Name}ControllerComponentCustom player or AI controller
InputReactorGS_Unit${Name}InputReactorComponentUnit-side input translation to bus calls
MoverGS_Unit${Name}MoverComponentCustom locomotion mode
GrounderGS_Unit${Name}GrounderComponentCustom ground detection
PhantomCameraGS_PhantomCam${Name}PhantomCamComponentCustom camera behaviour type
UiMotionTrackGS_UI${Name}TrackNew LyShine property animation track
FeedbackMotionTrackGS_Juice${Name}TrackNew world-space entity property animation track
DialogueConditionGS_Cinematics${Name}_DialogueConditionDialogue branch gate — return true to allow
DialogueEffectGS_Cinematics${Name}_DialogueEffectWorld event from Effects node; optionally reversible
DialoguePerformanceGS_Cinematics${Name}_DialoguePerformanceAsync NPC action; sequencer waits for completion

[ANCHOR] Registration requirements per template (some require a manual Reflect() call after generation) → Templates List: Registration Quick Reference

What the wizard generates

The wizard output for --template default_component --component-name PlayerHealth --namespace GS_Core produces:

  • Source/PlayerHealthComponent.h / .cpp — component with Reflect(), Activate(), Deactivate(), AZ_COMPONENT_IMPL
  • Include/GS_Core/PlayerHealthInterface.h — EBus interface header (unless --skip-interface)
  • CMake file list entries, module descriptor registration (with --automatic-register)

Post-wizard agent workflow

After the wizard runs:

  1. Add GS_ base class to the inheritance list alongside AZ::Component.
  2. Add EBus handler inheritance for any buses this component listens to.
  3. Wire BusConnect() / BusDisconnect() in Activate() / Deactivate().
  4. Override virtual methods from the GS_ base class.
  5. Add reflected fields in SerializeContextEditContext.
  6. Add gem dependency if the component depends on another GS_ gem: add_gem_dependency is automatic for some templates (e.g., LyShine), but cross-GS-gem dependencies must be added manually to CMake.

[ANCHOR] Before customizing wizard output → read the target base class API at /docs/framework/{gem}/.

Discovery commands

--list-templates                   # Print all available templates
--template-help <template_name>    # Print full command shape, all flags, and command list

Use --template-help to discover template-specific flags before invoking.


GEM INDEX

GemNamespaceRoleManager ComponentPrimary Incoming BusDoc Path
GS_CoreGS_CoreFoundation — managers, save, stages, input, actions, utilitiesGS_GameManagerComponentGameManagerRequestBus/docs/framework/core/
GS_AudioGS_AudioAudio engine, mixing, events, Klatt voice synthesisGS_AudioManagerComponentAudioManagerRequestBus/docs/framework/audio/
GS_CinematicsGS_CinematicsDialogue system, cinematics control, stage/performer markersGS_DialogueManagerComponent GS_CinematicsManagerComponentDialogueManagerRequestBus/docs/framework/cinematics/
GS_EnvironmentGS_EnvironmentTime of day, sky, world tickGS_TimeManagerComponentTimeManagerRequestBus/docs/framework/environment/
GS_InteractionGS_InteractionPulsors, targeting, world triggers, cursor(none)/docs/framework/interaction/
GS_PerformerGS_PerformerCharacter skins, paper-facing, locomotion visualsGS_PerformerManagerComponentPerformerManagerRequestBus/docs/framework/performer/
GS_PhantomCamGS_PhantomCamVirtual cameras, blending, influence fieldsGS_CamManagerComponentCamManagerRequestBus/docs/framework/phantomcam/
GS_UIGS_UIUI framework — single-tier Manager/Page navigation, GS_Motion animations, input interception, load screenGS_UIManagerComponentUIManagerRequestBus/docs/framework/ui/
GS_UnitGS_UnitUnit/character controllers, movers, input, AIGS_UnitManagerComponentUnitManagerRequestBus/docs/framework/unit/
GS_CompleteGS_CompleteIntegration layer — cross-gem components, proves patterns(none)GS_CompleteRequestBus

EBUS DISPATCH REFERENCE

Dispatch syntax

// Broadcast (no address, singleton bus)
GS_Core::GameManagerRequestBus::Broadcast(
    &GS_Core::GameManagerRequestBus::Events::NewGame);

// Entity-addressed
GS_Unit::UnitRequestBus::Event(
    entityId, &GS_Unit::UnitRequestBus::Events::Possess, controllerEntityId);

// Broadcast with return value
bool isStarted = false;
GS_Core::GameManagerRequestBus::BroadcastResult(
    isStarted, &GS_Core::GameManagerRequestBus::Events::IsStarted);

Bus naming convention

SuffixDirectionHandler policyTypical use
RequestBusCaller → HandlerSingle (singleton) or ById (entity)Commands and queries
NotificationBusBroadcaster → ListenersMultipleState change notifications

GS_Core buses

BusDispatchKey Methods
GameManagerRequestBusBroadcastIsInDebug, IsStarted, NewGame, LoadGame, ContinueGame, EnterStandby, ExitStandby, ReturnToTitle, ExitGame
GameManagerNotificationBusListenOnSetupManagers, OnStartupComplete, OnShutdownManagers, OnBeginGame, OnEnterStandby, OnExitStandby
SaveManagerRequestBusBroadcastNewGameSave, LoadGame, SaveData, LoadData, GetOrderedSaveList
SaveManagerNotificationBusListenOnSaveAll, OnLoadAll
RecordKeeperRequestBusByIdHasRecord, SetRecord, GetRecord, DeleteRecord
RecordKeeperNotificationBusById ListenRecordChanged
StageManagerRequestBusBroadcastChangeStageRequest, LoadDefaultStage, RegisterExitPoint, GetExitPoint
StageManagerNotificationBusListenBeginLoadStage, LoadStageComplete, StageLoadProgress
OptionsManagerRequestBusBroadcastGetActiveInputProfile
ActionRequestBusByIdDoAction
ActionNotificationBusById ListenOnActionComplete

GS_Cinematics buses

BusDispatchKey Methods
CinematicsManagerRequestBusBroadcastBeginCinematic, EndCinematic, RegisterStageMarker, GetStageMarker
DialogueManagerRequestBusBroadcastStartDialogueSequenceByName, ChangeDialogueDatabase, RegisterPerformerMarker, GetPerformer
DialogueSequencerRequestBusBroadcastStartDialogueBySequence, OnPerformanceComplete
DialogueSequencerNotificationBusListenOnDialogueTextBegin, OnDialogueSequenceComplete
DialogueUIBridgeRequestBusByIdRunDialogue, RunSelection, RegisterDialogueUI, CloseDialogue
TypewriterRequestBusByIdStartTypewriter, ForceComplete, ClearTypewriter

GS_Interaction buses

BusDispatchKey Methods
PulseReactorRequestBusByIdReceivePulses, IsReactor
GS_TargetingHandlerRequestBusByIdRegisterTarget, UnregisterTarget, GetInteractTarget
GS_TargetingHandlerNotificationBusById ListenOnUpdateInteractTarget, OnEnterStandby, OnExitStandby
WorldTriggerRequestBusByIdTrigger, Reset
GS_CursorRequestBusBroadcastRegisterCursorCanvas, HideCursor, SetCursorOffset, SetCursorVisuals, SetCursorPosition

GS_PhantomCam buses

BusDispatchKey Methods
CamManagerRequestBusBroadcastCamera system lifecycle
CamManagerNotificationBusListenEnableCameraSystem, DisableCameraSystem, SettingNewCam
CamCoreNotificationBusListenUpdateCameraPosition
PhantomCameraRequestBusByIdPer-camera control

GS_UI buses

BusDispatchKey Methods
UIManagerRequestBusBroadcastLoadGSUI, UnloadGSUI, FocusUI, NavLastUI, RegisterPage
UIPageRequestBusByIdShowPage, HidePage, FocusPage, ToggleShow
LoadScreenRequestBusBroadcastStartLoadScreen, EndLoadScreen
PauseMenuRequestBusBroadcastPauseGame, UnPauseGame

Removed / Deprecated (do not use): GS_UIHubComponent, GS_UIHubBus, GS_UIWindowComponent — these were part of a removed three-tier Hub/Window/Page hierarchy. The current architecture is single-tier: Manager → Page.

GS_Unit buses

BusDispatchKey Methods
UnitManagerRequestBusBroadcastRequestSpawnNewUnit, RegisterPlayerController, CheckIsUnit
UnitManagerNotificationBusListenReturnNewUnit, EnterStandby, ExitStandby
UnitRequestBusByIdPossess, DePossess, GetController, GetUniqueName
UnitNotificationBusById ListenUnitPossessed, UnitEnteringStandby, UnitExitingStandby

GS_Environment buses

BusDispatchKey Methods
TimeManagerRequestBusBroadcastSetMainCam, Set/GetTimeOfDay, SetTimePassageSpeed, GetWorldTime, IsDay
TimeManagerNotificationBusListenWorldTick, DayNightChanged

GS_Audio buses

BusDispatchKey Methods
AudioManagerRequestBusBroadcastAudio engine control
KlattVoiceRequestBusByIdVoice synthesis control per-entity
KlattVoiceSystemRequestBusBroadcastSystem-level voice engine

HOT PATHS

Canonical patterns for the most common agent tasks. Follow exactly.


HOT PATH 1 — Create a component extending a GS_ base

(Incorrect - Templates satisfy much of this.)

  1. Identify target gem namespace and base class.
  2. Generate skeleton via O3DE Class Creation Wizard.
  3. Replace generated names. Add GS_ base class alongside AZ::Component in inheritance list.
  4. In Activate(): call BusConnect() for all buses this component handles.
  5. In Deactivate(): call BusDisconnect() for all buses.
  6. In Reflect(): SerializeContext first, EditContext nested inside it. EnableForAssetEditor in SerializeContext only, directly under ->Version().

[ANCHOR] For any specific gem’s base class API → read /docs/framework/{gem}/ before writing the override body.


HOT PATH 2 — Dispatch an EBus command

(Incorrect - BusInterfaceName)

// Fire and forget, singleton bus
GS_{Gem}::{BusName}::Broadcast(
    &GS_{Gem}::{BusName}::Events::{MethodName},
    arg1, arg2);

// Entity-addressed bus
GS_{Gem}::{BusName}::Event(
    targetEntityId, &GS_{Gem}::{BusName}::Events::{MethodName}, arg1);

// With return value
ReturnType result{};
GS_{Gem}::{BusName}::BroadcastResult(
    result, &GS_{Gem}::{BusName}::Events::{MethodName});

HOT PATH 3 — Listen to an EBus notification

  1. Inherit from {BusName}::Handler (protected inheritance).
  2. In Activate(): {BusName}::Handler::BusConnect();
    • For entity-addressed buses: {BusName}::Handler::BusConnect(GetEntityId());
  3. In Deactivate(): {BusName}::Handler::BusDisconnect();
  4. Override the virtual notification methods.

HOT PATH 4 — Create a new Manager Component

  1. Inherit from GS_Core::GS_ManagerComponent.
  2. Connect to GameManagerNotificationBus in Activate().
  3. Implement lifecycle hooks: OnSetupManagers, OnShutdownManagers, OnEnterStandby, OnExitStandby.
  4. Define {Name}RequestBus (Single/Single policy) for commands.
  5. Define {Name}NotificationBus (Multiple policy) for notifications.
  6. Register descriptor in your gem’s module constructor.

[ANCHOR] Manager lifecycle and initialization order → /docs/framework/core/gs_managers/manager/


HOT PATH 5 — Create a Generic Data Asset

  1. Inherit from AZ::Data::AssetData.
  2. Include GS_Core Utility: GS_AssetReflectionIncludes.h
  3. In Reflect(), SerializeContext block:
    sc->Class<MyAsset, AZ::Data::AssetData>()
        ->Version(1)
        ->Attribute(AZ::Edit::Attributes::EnableForAssetEditor, true)
        ->Field("Field", &MyAsset::m_field);
    
  4. Register asset type in a system component’s Reflect().
  5. EnableForAssetEditor goes only in SerializeContext. Never in EditContext.

HOT PATH 6 — Polymorphic container of subtypes

// CORRECT — all subtypes selectable in editor
AZStd::vector<BaseClass*> m_items;

// WRONG — disables all subtype options in editor
AZStd::vector<AZStd::unique_ptr<BaseClass>> m_items;

Reflect with ->Field("Items", &MyClass::m_items). O3DE handles subtype selection automatically with raw pointers.


HOT PATH 7 — Create a custom Trigger Sensor or World Trigger

Two distinct extension points. Choose the correct base:

Custom Trigger Sensor (condition side — detects when to fire):

  1. Inherit from GS_Interaction::TriggerSensorComponent.
  2. Override EvaluateSensor() — call Trigger() on all WorldTriggerRequestBus handlers on the same entity when your condition is met.
  3. Reflect with SerializeContext and EditContext.
  4. Register descriptor in your gem’s module.

Custom World Trigger (response side — what happens when fired):

  1. Inherit from GS_Interaction::WorldTriggerComponent.
  2. Override DoTrigger() — implement the response behavior.
  3. Override DoReset() if the trigger should re-arm.
  4. Reflect with SerializeContext and EditContext.
  5. Register descriptor in your gem’s module.

[ANCHOR] Full WorldTrigger architecture → /docs/framework/interaction/world_triggers/


HOT PATH 8 — Add a custom Dialogue type (effect, condition, performance)

  1. Inherit from the appropriate base: DialogueEffect, DialogueCondition, or DialoguePerformance.
  2. Reflect the class using SerializeContext and EditContext. The system discovers it automatically at startup — no manual registration step.
  3. GS_Complete demonstrates this pattern for cross-gem types (e.g., a dialogue effect controlling a PhantomCam). Use it as a reference, not a required location.

[ANCHOR] Dialogue type system → /docs/framework/cinematics/dialogue_system/


ANTIPATTERN CATALOG

[ANTIPATTERN] EnableForAssetEditor in EditContext. Silent failure — asset never appears in Asset Editor [New] menu. See INVARIANTS.

[ANTIPATTERN] unique_ptr in polymorphic AZStd containers. Disables all subtype selection in the editor. Use BaseT*. See INVARIANTS.

[ANTIPATTERN] Guessing EBus method names. O3DE EBus dispatches fail silently on wrong method names. Always verify against source headers or the EBUS DISPATCH REFERENCE in this document.

[ANTIPATTERN] Dispatching to entity-addressed buses before target entity is active. Buses are not connected before Activate() or after Deactivate(). Sending events before activation is silently dropped.

[ANTIPATTERN] Omitting AZ_COMPONENT_IMPL in the .cpp file. Component will not be reflectable. Causes subtle reflection failures that appear unrelated to the missing macro.

[ANTIPATTERN] Using "Editor" service in AppearsInAddComponentMenu. GS_Play components use AZ_CRC_CE("Game"). The "Editor" category makes components invisible in standard game entity contexts.

[ANTIPATTERN] Writing component boilerplate manually. Module registration steps are reliably missed. Always start from the O3DE Class Creation Wizard output.


CONFIDENCE THRESHOLDS

Proceed without reading source or docs:

  • Standard O3DE component lifecycle: Init, Activate, Deactivate, BusConnect, BusDisconnect
  • AZStd container usage (with the raw pointer rule applied for polymorphic types)
  • CMake target aliases: Gem::{Name}.API, Gem::{Name}.Private.Object, Gem::{Name}.Clients
  • SerializeContextEditContext nesting order in Reflect()
  • AZ_CRC_CE() for compile-time CRC values
  • azrtti_cast<> for safe type casting in reflection code

Stop and read documentation or ask the user before proceeding:

  • Any EBus method signature you are not 100% certain of
  • Any virtual method signature on a GS_ base class
  • Whether a specific GS_ gem is present in the target project
  • Initialization order requirements for a new Manager component
  • Any TypeId UUID value — never generate a UUID, ask the user or read the TypeIds header

CONTEXT ANCHORS

Conditions that require reading deeper documentation before generating code.

[ANCHOR] Before writing any unit movement or controller code: → Read /docs/framework/unit/ for mover types (3DFree, Strafe, Grid, SideScroller, Slide, Physics), grounder types, and controller hierarchy (Player vs AI). → [ASK] Which mover type does this movement context require?

[ANCHOR] Before configuring any virtual camera: → Read /docs/framework/phantomcam/ for camera types (PhantomCamera, StaticOrbit, Track, ClampedLook) and influence field setup.

[ANCHOR] Before wiring any dialogue sequence: → Read /docs/framework/cinematics/dialogue_system/ for the full chain: Sequencer → UIBridge → DialogueUI component.

[ANCHOR] Before implementing targeting or interaction: → Read /docs/framework/interaction/targeting/ for the TargetingHandler, Target, TargetField setup order and registration timing.

[ANCHOR] Before building any UI screen or menu: → Read /docs/framework/ui/ for the Manager → Page architecture, navigation return policies, companion component pattern, and UIManager registration sequence. The Hub/Window layer has been removed — do not use GS_UIHubComponent or GS_UIWindowComponent.

[ANCHOR] Before integrating save or persistence: → Read /docs/framework/core/gs_save/ for the SaveManager, RecordKeeper, and Saver component chain.

[ANCHOR] Before setting up project physics layers or collision groups: → Read /docs/get_started/configure_project/setup_environment/ for required engine-level physics configuration.

[ANCHOR] Before implementing behavior that combines two or more GS_ gems: → Review GS_Complete — it provides reference implementations of common cross-gem patterns. Reuse or adapt them rather than reimplementing from scratch.


CLARIFICATION TRIGGERS

[ASK] At the start of any GS_Play session, identify the user context:

  • Designer / Scripter — Working in the O3DE editor with components, Script Canvas, or Lua. No C++ generation needed.

    • Guidance: component configuration, property setup, EBus Script Canvas nodes.
    • Documentation: The Basics → /docs/the_basics/
    • Gem-specific: /docs/the_basics/{gem}/ (e.g., /docs/the_basics/unit/ for unit questions)
  • Engineer — Writing C++ to extend the framework with new components, gems, or systems.

    • Guidance: follows HOT PATHS and code patterns in this document.
    • Documentation: Framework API → /docs/framework/
    • Gem-specific: /docs/framework/{gem}/ (e.g., /docs/framework/unit/ for unit questions)

Both sections cover identical gem featuresets. The Basics covers component usage and scripting; Framework API covers C++ interfaces, EBus signatures, and base class architecture. Available gems in both: core, audio, cinematics, environment, interaction, juice, performer, phantomcam, ui, unit.

When answering any gem-specific question, include a link to the appropriate documentation section alongside your response. If the context is not stated, ask before generating any code or providing documentation links.

[ASK] Before creating any new component: Confirm which gem’s namespace and base class this component should use.

[ASK] Before generating any TypeId UUID: Always request the UUID from the user or confirm they want one generated. Never silently generate a UUID.

[ASK] If the task references a system with no memory or documentation coverage: Ask before inferring behavior. GS_Play base class APIs are not reliably guessable from generic O3DE patterns.

[ASK] If a Manager initialization order is being modified: The sequence of OnSetupManagers calls is load-order dependent. Confirm the intended position in the sequence before writing.

[ASK] If asked to implement a new Dialogue type: Extend the appropriate base class (DialogueEffect, DialogueCondition, or DialoguePerformance) and reflect it. Confirm which gem or module owns the new class before writing.


UPCOMING: AGENTIC SITE FEATURES (Post v1)

The GS_Play documentation site will implement formal agent-friendly features after the v1 documentation release. These are not yet live.

Planned additions follow the emerging specifications at:

What this means for agents using this document now:

Until those features are live, this bootstrap document (/docs/get_started/agentic_guidelines/) is the primary machine-readable entry point. Treat it as authoritative. Do not rely on inferred site structure or navigated page discovery — navigate the docs via explicit paths from the GEM INDEX and CONTEXT ANCHORS sections above.

When the agent-friendly site features are released, this section will be updated with the new entry points and structured access patterns.

2 - Index

Fast references to speed up docs navigation — feature lists, patterns, glossary, templates, changelogs, and utilities.

Quick-access references for the GS_Play documentation. Use this section when you know what you are looking for and need a fast lookup — a term definition, the right template name, a feature comparison, or the changelog for a specific gem.


How This Section Is Organized

Change Log — Version history for GS_Play gems and the documentation itself.

Feature List — A complete index of every GS_Play feature, with links to both its Basics usage guide and Framework API reference.

Patterns — Diagrams and explanations of the core structural patterns used across GS_Play feature sets. A useful orientation before diving into a new gem.

Templates List — All ClassWizard extension templates for generating new conditions, effects, performances, trigger types, and other polymorphic extension classes across GS_Play gems.

Powerful Utilities — Cross-cutting utilities available across multiple feature gems.

Glossary — Definitions for GS_Play-specific terminology and commonly used O3DE concepts in the framework context.

Links — External links to community resources, the asset store, and Genome Studios.


Sections

2.1 - Change Logs

Version changelogs for all GS_Play gems and documentation.

Logs


GS_Audio

Latest version: 0.5.0

Summary: First official base release. Audio manager, event playback, mixing buses, multi-layer score arrangement, and Klatt voice synthesis.

GS_Audio Logs


GS_Cinematics

Latest version: 0.5.0

Summary: First official base release. Full dialogue system with node graph sequencer, polymorphic conditions/effects/performances, world and screen-space UI, typewriter, babble, localization, and the in-engine dialogue editor.

GS_Cinematics Logs


GS_Core

Latest version: 0.5.0

Summary: First official base release. Ready for continued development and support. On the way to 1.0!

GS_Core Logs


GS_Environment

Latest version: 0.5.0

Summary: First official base release. Time of day management, time passage speed, day/night cycle notifications, and sky color configuration assets.

GS_Environment Logs


GS_Interaction

Latest version: 0.5.0

Summary: First official base release. Physics pulse emitters and reactors, proximity targeting and cursor, and composable world trigger sensors.

GS_Interaction Logs


GS_Juice

Latest version: 0.5.0

Summary: First official base release (Early Development). Feedback motion system with transform and material tracks for screen shake, bounce, flash, and glow effects.

GS_Juice Logs


GS_Performer

Latest version: 0.5.0

Summary: First official base release (Early Development). Performer manager, skin slot swapping, paper billboard rendering, velocity locomotion, head tracking, and babble.

GS_Performer Logs


GS_PhantomCam

Latest version: 0.5.0

Summary: First official base release. Priority-based camera system with follow/look-at behaviors, blend profiles, and spatial influence fields.

GS_PhantomCam Logs


GS_UI

Latest version: 0.5.0

Summary: First official base release. Single-tier page navigation, GS_Motion UI animation, enhanced buttons, input interception, load screen, and pause menu.

GS_UI Logs


GS_Unit

Latest version: 0.5.0

Summary: First official base release. Mode-driven unit movement system with player/AI controllers, 3-stage input pipeline, multiple mover types, grounders, and movement influence fields.

GS_Unit Logs


Documentation

Latest version: 1.0.0

Summary: First official base release. Ready to be developed, updated, and refined over the continued development of GS_Play!

Docs Logs


2.1.1 - Audio Change Log

GS_Audio version changelog.

Logs


Audio 0.5.0

First official base release of GS_Audio.

Audio Manager

  • GS_AudioManagerComponent — audio engine lifecycle management, event library loading
  • AudioManagerRequestBus / AudioManagerNotificationBus
  • Master volume control and engine-level audio settings

Audio Events

  • AudioEventLibrary asset — pools of named audio events
  • GS_AudioEvent — pool selection, concurrent instance limiting, and 3D spatialization
  • AudioManagerRequestBus playback interface

Mixing & Effects

  • GS_MixingBus — mixing bus component with configurable effects chain
  • Filters, EQ, and environmental influence effects per bus

Score Arrangement

  • ScoreArrangementTrack — multi-layer music system
  • ScoreLayer — individual music layers toggled by gameplay state
  • TimeSignatures enum for bar-aligned transitions

Klatt Voice Synthesis

  • KlattVoiceSystemComponent — shared SoLoud engine, 3D listener management
  • KlattVoiceComponent — per-entity text-to-speech with phoneme mapping and segment queue
  • KlattVoiceRequestBus / KlattVoiceNotificationBus / KlattVoiceSystemRequestBus
  • Full 3D spatial voice audio with configurable voice parameters

2.1.2 - Cinematics Change Log

GS_Cinematics version changelog.

Logs


Cinematics 0.5.0

First official base release of GS_Cinematics.

Cinematics Manager

  • GS_CinematicsManagerComponent — cinematic mode lifecycle (BeginCinematic / EndCinematic)
  • CinematicsManagerRequestBus / CinematicsManagerNotificationBus (EnterCinematic / ExitCinematic)
  • CinematicStageMarkerComponent — named world-space anchor, self-registers on activate

Dialogue Manager

  • GS_DialogueManagerComponent — active .dialoguedb asset ownership and performer registry
  • DialogueManagerRequestBusStartDialogueSequenceByName, ChangeDialogueDatabase, RegisterPerformerMarker, GetPerformer
  • DialoguePerformerMarkerComponent — named NPC world anchor, PerformerMarkerRequestBus

Dialogue Sequencer

  • DialogueSequencerComponent — node graph traversal, runtime token management
  • DialogueUIBridgeComponent — routes active dialogue to whichever UI is registered, decouples sequencer from presentation
  • DialogueSequencerNotificationBusOnDialogueTextBegin, OnDialogueSequenceComplete

Data Structure

  • DialogueDatabase (.dialoguedb) — named actors and sequences asset
  • DialogueSequence — directed node graph with startNodeId
  • ActorDefinition — actor name, portrait, and metadata
  • Node types: TextNodeData, SelectionNodeData, RandomNodeData, EffectsNodeData, PerformanceNodeData
  • SelectionOption — per-choice text, conditions, and connections

Conditions

  • DialogueCondition (abstract base) — EvaluateCondition()
  • Boolean_DialogueCondition — base boolean comparison
  • Record_DialogueCondition — checks GS_Save records with comparison operators
  • Polymorphic discovery: extend base class, no manual registration

Effects

  • DialogueEffect (abstract base) — DoEffect() / ReverseEffect()
  • SetRecords_DialogueEffect — sets GS_Save records during dialogue
  • ToggleEntitiesActive_DialogueEffect — activates or deactivates entities in the level
  • Polymorphic discovery: extend base class, no manual registration

Performances

  • DialoguePerformance (abstract base, AZ::TickBus::Handler) — DoPerformance() / ExecutePerformance() / FinishPerformance()
  • MoveTo_DialoguePerformance — interpolated movement to stage marker
  • PathTo_DialoguePerformance — navmesh path navigation to stage marker (requires RecastNavigation)
  • RepositionPerformer_DialoguePerformance — instant teleport to stage marker
  • Polymorphic discovery: extend base class, no manual registration

Dialogue UI

  • DialogueUIComponent — screen-space dialogue text, speaker name, and portrait display
  • WorldDialogueUIComponent — world-space speech bubble display
  • DialogueUISelectionComponent — screen-space player choice menu
  • WorldDialogueUISelectionComponent — world-space selection display
  • TypewriterComponent — character-by-character text reveal, configurable speed, OnTypeFired / OnTypewriterComplete
  • BabbleComponent — procedural audio babble synchronized to typewriter output

Localization

  • LocalizedStringId — key + default fallback text, Resolve() method
  • LocalizedStringTable — runtime key-value string lookup table

Dialogue Editor

  • Node-based in-engine GUI for authoring .dialoguedb assets and sequence graphs

Type Registry

  • DialogueTypeRegistry — factory registration for conditions, effects, and performances
  • DialogueTypeDiscoveryBus — external gems register custom types without modifying GS_Cinematics

2.1.3 - Environment Change Log

GS_Environment version changelog.

Logs


Environment 0.5.0

First official base release of GS_Environment.

Time Manager

  • GS_TimeManagerComponent — world time ownership, time of day control
  • TimeManagerRequestBusSetTimeOfDay, SetTimePassageSpeed, GetTimeOfDay, GetWorldTime, IsDay
  • TimeManagerNotificationBusWorldTick (per-frame), DayNightChanged (on threshold cross)
  • GS_EnvironmentSystemComponent — system-level environment registration

Sky Configuration

  • SkyColourConfiguration — data asset defining sky color at different times of day
  • Authored as an asset and assigned to the environment for day/night color blending

2.1.4 - Core Change Log

Changelog.

Logs


Core 0.5.0

  • COOOOREEEEE

2.1.5 - Interaction Change Log

GS_Interaction version changelog.

Logs


Interaction 0.5.0

First official base release of GS_Interaction.

Pulsors

  • PulsorComponent — physics trigger volume that broadcasts typed pulse events
  • PulseReactorComponent — receives pulses and executes reactor types
  • PulseType (abstract base) — extensible pulse payload type
  • Built-in types: Debug_Pulse, Destruct_Pulse
  • PulseTypeRegistry — auto-discovery of all reflected pulse types

Targeting

  • GS_TargetComponent — marks an entity as a targetable object
  • GS_InteractTargetComponent — marks an entity as interactable
  • GS_CursorComponent — world-space cursor and proximity scan
  • GS_TargetingHandlerComponent — finds and locks the best interactable in range
  • GS_TargetingHandlerRequestBus / GS_TargetingHandlerNotificationBus

World Triggers

  • TriggerSensorComponent — base for all condition-side trigger sensors
  • WorldTriggerComponent — base for all response-side world triggers
  • Built-in sensors: InteractTriggerSensorComponent, PlayerInteractTriggerSensorComponent, ColliderTriggerSensorComponent, RecordTriggerSensorComponent
  • WorldTriggerRequestBus — trigger enable/disable interface
  • Sensors renamed from TriggerAction to TriggerSensor (2026-03)
  • 22 components total across all subsystems

2.1.6 - Juice Change Log

GS_Juice version changelog.

Logs


Juice 0.5.0

First official base release of GS_Juice. Early Development — full support planned 2026.

Feedback System

  • FeedbackEmitter — component that plays authored feedback motions on its entity
  • FeedbackMotionAsset — data asset containing one or more feedback tracks
  • FeedbackMotion — runtime instance wrapper for a FeedbackMotionAsset
  • FeedbackMotionTrack — domain base class extending GS_Core’s GS_Motion system

Feedback Tracks

  • FeedbackTransformTrack — drives position, scale, and rotation offsets (screen shake, bounce)
  • FeedbackMaterialTrack — drives opacity, emissive intensity, and color (flash, glow, pulse)

PostProcessing

  • Planned for future release. Not yet implemented.

2.1.7 - Performer Change Log

GS_Performer version changelog.

Logs


Performer 0.5.0

First official base release of GS_Performer. Early Development — full support planned 2026.

Performer Manager

  • GS_PerformerManagerComponent — performer registration and name-based lookup
  • PerformerManagerRequestBus
  • GS_PerformerSystemComponent — system-level lifecycle component

Skin Slots

  • PerformerSkinSlotComponent — manages a named visual slot on a character entity
  • SkinSlotHandlerComponent — handles asset swapping for a given slot
  • PerformerSkinSlotsConfigProfile — asset class defining slot configurations
  • SkinSlotData / SkinSlotNameDataPair — slot data types

Paper Performer

  • PaperFacingHandlerBaseComponent — base for billboard-style rendering handlers
  • PaperFacingHandlerComponent — orients a 2.5D billboard character to face the camera correctly

Locomotion

  • VelocityLocomotionHookComponent — reads entity velocity and drives animation blend tree parameters automatically

Head Tracking

  • Procedural head bone orientation toward a world-space look-at target

Babble

  • BabbleComponent — generates procedural vocalisation tones synchronized to GS_Cinematics typewriter output
  • BabbleToneEvent / SpeakerBabbleEvents — tone data structures per speaker identity

2.1.8 - PhantomCam Change Log

GS_PhantomCam version changelog.

Logs


PhantomCam 0.5.0

First official base release of GS_PhantomCam.

Cam Manager

  • GS_CamManagerComponent — camera system lifecycle, active camera tracking
  • CamManagerRequestBus / CamManagerNotificationBus
  • GS_PhantomCamSystemComponent — system-level registration

Phantom Cameras

  • GS_PhantomCameraComponent — virtual camera with follow target, look-at, FOV, and priority
  • PhantomCameraRequestBus
  • PhantomCamData — camera configuration data structure
  • Priority-based switching: highest priority active phantom camera drives the real camera

Cam Core

  • GS_CamCoreComponent — reads from the active phantom camera each frame and applies to the engine camera
  • CamCoreRequestBus / CamCoreNotificationBus

Camera Behaviors

  • ClampedLook_PhantomCamComponent — look-at with clamped angle limits
  • StaticOrbitPhantomCamComponent — fixed-distance orbit around a target
  • Track_PhantomCamComponent — follows a spline or path track
  • AlwaysFaceCameraComponent — keeps an entity billboard-facing the active camera

Blend Profiles

  • GS_PhantomCamBlendProfile — asset defining camera transition easing and duration
  • Smooth interpolation between camera positions on priority switch

Influence Fields

  • GlobalCameraInfluenceComponent — global persistent camera modifier
  • CameraInfluenceFieldComponent — spatial zone that modifies camera behavior on entry
  • CamInfluenceData — influence configuration data (offset, FOV shift, tilt)

2.1.9 - UI Change Log

GS_UI version changelog.

Logs


UI 0.5.0

First official base release of GS_UI.

UI Manager

  • GS_UIManagerComponent — canvas lifecycle management, focus stack, startup focus assignment
  • UIManagerRequestBus
  • Single-tier page navigation: Manager → Page (no hub layer)
  • GS_UIPageComponent — root canvas registration, nested page management
  • NavigationReturnPolicyRestoreLast or AlwaysDefault back navigation behavior
  • FocusEntry — focus state data for page transitions
  • Cross-canvas boundary navigation support
  • Companion Component Pattern: domain-specific page logic lives on companion components that react to page bus events

UI Interaction

  • GS_ButtonComponent — enhanced button with hover and select UiAnimationMotion support
  • GS_UIInputInterceptorComponent — input capture and blocking for UI canvases
  • GS_UIInputProfile — input mapping configuration for UI contexts

UI Animation

  • UiMotionTrack (domain base) — extends GS_Core’s GS_Motion system
  • 8 concrete tracks: UiPositionTrack, UiScaleTrack, UiRotationTrack, UiElementAlphaTrack, UiImageAlphaTrack, UiImageColorTrack, UiTextColorTrack, UiTextSizeTrack
  • UiAnimationMotionAsset — authored animation asset
  • UiAnimationMotion — runtime instance wrapper

Load Screen

  • GS_LoadScreenComponent — manages load screen display during level transitions

Pause Menu

  • PauseMenuComponent — pause state management and pause menu page coordination

Removed

  • GS_UIWindowComponent — removed entirely
  • GS_UIHubComponent / GS_UIHubBus — deprecated legacy layer

2.1.10 - Unit Change Log

GS_Unit version changelog.

Logs


Unit 0.5.0

First official base release of GS_Unit.

Unit Manager

  • GS_UnitManagerComponent — unit registration and lifecycle tracking
  • UnitManagerRequestBus

Unit Component

  • GS_UnitComponent — unit identity and state component
  • UnitRequestBus

Controllers

  • GS_UnitControllerComponent — abstract controller base, possess/release pattern
  • GS_PlayerControllerComponent — human player possession
  • GS_AIControllerComponent — AI-driven possession
  • GS_PlayerControllerInputReaderComponent — reads hardware input and routes to the possessed unit

Input Pipeline

  • 3-stage architecture: controller entity reads hardware → routes to InputDataComponent on the unit → reactor components act on the structured state
  • InputDataRequestBus / InputDataNotificationBus
  • GS_InputReactorComponent — reacts to binary input states (button/action)
  • GS_InputAxisReactorComponent — reacts to axis input values (sticks, triggers)

MoverContext

  • GS_MoverContextComponent — transforms raw input into movement intent, manages active mode and profiles
  • MoverContextRequestBus / MoverContextNotificationBus
  • Mode-driven movement: one named mode active at a time; only the mover and grounder matching that mode run

Movers

  • GS_MoverComponent — mode-aware mover base class
  • GS_PhysicsMoverComponent — physics-driven movement via PhysX
  • GS_3DFreeMoverComponent — unconstrained 3D free movement
  • GS_3DSlideMoverComponent — slide-and-collide surface movement

Grounders

  • GS_PhysicsRayGrounderComponent — raycast-based grounding for the “Free” movement mode

Movement Influence

  • MovementInfluenceFieldComponent — spatial zone that modifies unit movement within its bounds
  • GlobalMovementRequestBus — global movement modifier interface

Profiles

  • GS_UnitMovementProfile — per-mode speed, acceleration, and movement parameter configuration

2.1.11 - Docs Change Log

Updated changelog.

Logs


Docs 1.0.0

  • Welcome page finalized.

  • Get Started and Index sections finalized.

  • No missing pages. Final organization.

  • Patterns Complete.

  • Feature List, Glossary fully updated.

  • Get Started T1 page filled.

  • Index T1 page filled.

  • Final Logs format and arrangement.

  • Print page + print section links.

  • Homepage complete.

  • About Us complete.

  • Product pages complete.


Docs 0.9.0

Full first implementation of direct documentation formatting and content. With this release we have our first fully working catalogue of featuresets, functionality, and API.

Feature Roots are called “Tier 2”. They are overviews with quick indexes for their subsystems. Content Pages are called “Tier 3”. They carry the body and substance of any given feature or subfeature. Nested pages below these tiers are combinations of both T2, and T3, depending on the depth an overview needs to be to segue into the child pages.

Established base organization of information.

  • Get Started is very thin and for user onboarding.
  • Index replaces Get Started for regular users. Allowing for easy navigation across categorical necessities for users.
  • Basics is conceptual and script driven knowledge for the framework.
  • Framework API is the engineering and veteran user knowledgebase.
  • Learn is for Youtube video funnelling, or text driven Guide access.

Cross linking is prioritized to move users to the knowledge they seek, asap.

Agentic Guidelines is an LLM based seed for how to use GS_Play and navigate the documentation for information access. The website has been optimized behind the scenes for LLM scraping.

Basics

Basics Implemented.

  • Implemented Core
  • Implemented Audio
  • Implemented Cinematics
  • Implemented Environment
  • Implemented Interaction
  • Implemented Juice
  • Implemented Performer
  • Implemented PhantomCam
  • Implemented UI
  • Implemented Unit

Framework API

Framework API Implemented.

  • Implemented Core
  • Implemented Audio
  • Implemented Cinematics
  • Implemented Environment
  • Implemented Interaction
  • Implemented Juice
  • Implemented Performer
  • Implemented PhantomCam
  • Implemented UI
  • Implemented Unit

2.2 - Feature List

Quick index of all GS_Play Features.

Contents


GS_Core

FeatureAPIDoes
GS_ManagersAPIStart a new game, continue from a save, load a specific file, or return to the title screen
GS_SaveAPISave and load game data, or track persistent flags and counters across sessions
GS_StageManagerAPIMove between levels, or configure per-level spawn points and navigation settings
GS_OptionsAPIRead player input, disable input during menus, or swap control schemes at runtime
UtilitiesAPIUse easing curves, detect physics zones, smooth values, pick randomly, or work with splines
GS_ActionsAPITrigger a reusable behavior on an entity from a script, physics zone, or another action
GS_MotionAPIAnimate a transform, color, or value smoothly over time

GS_Audio

FeatureAPIDoes
Audio ManagerAPIManage the audio engine, load event libraries, or control master volume
Audio EventsAPIPlay sounds with pooling, 3D spatialization, and concurrency control
Mixing & EffectsAPIConfigure mixing buses with filters, EQ, and environmental influence effects
Score ArrangementAPILayer music tracks dynamically based on gameplay state
Klatt VoiceAPIGenerate text-to-speech with configurable voice parameters and 3D spatial audio

GS_Cinematics

FeatureAPIDoes
Cinematics ManagerAPICoordinate cinematic sequences and manage stage markers for actor positioning
Dialogue SystemAPIAuthor branching dialogue with conditions, effects, and performances
Dialogue UIAPIDisplay dialogue text, player choices, and speech babble on screen or in world space
PerformancesAPIMove actors to stage markers during dialogue with navigation or teleport
Timeline ExpansionAPIExtend the sequence timeline with custom track and key types

GS_Environment

FeatureAPIDoes
Time ManagerAPIControl world time, time passage speed, or respond to day/night changes
Sky ConfigurationAPIDefine how the sky looks at different times of day with data assets

GS_Interaction

FeatureAPIDoes
PulsorsAPIBroadcast typed physics events from trigger volumes to receiving entities
TargetingAPIFind and lock onto the best interactable entity in proximity with a cursor overlay
World TriggersAPIFire configurable responses from zones and conditions without scripting

GS_Juice

FeatureAPIDoes
Feedback SystemAPIPlay screen shake, bounce, flash, or material glow effects on entities

GS_Performer

FeatureAPIDoes
Performer ManagerAPIManage registered performers or query them by name
Skin SlotsAPISwap equipment visuals at runtime with modular slot-based assets
Paper PerformerAPIRender billboard-style 2.5D characters that face the camera correctly
LocomotionAPIDrive animation blend parameters from entity velocity automatically
Head TrackingAPIProcedurally orient a character’s head bone toward a world-space look-at target
BabbleAPIGenerate procedural vocalisation tones synchronized to dialogue typewriter output

GS_PhantomCam

FeatureAPIDoes
Cam ManagerAPIManage the camera system lifecycle and know which camera is active
Phantom CamerasAPIPlace virtual cameras with follow targets, look-at, and priority-based switching
Cam CoreAPIControl how the real camera reads from the active phantom camera each frame
Blend ProfilesAPIDefine smooth transitions between cameras with custom easing
Influence FieldsAPICreate spatial zones that modify camera behavior dynamically

GS_UI

FeatureAPIDoes
UI ManagerAPILoad and unload UI canvases, manage focus stack, or set startup focus
Page NavigationAPINavigate between pages, handle back navigation, or cross canvas boundaries
UI InteractionAPIButton animations and input interception for UI canvases
UI AnimationAPIAnimate UI elements with position, scale, rotation, alpha, and color tracks
WidgetsAPIReusable UI building blocks for counters, sliders, and interactive controls

GS_Unit

FeatureAPIDoes
Unit ManagerAPIRegister units, track active controllers, or spawn units at runtime
ControllersAPIPossess and release units with player or AI controllers
Input DataAPICapture raw input and convert it into structured movement intent
MovementAPIMove characters with movers, grounders, and movement influence fields

2.3 - Patterns

Patterns associated with the GS_Play featuresets.

Outline

The strength of the GS_Play framework is in it’s straightforward, and intuitive patterns.

By creating simple ways to look at certain featuresets, you can reliably expand your gameplay knowing you’re always going to properly connect with the underlying system.

Using our extensive list of templates, you can even more rapidly develop the features that make your project unique.

Use this quick list to jump to the features you want to look at.

You can expect a diagram outlining the relationships between feature elements, and then explanations on how the pattern plays out.

Have fun!

 

Legend

Colour Legend for pattern diagrams

 

Contents


Managers & Startup - GS_Core

Game Manager Startup Pattern Graph

Breakdown

When the project starts, the Game Manager runs three stages before the game is considered ready:

StageBroadcast EventWhat It Means
1 — Initialize(internal)Each manager is spawned. They activate, then report ready.
2 — SetupOnSetupManagersSetup stage. Now safe to query other managers.
3 — CompleteOnStartupCompleteLast stage. Everything is ready.
Do any last minute things. Now safe to begin gameplay.

For most scripts, you only need OnStartupComplete. Wait for this event before doing anything that depends on managers to be completely setup.

Back to top…


Save - GS_Core

Save System Pattern Graph

Breakdown

When a save or load is triggered, the Save Manager broadcasts to every Saver in the scene. Each Saver independently handles its own entity’s state. The Record Keeper persists flat progression data alongside the save file.

PartBroadcast EventWhat It Means
Save ManagerOnSaveAllBroadcasts to all Savers on save. Each serializes its entity state.
Save ManagerOnLoadAllBroadcasts to all Savers on load. Each restores its entity state.
Record KeeperRecordChangedFires when any progression flag is created, updated, or deleted.

Savers are per-entity components — add them to anything that needs to persist across sessions. The Record Keeper is a global singleton for flags and counters.

Back to top…


Stage Change - GS_Core

Stage Change Pattern Graph

Breakdown

When a stage change is requested, the system follows a five-step sequence before gameplay resumes in the new level:

StepBroadcast EventWhat It Means
1 — StandbyOnEnterStandbyGame Manager enters standby, halting all gameplay systems.
2 — Unload(internal)The current stage’s entities are torn down.
3 — Spawn(internal)The target stage prefab is instantiated.
4 — Set UpOnBeginSetUpStageStage Data runs its layered startup: SetUpStage → ActivateByPriority → Complete.
5 — CompleteLoadStageCompleteStage Manager broadcasts completion. Standby exits. Gameplay resumes.

The Stage Data startup is layered — OnBeginSetUpStage, ActivateByPriority, then OnLoadStageComplete — so heavy levels can initialize incrementally without causing frame-time spikes.

Back to top…


Dialogue Authoring - GS_Cinematics

Dialogue Authoring Pattern Graph

Breakdown

A dialogue sequence is authored in the node editor, stored in a .dialoguedb asset, and driven at runtime by the Dialogue Manager and Sequencer:

LayerWhat It Means
DialogueDatabaseStores named actors and sequences. Loaded at runtime by the Dialogue Manager.
DialogueSequenceA directed node graph. The Sequencer walks from startNodeId through Text, Selection, Effects, and Performance nodes.
ConditionsPolymorphic evaluators on branches. Failed conditions skip that branch automatically.
EffectsPolymorphic actions at EffectsNodeData nodes — set records, toggle entities.
PerformersNamed actor anchors in the level. Resolved from database actor names via DialoguePerformerMarkerComponent.

Conditions, effects, and performances are discovered automatically at startup — custom types from any gem appear in the editor without modifying GS_Cinematics.

Back to top…


Pulse - GS_Interaction

Pulse Pattern Graph

Breakdown

When an entity enters a Pulsor’s trigger volume, the Pulsor emits its configured pulse type to all Reactors on the entering entity:

StepWhat It Means
1 — Collider overlapPhysics detects an entity entering the Pulsor’s trigger volume.
2 — Pulse emitPulsorComponent reads its configured PulseType and prepares the event.
3 — Reactor queryEach PulseReactorComponent on the entering entity is checked with IsReactor().
4 — ReactionReactors returning true have ReceivePulses() called and execute their response.

Pulse types are polymorphic — new types are discovered automatically at startup. Any gem can define custom interaction semantics without modifying GS_Interaction.

Back to top…


World Triggers - GS_Interaction

World Trigger Pattern Graph

Breakdown

World Triggers split conditions from responses. Any number of World Triggers can stack on a single entity — each fires its own response independently when the condition is met.

PartWhat It Means
Trigger SensorMonitors for a condition. When satisfied, calls Trigger() on all WorldTriggerComponent instances on the same entity.
World TriggerExecutes a response when Trigger() is called. Can be reset with Reset() to re-arm for the next activation.

No scripting is required for standard patterns. Compose Trigger Sensors and World Triggers in the editor to cover the majority of interactive world objects without writing any code.

Back to top…


Camera Blend - GS_PhantomCam

Camera Blend Pattern Graph

Breakdown

When the dominant Phantom Camera changes, the Cam Manager queries the Blend Profile to determine transition timing and easing:

StepWhat It Means
1 — Priority changeA Phantom Camera gains highest priority or is activated.
2 — Profile queryCam Manager calls GetBestBlend(fromCam, toCam) on the assigned GS_PhantomCamBlendProfile asset.
3 — Entry matchEntries are checked by specificity: exact match → any-to-specific → specific-to-any → default fallback.
4 — InterpolationCam Core blends position, rotation, and FOV over the matched entry’s BlendTime using the configured EasingType.

Because Blend Profiles are shared assets, editing one profile updates every camera that references it.

Back to top…


Unit Possession - GS_Unit

Unit Possession Pattern Graph

Breakdown

Every unit has exactly one controller at a time, or no controller at all. Possession is established by calling Possess on the unit and released by calling DePossess. The unit fires UnitPossessed on UnitNotificationBus whenever ownership changes so other systems can react.

ConceptDescription
PossessionA controller attaches to a unit. The unit accepts input and movement commands from that controller only.
DePossessionThe controller releases the unit. The unit halts input processing and enters a neutral state.
UnitPossessed eventFires on UnitNotificationBus (addressed by entity ID) whenever a unit’s controller changes.
GetControllerReturns the entity ID of the current controller, or an invalid ID if none.
GetUniqueNameReturns the string name assigned by the Unit Manager when this unit was spawned.

Back to top…


Unit Input Handling - GS_Unit

Unit Input Handling Pattern Graph

Breakdown

The input pipeline has three stages. Each stage is a separate component on the unit entity, and they run in order every frame:

StageComponentWhat It Does
1 — ReadGS_PlayerControllerInputReaderComponentReads raw input events from the active input profile and writes them into GS_InputDataComponent.
2 — StoreGS_InputDataComponentHolds the current frame’s input state — button presses, axis values — as structured data.
3 — ReactReactor componentsRead from GS_InputDataComponent and produce intent: movement vectors, action triggers, etc.

All reactor components downstream of the store stage read from the same GS_InputDataComponent, so there is no duplicated hardware polling and no risk of two reactors seeing different input states for the same frame.

Back to top…


Movers & Grounders - GS_Unit

Movers & Grounders Pattern Graph

Breakdown

Movers and Grounders are multiple components on a Unit that are constantly changing activation based on the units state. When a mover is active, it freely processes the unit’s movement based on it’s functionality. At any moment, through internal or external forces, the Movement, Rotation, or Grounding state can change, which disables the old movers, and activates the new one to continue controlling the Unit.

Input
    ↓  InputDataNotificationBus::InputStateChanged
Input Reactor Components
    ↓  MoverContextRequestBus::SetMoveInputAxis
GS_MoverContextComponent
    ↓  ModifyInputAxis() → camera-relative
    ↓  GroundInputAxis() → ground-projected
    ↓  MoverContextNotificationBus::MovementModeChanged("Free")
GS_3DFreeMoverComponent  [active when mode == "Free"]
    ↓  AccelerationSpringDamper → rigidBody->SetLinearVelocity
GS_PhysicsRayGrounderComponent  [active when grounding mode == "Free"]
    ↓  MoverContextRequestBus::SetGroundNormal / SetContextState("grounding", ...)
    ↓  MoverContextRequestBus::ChangeMovementMode("Slide")  ← when slope too steep
GS_3DSlideMoverComponent  [active when mode == "Slide"]

The Slide mover activates when the Unit is walking on too steep an angle. It takes control over the unit, slides down the hill, then restores the previous movement behaviour.

Back to top…

2.4 - Templates List

All ClassWizard templates for GS_Play gems — one-stop reference for generating extension classes across every gem.

All GS_Play extension types are generated through the ClassWizard CLI. The wizard handles UUID generation, cmake file-list registration, and module descriptor injection automatically. Never create these files from scratch.

python ClassWizard.py \
    --template <TemplateName> \
    --gem <GemPath> \
    --name <SymbolName> \
    [--input-var key=value ...]

 

Contents


GS_Core

Full details: GS_Core Templates

TemplateGeneratesUse For
GS_ManagerComponent${Name}ManagerComponent.h/.cpp + optional ${Name}Bus.hManager-pattern components with EBus interface
SaverComponent${Name}SaverComponent.h/.cppCustom save/load handlers via the GS_Save system
GS_InputReaderComponent${Name}InputReaderComponent.h/.cppController-side hardware input readers
PhysicsTriggerComponent${Name}PhysicsTriggerComponent.h/.cppPhysX trigger volumes with enter/hold/exit callbacks

GS_Interaction

Full details: GS_Interaction Templates

TemplateGeneratesUse For
PulsorPulse${Name}_Pulse.h/.cppNew pulse type for the Pulsor emitter/reactor system
PulsorReactor${Name}_Reactor.h/.cppNew reactor type that responds to a named channel
WorldTrigger${Name}_WorldTrigger.h/.cppNew world trigger that fires a named event
TriggerSensor${Name}_TriggerSensor.h/.cppNew trigger sensor that listens and responds to events

GS_Unit

Full details: GS_Unit Templates

TemplateGeneratesUse For
UnitController${Name}ControllerComponent.h/.cppCustom controller for player or AI possession logic
InputReactor${Name}InputReactorComponent.h/.cppUnit-side input translation from named event to bus call
Mover${Name}PhysicsMoverComponent.h/.cpp or ${Name}MoverComponent.h/.cppCustom locomotion mode (walk, swim, climb, etc.)
Grounder${Name}PhysicsRayGrounderComponent.h/.cpp or ${Name}GrounderComponent.h/.cppCustom ground state detection (ray, capsule, custom)

GS_UI

Full details: GS_UI Templates

TemplateGeneratesUse For
UiMotionTrack${Name}Track.h + .cppNew LyShine property animation track for .uiam assets

GS_Juice

Full details: GS_Juice Templates

TemplateGeneratesUse For
FeedbackMotionTrack${Name}Track.h + .cppNew world-space entity property animation track for .feedbackmotion assets

GS_PhantomCam

Full details: GS_PhantomCam Templates

TemplateGeneratesUse For
PhantomCamera${Name}PhantomCamComponent.h/.cppCustom camera behaviour (follow, look-at, tick overrides)

GS_Cinematics

Full details: GS_Cinematics Templates

TemplateGeneratesUse For
DialogueCondition${Name}_DialogueCondition.h/.cppGate dialogue node connections — return true to allow
DialogueEffect${Name}_DialogueEffect.h/.cppFire world events from Effects nodes; optionally reversible
DialoguePerformance${Name}_DialoguePerformance.h/.cppDrive world-space NPC actions; sequencer waits for completion

Registration Quick Reference

TemplateAuto-registered by CLIManual step required
Manager ComponentYes (cmake + module)None
Saver ComponentYes (cmake + module)None
InputReader ComponentYes (cmake + module)None
Physics Trigger ComponentYes (cmake + module)None
Pulsor PulseYes (cmake)None
Pulsor ReactorYes (cmake)None
World TriggerYes (cmake)None
Trigger SensorYes (cmake)None
Unit ControllerYes (cmake + module)None
Input ReactorYes (cmake + module)None
Mover ComponentYes (cmake + module)Set mode name strings in Activate()
Grounder ComponentYes (cmake + module)Set mode name string in Activate()
Phantom CameraYes (cmake + module)None
UiMotionTrackYes (cmake)Add Reflect(context) to UIDataAssetsSystemComponent
FeedbackMotionTrackYes (cmake)Add Reflect(context) to GS_JuiceDataAssetSystemComponent
DialogueConditionYes (cmake)Add Reflect(context) to DialogueSequencerComponent
DialogueEffectYes (cmake)Add Reflect(context) to DialogueSequencerComponent
DialoguePerformanceYes (cmake)Add Reflect(context) to DialogueSequencerComponent

2.5 - Powerful Utilities

Powerful Utilities associated with the GS_Play featuresets.

Powerful Utilities Outline

Powerful Utilities

2.6 - Glossary

Descriptions and resources around common terminology used in the documentation.

GS_Play Definitions

 

Action

An Action is a reusable, entity-scoped behavior triggered by name. Actions are authored as ActionComponent instances and fired via ActionRequestBus. Any trigger source — a physics zone, dialogue effect, another action, or scripted logic — can invoke an action by name without needing a direct reference to the target component.

 

Babble

Babble is procedural audio vocalisation synchronized to the Typewriter text display system in GS_Cinematics. The BabbleComponent generates tones keyed to a speaker’s identity and tone configuration, producing low-fidelity speech sounds as each character is revealed. See Babble API.

 

Controller

See Unit Controller.

 

Coyote Time

Coyote Time is a grace period during which a grounded jump remains valid for a brief window after the unit walks off a ledge. It prevents the frustrating experience of pressing jump a frame too late when stepping off an edge. Configured per mover component as a float duration in seconds.

 

Debug Mode

Debug Mode is a setting on the Game Manager component that makes the game start in the current editor level instead of navigating to the title stage. All manager initialization and event broadcasting proceed normally — only the startup navigation changes. Use it for rapid iteration on any level without going through the full boot flow.

 

Extensible

Extensible means able to be inherited from to augment and expand on the core functionality. Extensible classes and components, identified by the extensible tag in the documentation, are those available on the public Gem API build target and usable across gem and project boundaries. Extensible classes usually have ClassWizard templates to rapidly generate child classes.

 

Feedback Sequence

A Feedback Sequence is a data asset (.feedbackmotion) that drives world-space entity property animations — position, rotation, scale, material parameters — over time. Sequences are played on entities via GS_FeedbackRequestBus. Track types within a sequence are polymorphic and extensible via the FeedbackMotionTrack ClassWizard template. See Feedback System API.

 

Grounder

A Grounder is a component that detects whether a unit is standing on a surface and reports the ground normal, surface material, and grounded state. Grounders feed data to movers and to coyote time logic. The built-in GS_PhysicsRayGrounderComponent uses a downward raycast. Custom grounders can be created with the Grounder ClassWizard template. See Movement API.

 

Input Data

Input Data is a structured snapshot of a player’s current input state: movement vectors, button states, and action flags. It is held by GS_InputDataComponent on the controller entity and updated each frame by an InputReaderComponent. Input Reactor components read from this snapshot to translate intent into unit commands. See Input Data API.

 

Input Profile

An Input Profile is a named mapping between hardware input events and GS_InputData fields. Profiles are defined in the GS_Options system and can be swapped at runtime to support multiple control schemes (gamepad, keyboard/mouse, accessibility layouts). See GS_Options API.

 

Input Reactor

An Input Reactor is a component on the unit entity that reads GS_InputData from the controller and translates it into movement vectors or action bus calls. Input Reactors are the bridge between the controller’s intent and the unit’s subsystems. Custom reactors are created with the InputReactor ClassWizard template. See Input Data API.

 

Input Reader

An Input Reader is a component on the controller entity that polls hardware input events and writes them into GS_InputDataComponent. It is the hardware-facing end of the input pipeline. The built-in GS_PlayerControllerInputReaderComponent handles standard gamepad and keyboard events. Custom readers are created with the GS_InputReaderComponent ClassWizard template. See Input Data API.

 

Manager

A Manager is a component that extends GS_ManagerComponent and registers with the Game Manager at startup. Managers are spawned by the Game Manager, initialized in a guaranteed two-stage sequence, and made globally accessible via EBus. Each GS_Play gem that provides a system component (Save, Stage, UI, Audio, etc.) includes a manager.

 

Modular

Modular describes a system or feature that is self-contained with clear boundaries between its internal structure and external integrations. Base functionality has no outside dependencies beyond GS_Core, ensuring that when you interface with it, no additional gems are required for it to work. GS_Core itself is the sole shared dependency across all GS_Play gems. Some modular features optionally integrate with O3DE subsystems (for example, GS_UI depends on LyShine).

 

Motion Composite

A Motion Composite is a named grouping of motion layers within GS_Motion that play and blend together. Composites allow multiple concurrent animations (e.g., a camera shake layered over a position tween) to be authored, started, and stopped as a unit.

 

Motion Proxy

A Motion Proxy is a lightweight target object used by GS_Motion to animate values that are not directly addressable on a component. Proxies act as intermediate containers; the animation writes into them, and a binding reads the proxy value each frame and applies it to the real target.

 

Motion Track

A Motion Track is a single animatable property lane within a Feedback Sequence or UI Animation asset. Each track drives one property (e.g., position X, material tint alpha) over time using keyframe curves. Custom track types are registered via Reflect(context) and authored with a ClassWizard template (FeedbackMotionTrack or UiMotionTrack). See Feedback API and UI Animation API.

 

Mover

A Mover is a component that implements one locomotion mode for a unit: walking, swimming, climbing, floating, and so on. Each mover processes movement input and outputs a velocity or transform delta. The active mover is selected by name at runtime. Multiple movers can coexist on one entity; only one is active at a time. Custom movers are created with the Mover ClassWizard template. See Movement API.

 

Movement Mode

A Movement Mode is the named identifier of a mover component. Units switch modes by calling SetMovementMode with the mode string defined in the mover’s Activate(). Common examples: "Walk", "Swim", "Fly".

 

MoverContext

GS_MoverContextComponent transforms raw input intent into mode-aware movement commands and manages which movement mode is active at any given moment. It holds movement profiles and routes input to the correct mover. Only one mover runs at a time — mode switches activate a different mover, not a blend of multiple. See Movement API.

 

Paper Performer

A Paper Performer is a billboard-based character rendering approach for 2.5D and top-down games. The PaperFacingHandlerComponent keeps a sprite quad oriented toward the active camera or a configurable facing target each tick. Custom facing strategies can be created by extending PaperFacingHandlerBaseComponent. See Paper Performer API.

 

Performer

A Performer is a character entity registered with the GS_PerformerManagerComponent. Performers are addressable by a unique name string and support skin slot swapping, locomotion hook binding, and optional head tracking. The Performer Manager is the lookup point for retrieving performer entities by name at runtime. See Performer API.

 

Phantom Camera

A Phantom Camera is a virtual camera component (GS_PhantomCameraComponent) that holds a complete camera state: FOV, clip planes, follow target, look-at target, offsets, and priority. Phantom cameras do not render — they are candidate states. The Cam Core blends the real camera toward the highest-priority phantom. Custom behavior is added by extending the component with the PhantomCamera ClassWizard template. See Phantom Cameras API.

 

Possession

Possession is the relationship between a Unit Controller and a Unit. A controller possesses a unit by calling Possess, gaining authority over its movement and actions. A unit can only be possessed by one controller at a time. Possession is released by calling DePossess. See Unit Controllers API.

 

Pulse / Pulsor

A Pulse is a typed data payload broadcast from an emitter trigger volume into the Pulsor system. A Pulsor Emitter fires pulses when triggered; Pulsor Reactors on other entities receive pulses on a matching named channel. New pulse types are created with the PulsorPulse ClassWizard template. See Pulsors API.

 

Reactor

A Reactor is a component that receives Pulses on a named channel from the Pulsor system and executes a response. Reactors subscribe to a specific pulse type and channel name. Multiple reactors can listen on the same channel simultaneously. New reactor types are created with the PulsorReactor ClassWizard template. See Pulsors API.

 

Record / RecordKeeper

A Record is a named integer value stored persistently in the GS_Save system as a flat key-value entry. Records are used for progression flags and counters — quest state, NPC disposition, collected items, and similar global data. The RecordKeeperRequestBus provides HasRecord, SetRecord, GetRecord, and DeleteRecord. Records persist alongside the save file and survive level transitions. See GS_Save API.

 

Skin Slot

A Skin Slot is one addressable mesh layer in a modular character’s appearance composition. Each slot holds an actor asset and a list of material overrides. PerformerSkinSlotComponent manages one slot; SkinSlotHandlerComponent manages the full set for a character and applies PerformerSkinSlotsConfigProfile presets. See Performer API.

 

Stage Marker

A Stage Marker is a named world-space anchor entity used by GS_Cinematics to position NPCs during dialogue sequences. A Performance navigates or teleports the assigned NPC to the marker’s transform. Markers are registered by name with the Cinematics Manager. See Cinematics Manager API.

 

Standby

Standby is a global pause broadcast to all managers and their subsystems. The Game Manager enters standby automatically during level transitions and other blocking operations, broadcasting OnEnterStandby to halt gameplay systems and OnExitStandby when the operation completes.

 

Startup Sequence

The Startup Sequence is the three-stage lifecycle the Game Manager runs before gameplay is ready: Stage 1 (Initialize) — each manager activates internally; Stage 2 (SetupManagers) — all managers connect to each other; Stage 3 (StartupComplete) — the game is fully ready. Listen to OnStartupComplete on GameManagerNotificationBus to know when it is safe to begin gameplay.

 

Timeline Expansion

Timeline Expansion refers to the extensibility layer of GS_Cinematics that allows new track and key types to be added to the dialogue sequence timeline. Custom tracks register via Reflect(context) and integrate with the sequencer’s playback engine. See Timeline Expansion API.

 

Trigger Channel

A Trigger Channel is the named string key used to match World Triggers to Trigger Sensors. A trigger fires events on its channel name; any sensor listening to the same channel name receives the event. Channels are free-form strings authored in the component properties.

 

Trigger Sensor

A Trigger Sensor is a component that listens on a named Trigger Channel and executes a response when an event fires on that channel. New sensor types are created with the TriggerSensor ClassWizard template. See World Triggers API.

 

Unit

A Unit is any entity possessable by a controller. The GS_UnitComponent marker transforms an ordinary entity into a unit, registering it with the Unit Manager and exposing the possession interface. The unit provides the body (movement, collision, visuals); the controller provides the brain (input or AI). See Units API.

 

Unit Controller

A Unit Controller is a component that possesses a unit entity and drives its behavior. The base GS_UnitControllerComponent handles possession, placement helpers, and unique naming. GS_PlayerControllerComponent adds player input integration; GS_AIControllerComponent adds AI hooks. Custom controllers are created with the UnitController ClassWizard template. See Unit Controllers API.

 

World Trigger

A World Trigger is a component that fires a named event on a Trigger Channel when its activation condition is met — volume entry, exit, proximity, or custom logic. New trigger types are created with the WorldTrigger ClassWizard template. See World Triggers API.


O3DE Source Definitions

 

AZ::Component

AZ::Component is the base class for all O3DE components. All GS_Play components ultimately inherit from it, either directly or through a GS_Play base class. It provides the Activate(), Deactivate(), GetProvidedServices(), and Reflect() interface that the O3DE component system calls at runtime.

 

AZ::Data::AssetData

AZ::Data::AssetData is the base class for O3DE asset types. Custom data assets in GS_Play (such as FeedbackMotionAsset, PerformerSkinSlotsConfigProfile, save records) derive from this class and are serialized by the Asset Processor.

 

AZ::EntityId

AZ::EntityId is the unique identifier for an entity in O3DE. It is the primary address type for ById EBus calls. An invalid (null) entity ID evaluates as AZ::EntityId::IsValid() == false. Most GS_Play bus calls take an AZ::EntityId as their address.

 

Class Wizard

The Class creation utility built natively into the O3DE codebase. Use the Class Wizard to create many common classes needed for O3DE project development. GS_Play extends the Class Wizard with gem-specific templates for generating extension classes. See ClassWizard reference.

 

Component

A Component in O3DE is a discrete unit of behavior or data that lives on an Entity. Components are authored in C++ as classes inheriting from AZ::Component, reflected into the engine’s serialization and editing systems, and added to entities in the Editor. GS_Play provides pre-built component types for all its subsystems.

 

EBus

An EBus (Event Bus) is O3DE’s primary inter-component communication mechanism. It is a typed, address-routed publish/subscribe bus. “Request” buses carry method calls to a handler; “Notification” buses broadcast events to listeners. Buses can be addressed by entity ID (ById), by a custom address, or broadcast to all handlers. All GS_Play public APIs are exposed as EBus interfaces.

 

EditContext

AZ::EditContext is the O3DE reflection context that controls how a type appears in the Editor Inspector. EditContext entries define display names, category, tooltip, slider limits, and which fields are visible. It is accessed from inside Reflect(context) after the SerializeContext block. See O3DE Reflection Rules in project instructions for placement constraints.

 

Entity

An Entity in O3DE is a container of components. Entities themselves have no behavior — all behavior comes from the components attached to them. Entities are identified by an AZ::EntityId and can be loaded from prefabs, spawned at runtime via EntitySpawnTicket, or created procedurally.

 

EntitySpawnTicket

An AZ::EntitySpawnTicket is the runtime handle for a spawned entity or prefab instance in O3DE. It keeps the spawned entity alive as long as the ticket is held. Releasing the ticket destroys the spawned entities. GS_Play’s Unit Manager and Stage Manager use spawn tickets to manage spawned unit and manager entities.

 

Gem

A Gem is O3DE’s modular project extension unit — equivalent to a plugin or package. Each GS_Play subsystem is packaged as a gem (GS_Core, GS_Unit, GS_UI, etc.). Gems declare their public API via build targets, and projects opt in by listing gems in their project configuration.

 

Module Descriptor

A Module Descriptor is the O3DE gem entry point — a class derived from AZ::Module that registers component type descriptors with the engine on startup. Every GS_Play gem has a module descriptor, and the ClassWizard templates automatically add new component descriptors to it when generating extension classes.

 

O3DE

Open 3D Engine — the open-source 3D engine on which GS_Play is built. Maintained by the Linux Foundation. O3DE provides the entity-component framework, asset pipeline, physics integration (PhysX), UI system (LyShine), animation (EMotionFX), and scripting (Script Canvas, Lua) that GS_Play builds upon.

 

Prefab

A Prefab in O3DE is a reusable, nestable entity hierarchy saved as a .prefab file. Prefabs are the O3DE equivalent of Unity prefabs or Unreal Blueprints at the entity composition level. GS_Play’s Game Manager, unit templates, and manager entities are typically authored and spawned as prefabs.

 

ReflectContext

AZ::ReflectContext is the base class for all O3DE reflection contexts. The Reflect(AZ::ReflectContext* context) static method on each component is called by the engine during startup and accepts a ReflectContext that may be cast to SerializeContext, EditContext, BehaviorContext, or others. Each context controls a different aspect of how the type is surfaced to the engine.

 

SerializeContext

AZ::SerializeContext is the O3DE reflection context that controls how a type is serialized — to .prefab files, save data, and asset files. It defines which fields are persisted, the type version, and the class hierarchy. It is the first context accessed in a Reflect() implementation. Placing ->Attribute(AZ::Edit::Attributes::EnableForAssetEditor, true) in SerializeContext (not EditContext) is required for asset types to appear in the Asset Editor.

 

Templates

Templates are pre-designed packages that allow the generative creation of gems, components, classes, and file types by copying and modifying template files to meet systemic needs. In GS_Play, ClassWizard templates handle UUID generation, cmake file-list registration, and module descriptor injection automatically. See the Templates List for all available GS_Play templates.

 

TypeId / UUID

A TypeId (also called UUID in O3DE context) is the globally unique identifier assigned to every reflected C++ type. It is a 128-bit value expressed as a hex string in braces, e.g. {A1B2C3D4-...}. The ClassWizard generates a new random UUID for each class it creates. UUIDs must be unique across all gems in a project; never duplicate them. They appear in AZ_COMPONENT_IMPL and in SerializeContext ->Class<>() declarations.

2.7 - Links

Useful links.

Genome Studios

GS_HUB discord link

GS_Play Asset Store Page

gs_play branding guidelines

gs_play site terms of service

gs_play EULA

gs_play socials

genome socials

o3de site

o3de documentation

o3de branding guidelines

3 - The Basics

High-level usage and scripting references for GS_Play feature sets. Drop-in functionality for intermediate developers and designers.

GS_Play is a full production set of modular features that can all be individually toggled on to enable only the features you need for your project. The “Basics” section covers what each feature set does, how to work with it from the editor and scripts, and what events and nodes are available — without requiring deep architectural knowledge.

If you are new to GS_Play, start with Get Started first. If you need architecture-level detail, component internals, or extension guides, use the Framework API section instead.


How This Section Is Organized

Each gem section follows this structure:

Gem overview page — What the gem does, the problems it solves, and a summary of its feature sets with links to feature sub-pages.

Feature sub-pages — One page per feature set. Each covers:

  • What it does and what components are involved.
  • Editor setup for drop-in use.
  • Relevant ScriptCanvas nodes and EBus events.
  • A quick reference table.
  • Links to the Framework API for deeper reference.

Pages in this section deliberately omit architecture internals, extension code, and low-level component details. If you need those, follow the Framework API links at the bottom of each page.


Sections

3.1 - GS_Core

The foundation gem for the GS_Play framework — game lifecycle, save system, stage management, input, actions, and utility libraries.

GS_Core is the required foundation for every GS_Play enabled project. All other GS gems depend on it to drive their complex behaviour, and utilize its systemic features. It provides the game startup sequence, persistence, level loading, input handling, a triggerable action system, and a shared utility library.

If you have not set up GS_Core yet, start with the Simple Project Setup guide before reading further.

For architecture details, component properties, and extending the system in C++, see the GS_Core API.


Quick Navigation

I want to…FeatureAPI
Start a new game, continue from a save, load a specific file, or return to the title screenGS_ManagersAPI
Save and load game data, or track persistent flags and counters across sessionsGS_SaveAPI
Move between levels, or configure per-level spawn points and navigation settingsGS_StageManagerAPI
Read player input, disable input during menus, or swap control schemes at runtimeGS_OptionsAPI
Use easing curves, detect physics zones, smooth values, pick randomly, or work with splinesUtilitiesAPI
Trigger a reusable behavior on an entity from a script, physics zone, or another actionSystems: GS_ActionsAPI
Animate a transform, color, or value smoothly over timeSystems: GS_MotionAPI

Installation

GS_Core is a required gem. It will be added to your project when you enable any other GS_Play gem.

For a complete guided setup, follow the Simple Project Setup guide or video tutorial.

Follow these steps in particular:

  1. Configure Project
  2. Prepare Managers
  3. Prepare Startup

 

Quick Installation Summary

Once the gem is registered to your project:

  1. Create a Game Manager prefab and place it in every level.
  2. Create prefabs of any managers you wish to utilize in your project
  3. Add all the manager prefabs to your GameManager Managers list.
  4. Implement a way to activate “Begin Game”
    • Create a UI to fire New Game, or Load Game.
    • Create a Script to fire New Game, or Load Game OnStartupComplete.
    • Toggle “Debug Mode” on. This skips through the begin game process.

GS_Managers

Controls the game startup lifecycle — spawning and initializing all manager systems in a guaranteed order, then providing top-level game navigation: New Game, Continue, Load Game, Return to Title, and Quit. The starting point for any game-wide behavior.

GS_Managers API


GS_Save

Handles all save and load operations, including entity state persistence across level loads and simple key-value record tracking for global flags and counters.

GS_Save API


GS_StageManager

Manages level loading and navigation. Place named Exit Points in your levels to control where the player arrives, and use Stage Data components to configure per-level settings like NavMesh references and spawn configuration.

GS_StageManager API


GS_Options

Manages player input through swappable Input Profile assets and Input Reader components, with group-level enable/disable for suppressing input during menus, cutscenes, or transitions.

GS_Options API


Systems

Core framework systems used across multiple gems: the GS_Actions triggerable behavior system and the GS_Motion track-based animation engine.

Systems API


Utilities

A collection of shared tools: easing curves (40+ types), spring dampers for smooth value following, weighted random selection, color and float gradients, spline helpers, and Physics Trigger Volume components.

Utilities API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

3.1.1 - Managers System

How to work with the GS_Play manager system — startup events, game navigation, and standby mode from scripts.

The Managers system is how GS_Play starts your game. The Game Manager spawns all other managers in a guaranteed order, coordinates their initialization stages, and then broadcasts events that signal when each stage is complete and when the game is fully ready to run.

This gives you the ability to ensure startup happens as expected, can create your own managers of any type, and can toggle full-game standby.

For architecture details, component properties, and extending the system in C++, see the GS_Managers API.

Game Manager component in the O3DE Inspector

 

Contents


Startup Sequence

Game Manager Startup Pattern Graph

Breakdown

When the project starts, the Game Manager runs three stages before the game is considered ready:

StageBroadcast EventWhat It Means
1 — Initialize(internal)Each manager is spawned. They activate, then report ready.
2 — SetupOnSetupManagersSetup stage. Now safe to query other managers.
3 — CompleteOnStartupCompleteLast stage. Everything is ready.
Do any last minute things. Now safe to begin gameplay.

For most scripts, you only need OnStartupComplete. Wait for this event before doing anything that depends on managers to be completely setup.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Responding to Startup

ScriptCanvas

Connect to GameManagerNotificationBus and handle OnStartupComplete to know when the game is fully ready:

To check at any point whether the game has already finished starting, use the IsStarted request:


Game Navigation

The Game Manager owns the top-level game flow. Call these from title screens, pause menus, and end-game sequences. They coordinate the Save Manager and Stage Manager automatically.

ScriptCanvas NodeWhat It Does
TriggerNewGameStarts a new game with the default save name.
TriggerNewGameWithName(saveName)Starts a new game and writes to a named save file.
TriggerContinueGameLoads the most recent save and continues from it.
TriggerLoadGame(saveName)Loads a specific save file by name.
TriggerReturnToTitleReturns to the title stage, tearing down the current session.
TriggerSaveAndExitGameSaves the current state and exits the application.
TriggerExitGameExits the application without saving.

Standby Mode

Standby is a global pause. The Game Manager enters standby automatically during level transitions and other blocking operations. It broadcasts OnEnterStandby to halt all gameplay systems, and OnExitStandby when the operation completes.

Listen to these in any script that drives continuous logic — timers, ticks, or animation sequences:

EventWhat to Do
OnEnterStandbyPause timers, halt ticks, stop animations.
OnExitStandbyResume timers, re-enable ticks.

Both events are on GameManagerNotificationBus.


Debug Mode

When Debug Mode is enabled on the Game Manager component in the editor, the game starts in the current level instead of navigating to your title stage. This allows rapid iteration on any level without going through the full boot flow.

Debug Mode only changes startup navigation. All manager initialization and event broadcasting proceed normally.


Quick Reference

NeedBusMethod / Event
Know when startup is completeGameManagerNotificationBusOnStartupComplete
Check if game has startedGameManagerRequestBusIsStarted
Start a new gameGameManagerRequestBusNewGame / TriggerNewGame (SC)
Continue from last saveGameManagerRequestBusContinueGame / TriggerContinueGame (SC)
Load a specific saveGameManagerRequestBusLoadGame / TriggerLoadGame (SC)
Return to titleGameManagerRequestBusReturnToTitle / TriggerReturnToTitle (SC)
Pause all systemsGameManagerRequestBusEnterStandby
Resume all systemsGameManagerRequestBusExitStandby
Know when standby changesGameManagerNotificationBusOnEnterStandby / OnExitStandby

Glossary

TermMeaning
StandbyGlobal pause broadcast to all managers and their subsystems
Startup SequenceThe three-stage lifecycle (Initialize → SetupManagers → StartupComplete) before gameplay is ready
ManagerA component that extends GS_ManagerComponent and registers with the Game Manager
Debug ModeStarts the game in the current editor level instead of navigating to the title stage

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

3.1.2 - Save System

How to work with the GS_Play save system — saving game state, loading saves, and tracking progression with the Record Keeper.

The Save system handles all persistence in a GS_Play project. The Save Manager coordinates file operations, Savers serialize per-entity state, and the Record Keeper tracks flat progression data. Together they give you a complete save/load pipeline that works out of the box and extends cleanly for custom data.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Save Manager component in the O3DE Inspector

 

Contents


How Saving Works

Save System Pattern Graph

Breakdown

When a save is triggered, the Save Manager broadcasts OnSaveAll to every Saver component in the scene. Each Saver serializes its entity’s relevant state into the save file. When loading, the Save Manager broadcasts OnLoadAll, and each Saver restores its entity from the save data.

The Save Manager also maintains a list of all save files with metadata (timestamps, names), so you can present a save/load UI to the player.

OperationWhat Happens
New saveCreates a new save file, broadcasts OnSaveAll to all Savers.
Load saveReads save file, broadcasts OnLoadAll to all Savers.
Save dataWrites current game state to the active save file.
Load dataReads data from the active save file into memory.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Responding to Save Events

ScriptCanvas

Connect to SaveManagerNotificationBus to know when save or load operations occur:


Triggering Saves and Loads

These methods are available on SaveManagerRequestBus:

ScriptCanvas NodeWhat It Does
NewGameSaveCreates a fresh save file for a new game.
LoadGame(saveName)Loads a specific save file by name.
SaveDataWrites current state to the active save file.
LoadDataReads the active save file into memory.
GetOrderedSaveListReturns all save files sorted by most recent.
ConvertEpochToReadable(epoch)Converts a save file timestamp to a human-readable string.

Record Keeper

The Record Keeper is a lightweight key-value store for tracking game-wide progression — quest flags, counters, unlock states, completion markers. It lives on the Save Manager prefab and is automatically persisted with the save system.

Unlike Savers (which are per-entity), the Record Keeper is a global singleton. Any script or component can read and write records by name.

ScriptCanvas NodeWhat It Does
HasRecord(name)Returns whether a record with the given name exists.
SetRecord(name, value)Creates or updates a record. Value is a float.
GetRecord(name)Returns the current value of a record.
DeleteRecord(name)Removes a record.

Responding to Record Changes

Listen on RecordKeeperNotificationBus for the RecordChanged event. This fires whenever any record is created, updated, or deleted — useful for UI that displays progression state.


Built-In Savers

Two Savers ship with GS_Core for the most common use cases:

SaverWhat It Saves
BasicEntitySaverEntity transform (position, rotation, scale).
BasicPhysicsEntitySaverEntity transform plus rigidbody velocity and angular velocity.

Add these components to any entity that needs to persist its position across save/load cycles. They handle serialization and restoration automatically.


Quick Reference

NeedBusMethod / Event
Trigger a saveSaveManagerRequestBusSaveData
Trigger a loadSaveManagerRequestBusLoadGame(saveName)
Create a new saveSaveManagerRequestBusNewGameSave
List all savesSaveManagerRequestBusGetOrderedSaveList
Know when savingSaveManagerNotificationBusOnSaveAll
Know when loadingSaveManagerNotificationBusOnLoadAll
Check a progress flagRecordKeeperRequestBusHasRecord(name) / GetRecord(name)
Set a progress flagRecordKeeperRequestBusSetRecord(name, value)
Know when a record changesRecordKeeperNotificationBusRecordChanged

Glossary

TermMeaning
SaverA component that serializes one entity’s state into the save file
Record KeeperA global key-value store for tracking progression flags and counters
Save FileA serialized snapshot of all Saver data plus Record Keeper state

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

3.1.3 - Stage Management

How to work with the GS_Play stage management system — level loading, stage transitions, exit points, and stage data.

The Stage Manager handles all level-to-level navigation in a GS_Play project. It owns the master list of stages, processes transition requests, and coordinates with the Game Manager’s standby mode to ensure clean load/unload cycles. Stage Data components in each level control how that level initializes.

For architecture details, component properties, and extension patterns, see the Framework API reference.

Stage Manager component in the O3DE Inspector

 

Contents


How Stage Transitions Work

Stage Change Pattern Graph

Breakdown

When you request a stage change, the system follows this sequence:

StepWhat Happens
1 — StandbyThe Game Manager enters standby, pausing all gameplay systems.
2 — UnloadThe current stage’s entities are torn down.
3 — SpawnThe target stage’s prefab is instantiated.
4 — Set UpThe Stage Data component in the new level runs its layered startup sequence.
5 — CompleteThe Stage Manager broadcasts LoadStageComplete. Standby exits.

The Stage Data startup is layered — SetUpStage, ActivateByPriority, then Complete — so heavy levels can initialize incrementally without causing frame-time spikes.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Stage Data

Stage Data component in the O3DE Inspector

Each level should have a Stage Data component as its root entity. Using a start inactive child “Level” entity, the Stage Data system controls the initialization sequence for that new level. Stage Data holds level-specific configuration, and scripts.

EventWhat It Means
OnBeginSetUpStageThe level is starting its setup. Initialize per-level systems.
ActivateByPriorityActivate heavy entities in priority order (lazy loading).
OnLoadStageCompleteThe level is fully loaded and ready.
OnTearDownStageThe level is being unloaded. Clean up per-level state.

Listen to these on StageDataNotificationBus in any script that needs to react to level lifecycle.


Triggering Stage Changes

ScriptCanvas

The exitPointName parameter is optional. If provided, the system will position the player at the named exit point in the target level.


Exit Points

Stage Exit Point component in the O3DE Inspector

Exit Points are named position markers placed in a level. They define where entities spawn when arriving from another stage. A door in Level A can specify that when transitioning to Level B, the player should appear at Exit Point “DoorB_Entry”.

Exit Points are registered and unregistered with the Stage Manager automatically when they activate and deactivate.

ScriptCanvas NodeWhat It Does
ChangeStageRequest(stageName, exitPoint)Transitions to a stage and positions at the named exit point.
RegisterExitPoint(name, entity)Manually registers an exit point (usually automatic).
UnregisterExitPoint(name)Manually unregisters an exit point.
GetExitPoint(name)Returns the entity ID of a registered exit point.


Responding to Stage Events

ScriptCanvas


Entity Level Configuration

Stage Data entity level setup in the O3DE Editor

The Stage Data entity must live outside of the levels Game Manager prefab. It is left active.

Inside, it has a secondary level “wrapper” entity that you set to “Start Inactive” by default. This enables the Stage Data to control exactly when the level begins loading.


Quick Reference

NeedBusMethod / Event
Change to a different levelStageManagerRequestBusChangeStageRequest(stageName, exitPoint)
Load the default stageStageManagerRequestBusLoadDefaultStage
Know when a load startsStageManagerNotificationBusBeginLoadStage
Track loading progressStageManagerNotificationBusStageLoadProgress
Know when a load finishesStageManagerNotificationBusLoadStageComplete
React to level setupStageDataNotificationBusOnBeginSetUpStage
React to level teardownStageDataNotificationBusOnTearDownStage
Find an exit pointStageManagerRequestBusGetExitPoint(name)

Glossary

TermMeaning
StageA spawnable prefab representing a game level or screen
Exit PointA named position marker in a level that defines where entities arrive from another stage
Stage DataA per-level component that controls level initialization and holds level-specific settings
Default StageThe first stage loaded on application start (typically the title screen)

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

3.1.4 - Options & Input

How to work with the GS_Play options system — input profiles, input groups, and runtime binding management.

The Options system manages player-facing configuration and input handling. Its primary feature is the Input Profile system, which provides group-based input binding management that can be toggled at runtime without code changes.

For architecture details, component properties, and extension patterns, see the Framework API reference.

Options Manager component in the O3DE Inspector

 

Contents


Options Manager

The Options Manager is a singleton that holds the active Input Profile and makes it available to all Input Reader components. It responds to the Game Manager lifecycle automatically.

ScriptCanvas NodeWhat It Does
GetActiveInputProfileReturns the currently active Input Profile asset.

Input Profiles

Input Profile asset in the O3DE Asset Editor

An Input Profile is a data asset created in the O3DE Asset Editor. It contains named input groups, each holding a set of event mappings. Each event mapping binds a gameplay event name to one or more raw input bindings (key presses, axis movements, button presses) with configurable deadzones.

The key advantage over raw O3DE input bindings is the group system. Groups can be enabled and disabled independently at runtime — a pause menu can suppress gameplay input by disabling the “Gameplay” group, without tearing down and rebuilding bindings.


Enabling and Disabling Input Groups

ScriptCanvas

Enabling, Disabling, and Checking State of Input Groups in Script Canvas


How Input Flows

StageWhat Happens
1 — Raw InputO3DE’s input system captures key/axis/button events.
2 — Input ReaderThe GS_InputReaderComponent on the entity matches raw input against the active Input Profile’s event mappings.
3 — Event MappingMatched input fires a named gameplay event (e.g., “Jump”, “MoveForward”).
4 — ConsumerOther components on the entity (controllers, reactors) handle the gameplay event.

Input Readers filter by group — if a group is disabled, none of its event mappings fire.


Quick Reference

NeedBusMethod / Event
Disable an input groupInputReaderRequestBusDisableInputGroup(groupName)
Enable an input groupInputReaderRequestBusEnableInputGroup(groupName)
Check if group is disabledInputReaderRequestBusIsGroupDisabled(groupName)
Get the active profileOptionsManagerRequestBusGetActiveInputProfile

Glossary

TermMeaning
Input ProfileA data asset containing named input groups with event mappings
Input GroupA named collection of event mappings that can be enabled or disabled at runtime
Event MappingA binding from a gameplay event name to one or more raw input sources
Input ReaderA component that matches raw input against the active Input Profile

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

3.1.5 - Systems

Core framework systems — the GS_Actions triggerable behavior system and the GS_Motion track-based animation engine.

GS_Core provides a growing number of standalone systems that are used across multiple gems. Currently, the GS_Motion track-based animation engine powers UIAnimation and Juice Feedback playback.

For architecture details, component properties, and C++ extension guides, see the Framework API: Systems.

 

Contents


GS_Motion

Provides tween-style Motion Track components for animating transforms, colors, and values over time. Multiple tracks on the same entity run in parallel; chains are configured by setting an On Complete motion name.

GS_Motion API


See Also

For the full API, component properties, and C++ extension guides:

For related systems:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

3.1.5.1 - Actions System

How to work with the GS_Play action system — triggerable, composable behaviors that fire from scripts, triggers, or code.

The Actions system provides a universal pattern for attaching discrete, reusable behaviors to entities and triggering them from any source — ScriptCanvas, World Triggers, UI buttons, or C++ code. Actions are data-driven components that fire on named channels, enabling composition without custom scripting.

For architecture details, component properties, and creating custom actions in C++, see the Framework API reference.

 

Contents


How Actions Work

An Action is a component you attach to an entity. Each Action has a channel name. When something calls DoAction(channelName) on that entity’s bus, every Action component whose channel matches the name will execute.

This decoupling is the core value — the system that fires DoAction does not need to know what kind of action is attached. You can change, add, or remove action components on an entity without modifying any calling code.

ConceptWhat It Means
ChannelA named string. Actions on the same channel fire together.
CompositionMultiple actions on the same channel execute in parallel — stack components to compose behaviors.
ChainingAn action can fire a different channel on completion, enabling lightweight sequences.

Triggering Actions

ScriptCanvas

[ActionRequestBus → DoAction(channelName)]
    └─► All Action components on this entity with matching channel execute

To know when an action completes:

[ActionNotificationBus → OnActionComplete]
    └─► Action has finished executing

Built-In Actions

GS_Core ships with these ready-to-use actions:

ActionWhat It Does
PrintLogLogs a configurable message to the console. Useful for debugging trigger chains.
ToggleMouseCursorShows or hides the system mouse cursor.

Additional actions are available in other gems (e.g., World Trigger actions in GS_Interaction, dialogue effects in GS_Cinematics).


Common Patterns

World Trigger → Action

A World Trigger detects a collision or interaction event and fires DoAction on its entity. Action components on the same entity respond — one might play a sound, another might set a record, another might toggle an entity.

UI Button → Action

A UI button press fires DoAction with a channel name. Actions handle the response — navigate to a different UI page, start a new game, or toggle the pause menu.

Chaining Actions

Set an action’s “Chain Channel” property to fire a different channel when it completes:

Channel "OpenDoor" → [ToggleEntity action] → chains to "PlayDoorSound" → [AudioEvent action]

Quick Reference

NeedBusMethod / Event
Fire an actionActionRequestBusDoAction(channelName)
Know when an action completesActionNotificationBusOnActionComplete

Glossary

TermMeaning
ActionA component that executes a discrete behavior when triggered on a named channel
ChannelA named string identifier that groups actions — all actions on the same channel fire together
ChainingConfiguring an action to fire a different channel on completion, creating lightweight sequences

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

3.1.5.2 - Motion System

How to work with GS_Motion — the track-based animation and tween system that powers UI transitions, feedback effects, and custom animations.

GS_Motion is a track-based animation engine built into GS_Core. It drives timed property changes — position, rotation, scale, color, opacity — through authored data assets rather than hand-coded interpolation scripts. Domain gems extend GS_Motion with their own track types: GS_UI adds 8 LyShine-specific tracks for UI animation, and GS_Juice adds transform and material tracks for game feel feedback.

For architecture details, the domain extension pattern, and all track types, see the Framework API reference.

 

Contents


Key Concepts

ConceptWhat It Is
TrackA single animated property — what changes, how long, and which easing curve.
MotionA collection of tracks that play together. Tracks can start at different times within the motion.
Motion AssetA data asset authored in the O3DE Asset Editor containing the tracks and their configuration.
ProxyAn optional entity redirect — lets a track target a child entity instead of the motion’s owner.
CompositeThe runtime instance created from an asset. Each entity gets its own deep copy.

How It Works

  1. Author a motion asset in the Asset Editor. Each domain has its own asset type (.uiam for UI, .feedbackmotion for Juice).
  2. Assign the asset to a component or embed it in a serialized field (e.g., a page’s show/hide transitions).
  3. Play the motion from ScriptCanvas or C++. The system initializes a runtime composite, resolves proxies, and ticks all tracks.
  4. Each track receives an eased progress value (0 → 1) every frame and applies its property change to the target entity.
  5. When all tracks complete, the motion fires its OnComplete callback.

Easing Curves

Every track can use any of the 40+ easing curves from the GS_Core curves library. Curves are configured per-track in the asset editor.

Available families: Linear, Quad, Cubic, Sine, Expo, Circ, Back, Elastic, Bounce — each with In, Out, and InOut variants.


Proxy Targeting

UiAnimationMotion with proxy bindings configured in the Inspector

When a motion asset has tracks with identifiers (named labels), those tracks appear in the proxy list on the component. Proxies let you redirect a track to a different entity in the hierarchy — for example, a page show animation might animate the background separately from the content panel.

Each proxy entry maps a track label to a target entity. If no proxy is set, the track targets the motion’s owner entity.


Domain Extensions

GS_Motion is not used directly — it provides the base system that domain gems extend with concrete track types.

UI Animation (GS_UI)

Eight tracks for LyShine UI elements (position, scale, rotation, alpha, color, text). Asset extension: .uiam. Used for page transitions, button hover/select effects, and standalone UI animation.

UI Animation API


Feedback Motions (GS_Juice)

Two tracks for game feel effects — transform (position, scale, rotation) and material (opacity, emissive, color tint). Asset extension: .feedbackmotion. Used for screen shake, hit flash, and visual feedback.

Feedback Motions API


Quick Reference

NeedWhere
Animate UI elementsUse .uiam assets with UiAnimationMotionComponent or page transitions
Create feedback effectsUse .feedbackmotion assets with FeedbackEmitter component
Change easing curveEdit the curve type on individual tracks in the asset editor
Redirect a track to another entityConfigure proxy entries on the component
Loop an animationEnable loop on the motion asset

Glossary

TermMeaning
TrackA single animated property within a motion — defines what changes, duration, and easing
MotionA collection of tracks that play together as a single animation
Motion AssetA data asset authored in the Asset Editor containing track configurations
ProxyAn entity redirect that lets a track target a child entity instead of the motion’s owner
CompositeThe runtime instance created from a motion asset — each entity gets its own deep copy
Domain ExtensionA gem-specific set of track types that extends GS_Motion for a particular use case

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

3.1.6 - Utilities

General-purpose components and math helpers — physics triggers, easing curves, spring dampers, gradients, splines, and more.

GS_Core includes a library of general-purpose components and math helpers available to every system in the framework. These utilities handle common game development tasks — physics overlap detection, value interpolation, animation curves, gradient sampling, and spatial math — so you can focus on gameplay logic rather than reimplementing fundamental patterns.

For full API details and code examples, see the Framework API reference.

 

Contents


Physics Trigger Volume

A reusable physics overlap detector. Handles trigger enter, stay, and exit events with filtering and callback support. Used internally by Pulsors, Targeting Fields, and World Triggers — and available for your own components.

Physics Trigger Volume API


Easing Curves

40+ easing functions organized into families. Every GS_Motion track, spring, and interpolation system in the framework can reference these curves by enum.

FamilyVariants
LinearLinear
QuadIn, Out, InOut
CubicIn, Out, InOut
SineIn, Out, InOut
ExpoIn, Out, InOut
CircIn, Out, InOut
BackIn, Out, InOut
ElasticIn, Out, InOut
BounceIn, Out, InOut

Select a curve type via the CurveType enum in component properties, asset fields, or C++ code.

Curves API


Spring Dampers

15+ spring functions for physically-grounded value interpolation. Springs produce natural-feeling motion that reacts to velocity and acceleration, making them ideal for camera smoothing, UI follow, and any value that should “settle” rather than snap.

FunctionUse Case
Simple SpringBasic spring with damping
Acceleration SpringSpring with acceleration bias
Double SpringTwo-stage spring for overshoot effects
Timed SpringSpring that reaches target in a fixed time
Velocity SpringSpring driven by velocity
Quaternion SpringSpring for rotation values

Springs API


Gradients

Multi-stop gradient types for sampling values over a range. Used extensively by GS_Motion tracks and GS_Juice feedback tracks to define animation curves.

TypeWhat It Samples
ColorGradientRGBA color with positioned markers
FloatGradientSingle float value
Vector2Gradient2D vector

Gradients are editable in the Inspector with visual marker placement.

Gradients API


Entity Helpers

Utility functions for finding entities in the scene by name.

FunctionWhat It Does
GetEntityByName(name)Returns the entity with the given name.
GetEntityIdByName(name)Returns the EntityId of the named entity.

Entity Helper API


Weighted Random

Template-based weighted random selection. Given a collection of items with weights, returns a randomly selected item biased by weight. Useful for loot tables, dialogue variation, and procedural placement.

GS_Random API


Angle Helpers

Functions for angle and orientation math, plus 22 preset section configurations for direction classification.

FunctionWhat It Does
YawFromDir(direction)Extracts yaw angle from a direction vector.
QuatFromYaw(yaw)Creates a quaternion from a yaw angle.
PickByAngle(angle, sections)Maps an angle to a section index using a preset configuration.

Section presets range from 2-section (left/right) to 16-section (compass-style), plus diagonal and cardinal configurations. Useful for animation direction selection and 2D-style facing.

Angle Helper API


Spline Helpers

Utility functions for working with O3DE spline components.

FunctionWhat It Does
FindClosestWorldPoint(spline, point)Returns the closest point on the spline in world space.
FindClosestLocalPoint(spline, point)Returns the closest point in local space.
FindClosestFraction(spline, point)Returns the 0–1 fraction along the spline.

Spline Helper API


Serialization Helpers

Utility functions for common O3DE serialization patterns. Simplifies working with SerializeContext and EditContext in component reflection.

Serialization Helper API


Common Enums

Shared enumeration types used across the framework.

Common Enums API


Glossary

TermMeaning
Easing CurveA function that maps linear progress (0→1) to a shaped output for smooth animation
Spring DamperA physically-modeled interpolation function that settles naturally toward a target
GradientA multi-stop sampler that returns interpolated values (color, float, vector) over a 0→1 range
Physics Trigger VolumeA reusable overlap detector that fires enter, stay, and exit callbacks

For full definitions, see the Glossary.


See Also

For the full API, component properties, and code examples:

For related systems:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

3.2 - GS_AI

Foundation scaffold for AI-driven entity behavior in the GS_Play framework.

GS_AI provides the structural foundation for AI-driven entity behavior. It defines the base architecture that AI controller implementations build upon, integrating with the Unit system’s controller pattern to drive NPC entities through behavior logic rather than player input.

For architecture details and extension patterns, see the GS_AI API.


How It Works

GS_AI establishes the conventions and base classes for AI in the GS_Play framework. AI controllers extend the Unit Controller pattern — the same mechanism that player controllers use to drive units. Switching an entity between player control and AI control is a standard possession swap with no special handling required.

AI implementations listen to the same game lifecycle events as every other GS_Play system. AI controllers respond to standby mode, stage transitions, and manager lifecycle broadcasts automatically through the manager pattern.


Integration Points

SystemHow AI Connects
Unit ControllersAI controllers extend the unit controller pattern to drive NPC movement and actions.
GS_ManagersAI systems respond to startup, standby, and shutdown lifecycle events.
InteractionAI entities can be targeting targets, trigger world triggers, and emit/receive pulsors.
CinematicsThe Cinematic Controller (in GS_Complete) demonstrates AI-to-cinematic handoff.

See Also

For the full API and extension patterns:

For related systems:


Get GS_AI

GS_AI — Explore this gem on the product page and add it to your project.

3.3 - GS_Audio

Audio management, event-based sound, music scoring, mixing, and Klatt voice synthesis for the GS_Play framework.

GS_Audio is the audio management gem for GS_Play. It provides event-based sound playback, a multi-layer music scoring system, mixing buses with effects chains, and a built-in Klatt formant voice synthesizer with 3D spatial audio. All audio features integrate with the GS_Play manager lifecycle and respond to standby mode automatically.

For architecture details, component properties, and extension patterns, see the GS_Audio API.


Quick Navigation

I want to…FeatureAPI
Manage the audio engine, load event libraries, or control master volumeAudio ManagerAPI
Play sounds with pooling, 3D spatialization, and concurrency controlAudio EventsAPI
Configure mixing buses with filters, EQ, and environmental influence effectsMixing & EffectsAPI
Layer music tracks dynamically based on gameplay stateScore ArrangementAPI
Generate text-to-speech with configurable voice parameters and 3D spatial audioKlatt VoiceAPI

Installation

GS_Audio requires GS_Core and the MiniAudio gem. Add both to your project’s gem dependencies.

For a complete guided setup, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Audio gem in your project configuration.
  2. Create an Audio Manager prefab and add it to the Game Manager’s Managers list.
  3. Create Audio Event Library assets for your sound effects.
  4. Place GS_AudioEventComponent on entities that need to play sounds.

Audio Manager

The Audio Manager is the singleton controller for the entire audio system. It initializes the audio engine, manages mixing buses, loads audio event libraries, and coordinates score playback. Like all GS_Play managers, it extends the Manager base class and plugs into the Game Manager’s startup sequence automatically.

Audio Manager API


Audio Events

Audio Events are the primary way to play sounds. Each event defines a pool of audio clips with selection rules (random or sequential), 2D or 3D playback mode, concurrent instance limiting, and repeat-hold behavior. Events are grouped into Audio Event Library assets that the Audio Manager loads at startup.

Audio Events API


Mixing & Effects

The mixing system provides named audio buses with configurable effects chains. Each bus can have filters applied — low pass, high pass, band pass, notch, peaking EQ, shelving, delay, and reverb. Audio Bus Influence Effects allow environmental zones to dynamically modify bus effects based on the listener’s position.

Mixing & Effects API


Score Arrangement

Score Arrangement Tracks are multi-layer music assets. Each arrangement defines a time signature, BPM, fade behavior, and a set of Score Layers — individual audio tracks that can be enabled or disabled independently. This allows dynamic music that adds or removes instrument layers based on gameplay state.

Score Arrangement API


Klatt Voice Synthesis

GS_Play includes a built-in text-to-speech system based on Klatt formant synthesis. The Klatt Voice component converts text to speech in real time with configurable voice parameters — frequency, speed, waveform, formants, and pitch variance. The system supports 3D spatial audio and inline KTT tags for expressive delivery.

Klatt Voice API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

3.3.1 - Audio Manager

How to work with the GS_Play Audio Manager — engine initialization, bus routing, and master control.

The Audio Manager is the singleton controller for the entire audio system. It initializes the MiniAudio engine, manages mixing buses, loads audio event libraries, and coordinates score track playback. Like all GS_Play managers, it extends the Manager base class and responds to the Game Manager lifecycle automatically.

For component properties and API details, see the Framework API reference.

Audio Manager component in the O3DE Inspector

 

Contents


What It Manages

ResponsibilityWhat It Does
Engine lifecycleInitializes and shuts down the MiniAudio audio engine.
Mixing busesCreates and routes named audio buses with effects chains.
Event librariesLoads Audio Event Library assets at startup.
Score playbackManages Score Arrangement Track assets for dynamic music.
Master volumeControls global volume levels per bus.

Quick Reference

NeedBusMethod
Audio management requestsAudioManagerRequestBusPlayAudioEvent, StopAudioEvent, SetMixerVolume

Glossary

TermMeaning
Audio Event LibraryA data asset containing named audio event definitions loaded at startup
Mixing BusA named audio channel with an effects chain for routing and processing sounds
Score ArrangementA multi-layer music asset managed by the Audio Manager for dynamic playback

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

3.3.2 - Audio Events

How to work with GS_Play audio events — data-driven sound playback with pooling, spatialization, and concurrency control.

Audio Events are the primary way to play sounds in GS_Play. Each event defines a pool of audio clips with selection rules, playback mode (2D or 3D), concurrent instance limiting, and repeat-hold behavior. Events are grouped into Audio Event Library assets that the Audio Manager loads at startup.

For asset structure and component properties, see the Framework API reference.

Audio Event Library asset in the O3DE Asset Editor

 

Contents


How It Works

  1. Create an Audio Event Library asset in the O3DE Asset Editor.
  2. Define audio events — each with a name, clip pool, and playback settings.
  3. Load the library by assigning it to the Audio Manager.
  4. Play events by name from ScriptCanvas or C++.

Event Configuration

SettingWhat It Controls
Pool SelectionRandom — picks a random clip from the pool. Increment — plays clips sequentially.
2D / 3D2D plays without spatialization. 3D positions the sound in the world.
Concurrent LimitMaximum instances of this event that can play simultaneously.
Repeat HoldMinimum time between repeated plays of the same event.

GS_AudioEventComponent

Place a GS_AudioEventComponent on any entity that needs to play sounds. The component references events by name from loaded libraries and handles 3D positioning automatically based on the entity’s transform.


Quick Reference

NeedBusMethod
Play a sound eventAudioManagerRequestBusPlayAudioEvent(eventName)
Stop a sound eventAudioManagerRequestBusStopAudioEvent(eventName)

Glossary

TermMeaning
Audio EventA named, data-driven sound definition with clip pool, playback mode, and concurrency settings
Audio Event LibraryA data asset grouping multiple audio events for batch loading
Pool SelectionThe strategy for choosing clips — Random or Increment
Concurrent LimitMaximum simultaneous instances of the same event

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

3.3.3 - Mixing & Effects

How to work with GS_Play audio mixing — named buses, effects chains, and environmental audio influence.

The mixing system provides named audio buses with configurable effects chains. Each bus can have multiple audio filters applied for real-time audio processing. Audio Bus Influence Effects allow environmental zones to dynamically modify bus parameters based on the listener’s position.

For component properties and filter types, see the Framework API reference.

 

Contents


Available Filters

FilterWhat It Does
Low PassRemoves high frequencies. Simulates muffling (underwater, behind walls).
High PassRemoves low frequencies. Simulates thin/tinny audio (radio, phone).
Band PassPasses a frequency band. Isolates specific ranges.
NotchRemoves a narrow frequency band.
Peaking EQBoosts or cuts a frequency band.
Low ShelfBoosts or cuts frequencies below a threshold.
High ShelfBoosts or cuts frequencies above a threshold.
DelayAdds echo/delay effect.
ReverbAdds room/space reverb.

Environmental Audio Influence

Audio Bus Influence Effects allow spatial zones to modify bus effects dynamically. When the listener enters an influence zone (like a cave or tunnel), the zone’s effects are applied to the specified bus with priority-based stacking. Multiple overlapping zones resolve by priority.


Quick Reference

NeedBusMethod
Control mixer settingsGS_MixingRequestBusMixer control methods
Set master volumeAudioManagerRequestBusSetMixerVolume

Glossary

TermMeaning
Mixing BusA named audio channel that routes sound through an effects chain
Audio FilterA real-time audio processing effect applied to a mixing bus
Audio Bus Influence EffectA spatial zone that modifies bus effects based on listener position

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

3.3.4 - Score Arrangement

How to work with GS_Play score arrangements — multi-layer dynamic music with configurable time signatures and layer control.

Score Arrangement Tracks are multi-layer music assets for dynamic, adaptive game music. Each arrangement defines a time signature, tempo, fade behavior, and a set of independently controllable Score Layers. This enables music that responds to gameplay — adding percussion during combat, muting melody during dialogue, or transitioning between intensity levels.

For asset structure and playback API, see the Framework API reference.

Score Arrangement asset in the O3DE Asset Editor

 

Contents


How It Works

A Score Arrangement Track is a data asset containing:

FieldWhat It Controls
Time SignatureMusical timing (4/4, 3/4, 6/8, etc.).
BPMTempo in beats per minute.
Fade ControlHow layers fade in and out.
Score LayersIndividual audio tracks that play simultaneously.

Each Score Layer is an independent audio stream within the arrangement. Layers can be enabled or disabled at runtime, creating dynamic music that evolves based on game state.


Supported Time Signatures

4/4, 4/2, 12/8, 2/2, 2/4, 6/8, 3/4, 3/2, 9/8


Glossary

TermMeaning
Score Arrangement TrackA multi-layer music data asset with time signature, tempo, and controllable layers
Score LayerAn individual audio stream within an arrangement that can be enabled or disabled at runtime
BPMBeats per minute — the tempo of the arrangement

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

3.3.5 - Klatt Voice Synthesis

How to work with GS_Play Klatt voice synthesis — text-to-speech with 3D spatial audio, voice profiles, and inline parameter control.

GS_Play includes a built-in text-to-speech system based on Klatt formant synthesis. The KlattVoiceComponent converts text to speech in real time with configurable voice parameters. Voices are positioned in 3D space and attenuate with distance, making synthesized speech feel like it comes from the character speaking.

For component properties, voice parameter details, and the phoneme mapping system, see the Framework API reference.

Klatt Voice Profile asset in the O3DE Asset Editor

 

Contents


How It Works

  1. Configure a voice using a KlattVoiceProfile — set frequency, speed, waveform, formants, and pitch variance.
  2. Assign a KlattPhonemeMap — maps text characters to ARPABET phonemes for pronunciation.
  3. Speak text from ScriptCanvas or C++ — the system converts text to phonemes and synthesizes audio in real time.
  4. Position in 3D — the voice component uses KlattSpatialConfig for 3D audio positioning relative to the entity.

Voice Configuration

ParameterWhat It Controls
FrequencyBase voice pitch.
SpeedSpeech rate.
WaveformVoice quality — Saw, Triangle, Sin, Square, Pulse, Noise, Warble.
FormantsVocal tract resonance characteristics.
Pitch VarianceRandom pitch variation for natural-sounding speech.
DeclinationPitch drop over the course of a sentence.

KTT Tags

KTT (Klatt Text Tags) allow inline parameter changes within speech text for expressive delivery:

"Hello <speed=0.5>world</speed>, how are <pitch=1.2>you</pitch>?"

The KlattCommandParser processes these tags during speech synthesis, enabling mid-sentence changes to speed, pitch, and other voice parameters.

For the complete tag reference — all attributes, value ranges, and reset behavior — see the Framework API: KTT Voice Tags.


Phoneme Maps

Two base phoneme maps are available:

MapDescription
SoLoud_DefaultSimple default mapping.
CMU_FullFull CMU pronunciation dictionary mapping.

Custom phoneme overrides allow project-specific word pronunciations (character names, fantasy terms) without modifying the base map.


3D Spatial Audio

The KlattSpatialConfig controls how synthesized speech is positioned in 3D:

  • Voices attenuate with distance from the listener.
  • The KlattVoiceSystemComponent tracks the listener position and updates all active voices.
  • Multiple characters can speak simultaneously with correct spatial positioning.

Quick Reference

NeedBusMethod
Control a voiceKlattVoiceRequestBusVoice synthesis methods (entity-addressed)
System-level voice controlKlattVoiceSystemRequestBusListener tracking, engine management

Glossary

TermMeaning
Klatt SynthesisA formant-based speech synthesis method that generates voice from frequency parameters
KTT TagsInline text tags that modify voice parameters mid-sentence during synthesis
Phoneme MapA mapping from text characters to ARPABET phonemes for pronunciation
KlattSpatialConfigConfiguration for 3D audio positioning and distance attenuation of synthesized speech

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

3.4 - GS_Cinematics

Node-graph dialogue sequences, cinematic stage management, polymorphic performances, and world-space UI with typewriter and audio babble.

GS_Cinematics is the complete dialogue and cinematic control system for GS_Play. It provides a node-graph authoring tool for branching dialogue sequences, a runtime sequencer with conditions and side effects, a UI layer for text display and player choice, and a Cinematics Manager for scene staging. Custom conditions, effects, and performance types are discovered automatically through O3DE serialization, so project-specific dialogue behaviors can be added without modifying the gem.

For architecture details, component properties, and extending the system in C++, see the GS_Cinematics API.


Quick Navigation

I want to…FeatureAPI
Coordinate cinematic sequences and manage stage markers for actor positioningCinematics ManagerAPI
Author branching dialogue with conditions, effects, and performances in a node graphDialogue SystemAPI
Display dialogue text, player choices, and speech babble on screen or in world spaceDialogue UIAPI
Move actors to stage markers during dialogue with navigation or teleportPerformancesAPI

Installation

GS_Cinematics requires GS_Core, LyShine, and RecastNavigation. The node editor tools additionally require GraphCanvas and GraphModel as editor-only dependencies.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Cinematics gem in your project configuration.
  2. Create Cinematics Manager and Dialogue Manager prefabs, add to the Game Manager’s Managers list.
  3. Create .dialoguedb assets with the Dialogue Editor.
  4. Place DialogueSequencer and DialogueUI components in your level.
  5. Bake NavMesh in levels where PathTo performances will be used.

Cinematics Manager

The Cinematics Manager coordinates the begin and end of cinematic sequences, broadcasting enter/exit events so other systems know when to yield control. It maintains a registry of stage marker entities placed in each level — named anchor points that performers and cameras look up at runtime to determine positioning during a sequence.

Cinematics Manager API


Dialogue System

The Dialogue System is the authoring and runtime core. Dialogue content is stored in .dialoguedb assets containing actors, portraits, and sequences. Each sequence is a graph of polymorphic nodes — text, selection, random branch, effects, and performances. A visual node editor lets writers author graphs directly in the O3DE Editor. At runtime, the sequencer walks the graph, evaluates conditions, executes effects, and emits events for the UI layer.

Dialogue System API


Dialogue UI

The Dialogue UI feature set puts dialogue text and player choices on screen. DialogueUI handles screen-space text display with a Typewriter for character-by-character reveal. DialogueUISelection renders player choices as selectable buttons. World-space variants place speech bubbles above actors. Babble plays audio tied to the active speaker. The DialogueUIBridge routes sequencer events to the correct UI implementation and routes player selection back.

Dialogue UI API


Performances

Performances are polymorphic actions that move or reposition actors during dialogue. MoveTo translates actors to named stage markers, PathTo navigates via NavMesh, and Reposition teleports instantly. All run asynchronously — the sequencer waits for completion before advancing. Custom performance types can be created through extending the class.

Performances API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

3.4.1 - Cinematics Manager

How to coordinate cinematic sequences in GS_Play — beginning and ending cutscenes, registering stage markers, and reacting to cinematic events from scripts.

The Cinematics Manager is the GS_Core-integrated manager component for the GS_Cinematics system. It signals when a cinematic sequence begins and ends, broadcasts events so other systems — UI, movement, input — know when to yield control to a cutscene, and maintains a registry of named CinematicStageMarkerComponent entities placed in each level.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Cinematics Manager component in the O3DE Inspector

 

Contents


Stage Markers

Cinematic Stage Marker component in the O3DE Inspector

Stage markers are named anchor entities placed in each level that serve as spatial reference points for cinematic sequences. Performers and camera systems look up markers by name at runtime to determine where actors should stand, face, or move to during a cutscene.

This design decouples authored dialogue data from level-specific layout. The same sequence asset plays in any level as long as that level provides CinematicStageMarkerComponent entities with the expected names.

ComponentPurpose
CinematicStageMarkerComponentMarks a named position in the level for cinematic staging.

To register a marker, add CinematicStageMarkerComponent to an entity in the level and give it a name. The Cinematics Manager automatically discovers and registers all markers in the level on startup via RegisterStageMarker. At runtime, any system can retrieve a marker entity by name with GetStageMarker.


Cinematic Events

When a cinematic begins and ends, the Cinematics Manager broadcasts events on CinematicsManagerNotificationBus. Listen to these in any system that needs to yield or reclaim control during a cutscene — player input, HUD, AI, camera.

EventWhen It FiresWhat to Do
EnterCinematicA cinematic sequence has started.Disable player input, hide the HUD, suspend AI.
ExitCinematicThe cinematic sequence has ended.Re-enable input, restore HUD, resume AI.

Starting and Ending Cinematics

Call BeginCinematic to signal that a cinematic is starting and EndCinematic when it completes. These calls broadcast EnterCinematic and ExitCinematic respectively. They do not drive animation or sequence playback directly — that is the role of DialogueSequencerComponent. The Cinematics Manager handles the global state change so all listening systems respond in one coordinated broadcast.

BusMethodWhat It Does
CinematicsManagerRequestBusBeginCinematicBroadcasts EnterCinematic to all listeners.
CinematicsManagerRequestBusEndCinematicBroadcasts ExitCinematic to all listeners.
CinematicsManagerRequestBusRegisterStageMarkerAdds a marker to the registry by name.
CinematicsManagerRequestBusGetStageMarkerReturns the entity for a marker by name.

ScriptCanvas Usage

Reacting to Cinematic State

To pause gameplay systems when a cinematic starts and resume them when it ends, connect to CinematicsManagerNotificationBus:

Triggering a Cinematic

To start a cinematic from a trigger or cutscene entity, call BeginCinematic, drive the sequence through the Dialogue Sequencer, then call EndCinematic on completion:

Looking Up a Stage Marker


Quick Reference

NeedBusMethod / Event
Start a cinematicCinematicsManagerRequestBusBeginCinematic
End a cinematicCinematicsManagerRequestBusEndCinematic
Register a stage markerCinematicsManagerRequestBusRegisterStageMarker
Retrieve a stage marker entityCinematicsManagerRequestBusGetStageMarker
Know when a cinematic startsCinematicsManagerNotificationBusEnterCinematic
Know when a cinematic endsCinematicsManagerNotificationBusExitCinematic

Glossary

TermMeaning
Stage MarkerA named anchor entity in a level used as a spatial reference for cinematic positioning
CinematicA global state where the Cinematics Manager has signaled that a cutscene is in progress

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

3.4.2 - Dialogue System

How to author and play back branching dialogue in GS_Play — dialogue databases, node types, conditions, effects, and the runtime sequencer.

The Dialogue System is the authoring and runtime core of GS_Cinematics. Dialogue content is stored in .dialoguedb assets (DialogueDatabase), which contain named actors and a collection of DialogueSequence records. Each sequence is a graph of polymorphic nodes. At runtime, GS_DialogueManagerComponent manages the database and maps performer names to entities in the level, while DialogueSequencerComponent drives playback — walking the graph, evaluating conditions on branches, executing effects, and emitting events that UI and other systems consume.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Dialogue Manager component in the O3DE Inspector

 

Contents


Dialogue Database

Dialogue Authoring Pattern Graph

Breakdown

A dialogue sequence is authored in the node editor, stored in a .dialoguedb asset, and driven at runtime by the Dialogue Manager and Sequencer:

LayerWhat It Means
DialogueDatabaseStores named actors and sequences. Loaded at runtime by the Dialogue Manager.
DialogueSequenceA directed node graph. The Sequencer walks from startNodeId through Text, Selection, Effects, and Performance nodes.
ConditionsPolymorphic evaluators on branches. Failed conditions skip that branch automatically.
EffectsPolymorphic actions at EffectsNodeData nodes — set records, toggle entities.
PerformersNamed actor anchors in the level. Resolved from database actor names via DialoguePerformerMarkerComponent.

Conditions, effects, and performances are discovered automatically at startup — custom types from any gem appear in the editor without modifying GS_Cinematics.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.

DialogueDatabase Asset

Dialogue Database asset in the O3DE Editor

The DialogueDatabase is a .dialoguedb asset authored in the O3DE node editor. It is the single source of truth for all dialogue content in a project section.

Asset ContentsPurpose
ActorsNamed character definitions with portrait and metadata.
SequencesNamed DialogueSequence records, each a graph of nodes.

Load a database at runtime by calling ChangeDialogueDatabase on DialogueManagerRequestBus. The manager resolves performer names from the database against DialoguePerformerMarkerComponent entities placed in the current level.


Node Types

Each sequence is a directed graph of DialogueNodeData nodes. The sequencer walks the graph starting from startNodeId and advances through the nodes in order.

Node TypeWhat It Does
TextNodeDataDisplays a speaker line. Supports LocalizedStringId for localization.
SelectionNodeDataPresents the player with a list of choices. Branches based on selection.
RandomNodeDataSelects a random outgoing branch.
EffectsNodeDataExecutes one or more DialogueEffect objects without advancing to a text node.
PerformanceNodeDataTriggers a DialoguePerformance action and waits for OnPerformanceComplete before continuing.

Conditions

Conditions are polymorphic objects attached to sequence branches. The sequencer evaluates all conditions on a branch before following it. Branches whose conditions fail are skipped.

Condition TypeWhat It Evaluates
Boolean_DialogueConditionA base boolean comparison.
Record_DialogueConditionChecks a game record via the RecordKeeper system. Extends Boolean_DialogueCondition.

Effects

Effects are polymorphic objects executed when the sequencer reaches an EffectsNodeData node. Effects can also be reversed.

Effect TypeWhat It Does
SetRecords_DialogueEffectSets one or more game records via the RecordKeeper system.
ToggleEntitiesActive_DialogueEffectToggles one or more entities active or inactive.

Performances

Performances are polymorphic objects executed when the sequencer reaches a PerformanceNodeData node. A performance can be blocking — the sequencer pauses and waits for OnPerformanceComplete before advancing — or non-blocking, where dialogue continues immediately after the performance fires.

Like conditions and effects, performance types are discovered automatically at startup via the type registry. Custom performance types from any gem appear in the editor without modifying GS_Cinematics.

Performance TypeWhat It Does
MoveTo_DialoguePerformanceSmoothly moves a named performer to a CinematicStageMarkerComponent position. Fires MoveTo_PerformanceNotificationBus — a companion component on the performer entity responds and moves it, then signals completion.
PathTo_DialoguePerformanceNavigates a named performer to a marker using the scene navmesh. Uses RecastNavigation pathfinding through the world geometry rather than a direct interpolation.
RepositionPerformer_DialoguePerformanceInstantly teleports a performer to a marker. Non-blocking — dialogue advances without waiting.

Runtime Playback

GS_DialogueManagerComponent

The Dialogue Manager is the top-level manager for all dialogue. It holds the active DialogueDatabase, maps performer names to level entities via DialoguePerformerMarkerComponent, and is the entry point for starting sequences by name.

BusMethodWhat It Does
DialogueManagerRequestBusStartDialogueSequenceByNameStarts a named sequence from the active database.
DialogueManagerRequestBusChangeDialogueDatabaseLoads a different DialogueDatabase asset.
DialogueManagerRequestBusRegisterPerformerMarkerRegisters a performer entity by name for the current level.
DialogueManagerRequestBusGetPerformerReturns the entity for a named performer.

DialogueSequencerComponent

The Dialogue Sequencer drives sequence playback. It walks the node graph, evaluates conditions, executes effects, triggers performances, and emits notifications when text lines begin and when the sequence completes. It is typically placed on a dedicated sequencer entity in the level alongside DialogueUIBridgeComponent.

BusMethod / EventPurpose
DialogueSequencerRequestBusStartDialogueBySequenceBegins playback of a given sequence object.
DialogueSequencerNotificationBusOnDialogueTextBeginFires when a TextNodeData begins — carries speaker and text data.
DialogueSequencerNotificationBusOnDialogueSequenceCompleteFires when the sequence reaches its end node.

Localization

Text nodes store speaker lines as LocalizedStringId references rather than raw strings. A LocalizedStringId holds a key and a default fallback text. At runtime, the sequencer calls Resolve() on each LocalizedStringId, which looks up the key in the active LocalizedStringTable and returns the localized string for the current locale. If no table is loaded or the key is not found, the default text is returned.

To use localization, load a LocalizedStringTable asset through your project’s initialization flow before any dialogue plays.


ScriptCanvas Usage

Starting a Dialogue Sequence

Reacting to Sequence States

Listen on DialogueSequencerNotificationBus to receive speaker and text data as each line begins:


Quick Reference

NeedBusMethod / Event
Start a sequence by nameDialogueManagerRequestBusStartDialogueSequenceByName
Load a different databaseDialogueManagerRequestBusChangeDialogueDatabase
Register a performer in the levelDialogueManagerRequestBusRegisterPerformerMarker
Get a performer entityDialogueManagerRequestBusGetPerformer
Start a sequence object directlyDialogueSequencerRequestBusStartDialogueBySequence
React to a text line startingDialogueSequencerNotificationBusOnDialogueTextBegin
React to sequence completionDialogueSequencerNotificationBusOnDialogueSequenceComplete

Glossary

TermMeaning
DialogueDatabaseA .dialoguedb asset containing actors and dialogue sequences
DialogueSequenceA graph of nodes that defines a single dialogue conversation
DialogueNodeDataA polymorphic node in the sequence graph (Text, Selection, Random, Effects, Performance)
DialogueConditionA polymorphic evaluator attached to branches that gates progression
DialogueEffectA polymorphic action executed at EffectsNodeData nodes (e.g., setting records)
DialoguePerformanceA polymorphic action executed at PerformanceNodeData nodes — moves, paths, or repositions performers. Can be blocking or non-blocking
PerformerA named actor entity in the level mapped from the database via DialoguePerformerMarkerComponent

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

3.4.2.1 - Dialogue UI

How to display dialogue text, player choices, and typewriter effects in GS_Play — screen-space and world-space UI components and the bridge that connects them to the sequencer.

The Dialogue UI layer is the display side of the GS_Cinematics system. It receives events from DialogueSequencerComponent through DialogueUIBridgeComponent and routes them to the correct UI implementation — screen-space for HUD-style dialogue or world-space for speech bubbles above actors. Player choices are handled by selection components, and the TypewriterComponent reveals text character-by-character. BabbleComponent optionally plays per-character audio babble to give speakers a voice.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

DialogueUIComponent canvas in the O3DE UI Editor

 

Contents


Component Overview

ComponentSpacePurpose
DialogueUIComponentScreenDisplays the current speaker line on the HUD.
WorldDialogueUIComponentWorldDisplays speech bubbles positioned above actors in 3D space.
DialogueUISelectionComponentScreenRenders player choice options on the HUD.
WorldDialogueUISelectionComponentWorldRenders player choice options in 3D world space.
DialogueUIBridgeComponentRoutes sequencer events to UI and player input back to sequencer.
TypewriterComponentReveals text one character at a time with configurable timing.
BabbleComponentPlays per-character audio babble for the active speaker.

DialogueUIBridgeComponent

The Bridge component is the central connector between the sequencer and the UI. Place it on the same entity as DialogueSequencerComponent. It listens for OnDialogueTextBegin and OnDialogueSequenceComplete from the sequencer and forwards those events to whatever UI components are registered with it. It also receives player selection events from the selection UI and forwards them back to the sequencer.

This design keeps the sequencer and the display fully decoupled — swapping in a new UI implementation only requires registering it with the Bridge.

BusMethodWhat It Does
DialogueUIBridgeRequestBusRunDialogueSends a text line to the registered dialogue UI.
DialogueUIBridgeRequestBusRunSelectionSends selection options to the registered selection UI.
DialogueUIBridgeRequestBusRegisterDialogueUIRegisters a UI entity as the active dialogue display target.
DialogueUIBridgeRequestBusCloseDialogueTells the registered UI to close.
DialogueUIBridgeNotificationBusOnDialogueCompleteFires when the dialogue UI reports it has finished displaying.
DialogueUIBridgeNotificationBusOnSelectionCompleteFires when the player makes a selection.

Dialogue Display Components

Screen-Space Text

DialogueUIComponent handles HUD-style dialogue display. Add it to a UI canvas entity. It exposes the active TypewriterComponent so other systems can check typewriter state or force completion.

World-Space Speech Bubbles

WorldDialogueUIComponent extends DialogueUIComponent for world-space placement. It positions the dialogue panel above the speaking actor’s entity in 3D space. Use this for over-the-shoulder dialogue or conversations where the camera stays in the world rather than cutting to a UI overlay.


Selection Components

Screen-Space Choices

DialogueUISelectionComponent renders the player’s available choices as a list on the HUD. Each choice is backed by a DialogueSelectButtonComponent entity that is configured with option text and a selection index. When the player activates a button, it fires back through the Bridge to the sequencer.

World-Space Choices

WorldDialogueUISelectionComponent extends DialogueUISelectionComponent for world-space display. Choices appear positioned in 3D space rather than as a HUD overlay, useful for games with diegetic UI.


TypewriterComponent

TypewriterComponent in the O3DE Inspector

The TypewriterComponent reveals a string one character at a time. It fires OnTypeFired for each character revealed and OnTypewriterComplete when the full string is displayed. Use ForceComplete to instantly reveal the remaining text — typically wired to a player skip input — and ClearTypewriter to reset the display to empty.

BusMethod / EventWhat It Does
TypewriterRequestBusStartTypewriter(text)Begins revealing the given string character by character.
TypewriterRequestBusForceCompleteInstantly reveals all remaining characters.
TypewriterRequestBusClearTypewriterClears the display and resets state.
TypewriterNotificationBusOnTypeFiredFires each time a character is revealed.
TypewriterNotificationBusOnTypewriterCompleteFires when the full string has been revealed.

BabbleComponent

BabbleComponent pairs with TypewriterComponent to play short audio sounds for each character revealed, giving the impression of a speaker’s voice. Each actor has a SpeakerBabbleEvents record that maps them to a specific babble sound profile. The component returns the correct BabbleToneEvent for the current speaker via GetBabbleEvent, which is called by the typewriter on each OnTypeFired.

BusMethodWhat It Does
BabbleRequestBusGetBabbleEventReturns the babble audio event for the active speaker.

ScriptCanvas Usage

Forcing Typewriter Completion on Skip Input

Wire a player input action to ForceComplete so pressing a button instantly reveals the current line:

Reacting to Typewriter Events


Quick Reference

NeedBusMethod / Event
Route sequencer events to UIDialogueUIBridgeRequestBusRunDialogue / RunSelection
Register a UI entity with the bridgeDialogueUIBridgeRequestBusRegisterDialogueUI
Close the dialogue UIDialogueUIBridgeRequestBusCloseDialogue
React when dialogue UI finishesDialogueUIBridgeNotificationBusOnDialogueComplete
React when player makes a choiceDialogueUIBridgeNotificationBusOnSelectionComplete
Start typewriter revealTypewriterRequestBusStartTypewriter
Skip / instantly reveal textTypewriterRequestBusForceComplete
Clear the typewriter displayTypewriterRequestBusClearTypewriter
React to each character revealTypewriterNotificationBusOnTypeFired
React to full text revealedTypewriterNotificationBusOnTypewriterComplete
Get babble event for speakerBabbleRequestBusGetBabbleEvent

Glossary

TermMeaning
DialogueUIBridgeThe connector component that routes sequencer events to UI and player input back to the sequencer
TypewriterA text reveal component that displays characters one at a time with configurable timing
BabblePer-character audio playback that simulates a speaker’s voice during typewriter text reveal

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

3.4.2.2 - Performances

How to move and reposition actors during dialogue in GS_Play — the polymorphic performance system, built-in movement types, and async completion.

Performances are polymorphic actions that move or reposition actors during dialogue sequences. The sequencer triggers a performance when it reaches a PerformanceNodeData node and waits for OnPerformanceComplete before advancing. This async model lets multi-step actor choreography complete fully before dialogue continues. Three built-in performance types cover direct movement, navmesh navigation, and instant teleportation. Custom types can be created by extending the base class.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Dialogue Performer Marker component in the O3DE Inspector Performance Node in the Dialogue Editor

 

Contents


How Performances Work

When the sequencer encounters a PerformanceNodeData node, it:

  1. Resolves the named performer entity via DialogueManagerRequestBus → GetPerformer.
  2. Instantiates the performance object specified on the node.
  3. Calls DoPerformance() — the public entry point.
  4. Waits. The sequencer does not advance until the performance broadcasts OnPerformanceComplete.
  5. Resumes from the next node once completion is received.

The performance itself handles the movement logic, monitors completion, and signals back through its notification bus. This means a single PerformanceNodeData can represent any action that takes time — walking to a spot, running a path, playing an animation, triggering a VFX sequence — as long as it broadcasts completion when done.


Built-in Performance Types

Performance TypeMovement MethodWhen to Use
MoveTo_DialoguePerformanceDirect translation to marker(s)Open areas, stylized movement, non-physical traversal.
PathTo_DialoguePerformanceRecastNavigation pathfindingRealistic navigation around obstacles and geometry.
RepositionPerformer_DialoguePerformanceInstant teleport to markerOff-screen repositioning between scenes, no visible movement needed.

All three extend DialoguePerformance, the abstract base class. The base class provides DoPerformance(), ExecutePerformance(), and FinishPerformance() hooks, plus TickBus integration for per-frame movement updates.


MoveTo_DialoguePerformance

MoveTo_DialoguePerformance translates an actor toward one or more named stage markers using direct movement — no obstacle avoidance, no navmesh. It broadcasts movement requests over MoveTo_PerformanceNotificationBus. Listening systems (typically the performer’s movement component) receive those requests and execute the actual translation. When the actor reaches the final marker, the performance calls FinishPerformance() and broadcasts completion.

Use this for stylized games where physical path correctness is less important than snappy, predictable actor placement, or for any case where the path between actor and marker is guaranteed to be clear.

BusPurpose
MoveTo_PerformanceRequestBusQuery state of the performance.
MoveTo_PerformanceNotificationBusReceives move-to-marker requests from the performance.

PathTo_DialoguePerformance

PathTo_DialoguePerformance navigates an actor to one or more named stage markers using the RecastNavigation navmesh. It requests a path from the navmesh, walks the actor along that path, and completes when the actor arrives at the final marker. Use this in games with detailed geometry where actors must walk around walls, furniture, and obstacles rather than moving in a straight line.

RecastNavigation must be enabled in the project and a navmesh must be baked in any level where PathTo performances are used.

BusPurpose
PathTo_PerformanceRequestBusQuery state of the performance.
PathTo_PerformanceNotificationBusReceives path-to-marker requests from the performance.

RepositionPerformer_DialoguePerformance

RepositionPerformer_DialoguePerformance teleports an actor instantly to a named stage marker. It does not animate, translate, or navigate — it sets position directly. Useful for placing actors at the start of a new scene, recovering from a previous sequence that ended far from the required starting point, or repositioning actors that are off-camera and do not need visible movement.

BusPurpose
RepositionPerformer_PerformanceNotificationBusReceives teleport requests from the performance.

Async Completion

All performances signal completion asynchronously. The sequencer does not poll or time out — it simply waits for the performance to call FinishPerformance(), which broadcasts OnPerformanceComplete over the appropriate notification bus. This means:

  • A performance can take any amount of time.
  • A performance can be driven by external events — animation callbacks, physics arrival, etc.
  • Multiple performances can run in parallel if the node graph is structured to fork, because each notifies independently.

When writing custom performance types, always call FinishPerformance() when the action is done. Forgetting to do so will stall the sequencer indefinitely.


Extending with Custom Performances

Custom performance types are discovered automatically through O3DE serialization at startup. Extend DialoguePerformance, reflect it, and your custom type appears in the node editor’s performance picker and can be placed on any PerformanceNodeData node.

See the Framework API reference for the full base class interface and extension guide.


ScriptCanvas Usage

Performances are authored in the dialogue node graph and executed automatically by the sequencer. Most projects do not need to drive performances from ScriptCanvas directly. The common scripting patterns involve the movement side — receiving move or path requests from a performance and executing them on the performer entity.

Receiving a MoveTo Request

[MoveTo_PerformanceNotificationBus → StartMoveToMarker(markerEntity)]
    └─► [Move performer toward markerEntity position]
    └─► [When arrived → MoveTo_PerformanceRequestBus → ReportArrival]

Receiving a Reposition Request

[RepositionPerformer_PerformanceNotificationBus → RepositionToMarker(markerEntity)]
    └─► [Set performer transform to markerEntity transform]

Quick Reference

NeedBusMethod / Event
Receive a MoveTo move requestMoveTo_PerformanceNotificationBusStartMoveToMarker
Receive a PathTo navigation requestPathTo_PerformanceNotificationBusStartPathToMarker
Receive a reposition teleport requestRepositionPerformer_PerformanceNotificationBusRepositionToMarker
Query active MoveTo stateMoveTo_PerformanceRequestBus(see Framework API)
Query active PathTo statePathTo_PerformanceRequestBus(see Framework API)

Glossary

TermMeaning
PerformanceA polymorphic action that moves or repositions an actor during a dialogue sequence
PerformanceNodeDataA dialogue graph node that triggers a performance and waits for completion
Async CompletionThe pattern where the sequencer waits for OnPerformanceComplete before advancing

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

3.5 - GS_Environment

World time and environmental systems for GS_Play — time-of-day progression, day/night cycle management, and data-driven sky configuration.

GS_Environment manages the living world — the passage of time, the shift between day and night, and the visual appearance of the sky. It provides a singleton time authority that other systems subscribe to for world tick events and phase-change notifications, decoupling every time-dependent system from managing its own clock. Sky presentation is data-driven through configuration assets.

For architecture details, component properties, and extending the system in C++, see the GS_Environment API.


Quick Navigation

I want to…FeatureAPI
Control world time, time passage speed, or respond to day/night changesTime ManagerAPI
Define how the sky looks at different times of day with data assetsSky ConfigurationAPI

Installation

GS_Environment requires GS_Core and the Atom renderer. Add both to your project.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Environment gem in your project configuration.
  2. Create a Time Manager prefab and add it to the Game Manager’s Managers list.
  3. Create SkyColourConfiguration assets for your sky appearance.

Time Manager

The Time Manager is the singleton authority for world time. It advances time each tick according to a configurable speed, exposes query and control methods, and broadcasts WorldTick and DayNightChanged events so any system can react to time progression without polling.

Time Manager API


Sky Configuration

The Sky Configuration system defines how the sky looks at different times of day through data assets. SkyColourConfiguration assets hold colour values for dawn, midday, dusk, and night — swappable per-region, per-weather, or per-level without modifying entity hierarchies.

Sky Configuration API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Environment

GS_Environment — Explore this gem on the product page and add it to your project.

3.5.1 - Time Manager

How to work with the GS_Play Time Manager — world time, day/night cycles, and time passage control.

The Time Manager is the singleton controller for world time in your project. It drives the time-of-day value, controls how fast time passes, determines day/night state, and broadcasts timing events that other systems (lighting, AI schedules, environmental effects) can react to.

For component properties and API details, see the Framework API reference.

Time Manager component in the O3DE Inspector

 

Contents


How It Works

The Time Manager maintains a continuous time-of-day value. Each frame, it advances the time based on the configured passage speed and broadcasts a WorldTick event. When the time crosses the day/night threshold, it broadcasts DayNightChanged.

You set the main camera reference so the Time Manager can position sky effects relative to the player’s view.


Controlling Time

ScriptCanvas NodeWhat It Does
SetTimeOfDay(time)Sets the current time of day directly.
GetTimeOfDayReturns the current time of day.
SetTimePassageSpeed(speed)Controls how fast time advances (multiplier).
GetWorldTimeReturns the total elapsed world time.
IsDayReturns whether it is currently daytime.
SetMainCam(entity)Sets the camera entity for sky positioning.

ScriptCanvas

Time Manager entity setup in the O3DE Editor


Responding to Time Events

Time Manager entity setup in the O3DE Editor


Time Manager Entity Configuration

Time Manager entity setup in the O3DE Editor

In order to drive many of the environment effects and weather systems, the Time Manager needs many entities and components available to it.

EntityWhat It Does
SkyPivotThis entity holds all the rest of the celestial entities. It’s pivot is the pivot of the planet.
Sun / MoonTwo directional lights, pointed in opposite directions. One is disabled at a time and has it’s colours and intensities handled by the Time Manager sky configuration profile.
Moon BodyA graphic, placed in the sky relative to the moon directional light. Purely Cosmetic.
Planet AxisAn entity that carries celestial bodies at an offset. They spin with the rest of the sky. Purely Cosmetic.
StarsThe stars component. If it’s inside the pivot, the stars follow the rotation of the sky.

Quick Reference

NeedBusMethod / Event
Set time of dayTimeManagerRequestBusSetTimeOfDay(time)
Get current timeTimeManagerRequestBusGetTimeOfDay
Check if daytimeTimeManagerRequestBusIsDay
Change time speedTimeManagerRequestBusSetTimePassageSpeed(speed)
React to time ticksTimeManagerNotificationBusWorldTick
React to day/night changeTimeManagerNotificationBusDayNightChanged

Glossary

TermMeaning
World TimeThe continuously advancing time-of-day value maintained by the Time Manager
Time Passage SpeedA multiplier controlling how fast world time advances each frame
Day/Night PhaseThe current phase (day or night) determined by the time-of-day threshold

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Environment

GS_Environment — Explore this gem on the product page and add it to your project.

3.5.2 - Sky Configuration

How to work with GS_Play sky configuration — data-driven sky color settings for time-of-day transitions.

Sky Colour Configuration assets define the visual appearance of the sky at different times of day. These are data assets created in the O3DE Asset Editor that the Time Manager references to drive sky color transitions as time progresses.

For asset structure and property details, see the Framework API reference.

Sky Colour Configuration asset in the O3DE Asset Editor

 

Contents


How It Works

A Sky Colour Configuration asset defines color values for key times of day (dawn, midday, dusk, night). The Time Manager samples the configuration based on the current time of day and interpolates between the defined colors to produce smooth transitions.

Different environments (desert, forest, underwater) can use different configuration assets. Swapping the active configuration changes the sky appearance without modifying entity hierarchies.


Glossary

TermMeaning
SkyColourConfigurationA data asset defining sky color values for dawn, midday, dusk, and night

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Environment

GS_Environment — Explore this gem on the product page and add it to your project.

3.6 - GS_Interaction

Physics-based pulse events, proximity targeting with cursors, and data-driven world trigger zones.

GS_Interaction provides three independent but composable systems for making the world respond to entities. Pulsors broadcast typed events via physics volumes, the Targeting system finds and locks onto the best interactable in proximity with a cursor overlay, and World Triggers fire configurable responses from zones and conditions without requiring script code.

For architecture details, component properties, and extending the system in C++, see the GS_Interaction API.


Quick Navigation

I want to…FeatureAPI
Broadcast typed physics events from trigger volumes to receiving entitiesPulsorsAPI
Find and lock onto the best interactable entity in proximity with a cursor overlayTargetingAPI
Fire configurable responses from zones and conditions without scriptingWorld TriggersAPI

Installation

GS_Interaction requires GS_Core, LmbrCentral, LyShine, and CommonFeaturesAtom. Add all required gems to your project before placing Interaction components in a level.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Interaction gem in your project configuration.
  2. Configure physics collision layers for trigger volumes.
  3. Place Pulsor, Targeting, or World Trigger components on entities as needed.

Pulsors

The Pulsor system is a physics-driven event broadcast layer. A PulsorComponent on any entity with a trigger collider emits a typed pulse event when another entity enters or exits that volume. PulseReactorComponents on the receiving entity listen for those events and respond. Pulse types are polymorphic and registered at runtime, so project-specific types can be added without modifying GS_Interaction.

Pulsors API


Targeting

The Targeting system handles “which interactable entity is the player looking at or closest to right now?” A TargetingHandler on the player maintains a live registry of nearby targets, evaluates candidates on demand, and exposes the current target via bus events. Cursor components render a tracking reticle on screen that updates based on the selected target’s properties.

Targeting API


World Triggers

World Triggers are a data-driven system for firing game responses when conditions are met, with no script boilerplate required. TriggerSensor components define the condition side (interact, collider overlap, record check), and WorldTrigger components define the response (set record, toggle entity, change stage, print log). The combination lets most interactive world objects be authored entirely in the editor.

World Triggers API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

3.6.1 - Pulsors

How to work with the GS_Interaction Pulsor system — emitting typed physics-based pulse events and reacting to them from scripts.

The Pulsor system is a physics-driven event broadcast layer. A PulsorComponent lives on any entity with a trigger collider and emits a typed pulse event when another entity enters or exits that volume. PulseReactorComponent instances on the receiving entity listen for those events and respond. Because pulse types are polymorphic and registered at runtime, you can define project-specific pulse types without modifying the Pulsor or Reactor components themselves.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Pulsor component in the O3DE Inspector

 

Contents


How Pulses Work

Pulse Pattern Graph

Breakdown

Pulse Reactor component in the O3DE Inspector

When a physics trigger fires, the PulsorComponent iterates all PulseReactorComponent instances on the entering entity and calls ReceivePulses() on each one that returns true from IsReactor() for the emitted pulse type. Each reactor independently decides whether it handles a given pulse, so multiple reactors on one entity can each respond to a different subset of pulse types without interfering with each other.

StepWhat Happens
1 — Collider overlapPhysics system detects entity entering the Pulsor’s trigger volume.
2 — Pulse emitPulsorComponent reads its configured PulseType and prepares the event.
3 — Reactor queryEach PulseReactorComponent on the entering entity is called with IsReactor() for that type.
4 — ReactionReactors that return true have ReceivePulses() called and execute their response.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Pulse Types

A pulse type is a class derived from the abstract PulseType base. It carries no payload — the type itself is the signal. Reactors match on type identity, so adding a new type automatically makes it distinct from all existing ones.

Built-In Pulse Types

TypePurpose
Debug_PulseTest and development use. Logs or prints when received.
Destruct_PulseSignals that the receiving entity should be destroyed or deactivated.

Reactor Types

A reactor type is a class derived from the abstract ReactorType base. It defines what the receiving entity does when a matching pulse arrives. One entity can carry multiple reactors — one for each pulse type it needs to handle.

Built-In Reactor Types

TypePaired WithBehavior
Debug_ReactorDebug_PulseOutputs a debug message when it receives a matching pulse.
Destructable_ReactorDestruct_PulseHandles entity destruction or deactivation on pulse receipt.

Extending with Custom Pulse Types

Custom pulse and reactor types are discovered automatically through O3DE serialization at startup. Any team or plugin can add new interaction semantics — damage types, pickup types, status effects — by extending the base class and reflecting it, without modifying GS_Interaction itself. Custom types appear automatically as options in the editor’s component property dropdowns and are available to all Pulsors and Reactors in the project.

For the full extension guide and C++ interface, see Framework API: Pulsors and Framework API: Pulses.


Components

ComponentRoleKey Bus
PulsorComponentEmits a typed pulse when its trigger volume fires.
PulseReactorComponentReceives pulses and executes a reaction if the type matches.PulseReactorRequestBus (ById)

ScriptCanvas Usage

PulseReactorNotificationBus exposes the reactor’s state for script-driven queries. A common pattern is to wait for a Reactor to receive a pulse, then process its incoming data for your own pulse processing.


Quick Reference

NeedBusMethod
Check if entity handles a pulse typePulseReactorRequestBus (ById)IsReactor()
Manually trigger pulse processingPulseReactorRequestBus (ById)ReceivePulses()

Glossary

TermMeaning
PulsorA component that emits a typed pulse event when its trigger volume fires
Pulse ReactorA component that receives pulses and executes a response if the type matches
Pulse TypeA polymorphic class that identifies the kind of pulse being emitted

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

3.6.2 - Targeting

How to work with the GS_Interaction Targeting system — selecting interactable entities, managing cursor display, and handling interaction input from scripts.

The Targeting system answers the question “which interactable entity should the player act on right now?” A GS_TargetingHandlerComponent on the player maintains a live registry of nearby targets. Proximity detection volumes populate that registry automatically as the player moves. When the handler evaluates candidates, it selects the best target and broadcasts the result so the cursor layer and any downstream logic stay in sync.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Targeting Handler component in the O3DE Inspector

 

Contents


How Targeting Works

The targeting pipeline runs across four layers — proximity detection, target registration, target selection, and cursor display. Each layer is a separate component so they can be mixed and matched for different entity configurations.

LayerComponentWhat It Does
Proximity detectionGS_TargetingInteractionFieldComponentPhysics trigger volume on targetable entities. Registers and unregisters with the handler as the player enters and exits range.
Target dataGS_TargetComponent / GS_InteractTargetComponentDefines the target’s visual properties — size, spatial offset, colour, and sprite.
Target selectionGS_TargetingHandlerComponentMaintains the candidate list, evaluates it, and exposes the selected target via GetInteractTarget.
Cursor displayGS_CursorComponent + GS_CursorCanvasComponentReads the selected target’s visual properties and renders a cursor on screen or in world space.

Target Components

GS_TargetComponent is the base marker that makes an entity targetable. It carries four properties the cursor layer reads to display correctly:

PropertyPurpose
SizeHow large the targeting cursor appears over this entity.
OffsetSpatial offset from the entity’s origin, used to position the cursor at a sensible point (e.g. centre of a character’s torso).
ColourTint applied to the cursor sprite when this target is selected.
SpriteThe cursor image to display for this target.

GS_InteractTargetComponent extends GS_TargetComponent with specialised interaction semantics. Use it for any entity that can be interacted with via the InteractInputReaderComponent.


Targeting Handler

The GS_TargetingHandlerComponent is the central coordinator. It tracks all currently-registered candidates and determines which one is the best target at any given moment.

Notifications

Subscribe to GS_TargetingHandlerNotificationBus to react to targeting state changes:

NotificationWhen It Fires
OnUpdateInteractTargetThe selected target has changed. Payload is the new target entity id (or invalid if none).
OnEnterStandbyThe targeting handler has suspended evaluation (e.g. during a cutscene or level transition).
OnExitStandbyThe targeting handler has resumed evaluation.

ScriptCanvas — Listening for Target Changes

ScriptCanvas — Querying the Current Target


Interaction Fields

GS_TargetingInteractionFieldComponent is a physics trigger volume placed on targetable entities. When the player’s collider overlaps the field, it calls RegisterTarget on the handler. When the player leaves, it calls UnregisterTarget. The handler never has to poll — the field layer pushes updates automatically.

The field’s volume shape (sphere, box, capsule) determines the interaction range independently of the entity’s visual or physics collision bounds, so you can have a large detection range with a small visible object.


Cursor System

Interact Cursor component in the O3DE Inspector

The cursor layer visualizes the current target selection on screen. It is composed of two components that work together:

ComponentBusRole
GS_CursorComponentGS_CursorRequestBus (Single)Global cursor coordinator. Manages canvas registration, visibility, offset, and position.
GS_CursorCanvasComponentGS_CursorCanvasRequestBus (ById)Per-canvas cursor. Renders the sprite and responds to hide/show instructions.

ScriptCanvas — Cursor Control

[GS_CursorRequestBus → SetCursorVisuals]
    └─► Updates the sprite and colour from the selected target's properties.

[GS_CursorRequestBus → SetCursorPosition]
    └─► Moves the cursor to track the target's world position.

[GS_CursorRequestBus → HideCursor]
    └─► Hides the cursor when no target is selected or targeting is suspended.

UI Cursor Canvas

GS_CursorCanvasComponent UI canvas configuration in the O3DE UI Editor

A LyShine ui canvas that pairs with the in-world CursorComponent.

The cursor canvas applies visuals to the UI image in behalf of the targeting system.


Interaction Input

InteractInputReaderComponent extends GS_Core::GS_InputReaderComponent to map raw button input to interaction events.

Place it on the same entity as the GS_TargetingHandlerComponent. When the player presses the configured interact button, the reader fires an event the handler can act on to confirm the current selection or trigger the target’s interact action.

This keeps input handling decoupled from targeting logic — you can swap input mappings or replace the reader component without touching the handler.


Quick Reference

NeedBusMethod / Event
Get the current targetGS_TargetingHandlerRequestBus (ById)GetInteractTarget
Manually register a targetGS_TargetingHandlerRequestBus (ById)RegisterTarget
Manually unregister a targetGS_TargetingHandlerRequestBus (ById)UnregisterTarget
Know when the target changesGS_TargetingHandlerNotificationBus (ById)OnUpdateInteractTarget
Know when targeting suspendsGS_TargetingHandlerNotificationBus (ById)OnEnterStandby
Know when targeting resumesGS_TargetingHandlerNotificationBus (ById)OnExitStandby
Hide the cursorGS_CursorRequestBusHideCursor
Update cursor positionGS_CursorRequestBusSetCursorPosition
Update cursor appearanceGS_CursorRequestBusSetCursorVisuals

Glossary

TermMeaning
Targeting HandlerThe coordinator component on the player that maintains the candidate list and selects the best target
Interaction FieldA physics trigger volume on a targetable entity that registers/unregisters with the handler automatically
Target ComponentA marker component that makes an entity targetable with visual properties for the cursor
CursorThe visual overlay that tracks the selected target on screen or in world space

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

3.6.3 - World Triggers

How to work with the GS_Interaction World Trigger system — configuring sensor types and trigger types to create event-driven world responses without scripting.

World Triggers are a data-driven system for firing game responses when conditions are met. A TriggerSensorComponent defines the condition side — physics overlap, player interact, record check. A WorldTriggerComponent defines the response side — changing stage, toggling an entity, writing a record, logging a message. Each component holds an array of polymorphic type objects, so any combination of conditions and responses can be composed on a single entity without scripting.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

 

Contents


How World Triggers Work

World Trigger Pattern Graph

Breakdown

Each trigger is split into two container components. Logic lives in type objects held by each component:

PartRole
TriggerSensorComponentHolds andConditions and orConditions arrays of TriggerSensorType objects. When conditions pass, fires WorldTriggerRequestBus::Trigger on the same entity.
WorldTriggerComponentHolds a triggerTypes array of WorldTriggerType objects. On Trigger(), calls Execute() on every type. On Reset(), calls OnReset().

You can add any mix of sensor types and trigger types — for example, a door that requires both a physics overlap AND a record flag (andConditions) before it opens (TriggerType_ToggleEntities), then sets a quest record (TriggerType_SetRecord).

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Sensor Types

Trigger Sensor component in the O3DE Inspector

Add sensor types to the andConditions or orConditions arrays on a TriggerSensorComponent. The four built-in types cover the most common condition patterns:

Sensor TypeConditionNotes
SensorType_InteractAny unit with targeting interacts with this entity.Requires GS_TargetComponent or GS_InteractTargetComponent on the entity.
SensorType_PlayerInteractThe player-controlled unit specifically interacts with this entity.Extends SensorType_Interact; filters to entities with the "Player" tag.
SensorType_ColliderA physics collider enters the entity’s trigger volume.Automatically activates a PhysicsTriggerComponent — no manual setup needed.
SensorType_RecordA game record reaches a configured value.Connects to RecordKeeper automatically; fires without polling.

Trigger Types

World Trigger component in the O3DE Inspector

Add trigger types to the triggerTypes array on a WorldTriggerComponent. The four built-in types handle the most common response patterns:

Trigger TypeResponseNotes
TriggerType_PrintLogPrints a message to the development log.Development and debugging use.
TriggerType_SetRecordWrites a value to the Record Keeper by name.Useful for setting quest flags, unlock states, or counters.
TriggerType_ToggleEntitiesEnables or disables referenced entities. Seeds initial state at startup via startActive.Works with any entity in the level — doors, pickups, blockers.
TriggerType_ChangeStageTransitions the game to a different stage (level).Queued on next tick; coordinates with Stage Manager automatically.

Reset and Re-Arming

Calling Reset() on WorldTriggerRequestBus calls OnReset() on every trigger type in the component. Each type can reverse its effect or prepare for re-activation:

  • TriggerType_ToggleEntities inverts entity states on both Execute and OnReset, enabling toggle behavior.
  • Switches that cycle on repeat interaction.
  • Area triggers that fire each time a player re-enters.
  • Multi-step sequences where earlier steps re-arm later ones.

ScriptCanvas Usage

WorldTriggerRequestBus exposes both core methods directly to scripts:

A common script pattern is to use a SensorType_Record condition to gate a sequence, then script the reset from a separate event so the trigger only re-arms after additional conditions are met:

You can also bypass the sensor component entirely and call Trigger() directly from any script when you need to fire a world trigger response programmatically — for example, from a dialogue completion callback or an animation event.


Editor Setup Pattern

Most interactive world objects follow this assembly in the editor:

  1. Create an entity for the interactive object (door, switch, collectible, zone border).
  2. Add a TriggerSensorComponent to the entity.
  3. Add the appropriate sensor type(s) to andConditions or orConditions (e.g. SensorType_PlayerInteract for interact, SensorType_Collider for proximity).
  4. Add a WorldTriggerComponent to the same entity.
  5. Add the appropriate trigger type(s) to triggerTypes (e.g. TriggerType_ToggleEntities to open a door, TriggerType_SetRecord to record the event).
  6. Configure each type’s properties in the editor — no scripting needed for these patterns.

For interact-based sensors, also add GS_TargetComponent or GS_InteractTargetComponent so the Targeting system can select this entity.


Quick Reference

NeedBusMethod
Fire a trigger response from scriptWorldTriggerRequestBus (ById)Trigger()
Re-arm a trigger after it has firedWorldTriggerRequestBus (ById)Reset()
Fire sensor evaluation from event-driven typeTriggerSensorRequestBus (ById)DoAction() / DoResetAction()

 

Condition TypeSensor Type
Any unit interactsSensorType_Interact
Player only interactsSensorType_PlayerInteract
Physics overlapSensorType_Collider
Record/flag stateSensorType_Record

 

Response TypeTrigger Type
Log a debug messageTriggerType_PrintLog
Write a game recordTriggerType_SetRecord
Toggle an entity on/offTriggerType_ToggleEntities
Change stage/levelTriggerType_ChangeStage

Glossary

TermMeaning
TriggerSensorComponentContainer component that owns sensor type objects and evaluates them on each event
WorldTriggerComponentContainer component that owns trigger type objects and calls Execute on each Trigger()
TriggerSensorTypeA type object that defines one condition — extend this to create custom sensor logic
WorldTriggerTypeA type object that defines one response — extend this to create custom trigger logic
andConditionsArray on TriggerSensorComponent where all types must pass for the trigger to fire
orConditionsArray on TriggerSensorComponent where any one type passing is sufficient
Re-ArmingCalling Reset() on a WorldTriggerComponent so its types can fire again on the next Trigger()

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

3.7 - GS_Juice

Game feel and feedback motion system — screen shake, bounce, flash, and material effects driven by the GS_Motion engine.

GS_Juice is the game feel and feedback system. It provides motion-based visual feedback effects — screen shake, bounce, flash, pulse, material glow — authored as data assets and played on entities at runtime. GS_Juice extends the GS_Motion animation system with feedback-specific track types, so all timing, easing, and proxy targeting features are available for feedback effects.

For architecture details, track types, and the domain extension pattern, see the GS_Juice API.


Quick Navigation

I want to…FeatureAPI
Play screen shake, bounce, flash, or material glow effects on entitiesFeedback SystemAPI

Installation

GS_Juice requires GS_Core. Add both gems to your project.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Juice gem in your project configuration.
  2. Create .feedbackmotion assets in the O3DE Asset Editor.
  3. Place FeedbackEmitter components on entities that need to play effects.

Feedback System

The feedback system is the core of GS_Juice. Feedback Motion assets (.feedbackmotion) define one or more animation tracks that play together as a feedback effect. The Feedback Emitter component plays a feedback motion on itself or on a target entity. Effects are additive — they modify properties relative to the entity’s current state, so multiple effects stack without conflict.

Two track types are included:

  • Transform Track — Animates position offset, scale, and rotation using gradients. Ideal for screen shake, bounce, squash-and-stretch, and recoil.
  • Material Track — Animates material properties (opacity, emissive intensity, color tint) using gradients. Ideal for hit flash, damage glow, and fade effects.

Feedback System API


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Juice

GS_Juice — Explore this gem on the product page and add it to your project.

3.7.1 - Feedback System

How to work with GS_Play feedback motions — transform and material animation effects for game feel.

The Feedback System is the core of GS_Juice. Feedback Motion assets define animation tracks that play together as a visual effect — screen shake, bounce, hit flash, glow pulse. The Feedback Emitter component plays these effects on entities, with support for proxy targeting and stacking.

For track type details, asset structure, and component properties, see the Framework API reference.

FeedbackEmitter component in the O3DE Inspector

 

Contents


Track Types

TrackWhat It AnimatesFields
FeedbackTransformTrackPosition offset, scale, rotationVector2Gradient (XY), FloatGradient (scale), FloatGradient (rotation)
FeedbackMaterialTrackOpacity, emissive intensity, color tintFloatGradient (opacity), FloatGradient (emissive), ColorGradient (tint)

Both track types use Gradients from GS_Core, giving you full control over the animation shape at every point in the effect’s duration.


Feedback Emitter

The FeedbackEmitter component lives on any entity and plays a Feedback Motion.

PropertyWhat It Does
FeedbackMotionThe motion asset and proxy configuration.
playOnActivateIf true, plays the motion automatically when the entity activates.
MethodWhat It Does
Play()Plays the feedback motion on the owning entity.
PlayOnTarget(entityId)Plays on a different target entity.
Stop()Stops the currently playing motion.

Authoring Feedback Motions

  1. Create a .feedbackmotion asset in the O3DE Asset Editor.
  2. Add transform and/or material tracks.
  3. Configure each track’s gradients — these define the animation curve shape.
  4. Set timing (start time, duration) and easing for each track.
  5. Optionally set track identifiers for proxy targeting.
  6. Assign the asset to a FeedbackEmitter component.

Stacking Effects

Feedback effects are additive — each track applies its property change relative to the entity’s current state. Multiple effects on the same entity stack naturally without conflict. A bounce effect and a flash effect can run simultaneously on the same entity.


Proxy Targeting

UiAnimationMotion with proxy bindings configured in the Inspector

When a motion asset has tracks with identifiers (named labels), those tracks appear in the proxy list on the component. Proxies let you redirect a track to a different entity in the hierarchy — for example, a page show animation might animate the background separately from the content panel.

Each proxy entry maps a track label to a target entity. If no proxy is set, the track targets the motion’s owner entity.


Glossary

TermMeaning
Feedback MotionA .feedbackmotion data asset containing animation tracks that play as a visual effect
Feedback EmitterA component that plays a feedback motion on its entity or a target entity
FeedbackTransformTrackA track that animates position offset, scale, and rotation via gradients
FeedbackMaterialTrackA track that animates opacity, emissive intensity, and color tint via gradients

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Juice

GS_Juice — Explore this gem on the product page and add it to your project.

3.8 - GS_Item

Inventory and equipment systems for GS_Play — container-based item storage, slot management, stacking, and visual equipment integration with GS_Performer.

GS_Item provides the foundational inventory and equipment infrastructure for GS_Play games. It handles how items are stored, organized, and associated with characters — from simple containers that hold collectables to slot-based equipment systems that drive visual changes through GS_Performer.

For architecture details, component properties, and extending the system in C++, see the GS_Item API.


Quick Navigation

I want to…FeatureAPI
Store, stack, and manage items in container-based inventoriesInventoryAPI
Equip items into typed character slots with visual integrationEquipmentAPI

Installation

GS_Item requires GS_Core. GS_Performer is an optional dependency for visual equipment integration.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Item gem in your project configuration.
  2. Define item data assets for your game’s items.
  3. Place Inventory components on entities that need to store items.
  4. Place Equipment components on characters that need to equip items.

Inventory

The Inventory system is a container-based model for item storage. Each inventory is a collection of item slots where each slot holds an item reference and a stack count. The system handles adding, removing, querying, and capacity management. Any entity can host an inventory — a player backpack, a chest, a vendor, or a loot bag.

Inventory API


Equipment

The Equipment system is the slot-based layer that governs which items a character has equipped. Each slot has a defined type and enforces compatibility rules. Equipment integrates with GS_Performer’s Skin Slot system — equipping an item automatically updates the matching visual asset, connecting inventory state to character presentation.

Equipment API


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Item

GS_Item — Explore this gem on the product page and add it to your project.

3.8.1 - Inventory

How to work with GS_Play inventory — container-based item storage with slots and stacking.

The Inventory system provides container-based item storage for entities. Inventory components define slot-based containers that hold item data, support stacking, and enable item transfer between entities (player to chest, vendor to player).

For component properties and the inventory data model, see the Framework API reference.

 

Contents


How Inventory Works

ConceptWhat It Means
ContainerA component on an entity that holds item slots. Players, chests, vendors, and loot drops are all containers.
SlotAn individual position within a container. Each slot holds one item type.
StackingMultiple units of the same item can occupy a single slot up to a maximum stack size.
TransferItems move between containers via slot references — no intermediate state required.

Glossary

TermMeaning
ContainerA component on an entity that holds item slots
SlotAn individual position within a container that holds one item type
StackingMultiple units of the same item occupying a single slot up to a maximum count

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Item

GS_Item — Explore this gem on the product page and add it to your project.

3.8.2 - Equipment

How to work with GS_Play equipment — typed equipment slots with visual integration via GS_Performer skin slots.

The Equipment system provides typed equipment slots for characters. Equipment slots enforce type constraints (helmet slot accepts helmets, weapon slot accepts weapons) and integrate directly with the Skin Slot system in GS_Performer to update visual appearance when items are equipped or unequipped.

For component properties and the equipment data model, see the Framework API reference.

 

Contents


How Equipment Works

PhaseWhat Happens
EquipAn item is placed into a typed slot. Equip notifications fire.
Visual UpdateIf linked to a GS_Performer skin slot, the character mesh updates automatically.
Stat ApplicationEquipment stats are applied to the entity (via RPStats integration, when available).
UnequipThe item is removed from the slot. Visual and stat changes revert.

Integration with Skin Slots

Equipment slots can be linked to Performer Skin Slots. When an item is equipped, the corresponding skin slot’s actor mesh and materials are automatically swapped to match the item. No manual wiring is required — the connection is data-driven through slot name matching.


Glossary

TermMeaning
Equipment SlotA typed slot on a character that accepts items matching its type constraint
Skin Slot IntegrationAutomatic visual update when equipping or unequipping items via GS_Performer

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Item

GS_Item — Explore this gem on the product page and add it to your project.

3.9 - GS_Performer

Character rendering and animation for GS_Play — modular skin slots, billboard 2.5D rendering, and velocity-driven locomotion parameters.

GS_Performer is the character presentation layer for GS_Play. It handles the visual side of any entity that needs to look like a character — whether that is a fully-rigged 3D mesh swapping equipment, a sprite-based billboard character, or an animated unit whose blend parameters track real movement velocity. GS_Performer sits above GS_Unit and GS_Core, consuming movement and game state events to keep visual output in sync with gameplay.

For architecture details, component properties, and extending the system in C++, see the GS_Performer API.


Quick Navigation

I want to…FeatureAPI
Manage registered performers or query them by namePerformer ManagerAPI
Swap equipment visuals at runtime with modular slot-based assetsSkin SlotsAPI
Render billboard-style 2.5D characters that face the camera correctlyPaper PerformerAPI
Drive animation blend parameters from entity velocity automaticallyLocomotionAPI

Installation

GS_Performer requires GS_Core. EMotionFX and GS_Unit are optional dependencies for animation and velocity-driven locomotion.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Performer gem in your project configuration.
  2. Create a Performer Manager prefab and add it to the Game Manager’s Managers list.
  3. Place PerformerSkinSlotComponent and SkinSlotHandlerComponent on character entities.
  4. For 2.5D characters, add a PaperFacingHandlerComponent.

Performer Manager

The Performer Manager is the singleton coordinator for all registered performers in the active scene. It follows the GS_Core manager pattern — spawned by the Game Manager, participates in standby, and is accessible globally via its request bus. Systems that need to enumerate or query active performers route through this component.

Performer Manager API


Skin Slots

The Skin Slot system is the modular equipment and appearance layer for characters. Each PerformerSkinSlotComponent represents an equipment slot holding an actor asset and material overrides. The SkinSlotHandlerComponent manages the full collection for one character, providing batch operations and appearance-changed notifications. This makes character visual configuration composable rather than monolithic.

Skin Slots API


Paper Performer

The Paper Performer system provides billboard-style 2.5D character rendering. The PaperFacingHandlerComponent rotates the entity’s mesh to face the active camera each frame, accounting for movement direction to prevent visual artifacts from naive always-face-camera implementations. Custom facing logic can be substituted without modifying the core component.

Paper Performer API


Locomotion

The Locomotion feature set connects an entity’s physical movement to its animation layer. The VelocityLocomotionHookComponent reads velocity from the physics system each tick and drives animation blend parameters — speed and direction — that EMotionFX or Simple Motion components consume. Animations stay tightly synchronized with gameplay movement without manual state machine scripts.

Locomotion API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

3.9.1 - Performer Manager

How to work with the GS_Play Performer Manager — performer registration and lifecycle.

The Performer Manager is the singleton controller for the performer system. It tracks all active performers in the scene, handles registration and lookup, and responds to the Game Manager’s lifecycle events automatically.

For component properties and API details, see the Framework API reference.

PerformerManager component in the O3DE Inspector

 

Contents


How It Works

Performers register with the Performer Manager when they activate. The manager provides lookup by name so that other systems — dialogue, cinematics, AI — can find performers without maintaining their own references.

The Performer Manager responds to standby and shutdown broadcasts from the Game Manager, coordinating performer state across level transitions.


Quick Reference

NeedBusMethod
Query performer managementPerformerManagerRequestBusManager-level queries

Glossary

TermMeaning
PerformerAn entity that represents a character in the world with visual, animation, and interaction capabilities
Performer ManagerThe singleton that tracks all active performers and provides lookup by name

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

3.9.2 - Performers

The character rendering entity types in GS_Performer — billboard 2.5D paper performers and fully-rigged 3D avatar performers.

Performers are the character entity types in GS_Performer. Each type defines how a character’s visual layer is structured and rendered — whether a flat sprite quad that rotates to face the camera, or a fully-rigged 3D mesh driven by EMotionFX. Choose the performer type that fits your character pipeline, then add Performer Features to layer in additional capabilities.

For component details and C++ extension, see the Framework API: Performers.


Paper Performer

The Paper Performer is a billboard-based character type for 2.5D and top-down games. The PaperFacingHandlerComponent orients the entity’s sprite quad toward the active camera each frame, accounting for movement direction to prevent visual artifacts. It is the simplest performer type — lightweight, with no animation graph required.

Paper Performer API


Avatar Performer

The Avatar Performer is the full 3D character pipeline. It drives a rigged EMotionFX actor and integrates with Skin Slots and the Velocity Locomotion Hook for a complete character presentation layer. Use this for characters with full 3D rigs, animation blend trees, and runtime equipment swapping.

Avatar Performer API


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

3.9.2.1 - Avatar Performer

How to work with GS_Play avatar performers — 3D pipeline, animation, equipment.

Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

3.9.2.2 - Paper Performer

How to work with GS_Play paper performers — billboard-style 2.5D character rendering with camera-aware facing.

The Paper Performer system provides billboard-style character rendering for 2.5D games. The Paper Facing Handler rotates sprite-based characters to always face the active camera while maintaining proper orientation relative to movement direction. This prevents the visual artifacts that occur when a flat sprite is viewed from the side.

For component properties and the facing algorithm, see the Framework API reference.

 

Contents


How It Works

The PaperFacingHandlerComponent is placed on any entity that uses sprite-based or flat-mesh character rendering. Each frame, it calculates the optimal facing direction based on:

  1. The active camera’s position (via CamCore notifications).
  2. The entity’s movement direction.
  3. Configurable facing rules.

The result is a character that always presents its front face to the camera without snapping or popping during movement direction changes.


Camera-Aware Variant

The CamCorePaperFacingHandlerComponent (in GS_Complete) extends the base paper facing with direct camera core integration. It listens to CamCoreNotificationBus for camera position updates, ensuring the facing calculation uses the exact camera state rather than a cached position.


Entity Configuration

Performer Entity Configuration in the O3DE Editor

Note this has been put inside a GS_Unit prefab.


Glossary

TermMeaning
Paper PerformerA billboard-style character that always faces the camera for 2.5D rendering
Paper Facing HandlerThe component that calculates optimal facing direction each frame

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

3.9.3 - Performer Features

Universal capabilities that extend any performer type — modular equipment via skin slots, and velocity-driven animation via locomotion.

Performer Features are components that layer onto any performer entity, regardless of type. They are not tied to how a character is rendered — they extend what any performer can do. Skin Slots give any character modular equipment slots. Locomotion connects physics movement to the animation graph automatically.

For component details and C++ extension, see the Framework API: Performer Features.


Skin Slots

The Skin Slot system provides modular equipment and appearance management. Each PerformerSkinSlotComponent represents one equipment position on the character — helm, chest, weapon — holding a swappable actor mesh and material set. The SkinSlotHandlerComponent coordinates all slots on a character and supports preset profiles for NPC archetypes or full equipment sets.

Skin Slots API


Locomotion

The Velocity Locomotion Hook connects entity movement to animation automatically. It reads the entity’s physics velocity each frame and writes blend parameters — speed and direction — directly into the EMotionFX animation graph, driving walk, run, idle, and fall transitions without manual state machine scripting.

Locomotion API


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

3.9.3.1 - Skin Slots

How to work with GS_Play skin slots — modular character appearance with swappable actor meshes and materials.

The Skin Slot system provides modular character appearance. Each slot represents an equipment position on a character (head, body, arms, weapon) and holds an actor mesh with its materials. The Skin Slot Handler manages the full collection of slots for a character, enabling runtime equipment swapping through data changes rather than entity restructuring.

For component properties and data structures, see the Framework API reference.

Skin Slot Configuration Profile in the O3DE Asset Editor

 

Contents


How It Works

Each character entity has one SkinSlotHandlerComponent and multiple PerformerSkinSlotComponent children — one per equipment position.

ComponentWhat It Does
SkinSlotHandlerManages the full set of skin slots for a character. Provides batch operations.
PerformerSkinSlotIndividual slot. Holds a SkinSlotData (actor asset + material assets).

When you equip a new item, you update the slot’s SkinSlotData — the system swaps the actor mesh and materials automatically.


Skin Slot Data

FieldWhat It Is
Actor AssetThe 3D mesh for this equipment slot.
Material AssetsMaterials applied to the mesh.
Slot NameNamed identifier for the slot (e.g., “Helmet”, “Chest”, “MainHand”).

Skin Slot Configuration Profiles are preset assets that define complete character appearances — useful for NPC archetypes or equipment sets.


Quick Reference

NeedBusMethod
Access a skin slotPerformerSkinSlotRequestBusPer-slot queries and updates
Manage all slotsSkinSlotHandlerRequestBusBatch operations on all slots

Glossary

TermMeaning
Skin SlotAn equipment position on a character that holds an actor mesh and materials
SkinSlotHandlerThe manager component that coordinates all skin slots on a single character
SkinSlotDataThe data structure containing actor asset, materials, and slot name
Skin Slot Configuration ProfileA preset asset defining a complete character appearance for batch application

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

3.9.3.2 - Locomotion

How to work with GS_Play velocity locomotion — automatic animation parameter driving from entity velocity.

The Velocity Locomotion Hook automatically drives animation blend parameters from an entity’s velocity. Instead of manually scripting animation state transitions, you attach this component and it reads the entity’s physics velocity each frame, writing the appropriate blend values to the animation system.

For component properties, see the Framework API reference.

Velocity Locomotion Hook component in the O3DE Inspector

 

Contents


How It Works

The VelocityLocomotionHookComponent runs on the tick bus. Each frame it:

  1. Reads the entity’s current velocity from the physics system.
  2. Extracts speed, direction, and vertical velocity.
  3. Writes these as blend parameters to the entity’s EMotionFX animation graph.

This creates a seamless connection between movement and animation — walk, run, idle, and fall transitions happen automatically based on physics state.


When to Use

ScenarioResult
Character walkingSpeed parameter drives walk/run blend
Character idleZero velocity triggers idle state
Character fallingVertical velocity drives fall animation
Character stoppingDeceleration naturally transitions to idle

The component is stateless and lightweight — it reads velocity and writes parameters each frame with no internal state machine.


Glossary

TermMeaning
Velocity Locomotion HookA component that reads entity velocity and writes animation blend parameters each frame
Blend ParameterA named value sent to the animation graph that controls state transitions and blending

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

3.10 - GS_PhantomCam

Priority-based virtual camera management with blend profiles, multiple behavior types, and spatial influence fields.

GS_PhantomCam is the camera management system for GS_Play. It uses virtual cameras — Phantom Cameras — that compete for control through a priority system. Whichever holds the highest priority drives the real camera’s position, rotation, and FOV. Transitions are governed by Blend Profile assets, and Influence Fields let spatial zones adjust camera behavior dynamically.

For architecture details, component properties, and extending the system in C++, see the GS_PhantomCam API.


Quick Navigation

I want to…FeatureAPI
Manage the camera system lifecycle and know which camera is activeCam ManagerAPI
Place virtual cameras with follow targets, look-at, and priority-based switchingPhantom CamerasAPI
Control how the real camera reads from the active phantom camera each frameCam CoreAPI
Define smooth transitions between cameras with custom easingBlend ProfilesAPI
Create spatial zones that modify camera priority dynamicallyInfluence FieldsAPI

Architecture

Camera Blend Pattern Graph

Breakdown

Blend Profiles are referenced from within a Phantom Camera’s blend settings. When the Cam Manager determines a transition is needed, it reads the outgoing camera’s blend settings to find the profile to use.

ScenarioWhich Profile Is Used
Camera A becomes inactive, Camera B was waitingCamera A’s blend profile governs the outgoing transition.
Camera B is activated and is now highest priorityCamera B’s blend profile governs the incoming transition.
No blend profile is assignedTransition is instantaneous — no interpolation.

Because profiles are assets, the same profile can be shared across many Phantom Cameras. A single edit to the asset changes the feel of every camera that references it.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Installation

GS_PhantomCam requires GS_Core only. Add both gems to your project before placing PhantomCam components in a level.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_PhantomCam gem in your project configuration.
  2. Create a Cam Manager prefab and add it to the Game Manager’s Managers list.
  3. Place a Cam Core entity with GS_CamCoreComponent in every level.
  4. Place Phantom Camera entities with desired behavior components.
  5. Create Blend Profile assets for camera transitions.

Cam Manager

The Cam Manager is the singleton camera system manager, integrated with GS_Core’s manager lifecycle. It handles startup and shutdown, responds to enable/disable events so the camera can be suspended during cinematics, and broadcasts whenever the dominant camera changes so downstream systems stay in sync.

Cam Manager API


Phantom Cameras

Phantom Cameras are virtual camera definitions placed as components on ordinary entities. Each holds field of view, clip planes, optional follow and look-at targets, positional and rotational offsets, and a priority value. The highest-priority camera drives the real camera. Specialized behavior components extend the vocabulary: first-person, orbit, static orbit, clamped look, and spline track.

Phantom Cameras API


Cam Core

Cam Core is the rendering bridge between the Phantom Camera system and the actual O3DE camera entity. It reads the current dominant camera’s data each frame and writes position, rotation, and FOV to the real camera. It also broadcasts per-frame camera position updates for shadow casters, audio listeners, and LOD controllers.

Cam Core API


Blend Profiles

Blend Profiles are data assets that define how the camera transitions when control shifts between Phantom Cameras. Each profile specifies blend duration, position easing curve, and rotation easing curve — independently, so position can ease out while rotation snaps. Profiles are shared by name across cameras for visual consistency.

Blend Profiles API


Influence Fields

Influence Fields are spatial or global modifiers that dynamically shift camera priority. GlobalCameraInfluence applies to cam core unconditionally, while CameraInfluenceField defines a physics trigger volume that modifies the priority when the player is inside it. Multiple fields with the same camera target stack influence.

Influence Fields API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

3.10.1 - Cam Manager

How to work with the GS_PhantomCam manager — enabling and disabling the camera system, and responding to active camera changes.

The Cam Manager is the camera system’s singleton manager. It integrates with GS_Core’s standard manager lifecycle, handles the camera system’s startup and shutdown through the two-stage initialization sequence, and maintains awareness of which Phantom Camera is currently dominant. Any time the active camera changes, the Cam Manager broadcasts a notification so dependent systems can react without polling.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Cam Manager component in the O3DE Inspector

 

Contents


Manager Lifecycle

The Cam Manager is a GS_ManagerComponent and participates in the standard GS_Core startup sequence. It activates alongside all other managers during the two-stage initialization, making it safe to call camera system methods any time after OnStartupComplete.

StageWhat Happens
InitializeCam Manager activates and registers with the camera pipeline.
Startup (OnSetupManagers)Manager is initialized. Safe to query camera state.
Complete (OnStartupComplete)All managers ready. Safe to enable or disable the camera system.

Enabling and Disabling the Camera System

The camera system can be suspended and resumed independently of other gameplay systems. This is used during UI-only states, loading screens, or when a cinematic takes direct control of the camera.

NotificationWhat It Does
EnableCameraSystemActivates the camera system. Phantom Cameras resume competing for priority.
DisableCameraSystemSuspends the camera system. The real camera stops receiving updates from Phantom Cameras.

Both events are on CamManagerNotificationBus. Broadcast them to switch the camera system state from any script or component.


Tracking the Active Camera

The Cam Manager broadcasts SettingNewCam on CamManagerNotificationBus whenever a different Phantom Camera becomes dominant. This fires any time a camera with higher priority activates, a dominant camera deactivates, or a blend resolves to a new target.

Use this event in any system that needs to know which camera is currently active — aiming reticles, world-space UI elements, minimaps, or audio listener positioning.

ScriptCanvas


Suspending the Camera System

ScriptCanvas

Broadcast DisableCameraSystem to suspend camera updates, for example when a cutscene takes direct camera control:

Because DisableCameraSystem and EnableCameraSystem are notifications rather than requests, they are fire-and-forget. Any script can broadcast them.


Changing the Camera Target

ScriptCanvas

The follow target, designated by the CamManager, can be set with Get/Set Target:


Cam Manager Entity Configuration

Cam Manager entity setup in the O3DE Editor

The Cam Manager MUST carry the MainCamera with it. This enables proper control of the camera and ensures there is a Cam Core no matter what stage you load into to start.

Additionally, you can add any phantom cameras that will persist between stage changes. This is usually the “rig” cameras that constantly track the player.

Using a PossessedUnit event on a player controller enables you to immediately set the Cam Manager target to your spawned player unit.


Quick Reference

NeedBusMethod / Event
Enable the camera systemCamManagerNotificationBusEnableCameraSystem
Disable the camera systemCamManagerNotificationBusDisableCameraSystem
Know when the active camera changesCamManagerNotificationBusSettingNewCam(newCamEntityId)
Access camera manager methodsCamManagerRequestBus(see Framework API)

Glossary

TermMeaning
Cam ManagerThe singleton manager for the PhantomCam system that tracks active cameras and broadcasts changes
Dominant CameraThe Phantom Camera with the highest priority that currently drives the real camera

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

3.10.2 - Phantom Cameras

How to configure and use GS_PhantomCameraComponent — priority, targets, data fields, and the available camera behavior types.

Phantom Cameras are virtual camera definitions placed in the level as components on ordinary entities. They do not render anything. Instead, each Phantom Camera holds a configuration record that describes how the real camera should behave when that Phantom Camera is dominant. The Cam Manager determines dominance by comparing priority values — whichever active Phantom Camera holds the highest priority drives the real camera at any given moment.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Phantom Camera component in the O3DE Inspector

 

Contents


How Priority Works

Every GS_PhantomCameraComponent has a priority value. The Cam Manager continuously tracks all active Phantom Cameras and routes camera control to the one with the highest priority. When a Phantom Camera is disabled or deactivated, the next-highest priority camera takes over.

ConditionResult
Camera A priority 10, Camera B priority 5 — both activeCamera A is dominant.
Camera A deactivatesCamera B becomes dominant immediately (or blends if a Blend Profile is set).
Camera A and Camera B both priority 10The most recently activated one is dominant.
No Phantom Cameras are activeThe real camera holds its last known position.

PhantomCamData Fields

Each Phantom Camera stores a PhantomCamData record that defines the full camera configuration for when that camera is dominant.

FieldTypePurpose
FOVfloatField of view in degrees. Applied to the real camera’s FOV when dominant.
NearClipfloatNear clip plane distance.
FarClipfloatFar clip plane distance.
FollowTargetEntityIdEntity the camera’s position tracks. Camera moves with this entity.
LookAtTargetEntityIdEntity the camera’s rotation aims at. Overrides authored rotation.
PositionOffsetVector3Positional offset applied relative to the follow target.
RotationOffsetQuaternionRotational offset applied after look-at or authored rotation.

FollowTarget and LookAtTarget are both optional. A Phantom Camera with neither set holds the position and rotation of its entity in the level.


Camera Behavior Types

Beyond the base GS_PhantomCameraComponent, several behavior components can be added to a Phantom Camera entity to specialize how it moves and aims:

ComponentWhat It Does
ClampedLook_PhantomCamComponentAdds yaw and pitch angle limits to look-at rotation. Prevents the camera from swinging past defined bounds.
StaticOrbitPhantomCamComponentLocks the camera to a fixed orbit radius around its target entity. The orbit position does not follow input — it is a fixed authored angle.
Track_PhantomCamComponentMoves the camera along a spline path. Useful for cinematic dolly shots or scripted flythrough sequences.
AlwaysFaceCameraComponentMakes the entity that holds this component always rotate to face the currently dominant Phantom Camera. Used for billboards and 2D sprites in 3D space.

Each behavior component is additive — the base Phantom Camera still drives priority and targeting while the behavior component modifies position or rotation evaluation.


Activating and Deactivating Cameras

Phantom Cameras participate in priority competition only while their entity is active. Enable and disable the entity to add or remove a camera from competition. The transition between dominant cameras is governed by the Blend Profile referenced in the outgoing camera’s blend settings.

ScriptCanvas

To raise a camera’s priority and make it dominant:

Because priority is numeric, you can also activate multiple cameras simultaneously and let priority values determine dominance without manually disabling others.


Requesting Camera Data

Use PhantomCameraRequestBus (addressed by the Phantom Camera entity’s ID) to read or write its configuration at runtime:

ScriptCanvas


Quick Reference

NeedBusMethod / Event
Read a camera’s full configurationPhantomCameraRequestBus(id)GetPhantomCamData
Change a camera’s follow targetPhantomCameraRequestBus(id)SetFollowTarget(entityId)
Change a camera’s look-at targetPhantomCameraRequestBus(id)SetLookAtTarget(entityId)
Change a camera’s FOVPhantomCameraRequestBus(id)SetFOV(value)
Make a camera dominant(entity activation)Enable the Phantom Camera entity
Remove a camera from competition(entity activation)Disable the Phantom Camera entity
Know when the dominant camera changesCamManagerNotificationBusSettingNewCam(newCamEntityId)

Glossary

TermMeaning
Phantom CameraA virtual camera definition that describes how the real camera should behave when dominant
PhantomCamDataThe configuration record holding FOV, clip planes, follow/look-at targets, and offsets
PriorityA numeric value that determines which Phantom Camera drives the real camera
Follow TargetAn entity whose position the camera tracks
Look-At TargetAn entity the camera’s rotation aims toward

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

3.10.3 - Cam Core

How to work with GS_CamCoreComponent — the rendering bridge that reads the dominant Phantom Camera each frame and drives the real camera entity.

Cam Core is the rendering bridge between the Phantom Camera system and the actual O3DE camera entity. GS_CamCoreComponent runs on tick, reads the dominant Phantom Camera’s configuration each frame, and writes position, rotation, and field of view directly to the camera entity that the renderer uses. Phantom Cameras define intent — Cam Core executes it.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Cam Core component in the O3DE Inspector

 

Contents


What Cam Core Does Each Frame

On every tick, Cam Core follows this sequence:

StepWhat Happens
1 — Query active cameraReads the current dominant Phantom Camera from the Cam Manager.
2 — Read PhantomCamDataFetches the active camera’s position, rotation, FOV, and clip planes.
3 — Apply blendIf a blend is in progress, interpolates between the outgoing and incoming camera values.
4 — Write to real cameraPushes the resolved transform and FOV to the real O3DE camera entity.
5 — Broadcast positionFires UpdateCameraPosition on CamCoreNotificationBus so dependent systems receive the final camera location.

Because Cam Core owns the final write to the camera entity, it is also the correct insertion point for last-mile adjustments such as screen shake offsets, post-processing overrides, or recoil displacement — applied after blend resolution but before the frame renders.


Responding to Camera Position Updates

CamCoreNotificationBus broadcasts UpdateCameraPosition every frame with the resolved camera transform. Connect to this bus in any system that needs per-frame camera location data.

Use CaseWhy UpdateCameraPosition
Audio listener positioningFollow the camera without polling the camera entity.
LOD or culling controllersReceive camera position reliably after blend is applied.
Shadow caster updatesReact to camera movement each frame.
World-space UI anchoringKnow exactly where the camera landed after Cam Core processed its frame.

ScriptCanvas


Cam Core Setup

Cam Core requires exactly one GS_CamCoreComponent in the level, and a separate entity with an O3DE Camera component that Cam Core is configured to drive. The Camera entity is what the renderer uses — Cam Core holds a reference to it and writes to it each frame.

EntityRequired Components
Cam Core entityGS_CamCoreComponent
Camera entityO3DE CameraComponent (standard)

Assign the Camera entity reference in the Cam Core component’s properties in the editor. Both entities should be present in every level that uses the PhantomCam system.


Querying Cam Core State

Use CamCoreRequestBus to read the current camera state or set the camera entity reference at runtime:

ScriptCanvas

Cam Core state in Script Canvas


Quick Reference

NeedBusMethod / Event
Know the camera position each frameCamCoreNotificationBusUpdateCameraPosition(transform)
Get the real camera entity IDCamCoreRequestBusGetCameraEntity
Get the current resolved FOVCamCoreRequestBusGetCurrentFOV
Know when the active camera changesCamManagerNotificationBusSettingNewCam(newCamEntityId)

Glossary

TermMeaning
Cam CoreThe rendering bridge that reads the dominant Phantom Camera each frame and writes to the real O3DE camera
Blend ResolutionThe process of interpolating position, rotation, and FOV between outgoing and incoming cameras

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

3.10.4 - Blend Profiles

How to create and use GS_PhantomCamBlendProfile assets to control transition timing, easing, and interpolation between Phantom Cameras.

Blend Profiles are data assets that define how the camera transitions when control shifts from one Phantom Camera to another. Without a Blend Profile, transitions are instantaneous. A well-authored Blend Profile is often the single largest contributor to a camera system feeling polished rather than mechanical.

For architecture details, asset structure, and extending the system in C++, see the Framework API reference.

Cam Blend Profile asset in the O3DE Asset Editor

 

Contents


What a Blend Profile Contains

Each GS_PhantomCamBlendProfile asset defines the shape of a single camera transition:

FieldPurpose
DurationHow long the blend takes, in seconds.
Position EasingThe easing curve applied to position interpolation. Controls how the camera accelerates and decelerates as it moves toward the new position.
Rotation EasingThe easing curve applied to rotation interpolation. Can be set independently from position easing.
FOV EasingThe easing curve applied to field-of-view interpolation.

Position, rotation, and FOV each have independent easing settings. This makes it straightforward to compose transitions that feel distinct in each axis — for example, position eases out slowly while rotation snaps quickly, or FOV leads the transition while position lags.


How Blend Profiles Are Referenced


Creating a Blend Profile Asset

Blend Profile assets are created from the Asset Browser in the O3DE Editor:

  1. Right-click in the Asset Browser where you want to store the profile.
  2. Select Create Asset → GS_PhantomCamBlendProfile.
  3. Name the asset descriptively — for example, CombatCameraBlend or CinematicSoftBlend.
  4. Open the asset to edit duration and easing curves.
  5. Assign the asset to one or more Phantom Camera components in the level.

Easing Curve Reference

Easing curves control the acceleration profile of an interpolation. Common choices and their camera feel:

CurveFeel
LinearMechanical, even movement. Rarely used for cameras.
EaseInStarts slow, accelerates. Camera hesitates before committing.
EaseOutStarts fast, decelerates. Camera arrives gently.
EaseInOutSlow start and end, fast through the middle. Most natural for most transitions.
EaseOutBackSlight overshoot before settling. Adds energy to arriving at a new view.

ScriptCanvas — Triggering a Blend

Blends trigger automatically when the dominant Phantom Camera changes. To trigger a blend from script, simply change which camera is active or dominant:

[Custom Event → EnterCombat]
    └─► [EntityId → Set Active (combat camera entity)]
            └─► Cam Manager detects priority change
            └─► Blend Profile on combat camera governs the transition
            └─► Cam Core interpolates position, rotation, FOV over blend duration

There is no separate “start blend” call. The blend begins the moment the Cam Manager determines a new dominant camera and ends when Cam Core finishes interpolating.


Quick Reference

NeedHow
Create a blend profileAsset Browser → Create Asset → GS_PhantomCamBlendProfile
Assign a profile to a cameraSet the Blend Profile reference in the Phantom Camera component properties
Make a transition instantLeave the Blend Profile reference empty on the Phantom Camera
Share one profile across many camerasAssign the same asset reference to multiple Phantom Camera components
Change blend timing without changing camerasEdit the Blend Profile asset — all cameras referencing it update automatically

Glossary

TermMeaning
Blend ProfileA data asset defining duration and per-axis easing curves for camera transitions
Position EasingThe easing curve controlling how the camera accelerates/decelerates in position
Rotation EasingThe easing curve controlling rotation interpolation independently from position

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

3.10.5 - Influence Fields

How to use GlobalCameraInfluenceComponent and CameraInfluenceFieldComponent to apply dynamic camera modifications globally or within spatial zones.

Influence Fields are modifiers that change camera priority dynamically beyond what any individual Phantom Camera’s base prioerity specifies. Two types are available: global components that apply unconditionally, and spatial field components that apply only when the player enters a defined volume.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Camera Influence Field component in the O3DE Inspector

 

Contents


Types of Influence Fields

ComponentScopeWhen It Applies
GlobalCameraInfluenceComponentGlobalAlways active. Applies priority effects to the system unconditionally. Usually placed on SceneData entity
CameraInfluenceFieldComponentSpatialActive when the player is inside the associated physics trigger volume.

Both types use CamInfluenceData records to describe what they modify. Multiple fields can be active at the same time — their effects are resolved by priority stacking.


Priority Stacking

When multiple influence fields are active at the same time, their priority changes add to the target camera’s priority. This enables growingly more granular spaces to regularly shift in, by adding a new influence field, and shift out, when you exit and are in a more general space.


Quick Reference

NeedComponentHow
Apply a camera prioerity alwaysGlobalCameraInfluenceComponentAdd to any entity, preferably the StageData entity
Apply a priority in a zoneCameraInfluenceFieldComponentAdd with a PhysX trigger collider; player entering activates it

Glossary

TermMeaning
Influence FieldA modifier that adjusts camera behavior dynamically on top of the dominant Phantom Camera
Priority StackingThe effect of taking the camera base priority, then additively applying influence field changes ontop to eventually determine the dominant phantom camera
Global InfluenceAn unconditional priority modifier that applies regardless of spatial position
Spatial InfluenceA volume-based modifier that activates when the player enters its physics trigger

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

3.11 - GS_Stats

RPG statistics and status effect systems for GS_Play — timed and permanent modifiers that alter entity stats and drive reactive gameplay behavior.

GS_Stats provides the statistical and modifier infrastructure for RPG-style character systems. It handles the application, tracking, and expiry of status effects — the layer between raw gameplay events and the numerical state of a character.

For architecture details, component properties, and extending the system in C++, see the GS_Stats API.


Quick Navigation

I want to…FeatureAPI
Apply timed or permanent modifiers to entity stats with stacking and expiryStatus EffectsAPI

Installation

GS_Stats requires GS_Core. Add both gems to your project.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Stats gem in your project configuration.
  2. Define status effect data assets for your game’s modifiers.
  3. Place status effect components on entities that need to receive modifiers.

Status Effects

Status Effects are the primary mechanism for applying timed or permanent modifiers to entities. Each effect defines what it changes, how long it lasts, whether it stacks, and what happens on expiry. The system handles the full lifecycle: application, duration tracking, stack management, and cleanup. Effects are data-driven — designers author new types in the Asset Editor without touching code.

Status Effects API


See Also

For the full API, component properties, and C++ extension guide:


Get GS_Stats

GS_Stats — Explore this gem on the product page and add it to your project.

3.11.1 - Status Effects

How to work with GS_Play status effects — timed modifiers that alter entity stats or behavior.

Status Effects are timed or permanent modifiers that alter entity stats or behavior. They support stacking, expiration, and side-effect triggering — enabling standard RPG mechanics like poison damage over time, temporary buffs, and debuff stacking.

For component properties and the status effect data model, see the Framework API reference.

 

Contents


How Status Effects Work

PhaseWhat Happens
ApplicationA status effect is applied to an entity with a duration and configuration.
ActiveThe effect modifies the entity’s stats or triggers periodic side effects.
StackingIf the same effect is applied again, stacking rules determine whether it refreshes, stacks count, or is rejected.
ExpirationWhen the duration expires, the effect is removed and any cleanup side effects fire.

Glossary

TermMeaning
Status EffectA timed or permanent modifier that alters entity stats or behavior
StackingRules governing how repeated applications of the same effect interact
Side EffectAn action triggered when a status effect is applied, ticks, or expires

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:


Get GS_Stats

GS_Stats — Explore this gem on the product page and add it to your project.

3.12 - GS_UI

The complete UI framework for GS_Play — canvas lifecycle management, single-tier page navigation, enhanced buttons, data-driven animation, and input interception.

GS_UI is the complete user interface framework for GS_Play, built on top of O3DE’s LyShine rendering layer. It provides a singleton manager for canvas lifecycle and focus, a single-tier page navigation architecture, motion-based button animations, a LyShine-specific track animation system authored as data assets, and input interception utilities for pause menus and stage transitions.

For architecture details, component properties, and extending the system in C++, see the GS_UI API.


Quick Navigation

I want to…FeatureAPI
Load and unload UI canvases, manage focus stack, or set startup focusUI ManagerAPI
Navigate between pages, handle back navigation, or cross canvas boundariesPage NavigationAPI
Add button animations or intercept input for UI canvasesUI InteractionAPI
Animate UI elements with position, scale, rotation, alpha, and color tracksUI AnimationAPI
Add a load screen or pause menuWidgetsAPI

Installation

GS_UI requires GS_Core and LyShine. Add both gems to your project before placing UI components.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_UI gem in your project configuration.
  2. Create a UI Manager prefab and add it to the Game Manager’s Managers list.
  3. Create LyShine UI canvases with GS_UIPageComponent on root elements.
  4. Create .uiam animation assets for page transitions and button effects.

UI Manager

The UI Manager is the singleton that owns canvas lifecycle for the entire game session. It loads and unloads canvases by name, maintains a global focus stack, and handles startup focus deterministically. All canvas operations go through the Manager so that cross-canvas navigation and focus transitions remain coordinated.

UI Manager API


Page Navigation is the single-tier architecture that structures all navigable screens. Canvases contain root pages that can parent nested child pages recursively. Navigation is driven through the Page component: NavigateTo, NavigateBack, ChangePage, and ChangeUI handle all routing. Pages play UiAnimationMotion assets on show and hide for data-driven transitions.

Page Navigation API


UI Interaction

Enhanced button animations and input interception. GS_ButtonComponent extends LyShine interactables with UiAnimationMotion-based hover and select animations. GS_UIInputInterceptorComponent captures input events while a canvas is focused, preventing them from reaching gameplay. Both systems work together on interactive canvases.

UI Interaction API


UI Animation

UI Animation extends GS_Motion with eight LyShine-specific tracks: position, scale, rotation, element alpha, image alpha, image color, text color, and text size. Tracks are authored together in .uiam assets and played in parallel at runtime. Page components embed motion instances for show/hide transitions, and a standalone component makes animations available on any UI entity.

UI Animation API


Widgets

Standalone UI components for game-event-driven scenarios outside the page navigation model. Load screens display automatically during stage transitions. Pause menus overlay gameplay and suppress gameplay input while active.

Widgets API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

3.12.1 - UI Manager

How to work with the GS_Play UI Manager — canvas lifecycle, focus stack, and cross-canvas navigation.

The UI Manager is the singleton controller for all UI canvases in your project. It loads and unloads canvases, maintains a global focus stack that tracks which canvas the player is currently interacting with, and handles cross-canvas navigation so that opening a pause menu over gameplay UI and returning to it works automatically.

For architecture details, component properties, and extension patterns, see the Framework API reference.

UI Manager component in the O3DE Inspector

 

Contents


How It Works

The UI Manager maintains a global focus stack — a history of which UI canvas has focus. When you open a new canvas (like a pause menu), it pushes onto the stack. When you close it, the previous canvas (gameplay HUD) regains focus automatically.

At startup, the UI Manager loads canvases and waits for root pages to register. One canvas is designated as the startup focus UI — it receives focus automatically when the startup sequence completes.


Loading and Unloading Canvases

ScriptCanvas NodeWhat It Does
LoadGSUI(name, path)Loads a UI canvas from an asset path and registers it by name.
UnloadGSUI(name)Unloads a canvas and removes it from the active map.
FocusUI(name)Pushes a canvas onto the global focus stack and focuses its root page.
NavLastUIPops the current canvas off the focus stack and returns focus to the previous one.
ToggleUI(name, on)Shows or hides a canvas without changing the focus stack.

ScriptCanvas

If you wish to change between canvases, leaving the previous inactive, and enabling the next:

Toggle off your current canvas, then focus the next. Focus automatically handles enabling the canvas for use.

Using Page: NavigateBack will naturally hide the new canvas, and reactivate the past one.


Querying UI State

ScriptCanvas NodeWhat It Does
GetFocusedUIReturns the name of the canvas currently on top of the focus stack.
GetUICanvasEntity(name)Returns the canvas entity for a named UI.
GetUIRootPageEntity(name)Returns the root page entity for a named UI.

ScriptCanvas


UI Input Profile

UI Input Profile asset in the O3DE Asset Editor


Quick Reference

NeedBusMethod
Load a canvasUIManagerRequestBusLoadGSUI(name, path)
Unload a canvasUIManagerRequestBusUnloadGSUI(name)
Focus a canvasUIManagerRequestBusFocusUI(name)
Return to previous canvasUIManagerRequestBusNavLastUI
Show/hide a canvasUIManagerRequestBusToggleUI(name, on)
Get focused canvas nameUIManagerRequestBusGetFocusedUI

Glossary

TermMeaning
UI CanvasA LyShine canvas loaded and managed by the UI Manager
Focus StackA global history of focused canvases — opening a new one pushes, closing pops
Root PageA page registered with the UI Manager as a canvas entry point

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

3.12.2 - Page Navigation

How to work with GS_Play page navigation — single-tier page hierarchy, focus management, and transitions.

Page Navigation is the core of the GS_UI system. Pages are the fundamental unit of UI organization — each page represents a screen, panel, or section of your interface. Pages can be root pages (registered with the UI Manager as a canvas entry point) or nested child pages (managed by a parent page). Navigation uses a push/pop focus stack so the system always knows where to return when the player backs out.

For architecture details, the NavigateTo algorithm, and component internals, see the Framework API reference.

Root page entity setup in the O3DE Editor

 

Contents


Page Types

TypeHow It’s ConfiguredWhat It Does
Root Pagem_isRoot = trueRegisters with the UI Manager by name. Acts as the canvas entry point.
Child Pagem_isRoot = falseRegisters with its parent page automatically. Managed by parent’s show/hide logic.

A canvas typically has one root page at the top, with child pages nested beneath it for sub-screens (settings tabs, inventory categories, confirmation dialogs).

Nested pages entity hierarchy in the O3DE Editor

Example of pages within pages to determine UI display structure, and handling changing focus between pages. SettingsWindowPage represents the main options windows. The child pages are the actual different windows that display within the space, changing with side menu changes.

For guides on complex page navigation, see Lesson: Create UI.

 

Required Companion Components

Root page entities require two additional components alongside GS_UIPageComponent:

ComponentWhy It’s Needed
FaderComponentDrives alpha-based fade transitions for show/hide animations.
HierarchicalInteractionToggleComponentDisables input on the entire page subtree when the page is hidden, preventing clicks from reaching invisible elements.

Add both to every root page entity in the UI Editor.


These are the primary methods for controlling page flow:

ScriptCanvas NodeWhat It Does
NavigateTo(forced)Focuses this page. Automatically pushes at the correct ancestor level in the hierarchy.
NavigateBackWalks up the page hierarchy to find the first ancestor with a non-empty focus stack and pops it. If no stack is found, calls NavLastUI on the UI Manager (closes the canvas).
ChangePage(target, takeFocus, forced)Swaps the displayed child page without pushing a new focus stack entry.
ToggleShow(on)Shows or hides this page.
ChangeUI(targetUI, takeFocus, hideThis)Cross-canvas navigation — switches to a different UI canvas.
FocusChildPageByName(name, forced)Focuses a child page by its m_pageName string.

Handling Page Components

Navigating Pages

Changing UIs


Return Policies

Each page has a NavigationReturnPolicy that controls which element gets focus when the page is returned to:

PolicyBehavior
RestoreLastReturns focus to the last interactable the player was on (e.g., resume where you left off).
AlwaysDefaultAlways returns to the page’s default interactable (e.g., always land on “New Game” button).

Page Transitions

Pages support three motion slots for animated transitions:

SlotWhen It Plays
onShowWhen the page becomes visible.
onShowLoopLoops continuously while the page is visible.
onHideWhen the page is hidden.

Each slot takes a .uiam (UI Animation Motion) asset. See UI Animation for track types.


Quick Reference

NeedBusMethod
Navigate to a pageUIPageRequestBusNavigateTo(forced)
Go backUIPageRequestBusNavigateBack
Switch child pageUIPageRequestBusChangePage(target, takeFocus, forced)
Show/hide a pageUIPageRequestBusToggleShow(on)
Switch canvasesUIPageRequestBusChangeUI(targetUI, takeFocus, hideThis)
Focus child by nameUIPageRequestBusFocusChildPageByName(name, forced)

Glossary

TermMeaning
PageThe fundamental UI unit — a screen, panel, or section of the interface
Root PageA page registered with the UI Manager as a canvas entry point
Child PageA page managed by a parent page, navigated to via ChangePage or FocusChildPageByName
NavigationReturnPolicyControls which element gets focus when returning to a page (RestoreLast or AlwaysDefault)

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

3.12.3 - UI Interaction

How to work with GS_Play UI interaction — motion-based button animations and input interception for focused canvases.

UI Interaction covers two systems that handle player input at the UI layer: enhanced buttons with motion-based hover and select animations, and the input interceptor that captures input events while a canvas is focused.

For component properties and the full API, see the Framework API reference.

 

Contents


Buttons

GS_ButtonComponent in the O3DE Inspector

GS_ButtonComponent extends LyShine’s interactable system with UiAnimationMotion support. When the player hovers over a button (mouse or gamepad navigation), a configurable animation plays. When the button is selected, a separate animation fires. Both animations use .uiam assets, giving full control over position, scale, rotation, alpha, and color transitions.

Setup

  1. Add a LyShine UiButtonComponent (or other interactable) to your entity.
  2. Add GS_ButtonComponent to the same entity.
  3. Author .uiam assets for hover, unhover, and select states.
  4. Assign the assets to the corresponding fields in the Inspector.

Integration with Page Focus

When page navigation moves focus to a button (via keyboard or gamepad), the button treats focus as a hover event and plays its hover animation. This ensures consistent visual feedback regardless of input method.


Input Interception

UI Input Interceptor coponent in the O3DE Asset Editor

GS_UIInputInterceptorComponent prevents input from leaking to gameplay systems while a UI canvas is active. It uses a GS_UIInputProfile asset to define which input channels to intercept. Intercepted events are re-broadcast on UIInputNotificationBus so UI-specific logic can respond to them.

When the canvas loses focus, the interceptor deactivates automatically and input flows normally back to gameplay.

Setup

  1. Add GS_UIInputInterceptorComponent to the root page entity.
  2. Configure a GS_UIInputProfile asset with the channels to intercept.
  3. Connect to UIInputNotificationBus in any component that should respond to intercepted events.

Integration

Buttons and input interception work together on interactive canvases:

  • The input interceptor blocks gameplay input while a menu is open.
  • Buttons provide visual hover/select feedback driven by UIInputNotificationBus events.
  • The page system routes gamepad focus to the correct button automatically.

Glossary

TermMeaning
GS_ButtonComponentAn enhanced button with GS_Motion-based hover and select animations
Hover AnimationA .uiam motion that plays when the button receives focus or mouse hover
Select AnimationA .uiam motion that plays when the button is activated
GS_UIInputInterceptorComponentComponent that captures input events while a canvas is focused
GS_UIInputProfileData asset defining which input channels to intercept

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

3.12.3.1 - Buttons

Redirected — button documentation is now part of UI Interaction.

GS_ButtonComponent is an enhanced button that adds GS_Motion-based animations for hover and select states. For the full guide, see The Basics: UI Interaction.

For component properties and API details, see the Framework API reference.

 

GS_ButtonComponent in the O3DE Inspector

Contents


How It Works

The GS_ButtonComponent connects to LyShine’s interactable notification system. When the interactable reports a hover event, the button plays its hover animation. When it reports an unhover event, the animation reverses or stops. Select works the same way.

Animations are defined as .uiam assets and assigned in the Inspector. Each button can have independent hover and select motions.


Integration with Page Focus

When page navigation moves focus to a button (via keyboard or gamepad), the button treats focus as a hover event and plays its hover animation. This ensures consistent visual feedback regardless of input method.


Handling Button Events

Button Events nodes in the Script Canvas


Glossary

TermMeaning
GS_ButtonComponentAn enhanced button with GS_Motion-based hover and select animations
Hover AnimationA .uiam motion that plays when the button receives focus or mouse hover
Select AnimationA .uiam motion that plays when the button is activated

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

3.12.4 - UI Animation

How to work with GS_Play UI animations — motion-based LyShine property animation for pages, buttons, and standalone components.

UI Animation extends the GS_Motion system with 8 LyShine-specific animation tracks. Animations are authored as .uiam (UI Animation Motion) assets in the O3DE Asset Editor and can be used for page transitions, button hover effects, and standalone component animation.

For track type details, the domain extension pattern, and component internals, see the Framework API reference.

UI Animation Motion asset in the O3DE Asset Editor

 

Contents


Available Tracks

Each track animates a single LyShine property over time:

TrackTarget ComponentWhat It Animates
PositionAny LyShine elementPosition offset from anchor
ScaleAny LyShine elementElement scale
RotationAny LyShine elementElement rotation
Element AlphaAny LyShine elementWhole-element transparency
Image AlphaUiImageComponentImage-specific transparency
Image ColorUiImageComponentImage color tint
Text ColorUiTextComponentText color
Text SizeUiTextComponentFont size

Multiple tracks can run simultaneously within a single motion — for example, a page show animation might fade in (Element Alpha) while sliding up (Position) and scaling from 90% to 100% (Scale).


Where Animations Are Used

ContextHow It’s Assigned
Page transitionsAssigned to onShow, onShowLoop, and onHide slots on GS_UIPageComponent.
Button statesAssigned to hover and select slots on GS_ButtonComponent.
Standalone playbackAssigned to UiAnimationMotionComponent on any entity.

Authoring Animations

  1. Create a new .uiam asset in the O3DE Asset Editor.
  2. Add tracks for the properties you want to animate.
  3. Configure timing, easing curves, and property values for each track.
  4. Optionally set track identifiers for proxy targeting.
  5. Assign the asset to a component slot in the Inspector.

Proxy Targeting

UiAnimationMotion with proxy bindings configured in the Inspector

When tracks have identifiers (named labels), they appear in the motion’s proxy list. Proxies let you redirect a track to a different entity in the UI hierarchy — for example, animating a background panel separately from a content area within the same page transition.


Quick Reference

NeedHow
Animate a page transitionAssign .uiam assets to onShow/onHide slots on the page
Animate a button hoverAssign .uiam asset to the button’s hover slot
Play an animation on any entityAdd UiAnimationMotionComponent and assign a .uiam asset
Target a child elementUse proxy entries to redirect tracks
Loop an animationEnable loop on the motion asset

Glossary

TermMeaning
UI Animation Motion (.uiam)A data asset containing LyShine-specific animation tracks for UI elements
UiAnimationMotionComponentA standalone component for playing UI animations on any entity
Proxy TargetingRedirecting a track to a different entity in the UI hierarchy via named labels

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

3.12.5 - Widgets

How to work with GS_Play UI widgets — load screens, pause menus, and other standalone UI components.

Widgets are standalone UI components that handle specific game scenarios outside the page navigation model. They activate and deactivate in response to game events rather than player navigation — a load screen appears during a stage transition, a pause menu overlays gameplay when the player pauses.

For component properties and the full API, see the Framework API reference.

 

Contents


Load Screen

GS_LoadScreenComponent shows a loading canvas during stage transitions and hides it when loading is complete. Place it on the Game Manager entity and configure a LyShine canvas to use as the loading screen.

GS_LoadScreenComponent in the O3DE Inspector

The component listens for StageManagerNotificationBus events and drives the canvas visibility automatically — no ScriptCanvas or manual triggers required.


Pause Menu

PauseMenuComponent handles the pause overlay. It listens for the configured pause input action and toggles a designated pause canvas on and off, broadcasting events so other systems know to yield control.


Glossary

TermMeaning
WidgetA standalone UI component that activates in response to game events rather than player navigation
Load ScreenA canvas displayed during stage transitions, managed by GS_LoadScreenComponent
Pause MenuAn overlay canvas displayed when the game is paused, managed by PauseMenuComponent

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

3.13 - GS_Unit

The character and entity control system — unit registration, player and AI controllers, input processing, and modular movement.

GS_Unit is the complete character control system for GS_Play. It handles how entities are registered as controllable units, which controller possesses each unit, how raw player input is captured and converted into intent, and how that intent is translated into physical movement through a modular stack of movers, grounders, and influence fields.

For architecture details, component properties, and extending the system in C++, see the GS_Unit API.


Quick Navigation

I want to…FeatureAPI
Register units, track active controllers, or spawn units at runtimeUnit ManagerAPI
Possess and release units with player or AI controllersControllersAPI
Capture raw input and convert it into structured movement intentInput DataAPI
Move characters with movers, grounders, and movement influence fieldsMovementAPI

Installation

GS_Unit requires GS_Core. Add both gems to your project before using any Unit components.

For a full guided walkthrough, follow the Simple Project Setup guide.

 

Quick Installation Summary

  1. Enable the GS_Unit gem in your project configuration.
  2. Create a Unit Manager prefab and add it to the Game Manager’s Managers list.
  3. Configure physics collision layers for character movement.
  4. Place a Player Controller and Unit entity in your level.

Unit Manager

The Unit Manager is the singleton coordinator for everything unit-related. It handles registration of new units as they spawn, maintains an indexed list of all active units queryable by name or ID, and provides the spawn interface for runtime unit creation. It also participates in the standby system for clean level transitions.

Unit Manager API


Controllers

Controllers are the possession layer — they determine which entity a given intelligence (player or AI) currently owns and drives. Player and AI controllers share the same possession interface, so swapping control mid-game or having both active simultaneously requires no special-casing.

Controllers API


Input Data

The Input Data feature set converts hardware signals into structured intent your movement code can consume. The PlayerControllerInputReader reads raw input events, InputData components store the current state, and Input Reactors compute movement vectors from keyboard or joystick input with deadzones and scaling.

Input Data API


Units

Unit overview description, leads to movement.

Units API


See Also

For the full API, component properties, and C++ extension guide:

For step-by-step project setup:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

3.13.1 - Unit Manager

How to work with the GS_Unit manager — spawning units, registering player controllers, and coordinating standby across all controlled characters.

The Unit Manager is the global coordinator for all units in a GS_Play project. It maintains a registry of every active unit entity, provides the interface for spawning new units at runtime, tracks which player controllers are currently active, and participates in the GS_Core standby system so that all controlled characters pause cleanly during level transitions.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Unit Manager component in the O3DE Inspector Unit Manager entity setup in the O3DE Editor

 

Contents


What the Unit Manager Does

The Unit Manager runs as a singleton that all other unit systems report to. When a unit entity activates, it registers itself so the manager can track it by name and ID. When gameplay code needs a new unit spawned, it asks the manager — not the spawning system directly — so the manager can assign a unique name and fire the appropriate notification.

ResponsibilityDescription
Unit registryEvery active unit registers and deregisters automatically.
SpawningIssues RequestSpawnNewUnit and fires ReturnNewUnit when the entity is ready.
Player controller trackingStores which controllers are registered as player-driven so multi-actor setups stay consistent.
Standby coordinationBroadcasts EnterStandby and ExitStandby to pause and resume all units during transitions.
Unit validationCheckIsUnit lets any system confirm whether an entity is a registered unit.

Spawning Units

How Spawning Works

Spawning a unit is a two-step async operation. Your script calls RequestSpawnNewUnit, the manager creates the entity using the prefab spawner, and when the unit is fully initialized the manager fires ReturnNewUnit on UnitManagerNotificationBus with the new entity ID and its assigned unique name.

You should always listen for ReturnNewUnit rather than attempting to use the entity immediately after the request, because spawning can take one or more frames.

StepWhat Happens
1 — RequestCall RequestSpawnNewUnit with the prefab reference and desired spawn transform.
2 — WaitThe manager spawns the entity and runs its initialization.
3 — ReceiveReturnNewUnit fires on UnitManagerNotificationBus with the entity ID and name.

ScriptCanvas — Spawning a Unit


Registering Player Controllers

Player controllers must register themselves with the Unit Manager on activation so the game always knows which controllers are active. This is handled automatically by GS_PlayerControllerComponent, but if you are building a custom controller, call RegisterPlayerController in your activation logic.

[On Entity Activated (custom controller)]
    └─► [UnitManagerRequestBus → RegisterPlayerController(selfEntityId)]

This registration matters most for split-screen or multi-player setups where the manager needs to route input correctly across multiple simultaneous controllers.


Checking Unit Identity

Any system can confirm whether an arbitrary entity is a registered unit without holding a direct reference to the Unit Manager:

ScriptCanvas NodeReturnsNotes
CheckIsUnit(entityId)booltrue if the entity is an active registered unit.

ScriptCanvas

[UnitManagerRequestBus → CheckIsUnit(targetEntityId)]
    └─► bool
            ├─► true  — proceed with unit-specific logic
            └─► false — entity is not a unit, skip

Standby Mode

The Unit Manager participates in GS_Core’s global standby system. During level transitions or other blocking operations, the Game Manager signals standby and the Unit Manager relays that signal to all units. Units should pause movement, suspend input processing, and halt any tick-driven logic while in standby.

Listen on UnitManagerNotificationBus to coordinate your own logic with unit standby:

EventWhen It FiresWhat to Do
EnterStandbyBefore a level transition or blocking operation begins.Pause all movement, halt ticks, stop animations.
ExitStandbyAfter the blocking operation completes.Resume movement, re-enable ticks.

ScriptCanvas


Quick Reference

NeedBusMethod / Event
Spawn a new unitUnitManagerRequestBusRequestSpawnNewUnit(prefab, transform)
Know when a unit is readyUnitManagerNotificationBusReturnNewUnit(entityId, uniqueName)
Register a player controllerUnitManagerRequestBusRegisterPlayerController(entityId)
Check if entity is a unitUnitManagerRequestBusCheckIsUnit(entityId)
Know when standby beginsUnitManagerNotificationBusEnterStandby
Know when standby endsUnitManagerNotificationBusExitStandby

Glossary

TermMeaning
Unit ManagerThe singleton manager that tracks all active units, handles spawning, and coordinates standby
Unit RegistryThe internal list of every active unit entity, indexed by name and ID
StandbyA pause state broadcast to all units during level transitions or blocking operations

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

3.13.2 - Controllers

How to work with GS_Unit controllers — the possession model, player and AI controller setup, and switching control at runtime.

Controllers are the possession layer of GS_Unit. A controller is the intelligence — human input or AI logic — that owns and drives a unit at any given moment. Because both player and AI controllers share the same possession interface, your unit entities do not need to know what is controlling them, and swapping control at runtime requires no special-casing.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Player Controller and Input Reader components in the O3DE Inspector

 

Contents


The Possession Model

Unit Possession Pattern Graph

Breakdown

Every unit has exactly one controller at a time, or no controller at all. Possession is established by calling Possess on the unit and released by calling DePossess. The unit fires UnitPossessed on UnitNotificationBus whenever ownership changes so other systems can react.

ConceptDescription
PossessionA controller attaches to a unit. The unit accepts input and movement commands from that controller only.
DePossessionThe controller releases the unit. The unit halts input processing and enters a neutral state.
UnitPossessed eventFires on UnitNotificationBus (addressed by entity ID) whenever a unit’s controller changes.
GetControllerReturns the entity ID of the current controller, or an invalid ID if none.
GetUniqueNameReturns the string name assigned by the Unit Manager when this unit was spawned.

Controller Types

GS_Unit ships three controller components that cover the full range of typical game use:

ComponentPurpose
GS_UnitControllerComponentBase class establishing the possession contract. Extend this for custom controllers.
GS_PlayerControllerComponentHuman-driven controller. Connects into the Input Data pipeline and routes player intent to the possessed unit.
GS_AIControllerComponentNPC controller. Provides the entry point for AI logic to issue movement commands through the same unit interface as player input.

Both GS_PlayerControllerComponent and GS_AIControllerComponent extend the base controller, meaning any code that works with the base controller interface works transparently with either type.


Player Controller

GS_PlayerControllerComponent registers itself with the Unit Manager on activation, making it visible to the game’s controller tracking system. It connects to the Input Data pipeline on the possessed unit so that raw input events are routed correctly as soon as possession is established.

Setup

  1. Add GS_PlayerControllerComponent to a controller entity (not the unit entity itself).
  2. Ensure GS_InputDataComponent and GS_PlayerControllerInputReaderComponent are on the unit entity.
  3. Call Possess to attach the controller to the target unit.

ScriptCanvas — Possessing a Unit


AI Controller

GS_AIControllerComponent gives AI logic the same possession interface as the player controller. An AI behavior system possesses a unit, then issues movement commands through UnitRequestBus or directly drives the unit’s Input Data component to simulate input. The unit processes those commands through the same mover stack it would use for player input.

ScriptCanvas — Handing a Unit to AI

// Called from your behavior tree or AI activation event when the NPC starts acting
[AI behavior activates]
    └─► [UnitRequestBus(unitEntityId) → Possess(aiControllerEntityId)]

[UnitNotificationBus(unitEntityId) → UnitPossessed(aiControllerEntityId)]
    └─► [AI logic begins issuing movement / action commands]

Switching Controllers at Runtime

Because possession is a simple attach/detach operation on the unit, switching from player to AI control (or back) is two calls:

[Trigger: cutscene starts, character incapacitated, etc.]
    └─► [UnitRequestBus(unitEntityId) → DePossess]
            └─► [UnitRequestBus(unitEntityId) → Possess(aiControllerEntityId)]

[Trigger: cutscene ends, character recovers, etc.]
    └─► [UnitRequestBus(unitEntityId) → DePossess]
            └─► [UnitRequestBus(unitEntityId) → Possess(playerControllerEntityId)]

The DePossess call ensures the previous controller’s input pipeline is disconnected before the new controller attaches, preventing stale input state from leaking between ownership changes.


Querying Controller State

ScriptCanvas NodeReturnsNotes
GetController (on unit)EntityIdCurrent controller entity. Invalid ID means no controller.
GetUniqueName (on unit)stringThe name assigned to this unit at spawn time.

ScriptCanvas


Standby and Controllers

When the Unit Manager broadcasts EnterStandby, both player and AI controllers should stop issuing commands. Player controllers automatically halt input reading. AI controllers must be suspended by whatever behavior system drives them. On ExitStandby, normal command flow resumes.

EventController Action
UnitEnteringStandbyHalt input reads, suspend AI commands.
UnitExitingStandbyResume input reads, restart AI commands.

Both events fire on UnitNotificationBus addressed by unit entity ID.


Quick Reference

NeedBusMethod / Event
Attach a controller to a unitUnitRequestBus (by ID)Possess(controllerEntityId)
Release a controller from a unitUnitRequestBus (by ID)DePossess
Know when possession changesUnitNotificationBus (by ID)UnitPossessed(controllerEntityId)
Get the current controllerUnitRequestBus (by ID)GetController
Get a unit’s nameUnitRequestBus (by ID)GetUniqueName
Know when unit enters standbyUnitNotificationBus (by ID)UnitEnteringStandby
Know when unit exits standbyUnitNotificationBus (by ID)UnitExitingStandby

Glossary

TermMeaning
PossessionThe act of a controller attaching to a unit and becoming its active intelligence
DePossessionReleasing a controller from a unit, halting its input processing
Player ControllerA human-driven controller that routes input from the Input Data pipeline to a unit
AI ControllerAn NPC controller that issues movement commands through the same unit interface as player input

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

3.13.3 - Input Data

How to work with the GS_Unit input pipeline — reading player input, converting keyboard and joystick signals into movement vectors, and chaining input reactors.

The Input Data system is the pipeline that converts raw hardware signals into structured intent your movement code can consume. It sits between the active input profile and the mover stack, so controllers, movement components, and action systems all read from a single consistent source rather than polling hardware directly. Swapping input profiles at runtime — for remapping, controller switching, or accessibility options — requires no change to any downstream component.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Input Reactor component in the O3DE Inspector

 

Contents


Pipeline Overview

Unit Input Handling Pattern Graph

Breakdown

The input pipeline has three stages. Each stage is a separate component on the unit entity, and they run in order every frame:

StageComponentWhat It Does
1 — ReadGS_PlayerControllerInputReaderComponentReads raw input events from the active input profile and writes them into GS_InputDataComponent.
2 — StoreGS_InputDataComponentHolds the current frame’s input state — button presses, axis values — as structured data.
3 — ReactReactor componentsRead from GS_InputDataComponent and produce intent: movement vectors, action triggers, etc.

All reactor components downstream of the store stage read from the same GS_InputDataComponent, so there is no duplicated hardware polling and no risk of two reactors seeing different input states for the same frame.


Input Data Component

GS_InputDataComponent is the shared state store. It does not read hardware — it only holds the values written by the Reader and consumed by Reactors.

Add it to any unit entity that needs to process input. A unit should have exactly one GS_InputDataComponent. All input reactors on the same entity read from it automatically.


Input Reader

GS_PlayerControllerInputReaderComponent connects the unit to the game’s active input profile. It receives input events from the profile and writes the resulting state into the unit’s GS_InputDataComponent each frame.

When the player controller possesses the unit, the reader activates. When the unit is depossessed or control is handed to an AI controller, the reader goes silent and the Input Data component retains its last-written state until overwritten.

Setup

Add GS_PlayerControllerInputReaderComponent to the unit entity alongside GS_InputDataComponent. No additional configuration is required — it picks up the active input profile automatically.

Script Canvas - Enabling and Disabling Input Groups

Input Groups handling nodes in the O3DE Script Canvas


Input Reactors

Reactors are the components that transform stored input state into useful intent values. Each reactor handles one concern:

ComponentInput SourceOutput
GS_InputReactorComponentBase class — reads from GS_InputDataComponent.Override to produce custom intent.
GS_InputAxisReactorComponentAxis values from GS_InputDataComponent.Processed axis output (scaled, deadzoned).
KeyboardMovement_InputReactorComponentKeyboard directional buttons (WASD / arrow keys).Normalized 3D movement vector.
JoyAxisMovement_AxisReactorComponentJoystick axis pair from GS_InputDataComponent.Scaled, deadzoned 3D movement vector.

Add as many reactors as your unit needs. A typical ground character has both KeyboardMovement_InputReactorComponent and JoyAxisMovement_AxisReactorComponent active so it responds to either keyboard or gamepad input without requiring separate prefabs.


Keyboard Movement Reactor

KeyboardMovement_InputReactorComponent reads the four directional button states from GS_InputDataComponent and produces a normalized movement vector. The vector direction is relative to the unit’s forward axis, so rotating the unit automatically adjusts the movement direction without any additional calculation.

The reactor writes the resulting vector into a movement data field that the active mover reads each frame.

ScriptCanvas — Reading the Movement Vector

You generally do not need to read the movement vector directly — the mover components consume it automatically. However, if you need to inspect or override it:

// Use only if you need to inspect or override the movement vector — movers consume it automatically
[On Tick]
    └─► [GS_InputDataComponent → GetMovementVector]
            └─► Vector3 — current frame's movement intent

Joystick Axis Reactor

JoyAxisMovement_AxisReactorComponent reads a pair of axis values (horizontal and vertical) from GS_InputDataComponent and applies deadzone filtering and magnitude scaling before writing the resulting vector. This prevents stick drift at rest and allows variable-speed movement when the stick is partially deflected.

SettingEffect
DeadzoneAxis values below this threshold are treated as zero.
ScaleMultiplier applied after deadzone removal.
Axis mappingWhich input profile axis pair to read (left stick, right stick, etc.).

Adding Custom Reactors

To handle input not covered by the built-in reactors — jumping, sprinting, interaction, ability activation — extend GS_InputReactorComponent or GS_InputAxisReactorComponent in C++. Your reactor reads from GS_InputDataComponent the same way the built-in ones do, so it automatically benefits from any input profile that writes to those fields.

For the C++ extension guide, see the Framework API reference.


Quick Reference

NeedComponentNotes
Store input state on a unitGS_InputDataComponentRequired on every unit that processes input.
Read from active input profileGS_PlayerControllerInputReaderComponentActivates automatically when player controller possesses.
Keyboard → movement vectorKeyboardMovement_InputReactorComponentReads WASD / arrow keys from input data.
Joystick → movement vectorJoyAxisMovement_AxisReactorComponentReads axis pair, applies deadzone and scale.
Custom button or axis handlerExtend GS_InputReactorComponent (C++)Reads from GS_InputDataComponent same as built-ins.
Inspect movement vectorGS_InputDataComponent → GetMovementVectorVector3, current frame’s movement intent.

Glossary

TermMeaning
Input Data ComponentThe shared state store that holds the current frame’s input values for a unit
Input ReaderThe component that reads raw input events from the active input profile and writes them into the Input Data Component
Input ReactorA component that reads from the Input Data Component and produces intent values (movement vectors, action triggers)
Movement VectorA normalized 3D direction produced by input reactors and consumed by movers

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

3.13.4 - Units

What makes an entity a unit — the GS_UnitComponent, entity configuration, collision setup, possession, standby, and movement.

A “unit” in GS_Play is any entity that can be possessed by a controller and driven through gameplay. The GS_UnitComponent is the marker that transforms an ordinary entity into a unit — it provides the possession interface, unique naming, standby awareness, and automatic registration with the Unit Manager.

Units are the characters, vehicles, creatures, or any other controllable actors in your game. They do not contain decision-making logic themselves — that comes from the controller that possesses them. A unit provides the body: movement, collision, and visuals. The controller provides the brain: player input or AI logic.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

Unit component in the O3DE Inspector

 

Contents


How Units Work

Registration

When a GS_UnitComponent activates, it registers itself with the Unit Manager. The manager tracks all active units and responds to CheckIsUnit queries. When the component deactivates, it unregisters automatically.

Possession

Units are possessed by controllers through the UnitRequestBus:

  1. A controller calls Possess(controllerEntityId) on the unit.
  2. The unit stores the controller reference and broadcasts UnitPossessed on UnitNotificationBus.
  3. The controller can now drive the unit’s subsystems (movement, actions, etc.).
  4. Calling DePossess() clears the controller reference and releases the unit.

Standby

Units participate in the standby system for clean level transitions. When the Unit Manager signals standby, the unit broadcasts UnitEnteringStandby on its notification bus. Child components — movers, input reactors, and others — listen for this signal and pause their processing. UnitExitingStandby reverses the process.


Entity Configuration

A minimal unit entity requires:

  1. GS_UnitComponent — Registers the entity as a unit and provides the possession interface.
  2. Movement components — At least a mover and optionally a grounder for ground detection.
  3. PhysX collider — For physics interaction and ground detection raycasts.

A fully featured unit entity typically includes:

ComponentRole
GS_UnitComponentMarks the entity as a unit; handles possession and standby.
GS_MoverContextComponentCoordinates movers and grounders; holds the Movement Profile.
A mover (e.g. GS_3DSlideMoverComponent)Defines how the unit moves each frame.
A grounder (e.g. GS_PhysicsRayGrounderComponent)Detects surface contact and reports ground state.
PhysX Rigid Body + ColliderPhysics presence in the world.
Mesh or Actor componentVisual representation.

 

Entity Arrangement

Unit entity setup in the O3DE Editor

Entity arrangement


Collision Setup

Units require properly configured PhysX collision layers to interact with the environment and other units. If you have not set up collision layers yet, refer to the Simple Project Setup guide.

Typical collision layer assignments:

LayerPurpose
Unit layerThe unit’s own collider. Collides with environment geometry and other units.
Ground detection layerUsed by grounder raycasts. Collides with terrain and walkable surfaces only.

Movement

The Movement feature set is a composable stack where each component handles one concern of character locomotion. Movers define core behavior (free, strafe, grid, slide), the MoverContext manages movement state and selects the active mover, Grounders report ground state, and Influence components apply external forces like wind or currents.

Movement API


Quick Reference

NeedBusMethod
Assign a controller to this unitUnitRequestBus (ById)Possess(controllerEntityId)
Release the current controllerUnitRequestBus (ById)DePossess()
Get the possessing controllerUnitRequestBus (ById)GetController()
Get the unit’s unique nameUnitRequestBus (ById)GetUniqueName()

 

EventBusFired When
UnitPossessedUnitNotificationBusA controller takes possession of this unit.
UnitEnteringStandbyUnitNotificationBusThe unit is entering standby (level transition).
UnitExitingStandbyUnitNotificationBusThe unit is exiting standby and resuming.

Glossary

TermMeaning
UnitAny entity with a GS_UnitComponent — the controllable body in gameplay
ControllerThe intelligence (player or AI) that possesses and drives a unit
PossessionThe act of a controller claiming ownership of a unit via Possess()
StandbyA paused state units enter during level transitions, coordinated by the Unit Manager
Unique NameAn identifier generated at activation used to look up a unit by name

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

3.13.4.1 - Mover Context

Movement state coordinator for GS_Unit — manages movement modes, context states, input axis transformation, and movement profile priority.

GS_MoverContextComponent is the coordinator that sits above the movers and owns the movement state for a unit. It does not move the character itself — it decides which mover runs, transforms input into the correct movement vector, and exposes state to every other system that needs to know what the character is doing.

Add exactly one GS_MoverContextComponent to any unit that uses movers.

For architecture details, component properties, and extending the system in C++, see the Framework API: Mover Context.

Mover Context component in the O3DE Inspector

 

Contents


Movement Modes

The Mover Context uses three independent mode tracks — movement, rotation, and grounding — each holding a named string. Only the mover or grounder whose assigned mode name matches the active mode on that track will run. All others stay dormant.

TrackControlsExample modes
MovementWhich mover applies velocity each frame"Free", "Slide"
RotationWhich component rotates the character"Free", "Locked"
GroundingWhich grounder runs surface detection"Free", "Disabled"

The three tracks are independent — changing the movement mode does not reset rotation or grounding.

Reverting Modes

Every mode change stores the previous value. Calling RevertToLastMovementMode, RevertToLastRotationMode, or RevertToLastGroundingMode restores it. This is how the Slide mover hands control back to Free movement after a slope is cleared — without needing to know what mode was active before.


Context States

Context states are named key/value pairs stored on the Mover Context. Any component can write or read them via MoverContextRequestBus. They provide a lightweight shared blackboard for movement systems to coordinate without direct dependencies.

Built-in StateValuesMeaning
"grounding"0 — FallingNo surface contact.
1 — GroundedStable surface contact.
2 — SlidingSurface too steep to stand. Slide mover will activate.
"StopMovement"1 — ActiveHalts all mover velocity processing for this frame.

The grounder writes "grounding" each frame. The Slide mover writes it back to 0 or 1 when the slope clears. Your own companion components can write custom states in the same way.


Input Axis Transformation

The Mover Context processes raw input through two transformations before handing it to the active mover.

StepMethodWhat It Does
Raw inputSetMoveInputAxisWritten by the input reactor. Screen-space direction.
1 — Camera-relativeModifyInputAxis()Rotates the axis to align with the active camera’s facing.
2 — Ground-projectedGroundInputAxis()Projects onto the contact surface to prevent skating above or below terrain.

Movers read the ground-projected axis. Both intermediate values are available via bus if a component needs an earlier stage.


Movement Profile Priority

The Mover Context resolves which GS_UnitMovementProfile governs the unit’s locomotion parameters at runtime using a priority stack:

  1. Influence fields — Any MovementInfluenceFieldComponent or GlobalMovementInfluenceComponent that has added a profile to the unit. Highest priority. Multiple influences are stacked additively.
  2. Global profile — The scene-level default from GlobalMovementRequestBus. Applied when no influence overrides are active.
  3. Default profile — The profile assigned directly to the GS_MoverContextComponent. Used as the final fallback.

To change a unit’s movement feel in a specific area, place a MovementInfluenceFieldComponent with a volume shape and assign an alternate profile to it. The unit’s own component property does not need to change.


Script Canvas Examples

Changing movement mode:

Script Canvas nodes for changing movement mode

Reverting to the previous movement mode:

Script Canvas nodes for reverting movement mode

Setting a context state flag:

Script Canvas nodes for setting context state


Quick Reference

NeedBusMethod
Change active movement modeMoverContextRequestBusChangeMovementMode(modeName)
Restore previous movement modeMoverContextRequestBusRevertToLastMovementMode()
Read the ground-projected inputMoverContextRequestBusGetGroundMoveInputAxis()
Write a context stateMoverContextRequestBusSetMoverState(name, value)
Read a context stateMoverContextRequestBusGetMoverState(name)
Push a movement profile overrideMoverContextRequestBusAddMovementProfile(influencer, profile)
Remove a profile overrideMoverContextRequestBusRemoveMovementProfile(influencer)
Listen for mode changesMoverContextNotificationBusMovementModeChanged, GroundingModeChanged
Listen for state changesMoverContextNotificationBusMoverStateChanged

Glossary

TermMeaning
Mover ContextThe coordinator component that owns movement state, selects the active mover, and transforms input each frame
Movement ModeA named string on a mode track that determines which mover or grounder is active
Context StateA named key/value pair on the Mover Context used as a shared blackboard between movement components
Input AxisThe movement direction written by input reactors, transformed through camera-relative and ground-projected stages
Movement ProfileA data asset storing locomotion parameters (speed, acceleration, jump force) for a unit

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

3.13.4.2 - Movement

How to work with GS_Unit movement — mover types, mover context, grounders, movement profiles, and influence fields.

The Movement system is a composable stack of components that each handle one concern of character locomotion. Movers define the core behavior, grounders report surface contact, the Mover Context manages which mover is active and what movement state the character is in, influence fields apply external forces, and Movement Profiles store the parameters that govern all of it. You assemble what you need — most characters use two or three components — and replace individual pieces without touching anything else.

For architecture details, component properties, and extending the system in C++, see the Framework API reference.

 

Contents


How it Works

Movers & Grounders Pattern Graph

Breakdown

Movers and Grounders are multiple components on a Unit that are constantly changing activation based on the units state. When a mover is active, it freely processes the unit’s movement based on it’s functionality. At any moment, through internal or external forces, the Movement, Rotation, or Grounding state can change, which disables the old movers, and activates the new one to continue controlling the Unit.

Input
    ↓  InputDataNotificationBus::InputStateChanged
Input Reactor Components
    ↓  MoverContextRequestBus::SetMoveInputAxis
GS_MoverContextComponent
    ↓  ModifyInputAxis() → camera-relative
    ↓  GroundInputAxis() → ground-projected
    ↓  MoverContextNotificationBus::MovementModeChanged("Free")
GS_3DFreeMoverComponent  [active when mode == "Free"]
    ↓  AccelerationSpringDamper → rigidBody->SetLinearVelocity
GS_PhysicsRayGrounderComponent  [active when grounding mode == "Free"]
    ↓  MoverContextRequestBus::SetGroundNormal / SetContextState("grounding", ...)
    ↓  MoverContextRequestBus::ChangeMovementMode("Slide")  ← when slope too steep
GS_3DSlideMoverComponent  [active when mode == "Slide"]

The Slide mover activates when the Unit is walking on too steep an angle. It takes control over the unit, slides down the hill, then restores the previous movement behaviour.


Mover Types

Mover component in the O3DE Inspector

Each mover defines one locomotion behavior. A unit has one active mover at a time, selected by the Mover Context. All movers read from the unit’s movement vector (produced by the input pipeline) and translate it into entity movement each frame.

ComponentBest ForHow It Moves
GS_3DFreeMoverComponentFlying, swimming, zero-gravityUnconstrained motion in all three axes. No ground contact required.
GS_3DSlideMoverComponentGround-bound characters on uneven terrainSlides the character along the contact surface, maintaining ground contact smoothly.
GS_PhysicsMoverComponentCharacters needing full collision responseDrives an O3DE physics rigid body. Forces and impulses apply normally.

Choosing a Mover

  • Use GS_3DFreeMoverComponent for any character that needs to move through air or water without ground contact.
  • Use GS_3DSlideMoverComponent for most ground-bound characters. It handles slopes and steps without requiring physics simulation.
  • Use GS_PhysicsMoverComponent when the character must interact physically with the world — pushing objects, being knocked back by forces, or responding to physics joints.

Mover Context

GS_MoverContextComponent sits above the movers and coordinates everything: it selects which mover runs each frame, owns the movement mode tracks, transforms raw input into camera-relative and ground-projected vectors, and manages the movement profile priority stack.

Add exactly one to any unit that uses movers.

Mover Context →


Grounders

Grounder component in the O3DE Inspector

Grounders detect whether the character has contact with a surface and report that information to the Mover Context. Without a grounder, the Context cannot distinguish ground-bound from airborne states.

ComponentMethodNotes
GS_GrounderComponentBase class — extend for custom detection logic.Not used directly.
GS_PhysicsRayGrounderComponentFires a downward raycast each frame.Suitable for most characters. Configurable ray length and collision layer.

Add one grounder to a unit that uses GS_3DSlideMoverComponent or any mover that needs ground awareness. GS_3DFreeMoverComponent does not require a grounder because it never needs ground contact.


Movement Profiles

Movement Profile asset in the O3DE Asset Editor

GS_UnitMovementProfile is a data asset that stores all locomotion parameters for a unit configuration: walk speed, run speed, acceleration, deceleration, air control, jump force, and any mover-specific values. The Mover Context component holds a reference to a profile asset, so multiple prefabs can share the same profile or each have their own without duplicating inline values.

Creating a Profile

  1. In the O3DE Asset Browser, right-click and select Create Asset → GS_UnitMovementProfile.
  2. Configure the parameters in the asset editor.
  3. Assign the asset to the Movement Profile slot on GS_MoverContextComponent.

Changing a profile asset at runtime allows you to alter all locomotion parameters at once — for example, switching a character between normal and encumbered movement profiles without touching individual component properties.

Reading Movement Profile Data - ScriptCanvas


Influence Fields

Movement Influence component in the O3DE Inspector

Influence components dynamically apply priority that change a units dominant movement profile. Priority values are additive — a unit accumulates priority values from all active sources and evaluate the final priority to determine the acting movement profile.

ComponentScopeTypical Use
GlobalMovementInfluenceComponentAll units globallyStandard movement for the region.
MovementInfluenceFieldComponentUnits inside a spatial volumeOpen fields, narrow channels, indoors and outdoors transitions.

GlobalMovementInfluenceComponent is placed once in the level, usually on the Stage Data entity. MovementInfluenceFieldComponent uses a shape component to define its volume — place one wherever you need to change movement priority.


Assembly Guide

A typical ground-bound player character uses this component combination:

ComponentRole
GS_InputDataComponentHolds input state — movement vector written here each frame.
GS_PlayerControllerInputReaderComponentReads input profile and fills GS_InputDataComponent.
KeyboardMovement_InputReactorComponentConverts keyboard input to movement vector.
JoyAxisMovement_AxisReactorComponentConverts joystick input to movement vector.
GS_PhysicsRayGrounderComponentDetects ground contact below the character.
GS_MoverContextComponentCoordinates grounders and movers, holds the Movement Profile.
GS_3DSlideMoverComponentMoves the character along the contact surface.

A flying or swimming character replaces the grounder and slide mover with GS_3DFreeMoverComponent and removes the grounder entirely.


Quick Reference

NeedComponentNotes
Free 3D movement (fly / swim)GS_3DFreeMoverComponentNo ground contact required.
Surface-hugging ground movementGS_3DSlideMoverComponentSmooth contact on slopes and steps.
Physics-simulated movementGS_PhysicsMoverComponentFull rigid body collision response.
Coordinate movers and stateGS_MoverContextComponentRequired on any unit with movers.
Raycast ground detectionGS_PhysicsRayGrounderComponentUsed with slide and physics movers.
Shared locomotion parametersGS_UnitMovementProfile (asset)Referenced by GS_MoverContextComponent.
Global force on all unitsGlobalMovementInfluenceComponentStandard movement values.
Localized force in a volumeMovementInfluenceFieldComponentNarrow paths, roughage, indoors to outdoors.

Glossary

TermMeaning
MoverA component that defines one locomotion behavior (free flight, surface sliding, physics-driven)
Mover ContextThe coordinator that tracks movement state and selects the active mover each frame
GrounderA component that detects surface contact and reports ground state to the Mover Context
Movement ProfileA data asset storing all locomotion parameters (speed, acceleration, jump force) for a unit
Influence FieldA component that applies additive priority to a units active movement profile

For full definitions, see the Glossary.


See Also

For the full API, component properties, and C++ extension guide:

For related systems:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4 - Framework API

Component references, EBus interfaces, and extension guides for the full GS_Play framework.

This section contains the full technical reference for every GS_Play gem — component properties, EBus request and notification interfaces, extension patterns, internal architecture, and code examples.

If you are looking for a scripting-level introduction or editor setup guides, see The Basics section first.


How This Section Is Organized

Each gem section follows a layered structure:

Gem overview page — Technical summary of the gem’s architecture, a table of feature sets with component listings, and installation notes.

Feature set pages — One page per sub-system (e.g., GS_Save, GS_StageManager). Each covers:

  • Architectural overview and core design pattern.
  • Component and class index with purpose descriptions.
  • Extension and customization guidance.
  • Links to The Basics for editor-level usage.

Class and component pages — One page per major class or component. Each covers:

  • Inspector properties and configuration.
  • EBus request and notification interfaces with method signatures.
  • Virtual methods for subclassing.
  • Code examples for common use cases.

Pages in this section assume familiarity with O3DE concepts (components, EBuses, SerializeContext). If you need a gentler introduction, start with the corresponding page in The Basics — every Framework API page links back to its Basics counterpart.


Sections

4.1 - GS_Core

The foundation gem for the GS_Play framework — game lifecycle, save system, stage management, input, actions, motion animation system, and utility libraries.

GS_Core is the required foundation for every GS_Play project. All other GS gems depend on it. It provides game lifecycle management (startup, shutdown, standby), persistence (save/load), level loading, input handling, triggerable actions, and a shared utility library.

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


GS_Managers

The lifecycle and startup management system. The Game Manager is a singleton placed in every level — it spawns all registered manager prefabs, coordinates their two-stage initialization, and provides global game navigation and standby mode.

All built-in GS_Play managers (Save, Stage, Options, UI, Unit, Camera, Audio) and any custom managers you create extend the Manager base class and plug into this system automatically.

ComponentPurpose
Game ManagerTop-level lifecycle controller. Spawns managers, handles navigation, standby, and debug mode.
Manager (Base)Base class for all managers. Handles two-stage initialization automatically.

GS_Managers API


GS_Save

The persistence system. The Save Manager orchestrates save and load operations across the project. Savers are per-entity components that serialize component state. The Record Keeper stores flat key-value progression records independently of the Saver system.

ComponentPurpose
Save ManagerCoordinates all save and load operations.
SaversPer-entity components that serialize transform and physics state.
Record KeeperKey-value progression store (flags, counters, unlock states).

GS_Save API


GS_StageManager

The level loading and navigation system. The Stage Manager owns the ordered stage list for the project and handles all level transitions. Stage Data components are placed in each level as anchors and spin-up controllers for that level’s systems.

ComponentPurpose
Stage ManagerManages stage list and handles transition requests.
Stage DataPer-level anchor, spawn point, and activation controller.

GS_StageManager API


GS_Options

Options and input profile management. The Options Manager persists player preferences. Input Profiles are data assets that hold input bindings, swappable at runtime.

ComponentPurpose
Options ManagerPersists and retrieves player option data.
Input ProfilesData asset for input binding mappings.

GS_Options API


Systems

Core framework systems used across multiple gems: the GS_Motion track-based animation engine and the GS_Actions triggerable behavior system.

SystemPurpose
GS_MotionTrack-based animation and tween engine — abstract base classes extended by domain gems.
GS_ActionsTriggerable, composable, data-driven behaviors attachable to any entity.

Systems API


Utilities

A general-purpose library of components and math helpers.

AreaContents
PhysicsPhysics Trigger Volume
Easing Curves40+ curve types (Linear, Quad, Cubic, Sine, Expo, Circ, Back, Elastic, Bounce)
Spring Dampers15+ spring functions including Simple, Acceleration, Double, Timed, Quaternion
GradientsColor, float, and Vector2 gradients
Entity HelpersEntity lookup by name
RandomWeighted random selection
SplinesClosest point, fraction, and local/world conversion
Angle HelpersYaw, quaternion from direction, section-by-angle mapping

Utilities API


Installation

GS_Core is required by all GS_Play projects. Add it to your project before any other GS gem.

  1. Follow the Simple Project Setup guide for full walkthrough.
  2. Configure collision layers and physics per the Setup Environment guide.
  3. Create a Game Manager prefab and place it in every level.
  4. Add the managers you need to the Game Manager’s Startup Managers list.

See Also

For conceptual overviews and usage guides:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.1 - GS_Managers

The game lifecycle management system — startup sequencing, systemic navigation, and the extensible manager pattern.

The Managers system is the backbone of every GS_Play project. It provides a controlled startup sequence that initializes game systems in the correct order, global access to those systems via EBus, and systemic navigation (New Game, Load Game, Quit) from a single point of control.

Every GS_Play gem that has a manager component (Save, Stage, UI, Unit, Camera, Audio, etc.) plugs into this system automatically.

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


Architecture

Game Manager Startup Pattern Graph

Breakdown

When the project starts, the Game Manager runs three stages before the game is considered ready:

StageBroadcast EventWhat It Means
1 — Initialize(internal)Each manager is spawned. They activate, then report ready.
2 — SetupOnSetupManagersSetup stage. Now safe to query other managers.
3 — CompleteOnStartupCompleteLast stage. Everything is ready.
Do any last minute things. Now safe to begin gameplay.

For most scripts, you only need OnStartupComplete. Wait for this event before doing anything that depends on managers to be completely setup.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Components

ComponentPurposeReference
GS_GameManagerComponentTop-level lifecycle controller. Spawns managers, handles New Game / Load / Quit / Standby.Game Manager
GS_ManagerComponentBase class for all game system managers. Handles the two-stage init pattern automatically.Manager

Quick Start

For step-by-step setup, see Game Manager — Setup and Manager — Setup.

For creating custom managers, see Manager — Extending the Manager Class.


See Also

For conceptual overviews and usage guides:

For component references:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.1.1 - Game Manager

The top-level game lifecycle controller — startup sequencing, systemic navigation, standby mode, and debug support.

The Game Manager is the root component of every GS_Play project. It controls the game lifecycle from startup through shutdown: spawning and initializing all Manager components in the correct order, providing systemic navigation (New Game, Load Game, Return to Title, Quit), and coordinating standby mode for pausing gameplay during level transitions or other blocking operations.

Every level in your project should contain the same Game Manager prefab. In debug mode it stays in the current level on start; in release mode it navigates to your title stage.

For usage guides and setup examples, see The Basics: GS_Core.

Game Manager component in the O3DE Inspector

The Game Manager component in the Entity Inspector, with added manager prefabs in the Startup Managers list.

Contents


Design Intent

Extend the Game Manager when you need project-wide changes to startup ordering, custom post-startup logic (analytics sessions, online service initialization), or additional global lifecycle events. For most projects, the built-in component is sufficient as-is. Do not move gameplay logic here — the Game Manager orchestrates systems, it does not run game rules.


Dependencies & Interactions

Coordinates withRole
GS_SaveManagerOwns all save/load file operations triggered by NewGame, ContinueGame, LoadGame, SaveAndExitGame
GS_StageManagerOwns level navigation during NewGame, LoadGame, and ReturnToTitle
GS_ManagerComponentBase class all spawned managers extend
All other GS gem managersAny manager in the Startup Managers list registers automatically

Setup

Game Manager Entity

The Game Manager wrapper entity set as Editor-Only inside Prefab Edit Mode.

  1. Create an entity and attach the GS_GameManagerComponent (or your extended version).
  2. Turn the entity into a prefab.
  3. Enter prefab edit mode. Set the wrapper entity (the parent of your Game Manager entity) to Editor Only. Save the prefab.
  4. Create Manager prefabs for the systems you need (Save, Stage, Options, or custom managers).
  5. Add each Manager .spawnable to the Game Manager’s Startup Managers list in the Entity Inspector.
  6. Push your overrides into the prefab to propagate across all levels.

Place your Game Manager prefab at the top of every level in the Entity Outliner.

The premade manager prefabs for each GS_Play gem are located in that gem’s Assets/Prefabs directory.


Core Concepts

Startup Sequence

When the project starts, the Game Manager executes a three-stage startup:

  1. Spawn Managers — The Game Manager instantiates every prefab in its Startup Managers list. Each manager component runs its Activate() and reports back via OnRegisterManagerInit().
  2. Startup Managers — Once all managers report initialized, the Game Manager broadcasts OnSetupManagers(). Managers now connect to each other and report back via OnRegisterManagerStartup().
  3. Startup Complete — Once all managers report started up, the Game Manager broadcasts OnStartupComplete(). The game is fully initialized and ready to run.

At each stage, the Game Manager counts reports and only proceeds when every spawned manager has reported. This guarantees safe cross-referencing between managers in Stage 2.


Systemic Navigation

The Game Manager provides the high-level game flow methods that drive your project:

  • NewGame / NewGame(saveName) — Starts a new game, optionally with a named save file.
  • ContinueGame — Loads the most recent save and continues.
  • LoadGame(saveName) — Loads a specific save file.
  • ReturnToTitle — Returns to the title stage, tearing down the current game session.
  • SaveAndExitGame / ExitGame — Saves and/or exits the application.

These methods coordinate with the Save Manager and Stage Manager automatically.


Standby Mode

Standby is a global pause mechanism. When the Game Manager enters standby, it broadcasts OnEnterStandby() to all managers, which propagate the signal to their subsystems. This halts timers, physics, gameplay ticks, and other simulation while a blocking operation (like a stage change) completes. Calling ExitStandby reverses the process.


Debug Mode

When Debug Mode is enabled in the Inspector, the Game Manager skips navigating to the title stage and remains in the current level. This allows rapid iteration — you can start the game from any level without going through the full title-to-gameplay flow. If save data and unit management are enabled, it loads default save data but overrides position data with the current level’s default spawn point.


Inspector Properties

PropertyTypeDescription
Project PrefixStringSets the project prefix for generated files (e.g. save files). Example: "GS" produces GS_SaveGame.json.
Startup ManagersList of Manager PrefabsThe manager prefab spawnables to instantiate on game start. Order does not matter — the two-stage init handles dependencies. (C++: AZStd::vector<SpawnableAssetRef>)
Debug ModeBoolWhen enabled, the Game Manager stays in the current level instead of navigating to the title stage.

API Reference

Request Bus: GameManagerRequestBus

Commands sent to the Game Manager. Singleton bus — single address, single handler.

MethodParametersReturnsDescription
IsInDebugboolReturns whether debug mode is active.
IsStartedboolReturns whether the startup sequence has completed.
GetProjectPrefixAZStd::stringReturns the configured project prefix string.
EnterStandbyvoidBroadcasts standby to all managers, pausing gameplay.
ExitStandbyvoidBroadcasts standby exit, resuming gameplay.
NewGamevoidStarts a new game with the default save name.
NewGameconst AZStd::string& saveNamevoidStarts a new game with a specified save file name.
ContinueGamevoidLoads the most recent save file and continues.
LoadGameconst AZStd::string& saveNamevoidLoads a specific save file by name.
ReturnToTitlevoidReturns to the title stage, tearing down the current session.
SaveAndExitGamevoidSaves the current game state and exits the application.
ExitGamevoidExits the application without saving.

Notification Bus: GameManagerNotificationBus

Events broadcast by the Game Manager. Multiple handler bus — any number of components can subscribe.

EventParametersReturnsDescription
OnSetupManagersFired when all managers have reported initialized. Managers should now connect to each other.
OnStartupCompleteFired when the full startup sequence is complete. Safe to begin gameplay.
OnShutdownManagersFired when the game is shutting down. Managers should clean up.
OnBeginGameFired when a new game or loaded game begins.
OnEnterStandbyFired when entering standby mode. Pause your systems.
OnExitStandbyFired when exiting standby mode. Resume your systems.

ScriptCanvas Nodes

These methods and events are available as ScriptCanvas nodes.

Request Nodes

NodeDescription
TriggerNewGameStarts a new game with the default save name.
TriggerNewGameWithName(saveName)Starts a new game with a named save file.
TriggerContinueGameContinues from the most recent save.
TriggerLoadGame(saveName)Loads a specific save file by name.
TriggerReturnToTitleReturns to the title stage.
TriggerSaveAndExitGameSaves and exits.
TriggerExitGameExits without saving.

Notification Nodes

Listen for these events on GameManagerNotificationBus.

NodeWhen It Fires
OnStartupCompleteThe game is fully initialized and ready to run.
OnBeginGameA new or loaded game session begins.
OnEnterStandbyThe game has entered standby (paused).
OnExitStandbyThe game has exited standby (resumed).

Virtual Methods

Override these when extending the Game Manager. Always call the base implementation.

MethodParametersReturnsDescription
InitializeManagers()voidSpawns the Startup Managers list. Override to add custom spawn logic.
ProcessFallbackSpawn()voidHandles fallback when a manager fails to spawn.
StartupManagers()voidBroadcasts OnSetupManagers. Override to inject logic between init and startup stages.
CompleteStartup()voidBroadcasts OnStartupComplete. Override to add post-startup logic.
BeginGame()voidCalled when transitioning into active gameplay.

SC ↔ C++ Quick Reference

ActionScriptCanvas NodeC++ MethodNotes
Start a new gameTriggerNewGameNewGame()Default save name
Start with named saveTriggerNewGameWithName(name)NewGame(saveName)Named save file
Continue from last saveTriggerContinueGameContinueGame()Loads most recent save
Load specific saveTriggerLoadGame(name)LoadGame(saveName)Load by file name
Return to titleTriggerReturnToTitleReturnToTitle()Tears down current session
Save and exitTriggerSaveAndExitGameSaveAndExitGame()
Exit without savingTriggerExitGameExitGame()
Check if startedIsStarted()Returns bool

Usage Examples

ScriptCanvas: Title Screen Workflow

Calling new or load game proceedes to start the gameplay and systems. Begin Game event fires to start all systems across the game.

Debug Mode automatically begins the game.


ScriptCanvas: Standby Logic


C++: Reacting to Events

A gameplay component that listens to the full startup and standby lifecycle:

#include <GS_Core/GS_CoreBus.h>

class MyGameplayComponent
    : public AZ::Component
    , protected GS_Core::GameManagerNotificationBus::Handler
{
protected:
    void Activate() override
    {
        GS_Core::GameManagerNotificationBus::Handler::BusConnect();
    }

    void Deactivate() override
    {
        GS_Core::GameManagerNotificationBus::Handler::BusDisconnect();
    }

    void OnStartupComplete() override
    {
        // Safe to access all managers and begin gameplay logic
    }

    void OnEnterStandby() override
    {
        // Pause your gameplay systems
    }

    void OnExitStandby() override
    {
        // Resume your gameplay systems
    }
};

C++: Custom Manager Integration

A manager that receives the startup handshake from the Game Manager. Add this class’s prefab to the Game Manager’s Startup Managers list:

#include <GS_Core/GS_CoreBus.h>
#include <Source/Managers/GS_ManagerComponent.h>

class MySystemManager
    : public GS_Core::GS_ManagerComponent
    , protected GS_Core::GameManagerNotificationBus::Handler
{
public:
    AZ_COMPONENT_DECL(MySystemManager);
    static void Reflect(AZ::ReflectContext* context);

protected:
    void Activate() override
    {
        GS_Core::GS_ManagerComponent::Activate();                        // Required: registers with Game Manager
        GS_Core::GameManagerNotificationBus::Handler::BusConnect();
    }

    void Deactivate() override
    {
        GS_Core::GameManagerNotificationBus::Handler::BusDisconnect();
        GS_Core::GS_ManagerComponent::Deactivate();                      // Required: deregisters from Game Manager
    }

    // Called during Stage 2 — safe to reference other managers here
    void OnSetupManagers() override
    {
        // Connect to other managers, set up cross-system references
    }

    // Called when the full startup sequence completes — safe to begin gameplay
    void OnStartupComplete() override
    {
        // Initialize systems that depend on all managers being ready
    }

    void OnEnterStandby() override { /* pause your systems */ }
    void OnExitStandby()  override { /* resume your systems */ }
};

Extending the Game Manager

Extend the Game Manager to add custom startup logic, additional lifecycle events, or project-specific behavior. Extension is done in C++.

Header (.h)

#pragma once
#include <GS_Core/GS_CoreBus.h>
#include <Source/Managers/GS_GameManagerComponent.h>

namespace MyProject
{
    class MyGameManager : public GS_Core::GS_GameManagerComponent
    {
    public:
        AZ_COMPONENT_DECL(MyGameManager);

        static void Reflect(AZ::ReflectContext* context);

    protected:
        void InitializeManagers() override;
        void StartupManagers() override;
        void CompleteStartup() override;
        void BeginGame() override;
    };
}

Implementation (.cpp)

#include "MyGameManager.h"
#include <AzCore/Serialization/SerializeContext.h>

namespace MyProject
{
    AZ_COMPONENT_IMPL(MyGameManager, "MyGameManager", "{YOUR-UUID-HERE}");

    void MyGameManager::Reflect(AZ::ReflectContext* context)
    {
        if (auto serializeContext = azrtti_cast<AZ::SerializeContext*>(context))
        {
            serializeContext->Class<MyGameManager, GS_Core::GS_GameManagerComponent>()
                ->Version(0);

            if (AZ::EditContext* editContext = serializeContext->GetEditContext())
            {
                editContext->Class<MyGameManager>("My Game Manager", "Custom game manager for MyProject")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyProject")
                        ->Attribute(AZ::Edit::Attributes::AppearsInAddComponentMenu, AZ_CRC_CE("Game"));
            }
        }
    }

    void MyGameManager::InitializeManagers()
    {
        // Call base to spawn the standard manager list
        GS_GameManagerComponent::InitializeManagers();
        // Custom initialization logic here
    }

    void MyGameManager::StartupManagers()
    {
        // Call base to handle standard startup
        GS_GameManagerComponent::StartupManagers();
    }

    void MyGameManager::CompleteStartup()
    {
        // Call base to broadcast OnStartupComplete
        GS_GameManagerComponent::CompleteStartup();
        // Post-startup logic here (e.g. connect to analytics, initialize online services)
    }

    void MyGameManager::BeginGame()
    {
        GS_GameManagerComponent::BeginGame();
        // Custom game start logic
    }
}

See Also

For scripting patterns and SC-first usage:

For related components:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.1.2 - Manager

The base class for all game system managers — automatic two-stage initialization and lifecycle integration with the Game Manager.

For usage guides and setup examples, see The Basics: GS_Core.

A Manager prefab with the wrapper entity set as Editor-Only inside Prefab Edit Mode.

GS_GameManager
  └── (spawns) GS_ManagerComponent ◄ you are here — and all other managers extend this

The GS_ManagerComponent is the base class that all game system managers inherit from. It handles the two-stage initialization pattern with the Game Manager automatically, so you can focus on your system’s logic without worrying about startup ordering.

Every built-in GS_Play manager (Save, Stage, Options, UI, Unit, Camera, Audio, Performer) extends this class. When you need a custom game system manager, you extend it too.

 

Contents


How It Works

When the Game Manager spawns its list of manager prefabs, each Manager component goes through this lifecycle:

  1. Activate — The component initializes itself (connects to buses, sets up internal state). When done, broadcasts OnRegisterManagerInit() to tell the Game Manager it is ready.

  2. OnStartupManagers — Called by the Game Manager after all managers have reported initialized. Connect to other managers and set up cross-references here. When done, broadcasts OnRegisterManagerStartup().

  3. OnStartupComplete — The Game Manager broadcasts this after all managers report started up. The game is fully ready.

  4. OnEnterStandby / OnExitStandby — Called when the Game Manager enters or exits standby mode. Pause and resume your subsystem here.

  5. OnShutdownManagers — Called when the game is shutting down. Clean up resources and disconnect buses.

The base class handles steps 1–2 automatically. Override them to add your logic, then call the base implementation to complete the handshake.


Setup

The Manager wrapper entity set as Editor-Only inside Prefab Edit Mode.

  1. Create an entity. Attach your Manager component (built-in or custom) to it.
  2. Configure any properties needed in the Entity Inspector.
  3. Turn the entity into a prefab.
  4. Enter prefab edit mode. Set the wrapper entity (parent of your Manager entity) to Editor Only. Save.
  5. Delete the Manager entity from the level (it will be spawned by the Game Manager at runtime).
  6. In the Game Manager prefab, add your Manager .spawnable to the Startup Managers list.

The premade manager prefabs for each GS_Play gem are located in that gem’s Assets/Prefabs directory.


Inspector Properties

The base GS_ManagerComponent exposes no inspector properties of its own. Properties are defined by each inheriting manager component (e.g., the Save Manager exposes a Save System Version Number, the Stage Manager exposes a Stages list).


API Reference

Lifecycle Events: GameManagerNotificationBus

The Manager base class automatically subscribes to these events from the Game Manager:

EventWhen It FiresWhat You Do
OnStartupManagersAll managers are initializedConnect to other managers, set up cross-references, then call base to report.
OnStartupCompleteAll managers are started upBegin runtime operation.
OnEnterStandbyGame is entering standbyPause your subsystem (stop ticks, halt processing).
OnExitStandbyGame is exiting standbyResume your subsystem.
OnShutdownManagersGame is shutting downClean up resources, disconnect buses.

Registration Methods

Called internally by the Manager base class to report back to the Game Manager. Do not call these manually — call the base class method instead.

MethodDescription
OnRegisterManagerInit()Called at the end of Activate(). Reports to the Game Manager that this manager has initialized.
OnRegisterManagerStartup()Called at the end of OnStartupManagers(). Reports to the Game Manager that this manager has started up.

Usage Examples

C++ — Checking if the Game Manager Is Ready

#include <GS_Core/GS_CoreBus.h>

bool isStarted = false;
GS_Core::GameManagerRequestBus::BroadcastResult(
    isStarted,
    &GS_Core::GameManagerRequestBus::Events::IsStarted
);

if (isStarted)
{
    // Safe to access all managers
}

Extending the Manager Class

Create a custom manager whenever you need a singleton game system that initializes with the rest of the framework. Extension is done in C++.

Header (.h)

#pragma once
#include <Source/Managers/GS_ManagerComponent.h>

// Define your EBus interface for other systems to call your manager
class MyManagerRequests
{
public:
    AZ_RTTI(MyManagerRequests, "{YOUR-UUID-HERE}");
    virtual ~MyManagerRequests() = default;

    virtual int GetSomeValue() = 0;
    virtual void DoSomething() = 0;
};

class MyManagerBusTraits : public AZ::EBusTraits
{
public:
    static constexpr AZ::EBusHandlerPolicy HandlerPolicy = AZ::EBusHandlerPolicy::Single;
    static constexpr AZ::EBusAddressPolicy AddressPolicy = AZ::EBusAddressPolicy::Single;
};

using MyManagerRequestBus = AZ::EBus<MyManagerRequests, MyManagerBusTraits>;

namespace MyProject
{
    class MyManagerComponent
        : public GS_Core::GS_ManagerComponent
        , protected MyManagerRequestBus::Handler
    {
    public:
        AZ_COMPONENT_DECL(MyManagerComponent);

        static void Reflect(AZ::ReflectContext* context);

    protected:
        // Manager lifecycle
        void Activate() override;
        void Deactivate() override;
        void OnStartupManagers() override;
        void OnEnterStandby() override;
        void OnExitStandby() override;

        // Your bus implementation
        int GetSomeValue() override;
        void DoSomething() override;

    private:
        int m_someValue = 0;
    };
}

Implementation (.cpp)

#include "MyManagerComponent.h"
#include <AzCore/Serialization/SerializeContext.h>

namespace MyProject
{
    AZ_COMPONENT_IMPL(MyManagerComponent, "MyManagerComponent", "{YOUR-UUID-HERE}");

    void MyManagerComponent::Reflect(AZ::ReflectContext* context)
    {
        if (auto serializeContext = azrtti_cast<AZ::SerializeContext*>(context))
        {
            serializeContext->Class<MyManagerComponent, GS_Core::GS_ManagerComponent>()
                ->Version(0)
                ->Field("SomeValue", &MyManagerComponent::m_someValue);

            if (AZ::EditContext* editContext = serializeContext->GetEditContext())
            {
                editContext->Class<MyManagerComponent>("My Manager", "A custom game system manager")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyProject")
                        ->Attribute(AZ::Edit::Attributes::AppearsInAddComponentMenu, AZ_CRC_CE("Game"))
                    ->DataElement(AZ::Edit::UIHandlers::Default,
                        &MyManagerComponent::m_someValue, "Some Value", "Description of this property");
            }
        }
    }

    void MyManagerComponent::Activate()
    {
        MyManagerRequestBus::Handler::BusConnect();

        // IMPORTANT: Call base Activate last — it reports OnRegisterManagerInit()
        GS_ManagerComponent::Activate();
    }

    void MyManagerComponent::Deactivate()
    {
        MyManagerRequestBus::Handler::BusDisconnect();
        GS_ManagerComponent::Deactivate();
    }

    void MyManagerComponent::OnStartupManagers()
    {
        // Access other managers here — they are all initialized by now

        // IMPORTANT: Call base last — it reports OnRegisterManagerStartup()
        GS_ManagerComponent::OnStartupManagers();
    }

    void MyManagerComponent::OnEnterStandby()
    {
        // Pause your subsystem
    }

    void MyManagerComponent::OnExitStandby()
    {
        // Resume your subsystem
    }

    int MyManagerComponent::GetSomeValue()
    {
        return m_someValue;
    }

    void MyManagerComponent::DoSomething()
    {
        // Your game system logic
    }
}

Module Registration

m_descriptors.insert(m_descriptors.end(), {
    MyProject::MyManagerComponent::CreateDescriptor(),
});

Then create a prefab for your manager and add it to the Game Manager’s Startup Managers list.


See Also

For conceptual overviews and usage guides:

For component references:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.2 - GS_Save

The persistence system — save files, load data, and track progression with managers, savers, and record keepers.

The Save system is the universal means of storing, loading, and handling game data. It provides a centralized Save Manager for file operations, entity-level Saver components for automatic data capture, and a Record Keeper for lightweight progression tracking — all backed by JSON-formatted save files that work across PC, console, and mobile platforms.

Every component that needs to persist data plugs into this system through one of two patterns: extend the Saver base class for complex per-entity data, or use the Record Keeper for simple key-value records.

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


Architecture

Save System Pattern Graph

Breakdown

When a save or load is triggered, the Save Manager broadcasts to every Saver in the scene. Each Saver independently handles its own entity’s state. The Record Keeper persists flat progression data alongside the save file.

PartBroadcast EventWhat It Means
Save ManagerOnSaveAllBroadcasts to all Savers on save. Each serializes its entity state.
Save ManagerOnLoadAllBroadcasts to all Savers on load. Each restores its entity state.
Record KeeperRecordChangedFires when any progression flag is created, updated, or deleted.

Savers are per-entity components — extend GS_SaverComponent to persist any custom data. The Record Keeper is a global singleton that lives on the Save Manager prefab.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Components

ComponentPurposeDocumentation
GS_SaveManagerComponentCentral save/load controller. Manages save files, triggers global save/load events, provides data access to all savers.Save Manager
GS_SaverComponentBase class for entity-level save handlers. Override BuildSaveData() and ProcessLoad() to persist any component data.Savers
RecordKeeperComponentLightweight key-value store for progression tracking (string name → integer value). Extends GS_SaverComponent.Record Keeper
BasicEntitySaverComponentPre-built saver that persists an entity’s Transform (position, rotation).List of Savers
BasicPhysicsEntitySaverComponentPre-built saver that persists an entity’s Transform and Rigidbody state (velocity).List of Savers

Quick Start

  1. Create a Save Manager prefab with the GS_SaveManagerComponent.
  2. Optionally add a Record Keeper to the same prefab for progression tracking.
  3. Add the Save Manager .spawnable to the Game Manager’s Startup Managers list.
  4. Attach Saver components to any entities that need to persist data.
  5. Call NewGame / LoadGame through the Game Manager — the Save Manager handles the rest.

See Also

For conceptual overviews and usage guides:

For component references:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.2.1 - Save Manager

The central save/load controller — manages save files, triggers global persistence events, and provides data access for all savers.

The Save Manager is the central controller of the save system. It manages save file creation, triggers global save/load events that all Saver components respond to, and provides data access methods for storing and retrieving serialized game state.

Save files use JSON formatting for easy human interpretation, and the system uses the O3DE SaveData gem to write to the target platform’s default user data directory — PC, console, or mobile with no additional configuration.

For usage guides and setup examples, see The Basics: GS_Core.

Save Manager component in the O3DE Inspector

 

Contents


How It Works

Save File Structure

The Save Manager maintains a CoreSaveData file that acts as an index of all save files for the project. This file is identified by the combination of the Game Manager’s Project Prefix and the Save Manager’s Save System Version Number. Changing either value starts a fresh save data pass.

Each individual save file stores the full game state as a JSON document, including timestamped metadata for ordering.

Initialization

On startup, the Save Manager checks for existing save data using the Project Prefix + Version combination. It sets its internal IsContinuing flag to indicate whether previous save data is available to load. Other systems can query this to determine whether to show “Continue” or “Load Game” options.

New Game Flow

When the Game Manager calls NewGame:

  1. The Save Manager creates a new save file — either defaultSaveData (no name given) or a custom name via NewGame(saveName).
  2. The CoreSaveData index is updated with the new file entry.

Save Flow

  1. Game systems call SaveData(uniqueName, data) on the SaveManagerRequestBus to store their data into the live save document.
  2. When a full save is triggered (via method call, OnSaveAll broadcast, or save-on-exit logic), the Save Manager serializes the complete document to disk.

Load Flow

  1. The Save Manager reads the target save file from disk into memory.
  2. It broadcasts OnLoadAll via the SaveManagerNotificationBus.
  3. Each Saver component responds by calling LoadData(uniqueName, outData) to retrieve its portion of the save data.

Setup

Image showing the Manager wrapper entity set as Editor-Only inside Prefab Edit Mode.

  1. Create an entity. Attach the GS_SaveManagerComponent to it.
  2. Set the Save System Version Number (increment this when your save format changes to avoid loading incompatible data).
  3. Optionally add a RecordKeeperComponent to the same entity for progression tracking.
  4. Turn the entity into a prefab.
  5. Enter prefab edit mode. Set the wrapper entity (parent) to Editor Only. Save.
  6. Delete the Save Manager entity from the level.
  7. In the Game Manager prefab, add the Save Manager .spawnable to the Startup Managers list.

Inspector Properties

PropertyTypeDefaultDescription
Save System Version Numberint0Version stamp for save file compatibility. Increment when your save data format changes — the system will treat saves from a different version as a fresh start.
Full Save On DestroybooltrueWhen enabled, the Save Manager performs a full save when the component is destroyed (e.g., on level exit or game shutdown).

API Reference

Request Bus: SaveManagerRequestBus

The primary interface for all save/load operations. Singleton bus — call via Broadcast.

MethodParametersReturnsDescription
NewGameSaveconst AZStd::string& uniqueNamevoidCreates a new save file with the given name. Pass empty string for default name.
LoadGameconst AZStd::string& uniqueNamevoidLoads the specified save file into memory and triggers data restoration.
SaveDataconst AZStd::string& uniqueName, const rapidjson::Value& datavoidStores a named data block into the live save document. Called by Saver components during save operations.
LoadDataconst AZStd::string& uniqueName, rapidjson::Value& outDataboolRetrieves a named data block from the loaded save document. Returns true if the data was found.
GetOrderedSaveListAZStd::vector<AZStd::pair<AZStd::string, AZ::u64>>Returns all save files ordered by timestamp (newest first). Each entry is a name + epoch timestamp pair.
ConvertEpochToReadableAZ::u64 epochSecondsAZStd::stringConverts an epoch timestamp to a human-readable date string.
GetEpochTimeNowAZ::u64Returns the current time as an epoch timestamp.
GetAllocatorrapidjson::Document::AllocatorType*Returns the JSON allocator for constructing save data values.
HasDataconst AZStd::string& uniqueNameboolChecks whether the specified data block exists in the current save.
IsContinuingboolReturns true if previous save data was found on startup.
RegisterSavingvoidRegisters that a save operation is in progress (used internally by the save counting system).

Notification Bus: SaveManagerNotificationBus

Broadcast to all Saver components. Connect to this bus to participate in global save/load events.

EventParametersDescription
OnSaveAllBroadcast when a full save is triggered. All savers should gather and submit their data.
OnLoadAllBroadcast when a save file has been loaded into memory. All savers should retrieve and restore their data.

Local / Virtual Methods

These methods are available when extending the Save Manager. Override them to customize save file handling.

MethodDescription
SaveToFile(fileName, docFile)Serializes a JSON document to disk using the O3DE SaveData gem.
LoadFromFile(fileName, docFile)Deserializes a save file from disk into a JSON document.
FullSave()Triggers a complete save of all game data to disk.
UpdateCoreData(saveName)Updates the CoreSaveData index with the current save file entry.
FileExists(dataBufferName, localUserId)Static utility — checks if a save file exists on disk.
GetSaveFilePath(dataBufferName, localUserId)Static utility — returns the platform-appropriate file path for a save file.

Usage Examples

Saving Data from a Component

#include <GS_Core/GS_CoreBus.h>

// Store your component's data into the live save document
rapidjson::Document::AllocatorType* allocator = nullptr;
GS_Core::SaveManagerRequestBus::BroadcastResult(
    allocator,
    &GS_Core::SaveManagerRequestBus::Events::GetAllocator
);

if (allocator)
{
    rapidjson::Value myData(rapidjson::kObjectType);
    myData.AddMember("health", m_health, *allocator);
    myData.AddMember("level", m_level, *allocator);

    GS_Core::SaveManagerRequestBus::Broadcast(
        &GS_Core::SaveManagerRequestBus::Events::SaveData,
        "MyComponent_PlayerStats",
        myData
    );
}

Loading Data into a Component

#include <GS_Core/GS_CoreBus.h>

rapidjson::Value outData;
bool found = false;
GS_Core::SaveManagerRequestBus::BroadcastResult(
    found,
    &GS_Core::SaveManagerRequestBus::Events::LoadData,
    "MyComponent_PlayerStats",
    outData
);

if (found)
{
    if (outData.HasMember("health")) m_health = outData["health"].GetInt();
    if (outData.HasMember("level"))  m_level = outData["level"].GetInt();
}

Checking if a Save Exists

#include <GS_Core/GS_CoreBus.h>

bool hasSave = false;
GS_Core::SaveManagerRequestBus::BroadcastResult(
    hasSave,
    &GS_Core::SaveManagerRequestBus::Events::IsContinuing
);

if (hasSave)
{
    // Show "Continue" / "Load Game" in the main menu
}

Getting the Save File List

#include <GS_Core/GS_CoreBus.h>

AZStd::vector<AZStd::pair<AZStd::string, AZ::u64>> saves;
GS_Core::SaveManagerRequestBus::BroadcastResult(
    saves,
    &GS_Core::SaveManagerRequestBus::Events::GetOrderedSaveList
);

for (const auto& [name, epoch] : saves)
{
    AZStd::string readable;
    GS_Core::SaveManagerRequestBus::BroadcastResult(
        readable,
        &GS_Core::SaveManagerRequestBus::Events::ConvertEpochToReadable,
        epoch
    );
    AZ_TracePrintf("Save", "Save: %s — %s", name.c_str(), readable.c_str());
}

Extending the Save Manager

Extend the Save Manager when you need custom save file formats, encryption, cloud save integration, or platform-specific serialization.

Header (.h)

#pragma once
#include <Source/SaveSystem/GS_SaveManagerComponent.h>

namespace MyProject
{
    class MySaveManagerComponent
        : public GS_Core::GS_SaveManagerComponent
    {
    public:
        AZ_COMPONENT_DECL(MySaveManagerComponent);

        static void Reflect(AZ::ReflectContext* context);

    protected:
        // Override save/load to add custom behavior (e.g., encryption, compression)
        void SaveToFile(AZStd::string fileName, rapidjson::Document& docFile) override;
        void LoadFromFile(AZStd::string fileName, rapidjson::Document& docFile) override;

        // Override to customize the full save sequence
        void FullSave() override;
    };
}

Implementation (.cpp)

#include "MySaveManagerComponent.h"
#include <AzCore/Serialization/SerializeContext.h>

namespace MyProject
{
    AZ_COMPONENT_IMPL(MySaveManagerComponent, "MySaveManagerComponent", "{YOUR-UUID-HERE}",
        GS_Core::GS_SaveManagerComponent);

    void MySaveManagerComponent::Reflect(AZ::ReflectContext* context)
    {
        if (auto serializeContext = azrtti_cast<AZ::SerializeContext*>(context))
        {
            serializeContext->Class<MySaveManagerComponent, GS_Core::GS_SaveManagerComponent>()
                ->Version(0);

            if (AZ::EditContext* editContext = serializeContext->GetEditContext())
            {
                editContext->Class<MySaveManagerComponent>(
                    "My Save Manager", "Custom save manager with encryption support")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyProject")
                        ->Attribute(AZ::Edit::Attributes::AppearsInAddComponentMenu, AZ_CRC_CE("Game"));
            }
        }
    }

    void MySaveManagerComponent::SaveToFile(AZStd::string fileName, rapidjson::Document& docFile)
    {
        // Example: encrypt the JSON before writing
        // ... your encryption logic ...

        // Call base to perform the actual file write
        GS_SaveManagerComponent::SaveToFile(fileName, docFile);
    }

    void MySaveManagerComponent::LoadFromFile(AZStd::string fileName, rapidjson::Document& docFile)
    {
        // Call base to perform the actual file read
        GS_SaveManagerComponent::LoadFromFile(fileName, docFile);

        // Example: decrypt the JSON after reading
        // ... your decryption logic ...
    }

    void MySaveManagerComponent::FullSave()
    {
        // Example: add a timestamp or checksum before saving
        // ... your custom logic ...

        GS_SaveManagerComponent::FullSave();
    }
}

Module Registration

m_descriptors.insert(m_descriptors.end(), {
    MyProject::MySaveManagerComponent::CreateDescriptor(),
});

Then create a prefab for your custom Save Manager and add it to the Game Manager’s Startup Managers list (replacing the default Save Manager).


See Also

For component references:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.2.2 - Savers

The base class for entity-level save handlers — override BuildSaveData() and ProcessLoad() to persist any component data automatically.

For usage guides and setup examples, see The Basics: GS_Core.

Image showing the Basic Physics Saver component, an example of a Saver-inherited class, as seen in the Entity Inspector.

The GS_SaverComponent is the base class for all entity-level save handlers. By extending it, you can create companion components that save and load data alongside your entities, or inherit directly into gameplay components for built-in persistence (as done by GS_Inventory).

Savers automatically participate in the global save/load cycle — when the Save Manager broadcasts OnSaveAll or OnLoadAll, every active Saver component responds by gathering or restoring its data.

 

Contents


How It Works

Save Cycle

When a save event fires (global OnSaveAll, local trigger, or saveOnDestroy):

  1. BeginSave() — Primes the save data container.
  2. BuildSaveData() — Your override. Gather your component’s data and write it into localData using the Save Manager’s JSON allocator.
  3. CompleteSave() — Submits the data to the Save Manager via SaveData(uniqueName, localData).

Load Cycle

When a load event fires (global OnLoadAll or loadOnActivate):

  1. LoadLocalData() — Retrieves this Saver’s data block from the Save Manager via LoadData(uniqueName, outData).
  2. ProcessLoad() — Your override. Read the retrieved data and restore your component’s state.

Automatic Identity

Each Saver generates a unique save key from the entity name and GetSubComponentName(). This ensures multiple Savers on different entities (or multiple Saver types on the same entity) don’t collide.


Setup

  1. Attach a Saver component (built-in or custom) to any entity that needs to persist data.
  2. Configure the Saver properties:
    • Load On Activate — restore data automatically when the entity spawns.
    • Save On Destroy — save data automatically when the entity is destroyed.
  3. Ensure the Save Manager is set up and running (it handles the file I/O).

Inspector Properties

PropertyTypeDefaultDescription
Load On ActivatebooltrueAutomatically loads and restores this Saver’s data when the component activates. Useful for entities that spawn mid-game and need to resume their saved state.
Save On DestroybooltrueAutomatically saves this Saver’s data when the component is destroyed. Ensures data is captured even if a global save hasn’t been triggered.

API Reference

Global Event Handlers

These are called automatically when the Save Manager broadcasts global save/load events. Override to customize behavior.

MethodDescription
OnSaveAll()Called when the Save Manager broadcasts a global save. Default implementation calls the full save cycle (BeginSave → BuildSaveData → CompleteSave).
OnLoadAll()Called when the Save Manager broadcasts a global load. Default implementation calls the full load cycle (LoadLocalData → ProcessLoad).

Save Methods

Override these to control how your component’s data is saved.

MethodDescription
BeginSave()Prepares the save data container. Called before BuildSaveData().
BuildSaveData()Your primary override. Write your component’s data into localData using the JSON allocator.
CompleteSave()Submits localData to the Save Manager. Called after BuildSaveData().

Load Methods

Override these to control how your component’s data is restored.

MethodDescription
LoadLocalData()Retrieves this Saver’s data block from the Save Manager into localData.
ProcessLoad()Your primary override. Read localData and restore your component’s state.

Utility Methods

MethodReturnsDescription
SetUniqueName()voidGenerates the unique save key from the entity name and sub-component name. Override for custom key generation.
GetSubComponentName()AZStd::stringReturns "Saver" by default. Override to distinguish multiple Saver types on the same entity.

Components

Pre-built Saver components included in GS_Core:

ComponentSavesDocumentation
BasicEntitySaverComponentEntity Transform (position, rotation)List of Savers
BasicPhysicsEntitySaverComponentEntity Transform + Rigidbody (position, rotation, linear velocity, angular velocity)List of Savers
RecordKeeperComponentKey-value records (string → integer)Record Keeper

Usage Examples

Responding to Global Save/Load Events

If your component doesn’t extend GS_SaverComponent but still needs to react to save/load events, connect to the SaveManagerNotificationBus:

#include <GS_Core/GS_CoreBus.h>

class MyComponent
    : public AZ::Component
    , protected GS_Core::SaveManagerNotificationBus::Handler
{
protected:
    void Activate() override
    {
        GS_Core::SaveManagerNotificationBus::Handler::BusConnect();
    }

    void Deactivate() override
    {
        GS_Core::SaveManagerNotificationBus::Handler::BusDisconnect();
    }

    void OnSaveAll() override
    {
        // Gather and submit data to the Save Manager
    }

    void OnLoadAll() override
    {
        // Retrieve and restore data from the Save Manager
    }
};

Extending the Saver Class

Use the SaverComponent ClassWizard template to generate a new saver with boilerplate already in place — see GS_Core Templates.

Create a custom Saver whenever you need to persist component-specific data that the built-in savers don’t cover.

Header (.h)

#pragma once
#include <Source/SaveSystem/GS_SaverComponent.h>

namespace MyProject
{
    class MyEntitySaverComponent
        : public GS_Core::GS_SaverComponent
    {
    public:
        AZ_COMPONENT_DECL(MyEntitySaverComponent);

        static void Reflect(AZ::ReflectContext* context);

    protected:
        // Saver overrides
        void BuildSaveData() override;
        void ProcessLoad() override;
        AZStd::string GetSubComponentName() const override { return "MyEntitySaver"; }
    };
}

Implementation (.cpp)

#include "MyEntitySaverComponent.h"
#include <AzCore/Serialization/SerializeContext.h>
#include <GS_Core/GS_CoreBus.h>

namespace MyProject
{
    AZ_COMPONENT_IMPL(MyEntitySaverComponent, "MyEntitySaverComponent", "{YOUR-UUID-HERE}",
        GS_Core::GS_SaverComponent);

    void MyEntitySaverComponent::Reflect(AZ::ReflectContext* context)
    {
        if (auto serializeContext = azrtti_cast<AZ::SerializeContext*>(context))
        {
            serializeContext->Class<MyEntitySaverComponent, GS_Core::GS_SaverComponent>()
                ->Version(0);

            if (AZ::EditContext* editContext = serializeContext->GetEditContext())
            {
                editContext->Class<MyEntitySaverComponent>(
                    "My Entity Saver", "Saves custom entity data")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyProject")
                        ->Attribute(AZ::Edit::Attributes::AppearsInAddComponentMenu, AZ_CRC_CE("Game"));
            }
        }
    }

    void MyEntitySaverComponent::BuildSaveData()
    {
        // Get the JSON allocator from the Save Manager
        rapidjson::Document::AllocatorType* allocator = nullptr;
        GS_Core::SaveManagerRequestBus::BroadcastResult(
            allocator,
            &GS_Core::SaveManagerRequestBus::Events::GetAllocator
        );

        if (!allocator) return;

        // Write your data into localData (inherited member)
        localData.SetObject();
        localData.AddMember("myValue", 42, *allocator);
        localData.AddMember("myFlag", true, *allocator);

        // Example: save a string
        rapidjson::Value nameVal;
        nameVal.SetString("hello", *allocator);
        localData.AddMember("myString", nameVal, *allocator);
    }

    void MyEntitySaverComponent::ProcessLoad()
    {
        // Read your data from localData (populated by LoadLocalData)
        if (localData.HasMember("myValue"))
        {
            int myValue = localData["myValue"].GetInt();
            // ... restore state ...
        }

        if (localData.HasMember("myFlag"))
        {
            bool myFlag = localData["myFlag"].GetBool();
            // ... restore state ...
        }
    }
}

Module Registration

m_descriptors.insert(m_descriptors.end(), {
    MyProject::MyEntitySaverComponent::CreateDescriptor(),
});

Attach your custom Saver to any entity that needs persistence. The save/load cycle handles the rest automatically.


See Also

For component references:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.2.2.1 - List of Savers

Pre-built saver components included in GS_Core — ready-to-use entity persistence without writing code.

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


GS_Core includes pre-built Saver components that handle common persistence scenarios out of the box. Attach one to any entity that needs to remember its state between save/load cycles — no custom code required.

All built-in Savers inherit from GS_SaverComponent and participate in the global save/load cycle automatically.


Basic Entity Saver

A companion component that saves and loads an entity’s Transform data: position and rotation.

When to Use

Use the Basic Entity Saver for any entity that can be moved or rotated during gameplay and needs to retain its position across saves — collectibles, furniture, doors, NPCs with fixed patrol points, etc.

Inspector Properties

PropertyTypeDefaultDescription
Load On ActivatebooltrueInherited from GS_SaverComponent. Automatically restores the saved Transform on activation.
Save On DestroybooltrueInherited from GS_SaverComponent. Automatically saves the current Transform on destruction.

What It Saves

DataTypeDescription
PositionAZ::Vector3World-space position of the entity.
RotationAZ::QuaternionWorld-space rotation of the entity.

Setup

  1. Attach the BasicEntitySaverComponent to any entity that needs Transform persistence.
  2. Ensure the Save Manager is running.
  3. That’s it — the component saves and loads automatically.

Basic Physics Entity Saver

Image showing the Basic Physics Saver component, as seen in the Entity Inspector.

A companion component that saves and loads an entity’s Transform and Rigidbody physics state.

When to Use

Use the Basic Physics Entity Saver for physics-driven entities that need to retain both their position and their motion state across saves — throwable objects, rolling boulders, physics puzzles, ragdolls, etc.

Inspector Properties

PropertyTypeDefaultDescription
Load On ActivatebooltrueInherited from GS_SaverComponent. Automatically restores saved state on activation.
Save On DestroybooltrueInherited from GS_SaverComponent. Automatically saves current state on destruction.

What It Saves

DataTypeDescription
PositionAZ::Vector3World-space position of the entity.
RotationAZ::QuaternionWorld-space rotation of the entity.
Linear VelocityAZ::Vector3Current linear velocity of the rigidbody.
Angular VelocityAZ::Vector3Current angular velocity of the rigidbody.

Setup

  1. Attach the BasicPhysicsEntitySaverComponent to any entity with a Rigidbody component.
  2. Ensure the Save Manager is running.
  3. The component saves and loads automatically.

Creating Your Own Saver

Need to persist data that the built-in Savers don’t cover? See the Extending the Saver Class guide for a complete walkthrough with header, implementation, and module registration.

See Also

For component references:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.2.3 - Record Keeper

Lightweight key-value progression tracking — store and retrieve named integer records without writing a custom saver.

For usage guides and setup examples, see The Basics: GS_Core.

Image showing the Record Keeper component with its unique variables and inherited Saver properties, as seen in the Entity Inspector.

The Record Keeper is a companion component that provides a simple key-value store for tracking progression, switch states, quest stages, or any other data that doesn’t require a complex Saver implementation. Each record is a SaveRecord — a name/value pair of recordName (string) and recordProgress (integer).

Because it extends GS_SaverComponent, the Record Keeper saves and loads automatically with the rest of the save system. No custom serialization code needed.

 

Contents


How It Works

  1. The Record Keeper lives on your Save Manager prefab entity (recommended) or any entity with save system access.
  2. Game systems call SetRecord, GetRecord, HasRecord, and DeleteRecord via the RecordKeeperRequestBus.
  3. On each call to SetRecord, the Record Keeper broadcasts RecordChanged on the RecordKeeperNotificationBus so listeners can react.
  4. When a global save event fires (OnSaveAll), the Record Keeper serializes all its records into the save file automatically.
  5. On load, it deserializes its records and makes them available immediately.

SaveRecord Data Structure

struct SaveRecord
{
    AZStd::string recordName;    // Unique key (e.g., "quest_village_rescue", "switch_bridge_01")
    AZ::s32       recordProgress; // Integer value (progression stage, state, count, etc.)
};

TypeId: {F6F4F258-819A-468A-B015-CAF51D8289BF}


Setup

  1. Open your Save Manager prefab in prefab edit mode.
  2. Add the RecordKeeperComponent to the Save Manager entity.
  3. Set the Record Keeper Name — this drives unique save/load identification and allows multiple Record Keepers if needed.
  4. Enable the Saver booleans as needed:
    • Load On Activate — automatically loads records when the component activates (recommended: on).
    • Save On Destroy — automatically saves records when the component is destroyed (recommended: on).

Inspector Properties

PropertyTypeDefaultDescription
Record Keeper NameAZStd::string""Unique identifier for this Record Keeper. Drives the save/load key. Required if using multiple Record Keepers.
Load On ActivatebooltrueInherited from GS_SaverComponent. Automatically loads records from save data on activation.
Save On DestroybooltrueInherited from GS_SaverComponent. Automatically saves records to the save system on destruction.

API Reference

Request Bus: RecordKeeperRequestBus

The primary interface for reading and writing records.

MethodParametersReturnsDescription
HasRecordconst AZStd::string& recordNameboolReturns true if a record with the given name exists.
SetRecordconst AZStd::string& recordName, AZ::s32 recordProgressvoidCreates or updates a record. Broadcasts RecordChanged on success.
GetRecordconst AZStd::string& recordNameAZ::s32Returns the value of the named record. Returns 0 if the record does not exist.
DeleteRecordconst AZStd::string& recordNamevoidRemoves the named record from the store.

Notification Bus: RecordKeeperNotificationBus

Connect to this bus to react when records change.

EventParametersDescription
RecordChangedconst AZStd::string& recordName, AZ::s32 recordValueBroadcast whenever SetRecord is called. Use this to update UI, trigger gameplay events, or log progression.

Usage Examples

Setting a Progression Record

#include <GS_Core/GS_CoreBus.h>

// Mark quest stage 2 as complete
GS_Core::RecordKeeperRequestBus::Broadcast(
    &GS_Core::RecordKeeperRequestBus::Events::SetRecord,
    "quest_village_rescue",
    2
);

Reading a Record

#include <GS_Core/GS_CoreBus.h>

AZ::s32 questStage = 0;
GS_Core::RecordKeeperRequestBus::BroadcastResult(
    questStage,
    &GS_Core::RecordKeeperRequestBus::Events::GetRecord,
    "quest_village_rescue"
);

if (questStage >= 2)
{
    // The player has completed stage 2 — unlock the bridge
}

Checking if a Record Exists

#include <GS_Core/GS_CoreBus.h>

bool exists = false;
GS_Core::RecordKeeperRequestBus::BroadcastResult(
    exists,
    &GS_Core::RecordKeeperRequestBus::Events::HasRecord,
    "switch_bridge_01"
);

if (!exists)
{
    // First time encountering this switch — initialize it
    GS_Core::RecordKeeperRequestBus::Broadcast(
        &GS_Core::RecordKeeperRequestBus::Events::SetRecord,
        "switch_bridge_01",
        0
    );
}

Listening for Record Changes

#include <GS_Core/GS_CoreBus.h>

// In your component header:
class MyQuestTrackerComponent
    : public AZ::Component
    , protected GS_Core::RecordKeeperNotificationBus::Handler
{
protected:
    void Activate() override
    {
        GS_Core::RecordKeeperNotificationBus::Handler::BusConnect();
    }

    void Deactivate() override
    {
        GS_Core::RecordKeeperNotificationBus::Handler::BusDisconnect();
    }

    // RecordKeeperNotificationBus
    void RecordChanged(const AZStd::string& recordName, AZ::s32 recordValue) override
    {
        if (recordName == "quest_village_rescue" && recordValue >= 3)
        {
            // Quest complete — trigger reward
        }
    }
};

Extending the Record Keeper

For complex data needs beyond simple key-value records, create custom Saver components.


See Also

For component references:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.3 - GS_StageManager

The level navigation system — handles loading, unloading, and staged startup of game levels with exit point traversal.

The Stage Manager system handles the process of changing levels, loading and unloading their assets, and starting up levels in the correct sequence. It supports incremental loading so that complex levels with generative components can spin up reliably without massive frame-time spikes.

Each level contains a Stage Data component that acts as the level’s anchor — connecting the Stage Manager to level-specific references, startup sequences, and navigation data. Exit Points mark spawn positions for player placement when transitioning between stages.

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


Architecture

Stage Change Pattern Graph

Breakdown

When a stage change is requested, the system follows a five-step sequence before gameplay resumes in the new level:

StepBroadcast EventWhat It Means
1 — StandbyOnEnterStandbyGame Manager enters standby, halting all gameplay systems.
2 — Unload(internal)The current stage’s entities are torn down.
3 — Spawn(internal)The target stage prefab is instantiated.
4 — Set UpOnBeginSetUpStageStage Data runs its layered startup: SetUpStage → ActivateByPriority → Complete.
5 — CompleteLoadStageCompleteStage Manager broadcasts completion. Standby exits. Gameplay resumes.

The Stage Data startup is layered so heavy levels can initialize incrementally without causing frame-time spikes.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Components

ComponentPurposeDocumentation
GS_StageManagerComponentSingleton level controller. Manages the Stages list, handles change requests, coordinates loading/unloading.Stage Manager
GS_StageDataComponentPer-level anchor. Holds stage name, NavMesh reference, and runs the level’s startup sequence.Stage Data
StageExitPointComponentMarks spawn/exit positions within a level. Registered by name for cross-stage entity placement.Stage Manager: Exit Points
StageLazyLoaderComponentPriority-based entity activation during level load. Spreads heavy initialization across frames.Stage Data: Priority Loading

Quick Start

  1. Create a Stage Manager prefab with the GS_StageManagerComponent.
  2. Add stage entries to the Stages list (name + spawnable prefab for each level).
  3. Set the Default Stage to your starting level.
  4. Add the Stage Manager .spawnable to the Game Manager’s Startup Managers list.
  5. In each level prefab, add a Stage Data component as the level’s anchor.
  6. Optionally place Exit Points in each level for spawn positioning.

See Also

For conceptual overviews and usage guides:

For component references:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.3.1 - Stage Manager

The singleton level controller — manages a list of stages, handles level loading/unloading, and coordinates staged startup with the Stage Data component.

The Stage Manager is the singleton that handles all the logistics of loading and unloading game levels. It maintains a list of stages (name-to-spawnable mappings) and processes change requests by unloading the current stage and spawning the target. Once loaded, it connects to the level’s Stage Data component to run the staged startup sequence.

The system operates as a “one in, one out” format — loading one level automatically unloads the previous one.

For usage guides and setup examples, see The Basics: GS_Core.

Stage Manager component in the O3DE Inspector

 

Contents


How It Works

Startup

At game startup, the Stage Manager checks the Game Manager’s Debug Mode flag:

  • Debug Mode OFF — The Stage Manager loads the Default Stage automatically. This is the normal game flow.
  • Debug Mode ON — The Stage Manager stays connected to whatever level is already loaded in the editor. This lets developers iterate within a stage without waiting for a full level load cycle.

Change Stage Flow

When ChangeStageRequest(stageName, exitName) is called:

  1. Unload — The current stage is destroyed (if one is loaded).
  2. Spawn — The Stage Manager looks up stageName in the Stages list and spawns the matching prefab.
  3. Connect — The Stage Manager queries the newly spawned level for its Stage Data component.
  4. Startup — The Stage Data component runs the level’s staged activation sequence (priority layers, NavMesh generation, etc.).
  5. Notify — The Stage Manager broadcasts LoadStageComplete when the level is fully ready.

Exit Points

Stage Exit Point component in the O3DE Inspector

Exit Points are simple components placed within a level that mark positions for entity placement after a stage loads. They support cross-stage traversal — when changing stages, you specify an exit point name, and entities can be repositioned to that point.

  • Exit Points register themselves with the Stage Manager by name via RegisterExitPoint.
  • One Exit Point can be flagged as Default — it serves as the fallback if no specific exit point is requested.
  • Call GetExitPoint(exitName) to retrieve the entity at a named position.

Setup

Image showing the Manager wrapper entity set as Editor-Only inside Prefab Edit Mode.

  1. Create an entity. Attach the GS_StageManagerComponent to it.
  2. Configure the Stages list — add an entry for each level (stage name + spawnable prefab reference).
  3. Set the Default Stage to your starting level. This name must also exist in the Stages list.
  4. Turn the entity into a prefab.
  5. Enter prefab edit mode. Set the wrapper entity (parent) to Editor Only. Save.
  6. Delete the Stage Manager entity from the level.
  7. In the Game Manager prefab, add the Stage Manager .spawnable to the Startup Managers list.

Inspector Properties

PropertyTypeDefaultDescription
StagesAZStd::vector<StageEntry>[]List of stage entries. Each entry maps a stage name to a spawnable prefab asset.
Default StageAZStd::string""The stage to load automatically on game startup (when not in Debug Mode). Must match a name in the Stages list.

API Reference

Request Bus: StageManagerRequestBus

The primary interface for level navigation. Singleton bus — call via Broadcast.

MethodParametersReturnsDescription
ChangeStageRequestAZStd::string stageName, AZStd::string exitNamevoidUnloads the current stage and loads the named stage. The exitName specifies which Exit Point to use for entity placement. Pass empty string for default.
LoadDefaultStagevoidLoads the Default Stage. Typically called internally during game startup.
RegisterExitPointAZStd::string exitName, AZ::EntityId exitEntityvoidRegisters a named exit point entity. Called by StageExitPointComponents on activation.
UnregisterExitPointAZStd::string exitNamevoidRemoves a named exit point. Called by StageExitPointComponents on deactivation.
GetExitPointAZStd::string exitName = ""AZ::EntityIdReturns the entity ID of the named exit point. Pass empty string to get the default exit point.

Notification Bus: StageManagerNotificationBus

Connect to this bus to react to level loading events.

EventParametersDescription
BeginLoadStageBroadcast when a stage change has started. Use this to show loading screens, disable input, etc.
LoadStageCompleteBroadcast when the new stage is fully loaded and its startup sequence is complete. Safe to begin gameplay.
StageLoadProgressfloat progressBroadcast during the loading process with a 0.0–1.0 progress value. Use for loading bar UI.

Local / Virtual Methods

These methods are available when extending the Stage Manager.

MethodDescription
ChangeStage(stageName, exitName)Internal stage change logic. Override to add custom behavior around stage transitions.
LoadStage()Spawns the target stage prefab. Override for custom spawning logic.
ContinueLoadStage()Called after the stage prefab has spawned. Queries for Stage Data and starts the level’s activation sequence.
UpdateStageData()Connects to the Stage Data component in the loaded level.
GetStageByName(stageName)Returns the spawnable asset reference for a named stage from the Stages list.

Usage Examples

Changing Stages

#include <GS_Core/GS_CoreBus.h>

// Load "ForestVillage" and place the player at the "main_entrance" exit point
GS_Core::StageManagerRequestBus::Broadcast(
    &GS_Core::StageManagerRequestBus::Events::ChangeStageRequest,
    AZStd::string("ForestVillage"),
    AZStd::string("main_entrance")
);

Getting an Exit Point Entity

#include <GS_Core/GS_CoreBus.h>

AZ::EntityId exitEntity;
GS_Core::StageManagerRequestBus::BroadcastResult(
    exitEntity,
    &GS_Core::StageManagerRequestBus::Events::GetExitPoint,
    AZStd::string("cave_exit")
);

if (exitEntity.IsValid())
{
    // Reposition the player entity to the exit point's transform
}

Listening for Stage Load Events

#include <GS_Core/GS_CoreBus.h>

class MyLoadScreenComponent
    : public AZ::Component
    , protected GS_Core::StageManagerNotificationBus::Handler
{
protected:
    void Activate() override
    {
        GS_Core::StageManagerNotificationBus::Handler::BusConnect();
    }

    void Deactivate() override
    {
        GS_Core::StageManagerNotificationBus::Handler::BusDisconnect();
    }

    void BeginLoadStage() override
    {
        // Show loading screen, disable player input
    }

    void LoadStageComplete() override
    {
        // Hide loading screen, enable player input
    }

    void StageLoadProgress(float progress) override
    {
        // Update loading bar: progress is 0.0 to 1.0
    }
};

Script Canvas

Changing stages:

Getting an exit point:


Extending the Stage Manager

Extend the Stage Manager when you need custom stage transition logic, procedural level generation, or multi-stage loading.

Header (.h)

#pragma once
#include <Source/StageManager/GS_StageManagerComponent.h>

namespace MyProject
{
    class MyStageManagerComponent
        : public GS_Core::GS_StageManagerComponent
    {
    public:
        AZ_COMPONENT_DECL(MyStageManagerComponent);

        static void Reflect(AZ::ReflectContext* context);

    protected:
        // Override stage change behavior
        void ChangeStage(AZStd::string stageName, AZStd::string exitName) override;
        void ContinueLoadStage() override;
    };
}

Implementation (.cpp)

#include "MyStageManagerComponent.h"
#include <AzCore/Serialization/SerializeContext.h>

namespace MyProject
{
    AZ_COMPONENT_IMPL(MyStageManagerComponent, "MyStageManagerComponent", "{YOUR-UUID-HERE}",
        GS_Core::GS_StageManagerComponent);

    void MyStageManagerComponent::Reflect(AZ::ReflectContext* context)
    {
        if (auto serializeContext = azrtti_cast<AZ::SerializeContext*>(context))
        {
            serializeContext->Class<MyStageManagerComponent, GS_Core::GS_StageManagerComponent>()
                ->Version(0);

            if (AZ::EditContext* editContext = serializeContext->GetEditContext())
            {
                editContext->Class<MyStageManagerComponent>(
                    "My Stage Manager", "Custom stage manager with transition effects")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyProject")
                        ->Attribute(AZ::Edit::Attributes::AppearsInAddComponentMenu, AZ_CRC_CE("Game"));
            }
        }
    }

    void MyStageManagerComponent::ChangeStage(AZStd::string stageName, AZStd::string exitName)
    {
        // Example: trigger a fade-out transition before changing stages
        // ... your transition logic ...

        // Call base to perform the actual stage change
        GS_StageManagerComponent::ChangeStage(stageName, exitName);
    }

    void MyStageManagerComponent::ContinueLoadStage()
    {
        // Call base to connect to Stage Data and start level activation
        GS_StageManagerComponent::ContinueLoadStage();

        // Example: trigger a fade-in transition after loading
        // ... your transition logic ...
    }
}

Module Registration

m_descriptors.insert(m_descriptors.end(), {
    MyProject::MyStageManagerComponent::CreateDescriptor(),
});

Then create a prefab for your custom Stage Manager and add it to the Game Manager’s Startup Managers list (replacing the default).


See Also

For component references:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.3.2 - Stage Data

The per-level anchor component — holds stage configuration, NavMesh references, and runs the level’s staged activation sequence.

The Stage Data component is the anchor that connects the Stage Manager to a loaded level. It holds level-specific configuration — stage name, NavMesh reference, priority layers — and runs the staged activation sequence that brings the level online incrementally rather than all at once.

Every level that loads through the Stage Manager should have exactly one Stage Data component at the root of its prefab hierarchy.

For usage guides and setup examples, see The Basics: GS_Core.

Stage Data component in the O3DE Inspector

 

Contents


How It Works

Connection

When the Stage Manager spawns a stage prefab, it queries the level for the Stage Data component via StageDataRequestBus. The Stage Data returns its reference, and the Stage Manager hands off the startup process.

 

Staged Activation

Rather than activating every entity in the level at once (which would cause a massive frame spike), the Stage Data component supports priority-based activation:

  1. SetUpStage() — Initial stage configuration. Sets up references, connects internal systems.
  2. ActivateByPriority — Entities tagged with priority layers activate in sequence, spreading heavy initialization across multiple frames. This allows NavMesh generation, procedural content, and complex entity hierarchies to spin up without blocking the main thread.
  3. LoadStageComplete — The Stage Manager broadcasts this once all priority layers have finished and the level is fully ready.

 

The Stage Data component holds a reference to the level’s Recast NavMesh entity. This allows the Stage Manager to trigger NavMesh generation at the appropriate point during the startup sequence — after the level geometry is loaded but before AI systems begin pathfinding.


Setup

  1. In your level prefab, create an entity at the root level.
  2. Attach the GS_StageDataComponent to it.
  3. Set the Stage Name to match the name used in the Stage Manager’s Stages list.
  4. Assign the NavMesh entity reference if your level uses Recast Navigation.
  5. Configure priority layers for incremental entity activation if needed.

Inspector Properties

PropertyTypeDefaultDescription
Stage NameAZStd::string""The name of this stage. Must match the corresponding entry in the Stage Manager’s Stages list.
NavMeshAZ::EntityIdInvalidReference to the entity with the Recast Navigation component. Used to trigger NavMesh generation during staged startup.

API Reference

Request Bus: StageDataRequestBus

Used by the Stage Manager to connect to and control the level’s startup.

MethodParametersReturnsDescription
GetStageDataStageData*Returns a reference to this Stage Data component. Used by the Stage Manager during initial connection.
BeginSetUpStagevoidStarts the stage’s setup sequence. Called by the Stage Manager after connecting.
GetStageNameAZStd::stringReturns the configured stage name.
GetStageNavMeshAZ::EntityIdReturns the NavMesh entity reference for this stage.

Notification Bus: StageDataNotificationBus

Connect to this bus for level-specific lifecycle events.

EventParametersDescription
OnLoadStageCompleteBroadcast when this stage’s activation sequence has finished.
OnBeginSetUpStageBroadcast when the stage setup begins.
OnTearDownStageBroadcast when the stage is being unloaded. Use for level-specific cleanup.
ActivateByPriorityint priorityBroadcast for each priority layer during staged activation. Entities at this priority level should activate.

Local / Virtual Methods

These methods are available when extending the Stage Data component.

MethodReturnsDescription
SetUpStage()boolRuns the stage’s initial setup. Override to add custom startup logic. Returns true when setup is complete.
BeginLoadStage()voidCalled when the stage begins loading. Override for custom load-start behavior.
LoadStageComplete()voidCalled when loading is complete. Override for custom post-load behavior.

Usage Examples

Listening for Stage Events in a Level Component

#include <GS_Core/GS_CoreBus.h>

class MyLevelController
    : public AZ::Component
    , protected GS_Core::StageDataNotificationBus::Handler
{
protected:
    void Activate() override
    {
        GS_Core::StageDataNotificationBus::Handler::BusConnect();
    }

    void Deactivate() override
    {
        GS_Core::StageDataNotificationBus::Handler::BusDisconnect();
    }

    void OnBeginSetUpStage() override
    {
        // Level setup is starting — initialize level-specific systems
    }

    void OnLoadStageComplete() override
    {
        // Level is fully ready — spawn enemies, start ambient audio, etc.
    }

    void OnTearDownStage() override
    {
        // Level is being unloaded — clean up level-specific resources
    }
};

Priority-Based Activation

#include <GS_Core/GS_CoreBus.h>

class MyPriorityActivator
    : public AZ::Component
    , protected GS_Core::StageDataNotificationBus::Handler
{
    int m_myPriority = 2; // Activate during priority layer 2

protected:
    void Activate() override
    {
        GS_Core::StageDataNotificationBus::Handler::BusConnect();
    }

    void Deactivate() override
    {
        GS_Core::StageDataNotificationBus::Handler::BusDisconnect();
    }

    void ActivateByPriority(int priority) override
    {
        if (priority == m_myPriority)
        {
            // This is our turn — initialize heavy resources
            // (procedural generation, AI setup, physics volumes, etc.)
        }
    }
};

Script Canvas

Getting stage name and NavMesh reference:


Extending the Stage Data Component

Extend Stage Data when you need custom level startup sequences, procedural generation hooks, or level-specific configuration.

Header (.h)

#pragma once
#include <Source/StageManager/GS_StageDataComponent.h>

namespace MyProject
{
    class MyStageDataComponent
        : public GS_Core::GS_StageDataComponent
    {
    public:
        AZ_COMPONENT_DECL(MyStageDataComponent);

        static void Reflect(AZ::ReflectContext* context);

    protected:
        // Override to add custom stage setup
        bool SetUpStage() override;
        void LoadStageComplete() override;

    private:
        bool m_enableProceduralContent = false;
    };
}

Implementation (.cpp)

#include "MyStageDataComponent.h"
#include <AzCore/Serialization/SerializeContext.h>

namespace MyProject
{
    AZ_COMPONENT_IMPL(MyStageDataComponent, "MyStageDataComponent", "{YOUR-UUID-HERE}",
        GS_Core::GS_StageDataComponent);

    void MyStageDataComponent::Reflect(AZ::ReflectContext* context)
    {
        if (auto serializeContext = azrtti_cast<AZ::SerializeContext*>(context))
        {
            serializeContext->Class<MyStageDataComponent, GS_Core::GS_StageDataComponent>()
                ->Version(0)
                ->Field("EnableProceduralContent", &MyStageDataComponent::m_enableProceduralContent);

            if (AZ::EditContext* editContext = serializeContext->GetEditContext())
            {
                editContext->Class<MyStageDataComponent>(
                    "My Stage Data", "Custom stage data with procedural generation support")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyProject")
                        ->Attribute(AZ::Edit::Attributes::AppearsInAddComponentMenu, AZ_CRC_CE("Game"))
                    ->DataElement(AZ::Edit::UIHandlers::Default,
                        &MyStageDataComponent::m_enableProceduralContent,
                        "Enable Procedural Content",
                        "Generate procedural content during stage setup");
            }
        }
    }

    bool MyStageDataComponent::SetUpStage()
    {
        if (m_enableProceduralContent)
        {
            // Run procedural generation for this level
            // ... your generation logic ...
        }

        // Call base to continue the standard setup sequence
        return GS_StageDataComponent::SetUpStage();
    }

    void MyStageDataComponent::LoadStageComplete()
    {
        // Custom post-load behavior
        // ... your logic ...

        GS_StageDataComponent::LoadStageComplete();
    }
}

Module Registration

m_descriptors.insert(m_descriptors.end(), {
    MyProject::MyStageDataComponent::CreateDescriptor(),
});

Use your custom Stage Data component in level prefabs instead of the default GS_StageDataComponent.


See Also

For component references:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.4 - GS_Options

The configuration system — input profiles, input readers, and runtime settings management for gameplay and accessibility.

The Options system is the central pillar for handling background systemic configuration. At its simplest, it lets you set up user-facing options like volume, graphics settings, subtitles, and other standard configurable settings. Beyond that, it manages development-side functionality — language settings, current input device type, and other runtime-aware systems — providing that data to hook components throughout the game.

Currently, the Options system’s primary feature is the Input Profile system, which replaces O3DE’s raw input binding files with a more flexible, group-based approach that supports runtime rebinding and per-group enable/disable toggling.

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


Components

ComponentPurposeDocumentation
GS_OptionsManagerComponentSingleton manager. Holds the active Input Profile and provides it to Input Readers.Options Manager
GS_InputProfileData asset defining input groups, event mappings, and key bindings. Created in the Asset Editor.Input Profiles
GS_InputReaderComponentReads input through the active profile and fires gameplay events. Can be extended for specialized input handling.Input Options

Quick Start

  1. Create an Options Manager prefab with the GS_OptionsManagerComponent.
  2. Create a GS_InputProfile data asset in the Asset Editor.
  3. Add input groups and event mappings to the profile.
  4. Assign the Input Profile to the Options Manager.
  5. Add the Options Manager .spawnable to the Game Manager’s Startup Managers list.
  6. Attach GS_InputReaderComponents to entities that need to process input.

See Also

For conceptual overviews and usage guides:

For component references:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.4.1 - Options Manager

The singleton options controller — holds the active Input Profile and provides runtime configuration data to game systems.

For usage guides and setup examples, see The Basics: GS_Core.

Image showing the Options Manager component, as seen in the Entity Inspector.

Options Manager component in the O3DE Inspector

The Options Manager is the central anchor of the Options system. It holds the active Input Profile data asset and provides it to Input Readers and other game systems that need runtime configuration data.

As a Manager, it is spawned by the Game Manager and participates in the standard two-stage initialization lifecycle.

 

Contents


How It Works

  1. The Options Manager initializes during the Game Manager’s startup sequence.
  2. It loads the configured Input Profile data asset.
  3. Input Readers across the game query the Options Manager for the active profile via OptionsManagerRequestBus.
  4. The Options Manager responds to standby events — pausing and resuming input processing when the game enters or exits standby mode.

Setup

Image showing the Manager wrapper entity set as Editor-Only inside Prefab Edit Mode.

  1. Create an entity. Attach the GS_OptionsManagerComponent to it.
  2. Assign your Input Profile data asset.
  3. Turn the entity into a prefab.
  4. Enter prefab edit mode. Set the wrapper entity (parent) to Editor Only. Save.
  5. Delete the Options Manager entity from the level.
  6. In the Game Manager prefab, add the Options Manager .spawnable to the Startup Managers list.

Inspector Properties

PropertyTypeDefaultDescription
Input ProfileAZ::Data::Asset<GS_InputProfile>NoneThe active Input Profile data asset. Input Readers across the game will use this profile for event-to-binding mappings.

API Reference

Request Bus: OptionsManagerRequestBus

Singleton bus — call via Broadcast.

MethodParametersReturnsDescription
GetActiveInputProfileAZ::Data::Asset<GS_InputProfile>Returns the currently active Input Profile data asset. Called by Input Readers on activation.

Lifecycle Events (from GameManagerNotificationBus)

EventDescription
OnStartupCompleteThe Options Manager is fully ready. Input Readers can now query for the active profile.
OnEnterStandbyPause input processing.
OnExitStandbyResume input processing.

Usage Examples

Getting the Active Input Profile

#include <GS_Core/GS_CoreBus.h>

AZ::Data::Asset<GS_Core::GS_InputProfile> profile;
GS_Core::OptionsManagerRequestBus::BroadcastResult(
    profile,
    &GS_Core::OptionsManagerRequestBus::Events::GetActiveInputProfile
);

if (profile.IsReady())
{
    // Use the profile to look up bindings, check group states, etc.
}

Extending the Options Manager

Extend the Options Manager when you need additional runtime settings (graphics quality, audio levels, accessibility), custom options persistence, or platform-specific configuration.

Header (.h)

#pragma once
#include <Source/OptionsSystem/GS_OptionsManagerComponent.h>

namespace MyProject
{
    class MyOptionsManagerComponent
        : public GS_Core::GS_OptionsManagerComponent
    {
    public:
        AZ_COMPONENT_DECL(MyOptionsManagerComponent);

        static void Reflect(AZ::ReflectContext* context);

    protected:
        void OnStartupComplete() override;

    private:
        float m_masterVolume = 1.0f;
        int m_graphicsQuality = 2; // 0=Low, 1=Med, 2=High
    };
}

Implementation (.cpp)

#include "MyOptionsManagerComponent.h"
#include <AzCore/Serialization/SerializeContext.h>

namespace MyProject
{
    AZ_COMPONENT_IMPL(MyOptionsManagerComponent, "MyOptionsManagerComponent", "{YOUR-UUID-HERE}",
        GS_Core::GS_OptionsManagerComponent);

    void MyOptionsManagerComponent::Reflect(AZ::ReflectContext* context)
    {
        if (auto serializeContext = azrtti_cast<AZ::SerializeContext*>(context))
        {
            serializeContext->Class<MyOptionsManagerComponent, GS_Core::GS_OptionsManagerComponent>()
                ->Version(0)
                ->Field("MasterVolume", &MyOptionsManagerComponent::m_masterVolume)
                ->Field("GraphicsQuality", &MyOptionsManagerComponent::m_graphicsQuality);

            if (AZ::EditContext* editContext = serializeContext->GetEditContext())
            {
                editContext->Class<MyOptionsManagerComponent>(
                    "My Options Manager", "Extended options with audio and graphics settings")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyProject")
                        ->Attribute(AZ::Edit::Attributes::AppearsInAddComponentMenu, AZ_CRC_CE("Game"))
                    ->DataElement(AZ::Edit::UIHandlers::Slider,
                        &MyOptionsManagerComponent::m_masterVolume, "Master Volume", "Global audio volume")
                        ->Attribute(AZ::Edit::Attributes::Min, 0.0f)
                        ->Attribute(AZ::Edit::Attributes::Max, 1.0f)
                    ->DataElement(AZ::Edit::UIHandlers::ComboBox,
                        &MyOptionsManagerComponent::m_graphicsQuality, "Graphics Quality", "Rendering quality preset");
            }
        }
    }

    void MyOptionsManagerComponent::OnStartupComplete()
    {
        // Apply saved options on startup
        // ... load from save system and apply settings ...

        GS_OptionsManagerComponent::OnStartupComplete();
    }
}

Module Registration

m_descriptors.insert(m_descriptors.end(), {
    MyProject::MyOptionsManagerComponent::CreateDescriptor(),
});

See Also

For component references:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.4.2 - GS_InputOptions

The input subsystem — Input Profiles for grouped event-to-binding mappings and Input Readers for processing input in gameplay.

Input Options is the input subsystem within the Options system. It provides two core pieces:

  • Input Profiles — Data assets that map input bindings to named events, organized into toggleable groups. These replace O3DE’s raw input binding files with a more flexible system that supports runtime rebinding and per-group enable/disable toggling.
  • Input Readers — Components that read input through the active profile and fire gameplay events. They can be extended for specialized input handling (player controllers, UI navigation, etc.).

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


Architecture

GS_OptionsManagerComponent
  └── Active GS_InputProfile (data asset)
        ├── InputBindingGroup: "Movement"
        │     ├── EventInputMapping: "MoveForward" → [keyboard_key_alphanumeric_W]
        │     ├── EventInputMapping: "MoveBack"    → [keyboard_key_alphanumeric_S]
        │     └── EventInputMapping: "Sprint"      → [keyboard_key_modifier_shift_l]
        ├── InputBindingGroup: "Combat"
        │     ├── EventInputMapping: "Attack"   → [mouse_button_left]
        │     └── EventInputMapping: "Block"    → [mouse_button_right]
        └── InputBindingGroup: "UI"
              ├── EventInputMapping: "Confirm"  → [keyboard_key_alphanumeric_E]
              └── EventInputMapping: "Cancel"   → [keyboard_key_navigation_escape]

GS_InputReaderComponent (on player entity, UI entity, etc.)
  ├── Queries OptionsManager for active profile
  ├── Listens for raw O3DE input events
  ├── Matches bindings → fires named events
  └── EnableInputGroup / DisableInputGroup for runtime control

Input Profiles

Data assets that map key bindings to named events, organized into toggleable groups. They replace O3DE’s raw input binding files with a more flexible system that supports runtime rebinding and per-group enable/disable toggling.

Asset / TypePurpose
GS_InputProfileData asset defining input groups, event mappings, and key bindings.
InputBindingGroupNamed group of event mappings — can be toggled at runtime.
EventInputMappingMaps a named event to one or more raw O3DE input bindings.

Input Profiles API


Input Reader

Components that read input through the active profile and fire named gameplay events. Supports runtime group toggling to enable or disable input categories on the fly, and a claim flag to absorb matched events from other readers.

ComponentPurpose
GS_InputReaderComponentPer-entity input processor. Queries the active profile, matches raw input, and fires named gameplay events.

Input Reader API


Setup

  1. Create a GS_InputProfile data asset in the Asset Editor.
  2. Add input groups and event mappings to the profile.
  3. Assign the profile to the Options Manager.
  4. Attach a GS_InputReaderComponent to any entity that needs to process input.
  5. The Input Reader automatically queries the Options Manager for the active profile on activation.

See Also

For component references:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.4.2.1 - Input Profiles

Data assets for input binding configuration — map key bindings to named events, organized into toggleable groups for advanced runtime input control.

Input Profiles are data assets that map key bindings to named events, organized into groups that can be enabled and disabled at runtime. They replace O3DE’s raw input binding files with a more flexible system that supports runtime rebinding, per-group toggling, and options menu customization.

For usage guides and setup examples, see The Basics: GS_Core.

Input Profile asset in the O3DE Asset Editor

 

Contents


How It Works

An Input Profile contains a list of InputBindingGroups. Each group contains EventInputMappings that define the relationship between a named event, its key bindings, and processing options.

Data Structure

GS_InputProfile
  └── InputBindingGroups[]
        ├── groupName: "Movement"
        └── eventMappings[]
              ├── EventInputMapping
              │     ├── eventName: "MoveForward"
              │     ├── inputBindings: ["keyboard_key_alphanumeric_W"]
              │     ├── deadzone: 0.0
              │     └── processUpdate: true
              └── EventInputMapping
                    ├── eventName: "Jump"
                    ├── inputBindings: ["keyboard_key_alphanumeric_Space"]
                    ├── deadzone: 0.0
                    └── processUpdate: false

EventInputMapping

FieldTypeDescription
eventNameAZStd::stringThe name used to fire this event in gameplay through Input Readers. This is the identifier your gameplay code listens for.
inputBindingsAZStd::vector<AZStd::string>One or more raw O3DE input binding names (e.g., keyboard_key_alphanumeric_W, mouse_button_left, gamepad_button_a). Multiple bindings allow the same event to fire from different input sources.
deadzonefloatDead zone threshold for analog inputs (joysticks, triggers). Values below this threshold are treated as zero. Set to 0.0 for digital inputs (keys, buttons).
processUpdateboolWhen true, the event fires continuously while the input is held. When false, the event fires only on initial press and release. Use true for movement and camera input; false for discrete actions like jumping or interacting.

TypeId: {61421EC2-7B99-4EF2-9C56-2A7F41ED3474}

Reflection: GS_InputProfile extends AZ::Data::AssetData. Its Reflect() function requires GS_AssetReflectionIncludes.h — see Serialization Helpers.

InputBindingGroup

FieldTypeDescription
groupNameAZStd::stringName of the group (e.g., “Movement”, “Combat”, “UI”). Used by Input Readers to enable/disable groups at runtime.
eventMappingsAZStd::vector<EventInputMapping>The event mappings belonging to this group.

TypeId: {37E880D1-9AB4-4E10-9C3C-020B5C32F75B}


Creating and Using an Input Profile

For initial startup instructions refer to the Core Set Up Guide.

  1. In the Asset Editor, open the New menu and select GS_InputProfile. This creates a blank Input Profile.
  2. Add Input Groups — Create groups that cluster related input events (e.g., “Movement”, “Combat”, “UI”). Groups can be toggled on/off at runtime by Input Readers.
  3. Add Event Mappings — Within each group, add EventInputMappings. Set the Event Name to the identifier your gameplay code will listen for.
  4. Set Bindings — Add the raw O3DE input bindings that should trigger each event. These are standard O3DE raw mapping names.
  5. Configure Deadzone — For gamepad joystick or trigger inputs, set an appropriate deadzone value (e.g., 0.15). Leave at 0.0 for keyboard and mouse inputs.
  6. Set Process Update — Enable for continuous input (movement, camera look). Disable for discrete actions (jump, interact, attack).
  7. Assign to Options Manager — Add the profile to the Options Manager’s Input Profile property.

API Reference

These methods are available on the GS_InputProfile data asset class for runtime binding queries and modifications.

MethodParametersReturnsDescription
GetMappingFromBindingconst AZStd::string& binding, const AZStd::string& groupName = ""const EventInputMapping*Looks up the event mapping associated with a raw binding. Optionally restrict the search to a specific group. Returns nullptr if not found.
GetGroupNameFromBindingconst AZStd::string& bindingconst AZStd::string*Returns the group name that contains the given binding. Returns nullptr if not found.
HasBindingconst AZStd::string& bindingboolReturns true if the binding exists anywhere in the profile.
ReplaceEventInputBindingconst AZStd::string& eventName, const AZStd::string& newBindingboolReplaces the existing binding for the named event with a new one. For runtime rebinding in options menus. Returns true on success.
AddEventInputBindingconst AZStd::string& eventName, const AZStd::string& newBindingboolAdds an additional binding to the named event. Returns true on success.
RemoveEventBindingconst AZStd::string& eventName, const AZStd::string& bindingToRemoveboolRemoves a specific binding from the named event. Returns true on success.

Usage Examples

Runtime Rebinding (Options Menu)

#include <GS_Core/GS_CoreBus.h>

// Get the active input profile
AZ::Data::Asset<GS_Core::GS_InputProfile> profile;
GS_Core::OptionsManagerRequestBus::BroadcastResult(
    profile,
    &GS_Core::OptionsManagerRequestBus::Events::GetActiveInputProfile
);

if (profile.IsReady())
{
    // Rebind the "Jump" event from Space to E
    profile.Get()->ReplaceEventInputBinding(
        "Jump",
        "keyboard_key_alphanumeric_E"
    );

    // Add an alternative binding (gamepad A button)
    profile.Get()->AddEventInputBinding(
        "Jump",
        "gamepad_button_a"
    );
}

Checking if a Binding Exists

if (profile.IsReady() && profile.Get()->HasBinding("keyboard_key_alphanumeric_W"))
{
    // This binding is mapped to an event in the profile
}

See Also

For component references:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.4.2.2 - Input Reader

Component that reads input through the active profile and fires named gameplay events, with runtime group toggling and input claiming.

The Input Reader component sits on any entity that needs to process input. It queries the Options Manager for the active Input Profile, then listens for raw O3DE input events and matches them against the profile’s bindings to fire named gameplay events.

For usage guides and setup examples, see The Basics: GS_Core.

Input Reader component in the O3DE Inspector

 

Contents


How It Works

On activation the Input Reader queries the Options Manager for the active Input Profile and subscribes to raw O3DE input events. Each frame it matches incoming raw input against the profile bindings. When a match is found it fires the associated named event into the gameplay event system. Input groups can be enabled or disabled at runtime to control which events are active at any moment.


Key Features

  • Group toggling — Enable or disable entire input groups at runtime. For example, disable “Combat” inputs during a dialogue sequence, or disable “Movement” inputs during a cutscene.
  • Claim input — The ClaimAllInput flag causes the reader to absorb matched input events, preventing them from reaching other readers. Useful for layered input (e.g., UI absorbs input before gameplay).
  • Extensible — Extend the Input Reader for specialized input handling. GS_Play uses this internally for player controller input and UI navigation input.

API Reference

GS_InputReaderComponent

FieldValue
HeaderGS_Core/GS_CoreBus.h

Request Bus: InputReaderRequestBus

Entity-addressed bus – call via Event(entityId, ...).

MethodParametersReturnsDescription
EnableInputGroupconst AZStd::string& groupNamevoidEnables a named input group for this reader. Events in this group will fire on matched input.
DisableInputGroupconst AZStd::string& groupNamevoidDisables a named input group. Events in this group will be ignored until re-enabled.
IsGroupDisabledconst AZStd::string& groupNameboolReturns true if the named group is currently disabled.

Usage Examples

Toggling Input Groups

#include <GS_Core/GS_CoreBus.h>

// Disable combat inputs during dialogue
GS_Core::InputReaderRequestBus::Event(
    playerEntityId,
    &GS_Core::InputReaderRequestBus::Events::DisableInputGroup,
    AZStd::string("Combat")
);

// Re-enable when dialogue ends
GS_Core::InputReaderRequestBus::Event(
    playerEntityId,
    &GS_Core::InputReaderRequestBus::Events::EnableInputGroup,
    AZStd::string("Combat")
);

Script Canvas

Input Groups handling nodes in the O3DE Script Canvas


See Also

For component references:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.5 - Systems

Core framework systems — the GS_Motion track-based animation engine and the GS_Actions triggerable behavior system.

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


GS_Motion

The track-based animation and tween engine. GS_Motion defines abstract base classes for tracks, assets, composites, and proxies. Domain gems (GS_UI, GS_Juice) extend it with concrete track types and custom asset formats. The system handles playback timing, per-track easing, proxy-based entity targeting, and deep-copy runtime instancing.

ClassPurpose
GS_MotionTrackAbstract base class for all animation tracks.
GS_MotionPlayback engine — ticks tracks and manages lifecycle.
GS_MotionCompositeRuntime deep-copy instance with proxy overrides.
GS_MotionAssetAbstract data asset base for motion definitions.
GS_MotionProxySerialized struct for track-to-entity redirection.

GS_Motion API


See Also

For conceptual overviews and usage guides:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.5.1 - GS_Motion

Track-based animation and tween system — abstract base classes for motions, tracks, composites, assets, and proxies with a domain extension pattern.

GS_Motion is the framework’s central track-based animation system. It defines abstract base classes that domain gems extend with concrete track types — GS_UI extends it with 8 UI animation tracks (.uiam assets), and GS_Juice extends it with 2 feedback tracks (.feedbackmotion assets). The system handles playback timing, per-track easing, proxy-based entity targeting, and deep-copy runtime instancing from data assets.

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


Architecture

GS_MotionTrack (abstract base)
    ├── UiMotionTrack (GS_UI domain base)
    │       ├── UiPositionTrack
    │       ├── UiScaleTrack
    │       ├── UiRotationTrack
    │       ├── UiElementAlphaTrack
    │       ├── UiImageAlphaTrack
    │       ├── UiImageColorTrack
    │       ├── UiTextColorTrack
    │       └── UiTextSizeTrack
    └── FeedbackMotionTrack (GS_Juice domain base)
            ├── FeedbackTransformTrack
            └── FeedbackMaterialTrack

GS_MotionAsset (abstract base)
    ├── UiAnimationMotionAsset (.uiam)
    └── FeedbackMotionAsset (.feedbackmotion)

GS_MotionComposite (runtime instance)
    └── Created by asset's CreateRuntimeComposite()

Core Classes

ClassDescriptionPage
GS_MotionTrackAbstract base for all animation tracks — fields, lifecycle, virtual methods, extension guide.MotionTrack
GS_MotionPlayback engine — ticks tracks, computes per-track progress windows, applies easing, manages lifecycle.Motion Engine
GS_MotionCompositeRuntime deep-copy instance with proxy entity overrides — created from GS_MotionAsset.MotionComposite
GS_MotionAssetAbstract base for motion data assets — holds track definitions and creates runtime composites.MotionAsset
GS_MotionProxySerialized struct for track-to-entity redirection — allows designers to target named tracks at different entities.MotionProxy

Domain Extensions

GS_Motion is designed to be extended by domain gems. Each domain creates its own track hierarchy, asset type, and file extension.

DomainGemBase TrackConcrete TracksAsset Extension
UI AnimationGS_UIUiMotionTrackPosition, Scale, Rotation, ElementAlpha, ImageAlpha, ImageColor, TextColor, TextSize (8).uiam
FeedbackGS_JuiceFeedbackMotionTrackFeedbackTransformTrack, FeedbackMaterialTrack (2).feedbackmotion

Extension Pattern

To create a new motion domain:

  1. Create a domain base trackclass MyTrack : public GS_Core::GS_MotionTrack
  2. Create concrete tracks extending the domain base — each overrides Update(float easedProgress) and GetTypeName()
  3. Create a domain assetclass MyAsset : public GS_Core::GS_MotionAsset with vector<MyTrack*> m_tracks
  4. Create an instance wrapper struct (not a component) — holds Asset<MyAsset> + vector<GS_MotionProxy>, manages Initialize / Unload / Play / Stop
  5. Embed the wrapper — components embed the instance wrapper struct as a serialized field

Critical: All track vectors must use raw pointers: vector<MyTrack*>. Never use unique_ptr — O3DE SerializeContext requires raw pointers for polymorphic enumeration in the asset editor.


See Also

For conceptual overviews and usage guides:

For class references:

For domain extensions:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.5.1.1 - GS_MotionTrack

Abstract base class for all animation tracks — fields, lifecycle, and virtual methods for domain extension.

GS_MotionTrack is the abstract base class for all animation tracks in the GS_Motion system. Each track animates one aspect of an entity over a time window within a motion. Domain gems extend this with concrete track types — GS_UI provides 8 LyShine tracks, GS_Juice provides 2 feedback tracks.

For usage guides and setup examples, see The Basics: GS_Core.


Fields

FieldTypeDescription
m_idAZ::UuidUnique track identifier (auto-generated).
m_identifierAZStd::stringProxy label — if set, the track appears in the proxy list for entity override targeting.
curveTypeCurveTypeEasing curve applied to track progress (from the Curves utility library).
startTimefloatTime offset before the track begins playing (seconds).
durationfloatTrack playback duration (seconds).
startVarianceMinfloatMinimum random variance added to start time.
startVarianceMaxfloatMaximum random variance added to start time.

Lifecycle

Each track goes through a fixed lifecycle managed by the parent GS_Motion:

PhaseMethodDescription
InitializeInit(AZ::EntityId owner)Stores the owner entity. Called once when the motion is initialized.
StartStart()Called when the track’s time window begins.
UpdateUpdate(float easedProgress)Called each frame with eased progress (0 → 1). Override this in concrete tracks.
EndEnd()Called when the track’s time window completes.
UnloadUnload()Cleanup. Called when the motion is unloaded.

Virtual Methods

These methods must be overridden in concrete track implementations:

MethodReturnsDescription
Update(float easedProgress)voidApply the animation at the given eased progress value (0 → 1).
GetTypeName()AZStd::stringReturn the track’s display name for proxy labels in the editor.

Extension Guide

To create a new domain of animation tracks:

  1. Create a domain base track: class MyTrack : public GS_Core::GS_MotionTrack — this serves as the common base for all tracks in your domain.
  2. Create concrete tracks extending the domain base — each overrides Update(float easedProgress) to animate a specific property.
  3. Reflect the track class using O3DE’s SerializeContext and EditContext. The system discovers the new type automatically.

Critical: Track vectors must use raw pointers (vector<MyTrack*>). Never use unique_ptr — O3DE SerializeContext requires raw pointers for polymorphic enumeration in the asset editor.


See Also


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.5.1.2 - GS_Motion

The playback engine — ticks through tracks, computes per-track progress windows, and manages motion lifecycle.

GS_Motion is the playback engine that drives animation tracks. It ticks through all tracks each frame, computes per-track progress windows based on start time and duration, applies easing curves, and calls Update(easedProgress) on each active track. It handles motion lifecycle from initialization through completion.

For usage guides and setup examples, see The Basics: GS_Core.


API Reference

MethodParametersReturnsDescription
PlayvoidBegin playback from the start.
PlayWithCallbackAZStd::function<void()> cbvoidPlay and invoke callback on completion.
StopvoidStop playback immediately.
InitializeAZ::EntityId ownervoidInitialize all tracks with the owner entity.
UnloadvoidUnload all tracks and clean up.

Fields

FieldTypeDescription
motionNameAZStd::stringDisplay name for the motion.
tracksvector<GS_MotionTrack*>The tracks in this motion.
loopboolWhether playback loops.
onCompletefunction<void()>Completion callback (set via PlayWithCallback).

How It Works

Each frame during playback:

  1. The motion calculates the global elapsed time.
  2. For each track, it computes the track-local progress based on startTime and duration.
  3. If the track is within its active window, the progress is eased using the track’s curveType.
  4. Update(easedProgress) is called on the track with the eased value (0 → 1).
  5. When all tracks complete, OnMotionComplete() is called.

Virtual Methods

MethodDescription
OnMotionCompleteCalled when playback finishes. Override in subclasses for custom teardown.

See Also


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.5.1.3 - GS_MotionComposite

Runtime deep-copy motion instance with proxy entity overrides — created from GS_MotionAsset for per-entity playback.

GS_MotionComposite is the runtime instance of a motion, created from a GS_MotionAsset. It deep-copies all tracks so each entity gets independent playback state, and applies proxy overrides to redirect specific tracks to different entities in the hierarchy.

For usage guides and setup examples, see The Basics: GS_Core.


GS_MotionComposite extends GS_Motion. When an asset’s CreateRuntimeComposite() is called, all tracks are SC-cloned (serialization-context cloned) into a new composite instance. This ensures each entity playing the same motion has its own independent track state — no shared mutation.


API Reference

MethodParametersReturnsDescription
ApplyProxiesAZ::EntityId owner, vector<GS_MotionProxy> proxiesvoidMatches proxy trackIds to tracks and overrides the target entity for each matched track.

How It Works

  1. Creation: GS_MotionAsset::CreateRuntimeComposite() deep-copies all asset tracks into a new composite.
  2. Proxy Application: ApplyProxies() walks the proxy list, matching each proxy’s m_trackId to a track’s m_id. Matched tracks redirect their owner entity to the proxy’s m_proxyEntity.
  3. Playback: The composite plays exactly like a regular GS_Motion — it ticks tracks, applies easing, and calls Update().
  4. Cleanup: On Unload(), the composite cleans up all deep-copied tracks.

See Also


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.5.1.4 - GS_MotionAsset

Abstract base for motion data assets — holds track definitions and creates runtime composites.

GS_MotionAsset is the abstract base class for motion data assets. Domain gems extend this with their own asset type and file extension — GS_UI creates UiAnimationMotionAsset (.uiam), GS_Juice creates FeedbackMotionAsset (.feedbackmotion). Assets are created and edited in the O3DE Asset Editor.

Extends AZ::Data::AssetData. All subclasses require GS_AssetReflectionIncludes.h in their header — see Serialization Helpers.

For usage guides and setup examples, see The Basics: GS_Core.


API Reference

MethodReturnsDescription
GetTrackInfosvector<GS_TrackInfo>Returns track UUID + label pairs for proxy list synchronization in the editor.
CreateRuntimeCompositeGS_MotionComposite*Factory — deep-copies all asset tracks into a new runtime GS_MotionComposite instance.

GS_TrackInfo

Lightweight struct used for proxy synchronization between asset tracks and instance proxy lists.

FieldTypeDescription
idAZ::UuidTrack identifier (matches the track’s m_id).
labelAZStd::stringTrack display name (from GetTypeName()).

Domain Extensions

DomainGemAsset ClassExtensionTracks
UI AnimationGS_UIUiAnimationMotionAsset.uiam8 LyShine-specific tracks
FeedbackGS_JuiceFeedbackMotionAsset.feedbackmotion2 feedback tracks

Extension Guide

To create a new domain asset type:

  1. Create class MyAsset : public GS_Core::GS_MotionAsset with vector<MyDomainTrack*> m_tracks. Include GS_AssetReflectionIncludes.h in your asset’s header — see Serialization Helpers.
  2. Implement GetTrackInfos() — iterate tracks and return UUID + label pairs.
  3. Implement CreateRuntimeComposite() — deep-copy tracks into a new GS_MotionComposite.
  4. Register the asset type in your gem’s DataAssetsSystemComponent.
  5. Add a .setreg entry for the asset processor to recognize your file extension.

Critical: Track vectors must use raw pointers (vector<MyTrack*>). O3DE SerializeContext requires raw pointers for polymorphic enumeration.


See Also


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.5.1.5 - GS_MotionProxy

Serialized struct for track-to-entity redirection — allows designers to target named tracks at different entities.

GS_MotionProxy is a serialized struct that allows designers to redirect a named track to a different entity in the hierarchy. This enables a single motion asset to animate multiple entities — for example, a UI animation that moves one element while fading another.

For usage guides and setup examples, see The Basics: GS_Core.


Fields

FieldTypeDescription
m_trackIdAZ::UuidReferences the track’s m_id. Matched during ApplyProxies().
m_labelAZStd::stringRead-only display label (synced from track info at edit time via GetTrackInfos()).
m_proxyEntityAZ::EntityIdThe entity this track should target instead of the motion’s owner entity.

How It Works

  1. Edit time: The proxy list syncs from the asset’s track info. Each track with an m_identifier (non-empty label) appears as a proxy slot in the inspector.
  2. Designer assignment: The designer drags an entity reference into the proxy’s m_proxyEntity field.
  3. Runtime: When GS_MotionComposite::ApplyProxies() runs, it matches each proxy’s m_trackId to a track’s m_id and overrides that track’s target entity.
  4. Playback: The track now animates the proxy entity instead of the motion’s owner.

Usage

Proxies are embedded in instance wrapper structs (e.g., UiAnimationMotion, FeedbackMotion) alongside the asset reference. Components serialize the proxy list, and the wrapper handles synchronization with the asset.

Only tracks with a non-empty m_identifier appear in the proxy list. Tracks without an identifier always animate the owner entity.


See Also


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.5.2 - GS_Actions

A utility system for triggerable single-purpose actions — fire discrete behaviors from any system without recoding logic for each component.

The Actions system provides a simple, universal pattern for triggering discrete behaviors. Instead of recoding common functionality (toggle cursor, play sound, print log) into every component that needs it, you attach Action components to an entity and fire them by channel name.

Actions support chaining — when one action completes, it can fire another action on a different channel, enabling lightweight sequences without custom code.

For usage guides and setup examples, see The Basics: GS_Core.

Architecture

Entity with Actions
  ├── ToggleMouseCursor_GSAction (channel: "enter_menu")
  ├── PlaySound_GSAction           (channel: "enter_menu")
  └── PrintLog_GSAction            (channel: "debug")

Any system calls:
  ActionRequestBus::Event(entityId, DoAction, "enter_menu")
    → Both ToggleMouseCursor and PlaySound fire (matching channel)
    → PrintLog does NOT fire (different channel)

Components

ComponentPurposeDocumentation
GS_ActionComponentBase class for all actions. Handles channel matching, completion broadcasting, and action chaining.Action
ToggleMouseCursor_GSActionComponentToggles the mouse cursor on or off.Core Actions
PrintLog_GSActionComponentPrints a message to the console log.Core Actions

Quick Start

  1. Attach one or more Action components to an entity.
  2. Set the Channel on each action to control when it fires.
  3. From any system, call DoAction(channelName) on the entity’s ActionRequestBus to trigger matching actions.
  4. Optionally enable Broadcast On Complete and set an On Complete Channel Chain for action sequencing.

See Also


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.5.2.1 - Action

The base class for triggerable actions — channel-based firing, completion broadcasting, and action chaining for lightweight behavior sequencing.

GS_ActionComponent is the base class for all triggerable actions. Actions are single-purpose behaviors that fire when their channel matches an incoming DoAction call. They support completion broadcasting and optional chaining to sequence multiple actions together.

Every GS_Play action (Toggle Mouse Cursor, Print Log, and actions from other gems) extends this class. When you need a custom action, you extend it too.

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


How It Works

Firing an Action

  1. A system calls DoAction(targetChannel) on an entity’s ActionRequestBus.
  2. Every Action component on that entity evaluates whether its channel matches the targetChannel.
  3. Matching actions call ProcessAction() to execute their behavior.
  4. When done, CompleteAction() handles completion:
    • If Broadcast On Complete is enabled, it fires OnActionComplete on the ActionNotificationBus.
    • If On Complete Channel Chain is set, it fires a new DoAction with that channel name, triggering the next action in the chain.

Chaining Actions

By setting the On Complete Channel Chain to a different channel name, an action can trigger the next step in a sequence when it completes. This enables lightweight behavior chains without custom sequencing code.

Action A (channel: "step1", onCompleteChain: "step2")
  → fires → Action B (channel: "step2", onCompleteChain: "step3")
    → fires → Action C (channel: "step3")

Important: Every custom action must call CompleteAction() when its work is done. If you forget, the action will never complete and chains will break.


Inspector Properties

PropertyTypeDefaultDescription
ChannelAZStd::string""The channel this action responds to. Only DoAction calls with a matching channel will trigger this action.
Broadcast On CompleteboolfalseWhen enabled, broadcasts OnActionComplete on the ActionNotificationBus when this action finishes.
On Complete Channel ChainAZStd::string""When non-empty, fires a new DoAction with this channel name after completing. Enables action chaining.

API Reference

Request Bus: ActionRequestBus

Entity-addressed bus — call via Event(entityId, ...).

MethodParametersReturnsDescription
DoActionAZStd::string targetChannelvoidTriggers all actions on the target entity whose channel matches targetChannel.

Notification Bus: ActionNotificationBus

Entity-addressed bus — connect via BusConnect(entityId).

EventParametersDescription
OnActionCompleteBroadcast when an action with Broadcast On Complete enabled finishes its work.

Local / Virtual Methods

Override these when creating custom actions.

MethodDescription
DoAction(targetChannel)Evaluates whether this action should fire. Default compares channel to targetChannel. Override for custom evaluation logic.
ProcessAction()Your primary override. Executes the action’s behavior. Called when channel evaluation passes.
CompleteAction()Handles completion. Default checks broadcastOnComplete, fires OnActionComplete, then checks onCompleteChannelChain for chaining. Override to add custom completion logic, but always call base or handle chaining manually.

Usage Examples

Firing Actions on an Entity

#include <GS_Core/GS_CoreBus.h>

// Fire all actions on this entity that match the "open_door" channel
GS_Core::ActionRequestBus::Event(
    targetEntityId,
    &GS_Core::ActionRequestBus::Events::DoAction,
    AZStd::string("open_door")
);

Listening for Action Completion

#include <GS_Core/GS_CoreBus.h>

class MyActionListener
    : public AZ::Component
    , protected GS_Core::ActionNotificationBus::Handler
{
protected:
    void Activate() override
    {
        // Listen for action completions on a specific entity
        GS_Core::ActionNotificationBus::Handler::BusConnect(m_targetEntityId);
    }

    void Deactivate() override
    {
        GS_Core::ActionNotificationBus::Handler::BusDisconnect();
    }

    void OnActionComplete() override
    {
        // An action on the target entity has completed
        // React accordingly (advance dialogue, open next door, etc.)
    }

private:
    AZ::EntityId m_targetEntityId;
};

Extending the Action Class

Create a custom action whenever you need a reusable, triggerable behavior that can be attached to any entity.

Header (.h)

#pragma once
#include <Source/ActionSystem/GS_ActionComponent.h>

namespace MyProject
{
    class PlayEffect_GSActionComponent
        : public GS_Core::GS_ActionComponent
    {
    public:
        AZ_COMPONENT_DECL(PlayEffect_GSActionComponent);

        static void Reflect(AZ::ReflectContext* context);

    protected:
        // Action overrides
        void ProcessAction() override;
        void CompleteAction() override;

    private:
        AZStd::string m_effectName;
        float m_duration = 1.0f;
    };
}

Implementation (.cpp)

#include "PlayEffect_GSActionComponent.h"
#include <AzCore/Serialization/SerializeContext.h>

namespace MyProject
{
    AZ_COMPONENT_IMPL(PlayEffect_GSActionComponent, "PlayEffect_GSActionComponent", "{YOUR-UUID-HERE}",
        GS_Core::GS_ActionComponent);

    void PlayEffect_GSActionComponent::Reflect(AZ::ReflectContext* context)
    {
        if (auto serializeContext = azrtti_cast<AZ::SerializeContext*>(context))
        {
            serializeContext->Class<PlayEffect_GSActionComponent, GS_Core::GS_ActionComponent>()
                ->Version(0)
                ->Field("EffectName", &PlayEffect_GSActionComponent::m_effectName)
                ->Field("Duration", &PlayEffect_GSActionComponent::m_duration);

            if (AZ::EditContext* editContext = serializeContext->GetEditContext())
            {
                // Pattern: Incoming Channel → Feature Properties → Outgoing Channel
                editContext->Class<PlayEffect_GSActionComponent>(
                    "Play Effect Action", "Triggers a named particle effect")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyProject/Actions")
                        ->Attribute(AZ::Edit::Attributes::AppearsInAddComponentMenu, AZ_CRC_CE("Game"))
                    ->DataElement(AZ::Edit::UIHandlers::Default,
                        &PlayEffect_GSActionComponent::m_effectName, "Effect Name",
                        "Name of the particle effect to trigger")
                    ->DataElement(AZ::Edit::UIHandlers::Default,
                        &PlayEffect_GSActionComponent::m_duration, "Duration",
                        "How long the effect plays (seconds)");
            }
        }
    }

    void PlayEffect_GSActionComponent::ProcessAction()
    {
        // Your action logic — trigger the particle effect
        AZ_TracePrintf("Action", "Playing effect: %s for %.1f seconds",
            m_effectName.c_str(), m_duration);

        // ... spawn particle effect, start timer, etc. ...

        // IMPORTANT: Call CompleteAction when done
        // If the action is instant, call it here.
        // If it takes time (animation, timer), call it when the effect finishes.
        CompleteAction();
    }

    void PlayEffect_GSActionComponent::CompleteAction()
    {
        // Custom completion logic (if any)
        // ...

        // IMPORTANT: Call base to handle broadcast and chaining
        GS_ActionComponent::CompleteAction();
    }
}

Module Registration

m_descriptors.insert(m_descriptors.end(), {
    MyProject::PlayEffect_GSActionComponent::CreateDescriptor(),
});

Reflection Pattern

When reflecting custom action properties in the Edit Context, follow this organization pattern so all actions have a consistent inspector layout:

  1. Incoming Channel — The channel property (inherited, reflected by base class)
  2. Feature Properties — Your action-specific properties (effect name, duration, etc.)
  3. Outgoing Channel — The broadcastOnComplete and onCompleteChannelChain properties (inherited, reflected by base class)

See Also

For component references:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.5.2.1.1 - Core Actions

Pre-built actions included in GS_Core — ready-to-use triggerable behaviors for common game functionality.

GS_Core includes pre-built Action components for common behaviors. Attach them to any entity, set a channel, and fire them from any system without writing code.

All built-in actions inherit from GS_ActionComponent and follow the same channel-matching, completion, and chaining patterns.

For usage guides and setup examples, see The Basics: GS_Core.


Toggle Mouse Cursor

Toggles the operating system mouse cursor on or off.

When to Use

Use this action when transitioning between gameplay (cursor hidden) and menus (cursor visible). Common triggers include opening an inventory, entering a pause menu, or starting a dialogue sequence.

Inspector Properties

PropertyTypeDefaultDescription
ChannelAZStd::string""Inherited. The channel this action responds to.
Broadcast On CompleteboolfalseInherited. Broadcasts OnActionComplete when the toggle finishes.
On Complete Channel ChainAZStd::string""Inherited. Fires a follow-up DoAction on completion for chaining.

Usage Example

#include <GS_Core/GS_CoreBus.h>

// Toggle the cursor when opening the pause menu
GS_Core::ActionRequestBus::Event(
    uiEntityId,
    &GS_Core::ActionRequestBus::Events::DoAction,
    AZStd::string("toggle_cursor")
);

Prints a configurable message to the console log. Useful for debugging action chains, verifying event flow, or logging gameplay milestones.

When to Use

Use this action during development to verify that action channels fire correctly, test chaining sequences, or log gameplay events without writing custom components.

Inspector Properties

PropertyTypeDefaultDescription
ChannelAZStd::string""Inherited. The channel this action responds to.
MessageAZStd::string""The message to print to the console log.
Broadcast On CompleteboolfalseInherited. Broadcasts OnActionComplete when the print finishes.
On Complete Channel ChainAZStd::string""Inherited. Fires a follow-up DoAction on completion for chaining.

Usage Example

#include <GS_Core/GS_CoreBus.h>

// Trigger a debug log message
GS_Core::ActionRequestBus::Event(
    debugEntityId,
    &GS_Core::ActionRequestBus::Events::DoAction,
    AZStd::string("debug")
);
// Console output: whatever message was configured on the PrintLog action

Creating Your Own Action

Need custom behavior? See the Extending the Action Class guide for a complete walkthrough with header, implementation, Reflect pattern, and module registration.

See Also


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.6 - Utilities

A collection of utility libraries — easing curves, spring dampers, physics trigger volumes, gradients, and entity helpers.

GS_Core includes a rich set of utility libraries for common game development patterns. These are header-only (or lightweight) utilities that any component or system can use without additional setup.

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


Physics Trigger Volume

PhysicsTriggeringVolume is a non-component base class that manages the full lifecycle of a physics trigger or collision volume — entity tracking, enter/exit/hold callbacks, and collision contact events. Inherit from it to create interactive volumes without boilerplate physics code. PhysicsTriggerComponent is the concrete O3DE component that wraps this base and adds game-lifecycle awareness (standby handling).

Physics Trigger Volume API


Easing Curves

The Curves namespace provides 40+ easing functions and a CurveType enum for data-driven curve selection. All functions take a normalized t (0 → 1) input. The dispatch function EvaluateCurve(CurveType, t) routes to the correct function at runtime. Curve families: Linear, Quadratic, Cubic, Quartic, Quintic, Sine, Exponential, Circular, Back, Elastic, and Bounce.

Easing Curves API


Spring Dampers

The Springs namespace provides 6 spring-damper types for physics-based animation and smoothing — each available in float, Vector2, and Vector3 overloads (plus a Quaternion variant). All springs use a halflife parameter (seconds to reach 50% of goal) rather than raw stiffness/damping constants.

Spring Dampers API


Gradients

Multi-stop gradient types (FloatGradient, Vector2Gradient, ColorGradient) for sampling values over a normalized [0,1] range. Used throughout the motion system and feedback effects for curve-based value animation. ColorGradient exposes independent EvaluateColor(t) and EvaluateAlpha(t) channels.

Gradients API


Entity Utilities

The EntityUtility namespace provides helper functions for entity lookup by name — returning either an AZ::Entity* or AZ::EntityId.

Entity Utilities API


Weighted Random

RandomUtils::GetRandomWeighted<T> selects a key from an AZStd::unordered_map<T, float> with probability proportional to each entry’s float weight. Requires a caller-provided AZ::SimpleLcgRandom instance.

Weighted Random API


Angle Helpers

The Orientation namespace maps angles to discrete sector indices for directional gameplay — facing directions, animation sectors, compass queries. Includes the SectionConfig enum (x4 through x24 presets), RotationDirection, hysteresis support, and yaw/quaternion helpers.

Angle Helpers API


Spline Utilities

The SplineUtility namespace provides helper functions for O3DE spline queries — closest world point, closest local point, and normalized fraction along a spline — taking an entity ID and world/local position.

Spline Utilities API


Serialization Helpers

Three helpers that reduce reflection boilerplate: GS_ReflectionIncludes.h (single-include for all reflection headers), and GS_AssetReflectionIncludes.h (extends with asset serialization).

Serialization Helpers API


Common Enums

Shared enum types used across the framework: CurveType (maps to all easing functions for serialized curve selection) and BooleanConditions (condition evaluation for dialogue and record-keeping systems).

Common Enums API


See Also

For conceptual overviews and usage guides:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.6.1 - Common Enums

Shared enumeration types used across the GS_Play framework — CurveType for easing curve selection and BooleanConditions for condition evaluation.

Common Enums provides shared enumeration types registered with O3DE’s SerializeContext. They can be used in component properties, asset fields, and ScriptCanvas nodes across any gem in the framework. The two primary enums are CurveType (curve selection for motion and gradient systems) and BooleanConditions (condition evaluation for dialogue and record-keeping systems).

Reflect functions: GS_Core::ReflectCommonEnums(context), GS_Core::ReflectCurveType(context)

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


BooleanConditions

Used by condition-evaluation logic in the dialogue and record-keeping systems to compare numeric or string values.

Used by: DialogueCondition, Record_DialogueCondition, RecordKeeperComponent

ValueDescription
EqualsExact equality check
NotEqualsInequality check
GreaterThanStrict greater-than
GreaterOrEqualsGreater-than or equal
LessThanStrict less-than
LessOrEqualsLess-than or equal

CurveType

Maps to all easing curve functions in the Curves utility. Used by GS_Motion tracks, blend profiles, gradient markers, and any system that needs designer-selectable easing.

Used by: GS_MotionTrack, UiMotionTrack, FeedbackMotionTrack, gradient markers, GS_PhantomCamBlendProfile

FamilyValues
LinearLinear
QuadraticEaseInQuadratic, EaseOutQuadratic, EaseInOutQuadratic
CubicEaseInCubic, EaseOutCubic, EaseInOutCubic
QuarticEaseInQuartic, EaseOutQuartic, EaseInOutQuartic
QuinticEaseInQuintic, EaseOutQuintic, EaseInOutQuintic
SineEaseInSine, EaseOutSine, EaseInOutSine
ExponentialEaseInExpo, EaseOutExpo, EaseInOutExpo
CircularEaseInCirc, EaseOutCirc, EaseInOutCirc
BackEaseInBack, EaseOutBack, EaseInOutBack
ElasticEaseInElastic, EaseOutElastic, EaseInOutElastic
BounceEaseInBounce, EaseOutBounce, EaseInOutBounce

Evaluating a CurveType in C++

#include <GS_Core/Utility/Math/CurvesUtility.h>

// Dispatch to the correct curve function via enum
float result = GS_Core::Curves::EvaluateCurve(GS_Core::CurveType::EaseInOutCubic, t);

Using Enums in Components

Use EnumAttribute on a DataElement with UIHandlers::ComboBox to expose enum selection as an Inspector dropdown.

#include <GS_Core/Utility/Math/CurvesUtility.h>
#include <GS_Core/Utility/CommonEnums.h>

// In your component's Reflect() method:
editContext->Class<MyComponent>("My Component", "Description")
    ->DataElement(AZ::Edit::UIHandlers::ComboBox,
        &MyComponent::m_curveType, "Curve Type",
        "The easing curve applied to this animation.")
        ->EnumAttribute(GS_Core::CurveType::Linear,             "Linear")
        ->EnumAttribute(GS_Core::CurveType::EaseInQuadratic,    "Ease In Quadratic")
        ->EnumAttribute(GS_Core::CurveType::EaseOutQuadratic,   "Ease Out Quadratic")
        ->EnumAttribute(GS_Core::CurveType::EaseInOutQuadratic, "Ease InOut Quadratic")
        ->EnumAttribute(GS_Core::CurveType::EaseInCubic,        "Ease In Cubic")
        ->EnumAttribute(GS_Core::CurveType::EaseOutCubic,       "Ease Out Cubic")
        ->EnumAttribute(GS_Core::CurveType::EaseInOutCubic,     "Ease InOut Cubic")
        // ... continue for all desired variants
    ;

This creates an Inspector dropdown where designers select a curve by name without touching code.


See Also

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.6.2 - Serialization Helpers

Three reflection helpers — single-include headers for common and asset serialization, and a generic asset handler template for custom asset types.

Three helpers that eliminate serialization boilerplate: a single-include header for common reflection, an extension for asset fields, and a ready-made O3DE asset handler template for any custom AssetData subclass.

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


GS_ReflectionIncludes.h

A single-include header that brings in all O3DE reflection headers needed for any class with an inline Reflect() method.

Includes:

  • AzCore/RTTI/RTTI.h
  • AzCore/Memory/SystemAllocator.h
  • AzCore/Serialization/SerializeContext.h
  • AzCore/Serialization/EditContext.h
  • AzCore/std/string/string.h

Use this in any header declaring a reflected struct, class, or enum. Does not handle AZ::Data::Asset<T> fields.

#include <GS_Core/Utility/Reflection/GS_ReflectionIncludes.h>

namespace MyProject
{
    struct MyData
    {
        AZ_TYPE_INFO(MyData, "{YOUR-UUID-HERE}");
        static void Reflect(AZ::ReflectContext* context);

        float m_value = 0.0f;
    };
}

GS_AssetReflectionIncludes.h

Extends GS_ReflectionIncludes.h with asset serialization headers.

Adds:

  • AzCore/Asset/AssetCommon.h
  • AzCore/Asset/AssetSerializer.h

Use this in any header declaring a class with AZ::Data::Asset<T> fields. Ensures SerializeGenericTypeInfo<Asset<T>> is visible and prevents silent failures in Unity builds.

#include <GS_Core/Utility/Reflection/GS_AssetReflectionIncludes.h>

namespace MyProject
{
    struct MyComponent : public AZ::Component
    {
        AZ_COMPONENT_DECL(MyComponent);
        static void Reflect(AZ::ReflectContext* context);

        AZ::Data::Asset<MyAssetType> m_asset;
    };
}

See Also

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.6.3 - Curves

40+ easing curve functions for smooth animation and interpolation — organized by family with a CurveType enum for data-driven selection.

The Curves namespace (GS_Core::Curves) provides 40+ easing functions for smooth animation and interpolation. All functions take a normalized t value in [0,1] and return a remapped [0,1] value. The dispatch function EvaluateCurve routes to the correct function by CurveType enum value, making curve selection fully data-driven from the Inspector or asset editor.

Used by every GS_Motion track, all gradient evaluate calls, and the blend profile system.

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


Curve Families

FamilyFunctionsCharacter
LinearLinearConstant speed
QuadraticEaseInQuadratic, EaseOutQuadratic, EaseInOutQuadraticGentle acceleration
CubicEaseInCubic, EaseOutCubic, EaseInOutCubicModerate acceleration
QuarticEaseInQuartic, EaseOutQuartic, EaseInOutQuarticStrong acceleration
QuinticEaseInQuintic, EaseOutQuintic, EaseInOutQuinticVery strong acceleration
SineEaseInSine, EaseOutSine, EaseInOutSineGentle, natural feel
ExponentialEaseInExpo, EaseOutExpo, EaseInOutExpoDramatic speed change
CircularEaseInCirc, EaseOutCirc, EaseInOutCircQuarter-circle shape
BackEaseInBack, EaseOutBack, EaseInOutBackOvershoot and return
ElasticEaseInElastic, EaseOutElastic, EaseInOutElasticSpring-like oscillation
BounceEaseInBounce, EaseOutBounce, EaseInOutBounceBouncing ball effect

Variant naming: EaseIn = slow start, EaseOut = slow end, EaseInOut = slow start and end.


API Reference

Dispatch Function

float EvaluateCurve(CurveType curveType, float t);

Dispatches to the correct curve function based on the CurveType enum value. This is the primary call site for all motion and gradient systems. t must be in [0,1].

Individual Functions

One free function per CurveType value, all sharing the signature float <Name>(float t). Examples:

FunctionDescription
GS_Core::Curves::Linear(t)Linear — no easing
GS_Core::Curves::EaseInQuadratic(t)Quadratic ease-in
GS_Core::Curves::EaseOutBounce(t)Bounce ease-out
GS_Core::Curves::EaseInOutBack(t)Back ease-in-out (overshoot both ends)

All functions follow the same naming pattern as the CurveType enum values.


Usage Examples

Dispatch via enum (data-driven)

#include <GS_Core/Utility/Math/CurvesUtility.h>

// Evaluate at normalized progress t (0 → 1)
float eased = GS_Core::Curves::EvaluateCurve(GS_Core::CurveType::EaseInOutCubic, t);

// Interpolate between two values
float result = start + (end - start) * GS_Core::Curves::EvaluateCurve(curveType, t);

Direct function call

// Call the function directly for a known curve
float eased = GS_Core::Curves::EaseOutBack(t);

See Also

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.6.4 - Springs

Spring-damper functions for physically-grounded value interpolation — smooth following, overshoot, and settling for floats, vectors, and quaternions.

The Springs namespace (GS_Core::Springs) provides six spring-damper types for physics-based animation and smoothing. Springs produce natural-feeling motion that reacts to velocity — ideal for camera follow, UI motion, and any value that should settle rather than snap.

All springs use a halflife parameter: the number of seconds to cover 50% of the remaining distance to the goal. This is more intuitive than raw stiffness/damping constants. Each spring type is available in float, AZ::Vector2, and AZ::Vector3 overloads where applicable.

Used by: gs_phantomcam (camera smoothing), gs_unit (character movement), gs_juice (feedback motions)

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


Spring Types

TypeCharacterBest For
SimpleSpringDamperExactFast start, ease-out arrivalSimple follow, snap behaviors
AccelerationSpringDamperTracks target velocity, smoother on input changesCharacter movement, camera motion
DoubleSpringDamperS-curve — slow start AND slow arrivalUI transitions, polished cuts
TimedSpringDamperReaches target by a specified timeChoreographed, loosely timed movements
VelocitySpringDamperPredictive — leads the target’s directionCamera following a moving target
QuaternionSpringDamperRotational spring (angular velocity)Smooth orientation changes

SimpleSpringDamperExact

A critically-damped spring that moves toward a target position. Fast start, ease-out arrival. Computed exactly (no approximation). The most common spring for follow and snap behaviors.

ParameterTypeDescription
positionT (in/out)Current position, updated in-place each frame
velocityT (in/out)Current velocity — must be cached between frames
targetPositionTGoal position to move toward
halflifefloatSeconds to cover 50% of remaining distance
deltaTimefloatFrame delta time

AccelerationSpringDamper

Tracks a target velocity rather than a position. Adds an acceleration memory term for smoother response to sudden direction changes (e.g. thumbstick flicks). Suited for character movement and camera motion where input can change abruptly.

ParameterTypeDescription
positionT (in/out)Accumulated position
velocityT (in/out)Current velocity — must be cached between frames
accelerationT (in/out)Acceleration memory — must be cached between frames
targetVelocityTThe desired velocity to spring toward
halflifefloatSettling time
deltaTimefloatFrame delta time

DoubleSpringDamper

Two springs chained together, producing an S-curve (ease-in AND ease-out). Slower to start and slower to arrive than the simple spring. Gives a more polished feel for UI transitions or cinematic camera cuts.

ParameterTypeDescription
positionT (in/out)Current position
velocityT (in/out)Current velocity — must be cached between frames
previousPositionT (in/out)Internal state — must be cached between frames
previousVelocityT (in/out)Internal state — must be cached between frames
targetPositionTGoal position
halflifefloatSettling time
deltaTimefloatFrame delta time

TimedSpringDamper

Attempts to reach the target by a specific time. Adjusts halflife internally so the spring arrives near the target time rather than asymptotically. Useful for choreographed movements with loose time targets.

ParameterTypeDescription
positionT (in/out)Current position
velocityT (in/out)Current velocity — must be cached between frames
previousTargetT (in/out)Last known target — must be cached between frames
targetPositionTDestination
targetTimefloatTime (in seconds) at which the spring should arrive
halflifefloatBase settling time
deltaTimefloatFrame delta time

VelocitySpringDamper

Tracks a moving target by incorporating the target’s own velocity for predictive lead. The follower anticipates the direction of movement, reducing lag on fast-moving targets. Suited for camera following a character at speed.

ParameterTypeDescription
positionT (in/out)Follower position
velocityT (in/out)Follower velocity — must be cached between frames
previousPositionT (in/out)Internal state — must be cached between frames
targetPositionTCurrent target position
targetVelocityTTarget’s velocity (used for predictive lead)
halflifefloatSettling time
deltaTimefloatFrame delta time

QuaternionSpringDamper

Rotation spring that operates on angular velocity. Includes a flip option to reverse the rotation direction.

Note: The rotation output jumps to the goal each frame. Extract angularVelocity and integrate externally if you need a continuously smooth rotation output.

ParameterTypeDescription
rotationAZ::Quaternion (in/out)Quaternion moved toward targetRotation
angularVelocityAZ::Vector3 (in/out)Angular velocity — must be cached between frames
targetRotationAZ::QuaternionGoal orientation
halflifefloatSettling time
deltaTimefloatFrame delta time
flipboolReverses rotation direction (default: false)

Usage Example

#include <GS_Core/Utility/Math/SpringsUtility.h>

// Member fields — must persist between frames
AZ::Vector3 m_followVelocity = AZ::Vector3::CreateZero();

// In your tick function:
AZ::Vector3 currentPos = GetEntityTranslation();
AZ::Vector3 targetPos  = GetTargetTranslation();

GS_Core::Springs::SimpleSpringDamperExact(
    currentPos,       // in/out: current position (updated in place)
    m_followVelocity, // in/out: velocity (cached between frames)
    targetPos,        // target position
    0.1f,             // halflife: 0.1 seconds to cover half the distance
    deltaTime
);

SetEntityTranslation(currentPos);

See Also

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.6.5 - Gradients

Multi-stop gradient types for sampling color, float, and vector values over a normalized range — used by motion tracks and feedback effects.

The Gradients utility provides three parallel gradient types for interpolating values over a normalized [0,1] range using a sorted list of marker points. All types are fully reflected (SerializeContext + EditContext) and editable in the Inspector with visual marker placement. Gradients are lazily sorted before evaluation.

Used by: FeedbackMaterialTrack (color/float animation), UiImageColorTrack, procedural visual effects, any system needing editable color or value ramps.

For usage guides and setup examples, see The Basics: GS_Core.

Gradient Slider in the O3DE Inspector

 

Contents


Gradient Types

TypeValue TypeDescription
FloatGradientfloatSingle float value ramp
Vector2GradientAZ::Vector22D vector ramp
ColorGradientAZ::ColorRGBA color ramp with separate color and alpha channels

Marker Structs

Each gradient type has a corresponding marker struct that defines a single stop on the gradient.

FloatGradientMarker

FieldTypeDescription
markerValuefloatThe float value at this stop
markerPositionfloatPosition in [0, 1] along the gradient

Vector2GradientMarker

FieldTypeDescription
markerValueAZ::Vector2The 2D vector value at this stop
markerPositionfloatPosition in [0, 1] along the gradient

ColorGradientMarker

FieldTypeDescription
markerColorAZ::ColorThe color (RGB + A) at this stop
markerPositionfloatPosition in [0, 1] along the gradient

Gradient Classes

All three gradient types share the same structure and interface:

Field / MethodDescription
sliderAZStd::vector<Marker> — the sorted list of gradient stops (field name for Float/Vector2)
sortedInternal dirty flag; gradient is lazily sorted before Evaluate is called
SortGradient()Sorts markers by position. Called automatically before evaluation when markers change.
Evaluate(float t)Returns the interpolated value at normalized position t in [0, 1]

ColorGradient

ColorGradient maintains two separate marker lists for independent RGB and alpha control:

FieldDescription
colorSliderAZStd::vector<ColorGradientMarker> — RGB stops
alphaSliderAZStd::vector<ColorGradientMarker> — alpha stops

ColorGradient Channels

ColorGradient exposes three evaluate methods for flexible sampling:

MethodReturnsDescription
Evaluate(float t)AZ::ColorFull RGBA color — samples both colorSlider and alphaSlider
EvaluateColor(float t)AZ::ColorRGB only — alpha is always 1.0
EvaluateAlpha(float t)floatAlpha channel only

Usage Example

#include <GS_Core/Utility/Gradients/FloatGradientUtility.h>
#include <GS_Core/Utility/Gradients/ColorGradientUtility.h>

// Sample a float gradient at normalized progress (0 → 1)
float value = myFloatGradient.Evaluate(normalizedProgress);

// Sample full RGBA color
AZ::Color color = myColorGradient.Evaluate(normalizedProgress);

// Sample RGB and alpha independently
AZ::Color rgb   = myColorGradient.EvaluateColor(normalizedProgress);
float alpha     = myColorGradient.EvaluateAlpha(normalizedProgress);

See Also

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.6.6 - Entity Helpers

Utility functions for finding entities in the scene by name — runtime entity lookup without maintaining manual references.

The EntityUtility namespace provides helper functions for finding entities in the active scene by name. Useful for runtime entity lookup without maintaining manual entity references.

For usage guides and setup examples, see The Basics: GS_Core.


API Reference

FunctionReturnsDescription
GetEntityByName(name)AZ::Entity*Finds an active entity with the given name. Returns nullptr if not found.
GetEntityIdByName(name)AZ::EntityIdFinds an active entity’s ID by name. Returns an invalid EntityId if not found.

Usage Example

#include <GS_Core/Utilities/EntityUtility.h>

// Find an entity by name
AZ::EntityId playerId = GS_Core::EntityUtility::GetEntityIdByName("Player");
if (playerId.IsValid())
{
    // Use the entity
}

See Also

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.6.7 - Angle Helpers

Angle and orientation math — sector mapping, yaw extraction, quaternion conversion, and hysteresis for directional classification.

The Orientation namespace (GS_Core::Orientation) provides angle-to-sector mapping for directional gameplay — animation direction selection, facing classification, and compass-style sector queries. It splits a full circle into N equal angular sectors and determines which sector a given angle falls into, with hysteresis to prevent rapid sector-switching at boundaries.

Used by: Paper-facing systems (gs_performer), directional input reactors (gs_unit), targeting systems (gs_interaction)

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


SectionConfig Enum

A reflected, editor-friendly enum selecting common sector counts and alignment modes. Two alignment modes exist per count:

  • cardinal / sideAligned — sector boundaries fall on the cardinal directions; sectors straddle diagonals
  • quarters / forwardAligned — sector boundaries fall on diagonals; sectors straddle cardinals
FamilyValues
x4x4_cardinal, x4_quarters
x6x6_sideAligned, x6_forwardAligned
x8x8_cardinal, x8_quarters
x10x10_sideAligned, x10_forwardAligned
x12x12_sideAligned, x12_forwardAligned
x14x14_sideAligned, x14_forwardAligned
x16x16_sideAligned, x16_forwardAligned
x18x18_sideAligned, x18_forwardAligned
x20x20_sideAligned, x20_forwardAligned
x22x22_sideAligned, x22_forwardAligned
x24x24_sideAligned, x24_forwardAligned

Register with GS_Core::Orientation::ReflectOrientationEnums(context) to expose these in the Inspector.


RotationDirection Enum

Controls the winding convention used when computing sector angles.

ValueNumericDescription
CCW1Counter-clockwise winding
CW-1Clockwise winding

Pick Struct

Returned by all PickByAngle overloads. Contains full sector geometry for the selected sector.

FieldTypeDescription
indexintWhich sector was selected [0, N)
countintTotal number of sectors
anglefloatThe input angle as provided
widthfloatAngular width of each sector (2π / N)
centerfloatCenter angle of the selected sector
startfloatStart angle of the selected sector
endfloatEnd angle of the selected sector

Use GS_Core::Orientation::Changed(pick, prevIndex) to detect when the sector index changes between frames.


API Reference

Sector Mapping

FunctionDescription
PickByAngle(angle, count, halfAligned, prevIndex, hysteresisDeg, startAngle, dir)Primary overload — full parameter control
PickByAngle(angle, SectionConfig, prevIndex, hysteresisDeg, startAngle, dir)Convenience overload using SectionConfig enum
PickByAngle(angle, count, offsetRad, prevIndex, hysteresisDeg, startAngle, dir)Low-level overload with explicit alignment offset
ConfigToParams(cfg)Maps a SectionConfig value to its (count, halfAligned) pair
Changed(pick, prevIndex)Returns true if the sector index changed from prevIndex

Angle Math

FunctionDescription
WrapToTwoPi(x)Wraps any angle to [0, 2π)
WrapToPi(x)Wraps any angle to (-π, π]
AlignmentOffsetRad(N, halfAligned)Returns the alignment shift in radians for a given count and mode

Yaw and Quaternion

FunctionDescription
YawFromDir(dir, rotDir)Flat yaw from a direction vector (Z-up world)
YawFromDir(dir, upAxis, forwardHint, rotDir)General yaw with custom up and forward axes
FlatSignedYaw_ToCam(camFwd, rootFwd, up, dir)Signed yaw from camera-forward to entity-forward, projected flat
QuatFromYaw(yawRad, upAxis)Builds a rotation quaternion from a yaw angle and up axis

Reflection

FunctionDescription
ReflectOrientationEnums(context)Registers SectionConfig and RotationDirection with SerializeContext

Usage Example

#include <GS_Core/Utility/Math/SectionByAngle.h>

// Member field — track previous sector to enable hysteresis
int m_prevSectorIndex = -1;

// In your tick / update function:
AZ::Vector3 moveDir = GetMovementDirection();

// Get the yaw from the movement direction (Z-up world)
float yaw = GS_Core::Orientation::YawFromDir(moveDir, GS_Core::Orientation::RotationDirection::CCW);

// Map to an 8-way sector with 5-degree hysteresis
GS_Core::Orientation::Pick pick = GS_Core::Orientation::PickByAngle(
    yaw,
    GS_Core::Orientation::SectionConfig::x8_cardinal,
    m_prevSectorIndex,  // previous index for hysteresis
    5.0f                // hysteresis in degrees
);

if (GS_Core::Orientation::Changed(pick, m_prevSectorIndex))
{
    m_prevSectorIndex = pick.index;
    // React to direction change: play animation, update facing, etc.
}

See Also

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.6.8 - Spline Helpers

Utility functions for O3DE spline queries — closest point, fraction, and local/world space conversion by entity ID.

The SplineUtility namespace (GS_Core::SplineUtility) provides free functions for querying the closest point on an O3DE spline component attached to an entity. All functions take an entity ID and a position in world or local space.

Used by: PathTo_DialoguePerformance (gs_cinematics), any system that guides an entity along a spline path.

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


API Reference

FunctionParametersDescription
FindClosestWorldPointAZ::EntityId entityId, AZ::Vector3 worldPosReturns the closest world-space point on the spline attached to entityId
FindClosestLocalPointAZ::EntityId entityId, AZ::Vector3 localPosReturns the closest point in local spline space
FindClosestFractionAZ::EntityId entityId, AZ::Vector3 worldPosReturns the normalized [0, 1] fraction along the spline of the closest point to worldPos

Usage Example

#include <GS_Core/Utility/SplineUtility.h>

// Find how far along a path the player is (0 = start, 1 = end)
AZ::Vector3 playerPos = GetPlayerWorldPosition();
float fraction = GS_Core::SplineUtility::FindClosestFraction(splineEntityId, playerPos);

// Get the actual closest world position on the path
AZ::Vector3 closestPoint = GS_Core::SplineUtility::FindClosestWorldPoint(splineEntityId, playerPos);

See Also

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.6.9 - Physics Trigger Volume

Base class and component for physics trigger and collision handling — inherit to create interactive volumes with enter, exit, and hold callbacks.

The physics trigger system is two classes: PhysicsTriggeringVolume (the non-component base class with all trigger logic) and PhysicsTriggerComponent (the concrete O3DE component that wraps it with game-lifecycle awareness). Inherit from either to create interactive volumes — damage zones, pickup areas, dialogue triggers, environmental hazards — without writing boilerplate physics code.

The base class handles entity tracking (one enter/exit per entity), supports both trigger overlaps and collision contacts, and provides optional hold/persist callbacks for continuous processing.

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


PhysicsTriggeringVolume

File: Physics/PhysicsTriggeringVolume.h Namespace: GS_Core Base classes: Physics::RigidBodyNotificationBus::Handler, DebugLoggingHelper

A non-component base class that manages the full lifecycle of a physics trigger or collision volume. Add it alongside your own AZ::Component via multiple inheritance.

Configuration Fields

FieldTypeDefaultDescription
EnableCollisionAsTriggerboolfalseTreat collision begin/persist/end events as trigger-style callbacks
EnableTriggerHoldUpdateboolfalseEnable per-physics-tick TriggerHold callback while entities are inside

Lifecycle Methods

MethodDescription
ConnectTriggering(AZ::EntityId entityId)Subscribes to physics events on the given entity. Call from your component’s Activate().
DisconnectTriggering()Unsubscribes and clears all handlers. Call from your component’s Deactivate().
OnPhysicsEnabled(AZ::EntityId entityId)Called when the physics body becomes active. Internally calls InitPhysicsTriggerHandler.
OnPhysicsDisabled(AZ::EntityId entityId)Called when the physics body is destroyed. Internally calls DisconnectTriggering.

Virtual Callbacks

Override these in your subclass to react to trigger and collision events.

MethodParametersReturnsDescription
TriggerEnterAZ::EntityId entityboolCalled when an entity enters the volume. Return false to reject the entity.
TriggerHoldfloat fixedDeltaTimevoidCalled each physics tick while entities are inside. Requires EnableTriggerHoldUpdate = true.
CollisionHoldAZ::EntityId entityvoidCalled per-entity each physics tick for collision events. Requires EnableCollisionAsTrigger = true.
TriggerExitAZ::EntityId entityboolCalled when an entity exits the volume. Return false to reject.

Internal State

FieldDescription
m_entitiesAZStd::unordered_set<AZ::EntityId> — entities currently inside the volume
m_triggerEntityThe entity whose physics body is being monitored

PhysicsTriggerComponent

File: Physics/PhysicsTriggerComponent.h Namespace: GS_Core Base classes: PhysicsTriggeringVolume, GameManagerNotificationBus::Handler

A concrete O3DE component that wraps PhysicsTriggeringVolume and adds game-lifecycle awareness. Automatically disables the trigger during game standby and re-enables it on exit.

Used by: WorldTriggerComponent (gs_interaction), ColliderTriggerSensorComponent — these subclass PhysicsTriggerComponent to implement game-specific trigger behaviors.

Data Fields

FieldTypeDefaultDescription
isActiveboolfalseWhether the trigger is currently armed
triggerEntityAZ::EntityIdinvalidThe entity providing the physics body to monitor

Virtual Methods

Override these to implement game-specific trigger behavior.

MethodDescription
ActivatePhysicsTrigger(AZ::EntityId entity)Called when triggered. Subclasses perform the response here.
DeactivatePhysicsTrigger()Called on exit. Subclasses clean up here.

Standby Handling

MethodDescription
OnEnterStandby()Disables the trigger when the game enters standby
OnExitStandby()Re-enables the trigger when the game exits standby

Extending Physics Trigger Volume

Use the PhysicsTriggerComponent ClassWizard template to generate a new trigger component with boilerplate already in place — see GS_Core Templates.

Inherit directly from PhysicsTriggeringVolume alongside AZ::Component for a lightweight custom trigger. Use PhysicsTriggerComponent as your base if you need the built-in standby handling.

Header (.h)

#pragma once
#include <AzCore/Component/Component.h>
#include <GS_Core/Utility/Physics/PhysicsTriggeringVolume.h>

namespace MyProject
{
    class DamageZoneComponent
        : public AZ::Component
        , public virtual GS_Core::PhysicsTriggeringVolume
    {
    public:
        AZ_COMPONENT_DECL(DamageZoneComponent);

        static void Reflect(AZ::ReflectContext* context);

    protected:
        void Activate() override;
        void Deactivate() override;

        // Trigger overrides
        bool TriggerEnter(AZ::EntityId entity) override;
        void TriggerHold(float fixedDeltaTime) override;
        bool TriggerExit(AZ::EntityId entity) override;

    private:
        float m_damagePerSecond = 10.0f;
    };
}

Implementation (.cpp)

#include "DamageZoneComponent.h"
#include <AzCore/Serialization/SerializeContext.h>

namespace MyProject
{
    AZ_COMPONENT_IMPL(DamageZoneComponent, "DamageZoneComponent", "{YOUR-UUID-HERE}");

    void DamageZoneComponent::Reflect(AZ::ReflectContext* context)
    {
        if (auto sc = azrtti_cast<AZ::SerializeContext*>(context))
        {
            sc->Class<DamageZoneComponent, AZ::Component, GS_Core::PhysicsTriggeringVolume>()
                ->Version(0)
                ->Field("DamagePerSecond", &DamageZoneComponent::m_damagePerSecond);

            if (AZ::EditContext* ec = sc->GetEditContext())
            {
                ec->Class<DamageZoneComponent>("Damage Zone", "Deals damage inside the trigger volume")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyProject")
                        ->Attribute(AZ::Edit::Attributes::AppearsInAddComponentMenu, AZ_CRC_CE("Game"))
                    ->DataElement(AZ::Edit::UIHandlers::Default,
                        &DamageZoneComponent::m_damagePerSecond,
                        "Damage Per Second", "Damage dealt per second while inside");
            }
        }
    }

    void DamageZoneComponent::Activate()
    {
        EnableTriggerHoldUpdate = true;  // Enable hold callbacks for continuous damage
        GS_Core::PhysicsTriggeringVolume::ConnectTriggering(GetEntityId());
    }

    void DamageZoneComponent::Deactivate()
    {
        GS_Core::PhysicsTriggeringVolume::DisconnectTriggering();
    }

    bool DamageZoneComponent::TriggerEnter(AZ::EntityId entity)
    {
        AZ_TracePrintf("DamageZone", "Entity entered damage zone");
        return true; // Accept the entity
    }

    void DamageZoneComponent::TriggerHold(float fixedDeltaTime)
    {
        // Apply damage to all entities currently inside
        for (const AZ::EntityId& entity : m_entities)
        {
            // Apply m_damagePerSecond * fixedDeltaTime damage to entity...
        }
    }

    bool DamageZoneComponent::TriggerExit(AZ::EntityId entity)
    {
        AZ_TracePrintf("DamageZone", "Entity exited damage zone");
        return true;
    }
}

Setup

  1. Create an entity with a PhysX Collider component set as a trigger.
  2. Add your custom trigger component (e.g., DamageZoneComponent).
  3. Configure the collider shape and component properties.
  4. Entities with PhysX rigid bodies that enter the collider will trigger your callbacks.

See Also

For conceptual overviews and usage guides:

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.6.10 - Weighted Random

Weighted random selection — pick a key from a weighted map with probability proportional to each entry’s float weight.

RandomUtils (GS_Core::RandomUtils) provides a single templated static method for weighted random selection from a map. The caller provides their own AZ::SimpleLcgRandom instance for deterministic control over the random sequence.

Used by: RandomNodeData (dialogue random node), KlattVoiceComponent (phoneme selection), any system needing weighted random draws.

For usage guides and setup examples, see The Basics: GS_Core.

 

Contents


API Reference

template<typename T>
static T GetRandomWeighted(
    AZ::SimpleLcgRandom* rand,
    const AZStd::unordered_map<T, float>& weightedTable
);

Selects one key from weightedTable with probability proportional to its float weight value. Higher weights mean higher probability of selection.

ParameterDescription
randCaller-owned AZ::SimpleLcgRandom instance. The caller controls seeding and lifetime.
weightedTableMap of candidates to weights. Keys are the items to select from; values are their relative weights.

Returns: The selected key of type T.

Edge cases:

  • Falls back to the last entry on floating-point precision edge cases.
  • Asserts if weightedTable is empty.

Usage Example

#include <GS_Core/Utility/Random/RandomUtils.h>
#include <AzCore/Math/Random.h>

// Member field — keep the random instance alive across calls
AZ::SimpleLcgRandom m_random;

// Build a weighted table: key = item, value = relative weight
AZStd::unordered_map<AZStd::string, float> lootTable;
lootTable["Common Sword"]    = 50.0f;
lootTable["Rare Shield"]     = 30.0f;
lootTable["Epic Helmet"]     = 15.0f;
lootTable["Legendary Ring"]  =  5.0f;

// Select an item — "Common Sword" is 10x more likely than "Legendary Ring"
AZStd::string selected = GS_Core::RandomUtils::GetRandomWeighted(&m_random, lootTable);

See Also

For related resources:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.7 - Templates

ClassWizard templates for GS_Core — manager components, save system savers, input readers, and physics trigger volumes.

All GS_Core extension types are generated through the ClassWizard CLI. The wizard handles UUID generation, cmake file-list registration, and module descriptor injection automatically. Never create these files from scratch.

For usage guides and setup examples, see The Basics: GS_Core.

python ClassWizard.py \
    --template <TemplateName> \
    --gem <GemPath> \
    --name <SymbolName> \
    [--input-var key=value ...]

<GemPath> is the full path to your gem root. <SymbolName> becomes ${Name} — the prefix on all generated filenames and class names.

 

Contents


Manager Component

Template: GS_ManagerComponent

Creates a standard game component with an optional EBus interface header. This is the baseline component pattern for all GS_Play manager-style objects — systems that own state and expose it to other components via a request bus.

Generated files:

  • Source/${Name}ManagerComponent.h/.cpp
  • Include/${GemName}/${Name}Bus.h (optional — the EBus interface)

CLI:

python ClassWizard.py --template GS_ManagerComponent --gem <GemPath> --name <Name>

# Skip the EBus header if no bus is needed:
python ClassWizard.py --template GS_ManagerComponent --gem <GemPath> --name <Name> \
    --input-var skip_interface=true

Input vars:

VarTypeDefaultDescription
skip_interfacetogglefalseOmit the ${Name}Bus.h EBus interface header

Post-generation: None — cmake registration and module descriptor are fully automatic.

See also: GS_Managers — the built-in manager system this component integrates with.


Saver Component

Template: SaverComponent

Creates a component that participates in the save system. Handles serializing and restoring a specific block of game state when the save system broadcasts its save/load events.

Generated files:

  • Source/${Name}SaverComponent.h/.cpp

CLI:

python ClassWizard.py --template SaverComponent --gem <GemPath> --name <Name>

Post-generation: Implement BuildSaveData() and ProcessLoad() bodies with save record reads/writes via GS_Core::SaveSystemRequestBus. Set GetSubComponentName() to a unique string so save keys do not collide with other savers.

See also: GS_Save / Savers — full extension guide with header and implementation examples.


InputReader Component

Template: GS_InputReaderComponent

Sits on the Controller Entity. Reads raw hardware input events (keyboard, gamepad) and translates them into named input data events broadcast via InputDataNotificationBus. Downstream InputReactor components on the Unit Entity subscribe to those named events.

Generated files:

  • Source/${Name}InputReaderComponent.h/.cpp

CLI:

python ClassWizard.py --template GS_InputReaderComponent --gem <GemPath> --name <Name>

Post-generation: Bind specific hardware input channels in Activate() and implement event handlers that call InputDataNotificationBus::Broadcast(...). Pair with a corresponding InputReactor on the Unit side.

See also: GS_Unit / Input Data — the full input pipeline overview.


Physics Trigger Component

Template: PhysicsTriggerComponent

Creates a component that wraps a PhysX trigger volume. Responds to TriggerEnter / TriggerExit events to fire game logic when entities enter or leave a physics shape.

Generated files:

  • Source/${Name}PhysicsTriggerComponent.h/.cpp

CLI:

python ClassWizard.py --template PhysicsTriggerComponent --gem <GemPath> --name <Name>

Post-generation: Implement TriggerEnter / TriggerExit / TriggerHold bodies. Requires a PhysX Shape component on the same entity with Trigger mode enabled. Stack multiple trigger components on one entity for compound logic.

See also: Physics Trigger Volume — full extension guide with header and implementation examples.


See Also

For the full API, component properties, and C++ extension guide:

For all ClassWizard templates across GS_Play gems:


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.1.8 - 3rd Party Implementations

For usage guides and setup examples, see The Basics: GS_Core.


Get GS_Core

GS_Core — Explore this gem on the product page and add it to your project.

4.2 - GS_AI

Foundation scaffold for AI controller integration in GS_Play projects. Provides the system component and extension point for custom AI controller gems.

GS_AI is the AI foundation gem for GS_Play. It installs the system-level scaffold that custom AI controller gems and in-project AI logic hook into. The gem itself ships a single system component — it does not provide ready-made AI behaviours. Instead it defines the integration pattern: your AI controllers extend the provided base classes and register against the system component at activation. The gem depends on GS_Core.

For usage guides and setup examples, see The Basics: GS_AI.

 

Contents


AI System

The AI system component is the runtime anchor for the gem. It initializes the AI subsystem, manages the controller registry, and provides the bus interface that AI controllers and external systems use to query active agents. All custom AI controllers in a project register through this component.

ComponentPurpose
GS_AISystemComponentRuntime system component. Initializes the AI subsystem and hosts the controller registry. Extension point for custom AI controller gems and in-project AI logic.

Extension pattern:

Custom AI controllers are O3DE components that extend the GS_AI base controller interface and activate/deactivate against GS_AISystemComponent. No gem source modification is required — new controllers self-register on Activate() via the system bus and deregister on Deactivate().

AI System API


Installation

GS_AI requires GS_Core and is a prerequisite for any project that uses custom AI controller gems.

  1. Enable GS_AI and GS_Core in your O3DE project’s gem list.
  2. The GS_AISystemComponent activates automatically as a system component — no manual placement in the level is required.
  3. Implement your AI controller as an O3DE component that connects to GS_AIRequestBus on Activate() and disconnects on Deactivate().
  4. Add your custom AI controller component to agent entities in the level.

See Also

For conceptual overviews and usage guides:

For related resources:


Get GS_AI

GS_AI — Explore this gem on the product page and add it to your project.

4.2.1 - 3rd Party Implementations

For usage guides and setup examples, see The Basics: GS_AI.


Get GS_AI

GS_AI — Explore this gem on the product page and add it to your project.

4.3 - GS_Audio

Audio management, event-based sound playback, multi-layer music scoring, mixing buses with effects, and Klatt formant voice synthesis with 3D spatial audio.

GS_Audio provides a complete audio solution for GS_Play projects. It includes an event-based sound playback system, multi-layer music scoring, named mixing buses with configurable effects chains, and a built-in Klatt formant voice synthesizer with 3D spatial audio. All features integrate with the GS_Play manager lifecycle and respond to standby mode automatically.

For usage guides and setup examples, see The Basics: GS_Audio.

 

Contents


Audio Management

The Audio Manager singleton initializes the MiniAudio engine, manages mixing buses, loads audio event libraries, and coordinates score track playback. It extends GS_ManagerComponent and participates in the standard two-stage initialization.

ComponentPurpose
Audio ManagerMaster audio controller – engine lifecycle, bus routing, event library loading, score management.

Audio Manager API


Audio Events

Event-based sound playback. Events define clip pools with selection rules, spatialization mode, concurrent limits, and repeat-hold behavior. Events are grouped into library assets.

Component / AssetPurpose
GS_AudioEventSingle event definition – clip pool, selection type, 2D/3D mode.
AudioEventLibraryAsset containing a collection of audio events.
GS_AudioEventComponentPer-entity audio event playback with 3D positioning.

Audio Events API


Mixing & Effects

Named audio buses with configurable effects chains and environmental influence. Includes 9 built-in audio filter types.

Component / TypePurpose
GS_MixingBusCustom MiniAudio node for mixing and effects processing.
BusEffectsPairMaps a bus name to an effects chain.
AudioBusInfluenceEffectsEnvironmental effects with priority stacking.

Mixing & Effects API


Score Arrangement

Multi-layer music scoring with configurable time signatures, tempo, and layer control.

AssetPurpose
ScoreArrangementTrackMulti-layer music asset – time signature, BPM, fade, layers.
ScoreLayerIndividual track within a score arrangement.

Score Arrangement API


Klatt Voice Synthesis

Built-in text-to-speech using Klatt formant synthesis with 3D spatial audio.

ComponentPurpose
KlattVoiceSystemComponentShared SoLoud engine management, 3D listener tracking.
KlattVoiceComponentPer-entity voice with spatial audio, phoneme mapping, segment queue.

Klatt Voice API


Dependencies

  • GS_Core (required)
  • MiniAudio (third-party audio library)
  • SoLoud (embedded, for voice synthesis)

Installation

  1. Enable the GS_Audio gem in your project configuration.
  2. Ensure GS_Core and MiniAudio are also enabled.
  3. Create an Audio Manager prefab and add it to the Game Manager’s Startup Managers list.
  4. Create Audio Event Library assets for your sound effects.
  5. Add GS_AudioEventComponent to entities that need to play sounds.

See Also

For conceptual overviews and usage guides:

For related resources:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

4.3.1 - Audio Manager

Master audio controller – engine initialization, mixing bus routing, event library loading, and score playback coordination.

The Audio Manager is the master audio controller for every GS_Play project. It extends GS_ManagerComponent and participates in the standard two-stage initialization managed by the Game Manager. On startup it initializes the MiniAudio engine, creates the named mixing bus graph, loads audio event libraries, and coordinates score track playback.

Like all GS_Play managers, the Audio Manager responds to standby mode automatically – muting or pausing audio output when the game enters a blocking operation such as a stage change.

For usage guides and setup examples, see The Basics: GS_Audio.

Audio Manager component in the O3DE Inspector

 

Contents


How It Works

Engine Lifecycle

When the Audio Manager activates, it initializes a MiniAudio engine instance and builds the mixing bus graph from its configured bus list. During Stage 2 startup it loads all referenced Audio Event Library assets so that events can be triggered immediately once startup completes.

Mixing Bus Routing

All audio output flows through named mixing buses. Each bus is a GS_MixingBus node in the MiniAudio graph with its own volume level and optional effects chain. The Audio Manager owns the top-level routing and exposes volume control per bus through the request bus.

Score Playback

The Audio Manager coordinates playback of ScoreArrangementTrack assets – multi-layer musical scores with configurable tempo, time signature, and layer selection. Score tracks are loaded and managed through the request bus.


Inspector Properties

PropertyTypeDescription
Mixing BusesAZStd::vector<BusEffectsPair>Named mixing buses with optional effects chains. Each entry maps a bus name to an effects configuration.
Startup LibrariesAZStd::vector<AZ::Data::Asset<AudioEventLibrary>>Audio event library assets to load during startup. Events in these libraries are available immediately after initialization.
Default Master VolumefloatInitial master volume level (0.0 to 1.0).

API Reference

GS_AudioManagerComponent

FieldValue
TypeId{F28721FD-B9FD-4C04-8CD1-6344BD8A3B78}
ExtendsGS_Core::GS_ManagerComponent
HeaderGS_Audio/GS_AudioManagerBus.h

Request Bus: AudioManagerRequestBus

Commands sent to the Audio Manager. Singleton bus – Single address, single handler.

MethodParametersReturnsDescription
PlayAudioEventconst AZStd::string& eventNamevoidPlays the named audio event from the loaded event libraries.
PlayAudioEventconst AZStd::string& eventName, const AZ::EntityId& entityIdvoidPlays the named audio event positioned at the specified entity for 3D spatialization.
StopAudioEventconst AZStd::string& eventNamevoidStops playback of the named audio event.
StopAllAudioEventsvoidStops all currently playing audio events.
SetMixerVolumeconst AZStd::string& busName, float volumevoidSets the volume of a named mixing bus (0.0 to 1.0).
GetMixerVolumeconst AZStd::string& busNamefloatReturns the current volume of a named mixing bus.
SetMasterVolumefloat volumevoidSets the master output volume (0.0 to 1.0).
GetMasterVolumefloatReturns the current master output volume.
LoadEventLibraryconst AZ::Data::Asset<AudioEventLibrary>& libraryvoidLoads an audio event library at runtime, making its events available for playback.
UnloadEventLibraryconst AZ::Data::Asset<AudioEventLibrary>& libraryvoidUnloads a previously loaded audio event library.
PlayScoreTrackconst AZ::Data::Asset<ScoreArrangementTrack>& trackvoidBegins playback of a score arrangement track.
StopScoreTrackvoidStops the currently playing score arrangement track.

Notification Bus: AudioManagerNotificationBus

Events broadcast by the Audio Manager. Multiple handler bus – any number of components can subscribe.

EventParametersDescription
OnAudioEventStartedconst AZStd::string& eventNameFired when an audio event begins playback.
OnAudioEventStoppedconst AZStd::string& eventNameFired when an audio event stops playback.
OnScoreTrackStartedFired when a score arrangement track begins playback.
OnScoreTrackStoppedFired when a score arrangement track stops playback.
OnMixerVolumeChangedconst AZStd::string& busName, float volumeFired when a mixing bus volume changes.

See Also

For conceptual overviews and usage guides:

For component references:

For related resources:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

4.3.2 - Audio Events

Event-based sound playback – audio event definitions, clip pool selection, spatialization, and event library assets.

Audio Events are the primary mechanism for playing sounds in GS_Play. A GS_AudioEvent defines a single sound event with a pool of audio clips, selection rules (random or sequential), 2D/3D spatialization mode, concurrent playback limits, and repeat-hold behavior. Events are grouped into AudioEventLibrary assets that the Audio Manager loads at startup or on demand.

When an event is triggered, the system selects a clip from the pool according to the configured selection type, checks concurrent limits, and routes the output through the appropriate mixing bus.

For usage guides and setup examples, see The Basics: GS_Audio.

Audio Event Library asset in the O3DE Asset Editor

 

Contents


Data Model

GS_AudioEvent

A single audio event definition containing all playback configuration.

FieldValue
TypeId{2A6E337B-2B9A-4CB2-8760-BF3A12C50CA0}
FieldTypeDescription
Event NameAZStd::stringUnique identifier for this event within its library. Used by PlayAudioEvent calls.
Audio ClipsAZStd::vector<AudioClipAsset>Pool of audio clip assets available for this event.
Pool Selection TypePoolSelectionTypeHow clips are chosen from the pool: Random or Increment (sequential).
Is 3DboolWhen true, audio is spatialized in 3D space relative to the emitting entity. When false, audio plays as 2D (non-positional).
Max ConcurrentintMaximum number of simultaneous instances of this event. Additional triggers are ignored until a slot opens. 0 means unlimited.
Repeat Hold TimefloatMinimum time in seconds before this event can be retriggered. Prevents rapid-fire repetition of the same sound.
Mixing BusAZStd::stringName of the mixing bus to route this event’s output through.
VolumefloatBase volume level for this event (0.0 to 1.0).
Pitch VariancefloatRandom pitch variation range applied each time the event plays. 0.0 means no variation.

AudioEventLibrary

An asset containing a collection of GS_AudioEvent definitions. Libraries are loaded by the Audio Manager at startup or at runtime via LoadEventLibrary.

FieldValue
TypeId{04218A1E-4399-4A7F-9649-ED468B5EF76B}
ExtendsAZ::Data::AssetData
ReflectionRequires GS_AssetReflectionIncludes.h — see Serialization Helpers
FieldTypeDescription
EventsAZStd::vector<GS_AudioEvent>The collection of audio events defined in this library.

PoolSelectionType (Enum)

Determines how audio clips are selected from an event’s clip pool.

FieldValue
TypeId{AF10C5C8-E54E-41DA-917A-6DF12CA89CE3}
ValueDescription
RandomA random clip is chosen from the pool each time the event plays.
IncrementClips are played sequentially, advancing to the next clip in the pool on each trigger. Wraps around at the end.

GS_AudioEventComponent

Per-entity component that provides audio event playback with optional 3D positioning. Attach this component to any entity that needs to emit sounds.

FieldTypeDescription
Audio EventsAZStd::vector<AZStd::string>List of event names this component can play. Events must exist in a loaded library.
Auto PlayboolWhen true, the first event in the list plays automatically on activation.

See Also

For conceptual overviews and usage guides:

For component references:

  • Audio Manager – Engine initialization and event library loading

For related resources:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

4.3.3 - Mixing & Effects

Named audio mixing buses with configurable effects chains, environmental influence, and 9 built-in audio filter types.

GS_Audio provides a named mixing bus system built on custom MiniAudio nodes. Each GS_MixingBus is a node in the audio graph with its own volume level and an optional chain of audio filters. Buses are configured in the Audio Manager Inspector and can be controlled at runtime through the mixing request bus.

The effects system includes 9 built-in filter types covering frequency shaping, equalization, delay, and reverb. Environmental influence effects allow game world volumes (rooms, weather zones) to push effects onto buses with priority-based stacking.

For usage guides and setup examples, see The Basics: GS_Audio.

Contents


GS_MixingBus

Custom MiniAudio node for mixing and effects processing.

FieldValue
TypeId{26E5BA8D-33E0-42E4-BBC0-6A3B2C46F52E}

API Reference

Request Bus: GS_MixingRequestBus

Mixer control commands. Singleton bus – Single address, single handler.

MethodParametersReturnsDescription
SetBusVolumeconst AZStd::string& busName, float volumevoidSets the volume of a named mixing bus (0.0 to 1.0).
GetBusVolumeconst AZStd::string& busNamefloatReturns the current volume of a named mixing bus.
MuteBusconst AZStd::string& busName, bool mutevoidMutes or unmutes a named mixing bus.
IsBusMutedconst AZStd::string& busNameboolReturns whether a named mixing bus is currently muted.
ApplyBusEffectsconst AZStd::string& busName, const AudioBusEffects& effectsvoidApplies an effects chain to a named mixing bus, replacing any existing effects.
ClearBusEffectsconst AZStd::string& busNamevoidRemoves all effects from a named mixing bus.
PushInfluenceEffectsconst AZStd::string& busName, const AudioBusInfluenceEffects& effectsvoidPushes environmental influence effects onto a bus with priority stacking.
PopInfluenceEffectsconst AZStd::string& busName, int priorityvoidRemoves influence effects at the specified priority level from a bus.

Audio Filters

All 9 built-in filter types. Each filter is configured as part of an effects chain applied to a mixing bus.

FilterTypeDescription
GS_LowPassFilterFrequency cutoffAttenuates frequencies above the cutoff point. Used for muffling, distance simulation, and underwater effects.
GS_HighPassFilterFrequency cutoffAttenuates frequencies below the cutoff point. Used for thinning audio, radio/telephone effects.
GS_BandPassFilterBand isolationPasses only frequencies within a specified band, attenuating everything outside. Combines low-pass and high-pass behavior.
GS_NotchFilterBand removalAttenuates frequencies within a narrow band while passing everything outside. The inverse of band-pass.
GS_PeakingEQFilterBand boost/cutBoosts or cuts frequencies around a center frequency with configurable bandwidth. Used for tonal shaping.
GS_LowShelfFilterLow frequency shelfBoosts or cuts all frequencies below a threshold by a fixed amount. Used for bass adjustment.
GS_HighShelfFilterHigh frequency shelfBoosts or cuts all frequencies above a threshold by a fixed amount. Used for treble adjustment.
GS_DelayFilterEcho/delayProduces delayed repetitions of the input signal. Configurable delay time and feedback amount.
GS_ReverbFilterRoom reverbSimulates room acoustics by adding dense reflections. Configurable room size and damping.

Data Structures

BusEffectsPair

Maps a bus name to an effects chain configuration. Used in the Audio Manager’s Inspector to define per-bus effects at design time.

FieldValue
TypeId{AD9E26C9-C172-42BF-B38C-BB06FC704E36}
FieldTypeDescription
Bus NameAZStd::stringThe name of the mixing bus this effects chain applies to.
EffectsAudioBusEffectsThe effects chain configuration for this bus.

AudioBusEffects

A collection of audio filter configurations that form an effects chain on a mixing bus.

FieldValue
TypeId{15EC6932-1F88-4EC0-9683-6D80AE982820}
FieldTypeDescription
FiltersAZStd::vector<AudioFilter>Ordered list of audio filters applied in sequence.

AudioBusInfluenceEffects

Environmental effects with priority-based stacking. Game world volumes (rooms, weather zones, underwater areas) push influence effects onto mixing buses. Higher priority influences override lower ones.

FieldValue
TypeId{75D039EC-7EE2-4988-A2ED-86689449B575}
FieldTypeDescription
PriorityintStacking priority. Higher values override lower values when multiple influences target the same bus.
EffectsAudioBusEffectsThe effects chain to apply as an environmental influence.

See Also

For conceptual overviews and usage guides:

For component references:

  • Audio Manager – Master controller that owns the mixing bus graph

For related resources:

  • Audio Events – Events route their output through mixing buses

Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

4.3.4 - Score Arrangement

Multi-layer musical score system for dynamic music – tempo, time signatures, fade control, and layer selection.

The Score Arrangement system provides multi-layer dynamic music for GS_Play projects. A ScoreArrangementTrack asset defines a musical score with configurable tempo, time signature, fade behavior, and multiple layers that can be enabled or disabled at runtime. This allows game music to adapt to gameplay state – adding or removing instrumental layers, changing intensity, or crossfading between sections.

Score tracks are loaded and controlled through the Audio Manager request bus.

For usage guides and setup examples, see The Basics: GS_Audio.

Score Arrangement asset in the O3DE Asset Editor

 

Contents


Data Model

ScoreArrangementTrack

A multi-layer musical score asset. Each track defines the musical structure and contains one or more layers that play simultaneously.

FieldValue
TypeId{DBB48082-1834-4DFF-BAD2-6EA8D83F1AD0}
ExtendsAZ::Data::AssetData
ReflectionRequires GS_AssetReflectionIncludes.h — see Serialization Helpers
FieldTypeDescription
Track NameAZStd::stringIdentifier for this score track.
Time SignatureTimeSignaturesThe time signature for this score (e.g. 4/4, 3/4, 6/8).
BPMfloatTempo in beats per minute.
Fade In TimefloatDuration in seconds for the score to fade in when playback begins.
Fade Out TimefloatDuration in seconds for the score to fade out when playback stops.
LoopboolWhether the score loops back to the beginning when it reaches the end.
LayersAZStd::vector<ScoreLayer>The musical layers that compose this score.
Active LayersAZStd::vector<int>Indices of layers that are active (audible) at the start of playback.

ScoreLayer

A single musical layer within a score arrangement. Each layer represents one track of audio (e.g. drums, bass, melody) that can be independently enabled or disabled.

FieldValue
TypeId{C8B2669A-FAEA-4910-9218-6FE50D2E588E}
FieldTypeDescription
Layer NameAZStd::stringIdentifier for this layer within the score.
Audio AssetAZ::Data::Asset<AudioClipAsset>The audio clip for this layer.
VolumefloatBase volume level for this layer (0.0 to 1.0).
Fade TimefloatDuration in seconds for this layer to fade in or out when toggled.

TimeSignatures (Enum)

Supported musical time signatures for score arrangement tracks.

FieldValue
TypeId{6D6B5657-746C-4FCA-A0AC-671C0F064570}
ValueBeats per MeasureBeat UnitDescription
FourFour4Quarter note4/4 – Common time. The most widely used time signature.
FourTwo4Half note4/2 – Four half-note beats per measure.
TwelveEight12Eighth note12/8 – Compound quadruple meter. Four groups of three eighth notes.
TwoTwo2Half note2/2 – Cut time (alla breve). Two half-note beats per measure.
TwoFour2Quarter note2/4 – Two quarter-note beats per measure. March time.
SixEight6Eighth note6/8 – Compound duple meter. Two groups of three eighth notes.
ThreeFour3Quarter note3/4 – Waltz time. Three quarter-note beats per measure.
ThreeTwo3Half note3/2 – Three half-note beats per measure.
NineEight9Eighth note9/8 – Compound triple meter. Three groups of three eighth notes.

See Also

For conceptual overviews and usage guides:

For component references:


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

4.3.5 - Klatt Voice Synthesis

Custom text-to-speech via Klatt formant synthesis with 3D spatial audio, phoneme mapping, and voice profiling.

The Klatt Voice Synthesis system provides custom text-to-speech for GS_Play projects using Klatt formant synthesis with full 3D spatial audio. It uses SoLoud internally for speech generation and MiniAudio for spatial positioning.

The system has two layers:

  • KlattVoiceSystemComponent – A singleton that manages the shared SoLoud engine instance and tracks the 3D audio listener position.
  • KlattVoiceComponent – A per-entity component that generates speech, queues segments, applies voice profiles, and emits spatialized audio from the entity’s position.

Voice characteristics are defined through KlattVoiceProfile assets containing frequency, speed, waveform, formant, and phoneme mapping configuration. Phoneme maps convert input text to ARPABET phonemes for the Klatt synthesizer, with support for custom pronunciation overrides.

For usage guides and setup examples, see The Basics: GS_Audio.

Klatt Voice Profile asset in the O3DE Asset Editor

 

Contents


Components

KlattVoiceSystemComponent

Singleton component that manages the shared SoLoud engine and 3D listener tracking.

FieldValue
TypeId{F4A5D6E7-8B9C-4D5E-A1F2-3B4C5D6E7F8A}
ExtendsAZ::Component, AZ::TickBus::Handler
BusKlattVoiceSystemRequestBus (Single/Single)

KlattVoiceComponent

Per-entity voice component with spatial audio, phoneme mapping, and segment queue.

FieldValue
TypeId{4A8B9C7D-6E5F-4D3C-2B1A-0F9E8D7C6B5A}
ExtendsAZ::Component, AZ::TickBus::Handler
Request BusKlattVoiceRequestBus (Single/ById, entity-addressed)
Notification BusKlattVoiceNotificationBus (Multiple/Multiple)

API Reference

Request Bus: KlattVoiceSystemRequestBus

System-level voice management. Singleton bus – Single address, single handler.

MethodParametersReturnsDescription
GetSoLoudEngineSoLoud::Soloud*Returns a pointer to the shared SoLoud engine instance.
SetListenerPositionconst AZ::Vector3& positionvoidUpdates the 3D audio listener position for spatial voice playback.
SetListenerOrientationconst AZ::Vector3& forward, const AZ::Vector3& upvoidUpdates the 3D audio listener orientation.
GetListenerPositionAZ::Vector3Returns the current listener position.
IsEngineReadyboolReturns whether the SoLoud engine has been initialized and is ready.

Request Bus: KlattVoiceRequestBus

Per-entity voice synthesis controls. Entity-addressed bus – Single handler per entity ID.

MethodParametersReturnsDescription
Speakconst AZStd::string& textvoidConverts text to speech and plays it. Uses the component’s configured voice profile.
SpeakWithParamsconst AZStd::string& text, const KlattVoiceParams& paramsvoidConverts text to speech using the specified voice parameters instead of the profile defaults.
StopSpeakingvoidImmediately stops any speech in progress and clears the segment queue.
IsSpeakingboolReturns whether this entity’s voice is currently producing speech.
QueueSegmentconst AZStd::string& textvoidAdds a speech segment to the queue. Queued segments play in order after the current segment finishes.
ClearQueuevoidClears all queued speech segments without stopping current playback.
SetVoiceProfileconst AZ::Data::Asset<KlattVoiceProfile>& profilevoidChanges the voice profile used by this component.
GetVoiceProfileAZ::Data::Asset<KlattVoiceProfile>Returns the currently assigned voice profile asset.
SetSpatialConfigconst KlattSpatialConfig& configvoidUpdates the 3D spatial audio configuration for this voice.
GetSpatialConfigKlattSpatialConfigReturns the current spatial audio configuration.
SetVolumefloat volumevoidSets the output volume for this voice (0.0 to 1.0).
GetVolumefloatReturns the current output volume.

Notification Bus: KlattVoiceNotificationBus

Events broadcast by voice components. Multiple handler bus – any number of components can subscribe.

EventParametersDescription
OnSpeechStartedconst AZ::EntityId& entityIdFired when an entity begins speaking.
OnSpeechFinishedconst AZ::EntityId& entityIdFired when an entity finishes speaking (including all queued segments).
OnSegmentStartedconst AZ::EntityId& entityId, int segmentIndexFired when a new speech segment begins playing.
OnSegmentFinishedconst AZ::EntityId& entityId, int segmentIndexFired when a speech segment finishes playing.

Data Types

KlattVoiceParams

Core voice synthesis parameters controlling the Klatt formant synthesizer output.

FieldValue
TypeId{8A9C7F3B-4E2D-4C1A-9B5E-6D8F9A2C1B4E}
FieldTypeDescription
Base FrequencyfloatFundamental frequency (F0) in Hz. Controls the base pitch of the voice.
SpeedfloatSpeech rate multiplier. 1.0 is normal speed.
DeclinationfloatPitch declination rate. Controls how pitch drops over the course of an utterance.
WaveformKlattWaveformGlottal waveform type used by the synthesizer.
Formant ShiftfloatShifts all formant frequencies up or down. Positive values raise pitch character, negative values lower it.
Pitch VariancefloatAmount of random pitch variation applied during speech for natural-sounding intonation.

KlattVoiceProfile

A voice profile asset combining synthesis parameters with a phoneme mapping.

FieldValue
TypeId{2CEB777E-DAA7-40B1-BFF4-0F772ADE86CF}
ReflectionRequires GS_AssetReflectionIncludes.h — see Serialization Helpers
FieldTypeDescription
Voice ParamsKlattVoiceParamsThe synthesis parameters for this voice profile.
Phoneme MapAZ::Data::Asset<KlattPhonemeMap>The phoneme mapping asset used for text-to-phoneme conversion.

KlattVoicePreset

A preset configuration for quick voice setup.

FieldValue
TypeId{2B8D9E4F-7C6A-4D3B-8E9F-1A2B3C4D5E6F}
FieldTypeDescription
Preset NameAZStd::stringDisplay name for this preset.
ProfileKlattVoiceProfileThe voice profile configuration stored in this preset.

KlattSpatialConfig

3D spatial audio configuration for voice positioning.

FieldValue
TypeId{7C9F8E2D-3A4B-5F6C-1E0D-9A8B7C6D5E4F}
FieldTypeDescription
Enable 3DboolWhether this voice uses 3D spatialization. When false, audio plays as 2D.
Min DistancefloatDistance at which attenuation begins. Below this distance the voice plays at full volume.
Max DistancefloatDistance at which the voice reaches minimum volume.
Attenuation ModelintThe distance attenuation curve type (linear, inverse, exponential).
Doppler FactorfloatIntensity of the Doppler effect applied to this voice. 0.0 disables Doppler.

KlattPhonemeMap

Phoneme mapping asset for text-to-ARPABET conversion with custom overrides.

FieldValue
TypeId{F3E9D7C1-2A4B-5E8F-9C3D-6A1B4E7F2D5C}
ReflectionRequires GS_AssetReflectionIncludes.h — see Serialization Helpers
FieldTypeDescription
Base MapBasePhonemeMapThe base phoneme dictionary to use as the foundation for conversion.
OverridesAZStd::vector<PhonemeOverride>Custom pronunciation overrides for specific words or patterns.

PhonemeOverride

A custom pronunciation rule that overrides the base phoneme map for a specific word or pattern.

FieldValue
TypeId{A2B5C8D1-4E7F-3A9C-6B2D-1F5E8A3C7D9B}
FieldTypeDescription
WordAZStd::stringThe word or pattern to match.
PhonemesAZStd::stringThe ARPABET phoneme sequence to use for this word.

Enumerations

KlattWaveform

Glottal waveform types available for the Klatt synthesizer.

FieldValue
TypeId{8ED1DABE-3347-44A5-B43A-C171D36AE780}
ValueDescription
SawSawtooth waveform. Bright, buzzy character.
TriangleTriangle waveform. Softer than sawtooth, slightly hollow.
SinSine waveform. Pure tone, smooth and clean.
SquareSquare waveform. Hollow, reed-like character.
PulsePulse waveform. Variable duty cycle for varied timbres.
NoiseNoise waveform. Breathy, whisper-like quality.
WarbleWarble waveform. Modulated tone with vibrato-like character.

BasePhonemeMap

Available base phoneme dictionaries for text-to-ARPABET conversion.

FieldValue
TypeId{D8F2A3C5-1B4E-7A9F-6D2C-5E8A1B3F4C7D}
ValueDescription
SoLoud_DefaultThe default phoneme mapping built into SoLoud. Covers standard English pronunciation.
CMU_FullThe full CMU Pronouncing Dictionary. Comprehensive English phoneme coverage with over 130,000 entries.

KTT Voice Tags

KTT (Klatt Text Tags) are inline commands embedded in strings passed to KlattVoiceComponent::SpeakText. They are parsed by KlattCommandParser::Parse and stripped from the spoken text before synthesis begins — they are never heard.

Format: <ktt attr1=value1 attr2=value2>

Multiple attributes can be combined in a single tag. Attribute names are case-insensitive. String values may optionally be wrapped in quotes. An empty value (e.g. speed=) resets that parameter to the voice profile default.


speed=X

Override the speech speed multiplier from this point forward.

Range0.15.0
Default resetspeed= (restores profile default)
1.0Normal speed
Normal speech <ktt speed=2.0> fast bit <ktt speed=> back to default.

decl=X / declination=X

Pitch declination — how much pitch falls over the course of the utterance. Both decl and declination are accepted.

Range0.01.0
0.0Steady pitch (no fall)
0.8Strong downward drift
Rising <ktt decl=0.0> steady <ktt decl=0.8> falling voice.

waveform="TYPE"

Change the glottal waveform used by the synthesizer, setting the overall character of the voice.

ValueCharacter
sawDefault, neutral voice
triangleSofter, smoother
sin / sinePure tone, robotic
squareHarsh, mechanical
pulseRaspy, textured
noiseWhispered, breathy
warbleWobbly, character voice
<ktt waveform="noise"> whispered section <ktt waveform="saw"> normal voice.

vowel=X

First formant (F1) frequency multiplier. Shifts the quality of synthesised vowel sounds.

1.0Normal
> 1.0More open vowel quality
< 1.0More closed vowel quality
<ktt vowel=1.4> different vowel colour here.

accent=X

Second formant (F2) frequency multiplier. Shifts accent or dialect colouration.

1.0Normal
< 1.0Shifted accent colouring
<ktt accent=0.8> shifted accent here.

pitch=X

F0 pitch variance amount. Controls how much pitch varies during synthesis.

1.0Normal variance
> 1.0More expressive intonation
< 1.0Flatter, more monotone
<ktt pitch=2.0> very expressive speech <ktt pitch=0.1> flat monotone.

pause=X

Insert a pause of X seconds at this position in the voice playback. Value is required — there is no default.

Hello.<ktt pause=0.8> How are you?

Combined Example

Dialogue string using typewriter text commands and KTT voice tags together:

[b]Warning:[/b] [color=#FF0000]do not[/color] proceed.[pause=1]
<ktt waveform="square" pitch=1.8>This is a mechanical override.<ktt pause=0.5><ktt waveform="saw" pitch=1.0>
[speed=3]Resuming normal protocol.[/speed]

See Also

For conceptual overviews and usage guides:

For component references:

  • Audio Manager – Manager lifecycle that the voice system participates in

Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

4.3.6 - Third Party Implementations

Integration guides for third-party audio systems with GS_Audio.

This section will contain integration guides for connecting third-party audio middleware and tools with the GS_Audio system.

For usage guides and setup examples, see The Basics: GS_Audio.


Get GS_Audio

GS_Audio — Explore this gem on the product page and add it to your project.

4.4 - GS_Cinematics

Dialogue graphs, cinematic staging, UI display, localization, and extensible condition/effect/performance systems for authored narrative experiences.

GS_Cinematics is the narrative and staging gem for GS_Play. It provides a node-based dialogue graph system backed by .dialoguedb assets, extensible polymorphic conditions, effects, and performances, world-space and screen-space dialogue UI with typewriter effects, audio babble, cinematic stage marker management, and a full localization pipeline. The gem depends on GS_Core, LyShine, and RecastNavigation.

For usage guides and setup examples, see The Basics: GS_Cinematics.

 

Contents


Cinematics Manager

The top-level cinematic lifecycle controller. The Cinematics Manager is a GS_Play manager that enters and exits cinematic mode for the current level. Stage Marker components are placed in the scene as named anchor points that performances and sequences can reference at runtime.

ComponentPurpose
GS_CinematicsManagerComponentBegins and ends cinematic mode. Registers and retrieves stage markers. Broadcasts EnterCinematic / ExitCinematic notifications.
CinematicStageMarkerComponentNamed world-space anchor placed in the level. Retrieved by the manager and referenced by sequences and performances.

Dialogue System API


Dialogue System

The core playback engine. The Dialogue Manager owns the active .dialoguedb asset and performer registry. The Dialogue Sequencer executes sequences node-by-node, issuing performances, conditions, and effects as it traverses the graph. The system is fully extensible — custom conditions, effects, and performances are discovered automatically through O3DE serialization at startup.

Component / AssetPurpose
GS_DialogueManagerComponentGS_Play manager. Loads and swaps dialogue databases. Registers performer markers by name.
DialogueSequencerComponentExecutes a DialogueSequence node graph. Manages runtime tokens and signals sequence completion.
DialogueDatabase (.dialoguedb)Persistent asset. Contains actor definitions, sequences, and all node data.
Node TypesTextNodeData, SelectionNodeData, RandomNodeData, EffectsNodeData, PerformanceNodeData.
ConditionsPolymorphic DialogueCondition hierarchy. Built-in: Boolean_DialogueCondition, Record_DialogueCondition.
EffectsPolymorphic DialogueEffect hierarchy. Built-in: SetRecords_DialogueEffect, ToggleEntitiesActive_DialogueEffect.
LocalizedStringIdReference to a localizable string with a key and default fallback text. Resolved at runtime.
LocalizedStringTableRuntime string lookup table loaded alongside the active database.

Dialogue System API


Dialogue UI

Screen-space and world-space presentation layer. The UI Bridge routes active dialogue to whichever UI component is registered — swapping from screen-space to world-space at runtime requires no sequencer changes. The Typewriter and Babble components provide character-by-character text reveal and procedural audio babble respectively.

ComponentPurpose
DialogueUIComponentScreen-space dialogue text display. Shows speaker name, portrait, and text lines.
WorldDialogueUIComponentWorld-space dialogue display (speech bubbles). Extends DialogueUIComponent.
DialogueUISelectionComponentScreen-space player choice display. Builds selection buttons from SelectionOption data.
WorldDialogueUISelectionComponentWorld-space selection display. Extends DialogueUISelectionComponent.
DialogueUIBridgeComponentRoutes active dialogue to whichever UI is registered. Decouples sequencer from presentation.
TypewriterComponentCharacter-by-character text reveal with configurable speed. Fires OnTypeFired and OnTypewriterComplete notifications.
BabbleComponentProcedural audio babble synchronized to typewriter output. Driven by BabbleToneEvent / SpeakerBabbleEvents data.

Dialogue UI API


Performances

Polymorphic, async performance types executed from PerformanceNodeData within a sequence. Each performance drives a performer entity in the world and signals completion back to the sequencer. External gems register additional performance types automatically by extending the base class.

ComponentPurpose
MoveTo_PerformanceComponentMoves a performer entity to a named stage marker over time.
PathTo_PerformanceComponentNavigates a performer to a marker along a RecastNavigation navmesh path.

Dialogue System API


Installation

GS_Cinematics requires GS_Core, LyShine, and RecastNavigation to be enabled in your project first.

  1. Enable GS_Cinematics, GS_Core, LyShine, and RecastNavigation in your O3DE project’s gem list.
  2. Add a GS_CinematicsManagerComponent to your Game Manager prefab and register it in the Startup Managers list.
  3. Create a .dialoguedb asset using the Dialogue Editor and assign it to the Dialogue Manager component.
  4. Place CinematicStageMarkerComponent entities in your level and register them by name.
  5. Add a DialogueSequencerComponent to the entity that will drive dialogue playback.
  6. Add a DialogueUIBridgeComponent and connect it to your chosen DialogueUIComponent variant.

See Also

For conceptual overviews and usage guides:

For sub-system references:

For related resources:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.4.1 - Cinematics Manager

Full API reference for GS_CinematicsManagerComponent — cinematic lifecycle control, stage marker registration, and the CinematicStageMarkerComponent.

GS_CinematicsManagerComponent is the top-level cinematic lifecycle controller for GS_Cinematics. It extends GS_Core::GS_ManagerComponent and participates in the standard Game Manager startup sequence.

For usage guides and setup examples, see The Basics: Cinematics Manager.

Cinematics Manager component in the O3DE Inspector

 

Contents


GS_CinematicsManagerComponent

Singleton GS_Play manager that controls cinematic mode for the current level. Registered with the Game Manager startup sequence via GS_Core::GS_ManagerComponent.

PropertyValue
ExtendsGS_Core::GS_ManagerComponent
BusCinematicsManagerRequestBus (Single address, single handler)
NotificationsCinematicsManagerNotificationBus

Request Bus: CinematicsManagerRequestBus

Singleton bus — Single address, single handler.

MethodParametersReturnsDescription
BeginCinematicvoidEnters cinematic mode for the current level. Broadcasts EnterCinematic to all listeners.
EndCinematicvoidExits cinematic mode. Broadcasts ExitCinematic to all listeners.
RegisterStageMarkerconst AZStd::string& name, AZ::EntityId entityvoidRegisters a CinematicStageMarkerComponent entity under the given name.
GetStageMarkerconst AZStd::string& nameAZ::EntityIdReturns the entity registered under the given stage marker name.

Notification Bus: CinematicsManagerNotificationBus

Multiple handler bus — any number of components can subscribe.

EventDescription
EnterCinematicFired when cinematic mode begins. Listeners should suppress gameplay systems.
ExitCinematicFired when cinematic mode ends. Listeners should resume gameplay systems.

CinematicStageMarkerComponent

A simple world-space anchor component placed on entities in the level. Stage markers are registered by name with the Cinematics Manager during activation. Performances and sequences reference markers by name to position performers in the world.

Cinematic Stage Marker component in the O3DE Inspector

Attach a CinematicStageMarkerComponent to any entity, give it a unique name, and it self-registers with the CinematicsManagerRequestBus on Activate().


Script Canvas Examples

Entering and exiting cinematic mode:

Getting a stage marker entity by name:


See Also

For usage guides:

For related references:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.4.2 - Dialogue System

Architecture overview of the GS_Cinematics dialogue system — sequencing, data structures, UI display, actors, and extensible conditions, effects, and performances.

The Dialogue System is the runtime engine inside GS_Cinematics that drives all authored narrative experiences. It connects a persistent .dialoguedb asset to a node-graph sequencer, world-space and screen-space UI display, and an extensible set of polymorphic conditions, effects, and performances. Every piece is a standalone component communicating through EBus interfaces, so you can replace or extend any layer without modifying the gem itself.

The system is managed by the GS_DialogueManagerComponent, which participates in the standard Game Manager startup sequence.

For usage guides and setup examples, see The Basics: GS_Cinematics.

 

Contents


Architecture

Dialogue Authoring Pattern Graph

Breakdown

A dialogue sequence is authored in the node editor, stored in a .dialoguedb asset, and driven at runtime by the Dialogue Manager and Sequencer:

LayerWhat It Means
DialogueDatabaseStores named actors and sequences. Loaded at runtime by the Dialogue Manager.
DialogueSequenceA directed node graph. The Sequencer walks from startNodeId through Text, Selection, Effects, and Performance nodes.
ConditionsPolymorphic evaluators on branches. Failed conditions skip that branch automatically.
EffectsPolymorphic actions at EffectsNodeData nodes — set records, toggle entities.
PerformersNamed actor anchors in the level. Resolved from database actor names via DialoguePerformerMarkerComponent.

Conditions, effects, and performances are discovered automatically at startup — custom types from any gem appear in the editor without modifying GS_Cinematics.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Dialogue Manager

GS_DialogueManagerComponent owns the active dialogue database and performer registry. Extends GS_Core::GS_ManagerComponent.

Dialogue Manager API


Dialogue Sequencer

DialogueSequencerComponent traverses a node graph at runtime. The DialogueUIBridgeComponent routes dialogue and selection events to whichever UI is registered.

Dialogue Sequencer API


Data Structure

The .dialoguedb asset format, DialogueSequence, ActorDefinition, all node types, the polymorphic condition hierarchy, and localization types.

Data Structure API


Effects

Polymorphic DialogueEffect types executed synchronously from EffectsNodeData nodes. Built-in: SetRecords_DialogueEffect, ToggleEntitiesActive_DialogueEffect. Fully extensible.

Effects API


Performances

Polymorphic DialoguePerformance types executed asynchronously from PerformanceNodeData nodes. Built-in: MoveTo, PathTo, RepositionPerformer. Fully extensible.

Performances API


Dialogue UI

Screen-space and world-space dialogue display, selection menus, the typewriter text-reveal system, and the babble audio system.

Dialogue UI API


Actors

DialoguePerformerMarkerComponent places named performer anchors in the world, plus the ActorDefinition data model.

Dialogue Actors API


Editor

The in-engine GUI for authoring dialogue databases, sequences, and node graphs.

Dialogue Editor


See Also

For conceptual overviews and usage guides:

For related references:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.4.2.1 - Dialogue Manager

Full API reference for GS_DialogueManagerComponent — dialogue database management, performer marker registration, and the DialoguePerformerMarkerComponent.

GS_DialogueManagerComponent owns the active dialogue database and performer registry. It extends GS_Core::GS_ManagerComponent and participates in the standard Game Manager startup sequence.

For usage guides and setup examples, see The Basics: GS_Cinematics.

Dialogue Manager component in the O3DE Inspector

 

Contents


GS_DialogueManagerComponent

Singleton GS_Play manager that owns the active .dialoguedb asset and the performer registry. Registered with the Game Manager startup sequence via GS_Core::GS_ManagerComponent.

PropertyValue
ExtendsGS_Core::GS_ManagerComponent
BusDialogueManagerRequestBus (Single address, single handler)
NotificationsDialogueManagerNotificationBus (inherited from ManagerBaseOutgoingEvents)

Request Bus: DialogueManagerRequestBus

Singleton bus — Single address, single handler.

MethodParametersReturnsDescription
StartDialogueSequenceByNameconst AZStd::string& sequenceNamevoidLooks up a sequence by name in the active database and starts playback through the sequencer.
ChangeDialogueDatabaseAZ::Data::Asset<DialogueDatabase> databasevoidSwaps the active .dialoguedb asset at runtime.
RegisterPerformerMarkerconst AZStd::string& name, AZ::EntityId entityvoidRegisters a DialoguePerformerMarkerComponent entity under the given performer name.
GetPerformerconst AZStd::string& nameAZ::EntityIdReturns the entity registered under the given performer name.

Notification Bus: DialogueManagerNotificationBus

Dialogue Manager notifications are inherited from ManagerBaseOutgoingEvents. See Manager for the full notification interface.


DialoguePerformerMarkerComponent

A world-space anchor component placed on NPC entities in the level. Performer markers register by name with the Dialogue Manager during activation. The dialogue system resolves actor names from the database to these world-space entities.

Request Bus: PerformerMarkerRequestBus

MethodParametersReturnsDescription
GetPerformerNameAZStd::stringReturns the registered name for this performer.
GetPosEntityAZ::EntityIdReturns the entity used as the position reference for this performer.
GetPosPointAZ::Vector3Returns the world-space position of the performer anchor.
ShouldTrackTargetboolReturns whether this performer should track a target entity.

Script Canvas Examples

Starting a dialogue sequence by name:


See Also

For usage guides:

For component references:

For related resources:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.4.2.2 - Dialogue Data Structure

Full reference for the DialogueDatabase asset, dialogue sequences, node types, conditions, effects, localization types, and the extension guide for custom polymorphic types.

All dialogue content lives in a .dialoguedb asset file. This page documents the complete data model: the database container, sequences, actor definitions, every node type, the polymorphic condition and effect hierarchies, and the localization pipeline. It also provides an extension guide for registering custom conditions and effects from external gems.

For usage guides and setup examples, see The Basics: GS_Cinematics.

Dialogue Database asset in the O3DE Editor Dialogue Selection Node in the Dialogue Editor Performance Node in the Dialogue Editor Dialogue Effects Node with Condition in the Dialogue Editor

 

Contents


DialogueDatabase

The top-level asset that contains all data for a set of dialogue encounters.

PropertyTypeDescription
Asset Extension.dialoguedbCustom O3DE asset type.
TypeId{4C5D6E7F-8A9B-0C1D-2E3F-4A5B6C7D8E9F}Registered asset type identifier.
ActorsAZStd::unordered_map<AZStd::string, ActorDefinition>Named actor definitions. Keyed by actor name.
SequencesAZStd::unordered_map<AZStd::string, DialogueSequence>Named dialogue sequences. Keyed by sequence name.
MetadataVariousDatabase-level settings (default delays, typewriter speed, UI profiles).

The Dialogue Manager loads one database at a time. You can swap databases at runtime via DialogueManagerRequestBus::ChangeDialogueDatabase.


DialogueSequence

A single dialogue encounter within the database. Contains a linear or branching graph of nodes.

PropertyTypeDescription
TypeId{2B3C4D5E-6F7A-8B9C-0D1E-2F3A4B5C6D7E}
metadataVariousSequence name, description, and editor settings.
nodesAZStd::vector<DialogueNodeData*>Polymorphic vector of node data. Ownership is held by the sequence.
startNodeIdAZStd::stringThe ID of the first node the sequencer processes when this sequence begins.

ActorDefinition

Defines a named character that appears in dialogue.

PropertyTypeDescription
TypeId{1A2B3C4D-5E6F-7A8B-9C0D-1E2F3A4B5C6D}
Actor NameAZStd::stringInternal name used for lookups and performer matching.
PortraitAsset referenceDefault portrait image for this actor.
MetadataVariousDisplay name, emotion categories, pose sets, profile image sets.

The actor name in the database is matched against the performer name registered by a DialoguePerformerMarkerComponent in the level. This links authored data to a world-space entity.


Node Types

All nodes inherit from the abstract DialogueNodeData base class. The sequencer uses the concrete type to determine processing behavior.

DialogueNodeData (Abstract Base)

PropertyTypeDescription
TypeId{D1E2F3A4-B5C6-7D8E-9F0A-1B2C3D4E5F6A}
nodeIdAZStd::stringUnique identifier within the sequence.
connectionsAZStd::vector<NodeConnection>Outgoing connections, each with a target node ID, optional conditions, and optional weight.

Concrete Node Types

TypeTypeIdDescription
TextNodeData{A2B3C4D5-E6F7-8901-2345-6789ABCDEF01}Displays dialogue text. Contains speaker name, localized text, portrait override, close-text flag, and continue settings (display-and-delay, wait-for-input, after-seconds).
SelectionNodeData{C4D5E6F7-A890-1234-5678-9ABCDEF01234}Presents player choices. Contains a vector of SelectionOption objects. Each option has its own text, conditions, and outgoing connections.
RandomNodeData{D5E6F7A8-B901-2345-6789-ABCDEF012345}Picks a random outgoing connection. Supports pure random and weighted modes. Does not pause for input.
EffectsNodeData{E6F7A8B9-C012-3456-789A-BCDEF0123456}Executes a list of DialogueEffect instances. Processing continues immediately after all effects fire.
PerformanceNodeData{F7A8B9C0-D123-4567-89AB-CDEF01234567}Starts a DialoguePerformance on a named performer. The sequencer waits for the performance to complete before advancing.

SelectionOption

A single choice within a SelectionNodeData.

PropertyTypeDescription
TypeId{B3C4D5E6-F7A8-9012-3456-789ABCDEF012}
textLocalizedStringIdThe display text for this option.
conditionsAZStd::vector<DialogueCondition*>Conditions that must pass for this option to appear.
connectionsAZStd::vector<NodeConnection>Outgoing connections followed when this option is selected.

Conditions

Conditions are polymorphic evaluators attached to node connections and selection options. They are registered automatically by extending the base class.

DialogueCondition (Abstract Base)

PropertyTypeDescription
TypeId{E44DAD9C-F42B-458B-A44E-1F15971B8F4C}
MethodEvaluateCondition()Returns bool. Called by the sequencer to determine whether a connection or option is valid.

Built-in Conditions

TypeTypeIdDescription
Boolean_DialogueCondition{34040E77-A58F-48CD-AC36-B3FE0017AA1F}Base boolean comparison. Evaluates a simple true/false state.
Record_DialogueCondition{E9C5C06D-B2C7-42EF-98D0-2290DE23635D}Checks game records via the GS_Save RecordKeeper system. Supports greater-than, less-than, equal, and not-equal operators against integer record values.

Effects

Effects are polymorphic actions triggered by EffectsNodeData nodes. Each effect can also be reversed. They are registered automatically by extending the base class.

DialogueEffect (Abstract Base)

PropertyTypeDescription
TypeId{71E2E05E-A7A3-4EB3-8954-2F75B26D5E3A}
MethodsDoEffect(), ReverseEffect()DoEffect() executes the effect. ReverseEffect() undoes it (used for rollback or undo scenarios).

Built-in Effects

TypeTypeIdDescription
SetRecords_DialogueEffect{FFCACF63-0625-4EE5-A74B-86C809B7BF80}Sets one or more game records in the GS_Save RecordKeeper system. Commonly used to flag quest progress, NPC disposition, or world state changes.
ToggleEntitiesActive_DialogueEffect{F97E1B87-EE7A-4D26-814C-D2F9FA810B28}Activates or deactivates entities in the level. Used to show/hide objects, enable/disable triggers, or change the world during dialogue.

Localization

All player-visible text in the dialogue system passes through the localization layer.

LocalizedStringId

A reference to a localizable string.

PropertyTypeDescription
TypeId{7D8E9F0A-1B2C-3D4E-5F6A-7B8C9D0E1F2A}
keyAZStd::stringLookup key into the active LocalizedStringTable.
defaultTextAZStd::stringFallback text returned when no localized entry exists for the key.
Resolve()– returns AZStd::stringResolves the string: looks up key in the active table, falls back to defaultText.

LocalizedStringTable

Runtime lookup table loaded alongside the active dialogue database.

PropertyTypeDescription
TypeId{3D4E5F6A-7B8C-9D0E-1F2A-3B4C5D6E7F8A}
EntriesAZStd::unordered_map<AZStd::string, AZStd::string>Key-value pairs mapping string keys to localized text.

Complete Type Reference

TypeTypeIdCategory
DialogueDatabase{4C5D6E7F-8A9B-0C1D-2E3F-4A5B6C7D8E9F}Asset
DialogueSequence{2B3C4D5E-6F7A-8B9C-0D1E-2F3A4B5C6D7E}Data
ActorDefinition{1A2B3C4D-5E6F-7A8B-9C0D-1E2F3A4B5C6D}Data
DialogueNodeData{D1E2F3A4-B5C6-7D8E-9F0A-1B2C3D4E5F6A}Node (Abstract)
TextNodeData{A2B3C4D5-E6F7-8901-2345-6789ABCDEF01}Node
SelectionNodeData{C4D5E6F7-A890-1234-5678-9ABCDEF01234}Node
RandomNodeData{D5E6F7A8-B901-2345-6789-ABCDEF012345}Node
EffectsNodeData{E6F7A8B9-C012-3456-789A-BCDEF0123456}Node
PerformanceNodeData{F7A8B9C0-D123-4567-89AB-CDEF01234567}Node
SelectionOption{B3C4D5E6-F7A8-9012-3456-789ABCDEF012}Data
DialogueCondition{E44DAD9C-F42B-458B-A44E-1F15971B8F4C}Condition (Abstract)
Boolean_DialogueCondition{34040E77-A58F-48CD-AC36-B3FE0017AA1F}Condition
Record_DialogueCondition{E9C5C06D-B2C7-42EF-98D0-2290DE23635D}Condition
DialogueEffect{71E2E05E-A7A3-4EB3-8954-2F75B26D5E3A}Effect (Abstract)
SetRecords_DialogueEffect{FFCACF63-0625-4EE5-A74B-86C809B7BF80}Effect
ToggleEntitiesActive_DialogueEffect{F97E1B87-EE7A-4D26-814C-D2F9FA810B28}Effect
DialoguePerformance{BCCF5C52-42C3-49D4-8202-958120EA8743}Performance (Abstract)
MoveTo_DialoguePerformance{6033A69D-F46F-40DF-A4A0-A64C4E28D6D5}Performance
PathTo_DialoguePerformance{C0DF4B0E-924D-4C38-BD26-5A286161D95C}Performance
RepositionPerformer_DialoguePerformance{DE1E0930-F0A4-4F60-A741-4FB530610AEE}Performance
LocalizedStringId{7D8E9F0A-1B2C-3D4E-5F6A-7B8C9D0E1F2A}Localization
LocalizedStringTable{3D4E5F6A-7B8C-9D0E-1F2A-3B4C5D6E7F8A}Localization

Extension Guide

Use the ClassWizard templates to generate new dialogue extension classes — see GS_Cinematics Templates:

  • DialogueCondition — generates a condition stub (Basic or Boolean variant). Requires a Reflect(context) call in DialogueSequencerComponent.cpp.
  • DialogueEffect — generates an effect stub with DoEffect() / ReverseEffect(). Same manual registration step.
  • DialoguePerformance — generates a performance stub with the full ExecutePerformance() / FinishPerformance() lifecycle. Same manual registration step.

All three types are polymorphic and discovered automatically.

External gems can add custom conditions, effects, and performances without modifying GS_Cinematics. The process is the same for all three categories.

Adding a Custom Condition

1. Define the class

#pragma once
#include <GS_Cinematics/Dialogue/Conditions/DialogueCondition.h>

namespace MyGem
{
    class HasItem_DialogueCondition : public GS_Cinematics::DialogueCondition
    {
    public:
        AZ_RTTI(HasItem_DialogueCondition, "{YOUR-UUID-HERE}", GS_Cinematics::DialogueCondition);
        AZ_CLASS_ALLOCATOR(HasItem_DialogueCondition, AZ::SystemAllocator);

        static void Reflect(AZ::ReflectContext* context);

        bool EvaluateCondition() override;

    private:
        AZStd::string m_itemName;
    };
}

2. Implement and reflect

#include "HasItem_DialogueCondition.h"
#include <AzCore/Serialization/SerializeContext.h>

namespace MyGem
{
    void HasItem_DialogueCondition::Reflect(AZ::ReflectContext* context)
    {
        if (auto sc = azrtti_cast<AZ::SerializeContext*>(context))
        {
            sc->Class<HasItem_DialogueCondition, GS_Cinematics::DialogueCondition>()
                ->Version(1)
                ->Field("ItemName", &HasItem_DialogueCondition::m_itemName);

            if (auto ec = sc->GetEditContext())
            {
                ec->Class<HasItem_DialogueCondition>("Has Item", "Checks if the player has a specific item.")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyGem/Conditions")
                    ->DataElement(AZ::Edit::UIHandlers::Default, &HasItem_DialogueCondition::m_itemName, "Item Name", "");
            }
        }
    }

    bool HasItem_DialogueCondition::EvaluateCondition()
    {
        // Query your inventory system
        bool hasItem = false;
        // MyGem::InventoryRequestBus::BroadcastResult(hasItem, ...);
        return hasItem;
    }
}

The same pattern applies to custom effects (inherit DialogueEffect, implement DoEffect() / ReverseEffect()) and custom performances (inherit DialoguePerformance, implement DoPerformance() / ExecutePerformance() / FinishPerformance()).


See Also

For conceptual overviews and usage guides:

For component references:

For related resources:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.4.2.3 - Editor

Information on the in engine GUI.

For usage guides and setup examples, see The Basics: GS_Cinematics.

Performer

Performer Targeting

Sequences

Localization

Nodes

Start / End

End

Restart

New Sequence

Specific Node

Dialogue

Text

Animated/Stylized Text

Performance

Posing

Pathing

Portrait

Trigger Timeline

Sound

2D

3D

Option Menu + Choices

Actions + Set Record

Conditions

How to use conditions to determine path.

Not always related to player choice.

Timeline Integration

Start Sequence

Continue Dialogue

UI

O3DE Editor Menu

Editor GUI

Adding icon to editor is a complex problem, outlined by Nick.

Sequence Window

The Inspector Window can have any number of sub-tabs. Which might be valuable for adding unique settings, or multiple Conditions, for example. Basically component-ize different node types.

System Window

Actors Window

Ingame UI


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.4.2.4 - Dialogue Sequencer

Full API reference for the DialogueSequencerComponent and DialogueUIBridgeComponent — runtime dialogue graph traversal, node processing, and UI routing.

The Dialogue Sequencer is the runtime engine that executes dialogue sequences node-by-node. It receives a DialogueSequence from the Dialogue Manager, traverses the node graph, evaluates conditions, fires effects and performances, and routes text and selection events to the UI layer through the DialogueUIBridgeComponent.

The two components on this page work together:

ComponentRole
DialogueSequencerComponentTraverses the node graph. Processes each DialogueNodeData in order, manages runtime tokens, and signals sequence completion.
DialogueUIBridgeComponentDecouples the sequencer from presentation. Routes dialogue text and selection events to whichever DialogueUIComponent or DialogueUISelectionComponent is currently registered.

For usage guides and setup examples, see The Basics: GS_Cinematics.

 

Contents


DialogueSequencerComponent

The sequencer is the core playback controller. When a sequence starts, the sequencer reads the startNodeId, resolves the first node, and begins processing. Each node type has its own processing logic:

  • TextNodeData – Sends text, speaker, and portrait data to the UI bridge for display.
  • SelectionNodeData – Sends options to the UI bridge. Waits for a player selection before continuing.
  • RandomNodeData – Picks a random outgoing connection (pure random or weighted) and continues.
  • EffectsNodeData – Executes all attached DialogueEffect instances, then continues.
  • PerformanceNodeData – Starts a DialoguePerformance. Waits for the performance to signal completion before continuing.

At each node, outgoing connections are evaluated. Conditions on connections are tested in priority order; the first connection whose conditions all pass is followed. When no outgoing connections remain, the sequence ends.


Request Bus: DialogueSequencerRequestBus

Singleton bus – Single address, single handler.

MethodParametersReturnsDescription
StartDialogueBySequenceconst DialogueSequence& sequencevoidBegins playback of the given sequence from its startNodeId.
OnPerformanceCompletevoidCalled by a running DialoguePerformance when it finishes. The sequencer advances to the next node.

Notification Bus: DialogueSequencerNotificationBus

Multiple handler bus – any number of components can subscribe.

EventParametersDescription
OnDialogueTextBeginconst DialogueTextPayload& payloadFired when the sequencer begins processing a text node. Contains speaker name, text, portrait, and display settings.
OnDialogueSequenceCompleteFired when the sequencer reaches the end of a sequence (no further outgoing connections).

Runtime Token Management

The sequencer maintains a set of runtime tokens that track state within a single sequence execution. Tokens are used internally to prevent infinite loops, track visited nodes, and manage branching state. They are cleared when a sequence completes or is interrupted.


DialogueUIBridgeComponent

The UI Bridge sits between the sequencer and the presentation layer. It holds a reference to the currently registered DialogueUIComponent and DialogueUISelectionComponent. When the sequencer needs to display text or a selection menu, it calls the bridge, which forwards the request to the active UI. This allows you to swap between screen-space, world-space, or custom UI implementations at runtime without changing the sequencer or the dialogue data.


Request Bus: DialogueUIBridgeRequestBus

Singleton bus.

MethodParametersReturnsDescription
RunDialogueconst DialogueTextPayload& payloadvoidForwards a dialogue text event to the registered DialogueUIComponent.
RunSelectionconst DialogueSelectionPayload& payloadvoidForwards a selection event to the registered DialogueUISelectionComponent.
RegisterDialogueUIAZ::EntityId uiEntityvoidRegisters the entity whose DialogueUIComponent and/or DialogueUISelectionComponent will receive events.
CloseDialoguevoidTells the registered UI to close and clean up.

Notification Bus: DialogueUIBridgeNotificationBus

Multiple handler bus.

EventParametersDescription
OnDialogueCompleteFired when the registered UI finishes displaying a dialogue line (after typewriter completes or the player advances). The sequencer listens for this to advance to the next node.
OnSelectionCompleteint selectedIndexFired when the player selects an option. The sequencer uses the index to follow the corresponding connection.

Execution Flow

A typical dialogue playback follows this path:

  1. StartDialogueManagerRequestBus::StartDialogueSequenceByName("MySequence") looks up the sequence in the active database and calls DialogueSequencerRequestBus::StartDialogueBySequence(sequence).
  2. Node Processing – The sequencer reads the start node and begins processing. For each node:
    • Text – The sequencer calls DialogueUIBridgeRequestBus::RunDialogue(payload). The bridge forwards to the registered UI. The UI displays text (via TypewriterComponent), then fires OnDialogueComplete. The sequencer advances.
    • Selection – The sequencer calls DialogueUIBridgeRequestBus::RunSelection(payload). The bridge forwards to the selection UI. The player picks an option, the UI fires OnSelectionComplete(index). The sequencer follows the indexed connection.
    • Effects – The sequencer executes each DialogueEffect on the node, then advances immediately.
    • Performance – The sequencer starts the DialoguePerformance. When the performance calls OnPerformanceComplete, the sequencer advances.
    • Random – The sequencer picks a connection and advances immediately.
  3. End – When no outgoing connections remain, the sequencer fires OnDialogueSequenceComplete and calls DialogueUIBridgeRequestBus::CloseDialogue().

C++ Usage

Starting a Sequence

#include <GS_Cinematics/GS_CinematicsBus.h>

// Start by name through the Dialogue Manager
GS_Cinematics::DialogueManagerRequestBus::Broadcast(
    &GS_Cinematics::DialogueManagerRequestBus::Events::StartDialogueSequenceByName,
    AZStd::string("PerryConfrontation")
);

Listening for Sequence Completion

#include <GS_Cinematics/GS_CinematicsBus.h>

class MyListener
    : public AZ::Component
    , protected GS_Cinematics::DialogueSequencerNotificationBus::Handler
{
protected:
    void Activate() override
    {
        GS_Cinematics::DialogueSequencerNotificationBus::Handler::BusConnect();
    }

    void Deactivate() override
    {
        GS_Cinematics::DialogueSequencerNotificationBus::Handler::BusDisconnect();
    }

    void OnDialogueSequenceComplete() override
    {
        // Sequence finished — resume gameplay, unlock camera, etc.
    }
};

See Also

For conceptual overviews and usage guides:

For component references:

For related resources:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.4.2.5 - Dialogue UI

Full API reference for all dialogue UI components — screen-space and world-space dialogue display, selection menus, typewriter text reveal, and babble audio.

The Dialogue UI layer handles all player-facing presentation of dialogue content. It is completely decoupled from the sequencer through the DialogueUIBridgeComponent – the sequencer never talks to a UI component directly. This page documents the seven components that make up the built-in UI system.

ComponentPurpose
DialogueUIComponentScreen-space dialogue text display.
WorldDialogueUIComponentWorld-space speech bubble display.
DialogueUISelectionComponentScreen-space player choice display.
WorldDialogueUISelectionComponentWorld-space player choice display.
DialogueSelectButtonComponentIndividual selection button within a choice menu.
TypewriterComponentCharacter-by-character text reveal.
BabbleComponentProcedural audio babble synchronized to text reveal.

For usage guides and setup examples, see The Basics: GS_Cinematics.

 

Contents


DialogueUIComponent

The standard screen-space dialogue display. Receives dialogue payloads from the UI Bridge and renders speaker name, portrait, and text lines using LyShine UI canvases.

Request Bus: DialogueUIRequestBus

MethodParametersReturnsDescription
DoDialogueconst DialogueTextPayload& payloadvoidProcesses a full dialogue payload: sets speaker name, portrait, and begins text display through the typewriter.
ShowDialoguevoidMakes the dialogue UI canvas visible.
HideDialoguevoidHides the dialogue UI canvas without destroying it.
CloseUIvoidFully closes and cleans up the dialogue UI.
GetTypewriterAZ::EntityIdReturns the entity that holds the TypewriterComponent used by this UI.

DialogueUIComponent canvas in the O3DE UI Editor


WorldDialogueUIComponent

Extends DialogueUIComponent for world-space rendering. Displays speech bubbles attached to performer entities in 3D space. Shares the same DialogueUIRequestBus interface as the base class. The world-space variant tracks the performer entity’s position and orients the UI element toward the camera.

Use WorldDialogueUIComponent when you want dialogue to appear as in-world speech bubbles rather than as a screen overlay.

WorldDialogueUIComponent canvas in the O3DE UI Editor


DialogueUISelectionComponent

Screen-space player choice display. Builds a list of selectable options from a DialogueSelectionPayload, spawns DialogueSelectButtonComponent instances for each valid option, and waits for the player to pick one.

Request Bus: DialogueUISelectionRequestBus

MethodParametersReturnsDescription
DoSelectionconst DialogueSelectionPayload& payloadvoidBuilds and displays the selection menu from the provided options.
OnSelectionint selectedIndexvoidCalled when a button is pressed. Processes the selection and notifies the bridge.
ShowSelectionvoidMakes the selection UI canvas visible.
ClearSelectionvoidRemoves all spawned option buttons and resets the selection state.
CloseUIvoidFully closes and cleans up the selection UI.

DialogueUISelectionComponent canvas in the O3DE UI Editor


WorldDialogueUISelectionComponent

Extends DialogueUISelectionComponent for world-space rendering. Displays selection options as in-world UI elements attached to performer entities. Shares the same DialogueUISelectionRequestBus interface as the base class.

WorldDialogueUISelectionComponent canvas in the O3DE UI Editor


DialogueSelectButtonComponent

An individual button within a selection menu. One instance is spawned per valid option. The button displays option text and forwards press events to the parent selection component.

Event Bus: DialogueUISelectionEventBus

MethodParametersReturnsDescription
SetupOptionconst SelectionOption& option, int indexvoidConfigures the button with display text and its index in the selection list.
GetInteractableEntityAZ::EntityIdReturns the LyShine interactable entity used for input detection.

DialogueSelectButtonComponent in the O3DE Inspector


TypewriterComponent

Character-by-character text reveal system. The typewriter receives a full text string and reveals it one character at a time at a configurable speed. It fires events on each character reveal and when the full text has been displayed. The typewriter is used internally by DialogueUIComponent but can also be used independently for any text reveal effect.

For detailed usage and standalone setup, see the Typewriter sub-page.

Request Bus: TypewriterRequestBus

MethodParametersReturnsDescription
StartTypewriterconst AZStd::string& text, float speedvoidBegins revealing the given text at the specified characters-per-second speed.
ForceCompletevoidImmediately reveals all remaining text. Fires OnTypewriterComplete.
ClearTypewritervoidClears all displayed text and resets the typewriter state.

Notification Bus: TypewriterNotificationBus

Multiple handler bus.

EventParametersDescription
OnTypeFiredchar characterFired each time a character is revealed. Used by the BabbleComponent to synchronize audio.
OnTypewriterCompleteFired when all characters have been revealed, either naturally or via ForceComplete.

BabbleComponent

Procedural audio babble that plays in sync with the typewriter. Each character reveal from OnTypeFired can trigger a short audio event, creating a character-by-character speech sound. Different speakers can have different babble tones.

Request Bus: BabbleRequestBus

MethodParametersReturnsDescription
GetBabbleEventconst AZStd::string& speakerNameBabbleToneEventReturns the babble audio event associated with the given speaker.

Data Types

TypeDescription
BabbleToneEventA single audio event reference (FMOD or Wwise event) that represents one character’s babble sound.
SpeakerBabbleEventsA mapping of speaker names to BabbleToneEvent instances. Stored on the BabbleComponent and queried at runtime.

Setup Guide

Screen-Space Dialogue

  1. Create a UI canvas entity with your dialogue layout (speaker name text, portrait image, dialogue text area).
  2. Attach a DialogueUIComponent to the entity.
  3. Attach a TypewriterComponent to the text entity (or the same entity).
  4. Optionally attach a BabbleComponent and configure speaker babble events.
  5. Attach a DialogueUISelectionComponent to a separate entity (or the same canvas) for player choices.
  6. Register the UI entity with DialogueUIBridgeRequestBus::RegisterDialogueUI(uiEntityId).

World-Space Dialogue

  1. Follow the same steps as screen-space, but use WorldDialogueUIComponent and WorldDialogueUISelectionComponent instead.
  2. The world-space components will track the performer entity’s position automatically.
  3. Register the UI entity with the bridge the same way.

Swapping between screen-space and world-space at runtime requires only changing which UI entity is registered with the bridge. No sequencer or dialogue data changes are needed.


See Also

For conceptual overviews and usage guides:

For component references:

For related resources:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.4.2.5.1 - Typewriter

Full API reference for the TypewriterComponent — character-by-character text reveal with configurable speed, force-complete, and notification events.

The TypewriterComponent provides character-by-character text reveal for dialogue display. It receives a string and a speed value, then reveals one character at a time on tick. It fires a notification on every character reveal and again when the full text is complete. The component is used internally by the DialogueUIComponent but can also be attached to any text entity independently for non-dialogue text effects.

For usage guides and setup examples, see The Basics: GS_Cinematics.

TypewriterComponent in the O3DE Inspector

 

Contents


Request Bus: TypewriterRequestBus

Entity-addressed bus.

MethodParametersReturnsDescription
StartTypewriterconst AZStd::string& text, float speedvoidBegins revealing the given text. speed is in characters per second. The component connects to TickBus and reveals characters on each tick.
ForceCompletevoidImmediately reveals all remaining text. Fires OnTypewriterComplete. Useful when the player presses a button to skip the reveal.
ClearTypewritervoidClears all displayed text and resets the internal state. Disconnects from TickBus.

Notification Bus: TypewriterNotificationBus

Multiple handler bus – any number of listeners can connect.

EventParametersDescription
OnTypeFiredchar characterFired each time a single character is revealed. The BabbleComponent listens to this event to trigger per-character audio. Custom effects (particles, screen shake, emphasis) can also listen here.
OnTypewriterCompleteFired when all characters have been revealed, either through normal tick-based reveal or through ForceComplete. The dialogue UI listens for this to know when the line is fully displayed.

How It Works

  1. A caller (typically DialogueUIComponent) calls StartTypewriter(text, speed).
  2. The component runs ParseFormattedText on the input string, stripping inline [command] tags and building an internal command schedule — pauses, speed changes, and style runs are all resolved before the first tick.
  3. The component stores the parsed text, resets its character index to zero, and connects to TickBus.
  4. On each tick, the component calculates how many characters should be visible based on elapsed time and the current speed. For each newly revealed character it fires OnTypeFired(character). Scheduled commands (pauses, speed changes, style tags) execute when their character position is reached.
  5. The component updates the associated LyShine text element — style tags ([color], [b], etc.) are output as LyShine <font> / <b> / <i> markup inside the visible string.
  6. When the last character is revealed, the component fires OnTypewriterComplete and disconnects from TickBus.
  7. If ForceComplete is called mid-reveal, all remaining characters are revealed at once, OnTypewriterComplete fires, and the tick handler disconnects.

Text Command Reference

Inline commands are embedded directly in dialogue text strings and parsed by ParseFormattedText before reveal begins. Commands do not appear in the displayed text.

Format: [command=value] to open, [/command] to close where applicable.


[pause=X]

Pause the reveal for X seconds before continuing. If no value is given, defaults to 1.0 second. No closing tag.

Hello[pause=1.5] world.

[speed=X] / [/speed]

Change the typing speed to X characters per second from this point forward. [/speed] resets to the component’s configured default speed. Setting speed=0 causes all following text to appear instantly until the next speed tag.

Normal text [speed=5]fast text[/speed] normal again.

[color=VALUE] / [/color]

Change text colour. VALUE is any valid LyShine <font color> value — hex (#FF0000) or a named colour. Outputs <font color="VALUE">…</font> to the text element. Styles are tracked on a stack so nested formatting closes correctly.

Regular [color=#FF4444]danger text[/color] back to normal.

[size=VALUE] / [/size]

Change font size. VALUE is a numeric point size (e.g. 32). Outputs <font size="VALUE">…</font> to the text element.

Normal [size=48]big text[/size] normal.

[bold] or [b] / [/bold] or [/b]

Apply bold formatting. Both bold and b are valid keywords. Outputs <b>…</b>.

This is [b]bold[/b] text.

[italic] or [i] / [/italic] or [/i]

Apply italic formatting. Both italic and i are valid keywords. Outputs <i>…</i>.

This is [i]italic[/i] text.

[n] or [new]

Insert a newline character. No closing tag. Both n and new are valid.

First line[n]Second line.

Standalone Usage

The typewriter is not limited to dialogue. You can attach it to any entity with a LyShine text element for effects like:

  • Tutorial text that types out instructions
  • Item descriptions that reveal on hover
  • Narrative text during loading screens
  • Any text that benefits from a gradual reveal

Setup

  1. Create an entity with a LyShine text component.
  2. Attach a TypewriterComponent to the same entity.
  3. Call TypewriterRequestBus::Event(entityId, &TypewriterRequestBus::Events::StartTypewriter, text, speed) from C++ or ScriptCanvas.
  4. Optionally listen to TypewriterNotificationBus for per-character or completion events.

C++ Usage

#include <GS_Cinematics/GS_CinematicsBus.h>

// Start typewriter on a specific entity
AZ::EntityId textEntity = /* your text entity */;
GS_Cinematics::TypewriterRequestBus::Event(
    textEntity,
    &GS_Cinematics::TypewriterRequestBus::Events::StartTypewriter,
    AZStd::string("Hello, traveler. Welcome to the village."),
    12.0f  // 12 characters per second
);

// Force-complete if the player presses a button
GS_Cinematics::TypewriterRequestBus::Event(
    textEntity,
    &GS_Cinematics::TypewriterRequestBus::Events::ForceComplete
);

Script Canvas

Typewriter requests and notification events:


See Also

For conceptual overviews and usage guides:

For component references:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.4.2.6 - Dialogue Actors

Full API reference for DialoguePerformerMarkerComponent, ActorDefinition data model, and the performer registration and lookup system.

Dialogue Actors connect authored character data in the .dialoguedb asset to world-space entities in the level. The system has two halves:

  • ActorDefinition – The data side. Defined in the DialogueDatabase, it holds the actor’s name, portrait, and metadata.
  • DialoguePerformerMarkerComponent – The world side. Attached to an entity in the level, it registers a named performer with the Dialogue Manager so the sequencer can find it at runtime.

When a dialogue sequence references an actor by name, the sequencer looks up the matching performer entity through DialogueManagerRequestBus::GetPerformer(name). This links the authored text, portraits, and performance instructions to a specific entity in the world.

For usage guides and setup examples, see The Basics: GS_Cinematics.


DialoguePerformerMarkerComponent

Dialogue Performer Marker component in the O3DE Inspector

A component placed on any entity that represents a dialogue performer in the level. On Activate(), the marker registers itself with the Dialogue Manager by name. Multiple performers with the same name can exist in different levels, but only one should be active at a time within a single level.

Request Bus: PerformerMarkerRequestBus

Entity-addressed bus – each marker entity responds on its own address.

MethodParametersReturnsDescription
GetPerformerNameAZStd::stringReturns the performer’s registered name. This name must match the actor name in the dialogue database.
GetPosEntityAZ::EntityIdReturns the entity to use as the performer’s position source. Typically the marker entity itself, but can point to a child entity for offset control.
GetPosPointAZ::Vector3Returns the performer’s current world-space position.
ShouldTrackTargetboolReturns whether the performer entity should continuously track and face its dialogue target during a sequence.

ActorDefinition Data Model

The ActorDefinition is the authored data for a single character, stored inside the DialogueDatabase.

PropertyTypeDescription
TypeId{1A2B3C4D-5E6F-7A8B-9C0D-1E2F3A4B5C6D}
Actor NameAZStd::stringInternal lookup name. Must match the performer marker name in the level.
Display NameAZStd::stringThe name shown to the player in dialogue UI. Can differ from the internal name.
PortraitAsset referenceDefault portrait image. Overridden per-node in TextNodeData when needed.
Emotion CategoriesAZStd::vectorGroupings of emotional states (e.g. happy, angry, sad). Used to organize profile images and poses.
Profile Image SetsAZStd::vectorSets of portrait images keyed by emotion or attitude. The sequencer can switch profiles during dialogue.
Pose SetsAZStd::vectorSets of body poses and idle animations keyed by attitude. Referenced by performance nodes.

Performer Registration and Lookup

Registration Flow

  1. A DialoguePerformerMarkerComponent activates on an entity in the level.
  2. During Activate(), the component calls DialogueManagerRequestBus::RegisterPerformerMarker(name, entityId).
  3. The Dialogue Manager stores the mapping from name to entity ID.
  4. When the component deactivates, it unregisters itself.

Lookup Flow

  1. The sequencer processes a node that references an actor name (e.g. a TextNodeData with speaker “perry” or a PerformanceNodeData targeting “perry”).
  2. The sequencer calls DialogueManagerRequestBus::GetPerformer("perry").
  3. The Dialogue Manager returns the registered entity ID.
  4. The sequencer (or performance) can now query the performer’s position, facing, and tracking state through PerformerMarkerRequestBus.

Multiple Performers

The actorTargetId field on node data allows a sequence to target a specific instance of a performer when multiple entities share the same actor name. This supports scenarios like having the same NPC appear in multiple locations across different levels while keeping the same actor definition in the database.


Setup

  1. In your .dialoguedb asset, define an actor with a unique name (e.g. “perry”) using the Dialogue Editor.
  2. In your level, create an entity and attach a DialoguePerformerMarkerComponent.
  3. Set the marker’s performer name to match the actor name in the database (e.g. “perry”).
  4. Position the entity where the performer should appear in the world.
  5. Optionally, add child entities for position offsets or attach visual representation (mesh, animation) to the same entity.

The marker entity will self-register with the Dialogue Manager on activation. The sequencer can now find this performer when processing sequences that reference the matching actor name.


Script Canvas Examples

Getting a performer’s name and world position:


See Also


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.4.2.7 - Effects

Full API reference for the DialogueEffect base class and built-in effect types — SetRecords and ToggleEntitiesActive.

Effects are synchronous actions executed from EffectsNodeData nodes within a dialogue sequence. When the Dialogue Sequencer encounters an Effects node, it calls DoEffect() on each DialogueEffect in the node’s list in order, then immediately advances to the next node.

All effects are polymorphic — extending the base class automatically populates the effects list in the dialogue editor at startup.

For usage guides and setup examples, see The Basics: GS_Cinematics.

 

Contents


DialogueEffect (Abstract Base)

The base class for all dialogue effects. Effects execute synchronously and can be reversed for rollback scenarios.

PropertyTypeDescription
TypeId{71E2E05E-A7A3-4EB3-8954-2F75B26D5E3A}

Virtual Methods

MethodDescription
DoEffect()Executes the effect. Called by the sequencer when the EffectsNodeData is processed.
ReverseEffect()Undoes the effect. Used for rollback or undo scenarios.

Lifecycle

  1. The sequencer encounters an EffectsNodeData node.
  2. The sequencer calls DoEffect() on each effect in the node’s list in order.
  3. All effects complete synchronously within the same frame.
  4. The sequencer immediately advances to the next node.

SetRecords_DialogueEffect

Sets one or more game records in the GS_Save RecordKeeper system. Commonly used to flag quest progress, NPC disposition, or world state changes triggered by dialogue.

PropertyTypeDescription
TypeId{FFCACF63-0625-4EE5-A74B-86C809B7BF80}
RecordsAZStd::vector<RecordEntry>List of record name/value pairs to set on execution.

ToggleEntitiesActive_DialogueEffect

Activates or deactivates a list of entities in the level. Used to show or hide objects, enable or disable triggers, or change world state during dialogue.

PropertyTypeDescription
TypeId{F97E1B87-EE7A-4D26-814C-D2F9FA810B28}
EntitiesAZStd::vector<AZ::EntityId>The entities to activate or deactivate.
ActiveboolTarget active state: true to activate, false to deactivate.

Complete Effect Type Reference

TypeTypeIdDescription
DialogueEffect{71E2E05E-A7A3-4EB3-8954-2F75B26D5E3A}Abstract base
SetRecords_DialogueEffect{FFCACF63-0625-4EE5-A74B-86C809B7BF80}Sets game records
ToggleEntitiesActive_DialogueEffect{F97E1B87-EE7A-4D26-814C-D2F9FA810B28}Activates/deactivates entities

Creating a Custom Effect

To add a custom effect type from an external gem:

1. Define the class

#pragma once
#include <GS_Cinematics/Dialogue/Effects/DialogueEffect.h>

namespace MyGem
{
    class PlaySound_DialogueEffect : public GS_Cinematics::DialogueEffect
    {
    public:
        AZ_RTTI(PlaySound_DialogueEffect, "{YOUR-UUID-HERE}", GS_Cinematics::DialogueEffect);
        AZ_CLASS_ALLOCATOR(PlaySound_DialogueEffect, AZ::SystemAllocator);

        static void Reflect(AZ::ReflectContext* context);

        void DoEffect() override;
        void ReverseEffect() override;

    private:
        AZStd::string m_soundEvent;
    };
}

2. Implement and reflect

#include "PlaySound_DialogueEffect.h"
#include <AzCore/Serialization/SerializeContext.h>

namespace MyGem
{
    void PlaySound_DialogueEffect::Reflect(AZ::ReflectContext* context)
    {
        if (auto sc = azrtti_cast<AZ::SerializeContext*>(context))
        {
            sc->Class<PlaySound_DialogueEffect, GS_Cinematics::DialogueEffect>()
                ->Version(1)
                ->Field("SoundEvent", &PlaySound_DialogueEffect::m_soundEvent);

            if (auto ec = sc->GetEditContext())
            {
                ec->Class<PlaySound_DialogueEffect>("Play Sound", "Fires a sound event during dialogue.")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyGem/Effects")
                    ->DataElement(AZ::Edit::UIHandlers::Default, &PlaySound_DialogueEffect::m_soundEvent, "Sound Event", "");
            }
        }
    }

    void PlaySound_DialogueEffect::DoEffect()
    {
        // Fire the sound event
    }

    void PlaySound_DialogueEffect::ReverseEffect()
    {
        // Stop the sound if needed
    }
}

Custom effects are discovered automatically at startup through O3DE serialization — no manual registration step is required.


See Also

For conceptual overviews and usage guides:

For component references:

For related resources:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.4.2.8 - Performances

Full API reference for the DialoguePerformance base class and built-in performance types — MoveTo, PathTo, and RepositionPerformer.

Performances are asynchronous actions executed from PerformanceNodeData nodes within a dialogue sequence. When the Dialogue Sequencer encounters a performance node, it instantiates the appropriate DialoguePerformance subclass, starts it, and waits for it to signal completion before advancing to the next node.

All performances are polymorphic by extending the class they automatically populate the performances list when adding performances to a node.

For usage guides and setup examples, see The Basics: GS_Cinematics.

 

Contents


DialoguePerformance (Abstract Base)

The base class for all dialogue performances. Extends AZ::TickBus::Handler so that performances can drive time-based behavior across multiple frames.

PropertyTypeDescription
TypeId{BCCF5C52-42C3-49D4-8202-958120EA8743}
Tick HandlerAZ::TickBus::HandlerPerformances connect to TickBus during execution for frame-by-frame updates.

Virtual Methods

MethodDescription
DoPerformance()Entry point called by the sequencer. Sets up the performance and calls ExecutePerformance().
ExecutePerformance()Core execution logic. Override this in subclasses to implement the actual behavior (movement, animation, etc.).
FinishPerformance()Called when the performance is complete. Disconnects from TickBus and signals the sequencer via DialogueSequencerRequestBus::OnPerformanceComplete().

Lifecycle

  1. The sequencer encounters a PerformanceNodeData and instantiates the corresponding DialoguePerformance subclass.
  2. The sequencer calls DoPerformance().
  3. DoPerformance() connects to TickBus and calls ExecutePerformance().
  4. The performance runs across multiple ticks until its completion condition is met.
  5. The performance calls FinishPerformance(), which disconnects from TickBus and notifies the sequencer.
  6. The sequencer advances to the next node.

MoveTo_DialoguePerformance

Moves a performer entity to a named stage marker position over time using direct interpolation (no navmesh). Suitable for short-distance movements within a scene where pathfinding is unnecessary.

PropertyTypeDescription
TypeId{6033A69D-F46F-40DF-A4A0-A64C4E28D6D5}
TargetStage marker nameThe destination position, resolved through CinematicsManagerRequestBus::GetStageMarker().
Speed / DurationfloatControls how quickly the performer reaches the target.

Request Bus: MoveTo_PerformanceRequestBus

MethodParametersReturnsDescription
StartMoveToAZ::EntityId performer, AZ::Vector3 target, float speedvoidBegins moving the performer entity toward the target position.

Notification Bus: MoveTo_PerformanceNotificationBus

EventDescription
OnMoveToCompleteFired when the performer reaches the target position. The performance calls FinishPerformance() in response.

PathTo_DialoguePerformance

Navigates a performer entity to a named stage marker along a RecastNavigation navmesh path. Suitable for longer-distance movements where the performer must navigate around obstacles. Requires RecastNavigation to be enabled in the project.

PropertyTypeDescription
TypeId{C0DF4B0E-924D-4C38-BD26-5A286161D95C}
TargetStage marker nameThe destination position, resolved through CinematicsManagerRequestBus::GetStageMarker().
NavigationRecastNavigationUses the navmesh to compute a valid path from the performer’s current position to the target.

Request Bus: PathTo_PerformanceRequestBus

MethodParametersReturnsDescription
StartPathToAZ::EntityId performer, AZ::Vector3 targetvoidComputes a navmesh path and begins navigating the performer toward the target.

Notification Bus: PathTo_PerformanceNotificationBus

EventDescription
OnPathToCompleteFired when the performer reaches the end of the computed path. The performance calls FinishPerformance() in response.

RepositionPerformer_DialoguePerformance

Instantly teleports a performer entity to a named stage marker position. No interpolation, no pathfinding – the entity is moved in a single frame. Useful for off-screen repositioning between scenes or for snapping a performer to a mark before a sequence begins.

PropertyTypeDescription
TypeId{DE1E0930-F0A4-4F60-A741-4FB530610AEE}
TargetStage marker nameThe destination position, resolved through CinematicsManagerRequestBus::GetStageMarker().

Notification Bus: RepositionPerformer_PerformanceNotificationBus

EventDescription
OnRepositionCompleteFired immediately after the performer is teleported. The performance calls FinishPerformance() in the same frame.

Complete Performance Type Reference

TypeTypeIdMovementNavigation
DialoguePerformance{BCCF5C52-42C3-49D4-8202-958120EA8743}Abstract base
MoveTo_DialoguePerformance{6033A69D-F46F-40DF-A4A0-A64C4E28D6D5}InterpolatedNone
PathTo_DialoguePerformance{C0DF4B0E-924D-4C38-BD26-5A286161D95C}Navmesh pathRecastNavigation
RepositionPerformer_DialoguePerformance{DE1E0930-F0A4-4F60-A741-4FB530610AEE}Instant teleportNone

Creating a Custom Performance

To add a custom performance type from an external gem:

1. Define the class

#pragma once
#include <GS_Cinematics/Dialogue/Performances/DialoguePerformance.h>

namespace MyGem
{
    class PlayAnimation_DialoguePerformance : public GS_Cinematics::DialoguePerformance
    {
    public:
        AZ_RTTI(PlayAnimation_DialoguePerformance, "{YOUR-UUID-HERE}", GS_Cinematics::DialoguePerformance);
        AZ_CLASS_ALLOCATOR(PlayAnimation_DialoguePerformance, AZ::SystemAllocator);

        static void Reflect(AZ::ReflectContext* context);

    protected:
        void ExecutePerformance() override;
        void OnTick(float deltaTime, AZ::ScriptTimePoint time) override;

    private:
        AZStd::string m_animationName;
        bool m_animationComplete = false;
    };
}

2. Implement and register

void PlayAnimation_DialoguePerformance::ExecutePerformance()
{
    // Start the animation on the performer entity
    // Listen for animation completion
}

void PlayAnimation_DialoguePerformance::OnTick(float deltaTime, AZ::ScriptTimePoint time)
{
    if (m_animationComplete)
    {
        FinishPerformance();
    }
}

See Also

For conceptual overviews and usage guides:

For component references:

For related resources:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.4.3 - Timeline Expansion

For usage guides and setup examples, see The Basics: GS_Cinematics.


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.4.4 - Templates

ClassWizard templates for GS_Cinematics — dialogue conditions, effects, and performance objects for the dialogue sequencer.

All GS_Cinematics extension types are generated through the ClassWizard CLI. The wizard handles UUID generation and cmake file-list registration automatically.

For usage guides and setup examples, see The Basics: GS_Cinematics.

python ClassWizard.py \
    --template <TemplateName> \
    --gem <GemPath> \
    --name <SymbolName> \
    [--input-var key=value ...]

 

Contents


Dialogue Condition

Template: DialogueCondition

Creates a condition object that gates a dialogue node connection at runtime. Assigned to a node’s conditions list in the DialogueDatabase asset. EvaluateCondition() is called before the connected node is offered to the player — return true to allow it, false to hide it.

Two types available via --input-var:

OptionDescription
Basic Condition (default)Free-form stub — implement EvaluateCondition() however the design requires
Boolean ConditionPre-wired to read a save record key and compare it with GS_Core::BooleanConditions

Generated files (Basic):

  • Source/${Name}_DialogueCondition.h/.cpp

Generated files (Boolean):

  • Source/${Name}_BooleanDialogueCondition.h/.cpp

CLI:

# Basic Condition:
python ClassWizard.py --template DialogueCondition --gem <GemPath> --name <Name> \
    --input-var condition_type="Basic Condition"

# Boolean Condition:
python ClassWizard.py --template DialogueCondition --gem <GemPath> --name <Name> \
    --input-var condition_type="Boolean Condition"

Post-generation — manual registration required:

In DialogueSequencerComponent.cpp, add:

#include <path/to/${Name}_DialogueCondition.h>
// inside Reflect(context):
${Name}_DialogueCondition::Reflect(context);

Extensibility: Fully polymorphic. The conditions list on each DialogueNodeData holds DialogueCondition* raw pointers — O3DE’s property editor type picker discovers all registered subtypes via EnumerateDerived. Add as many condition types as needed; they appear in the picker automatically once reflected.

See also: Dialogue Data Structure — full extension walkthrough with header and implementation examples.


Dialogue Effect

Template: DialogueEffect

Creates an effect object that fires world events when the sequencer reaches an Effects node. If the node’s temporary flag is set, ReverseEffect() is called at sequence end to undo the change. Used for enabling/disabling lights, playing sounds, showing UI, or changing world state during dialogue.

Generated files:

  • Source/${Name}_DialogueEffect.h/.cpp

CLI:

python ClassWizard.py --template DialogueEffect --gem <GemPath> --name <Name>

Post-generation — manual registration required:

In DialogueSequencerComponent.cpp, add:

#include <path/to/${Name}_DialogueEffect.h>
// inside Reflect(context):
${Name}_DialogueEffect::Reflect(context);

Key overrides:

MethodWhen called
DoEffect()When the Effects node is reached in the sequence
ReverseEffect()At sequence end, only if the node’s temporary flag is true

Extensibility: Same polymorphic pattern as DialogueCondition. The effects list holds DialogueEffect* pointers. All reflected subtypes appear in the editor type picker automatically.

See also: Dialogue Data Structure — full extension walkthrough.


Dialogue Performance

Template: DialoguePerformance

Creates a performance object that drives world-space NPC actions from a Performance node in the dialogue sequence. The sequencer waits at the node until all performances signal completion (unless waitToContinue is false). Used for NPC movement, animation, repositioning, or any async world action that must complete before the dialogue continues.

Generated files:

  • Source/${Name}_DialoguePerformance.h/.cpp

CLI:

python ClassWizard.py --template DialoguePerformance --gem <GemPath> --name <Name>

Post-generation — manual registration required:

In DialogueSequencerComponent.cpp, add:

#include <path/to/${Name}_DialoguePerformance.h>
// inside Reflect(context):
${Name}_DialoguePerformance::Reflect(context);

Lifecycle:

MethodRole
ExecutePerformance()Entry point — start the world action here
FinishPerformance()Call when the action is done (may be async); clean up state, then call base
PerformanceComplete()Called by base after FinishPerformance() — signals the sequencer to advance

Inherited fields: delayStartTime (seconds before ExecutePerformance is called), delayEndTime (seconds before PerformanceComplete fires after finish).

For instant actions, call FinishPerformance() at the end of ExecutePerformance(). For async actions, store a callback or subscribe to a bus and call FinishPerformance() from the completion handler.

Extensibility: Same polymorphic pattern as the other dialogue types. The performances list on a Performance node holds DialoguePerformance* pointers. All reflected subtypes appear in the editor picker automatically.

See also: Dialogue Data Structure — full extension walkthrough.


See Also

For the full API, component properties, and C++ extension guide:

For all ClassWizard templates across GS_Play gems:


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.4.5 - 3rd Party Implementations

For usage guides and setup examples, see The Basics: GS_Cinematics.


Get GS_Cinematics

GS_Cinematics — Explore this gem on the product page and add it to your project.

4.5 - GS_Environment

Time of day progression, day/night cycle management, and sky colour configuration for GS_Play projects.

GS_Environment is the time and world atmosphere gem for GS_Play. It drives a configurable time-of-day clock, broadcasts day/night transition events, exposes per-tick world time notifications for dependent systems, and provides a sky colour configuration asset for Atom renderer integration. The gem depends on GS_Core.

For usage guides and setup examples, see The Basics: GS_Environment.

 

Contents


Time Manager

The Time Manager singleton owns the world clock. It advances time at a configurable speed, determines whether the current time of day is day or night, and registers the active main camera for sky calculations. Listeners subscribe via TimeManagerNotificationBus to react to time changes or the day/night boundary crossing.

Component / AssetPurpose
GS_TimeManagerComponentGS_Play manager. Owns the world clock. Controls time passage speed and exposes day/night state queries.
SkyColourConfigurationAsset class for sky colour settings used by Atom renderer integration.

Time Manager API


Environment System

The system component manages gem-level initialization and global environment state. It runs on AZ::TickBus to advance the world clock each frame and coordinates environment requests across the level.

ComponentPurpose
GS_EnvironmentSystemComponentRuntime system component. Advances the world clock on tick. Handles GS_EnvironmentRequestBus queries.

Environment System API


Dependencies

  • GS_Core (required)

Installation

  1. Enable GS_Environment and GS_Core in your O3DE project’s gem list.
  2. Add GS_TimeManagerComponent to your Game Manager prefab and include it in the Startup Managers list.
  3. Configure the day window (start and end time values) on the Time Manager component.
  4. Set time passage speed to 0 if you want a static time of day, or to a positive value for a live clock.
  5. Call SetMainCam via TimeManagerRequestBus once your active camera entity is known (typically from a stage startup sequence).
  6. Subscribe to TimeManagerNotificationBus::WorldTick or DayNightChanged on any entities that respond to time changes.

See Also

For conceptual overviews and usage guides:

For related resources:


Get GS_Environment

GS_Environment — Explore this gem on the product page and add it to your project.

4.5.1 - Time Manager

World clock management, time-of-day control, day/night cycle detection, and per-tick time notifications.

The Time Manager is the singleton world clock for every GS_Play project. It extends GS_ManagerComponent and participates in the standard two-stage initialization managed by the Game Manager. On activation it begins advancing a time-of-day value each frame, determines whether the current time falls within the configured day window, and registers the active main camera for sky positioning calculations.

Listeners subscribe to TimeManagerNotificationBus to react to per-frame ticks or the day/night boundary crossing.

For usage guides and setup examples, see The Basics: Time Manager.

Time Manager component in the O3DE Inspector

 

Contents


How It Works

Time Progression

Each frame the Time Manager advances the current time-of-day value by deltaTime × timePassageSpeed. A speed of 0 freezes the clock at the configured initial time; positive values run the clock forward in real time.

Day/Night State

The Time Manager compares the current time against a configured day window (start and end values). When the time crosses either threshold it fires DayNightChanged once. IsDay() returns the cached state and is safe to call every tick.

Sky Integration

Call SetMainCam with the active camera entity after stage startup. The Time Manager uses this reference to pass sky positioning data to the SkyColourConfiguration asset used by the Atom renderer.


Inspector Properties

PropertyTypeDescription
Day StartfloatNormalized time value (0.0–1.0) at which day begins.
Day EndfloatNormalized time value (0.0–1.0) at which night begins.
Initial TimefloatStarting time-of-day value when the component activates.
Time Passage SpeedfloatMultiplier controlling how fast time advances each frame. Set to 0 for a static clock.
Sky Colour ConfigAZ::Data::Asset<SkyColourConfiguration>Sky colour configuration asset used by the Atom renderer integration.

API Reference

GS_TimeManagerComponent

FieldValue
ExtendsGS_Core::GS_ManagerComponent
HeaderGS_Environment/GS_TimeManagerBus.h

Request Bus: TimeManagerRequestBus

Commands sent to the Time Manager. Singleton bus – Single address, single handler.

MethodParametersReturnsDescription
SetTimeOfDayfloat timevoidSets the current time of day (0.0–1.0 normalized range).
GetTimeOfDayfloatReturns the current time of day value.
GetWorldTimefloatReturns the total elapsed world time since startup.
SetTimePassageSpeedfloat speedvoidSets the multiplier controlling how fast time advances each frame.
IsDayboolReturns whether the current time falls within the configured day window.
SetMainCamAZ::EntityId entityIdvoidRegisters the active main camera entity for sky positioning calculations.

Notification Bus: TimeManagerNotificationBus

Events broadcast by the Time Manager. Multiple handler bus – any number of components can subscribe.

EventParametersDescription
WorldTickfloat deltaTimeFired every frame while the Time Manager is active. Use for time-dependent per-frame updates.
DayNightChangedbool isDayFired once when the time-of-day crosses the day/night boundary in either direction.

SkyColourConfiguration

Asset class for sky colour settings. Referenced by GS_TimeManagerComponent and consumed by the Atom renderer integration to drive sun, moon, and ambient colour values across the day/night cycle.


See Also

For conceptual overviews and usage guides:

For component references:


Get GS_Environment

GS_Environment — Explore this gem on the product page and add it to your project.

4.5.2 - Environment System

Runtime system component that drives the world clock tick and handles GS_EnvironmentRequestBus queries.

The Environment System Component is the gem-level runtime system for GS_Environment. It extends AZ::Component and implements AZ::TickBus::Handler to advance the world clock each frame. It handles GS_EnvironmentRequestBus queries and coordinates global environment state across the level.

This component activates automatically as part of the GS_Environment gem – it does not need to be added manually to any entity.

For usage guides and setup examples, see The Basics: GS_Environment.

Sky Colour Configuration asset in the O3DE Asset Editor

 

Contents


How It Works

Tick Integration

The Environment System Component connects to AZ::TickBus on activation. Each tick it drives the Time Manager’s clock advancement and ensures all environment-dependent systems receive their frame update in the correct order.

Environment Requests

GS_EnvironmentRequestBus provides a global access point for querying environment state. The system component is the sole handler; any gem or component can broadcast requests to read current environment conditions.


API Reference

GS_EnvironmentSystemComponent

FieldValue
TypeId{57B91AE7-B0EC-467E-A359-150B5FB993F9}
ExtendsAZ::Component, AZ::TickBus::Handler
HeaderGS_Environment/GS_EnvironmentBus.h

Request Bus: GS_EnvironmentRequestBus

Commands sent to the Environment System. Singleton bus – Single address, single handler.

FieldValue
Interface TypeId{39599FDC-B6DC-4143-A474-9B525599C919}
MethodParametersReturnsDescription
(base interface)Extended by project-level environment systems as needed.

See Also

For conceptual overviews and usage guides:

For component references:

  • Time Manager – World clock, day/night cycle, and per-tick notifications

Get GS_Environment

GS_Environment — Explore this gem on the product page and add it to your project.

4.5.3 - 3rd Party Implementations

For usage guides and setup examples, see The Basics: GS_Environment.


Get GS_Environment

GS_Environment — Explore this gem on the product page and add it to your project.

4.6 - GS_Interaction

Entity interaction systems — physics-based pulse events, targeting and cursor management, and extensible world trigger actions.

GS_Interaction is the gem that connects entities to each other and to the player. It provides three distinct systems: a physics-based pulse emitter/reactor pair for broadcasting typed events through trigger volumes, a targeting subsystem that tracks and selects interact candidates relative to a cursor, and a world trigger framework that composes trigger conditions with discrete world-state actions. All three systems are designed to be extended — custom pulse types, reactor types, trigger actions, and world trigger behaviors can be registered without modifying the gem.

For usage guides and setup examples, see The Basics: GS_Interaction.

 

Contents


Pulsors

A physics-based event broadcast system. A PulsorComponent emits a typed pulse event when its physics trigger volume fires. PulseReactorComponents on nearby entities receive the pulse and execute their registered reactor logic. Pulse and reactor types are extensible, so any gem can contribute new pulse/reactor pairs.

ComponentPurpose
PulsorComponentEmits a typed pulse when a physics trigger volume fires.
PulseReactorComponentReceives typed pulses and routes them to registered reactor logic.

Pulsors API


Targeting

A targeting and cursor management system. The Targeting Handler selects the best available interact target from registered candidates detected through proximity trigger volumes. The cursor components manage visual cursor representation on a LyShine canvas. An input reader component bridges raw player input into interaction events.

ComponentPurpose
GS_TargetingHandlerComponentCore targeting logic — selects the closest/best registered interact target.
GS_TargetComponentBase target component with visual properties (size, offset, color, sprite).
GS_CursorComponentManages the active cursor: registration, visibility, position, and visuals.
GS_CursorCanvasComponentPer-canvas cursor sprite component, registered with GS_CursorComponent.
GS_TargetingInteractionFieldComponentPhysics trigger volume used for proximity-based target detection.
InteractInputReaderComponentMaps raw player input to interaction events.

Targeting API


World Triggers

A composable trigger-action framework for world-state changes. Trigger Sensor components define the condition that fires the trigger (player interact, physics collision, save record state). World Trigger components define the effect (log a message, set a record, toggle entity activation, change stage). Any number of trigger actions can drive any number of world triggers on the same entity.

ComponentPurpose
TriggerSensorComponentBase class for all trigger condition components.
InteractTriggerSensorComponentFires on interact input.
PlayerInteractTriggerSensorComponentFires on interact input from the player only.
ColliderTriggerSensorComponentFires on physics collision with the trigger volume.
RecordTriggerSensorComponentFires when a save record matches a configured condition.
WorldTriggerComponentBase class for all world trigger effect components.
PrintLog_WTComponentLogs a message when triggered.
SetRecord_WorldTriggerComponentSets a key-value save record when triggered.
ToggleEntityActivate_WorldTriggerComponentToggles entity activation state when triggered.
ChangeStage_WorldTriggerComponentRequests a stage/level transition when triggered.

World Triggers API


Installation

GS_Interaction requires GS_Core, LmbrCentral, LyShine, and CommonFeaturesAtom to be active in your project.

  1. Enable GS_Interaction in Project Manager or project.json.
  2. Add a GS_TargetingHandlerComponent to the player entity and configure its proximity trigger.
  3. Add GS_CursorComponent and GS_CursorCanvasComponent to the cursor entity and link them together.
  4. Place TriggerSensorComponent and WorldTriggerComponent variants on any entity that should react to the world.
  5. Refer to the Interaction Set Up Guide for a full walkthrough.

See Also

For conceptual overviews and usage guides:

For related resources:


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

4.6.1 - Pulsors

Physics-based pulse emitter and reactor system — extensible typed interactions via polymorphic PulseType and ReactorType classes.

The Pulsor system is a physics-driven interaction layer. A PulsorComponent emits typed pulses when its physics trigger fires. PulseReactorComponents on other entities receive and process those pulses based on their type. The system is fully extensible — new pulse and reactor types are discovered automatically through O3DE serialization, so new interaction types can be added from any gem without modifying GS_Interaction.

For usage guides and setup examples, see The Basics: GS_Interaction.

 

Contents


Architecture

Pulse Pattern Graph

Breakdown

When an entity enters a Pulsor’s trigger volume, the Pulsor emits its configured pulse type to all Reactors on the entering entity:

StepWhat It Means
1 — Collider overlapPhysics detects an entity entering the Pulsor’s trigger volume.
2 — Pulse emitPulsorComponent reads its configured PulseType and prepares the event.
3 — Reactor queryEach PulseReactorComponent on the entering entity is checked with IsReactor().
4 — ReactionReactors returning true have ReceivePulses() called and execute their response.

Pulse types are polymorphic — new types are discovered automatically at startup via EnumerateDerived. Any gem can define custom interaction semantics without modifying GS_Interaction.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Components

PulsorComponent

Pulsor component in the O3DE Inspector

Emits typed pulses when its physics trigger fires. Extends AZ::Component and PhysicsTriggeringVolume.

PropertyTypeDescription
Pulse Typesvector<PulseType*>The pulse types this pulsor emits on trigger.

The pulsor fires all configured pulse types simultaneously when an entity enters its trigger volume.


PulseReactorComponent

Pulse Reactor component in the O3DE Inspector

Receives and processes typed pulses from pulsors.

Bus: PulseReactorRequestBus (ById, Multiple)

MethodParametersReturnsDescription
ReceivePulsesvoidProcess incoming pulse events.
IsReactorboolReturns whether this component is an active reactor.

Pulse Types

Pulse types define what kind of interaction a pulsor emits. All types extend the abstract PulseType base class.

TypeTypeIdDescription
PulseType (base){8A1B2C3D-4E5F-6A7B-8C9D-0E1F2A3B4C5D}Abstract base class for all pulse types.
Debug_Pulse{123D83FD-027C-4DA4-B44B-3E0520420E44}Test and debug pulse for development.
Destruct_Pulse{98EC44DA-C838-4A44-A37A-FA1A502A506B}Destruction pulse — triggers destructible reactions.

Pulse Types API


Reactor Types

Reactor types define how an entity responds to incoming pulses. All types extend the abstract ReactorType base class.

TypeTypeIdDescription
ReactorType (base){9B2C3D4E-5F6A-7B8C-9D0E-1F2A3B4C5D6E}Abstract base class for all reactor types.
Debug_Reactor{38CC0EA0-0975-497A-B2E5-299F5B4222F7}Test and debug reactor for development.
Destructable_Reactor{47C9B959-2A9F-4E06-8187-E32DDA3449EC}Handles destruction responses to Destruct_Pulse.

Reactor Types API


Extension Guide

Use the ClassWizard templates to generate new pulse and reactor classes with boilerplate already in place — see GS_Interaction Templates:

  • PulsorPulse — generates a new PulseType subclass with a named channel and payload stub. Supply type_display_name and type_category input vars to control how it appears in the editor dropdown.
  • PulsorReactor — generates a new ReactorType subclass. Supply the required pulse_channel var — the channel string is baked into the header at generation time.

To create a custom pulse or reactor type manually:

  1. Create a class extending PulseType (or ReactorType) with a unique RTTI TypeId.
  2. Reflect the class using O3DE’s SerializeContext and EditContext. The system discovers the new type automatically.

Channels are string-matched at runtime — keep the channel string consistent between the Pulse and the Reactor that receives it.


See Also

For related resources:


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

4.6.1.1 - Pulse Types

Built-in pulse types for the Pulsor interaction system.

Pulse types define what kind of interaction a PulsorComponent emits. Each pulse type is a data class extending the abstract PulseType base. The PulsorComponent holds an array of pulse types and emits all of them simultaneously when its physics trigger fires.

For usage guides and setup examples, see The Basics: GS_Interaction.


Component

PulsorComponent

Pulsor component in the O3DE Inspector

Emits configured pulse types when its physics trigger fires. Extends AZ::Component and PhysicsTriggeringVolume.

FieldTypeDescription
pulseTypesvector<PulseType*>Pulse types emitted to all PulseReactorComponent instances on any entering entity.

All pulse types in the array fire simultaneously on a single trigger event. Configure multiple pulse types to target reactors on different channels in one emission.


Base Class

PulseType

Abstract base class for all pulse types. Not a component — discovered automatically at startup via EnumerateDerived and reflected into the editor dropdown.

Field / VirtualTypeDescription
TypeId{8A1B2C3D-4E5F-6A7B-8C9D-0E1F2A3B4C5D}RTTI identifier. Each subclass must define its own unique TypeId.
GetChannel()AZStd::stringReturns the channel string this pulse is sent on. Matched against reactor channel strings at runtime.

Subclass PulseType to define custom pulse data (damage values, force vectors, status effect references) and a unique channel name.


Built-in Types

Debug_Pulse

Test and debug pulse for verifying pulsor setups during development.

FieldTypeId
TypeId{123D83FD-027C-4DA4-B44B-3E0520420E44}

Use Debug_Reactor on the receiving entity to verify the pulse is arriving correctly.


Destruct_Pulse

Destruction pulse — triggers destructible reactions on receiving entities.

FieldTypeId
TypeId{98EC44DA-C838-4A44-A37A-FA1A502A506B}

Pair with Destructable_Reactor on entities that should respond to destruction events.


Creating Custom Pulse Types

Use the ClassWizard PulsorPulse template to scaffold a new PulseType subclass with boilerplate already in place — see GS_Interaction Templates. Supply type_display_name and type_category input vars to control how the type appears in the editor dropdown.

To create a custom pulse type manually:

  1. Create a class extending PulseType with a unique RTTI TypeId.
  2. Override GetChannel() to return a unique channel name string.
  3. Add any data fields your pulse carries (damage values, force vectors, status effect references).
  4. Reflect the class using O3DE’s SerializeContext and EditContext. The system discovers the type automatically via EnumerateDerived — no registration step required.
  5. Configure PulsorComponent instances to emit your custom pulse type.

Keep the channel string consistent between your pulse type and the reactor types that should receive it.


See Also


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

4.6.1.2 - Reactor Types

Built-in reactor types for the Pulsor interaction system.

Reactor types define how a PulseReactorComponent responds to incoming pulses. Each reactor type is a data class extending the abstract ReactorType base. The PulseReactorComponent holds an array of reactor types and processes all of them when a matching pulse arrives.

For usage guides and setup examples, see The Basics: GS_Interaction.


Component

PulseReactorComponent

Pulse Reactor component in the O3DE Inspector

Owns and processes reactor types when pulses arrive from PulsorComponent instances. Extends AZ::Component.

Bus: PulseReactorRequestBus (ById, Multiple)

FieldTypeDescription
reactorTypesvector<ReactorType*>Reactor types evaluated and executed when a pulse is received.
MethodReturnsDescription
IsReactorboolReturns true if this component has active reactor types. The PulsorComponent checks this before calling ReceivePulses.
ReceivePulsesvoidProcesses incoming pulse events. Iterates all reactor types and calls React() on types whose channel matches the incoming pulse.

The PulsorComponent queries IsReactor() first — only components that return true have ReceivePulses() called. This allows entities with no reactor types to be skipped without iterating all types.


Base Class

ReactorType

Abstract base class for all reactor types. Not a component — discovered automatically at startup via EnumerateDerived and reflected into the editor dropdown.

Field / VirtualTypeDescription
TypeId{9B2C3D4E-5F6A-7B8C-9D0E-1F2A3B4C5D6E}RTTI identifier. Each subclass must define its own unique TypeId.
GetChannel()AZStd::stringReturns the channel string this reactor listens on. Matched against incoming pulse channel strings at runtime.
React(pulse, sourceEntity)voidOverride to implement the reaction behavior when a matching pulse is received.

Subclass ReactorType to define custom reaction behavior for a specific channel. The channel string must match the emitting PulseType’s channel exactly.


Built-in Types

Debug_Reactor

Test and debug reactor for verifying reactor setups during development. Logs a message when it receives a Debug_Pulse.

FieldTypeId
TypeId{38CC0EA0-0975-497A-B2E5-299F5B4222F7}

Destructable_Reactor

Handles destruction responses — processes Destruct_Pulse events on the receiving entity.

FieldTypeId
TypeId{47C9B959-2A9F-4E06-8187-E32DDA3449EC}

Add this to any entity that should respond to destruction pulses — props, breakables, or enemy characters.


Creating Custom Reactor Types

Use the ClassWizard PulsorReactor template to scaffold a new ReactorType subclass with boilerplate already in place — see GS_Interaction Templates. Supply the pulse_channel input var — the channel string is baked into the generated header at generation time.

To create a custom reactor type manually:

  1. Create a class extending ReactorType with a unique RTTI TypeId.
  2. Override GetChannel() to return the channel name this reactor listens on.
  3. Override React(pulse, sourceEntity) to implement the reaction logic.
  4. Reflect the class using O3DE’s SerializeContext and EditContext. The system discovers the type automatically via EnumerateDerived — no registration step required.
  5. Add the custom reactor type to PulseReactorComponent instances on entities that should respond.

The channel string in GetChannel() must exactly match the GetChannel() return value of the PulseType you want to receive.


See Also


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

4.6.2 - Targeting

Target detection, selection, and cursor management — scanning, filtering, and selecting entities based on proximity and sensory fields.

The targeting system provides spatial awareness and entity selection for units and other entities. A TargetingHandler component serves as the central processor — it receives target registrations from sensory fields, maintains categorized target lists, and selects the best interact target based on proximity and type filtering. The system supports a cursor overlay for visual feedback and an input reader for triggering interactions.

For usage guides and setup examples, see The Basics: GS_Interaction.

Targeting Handler component in the O3DE Inspector Targeting Interaction Field component in the O3DE Inspector Interact Cursor component in the O3DE Inspector Interact Target component in the O3DE Inspector

 

Contents


Components

ComponentPurpose
GS_TargetingHandlerComponentCentral targeting processor. Receives target registrations, selects best interact target.
GS_TargetingInteractionFieldComponentPhysics trigger volume for proximity-based target detection.
GS_TargetComponentBase target marker. Makes an entity detectable by the targeting system.
GS_InteractTargetComponentSpecialized interact target extending GS_TargetComponent.
GS_CursorComponentCursor display and positioning for targeting feedback.
GS_CursorCanvasComponentUI canvas layer for cursor sprite rendering.
InteractInputReaderComponentInput reader that bridges interact input to trigger actions on the current target.

EBus Summary

Request Buses

BusAddressHandlerKey Methods
GS_TargetingHandlerRequestBusByIdMultipleRegisterTarget, UnregisterTarget, RegisterInteractionRangeTarget, UnregisterInteractionRangeTarget, GetInteractTarget
GS_TargetRequestBusByIdMultipleGetTargetSize, GetTargetOffset, GetTargetColour, GetTargetSprite
GS_CursorRequestBusSingleMultipleRegisterCursorCanvas, HideCursor, SetCursorOffset, SetCursorVisuals, SetCursorPosition
GS_CursorCanvasRequestBusByIdMultipleHideSprite, SetCursorSprite

Notification Buses

BusAddressHandlerKey Events
GS_TargetingHandlerNotificationBusByIdMultipleOnUpdateInteractTarget, OnEnterStandby, OnExitStandby

See Also

For component references:

For related resources:


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

4.6.2.1 - Targeting Handler

Central targeting processor — receives target registrations from sensory fields, selects best interact target, and manages cursor feedback.

The GS_TargetingHandlerComponent is the central processor for the targeting system. It receives target registrations from sensory fields, maintains categorized lists of detected targets, and selects the best interact target based on proximity and type. It also manages cursor feedback and interaction range tracking.

For usage guides and setup examples, see The Basics: GS_Interaction.


The Targeting Handler sits on the same entity as the unit it serves (typically the player or an AI character). Sensory fields register and unregister targets as they enter and exit detection volumes. The handler evaluates all registered targets each tick to determine the closest interactable, which becomes the active interact target.

 

Contents


How It Works

Target Selection

Each tick, the handler runs ProcessClosestInteractable to evaluate all registered targets. It filters by type, checks interaction range, and selects the closest valid target. The result is broadcast via OnUpdateInteractTarget.

Interaction Range

A separate interaction range field provides a close-proximity subset of targets. Targets within this range are prioritized for interact selection. The range is driven by a dedicated GS_TargetingInteractionFieldComponent trigger volume.

Cursor Tracking

The handler updates the cursor position to follow the active interact target. Cursor visuals change based on target type (interact, item, unit, etc.).


API Reference

GS_TargetingHandlerRequestBus

Bus policy: ById, Multiple

MethodParametersReturnsDescription
RegisterTargetAZ::EntityId entityboolRegisters a target entity detected by a sensory field.
UnregisterTargetAZ::EntityId entityboolRemoves a target entity from the detection list.
RegisterInteractionRangeTargetAZ::EntityId entityboolRegisters a target within the close interaction range.
UnregisterInteractionRangeTargetAZ::EntityId entityboolRemoves a target from the interaction range list.
GetInteractTargetAZ::EntityIdReturns the current best interact target.

GS_TargetingHandlerNotificationBus

Bus policy: ById, Multiple

EventParametersDescription
OnUpdateInteractTargetAZ::EntityId targetFired when the active interact target changes.
OnEnterStandbyFired when the system enters standby mode.
OnExitStandbyFired when the system exits standby mode.

Virtual Methods

MethodParametersReturnsDescription
CheckForTickvoidCalled each tick to evaluate whether target processing should run.
ProcessClosestInteractablevoidDetermines the closest interactable target and sets it as the active interact target.

GS_CursorComponent

Manages cursor display and positioning for targeting feedback.

GS_CursorRequestBus

Bus policy: Single, Multiple

MethodParametersReturnsDescription
RegisterCursorCanvasAZ::EntityId canvasvoidRegisters the UI canvas for cursor rendering.
HideCursorbool hidevoidShows or hides the cursor.
SetCursorOffsetAZ::Vector2 offsetvoidSets the screen-space offset for cursor positioning.
SetCursorVisualsvisual datavoidConfigures cursor appearance.
SetCursorPositionAZ::Vector2 posvoidSets the cursor screen position directly.

GS_CursorCanvasComponent

UI canvas layer that renders the cursor sprite.

GS_CursorCanvasRequestBus

Bus policy: ById, Multiple

MethodParametersReturnsDescription
HideSpritebool hidevoidShows or hides the cursor sprite element.
SetCursorSpritesprite datavoidSets the cursor sprite image.

Extension Guide

The Targeting Handler is designed for extension via companion components. Add custom targeting logic by creating components that listen to GS_TargetingHandlerNotificationBus on the same entity. Override ProcessClosestInteractable for custom target selection algorithms.


Script Canvas Examples

Getting the current interact target:

Reacting to interact target changes:


See Also

For conceptual overviews and usage guides:

For component references:


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

4.6.2.2 - Senses & Targeting Fields

Physics-based detection volumes for the targeting system — interaction fields and sensory detection.

Targeting fields are physics-based detection volumes that feed target registrations into the Targeting Handler. The GS_TargetingInteractionFieldComponent provides proximity detection for interact-range targets. Fields use their own collision layers and should be placed on child entities rather than the main unit capsule.

For usage guides and setup examples, see The Basics: GS_Interaction.


GS_TargetingInteractionFieldComponent

A physics trigger volume that detects targets entering and exiting the interaction range. Extends AZ::Component and PhysicsTriggeringVolume. Registers detected entities with the parent Targeting Handler.

API

MethodParametersReturnsDescription
TriggerEnterAZ::EntityId entityboolCalled when an entity enters the field. Registers the entity as an interaction-range target.
TriggerExitAZ::EntityId entityboolCalled when an entity exits the field. Unregisters the entity from the interaction-range list.

Setup

  1. Create a child entity under the unit entity that owns the Targeting Handler.
  2. Add a PhysX collider configured as a trigger shape.
  3. Add the GS_TargetingInteractionFieldComponent.
  4. Set the collider to use a dedicated interaction collision layer — do not share the unit’s physics layer.
  5. Point the field to the parent Targeting Handler entity.

Extension Guide

Custom sensory fields can be created by extending the base field pattern:

  1. Create a component that extends AZ::Component and PhysicsTriggeringVolume.
  2. On trigger enter/exit, call RegisterTarget / UnregisterTarget on the Targeting Handler via GS_TargetingHandlerRequestBus.
  3. Add custom filtering logic (line of sight, tag filtering, team affiliation) in your trigger callbacks.

See Also


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

4.6.2.3 - Targets

Target component types — base target markers and specialized interact targets that make entities detectable by the targeting system.

Target components mark entities as detectable by the targeting system. The base GS_TargetComponent provides core target data (size, offset, visual properties). Specialized variants like GS_InteractTargetComponent add interaction-specific behavior.

For usage guides and setup examples, see The Basics: GS_Interaction.


GS_TargetComponent

Base target component. Makes an entity visible to the targeting system with configurable visual properties for cursor feedback.

GS_TargetRequestBus

Bus policy: ById, Multiple

MethodParametersReturnsDescription
GetTargetSizefloatReturns the target’s visual size for cursor scaling.
GetTargetOffsetAZ::Vector3Returns the world-space offset for cursor positioning above the target.
GetTargetColourAZ::ColorReturns the target’s highlight colour.
GetTargetSpritesprite refReturns the target’s cursor sprite override, if any.

GS_InteractTargetComponent

Extends GS_TargetComponent with interact-specific properties. Entities with this component can be selected as interact targets by the Targeting Handler and triggered via the Interact Input system.


Cross-Gem Target Types

Additional target types are conditionally compiled when GS_Interaction is available alongside other gems:

TypeSource GemConditionDescription
ItemTargetGS_Item#IF GS_INTERACTIONMakes item entities targetable for pickup and inspection.
UnitTargetGS_Unit#IF GS_INTERACTIONMakes unit entities targetable for interaction and AI awareness.

Extension Guide

Create custom target types by extending GS_TargetComponent:

  1. Create a new component class extending GS_TargetComponent.
  2. Override the target data methods (GetTargetSize, GetTargetOffset, etc.) to return your custom values.
  3. Add any additional data fields specific to your target type.
  4. The targeting system will automatically detect and categorize your custom targets when they enter sensory fields.

See Also


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

4.6.2.4 - Interact Input Reader

For usage guides and setup examples, see The Basics: GS_Interaction.

Image showing the InteractInputReader component, as seen in the Entity Inspector.

Interact Input Reader Overview

Extends Input Reader.

This component is a standalone input detector utility to enable Interact Input firing outside of the GS_Unit Gem.

It intakes the Event Name for your chosen interaction input, as defined in your Input Profile and ignores all other input.

Functionality

Interact Input Reader needs to be put in the same entity as the Targeting Handler.

After firing the correct Input Event, the Interact Input Reader requests the current InteractTarget from the Targeting Handler. It then sends a DoInteractAction event, with a reference to its EntityId as caller, to the InteractTriggerSensor component that should be present on the Interact Target.

The World Trigger and Trigger Sensor system takes care of the rest.


Setting Up Your Interact Input Reader

Assure that the component is placed on the same Entity as the Targeting Handler.

Image showing the Interact Input Reader Component alongside the Targeting Handler Component.

Fill the Interact Event Name with the Event Name for your chosen interaction input, as defined in your Input Profile.

So long as the Input Profile that’s set in your GS_OptionsManager has that event set. The rest should work as is.


API

// OptionsManagerNotificationBus
void HandleStartup() override;

// Local Methods (Parent Input Reader Component Class)
void HandleFireInput(AZStd::string eventName, float value) override;

Extending Interact Input Reader


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

4.6.3 - World Triggers

Trigger sensor and world trigger system — event-driven world state changes via composable polymorphic type objects on two container components.

The World Trigger system pairs a TriggerSensorComponent (condition side) with a WorldTriggerComponent (response side) to create event-driven world state changes. Each component owns an array of polymorphic type objects — sensor types define what conditions to evaluate, trigger types define what happens when the condition passes. Any combination of types can be composed on a single entity without scripting.

For usage guides and setup examples, see The Basics: GS_Interaction.

 

Contents


Architecture

World Trigger Pattern Graph

Breakdown

World Triggers split conditions from responses using a polymorphic type/component pattern. The two container components sit on the entity; logic lives in the type objects they own.

PartWhat It Does
TriggerSensorComponentOwns arrays of TriggerSensorType objects (andConditions, orConditions). Evaluates all conditions and fires WorldTriggerRequestBus::Trigger on success. Automatically activates a PhysicsTriggerComponent if any sensor type requires it.
WorldTriggerComponentOwns an array of WorldTriggerType objects (triggerTypes). On Trigger(), calls Execute() on every type. On Reset(), calls OnReset(). On Activate(), calls OnComponentActivate() for startup initialization.

No scripting is required for standard patterns. Compose sensor types and trigger types in the editor to cover the majority of interactive world objects.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Trigger Sensors

The sensory side. TriggerSensorComponent holds arrays of TriggerSensorType objects that define evaluation logic — collisions, interactions, record conditions. Supports AND conditions (all must pass) and OR conditions (any must pass).

Trigger Sensors API


World Triggers

The response side. WorldTriggerComponent holds an array of WorldTriggerType objects that execute when triggered — toggling entities, setting records, changing stages, logging messages.

World Triggers API


Extension Guide

Use the ClassWizard templates to generate new sensor and trigger type classes — see GS_Interaction Templates:

  • TriggerSensorType — generates a new TriggerSensorType subclass with EvaluateAction(), EvaluateResetAction(), and lifecycle stubs.
  • WorldTriggerType — generates a new WorldTriggerType subclass with Execute(), OnReset(), and OnComponentActivate() stubs.

Custom Sensor Types

  1. Create a class extending TriggerSensorType.
  2. Override EvaluateAction() to define your condition logic.
  3. Override EvaluateResetAction() if your type supports reset evaluation.
  4. Override Activate(entityId) / Deactivate() for lifecycle setup.
  5. If your type needs a physics trigger volume, return true from NeedsPhysicsTrigger().
  6. For event-driven types, listen on a bus and call TriggerSensorRequestBus::DoAction() or DoResetAction() when the event fires.
  7. Reflect the class manually in GS_InteractionSystemComponent::Reflect().

Custom Trigger Types

  1. Create a class extending WorldTriggerType.
  2. Override Execute(entityId) — implement the world state change here.
  3. Override OnReset(entityId) to reverse or re-arm the effect.
  4. Override OnComponentActivate(entityId) for any initial state that must be set at startup.
  5. Reflect the class manually in GS_InteractionSystemComponent::Reflect().

See Also

For component references:

For related resources:


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

4.6.3.1 - Trigger Sensors

Trigger sensor component and sensor types — condition evaluation objects for physics, interact, and record-based world trigger activation.

TriggerSensorComponent is the condition container of the World Trigger system. It owns arrays of TriggerSensorType objects that define evaluation logic — collisions, interactions, record checks. When conditions pass, it fires WorldTriggerRequestBus::Trigger on the same entity.

For usage guides and setup examples, see The Basics: GS_Interaction.


Container Component

TriggerSensorComponent

Trigger Sensor component in the O3DE Inspector

Owns and evaluates all sensor type objects. Not subclassed — extended by adding TriggerSensorType objects to its arrays.

Bus: TriggerSensorRequestBus (ById, Single)

FieldTypeDescription
andConditionsvector<TriggerSensorType*>All types must pass for the trigger to fire.
orConditionsvector<TriggerSensorType*>Any type passing is sufficient to fire.
MethodReturnsDescription
DoActionvoidEvaluates all conditions. On pass, fires WorldTriggerRequestBus::Trigger and TriggerSensorNotificationBus::OnTriggered.
DoResetActionvoidEvaluates reset conditions and fires reset if they pass.

On Activate, automatically activates a PhysicsTriggerComponent on the entity if any sensor type returns true from NeedsPhysicsTrigger().


Base Type Class

TriggerSensorType

Abstract base for all sensor evaluation logic. Not a component — reflected manually in GS_InteractionSystemComponent::Reflect(). Subclass this to add new condition types.

VirtualReturnsDescription
Activate(entityId)voidCalled when the owning component activates. Connect to buses here.
Deactivate()voidCalled when the owning component deactivates. Disconnect from buses here.
EvaluateAction()boolReturns true if the condition is currently satisfied.
EvaluateResetAction()boolReturns true if the reset condition is satisfied. Returns true by default.
NeedsPhysicsTrigger()boolReturn true to signal that a PhysicsTriggerComponent should be activated. Returns false by default.
OnTriggerEnter(entity)voidCalled by the physics trigger when an entity enters the volume.
OnTriggerExit(entity)voidCalled by the physics trigger when an entity exits the volume.

Stores m_ownerEntityId for calling back to TriggerSensorRequestBus::DoAction() or DoResetAction() from event-driven types.


Built-in Sensor Types

SensorType_Collider

Physics overlap sensor. Returns NeedsPhysicsTrigger() = true, causing the owning component to activate a PhysicsTriggerComponent.

  • OnTriggerEnter stores the entering entity.
  • OnTriggerExit clears the stored entity.
  • EvaluateAction() returns true while a valid entity is present.
  • EvaluateResetAction() returns true when no entity is present.

SensorType_Record

Fires when a named save record changes to a configured value. Extends RecordKeeperNotificationBus — connects on Activate and fires DoAction automatically when the matching record changes.

  • EvaluateAction() calls GetRecord and compares against recordValue.

→ Record Sensor Type Reference

FieldDescription
recordNameName of the record to watch.
recordValueValue the record must reach to fire.

SensorType_Interact

Fires when any unit interacts with this entity via the targeting system. Extends InteractTriggerSensorRequestBus.

  • On interact: stores lastCaller and calls DoAction.
  • EvaluateAction() returns true while lastCaller is a valid entity.

SensorType_PlayerInteract

Player-only variant of SensorType_Interact. Extends SensorType_Interact and adds a tag check — lastCaller must have the "Player" tag for the condition to pass.


Extension Guide

To create a custom sensor type:

  1. Create a class extending TriggerSensorType.
  2. Override EvaluateAction() to define when your condition is satisfied.
  3. Override Activate(entityId) / Deactivate() to connect and disconnect from any buses.
  4. For event-driven evaluation, listen on a bus and call TriggerSensorRequestBus::DoAction(m_ownerEntityId) when the event fires — the component will run the full evaluation pipeline.
  5. Return true from NeedsPhysicsTrigger() if your type needs a physics volume.
  6. Reflect the class in GS_InteractionSystemComponent::Reflect() — types are not components and are not registered via CreateDescriptor().

See Also


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

4.6.3.1.1 - SensorType_Record

Record-based sensor type — fires when a named RecordKeeper entry reaches a configured value.

For usage guides and setup examples, see The Basics: GS_Interaction.


Overview

SensorType_Record is a TriggerSensorType that fires when a named save record is changed to a configured value. Add it to the andConditions or orConditions array on a TriggerSensorComponent.

Inherits from RecordKeeperNotificationBus — connects automatically on Activate and listens for record changes without polling. When a matching record change arrives, it calls TriggerSensorRequestBus::DoAction on the owning component, which runs the full condition evaluation pipeline.


Fields

FieldTypeDescription
recordNamestringName of the record to watch in the RecordKeeper.
recordValueintValue the record must reach for the condition to pass.

Evaluation

EvaluateAction() calls RecordKeeperRequestBus::GetRecord(recordName) and compares the result to recordValue. Returns true if they match.

The type self-triggers on record change events — you do not need to poll or script a check manually.


Setup

  1. Add a TriggerSensorComponent to your entity.
  2. Add a SensorType_Record entry to the andConditions or orConditions array.
  3. Set recordName to the record you want to watch.
  4. Set recordValue to the value that should fire the trigger.
  5. Add a WorldTriggerComponent with the desired WorldTriggerType objects to the same entity.

API

From TriggerSensorType:

//! Returns true if the watched record currently matches recordValue.
virtual bool EvaluateAction();

//! Connects to RecordKeeperNotificationBus on component activate.
virtual void Activate(AZ::EntityId entityId);

//! Disconnects from RecordKeeperNotificationBus on component deactivate.
virtual void Deactivate();

From RecordKeeperNotificationBus:

//! Called when any record changes. Fires DoAction if recordName and recordValue match.
virtual void RecordChanged(const AZStd::string& name, int value);

Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

4.6.3.2 - World Triggers

World trigger component and trigger types — response execution objects that change world state when a trigger sensor fires.

WorldTriggerComponent is the response container of the World Trigger system. It owns an array of WorldTriggerType objects that execute when triggered — toggling entities, setting records, changing stages, logging messages.

For usage guides and setup examples, see The Basics: GS_Interaction.


Container Component

WorldTriggerComponent

World Trigger component in the O3DE Inspector

Owns and drives all trigger type objects. Not subclassed — extended by adding WorldTriggerType objects to triggerTypes.

Bus: WorldTriggerRequestBus (ById)

FieldTypeDescription
triggerTypesvector<WorldTriggerType*>All types are executed on each trigger or reset event.
MethodDescription
Trigger()Calls Execute(entityId) on every type in triggerTypes.
Reset()Calls OnReset(entityId) on every type in triggerTypes.

On Activate(), calls OnComponentActivate(entityId) on every type — use this for initial state seeding (e.g. entity visibility setup before any trigger fires).


Base Type Class

WorldTriggerType

Abstract base for all trigger response logic. Not a component — reflected manually in GS_InteractionSystemComponent::Reflect(). Subclass this to add new response types.

VirtualParametersDescription
OnComponentActivate(entityId)AZ::EntityIdCalled at component activation for startup initialization.
Execute(entityId)AZ::EntityIdCalled on Trigger(). Implement the world state change here.
OnReset(entityId)AZ::EntityIdCalled on Reset(). Reverse or re-arm the effect here.

Built-in Trigger Types

Type ClassResponse
TriggerType_PrintLogPrints logMessage to the development log on Execute.
TriggerType_SetRecordSets recordName to recordProgress via the RecordKeeper on Execute.
TriggerType_ToggleEntitiesToggles toggleEntities[] active/inactive. OnComponentActivate seeds initial state from startActive. Execute and OnReset invert the state each call.
TriggerType_ChangeStageQueues StageManagerRequestBus::ChangeStageRequest(targetStageName, targetExitPoint) on the next tick via Execute.

Extension Guide

To create a custom trigger type:

  1. Create a class extending WorldTriggerType.
  2. Override Execute(entityId) — implement your world state change here.
  3. Override OnReset(entityId) for reversible or re-armable effects.
  4. Override OnComponentActivate(entityId) if your type needs to set up initial state at startup (e.g. TriggerType_ToggleEntities uses this to seed entity visibility).
  5. Reflect the class in GS_InteractionSystemComponent::Reflect() — types are not components and are not registered via CreateDescriptor().

See Also


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

4.6.4 - Templates

ClassWizard templates for GS_Interaction — Pulsor pulses, reactors, world triggers, and trigger sensors.

All GS_Interaction extension types are generated through the ClassWizard CLI. The wizard handles UUID generation, cmake file-list registration, and reflection automatically.

For usage guides and setup examples, see The Basics: GS_Interaction.

python ClassWizard.py \
    --template <TemplateName> \
    --gem <GemPath> \
    --name <SymbolName> \
    [--input-var key=value ...]

 

Contents


Pulsor Pulse

Template: PulsorPulse

Creates a Pulse — the sender side of the Pulsor system. A Pulse fires on a named channel and carries a typed payload. Any PulsorReactor on the same entity listening to the same channel will receive it.

Generated files:

  • Source/${Name}_Pulse.h/.cpp

CLI:

python ClassWizard.py --template PulsorPulse --gem <GemPath> --name <Name> \
    --input-var type_display_name="My Pulse" \
    --input-var type_category="Input"

Input vars:

VarTypeDefaultDescription
type_display_nametext${Name}Label shown in the Pulse type dropdown in the editor
type_categorytext(empty)Grouping category in the dropdown (e.g. Input, Physics, Timer)

Post-generation: Implement Fire() with the data payload your channel requires. Match the channel string to the Reactor that will receive it. Each Pulse/Reactor pair is one interaction channel — channels are string-matched.

See also: Pulsors — full pulsor architecture, built-in pulse types, and extension guide.


Pulsor Reactor

Template: PulsorReactor

Creates a Reactor — the receiver side of the Pulsor system. Listens on a specific named channel and executes a response when a matching Pulse fires on the same entity.

Generated files:

  • Source/${Name}_Reactor.h/.cpp

CLI:

python ClassWizard.py --template PulsorReactor --gem <GemPath> --name <Name> \
    --input-var pulse_channel=MyChannel \
    --input-var type_display_name="My Reactor" \
    --input-var type_category="Input"

Input vars:

VarTypeRequiredDescription
pulse_channeltextyesThe channel name this reactor subscribes to — must match the Pulse’s channel
type_display_nametextnoLabel shown in the Reactor type dropdown
type_categorytextnoGrouping category in the dropdown

Important: pulse_channel is required. The channel string is baked into the header at generation time. If you need to change it later, edit the m_channel field directly in the generated header.

Post-generation: Implement OnPulse(...) with the response logic. Multiple Reactor instances of different types can coexist on one entity, each listening to a different channel.

See also: Pulsors — full pulsor architecture, built-in reactor types, and extension guide.


World Trigger

Template: WorldTrigger

Creates a WorldTrigger component — a logical trigger in the world that fires a named event when activated. Used to start dialogue sequences, cutscenes, or other scripted events from gameplay code or Script Canvas.

Generated files:

  • Source/${Name}_WorldTrigger.h/.cpp

CLI:

python ClassWizard.py --template WorldTrigger --gem <GemPath> --name <Name>

Post-generation: Wire the trigger activation to whatever condition suits the design (physics overlap, input event, distance check, etc.). Override Trigger() — call the parent first to check channels and conditions. Override Reset() for reversible triggers.

See also: World Triggers — full world trigger architecture and extension guide.


Trigger Sensor

Template: TriggerSensor

Creates a TriggerSensor — the listening side of a trigger pair. Registers to receive named trigger events from a WorldTrigger and executes a response action.

Generated files:

  • Source/${Name}_TriggerSensor.h/.cpp

CLI:

python ClassWizard.py --template TriggerSensor --gem <GemPath> --name <Name>

Post-generation: Subscribe to the matching trigger event name in Activate(). Override EvaluateAction() to define your trigger condition. Override DoAction() — call the parent first, then add custom behavior if it succeeds. Stack sensors to react to the same trigger in different ways.

See also: Trigger Sensors — full trigger sensor reference.


See Also

For the full API, component properties, and C++ extension guide:

For all ClassWizard templates across GS_Play gems:


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

4.6.5 - 3rd Party Implementations

For usage guides and setup examples, see The Basics: GS_Interaction.


Get GS_Interaction

GS_Interaction — Explore this gem on the product page and add it to your project.

4.7 - GS_Juice

Game feel and feedback motion system — GS_Motion extension with transform and material tracks for visual feedback effects.

GS_Juice provides motion-based visual feedback effects for game feel. It extends the GS_Motion system with feedback-specific track types — transform animation (position, scale, rotation) and material property animation (opacity, emissive, color). Effects are authored as .feedbackmotion data assets and played by FeedbackEmitter components on entities.

For usage guides and setup examples, see The Basics: GS_Juice.

 

Contents


Feedback System

The feedback system follows the standard GS_Motion domain extension pattern. FeedbackMotionTrack is the domain base class, with two concrete track types. FeedbackMotionAsset holds the track definitions. FeedbackMotion is the instance wrapper struct that manages the runtime lifecycle. FeedbackEmitter is the component that plays effects.

Component / TypePurpose
FeedbackEmitterComponent — plays a feedback motion on its entity or a target entity.
FeedbackMotionInstance wrapper struct — asset reference, proxies, runtime composite.
FeedbackMotionAssetData asset (.feedbackmotion) — holds vector<FeedbackMotionTrack*>.
FeedbackMotionTrackDomain base track — extends GS_Core::GS_MotionTrack.
FeedbackTransformTrackConcrete track — position (Vector2Gradient), scale (FloatGradient), rotation (FloatGradient).
FeedbackMaterialTrackConcrete track — opacity (FloatGradient), emissive (FloatGradient), color (ColorGradient).

Feedback API


PostProcessing (Planned)

Planned for post-processing feedback effects (screen distortion, color grading, vignette). Not yet implemented.


System Components

ComponentPurpose
GS_JuiceSystemComponentRuntime system component for GS_Juice.

Dependencies

  • GS_Core (required — provides GS_Motion base system)

Installation

  1. Enable the GS_Juice gem in your project configuration.
  2. Ensure GS_Core is also enabled.
  3. Create .feedbackmotion assets in the O3DE Asset Editor.
  4. Add FeedbackEmitter components to entities that need feedback effects.

See Also

For conceptual overviews and usage guides:

For sub-system references:

For related resources:


Get GS_Juice

GS_Juice — Explore this gem on the product page and add it to your project.

4.7.1 - Feedback System

Feedback motion tracks, FeedbackEmitter component, and FeedbackMotionAsset reference.

The Feedback System provides the concrete implementation of GS_Juice’s game feel effects. It includes the FeedbackEmitter component for playback, FeedbackMotionAsset for data storage, and two concrete track types for transform and material animation.

For usage guides and setup examples, see The Basics: GS_Juice.

FeedbackEmitter component in the O3DE Inspector

 

Contents


FeedbackEmitter

The FeedbackEmitter component plays a Feedback Motion on its entity or on a specified target entity.

Inspector Properties

PropertyTypeDescription
FeedbackMotionFeedbackMotionThe motion instance — holds the asset reference, proxy list, and runtime composite.
playOnActivateboolWhen true, the motion plays automatically when the entity activates.

API Reference

MethodParametersReturnsDescription
PlayvoidPlays the feedback motion on the owning entity.
PlayOnTargetAZ::EntityId targetvoidPlays the feedback motion on a different entity.
StopvoidStops the currently playing motion.

FeedbackMotionAsset

The data asset that defines a feedback effect. Created and edited in the O3DE Asset Editor with the .feedbackmotion extension. Extends GS_Core::GS_MotionAssetAZ::Data::AssetData. Requires GS_AssetReflectionIncludes.h when reflecting — see Serialization Helpers.

Properties

PropertyTypeDescription
m_tracksvector<FeedbackMotionTrack*>The tracks in this motion.
motionNameAZStd::stringDisplay name for the motion.
loopboolWhether the motion loops.

Methods

MethodReturnsDescription
GetTrackInfosvector<GS_TrackInfo>Returns track UUID + label pairs for proxy sync.
CreateRuntimeCompositeGS_MotionComposite*Creates a deep-copy runtime instance of all tracks.

To understand how to author Feedback Motions, refer to Feedback: Authoring Feedback Motions


FeedbackTransformTrack

Animates entity transform properties. All fields use gradient types for full curve control.

FieldTypeWhat It Animates
PositionVector2GradientXY offset from origin (additive).
ScaleFloatGradientUniform scale factor over time.
RotationFloatGradientRotation angle over time.

The track captures the entity’s origin transform on Init() and applies offsets relative to it. Effects are additive — multiple transform tracks on the same entity stack correctly.


FeedbackMaterialTrack

Animates entity material properties via the render component.

FieldTypeWhat It Animates
OpacityFloatGradientMaterial opacity.
EmissiveFloatGradientEmissive intensity.
ColorColorGradientColor tint.

Adding Custom Tracks

Use the FeedbackMotionTrack ClassWizard template to generate a new world-space track — see GS_Juice Templates. The only manual step after generation is adding a Reflect(context) call in {$Gem}DataAssetSystemComponent.cpp. Once reflected, the new track type is discovered automatically and appears in the asset editor type picker.

Override Init(ownerEntityId) to cache entity context, and Update(easedProgress) to drive the target property via its bus.

void ${Custom}Track::Init(AZ::EntityId ownerEntity)
{
    // Always call the base first  it sets m_owner.
    GS_Juice::FeedbackMotionTrack::Init(ownerEntity);

    // Cache the entity's origin value so the track can apply additive offsets.
    // Example:
    //   AZ::TransformBus::EventResult(m_originPosition, ownerEntity, &AZ::TransformInterface::GetWorldTranslation);
    //   AZ::TransformBus::EventResult(m_originRotation, ownerEntity, &AZ::TransformInterface::GetWorldRotationQuaternion);
}

void ${Custom}Track::Update(float easedProgress)
{
    // Evaluate the gradient at the current eased progress and apply to the entity.
    // easedProgress is in [0, 1] and already passed through the track's CurveType.
    // Example:
    //   const AZ::Vector3 offset = valueGradient.Evaluate(easedProgress);
    //   AZ::TransformBus::Event(m_owner, &AZ::TransformInterface::SetWorldTranslation, m_originPosition + offset);
}

See Also

For conceptual overviews and usage guides:

For component references:

For related resources:


Get GS_Juice

GS_Juice — Explore this gem on the product page and add it to your project.

4.7.2 - Third Party Implementations

Integration guides for third-party feedback systems with GS_Juice.

This section will contain integration guides for connecting third-party game feel and feedback tools with the GS_Juice system.

For usage guides and setup examples, see The Basics: GS_Juice.


Get GS_Juice

GS_Juice — Explore this gem on the product page and add it to your project.

4.7.3 - Templates

ClassWizard templates for GS_Juice — custom feedback motion tracks for world-space game-feel effects.

All GS_Juice extension types are generated through the ClassWizard CLI. The wizard handles UUID generation and cmake file-list registration automatically.

For usage guides and setup examples, see The Basics: GS_Juice.

python ClassWizard.py \
    --template <TemplateName> \
    --gem <GemPath> \
    --name <SymbolName> \
    [--input-var key=value ...]

 

Contents


Feedback Motion Track

Template: FeedbackMotionTrack

Creates a custom animation track for GS_Juice feedback sequences. A FeedbackMotionTrack child animates a world-space entity property (transform, material, audio, particle, etc.) via O3DE buses over a normalised [0,1] eased progress value. Tracks are referenced from GS_FeedbackSequence data assets.

Generated files:

  • Include/${GemName}/GS_Feedback/MotionTracks/${Name}Track.h
  • Source/GS_Feedback/MotionTracks/${Name}Track.cpp

CLI:

python ClassWizard.py --template FeedbackMotionTrack --gem <GemPath> --name <Name>

Post-generation — manual registration required:

In GS_JuiceDataAssetSystemComponent.cpp, add:

#include <path/to/${Name}Track.h>
// inside Reflect(context):
${Name}Track::Reflect(context);

Extensibility: Same polymorphic pattern as UiMotionTrack — any number of track types, discovered automatically via EnumerateDerived. World-space tracks differ from UI tracks in that they act on entity buses (TransformBus, material buses, etc.) rather than LyShine element interfaces. Init(ownerEntityId) caches the entity context; Update(easedProgress) drives the property.

See also: Feedback — the full feedback system architecture, built-in track types, and domain extension pattern.


See Also

For the full API, component properties, and C++ extension guide:

For all ClassWizard templates across GS_Play gems:


Get GS_Juice

GS_Juice — Explore this gem on the product page and add it to your project.

4.8 - GS_Item

Data-driven inventory containers and equipment slot management for collectible, usable, and equippable items in GS_Play projects.

GS_Item provides the inventory and equipment systems for GS_Play. It gives entities inventory containers that hold stacks of item data, and an equipment system that maps items to named slots on a character entity. Items are defined as data assets, making the full catalog editable without code changes. GS_Item is under active development — the API surface is evolving and additional item behaviours will be added in future releases.

For usage guides and setup examples, see The Basics: GS_Item.

 

Contents


Inventory

The inventory system manages ordered collections of item stacks on an entity. Stacks track item type, quantity, and metadata. The inventory component exposes add, remove, query, and transfer operations and broadcasts notifications when contents change.

AreaContents
Inventory ComponentPer-entity container that holds item stacks and manages quantity limits.
Item Data AssetsData-driven item definitions: display name, icon, stack limit, categories, and custom properties.
Stack OperationsAdd, remove, transfer, split, and query by item type or category.
NotificationsChange events broadcast when items are added, removed, or quantities updated.

Inventory API


Equipment

The equipment system maps item assets to named slots on a character or entity. Equipping an item occupies its designated slot and can trigger stat modifications via GS_RPStats integration. Unequipping returns the item to inventory or discards it based on configured policy.

AreaContents
Equipment ComponentPer-entity slot manager. Maps slot names to currently equipped item assets.
Slot DefinitionsNamed slot configuration (e.g. weapon_main, armour_chest, accessory_1) defined per entity type.
Equip / Unequip FlowValidates slot compatibility, optionally modifies stats via GS_RPStats, and notifies listeners.
NotificationsChange events broadcast when slots are filled, cleared, or swapped.

Equipment API


Installation

GS_Item is a standalone gem. GS_RPStats integration is optional but recommended for stat-modifying equipment.

  1. Enable GS_Item and GS_Core in your O3DE project’s gem list.
  2. Create item data assets in your project’s asset directory and populate them in the Editor.
  3. Add the inventory component to any entity (player, chest, NPC) that should hold items.
  4. Add the equipment component to character entities that support equippable slots.
  5. Configure slot names on the equipment component to match your game’s equipment schema.
  6. Optionally enable GS_RPStats to wire equipment stat modifiers into the character stat pipeline.

Note: GS_Item is under active development. Check the changelog for the latest additions before starting integration work.


See Also

For conceptual overviews and usage guides:

For sub-system references:

For related resources:


Get GS_Item

GS_Item — Explore this gem on the product page and add it to your project.

4.8.1 - 3rd Party Implementations

For usage guides and setup examples, see The Basics: GS_Item.


Get GS_Item

GS_Item — Explore this gem on the product page and add it to your project.

4.8.2 - Equipment

For usage guides and setup examples, see The Basics: GS_Item.


Get GS_Item

GS_Item — Explore this gem on the product page and add it to your project.

4.8.3 - Inventory

For usage guides and setup examples, see The Basics: GS_Item.


Get GS_Item

GS_Item — Explore this gem on the product page and add it to your project.

4.9 - GS_Performer

Modular character rendering via skin slots, paper billboard performers, velocity-driven locomotion, head tracking, and audio babble integration.

GS_Performer is the character rendering and presentation gem for GS_Play. It provides a slot-based skinning system for composing modular character appearances at runtime, a paper (billboard) performer system for 2.5D and top-down projects, velocity-driven locomotion parameter hooks, head tracking, and audio babble. The gem depends on GS_Core and integrates optionally with EMotionFX and GS_Unit.

For usage guides and setup examples, see The Basics: GS_Performer.

 

Contents


Manager

The Performer Manager is a GS_Play manager that owns the global performer registry and coordinates skin slot configuration across the level.

ComponentPurpose
GS_PerformerManagerComponentGS_Play manager. Global performer registry. Coordinates skin slot profiles and performer lookups.

Performer API


Skin Slots

A slot-based system for composing character appearance from swappable actor meshes and material sets. Each slot holds an actor asset and material list. The handler component manages the full set of slots for a character and applies PerformerSkinSlotsConfigProfile presets.

Component / AssetPurpose
PerformerSkinSlotComponentIndividual skin slot. Holds one actor mesh and its material overrides. Addressed by slot ID.
SkinSlotHandlerComponentManages the full collection of skin slots for a character entity. Applies profile presets.
SkinSlotDataData structure holding actor asset and material asset list for a single slot.
PerformerSkinSlotsConfigProfileAsset class defining a named preset of skin slot assignments. Loaded and applied at runtime.

Skin Slot API


Paper Performer

Billboard-based character rendering for 2.5D and top-down games. The facing handler keeps sprite quads oriented toward the camera or a configurable facing target. A camera-aware variant (in GS_Complete) extends this with PhantomCam integration.

ComponentPurpose
PaperFacingHandlerBaseComponentAbstract base for billboard facing logic. Extend to implement custom facing strategies.
PaperFacingHandlerComponentConcrete paper-facing implementation. Orients the entity quad toward the active camera each tick.

Paper Performer API


Locomotion

Velocity-driven animation parameter hooks. The locomotion component samples entity velocity each tick and pushes values into the animation graph, driving blend trees without manual parameter management.

ComponentPurpose
VelocityLocomotionHookComponentReads entity velocity and writes locomotion parameters to the EMotionFX animation graph each tick.
PrefabAnimAssetsReloaderComponentHot-reloads prefab animation assets during development without requiring a full level restart.

Performer API


Head Tracking

Procedural head look-at targeting for performer entities. Drives bone orientation toward a world-space target using configurable angle limits and spring damping.

Head Tracking API


Babble

Audio babble synchronized to dialogue typewriter output. The Babble component generates procedural vocalisation tones keyed to speaker identity and tone configuration, complementing the Typewriter system in GS_Cinematics.

Babble API


Installation

GS_Performer requires GS_Core. EMotionFX and GS_Unit are optional integrations.

  1. Enable GS_Performer and GS_Core in your O3DE project’s gem list.
  2. Add GS_PerformerManagerComponent to your Game Manager prefab and include it in the Startup Managers list.
  3. For skin slots, add SkinSlotHandlerComponent to the character root entity, then add one PerformerSkinSlotComponent per modular slot.
  4. For paper performers, add PaperFacingHandlerComponent to sprite entities that should billboard toward the camera.
  5. For locomotion, add VelocityLocomotionHookComponent to entities with an EMotionFX actor and configure the parameter name bindings.

See Also

For conceptual overviews and usage guides:

For sub-system references:

For related resources:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

4.9.1 - Performer Manager & Skin Slots

Performer manager lifecycle, skin slot system for modular character appearance, and config profile assets.

The Performer Manager and Skin Slot system provide modular character appearance management at runtime. The manager handles global performer registration and lookup. Skin slots define individual mesh/material combinations that can be swapped independently, enabling runtime costume and equipment changes.

For usage guides and setup examples, see The Basics: GS_Performer.

 

Contents


GS_PerformerManagerComponent

Singleton manager extending GS_ManagerComponent. Handles global performer registration, coordinates skin slot configuration across the level, and provides performer lookup.

Bus: PerformerManagerRequestBus (Single, Single)


Skin Slot System

Skin Slot Configuration Profile in the O3DE Asset Editor

PerformerSkinSlotComponent

Individual skin slot component. Each slot holds one actor mesh asset and its material overrides. Addressed by slot ID.

Bus: PerformerSkinSlotRequestBus (ById, Single)

PropertyTypeDescription
Slot NameAZStd::stringIdentifier for this skin slot.
Skin DataSkinSlotDataActor asset and material list for this slot.

SkinSlotHandlerComponent

Manages the complete collection of skin slots for a character entity. Applies PerformerSkinSlotsConfigProfile presets to swap entire outfits at once.

Bus: SkinSlotHandlerRequestBus (ById, Single)


Data Types

TypeDescription
SkinSlotDataActor asset reference + material asset list for a single slot.
SkinSlotNameDataPairSlot name string + SkinSlotData pair for serialization.
PerformerSkinSlotsConfigProfileAsset class defining a named preset of skin slot assignments. Loaded and applied at runtime for outfit/costume changes.

VelocityLocomotionHookComponent

Velocity Locomotion Hook component in the O3DE Inspector

Reads entity velocity each tick and writes locomotion parameters (speed, direction) to the EMotionFX animation graph. Drives blend trees automatically without manual parameter management.

PropertyTypeDescription
Speed ParameterAZStd::stringThe EMotionFX parameter name to write speed values into.
Direction ParameterAZStd::stringThe EMotionFX parameter name to write direction values into.

PrefabAnimAssetsReloaderComponent

Hot-reloads prefab animation assets during development without requiring a full level restart.


Setup

  1. Add GS_PerformerManagerComponent to the Game Manager prefab and include it in the Startup Managers list.
  2. Add SkinSlotHandlerComponent to the character root entity.
  3. Add one PerformerSkinSlotComponent per modular slot (head, body, arms, legs, etc.) to child entities.
  4. Create PerformerSkinSlotsConfigProfile assets in the Asset Editor for each outfit configuration.
  5. For locomotion, add VelocityLocomotionHookComponent to entities with an EMotionFX actor.

Extension Guide

The Skin Slot system uses the companion component pattern. Custom slot behavior (equipment integration, visual effects on swap) should be added as companion components that listen to SkinSlotHandler bus events on the same entity.


See Also

For related component references:

For conceptual overviews and usage guides:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

4.9.2 - Performers

Billboard and 3D character entity types — PaperFacingHandlerComponent for 2.5D rendering and the Avatar pipeline for rigged EMotionFX characters.

The Performers sub-system defines the character entity types in GS_Performer. Each type encapsulates a complete rendering strategy — from the lightweight billboard paper performer to the full 3D avatar pipeline integrating EMotionFX, skin slots, and locomotion.

The performer type determines the fundamental visual structure of the character entity. Performer Features then layer onto it, independent of type.

For usage guides and setup examples, see The Basics: Performers.

 

Contents


Core Pattern

Each performer type is a self-contained entity configuration. The Paper Performer requires only sprite geometry and a facing handler component. The Avatar Performer drives an EMotionFX actor and composes with skin slots and the locomotion hook. Both types participate in Performer Manager registration and support the full Performer Features set.


Paper Performer

Billboard-based character rendering for 2.5D and top-down games. The facing handler orients the sprite quad toward the active camera or a configurable target each tick. The base component allows custom facing strategies through extension.

ComponentPurposeReference
PaperFacingHandlerBaseComponentAbstract base for billboard facing logic. Extend to implement custom facing strategies.Paper Performer
PaperFacingHandlerComponentConcrete implementation. Orients the entity quad toward the active camera each tick.Paper Performer

Paper Performer API


Avatar Performer

The full 3D character pipeline. Drives a rigged EMotionFX actor with support for skin slot equipment, velocity locomotion parameter binding, and animation asset hot-reloading during development.

ComponentPurposeReference
VelocityLocomotionHookComponentReads entity velocity each tick and writes speed and direction blend parameters to the EMotionFX animation graph.Avatar Performer
PrefabAnimAssetsReloaderComponentHot-reloads prefab animation assets without a level restart. Development utility.Avatar Performer

Avatar Performer API


Extension Guide

Custom paper facing strategies are supported by extending PaperFacingHandlerBaseComponent. Override the tick method to compute any desired facing rotation and apply it to the entity transform each frame. See Paper Performer API for the full extension walkthrough.


See Also

For conceptual overviews and usage guides:

For related component references:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

4.9.2.1 - Avatar Performer

How to work with GS_Play avatar performers.

Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

4.9.2.2 - Paper Performer

Billboard-based character rendering — paper facing handlers that orient sprite quads toward the camera.

The Paper Performer system provides billboard-based character rendering for 2.5D and top-down games. Facing handler components keep sprite quads oriented toward the camera or a configurable facing target each tick.

For usage guides and setup examples, see The Basics: GS_Performer.

 

Contents


PaperFacingHandlerBaseComponent

Abstract base component for billboard facing logic. Extend this to implement custom facing strategies (face camera, face movement direction, face target entity).


PaperFacingHandlerComponent

Concrete paper-facing implementation. Orients the entity’s quad toward the active camera each tick. Suitable for standard 2.5D billboard characters.

PropertyTypeDescription
Face ModeenumHow the billboard faces: toward camera, along movement direction, or toward a target.

Camera-Aware Variant

A camera-aware paper facing handler is provided by GS_Complete as a cross-gem component (PhantomCam + Performer):

ComponentGemsDescription
CamCorePaperFacingHandlerComponentPerformer + PhantomCamUses CamCore notifications to adjust paper performer facing direction relative to the active phantom camera.

See Utility: Angles Helper for details.


Extension Guide

Create custom facing strategies by extending PaperFacingHandlerBaseComponent:

  1. Create a class extending PaperFacingHandlerBaseComponent.
  2. Override the tick/update method to compute your desired facing rotation.
  3. Apply the rotation to the entity transform each frame.

See Also

For related component references:

For conceptual overviews and usage guides:

For related resources:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

4.9.3 - Performer Features

Universal capabilities for any performer entity — procedural head tracking, typewriter-synchronized babble, and mesh swapping.

Performer Features are components that apply to any performer entity, independent of performer type. They represent capabilities that layer onto the character rather than define it — procedural bone targeting via head tracking, typewriter-synchronized vocalization via babble, and runtime mesh swapping.

The performer type (Paper or Avatar) defines the character’s rendering pipeline. Performer Features extend what that character can do within it.

For usage guides and setup examples, see The Basics: Performer Features.

 

Contents


Head Tracking

Procedural look-at targeting for performer bones. Drives bone orientation toward a world-space target each tick using configurable angle limits and spring damping, enabling characters to naturally track targets, speakers, or points of interest.

ComponentPurposeReference
Head Tracking ComponentReads a target position each tick and computes bone rotation within clamped horizontal and vertical limits, smoothed by spring damping.Head Tracking

Head Tracking API


Babble

Procedural vocalization tones synchronized to Typewriter text output. Fires audio tone events on each character reveal, keyed to speaker identity, creating the characteristic character-voice effect.

ComponentPurposeReference
BabbleComponentGenerates babble tones on OnTypeFired events. Returns BabbleToneEvent for audio playback. Speaker mapped via SpeakerBabbleEvents.Babble

Babble API


Mesh Swap

Runtime mesh and material swapping for performer entities. Provides lightweight visual variation without the full slot-based equipment system.

Mesh Swap API


See Also

For conceptual overviews and usage guides:

For related component references:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

4.9.3.1 - Locomotion

Velocity-driven animation hooks and prefab animation asset reloading for performer entities.

The Locomotion system bridges entity movement to animation. The VelocityLocomotionHookComponent samples entity velocity each tick and pushes values into the EMotionFX animation graph, driving blend trees without manual parameter management. The PrefabAnimAssetsReloaderComponent supports hot-reloading animation assets during development.

For usage guides and setup examples, see The Basics: GS_Performer.

 

Contents


VelocityLocomotionHookComponent

Reads the entity’s current velocity each tick and writes locomotion parameters (speed, direction) to the EMotionFX animation graph. This drives blend trees automatically — characters walk, run, and idle based on their actual movement speed.

PropertyTypeDescription
Speed ParameterAZStd::stringThe EMotionFX parameter name to write speed values into.
Direction ParameterAZStd::stringThe EMotionFX parameter name to write direction values into.

PrefabAnimAssetsReloaderComponent

Hot-reloads prefab animation assets during development without requiring a full level restart. Monitors animation asset files for changes and reapplies them to the entity’s EMotionFX actor.


Setup

  1. Add VelocityLocomotionHookComponent to an entity with an EMotionFX actor and a movement system (GS_Unit mover or physics rigidbody).
  2. Configure the speed and direction parameter names to match your EMotionFX blend tree parameters.
  3. Add PrefabAnimAssetsReloaderComponent during development for faster animation iteration.

See Also

For related component references:

For conceptual overviews and usage guides:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

4.9.3.2 - Head Tracking

Procedural head look-at targeting — drives bone orientation toward a world-space target with configurable angle limits and damping.

The Head Tracking system provides procedural head look-at targeting for performer entities. It drives bone orientation toward a world-space target using configurable angle limits and spring damping, enabling characters to naturally track targets, speakers, or points of interest.

For usage guides and setup examples, see The Basics: GS_Performer.

 

Contents


How It Works

The head tracking component reads the target position each tick and computes the desired bone rotation to face it. The rotation is clamped to configurable angle limits (horizontal and vertical) and smoothed via spring damping to avoid snapping.


Setup

  1. Add the head tracking component to an entity with an EMotionFX actor.
  2. Configure the target bone (typically the head or neck bone).
  3. Set angle limits for horizontal and vertical rotation.
  4. Configure spring damping parameters for smooth tracking.
  5. Set the look-at target via code or companion component.

See Also

For related component references:

For conceptual overviews and usage guides:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

4.9.3.3 - Babble

Audio babble synchronized to dialogue typewriter output — procedural vocalization tones keyed to speaker identity.

The Babble system generates procedural vocalization tones synchronized to the dialogue Typewriter output. Each speaker can have unique babble tone events that fire with each character reveal, creating the characteristic “character voice” effect used in games like Animal Crossing or Undertale.

For usage guides and setup examples, see The Basics: GS_Performer.

Babble component in the O3DE Inspector

 

Contents


BabbleComponent

Bus: BabbleRequestBus (ById)

MethodParametersReturnsDescription
GetBabbleEventBabbleToneEventReturns the current babble tone event for audio playback.

Data Types

TypeDescription
BabbleToneEventAudio event configuration for a single babble tone.
SpeakerBabbleEventsCollection of babble events mapped to a specific speaker/actor.

How It Works

  1. The TypewriterComponent (GS_Cinematics) reveals text character by character.
  2. On each OnTypeFired notification, the BabbleComponent triggers its configured babble tone.
  3. The tone varies based on the speaker’s SpeakerBabbleEvents configuration.
  4. The result is a procedural “voice” that matches the text reveal rhythm.

See Also

For related component references:

For conceptual overviews and usage guides:


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

4.9.4 - 3rd Party Implementations

For usage guides and setup examples, see The Basics: GS_Performer.


Get GS_Performer

GS_Performer — Explore this gem on the product page and add it to your project.

4.10 - GS_PhantomCam

Priority-based virtual camera management — phantom cameras, blend profiles, and spatial camera influence fields.

Rather than moving a single camera directly, GS_PhantomCam lets you place lightweight virtual cameras — “phantoms” — throughout the scene. Each phantom holds a complete camera state and a priority value. The Cam Core component drives the real camera to match whichever phantom currently has the highest effective priority. Transitions are animated by Blend Profiles that specify duration and easing. Influence Fields modify camera selection spatially or globally without changing base priorities.

For usage guides and setup examples, see The Basics: GS_PhantomCam.

 

Contents


Architecture

Camera Blend Pattern Graph

Breakdown

When the dominant Phantom Camera changes, the Cam Manager queries the Blend Profile to determine transition timing and easing:

StepWhat It Means
1 — Priority changeA Phantom Camera gains highest priority or is activated.
2 — Profile queryCam Manager calls GetBestBlend(fromCam, toCam) on the assigned GS_PhantomCamBlendProfile asset.
3 — Entry matchEntries are checked by specificity: exact match → any-to-specific → specific-to-any → default fallback.
4 — InterpolationCam Core blends position, rotation, and FOV over the matched entry’s BlendTime using the configured EasingType.

E Indicates extensible classes and methods.

Patterns - Complete list of system patterns used in GS_Play.


Cam Manager

The Cam Manager is the singleton controller for the entire camera system. It extends GS_ManagerComponent and owns three responsibilities:

  • Registration — Every phantom camera calls RegisterPhantomCam on activate and UnRegisterPhantomCam on deactivate. The Cam Manager holds the authoritative list.
  • Priority evaluation — When any camera’s priority changes, EvaluatePriority() re-sorts the list and determines the dominant camera. A dominance change fires SettingNewCam, which the Cam Core listens for to begin blending.
  • Influence routing — Priority influences (from Influence Fields or gameplay code) are added and removed through AddCameraInfluence / RemoveCameraInfluence by camera name. The Cam Manager sums all active influences with each camera’s base priority during evaluation.

The global camera target (the entity that follow and look-at cameras track) is also held here, set via SetTarget.

ComponentPurpose
GS_CamManagerComponentSingleton manager. Registration, priority evaluation, influence routing, and target assignment.

Cam Manager API


Cam Core

The Cam Core is the per-frame driver that makes the real O3DE camera match the dominant phantom. It lives on the main camera entity, which must be a child entity of the Cam Manager entity inside the manager’s prefab so it spawns and despawns with the camera system.

Every frame, when locked to a phantom, the Cam Core parents the main camera to that phantom entity and reads its position, rotation, and FOV directly. During a blend transition:

  1. The Cam Core receives SettingNewCam from the Cam Manager.
  2. It queries the assigned Blend Profile for the best matching entry between the outgoing and incoming cameras.
  3. The blend profile returns a duration and an easing curve (see Curves Utility). If no profile entry matches, the Cam Core falls back to its own default blend time and easing configured directly on the component.
  4. Over the blend duration, position, rotation, and FOV are interpolated from the outgoing state to the incoming phantom’s state.
  5. On completion, the Cam Core parents the main camera to the new dominant phantom, locking them together until the next transition.
ComponentPurpose
GS_CamCoreComponentCore camera driver. Applies phantom camera state to the real camera each tick.

Cam Core API


Phantom Cameras

Phantom cameras are entity components that hold a complete candidate camera state — FOV, near/far clip planes, follow target, look-at target, position offset, rotation offset, and base priority. They do not render anything. They register with the Cam Manager on activation and are evaluated by priority each time the dominant camera changes.

Priority Control

The camera with the highest effective priority (base + all active influences) becomes dominant. You can control which camera wins through:

  • Raise priority — Set one camera’s priority above all others.
  • Disable/EnableDisableCamera drops effective priority to 0; EnableCamera restores it.
  • Direct changeSetCameraPriority or ChangeCameraPriority on the Cam Manager bus.
  • Influence — Add temporary priority boosts through Influence Fields without touching base priorities.

Follow and Look-At

Each phantom supports independent follow and look-at tracking, each with position offsets and optional damping. Both modes expose overridable virtual methods so custom camera types can inject their own logic. Processing is split into transform-based (kinematic) and physics-based paths per mode.

Specialized Camera Types

Companion components add distinct motion behaviors on top of the base phantom:

ComponentBehavior
GS_PhantomCameraComponentBase virtual camera. Priority, targets, offsets, and blend settings.
ClampedLook_PhantomCamComponentClamped rotation look — limits the camera’s look angle within defined min/max bounds.
StaticOrbitPhantomCamComponentFixed orbital position around the follow target.
Track_PhantomCamComponentFollows a spline or path while optionally tracking a look-at target.
AlwaysFaceCameraComponentBillboard utility — keeps an entity facing the active camera. Not a camera type itself.

Phantom Cameras API


Blend Profiles

Blend Profiles are data assets (.blendprofile) that define how the Cam Core transitions between phantom cameras. Each profile contains a list of blend entries, where each entry defines a From camera, a To camera, a blend duration, and an easing curve (see Curves Utility). This allows every camera-to-camera transition in your project to have unique timing and feel.

Entry Resolution Order

  1. Exact match — From name and To name both match.
  2. Any-to-specific — From is blank/“any”, To matches the incoming camera.
  3. Specific-to-any — From matches the outgoing camera, To is blank/“any”.
  4. Default fallback — The Cam Core’s own default blend time and easing (set on the component).

Blend Entry Fields

Each entry specifies:

FieldDescription
FromCameraOutgoing phantom camera entity name. Blank/“any” matches all.
ToCameraIncoming phantom camera entity name. Blank/“any” matches all.
BlendTimeTransition duration in seconds.
EasingTypeInterpolation curve applied during the blend. See Curves Utility for the full list of available easing types.

Camera names correspond to entity names of the phantom camera entities in the scene.

AssetPurpose
GS_PhantomCamBlendProfileBlend settings asset — per-camera-pair transition duration and easing.

Blend Profiles API


Camera Influence Fields

Influence components modify the effective priority of phantom cameras without touching their base priority values. They work by calling AddCameraInfluence / RemoveCameraInfluence on the Cam Manager bus, identified by camera name. Multiple influences on the same camera stack additively.

GlobalCameraInfluenceComponent

Applies a constant priority modifier for its entire active lifetime. Useful for gameplay states that should always favor a particular camera — boosting a cutscene camera’s priority during a scripted sequence, for example.

Placement: Place GlobalCameraInfluenceComponent on the StageData entity. It activates and deactivates automatically with the stage, keeping camera influence scoped to the level that defines it.

CameraInfluenceFieldComponent

Applies a priority modifier only when the camera subject enters a defined spatial volume. Requires a PhysX Collider (set as trigger) on the same entity to define the volume. On entry, the influence is added; on exit, it is removed. See Physics Trigger Volume Utility for setup details.

Useful for level design — switching to an overhead camera when the player enters a room, or boosting a scenic camera in a vista area.

ComponentPurpose
GlobalCameraInfluenceComponentGlobal, constant priority modifier for a named camera. Place on the StageData entity.
CameraInfluenceFieldComponentSpatial priority modifier. Activates when the camera subject enters the trigger volume. Requires a PhysX trigger collider.

Camera Influence Fields API


Installation

GS_PhantomCam requires only GS_Core.

  1. Enable GS_PhantomCam in Project Manager or project.json.
  2. Add GS_CamManagerComponent to a dedicated entity and register it in the Game Manager Startup Managers list. Save this entity as a prefab.
  3. Add GS_CamCoreComponent to the main O3DE camera entity. Make this camera entity a child of the Cam Manager entity, inside the Cam Manager’s prefab. This ensures the camera spawns and despawns cleanly with the camera system.
  4. Assign a Blend Profile asset to the Cam Core’s Blend Profile slot. Configure default blend time and easing on the component for fallback transitions.
  5. Place phantom camera entities in your level with GS_PhantomCameraComponent. Assign follow and look-at targets as needed.
  6. For spatial influence zones, add CameraInfluenceFieldComponent alongside a PhysX Collider (trigger) to define the volume.
  7. For global per-stage influence, add GlobalCameraInfluenceComponent to the StageData entity.

For a full guided walkthrough, see the PhantomCam Set Up Guide.


See Also

For conceptual overviews and usage guides:

For related resources:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

4.10.1 - Cam Manager

Camera system lifecycle controller — registration, priority evaluation, target assignment, and influence management for all phantom cameras.

The Cam Manager is the singleton controller for the GS_PhantomCam camera system. It extends GS_ManagerComponent and manages the full lifecycle of virtual cameras: registering and unregistering phantom cameras, evaluating which camera has the highest effective priority, assigning the global camera target, and routing camera influence modifiers.

When a phantom camera activates, it registers itself with the Cam Manager. When priorities change — through direct calls, enable/disable toggling, or influence fields — the Cam Manager re-evaluates which phantom camera should be dominant and notifies the Cam Core to begin blending toward it.

For usage guides and setup examples, see The Basics: GS_PhantomCam.

Cam Manager component in the O3DE Inspector

 

Contents


How It Works

Camera Registration

Every Phantom Camera registers with the Cam Manager on activation via RegisterPhantomCam and unregisters on deactivation via UnRegisterPhantomCam. The Cam Manager maintains the authoritative list of all active phantom cameras in the scene.

Priority Evaluation

When any camera’s priority changes, the Cam Manager calls EvaluatePriority() to determine the new dominant camera. The camera with the highest effective priority (base priority plus any active influences) becomes the dominant camera. A change in dominance triggers the SettingNewCam notification, which the Cam Core uses to begin its blend transition.

Camera Influences

Camera influences are named priority modifiers that shift a camera’s effective priority up or down. They are added and removed through AddCameraInfluence and RemoveCameraInfluence, identified by camera name. This mechanism allows Camera Influence Fields and gameplay systems to affect camera selection without directly modifying base priority values.

Target Assignment

The Cam Manager holds the global camera target — the entity that follow and look-at cameras will track. Setting the target here propagates it to the active phantom camera system.


Setup

  1. Add GS_CamManagerComponent to your Game Manager entity (or a dedicated camera manager entity).
  2. Register the manager prefab in the Game Manager Startup Managers list.
  3. Add a Cam Core component to your main camera entity as a child of the Cam Manager entity.
  4. Place Phantom Cameras in your scene with appropriate priorities.

For a full walkthrough, see the PhantomCam Set Up Guide.


API Reference

Request Bus: CamManagerRequestBus

Commands sent to the Cam Manager. Global bus — Single address, single handler.

MethodParametersReturnsDescription
RegisterPhantomCamAZ::EntityId camvoidRegisters a phantom camera with the manager. Called automatically on phantom camera activation.
UnRegisterPhantomCamAZ::EntityId camvoidUnregisters a phantom camera from the manager. Called automatically on phantom camera deactivation.
ChangeCameraPriorityAZ::EntityId cam, AZ::u32 priorityvoidSets a new base priority for the specified camera and triggers priority re-evaluation.
AddCameraInfluenceAZStd::string camName, float influencevoidAdds a priority influence modifier to the named camera. Positive values increase effective priority.
RemoveCameraInfluenceAZStd::string camName, float influencevoidRemoves a priority influence modifier from the named camera.
SetTargetAZ::EntityId targetEntityvoidSets the global camera follow/look-at target entity.
GetTargetAZ::EntityIdReturns the current global camera target entity.

Notification Bus: CamManagerNotificationBus

Events broadcast by the Cam Manager. Multiple handler bus — any number of components can subscribe.

EventDescription
EnableCameraSystemFired when the camera system is fully initialized and active.
DisableCameraSystemFired when the camera system is shutting down.
SettingNewCamFired when a new phantom camera becomes dominant after priority evaluation. The Cam Core listens for this to begin blending.

Virtual Methods

Override these when extending the Cam Manager. Always call the base implementation.

MethodParametersReturnsDescription
EvaluatePriority()voidSorts all registered phantom cameras by effective priority and determines the dominant camera. Override to add custom priority logic (e.g. distance weighting, gameplay state filters).

Extending the Cam Manager

Extend the Cam Manager to add custom priority logic, additional registration behavior, or project-specific camera selection rules. Extension is done in C++.

Header (.h)

#pragma once
#include <GS_PhantomCam/GS_PhantomCamBus.h>
#include <Source/CamManager/GS_CamManagerComponent.h>

namespace MyProject
{
    class MyCamManager : public GS_PhantomCam::GS_CamManagerComponent
    {
    public:
        AZ_COMPONENT_DECL(MyCamManager);

        static void Reflect(AZ::ReflectContext* context);

    protected:
        void EvaluatePriority() override;
    };
}

Implementation (.cpp)

#include "MyCamManager.h"
#include <AzCore/Serialization/SerializeContext.h>

namespace MyProject
{
    AZ_COMPONENT_IMPL(MyCamManager, "MyCamManager", "{YOUR-UUID-HERE}");

    void MyCamManager::Reflect(AZ::ReflectContext* context)
    {
        if (auto serializeContext = azrtti_cast<AZ::SerializeContext*>(context))
        {
            serializeContext->Class<MyCamManager, GS_PhantomCam::GS_CamManagerComponent>()
                ->Version(0);

            if (AZ::EditContext* editContext = serializeContext->GetEditContext())
            {
                editContext->Class<MyCamManager>("My Cam Manager", "Custom camera manager")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyProject")
                        ->Attribute(AZ::Edit::Attributes::AppearsInAddComponentMenu, AZ_CRC_CE("Game"));
            }
        }
    }

    void MyCamManager::EvaluatePriority()
    {
        // Call base to run standard priority sorting
        GS_CamManagerComponent::EvaluatePriority();

        // Custom priority logic here — e.g. filter cameras by gameplay state,
        // apply distance-based weighting, or enforce camera lock rules
    }
}

Script Canvas Examples

Enabling and disabling the camera system:

Setting the global camera target:

Reacting to a new dominant camera:


See Also

For related PhantomCam components:

For foundational systems:

For conceptual overviews and usage guides:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

4.10.2 - Cam Core

The runtime camera driver — applies phantom camera position, rotation, and FOV to the real camera with blend interpolation.

The Cam Core is the runtime driver of the GS_PhantomCam system. It lives on the main camera entity and is responsible for a single job: making the real O3DE camera match the dominant Phantom Camera. Every frame, the Cam Core reads the target phantom camera’s position, rotation, and field of view, then applies those values to the actual camera entity — either instantly (when locked) or through a timed blend transition.

When the Cam Manager determines a new dominant phantom camera, it fires a notification that the Cam Core receives. The Cam Core then looks up the best matching blend in the active Blend Profile and begins interpolating toward the new phantom camera over the specified duration and easing curve (see Curves Utility). Once the blend completes, the main camera locks to the phantom camera as a child entity, matching its transforms exactly until the next transition.

For usage guides and setup examples, see The Basics: GS_PhantomCam.

Cam Core component in the O3DE Inspector

 

Contents


How It Works

Blend Transitions

When a camera transition is triggered:

  1. The Cam Core receives the SettingNewCam notification from the Cam Manager.
  2. It queries the assigned Blend Profile for the best matching blend between the outgoing and incoming camera (by name).
  3. The blend profile returns the duration and easing curve. If no specific blend is found, the Cam Core falls back to its default blend time and easing.
  4. Over the blend duration, the Cam Core interpolates position, rotation, and FOV from the outgoing state to the incoming phantom camera state.
  5. On blend completion, the Cam Core parents the main camera to the phantom camera entity, locking them together.

Blend Profile Resolution

The Cam Core holds a reference to a GS_PhantomCamBlendProfile asset. When a transition occurs, it calls GetBestBlend(fromCam, toCam) on the profile to find the most specific matching blend. The resolution order is:

  1. Exact match — From camera name to To camera name.
  2. Any-to-specific — “Any” to the To camera name.
  3. Specific-to-any — From camera name to “Any”.
  4. Default fallback — The Cam Core’s own default blend time and easing values.

See Blend Profiles for full details on the resolution hierarchy.

Camera Locking

Once a blend completes, the main camera becomes a child of the dominant phantom camera entity. All position, rotation, and property updates from the phantom camera are reflected immediately on the real camera. This lock persists until the next transition begins.


Setup

  1. Add GS_CamCoreComponent to your main camera entity. There should be only one Cam Core in the scene.
  2. Make the main camera entity a child of the Cam Manager entity to ensure it spawns and despawns with the camera system.
  3. Create a Blend Profile data asset in the Asset Editor and assign it to the Cam Core’s Blend Profile inspector slot.
  4. Configure the default blend time and easing on the Cam Core component for cases where no blend profile entry matches. See Curves Utility for available easing types.

API Reference

Request Bus: CamCoreRequestBus

Commands sent to the Cam Core. Global bus — Single address, single handler.

MethodParametersReturnsDescription
SetPhantomCamAZ::EntityId targetCamvoidSets the phantom camera that the Cam Core should blend toward or lock to. Typically called by the Cam Manager after priority evaluation.
GetCamCoreAZ::EntityIdReturns the entity ID of the Cam Core (the main camera entity).

Notification Bus: CamCoreNotificationBus

Events broadcast by the Cam Core. Multiple handler bus — any number of components can subscribe.

EventDescription
UpdateCameraPositionFired each frame during an active blend, after the Cam Core has updated the camera’s position, rotation, and FOV. Subscribe to this to react to camera movement in real time.

Usage Examples

C++ – Querying the Main Camera Entity

#include <GS_PhantomCam/GS_PhantomCamBus.h>

AZ::EntityId mainCameraId;
GS_PhantomCam::CamCoreRequestBus::BroadcastResult(
    mainCameraId,
    &GS_PhantomCam::CamCoreRequestBus::Events::GetCamCore
);

C++ – Listening for Camera Updates

#include <GS_PhantomCam/GS_PhantomCamBus.h>

class MyCameraListener
    : public AZ::Component
    , protected GS_PhantomCam::CamCoreNotificationBus::Handler
{
protected:
    void Activate() override
    {
        GS_PhantomCam::CamCoreNotificationBus::Handler::BusConnect();
    }

    void Deactivate() override
    {
        GS_PhantomCam::CamCoreNotificationBus::Handler::BusDisconnect();
    }

    void UpdateCameraPosition() override
    {
        // React to camera position/rotation changes during blends
    }
};

Script Canvas

Reacting to camera position updates during blends:


Extending the Cam Core

The Cam Core can be extended in C++ to customize blend behavior, add post-processing logic, or integrate with external camera systems.

Header (.h)

#pragma once
#include <GS_PhantomCam/GS_PhantomCamBus.h>
#include <Source/CamCore/GS_CamCoreComponent.h>

namespace MyProject
{
    class MyCamCore : public GS_PhantomCam::GS_CamCoreComponent
    {
    public:
        AZ_COMPONENT_DECL(MyCamCore);

        static void Reflect(AZ::ReflectContext* context);
    };
}

Implementation (.cpp)

#include "MyCamCore.h"
#include <AzCore/Serialization/SerializeContext.h>

namespace MyProject
{
    AZ_COMPONENT_IMPL(MyCamCore, "MyCamCore", "{YOUR-UUID-HERE}");

    void MyCamCore::Reflect(AZ::ReflectContext* context)
    {
        if (auto serializeContext = azrtti_cast<AZ::SerializeContext*>(context))
        {
            serializeContext->Class<MyCamCore, GS_PhantomCam::GS_CamCoreComponent>()
                ->Version(0);

            if (AZ::EditContext* editContext = serializeContext->GetEditContext())
            {
                editContext->Class<MyCamCore>("My Cam Core", "Custom camera core driver")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyProject")
                        ->Attribute(AZ::Edit::Attributes::AppearsInAddComponentMenu, AZ_CRC_CE("Game"));
            }
        }
    }
}

See Also

For related PhantomCam components:

For conceptual overviews and usage guides:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

4.10.3 - Phantom Cameras

Virtual camera components with priority-based selection, follow/look-at targets, offsets, and specialized camera behavior types.

Phantom Cameras are the virtual cameras that power the GS_PhantomCam system. Each phantom camera is an entity component that holds a complete camera state: field of view, clip planes, a follow target, a look-at target, position and rotation offsets, and a priority value. Phantom cameras do not render anything themselves — they represent candidate camera configurations that the Cam Core can blend toward when a phantom becomes dominant.

Any number of phantom cameras can exist in a scene simultaneously. They register with the Cam Manager on activation and are sorted by effective priority. The camera with the highest priority becomes the dominant camera. The Cam Core then blends the real camera to match the dominant phantom’s state using the active Blend Profile.

Specialized phantom camera types (companion components) extend the base behavior with distinct motion patterns: clamped rotation, static orbiting, spline tracking, and billboard facing.

For usage guides and setup examples, see The Basics: GS_PhantomCam.

The Phantom Camera component in the Entity Inspector.

 

Contents


How It Works

Priority-Based Selection

Every phantom camera has a base priority (an integer). The camera with the highest effective priority (base plus any active influences) becomes the dominant camera. You can control which camera is active through several strategies:

  • Raise priority — Set the desired camera’s priority above all others.
  • Lower neighbors — Reduce competing cameras’ priorities below the desired camera.
  • Disable/Enable — Call DisableCamera on the current dominant camera (drops it to priority 0) or EnableCamera on a previously disabled camera to restore its priority.
  • Change priority directly — Call SetCameraPriority or use ChangeCameraPriority on the Cam Manager bus.

Follow and Look-At

Phantom cameras support two independent tracking modes:

  • Follow — The camera follows a target entity’s position, applying follow offsets and optional damping. Processing is split into transform-based follow (kinematic) and physics-based follow (respects colliders).
  • Look-At — The camera rotates to face a target entity or focus group, applying look-at offsets and optional damping. Processing is split into transform-based and physics-based look-at.

Both modes run through overridable virtual methods, allowing custom camera types to inject their own follow and look-at logic.

Camera Data

Each phantom camera stores its configuration in a PhantomCamData structure. This data is read by the Cam Core during blending and when locked to the phantom camera.


Setup

  1. Create an entity and add GS_PhantomCameraComponent (or a specialized type like StaticOrbitPhantomCamComponent).
  2. Set the Priority value. Higher values take precedence.
  3. Assign a Follow Target entity for position tracking (optional).
  4. Assign a Look-At Target entity for rotation tracking (optional).
  5. Configure FOV, clip planes, and offsets as needed.
  6. Place the entity in your scene. It registers with the Cam Manager automatically on activation.

For a full walkthrough, see the PhantomCam Set Up Guide.


PhantomCamData Structure

The PhantomCamData structure holds the complete configuration for a phantom camera.

FieldTypeDescription
FOVfloatField of view in degrees.
NearClipfloatNear clipping plane distance.
FarClipfloatFar clipping plane distance.
FollowTargetAZ::EntityIdThe entity this camera follows for position tracking.
LookAtTargetAZ::EntityIdThe entity this camera faces for rotation tracking.
FollowOffsetAZ::Vector3Position offset applied relative to the follow target.
LookAtOffsetAZ::Vector3Position offset applied to the look-at target point.

API Reference

Request Bus: PhantomCameraRequestBus

Commands sent to a specific phantom camera. ById bus — addressed by entity ID, single handler per address.

MethodParametersReturnsDescription
EnableCameravoidEnables the phantom camera, restoring its priority and registering it for evaluation.
DisableCameravoidDisables the phantom camera, dropping its effective priority to 0.
SetCameraPriorityAZ::s32 newPriorityvoidSets the base priority of this phantom camera and triggers re-evaluation.
GetCameraPriorityAZ::s32Returns the current base priority of this phantom camera.
SetCameraTargetAZ::EntityId targetEntityvoidSets the follow target entity for this phantom camera.
SetTargetFocusGroupAZ::EntityId targetFocusGroupvoidSets the focus group entity for multi-target look-at behavior.
GetCameraDataconst PhantomCamData*Returns a pointer to the camera’s full configuration data.

Virtual Methods

Override these when extending the Phantom Camera. These methods form the per-frame camera behavior pipeline.

MethodParametersReturnsDescription
StartupCheckvoidCalled on activation. Validates initial state and registers with the Cam Manager.
EvaluateCamTickvoidCalled each tick. Runs the follow and look-at processing pipeline for this camera.

Follow pipeline:

MethodParametersReturnsDescription
ProcessTransformFollowAZ::Vector3& desiredPos, float deltaTimevoidComputes the desired follow position using transform-based (kinematic) movement.
ProcessPhysicsFollowfloat deltaTimevoidComputes follow position using physics-based movement (respects colliders).
ProcessFollowOffsetAZ::Vector3& destFollowPos, AZ::Transform destWorldTMFollowvoidApplies the follow offset to the computed follow position.

Look-At pipeline:

MethodParametersReturnsDescription
ProcessTransformLookAtAZ::Quaternion& desiredRot, AZ::Transform curWorldTM, float deltaTimevoidComputes the desired look-at rotation using transform-based interpolation.
ProcessPhysicsLookAtfloat deltaTimevoidComputes look-at rotation using physics-based constraints.
ProcessLookAtOffsetAZ::Vector3& destLookPos, AZ::Transform curWorldTM, AZ::Transform destWorldTMLookvoidApplies the look-at offset to the computed look-at position.

Camera Behavior Types

Specialized companion components extend the base phantom camera with distinct motion behaviors. Add these alongside GS_PhantomCameraComponent on the same entity.

ClampedLook_PhantomCamComponent

A phantom camera with angle-clamped look rotation. The camera tracks its look-at target but clamps the rotation within defined minimum and maximum angles. Useful for fixed-angle security cameras, limited-rotation turret views, or any situation where the camera should not rotate beyond defined bounds.

StaticOrbitPhantomCamComponent

A phantom camera that maintains a fixed orbital position around its target. The camera stays at a set distance and angle from the follow target without player input. Useful for fixed-perspective gameplay cameras, environmental showcase views, or pre-positioned cinematic angles.

See Static Orbit Cam for full details.

Track_PhantomCamComponent

A phantom camera that follows a spline or path. The camera moves along a predefined trajectory, optionally tracking a look-at target while moving. Useful for cinematic fly-throughs, guided tours, and scripted camera movements.

AlwaysFaceCameraComponent

A billboard helper component. Keeps the attached entity always facing the active camera. This is not a phantom camera type itself but a utility for objects that should always face the viewer (UI elements in world space, sprite-based effects, etc.).


Usage Examples

Switching Cameras by Priority

To make a specific phantom camera dominant, raise its priority above all others:

#include <GS_PhantomCam/GS_PhantomCamBus.h>

// Set camera to high priority to make it dominant
GS_PhantomCam::PhantomCameraRequestBus::Event(
    myCameraEntityId,
    &GS_PhantomCam::PhantomCameraRequestBus::Events::SetCameraPriority,
    100
);

Disabling the Current Camera

Disable the current dominant camera to let the next-highest priority camera take over:

GS_PhantomCam::PhantomCameraRequestBus::Event(
    currentCameraEntityId,
    &GS_PhantomCam::PhantomCameraRequestBus::Events::DisableCamera
);

Script Canvas

Enabling and disabling a phantom camera:

Getting a phantom camera’s data:


Extending Phantom Cameras

Use the PhantomCamera ClassWizard template to generate a new camera component with boilerplate already in place — see GS_PhantomCam Templates. cmake and module registration are fully automatic.

Create custom phantom camera types by extending GS_PhantomCameraComponent. Override the virtual methods to implement custom follow, look-at, or per-tick behavior.

Header (.h)

#pragma once
#include <GS_PhantomCam/GS_PhantomCamBus.h>
#include <Source/PhantomCamera/GS_PhantomCameraComponent.h>

namespace MyProject
{
    class MyCustomCam : public GS_PhantomCam::GS_PhantomCameraComponent
    {
    public:
        AZ_COMPONENT_DECL(MyCustomCam);

        static void Reflect(AZ::ReflectContext* context);

    protected:
        void EvaluateCamTick() override;
        void ProcessTransformFollow(AZ::Vector3& desiredPos, float deltaTime) override;
        void ProcessTransformLookAt(AZ::Quaternion& desiredRot, AZ::Transform curWorldTM, float deltaTime) override;
    };
}

Implementation (.cpp)

#include "MyCustomCam.h"
#include <AzCore/Serialization/SerializeContext.h>

namespace MyProject
{
    AZ_COMPONENT_IMPL(MyCustomCam, "MyCustomCam", "{YOUR-UUID-HERE}");

    void MyCustomCam::Reflect(AZ::ReflectContext* context)
    {
        if (auto serializeContext = azrtti_cast<AZ::SerializeContext*>(context))
        {
            serializeContext->Class<MyCustomCam, GS_PhantomCam::GS_PhantomCameraComponent>()
                ->Version(0);

            if (AZ::EditContext* editContext = serializeContext->GetEditContext())
            {
                editContext->Class<MyCustomCam>("My Custom Camera", "Custom phantom camera behavior")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyProject")
                        ->Attribute(AZ::Edit::Attributes::AppearsInAddComponentMenu, AZ_CRC_CE("Game"));
            }
        }
    }

    void MyCustomCam::EvaluateCamTick()
    {
        // Call base to run standard follow/look-at pipeline
        GS_PhantomCameraComponent::EvaluateCamTick();

        // Add custom per-tick logic (screen shake, zoom pulses, etc.)
    }

    void MyCustomCam::ProcessTransformFollow(AZ::Vector3& desiredPos, float deltaTime)
    {
        // Call base for standard follow behavior
        GS_PhantomCameraComponent::ProcessTransformFollow(desiredPos, deltaTime);

        // Modify desiredPos for custom follow behavior
    }

    void MyCustomCam::ProcessTransformLookAt(
        AZ::Quaternion& desiredRot, AZ::Transform curWorldTM, float deltaTime)
    {
        // Call base for standard look-at behavior
        GS_PhantomCameraComponent::ProcessTransformLookAt(desiredRot, curWorldTM, deltaTime);

        // Modify desiredRot for custom look-at behavior
    }
}

See Also

For related PhantomCam components:

For conceptual overviews and usage guides:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

4.10.3.1 - Orbit Cam

A player-controlled orbit phantom camera type — planned for a future release.

The Orbit Cam is a planned phantom camera type that will provide player-controlled orbital camera behavior: the camera orbits around a follow target based on player input, with configurable distance, pitch limits, and rotation speed.

This camera type is not yet available. When implemented, it will appear as a companion component alongside GS_PhantomCameraComponent on the camera entity.

For usage guides and setup examples, see The Basics: GS_PhantomCam.


See Also


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

4.10.3.2 - Static Orbit Cam

A phantom camera that maintains a fixed orbital position around its target — no player input, fixed distance and angle.

The Static Orbit Cam is a specialized Phantom Camera type that maintains a fixed orbital position around its follow target. Unlike an input-driven orbit camera, the Static Orbit Cam holds a constant distance, pitch, and yaw relative to the target — the player does not control the orbit angle. The camera automatically rotates to keep the target centered while maintaining its fixed offset.

This camera type is useful for fixed-perspective gameplay (top-down, isometric, side-scrolling), environmental showcase views, and pre-positioned cinematic angles where the camera should orbit a subject without player control.

For usage guides and setup examples, see The Basics: GS_PhantomCam.


How It Works

The StaticOrbitPhantomCamComponent extends GS_PhantomCameraComponent and overrides the follow pipeline to maintain a fixed orbital offset. Each tick:

  1. The component reads the follow target’s world position.
  2. It applies the configured orbit distance, pitch, and yaw to compute the camera’s world position relative to the target.
  3. The camera rotates to face the target.
  4. Standard Phantom Camera look-at processing applies on top if a separate look-at target is configured.

Because the orbit parameters are fixed, the camera smoothly tracks the target’s movement while preserving the same viewing angle at all times.


Setup

  1. Create an entity and add GS_PhantomCameraComponent.
  2. Add StaticOrbitPhantomCamComponent to the same entity.
  3. Set the Follow Target to the entity the camera should orbit around.
  4. Configure the orbit Distance, Pitch, and Yaw on the Static Orbit component.
  5. Set the Priority on the Phantom Camera component.
  6. The camera registers with the Cam Manager automatically on activation.

Inspector Properties

PropertyTypeDescription
OrbitDistancefloatThe fixed distance from the camera to the follow target.
OrbitPitchfloatThe vertical angle of the orbit in degrees. 0 is level; positive values look down.
OrbitYawfloatThe horizontal angle of the orbit in degrees. 0 faces the target’s forward direction.

See Also


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

4.10.3.3 - First Person Cam

A first-person phantom camera type — planned for a future release.

The First Person Cam is a planned phantom camera type that will provide first-person camera behavior: the camera is positioned at the follow target’s head position and rotates based on player input, with configurable pitch and yaw limits.

This camera type is not yet available. When implemented, it will appear as a companion component alongside GS_PhantomCameraComponent on the camera entity.

For usage guides and setup examples, see The Basics: GS_PhantomCam.


See Also


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

4.10.4 - Blend Profiles

Data assets defining camera transition behavior — blend duration, easing curves, and per-camera-pair transition overrides.

Blend Profiles are data assets that control how the Cam Core transitions between Phantom Cameras. Each profile contains a list of blend entries, where each entry defines a From camera, a To camera, a blend duration, and an easing curve (see Curves Utility). This allows every camera-to-camera transition in your project to have unique timing and feel.

The GS_PhantomCamBlendProfile asset is created in the Asset Editor and assigned to the Cam Core component. When a camera transition occurs, the Cam Core queries the profile for the best matching blend entry.

For usage guides and setup examples, see The Basics: GS_PhantomCam.

Cam Blend Profile asset in the O3DE Asset Editor

 

Contents


How It Works

Blend Entries

Each entry in a Blend Profile defines a single transition rule:

  • From Camera — The name of the outgoing camera (the entity name of the phantom camera being left).
  • To Camera — The name of the incoming camera (the entity name of the phantom camera being transitioned to).
  • Blend Time — The duration of the transition in seconds.
  • Easing Type — The interpolation curve applied during the blend. See Curves Utility for the full list of available easing types.

Camera names correspond to the entity names of your phantom camera entities in the scene.

Best Target Blend

When the Cam Core needs to transition, it calls GetBestBlend(fromCam, toCam) on the assigned Blend Profile. The system evaluates blend entries in order of specificity:

  1. Exact match — An entry with both the From and To camera names matching exactly.
  2. Any-to-specific — An entry with From set to blank or “any” and To matching the incoming camera name.
  3. Specific-to-any — An entry with From matching the outgoing camera name and To set to blank or “any”.
  4. Default fallback — If no entry matches, the Cam Core uses its own default blend time and easing values configured on the component.

This layered resolution allows you to define broad defaults (e.g. “any” to “any” at 1.0 seconds) while overriding specific transitions (e.g. “MenuCam” to “GameplayCam” at 2.5 seconds with ease-in-out).


Data Model

GS_PhantomCamBlendProfile

The top-level asset class. Extends AZ::Data::AssetData. Requires GS_AssetReflectionIncludes.h when reflecting — see Serialization Helpers.

FieldTypeDescription
BlendsAZStd::vector<PhantomBlend>The list of blend entries defining camera transitions.

PhantomBlend

A single blend entry within the profile.

FieldTypeDescription
FromCameraAZStd::stringThe entity name of the outgoing phantom camera. Blank or “any” matches all outgoing cameras.
ToCameraAZStd::stringThe entity name of the incoming phantom camera. Blank or “any” matches all incoming cameras.
BlendTimefloatDuration of the blend transition in seconds.
EasingTypeEasingCurveThe interpolation curve for the blend. See Curves Utility.

API Reference

GS_PhantomCamBlendProfile Methods

MethodParametersReturnsDescription
GetBestBlendAZStd::string fromCam, AZStd::string toCamconst PhantomBlend*Returns the best matching blend entry for the given camera pair, or nullptr if no match is found. Resolution follows the specificity hierarchy.

Creating a Blend Profile

  1. Open the Asset Editor in O3DE.
  2. Select New and choose GS_BlendProfile from the asset type list.
  3. Add blend entries using the + button.
  4. For each entry:
    • Set the From Camera name (or leave blank for “any”).
    • Set the To Camera name (or leave blank for “any”).
    • Set the Blend Time in seconds.
    • Choose an Easing Type from the dropdown.
  5. Save the asset.
  6. Assign the asset to the Cam Core component’s Blend Profile inspector slot.

For a full walkthrough, see the PhantomCam Set Up Guide.


See Also

For related PhantomCam components:

For related resources:

For conceptual overviews and usage guides:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

4.10.5 - Camera Influence Fields

Global and spatial camera influence components — priority modifiers that affect phantom camera selection without changing base priorities.

Camera Influence Fields modify the effective priority of Phantom Cameras without changing their base priority values. They call AddCameraInfluence / RemoveCameraInfluence on the Cam Manager bus, identified by camera name. Multiple influences on the same camera stack additively.

There are two influence component types:

  • GlobalCameraInfluenceComponent — Applies a priority influence globally for its entire active lifetime. Place this on the StageData entity so it activates and deactivates automatically with the stage.
  • CameraInfluenceFieldComponent — Applies a priority influence only when the player enters a defined spatial volume. Requires a PhysX Collider set as a trigger on the same entity. See Physics Trigger Volume Utility.

For usage guides and setup examples, see The Basics: GS_PhantomCam.

 

Contents


How It Works

Global Influence

The GlobalCameraInfluenceComponent applies a constant priority modifier to a named phantom camera. On Activate(), it calls AddCameraInfluence on the Cam Manager bus. On Deactivate(), it calls RemoveCameraInfluence.

Placement: Add this component to the StageData entity. Because StageData activates at stage load and deactivates at stage unload, the camera influence is automatically scoped to the stage that defines it — no manual enable/disable management needed.

Spatial Influence

The CameraInfluenceFieldComponent uses a PhysX Collider (trigger mode) to detect when the player enters or exits a defined region. On entry, it adds the influence; on exit, it removes it. See Physics Trigger Volume Utility for collider setup.

This is useful for level design — switching to an overhead camera when the player enters a specific room, or boosting a scenic camera in a vista area.

Influence Stacking

Multiple influences can be active on the same camera simultaneously. The Cam Manager sums all active influences with the base priority to compute the effective priority used during EvaluatePriority().


CamInfluenceData Structure

Both component types use the CamInfluenceData structure to define their effect.

FieldTypeDescription
CameraNameAZStd::stringEntity name of the phantom camera to influence. Must match exactly.
InfluencefloatPriority modifier. Positive values increase effective priority; negative values decrease it.

API Reference

Request Bus: GlobalCameraInfluenceRequestBus

Commands for the global camera influence system. Single address, single handler.

MethodParametersReturnsDescription
AddCameraInfluenceAZStd::string camName, float influencevoidAdds a priority influence to the named camera. Delegates to the Cam Manager.
RemoveCameraInfluenceAZStd::string camName, float influencevoidRemoves a priority influence from the named camera. Delegates to the Cam Manager.

Component Reference

GlobalCameraInfluenceComponent

Applies a camera priority influence globally for its entire active lifetime.

PropertyTypeDescription
CamInfluenceDataCamInfluenceDataThe camera name and influence value to apply.

Behavior: On Activate(), adds the influence. On Deactivate(), removes it. Constant while active.

Placement: Add to the StageData entity to scope the influence to the stage lifecycle.


CameraInfluenceFieldComponent

Applies a camera priority influence when the player enters a spatial trigger volume.

PropertyTypeDescription
CamInfluenceDataCamInfluenceDataThe camera name and influence value to apply when triggered.

Behavior: Requires a PhysX Collider (trigger) on the same entity. Entry adds the influence; exit removes it.

Setup:

  1. Add CameraInfluenceFieldComponent to an entity.
  2. Add a PhysX Collider (set as trigger) to the same entity. See Physics Trigger Volume Utility.
  3. Set the Camera Name to the target phantom camera’s entity name.
  4. Set the Influence value (positive to boost, negative to reduce priority).

Usage Examples

C++ – Adding a Global Influence

#include <GS_PhantomCam/GS_PhantomCamBus.h>

// Boost "CinematicCam" priority by 50 during a cutscene
GS_PhantomCam::CamManagerRequestBus::Broadcast(
    &GS_PhantomCam::CamManagerRequestBus::Events::AddCameraInfluence,
    AZStd::string("CinematicCam"),
    50.0f
);

// Remove the boost when the cutscene ends
GS_PhantomCam::CamManagerRequestBus::Broadcast(
    &GS_PhantomCam::CamManagerRequestBus::Events::RemoveCameraInfluence,
    AZStd::string("CinematicCam"),
    50.0f
);

See Also

For related PhantomCam components:

For related utilities:

For conceptual overviews and usage guides:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

4.10.6 - Templates

ClassWizard templates for GS_PhantomCam — custom phantom camera behaviour components.

All GS_PhantomCam extension types are generated through the ClassWizard CLI. The wizard handles UUID generation, cmake file-list registration, and module descriptor injection automatically.

For usage guides and setup examples, see The Basics: GS_PhantomCam.

python ClassWizard.py \
    --template <TemplateName> \
    --gem <GemPath> \
    --name <SymbolName> \
    [--input-var key=value ...]

 

Contents


Phantom Camera Component

Template: PhantomCamera

Creates a custom camera behaviour component, a child of GS_PhantomCameraComponent. Multiple camera components can coexist on a Camera Entity; the GS_CamManagerComponent activates them by priority. Override the follow, look-at, and tick virtuals to define custom camera positioning and aiming logic.

Generated files:

  • Source/${Name}PhantomCamComponent.h/.cpp

CLI:

python ClassWizard.py --template PhantomCamera --gem <GemPath> --name <Name>

Post-generation: None — cmake and module registration are fully automatic. Override the following virtual methods:

MethodPurpose
ProcessPhysicsFollow()Drive camera position each physics tick using VelocitySpringDamper
ProcessPhysicsLookAt()Drive camera rotation using QuaternionSpringDamper
EvaluateCamTick(dt)Per-frame camera logic (blend weights, FOV, offsets)
ProcessTransformFollow()Position follow when physics is not available
ProcessTransformLookAt()Rotation follow when physics is not available

Extensibility: One component per camera mode (e.g. ThirdPerson, Aim, Dialogue, Cinematic). Components declare incompatibility with GS_PhantomCameraComponentService so only one camera behaviour is active at a time on an entity. Swap active cameras by toggling component activation, or let the CamManager handle priority.

See also: Phantom Cameras — full extension guide with complete header and implementation examples.


See Also

For the full API, component properties, and C++ extension guide:

For all ClassWizard templates across GS_Play gems:


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

4.10.7 - 3rd Party Implementations

For usage guides and setup examples, see The Basics: GS_PhantomCam.


Get GS_PhantomCam

GS_PhantomCam — Explore this gem on the product page and add it to your project.

4.11 - GS_Stats

RPG stat definitions, stat containers, and a polymorphic status effect system for character and entity attribute management.

GS_Stats provides the attribute and status effect layer for GS_Play projects. It gives entities a stat container that holds named, typed values (health, stamina, speed, and any custom stat you define), and a status effect system that applies timed or conditional modifications to those stats. GS_Stats is under active development — the API surface is evolving and some features listed here may expand in future releases.

For usage guides and setup examples, see The Basics: GS_Stats.

 

Contents


Status Effects

The status effect system applies timed or trigger-driven modifications to entity stats. Effects are polymorphic data assets that define what stat they target, how they modify it, and when they expire. Multiple effects stack on the same entity and resolve in a defined order.

AreaContents
Effect AssetsData-driven effect definitions with target stat, modification type, magnitude, and duration.
Effect ContainerRuntime component that holds the active effect stack for an entity and processes tick-based expiry.
Modification TypesAdditive, multiplicative, and override modification strategies.
ConditionsConditional triggers for applying or removing effects based on stat thresholds or game records.

Status Effects API


Installation

GS_Stats is a standalone gem with no external GS gem dependencies beyond GS_Core.

  1. Enable GS_Stats and GS_Core in your O3DE project’s gem list.
  2. Add the stat container component to any entity that requires tracked attributes.
  3. Define stat definitions in your project’s stat registry (data asset).
  4. Add status effect asset files to your project and reference them from abilities, items, or world triggers.

Note: GS_Stats is under active development. Check the changelog for the latest additions before starting integration work.


See Also

For conceptual overviews and usage guides:

For sub-system references:

For related resources:


Get GS_Stats

GS_Stats — Explore this gem on the product page and add it to your project.

4.11.1 - 3rd Party Implementations

For usage guides and setup examples, see The Basics: GS_Stats.


Get GS_Stats

GS_Stats — Explore this gem on the product page and add it to your project.

4.11.2 - Status Effect

The core gem for the GS_Play framework.

For usage guides and setup examples, see The Basics: GS_Stats.


Get GS_Stats

GS_Stats — Explore this gem on the product page and add it to your project.

4.12 - GS_Complete

Cross-gem integration layer — reference components that combine multiple GS_Play gems for camera-aware performers, cinematic controllers, and dialogue UI selection.

GS_Complete is the cross-gem integration layer. It depends on every other GS gem and provides reference components that combine functionality from two or more gems — camera-aware paper facing, cinematic-driven unit controllers, dialogue camera effects, and dialogue UI selection buttons. GS_Complete exists as a pattern reference: it proves that gem combination works cleanly via companion components and bus events, without modifying any gem’s source code.

 

Contents


Cinematics + PhantomCam

Dialogue effects that control camera behavior during conversations. These extend DialogueEffect and are discovered automatically through O3DE serialization, demonstrating how external gems add new dialogue effects without modifying GS_Cinematics.

ComponentPurpose
SetCamPriority_DialogueEffectComponentDialogue effect that changes a phantom camera’s priority during a dialogue sequence.
TogglePhantomCam_DialogueEffectComponentDialogue effect that enables or disables a phantom camera during a dialogue sequence.

Cinematics + UI

Connects LyShine button interactions to the dialogue selection system, enabling player choice UI in dialogue sequences.

ComponentPurpose
GS_UIDialogueSelectButtonComponentButton component for dialogue choice selection — bridges LyShine button events to dialogue selection bus events.

GS_UIDialogueSelectButtonComponent in the O3DE Inspector


Performer + PhantomCam

Camera-aware extensions for the Performer system, enabling 2.5D paper characters to face the active camera correctly.

ComponentPurpose
CamCorePaperFacingHandlerComponentCamera-aware paper facing for 2.5D performers — uses CamCore notifications to adjust paper performer facing direction relative to the active camera.

Unit + Cinematics

Unit controller extensions driven by the cinematic system rather than player input.

ComponentPurpose
CinematicControllerComponentUnit controller driven by cinematic sequences — extends GS_UnitControllerComponent to be driven by cinematic playback.

Cinematic Controller component in the O3DE Inspector


Architecture

GS_Complete follows the companion component pattern — each cross-gem component lives on the same entity as the components it integrates, communicating via their existing EBus interfaces. No gem source code is modified.

GS_Complete (integration layer)
    ├── Cinematics × PhantomCam
    │       ├── SetCamPriority_DialogueEffectComponent
    │       └── TogglePhantomCam_DialogueEffectComponent
    ├── Cinematics × UI
    │       └── GS_UIDialogueSelectButtonComponent
    ├── Performer × PhantomCam
    │       └── CamCorePaperFacingHandlerComponent
    └── Unit × Cinematics
            └── CinematicControllerComponent

System Components

ComponentPurpose
GS_CompleteSystemComponentRuntime system component for GS_Complete.

Dependencies

  • All GS gems (required — GS_Core, GS_Audio, GS_Cinematics, GS_Environment, GS_Interaction, GS_Juice, GS_Performer, GS_PhantomCam, GS_UI, GS_Unit, GS_AI, GS_Platform)
  • LyShine (required — for dialogue UI selection)
  • RecastNavigation (required — via GS_Cinematics)

Installation

  1. Enable the GS_Complete gem in your project configuration.
  2. Ensure all other GS gems are also enabled — GS_Complete depends on the full suite.
  3. Add cross-gem components to entities alongside the gem components they integrate.

See Also

For conceptual overviews and usage guides:

For related API references:

4.12.1 - Third Party Implementations

Integration guides for third-party cross-gem components with GS_Complete.

This section will contain integration guides for connecting third-party cross-gem components with the GS_Complete integration layer.

4.13 - GS_UI

The complete UI framework — single-tier page navigation, motion-based animations, enhanced buttons, input interception, load screens, and pause menus.

GS_UI is the gem that builds the game’s interface layer on top of O3DE’s LyShine system. It replaces the old multi-tier Hub/Window/Page hierarchy with a clean single-tier model: the UI Manager owns canvases, and each canvas root is a Page that can parent any depth of nested child Pages. Navigation is fully recursive — any page can push, pop, or swap focus within its subtree. Animations are authored as .uiam data assets using eight LyShine-specific motion tracks (position, scale, rotation, alpha, color, text). Buttons, input interception, load screens, and pause menus round out the feature set.

For usage guides and setup examples, see The Basics: GS_UI.

 

Contents


UI Manager

The singleton that owns all loaded canvases. The UI Manager loads and unloads canvases by name, maintains a global focus stack across canvases, and drives the startup focus sequence deterministically.

ComponentPurpose
GS_UIManagerComponentSingleton manager. Loads canvases, maintains global focus stack, drives startup focus.

UI Manager API


The core navigation system. A single GS_UIPageComponent replaces the old Hub, Window, and Page roles. When m_isRoot is true, the page registers itself with the UI Manager as a root canvas entry point. Pages can parent other pages to form any navigation depth. Focus is managed through push/pop stacks at each level.

ComponentPurpose
GS_UIPageComponentCore navigation component. Handles root canvas registration, child page management, focus push/pop, and show/hide transitions.

Pages API


UI Interaction

The button and input interception layer. GS_ButtonComponent plays motion-based animations on hover and select. GS_UIInputInterceptorComponent captures input events while a canvas is focused, preventing them from reaching gameplay systems.

ComponentPurpose
GS_ButtonComponentEnhanced button. Plays UiAnimationMotion assets on hover, unhover, and select events.
GS_UIInputInterceptorComponentIntercepts configured input events and re-broadcasts on UIInputNotificationBus.

UI Interaction API


UI Animation

A GS_Motion extension with eight LyShine-specific animation tracks. Animations are authored as .uiam assets in the editor and referenced by page transition fields or played directly by the standalone motion component.

TypePurpose
UiAnimationMotionComponentStandalone component for playing a UiAnimationMotion asset on any entity.
UiAnimationMotionAssetData asset (.uiam) that holds a list of UiMotionTracks and playback settings.
UiPositionTrackAnimates LyShine element position offset.
UiScaleTrackAnimates LyShine element scale.
UiRotationTrackAnimates LyShine element rotation.
UiElementAlphaTrackAnimates element-level alpha.
UiImageAlphaTrackAnimates UiImageComponent image alpha.
UiImageColorTrackAnimates UiImageComponent color tint.
UiTextColorTrackAnimates UiTextComponent text color.
UiTextSizeTrackAnimates UiTextComponent font size.

UI Animation API


Widgets

Standalone UI components for game-event-driven scenarios outside the page navigation model — load screens during transitions, pause menus overlaying gameplay.

ComponentPurpose
GS_LoadScreenComponentManages loading screen display during stage transitions.
PauseMenuComponentManages pause state and pause menu overlay.

Widgets API


Installation

GS_UI requires GS_Core, LyShine, and LmbrCentral.

  1. Enable GS_UI in Project Manager or project.json.
  2. Add GS_UIManagerComponent to the Game Manager entity and register it in the Startup Managers list.
  3. Create a LyShine canvas and add GS_UIPageComponent (with m_isRoot = true) to the root element.
  4. Set m_uiName on the root page to match the name you will use when calling LoadGSUI.
  5. Nest child pages under the root by adding GS_UIPageComponent to child entities and connecting them through m_defaultChildPage.
  6. Refer to the UI Set Up Guide for a full walkthrough.

See Also

For conceptual overviews and usage guides:

For related resources:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

4.13.1 - UI Manager

The singleton manager that owns all loaded UI canvases, maintains a global focus stack, and drives the startup focus sequence.

For usage guides and setup examples, see The Basics: GS_UI.

UI Manager component in the O3DE Inspector

The UI Manager is the singleton that controls the lifecycle of every UI canvas in your project. It loads and unloads LyShine canvases by name, maintains a global focus stack that tracks which canvas is currently active, and drives the startup focus sequence so that the correct UI appears automatically when the game finishes initializing.

GS_UIManagerComponent extends GS_ManagerComponent, which means it participates in the Game Manager’s staged startup. It is spawned by the Game Manager alongside other managers and receives OnStartupComplete to begin loading canvases and focusing the startup UI.

 

Contents


How It Works

Canvas Lifecycle

Each UI canvas is identified by a unique name string. You call LoadGSUI(name, path) to load a canvas from a .uicanvas asset path and associate it with the given name. The manager stores this in its m_activeUIs map. When you no longer need the canvas, call UnloadGSUI(name) to destroy it and remove it from the map.

Once a canvas is loaded, its root page component (a GS_UIPageComponent with m_isRoot = true) calls RegisterRootPage(name, entity) to register itself as the entry point for that canvas. The manager returns true if registration succeeds, false if the name is already registered or does not match a loaded canvas.


Global Focus Stack

The UI Manager maintains m_globalFocusStack, which tracks the order of focused canvases. When you call FocusUI(name), the named canvas is pushed onto the focus stack and becomes the active UI. NavLastUI() pops the current canvas and returns focus to the previous one. This allows layered UI flows such as opening a settings menu on top of a pause menu and navigating back through them in order.

ToggleUI(name, on) shows or hides a canvas by name. When toggling on, it calls FocusUI internally; when toggling off, it removes the canvas from the focus stack.

GetFocusedUI() returns the name of the currently focused canvas (the top of the stack).


Startup Sequence

The UI Manager’s startup follows this deterministic sequence:

  1. OnStartupComplete — The Game Manager broadcasts startup complete. The UI Manager loads all configured canvases via LoadGSUI.
  2. Root Page Registration — Each canvas’s root page component calls RegisterRootPage(name, entity) during its own activation.
  3. Startup Focus — The UI Manager checks each registered root page name against m_startupFocusUI. When it finds a match, it calls FocusUI(name) to focus that canvas as the initial UI.

This sequence ensures that canvases are loaded before pages register, and the correct startup UI receives focus without race conditions.


Data Types

UICanvasWindowPair

A pairing of a loaded LyShine canvas entity with its associated root page entity. Used internally by the UI Manager to track loaded canvases.

UINamePathPair

A pairing of a UI name string with a .uicanvas asset path. Used to configure which canvases the UI Manager loads on startup.


Inspector Properties

PropertyTypeDescription
Startup Focus UIAZStd::stringThe name of the UI canvas to focus automatically after startup completes. Must match a registered root page name.
UI Name Path PairsAZStd::vector<UINamePathPair>The list of canvas name/path pairs to load on startup. Each entry maps a unique name to a .uicanvas asset path.

API Reference

Request Bus: UIManagerRequestBus

Commands sent to the UI Manager. Singleton bus — Single address, single handler.

MethodParametersReturnsDescription
LoadGSUIconst AZStd::string& name, const AZStd::string& pathvoidLoads a LyShine canvas from the given asset path and registers it under the given name.
UnloadGSUIconst AZStd::string& namevoidDestroys the canvas registered under the given name and removes it from the active UI map.
RegisterRootPageconst AZStd::string& name, AZ::EntityId entityboolRegisters a root page entity for the named canvas. Returns true on success, false if the name is unrecognized or already registered.
UnregisterRootPageconst AZStd::string& namevoidRemoves the root page registration for the named canvas.
FocusUIconst AZStd::string& namevoidPushes the named canvas onto the global focus stack and activates it.
NavLastUIvoidPops the current canvas from the global focus stack and returns focus to the previous canvas.
ToggleUIconst AZStd::string& name, bool onvoidShows or hides the named canvas. Toggling on focuses it; toggling off removes it from the focus stack.
GetFocusedUIAZStd::stringReturns the name of the currently focused UI canvas (top of the global focus stack).
GetUICanvasEntityconst AZStd::string& nameAZ::EntityIdReturns the LyShine canvas entity for the named UI.
GetUIRootPageEntityconst AZStd::string& nameAZ::EntityIdReturns the root page entity registered for the named UI.

Usage Examples

C++ – Loading and Focusing a UI Canvas

#include <GS_UI/GS_UIBus.h>

// Load a canvas
GS_UI::UIManagerRequestBus::Broadcast(
    &GS_UI::UIManagerRequestBus::Events::LoadGSUI,
    AZStd::string("TitleScreen"),
    AZStd::string("ui/titlescreen.uicanvas")
);

// Focus it
GS_UI::UIManagerRequestBus::Broadcast(
    &GS_UI::UIManagerRequestBus::Events::FocusUI,
    AZStd::string("TitleScreen")
);

C++ – Navigating Back

// Return to the previous UI in the focus stack
GS_UI::UIManagerRequestBus::Broadcast(
    &GS_UI::UIManagerRequestBus::Events::NavLastUI
);

C++ – Toggling a UI Canvas

// Show the pause menu
GS_UI::UIManagerRequestBus::Broadcast(
    &GS_UI::UIManagerRequestBus::Events::ToggleUI,
    AZStd::string("PauseMenu"),
    true
);

See Also

For conceptual overviews and usage guides:

For component references:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

4.13.2 - Page Navigation

The core single-tier navigation component – root canvas registration, nested child page management, focus push/pop stacks, and show/hide transitions.

For usage guides and setup examples, see The Basics: GS_UI.

The GS_UIPageComponent in the UI Editor, configured as a root page with a default child page assigned.

GS_UIPageComponent is the core navigation component of the GS_UI system. It replaces the legacy multi-tier Hub/Window/Page hierarchy with a single, unified component that handles all levels of UI navigation. A single GS_UIPageComponent can serve as a root canvas entry point, a mid-level container, or a leaf page – its role is determined entirely by configuration.

When m_isRoot is set to true, the page registers itself with the UI Manager as the root entry point for its canvas. Non-root pages are nested as children of other pages and managed through their parent’s child page system. This creates a single-tier model where any depth of navigation is achieved through recursive nesting of the same component.

Focus management uses push/pop stacks at each level. When a child page is focused, it is pushed onto its parent’s focus stack. Navigating back pops the stack and restores the previous child. The NavigationReturnPolicy enum controls whether returning to a page restores the last focused child or always resets to the default.

Required Companion Components

Root pages depend on two O3DE components placed on the same entity:

ComponentPurpose
FaderComponentDrives alpha-based fade transitions during show/hide animations. Required for UiAnimationMotion tracks that animate element alpha.
HierarchicalInteractionToggleComponentEnables or disables the entire interactable subtree when the page shows or hides. Prevents input from reaching hidden page elements.

These are standard LyShine/LmbrCentral components — add them alongside GS_UIPageComponent on every root page entity.

 

Root Page component showing its dependency components in the O3DE Inspector

Contents


How It Works

Root Pages

A root page (m_isRoot = true) is the entry point for a UI canvas. During activation, it calls RegisterRootPage(m_uiName, entityId) on the UIManagerRequestBus to register itself. The m_uiName field must match the name used when the canvas was loaded via LoadGSUI.

When m_startEnabled is true, the root page shows itself immediately upon registration. Otherwise, it remains hidden until the UI Manager calls FocusUI for its canvas name.


Child Page Management

When m_manageChildPages is true, the page acts as a container for child pages. Child pages register themselves with their parent during activation via RegisterChildPage(entity). The parent tracks all registered children and manages which one is currently visible and focused.

Each page maintains its own focus stack. When a child is focused, it is pushed onto the parent’s stack via PushFocus. PopFocus restores the previous child. The m_returnPolicy field controls behavior when navigating back to this page:

  • RestoreLast – Restores the last child that was focused before navigating away.
  • AlwaysDefault – Always resets to m_defaultChildPage, ignoring the focus history.

NavigateTo(forced) navigates the UI to show the calling page, regardless of where focus currently sits in the page tree. The algorithm works by building a path from the calling page up to the root, then cascading focus changes downward:

  1. Build path – Walk up from the calling page to the root, collecting each ancestor into a path array.
  2. Find divergence point – Starting from the highest ancestor, find the first level where the parent’s currently focused child does not match the path node. This is the “push point.”
  3. Change pages – From the push point downward, call ChangePage at each level to switch the visible child to the path node.
  4. Push focus – At the push point, call PushFocus to record the new child on the parent’s focus stack.
  5. Cascade – Continue down the path to the target page, focusing each level.

This algorithm ensures that navigating to a deeply nested page correctly updates every intermediate level.


NavigateBack() walks up from the calling page to find the first ancestor with a non-empty focus stack, then pops it:

  1. Walk up – Starting from the calling page, check each ancestor’s focus stack.
  2. Pop focus – At the first ancestor with a non-empty stack, call PopFocus to restore the previous child.
  3. Fallback to UI Manager – If no ancestor has a non-empty stack (the user has navigated all the way back to the root), call NavLastUI() on the UIManagerRequestBus to return to the previous canvas in the global focus stack.

Show/Hide Transitions

Each page can have three UiAnimationMotion assets assigned:

  • m_onShow – Played when the page becomes visible.
  • m_onShowLoop – Played in a loop after the show animation completes.
  • m_onHide – Played when the page is hidden.

These motion assets drive LyShine property animations (position, scale, rotation, alpha, color) to create smooth transitions between pages.


Data Types

Controls how a page restores child focus when navigated back to.

ValueDescription
RestoreLastRestores the last focused child page from the focus stack.
AlwaysDefaultAlways resets to m_defaultChildPage, clearing focus history.

FocusEntry

A single entry in a page’s focus stack.

FieldTypeDescription
focusedChildAZ::EntityIdThe entity ID of the child page that was focused.

Inspector Properties

PropertyTypeDescription
Is RootboolWhen true, this page registers itself with the UI Manager as a root canvas entry point.
UI NameAZStd::string(Root only) The name this root page registers under. Must match the name used in LoadGSUI.
Start Enabledbool(Root only) When true, the root page shows itself immediately upon registration.
Page NameAZStd::stringA human-readable name for this page, used for identification and debugging.
Default Child PageAZ::EntityIdThe child page entity to show by default when this page is focused.
Return PolicyNavigationReturnPolicyControls whether returning to this page restores the last child or always resets to the default.
Manage Child PagesboolWhen true, this page acts as a container that manages nested child pages.
Default InteractableAZ::EntityIdThe LyShine interactable element to focus by default when this page is shown (e.g. the first button).
On ShowUiAnimationMotionAnimation played when this page becomes visible.
On Show LoopUiAnimationMotionLooping animation played after the show animation completes.
On HideUiAnimationMotionAnimation played when this page is hidden.

GS_UIPageComponent in the O3DE Inspector


API Reference

Request Bus: UIPageRequestBus

Commands sent to a specific page by entity ID. Addressed bus – addressed by EntityId.

User-Facing Methods

These methods are intended to be called from gameplay code, ScriptCanvas, or other UI components to drive navigation.

MethodParametersReturnsDescription
NavigateTobool forcedvoidNavigates the entire page tree to show this page. Builds a path to root and cascades focus changes downward. When forced is true, bypasses transition guards.
NavigateBackvoidWalks up from this page to find the first ancestor with focus history and pops it. Falls back to NavLastUI if no ancestor has history.
ChangePageAZ::EntityId target, bool takeFocus, bool forcedvoidSwitches this page’s visible child to the target entity. When takeFocus is true, the target also receives interactable focus.
ToggleShowbool onvoidShows or hides this page. Plays the appropriate show/hide animation.
ChangeUIconst AZStd::string& targetUI, bool takeFocus, bool hideThisvoidSwitches to a different UI canvas by name. Optionally hides this page and gives focus to the target canvas.
FocusChildPageByNameconst AZStd::string& name, bool forcedvoidFinds a child page by its m_pageName and focuses it.

Internal Methods

These methods are used internally by the navigation system. They can be called from C++ when building custom navigation behavior, but are not typically called from ScriptCanvas.

MethodParametersReturnsDescription
FocusPagebool forcedvoidActivates this page, shows it, and focuses its default interactable. Called as part of the NavigateTo cascade.
PushFocusAZ::EntityId child, bool forcedvoidPushes a child page onto this page’s focus stack and focuses it.
PopFocusvoidPops the top entry from this page’s focus stack and restores the previous child. Respects m_returnPolicy.
ClearFocusHistoryvoidClears this page’s entire focus stack.
RegisterChildPageAZ::EntityId entityvoidRegisters a child page entity with this parent. Called by child pages during their activation.
NotifyInteractableFocusedAZ::EntityId entityvoidNotifies this page that a LyShine interactable has received focus. Used for tracking the current interactable selection.
GetResolvedInteractableAZ::EntityIdReturns the currently resolved default interactable for this page, accounting for child page focus state.

Usage Examples

C++ – Navigating to a Specific Page

#include <GS_UI/GS_UIBus.h>

// Navigate to a specific page entity, forcing the transition
GS_UI::UIPageRequestBus::Event(
    settingsPageEntityId,
    &GS_UI::UIPageRequestBus::Events::NavigateTo,
    true  // forced
);

C++ – Navigating Back

// Navigate back from the current page
GS_UI::UIPageRequestBus::Event(
    currentPageEntityId,
    &GS_UI::UIPageRequestBus::Events::NavigateBack
);

C++ – Switching Between UI Canvases

// From the title screen, switch to the gameplay HUD
GS_UI::UIPageRequestBus::Event(
    titlePageEntityId,
    &GS_UI::UIPageRequestBus::Events::ChangeUI,
    AZStd::string("GameplayHUD"),
    true,  // takeFocus
    true   // hideThis
);

Companion Component Pattern

GS_UIPageComponent handles navigation structure, but it does not contain game-specific logic. The intended pattern is to place a companion component on the same entity as the page component. The companion component listens for page show/hide events and drives game-specific behavior (populating lists, starting timers, sending analytics events).

This separation keeps the navigation system generic and reusable while allowing each page to have unique behavior through its companion.


See Also

For component references:

  • UI Manager – The singleton that owns canvases and drives the global focus stack
  • UI Animation – Motion assets used for page show/hide transitions

For conceptual overviews and usage guides:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

4.13.3 - UI Animation

GS_Motion extension with eight LyShine-specific animation tracks, data-driven .uiam assets, and a standalone playback component.

UI Animation is a domain extension of the GS_Motion system, purpose-built for animating LyShine UI elements. It provides eight concrete track types that target LyShine component properties (position, scale, rotation, alpha, color, text), a data asset format (.uiam) for authoring animations, and a standalone component for playing them on any entity.

The architecture follows the GS_Motion domain extension pattern: UiMotionTrack extends GS_MotionTrack as the domain base, each concrete track type targets a specific LyShine property, and UiAnimationMotionAsset extends GS_MotionAsset to bundle tracks into a single asset file.

For usage guides and setup examples, see The Basics: GS_UI.

 

Contents


Domain Extension Pattern

To add a custom track type, use the UiMotionTrack ClassWizard template — see GS_UI Templates. The template generates the header and source, and the only manual step is adding a Reflect(context) call in UIDataAssetsSystemComponent.cpp. Once reflected, the new track type appears automatically in the asset editor type picker via EnumerateDerived.

GS_Motion is designed to be extended per domain. GS_UI provides the UI domain extension:

GS_Motion BaseGS_UI Extension
GS_MotionTrackUiMotionTrack – Domain base for all UI tracks
GS_MotionAssetUiAnimationMotionAsset – Asset container for UI tracks
GS_MotionCompositeUsed internally by UiAnimationMotion for multi-track playback
GS_MotionProxyUsed internally by UiAnimationMotion for asset instance binding

See GS_Motion for the full base system reference.


Track Types

UiMotionTrack

The domain base class that all UI-specific tracks extend. Inherits from GS_Core::GS_MotionTrack and provides LyShine entity targeting common to all UI tracks.

Concrete Tracks

Each track type animates a specific LyShine component property. All tracks are authored inside a .uiam asset.

Track TypeTarget ComponentProperty AnimatedValue Type
UiPositionTrackUiTransform2dComponentPosition offsetAZ::Vector2
UiScaleTrackUiTransform2dComponentScaleAZ::Vector2
UiRotationTrackUiTransform2dComponentRotationfloat
UiElementAlphaTrackUiElementComponentElement-level alphafloat
UiImageAlphaTrackUiImageComponentImage alphafloat
UiImageColorTrackUiImageComponentColor tintAZ::Color
UiTextColorTrackUiTextComponentText colorAZ::Color
UiTextSizeTrackUiTextComponentFont sizefloat

UiAnimationMotionAsset

The data asset that holds a complete UI animation. Extends GS_Core::GS_MotionAssetAZ::Data::AssetData. Requires GS_AssetReflectionIncludes.h when reflecting — see Serialization Helpers.

PropertyTypeDescription
Extension.uiam
Motion NameAZStd::stringA human-readable name for this animation.
LoopboolWhether this animation loops continuously after completing.
TracksAZStd::vector<UiMotionTrack*>The list of animation tracks that make up this motion. Each track targets a specific LyShine property on a specific entity.

To understand how to author Feedback Motions, refer to UI Animation: Authoring UIAnimation Motions


UiAnimationMotion

A serializable wrapper struct that bridges a .uiam asset to runtime playback. This is not a component – it is a data struct used as a field type in other components (such as GS_UIPageComponent’s m_onShow, m_onHide, and GS_ButtonComponent’s hover/select fields).

FieldTypeDescription
AssetAZ::Data::Asset<UiAnimationMotionAsset>Reference to the .uiam asset to play.
Motion ProxiesAZStd::vector<GS_MotionProxy>Instance-specific bindings that map track targets to actual entities at runtime.
Motion CompositeAZStd::unique_ptr<GS_MotionComposite>The runtime composite that coordinates multi-track playback. Created automatically from the asset data.

UiAnimationMotion struct with proxy bindings configured


UiAnimationMotionComponent

A standalone component that plays a UiAnimationMotion on any entity. Use this when you need to trigger a UI animation outside of the page navigation or button systems – for example, an ambient animation on a background element, or a custom transition triggered from ScriptCanvas.

The component exposes the UiAnimationMotion struct in the Inspector, allowing you to assign a .uiam asset and configure it directly on any entity.

UiAnimationMotionComponent in the O3DE Inspector


Adding Custom Tracks

Use the UIMotionTrack ClassWizard template to generate a new world-space track — see GS_UI Templates. The only manual step after generation is adding a Reflect(context) call in your {$Gem}DataAssetSystemComponent.cpp. Once reflected, the new track type is discovered automatically and appears in the asset editor type picker.

Override Init(ownerEntityId) to cache entity context, and Update(easedProgress) to drive the target property via its bus.

void ${Custom}Track::Init(AZ::EntityId ownerEntity)
{
    // Always call the base first  it sets m_owner.
    GS_UI::UiMotionTrack::Init(ownerEntity);

    // Cache the element's origin value so the track can apply additive offsets.
    // Example:
    //   UiTransformBus::EventResult(m_originPosition, ownerEntity, &UiTransformBus::Events::GetLocalPosition);
}

void ${Custom}Track::Update(float easedProgress)
{
    // Evaluate the gradient at the current eased progress and apply to the element.
    // easedProgress is in [0, 1] and already passed through the track's CurveType.
    // Example:
    //   const AZ::Vector2 offset = valueGradient.Evaluate(easedProgress);
    //   UiTransformBus::Event(m_owner, &UiTransformBus::Events::SetLocalPosition, m_originPosition + offset);
}

Usage Examples

Typical Animation Setup

A page fade-in animation might use two tracks in a single .uiam asset:

  • A UiElementAlphaTrack that fades from 0 to 1 over 0.3 seconds.
  • A UiPositionTrack that slides the element 20 pixels upward over the same duration.

Assign this asset to a page’s On Show field and the page will automatically play the animation whenever it becomes visible.


Button Hover Animation

A button scale-bounce on hover:

  • A UiScaleTrack that scales from 1.0 to 1.1 and back to 1.0 over 0.15 seconds.

Assign this to the button’s Hover Motion field. Assign the reverse (scale from current back to 1.0) to Unhover Motion.


See Also

For component references:

  • Page Navigation – Pages use UiAnimationMotion for show/hide transitions
  • Buttons – Buttons use UiAnimationMotion for hover/select animations

For conceptual overviews and usage guides:

For related resources:

  • GS_Motion – The base motion system that UI Animation extends

Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

4.13.4 - UI Interaction

Button animations and input interception — motion-driven hover/select states and input channel management for focused UI canvases.

UI Interaction covers the two systems that handle player input at the UI layer: the enhanced button component that plays motion-based animations on hover and select, and the input interceptor that captures and routes input events while a canvas is focused.

For usage guides and setup examples, see The Basics: GS_UI.

 

Contents


Button

An enhanced button component that extends LyShine interactable behavior with motion-driven hover and select state animations. Attach alongside a UiButtonComponent and assign .uiam assets for hover, unhover, and select states.

ComponentPurpose
GS_ButtonComponentEnhanced button. Plays UiAnimationMotion assets on hover, unhover, and select events.

Button API


UI Input

An input interceptor component that captures and redirects input events for UI-specific handling, preventing input from propagating to gameplay systems while a UI canvas is focused.

Component / TypePurpose
GS_UIInputInterceptorComponentIntercepts configured input events and re-broadcasts them on UIInputNotificationBus.
GS_UIInputProfileData asset defining which input channels to intercept and how to route them.

UI Input API


See Also

For component references:

  • Page Navigation – Pages use the same UiAnimationMotion system for show/hide transitions
  • UI Animation – The motion asset system used by buttons and pages
  • UI Manager – Controls which canvas is focused, activating the input interceptor

For conceptual overviews and usage guides:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

4.13.4.1 - Buttons

Enhanced button component with motion-driven hover and select animations, built on top of LyShine interactable notifications.

GS_ButtonComponent is an enhanced button that extends O3DE’s LyShine interactable system with UiAnimationMotion support. Instead of relying on LyShine’s built-in state sprites or color transitions, GS_ButtonComponent plays data-driven motion assets for hover, unhover, and select states, giving you full control over animated button feedback using the same animation system that drives page transitions.

The component attaches to any entity that already has a LyShine interactable (e.g. UiButtonComponent). It listens for LyShine interactable notifications and triggers the appropriate motion asset when the interactable state changes.

For usage guides and setup examples, see The Basics: GS_UI.

GS_ButtonComponent in the O3DE Inspector

 

Contents


How It Works

LyShine Integration

GS_ButtonComponent connects to O3DE’s UiInteractableNotificationBus on the same entity. When LyShine fires hover, unhover, or press events, the component intercepts them and plays the corresponding UiAnimationMotion asset. This means you author your button animations as .uiam assets in the same workflow as page transitions and other UI animations.

The component does not replace LyShine’s interactable – it works alongside it. The LyShine UiButtonComponent still handles click detection, navigation, and accessibility. GS_ButtonComponent adds the visual animation layer on top.


Motion States

Each button can have up to three motion assets assigned:

  • Hover Motion – Played when the interactable receives hover focus (mouse enter or gamepad navigation highlight).
  • Unhover Motion – Played when the interactable loses hover focus.
  • Select Motion – Played when the interactable is pressed/selected.

Each of these is a UiAnimationMotion struct that references a .uiam asset. The motion can animate any combination of the eight UI track types (position, scale, rotation, alpha, color, text size) to create rich button feedback.


Inspector Properties

PropertyTypeDescription
Hover MotionUiAnimationMotionThe animation played when this button receives hover focus.
Unhover MotionUiAnimationMotionThe animation played when this button loses hover focus.
Select MotionUiAnimationMotionThe animation played when this button is pressed.

Usage

Setup

  1. Add a LyShine UiButtonComponent (or other interactable) to your entity.
  2. Add GS_ButtonComponent to the same entity.
  3. Author .uiam assets for the hover, unhover, and select states using the UiAnimationMotion system.
  4. Assign the .uiam assets to the corresponding fields in the Inspector.

The button will now play the assigned animations automatically when the user interacts with it. No ScriptCanvas or C++ code is required for the animation behavior.

A GS_ButtonComponent configured with a UiAnimationMotion proxy for animation target remapping


See Also

For component references:

  • UI Animation – The motion asset system that drives button animations
  • Page Navigation – Pages use the same UiAnimationMotion system for show/hide transitions
  • UI Input – Input interception used alongside buttons in interactive canvases

For conceptual overviews and usage guides:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

4.13.4.2 - UI Input

Input interception for UI canvases – captures and redirects input events to prevent gameplay propagation while a UI is focused.

GS_UIInputInterceptorComponent intercepts input events when a UI canvas is active, preventing them from propagating to gameplay systems. This solves the common problem of button presses or navigation inputs leaking through to the game world while a menu is open.

The component uses a GS_UIInputProfile to define which input events to intercept and how to route them. When the associated UI canvas is focused, the interceptor captures configured input events and broadcasts them on the UIInputNotificationBus instead of allowing them to reach gameplay input handlers.

For usage guides and setup examples, see The Basics: GS_UI.

UI Input Profile asset in the O3DE Asset Editor

 

Contents


How It Works

Input Interception

When a UI canvas receives focus through the UI Manager, the input interceptor activates. It listens for input events that match its configured GS_UIInputProfile and consumes them before they reach the gameplay input system. The intercepted events are then re-broadcast on the UIInputNotificationBus so that UI-specific logic can respond to them.

When the canvas loses focus, the interceptor deactivates and input flows normally to gameplay systems.


GS_UIInputProfile

The input profile defines which input channels to intercept and how to map them for UI use. This allows different UI canvases to intercept different sets of inputs – a pause menu might intercept all gameplay inputs, while a HUD overlay might only intercept specific menu navigation inputs.


API Reference

Notification Bus: UIInputNotificationBus

Events broadcast when the interceptor captures input. Multiple handler bus – any number of components can subscribe to receive intercepted input notifications.

Components that need to respond to UI-specific input (such as custom navigation logic or input-driven UI animations) connect to this bus to receive intercepted events while a canvas is focused.


Inspector Properties

PropertyTypeDescription
Input ProfileGS_UIInputProfileDefines which input events to intercept and how to route them for UI handling.

Usage

Setup

  1. Add GS_UIInputInterceptorComponent to the same entity as your root GS_UIPageComponent.
  2. Configure the GS_UIInputProfile to specify which input channels should be intercepted when this canvas is focused.
  3. Any component that needs to respond to intercepted input connects to UIInputNotificationBus.

See Also

For component references:

  • UI Manager – Controls which canvas is focused and therefore which interceptor is active
  • Page Navigation – The page system that drives canvas focus changes
  • Button – Button animations used alongside input interception

For conceptual overviews and usage guides:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

4.13.5 - Widgets

Standalone UI widget components — load screens, pause menus, and other self-contained UI elements that operate outside the page navigation model.

Widgets are standalone UI components that handle specific UI scenarios outside the page navigation hierarchy. Unlike pages, which are navigated to and from, widgets activate and deactivate in response to game events — a load screen appears during a stage transition, a pause menu overlays gameplay when the player pauses.

For usage guides and setup examples, see The Basics: GS_UI.

 

Contents


Load Screen

GS_LoadScreenComponent manages loading screen display during stage transitions. It listens for stage load lifecycle events and shows or hides a designated LyShine canvas accordingly.

GS_LoadScreenComponent in the O3DE Inspector

ComponentPurpose
GS_LoadScreenComponentDisplays a loading canvas during stage transitions. Hides automatically when loading is complete.

Pause Menu

PauseMenuComponent manages the pause overlay. It responds to pause input, activates the pause canvas, and suppresses gameplay systems while the menu is open.

ComponentPurpose
PauseMenuComponentToggles pause state and the pause menu canvas overlay.

See Also

For component references:

  • UI Manager – Loads and manages the canvases that widgets display on
  • Page Navigation – The navigation system for menus within widget canvases

For conceptual overviews and usage guides:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

4.13.6 - Templates

ClassWizard templates for GS_UI — custom UI animation motion tracks.

All GS_UI extension types are generated through the ClassWizard CLI. The wizard handles UUID generation and cmake file-list registration automatically.

For usage guides and setup examples, see The Basics: GS_UI.

python ClassWizard.py \
    --template <TemplateName> \
    --gem <GemPath> \
    --name <SymbolName> \
    [--input-var key=value ...]

 

Contents


Ui Motion Track

Template: UiMotionTrack

Creates a custom animation track for the GS_UI motion system. A UiMotionTrack child animates one LyShine UI element property (position, scale, alpha, color, rotation, etc.) over a normalised [0,1] eased progress value. Tracks are added to UiMotionAnimations assets in the editor and played by UIMotionAnimationComponent.

Generated files:

  • Include/${GemName}/UIMotions/MotionTracks/${Name}Track.h
  • Source/UIMotions/MotionTracks/${Name}Track.cpp

CLI:

python ClassWizard.py --template UiMotionTrack --gem <GemPath> --name <Name>

Post-generation — manual registration required:

In UIDataAssetsSystemComponent.cpp, add:

#include <path/to/${Name}Track.h>
// inside Reflect(context):
${Name}Track::Reflect(context);

Extensibility: One track class per animatable property. Any number of track types can be registered. Instances are stored as polymorphic pointers in UiMotionAnimation assets — the property editor’s type picker discovers them automatically via O3DE’s serialization reflection (EnumerateDerived).

See also: UI Animation — the full track architecture, domain extension pattern, and built-in track type reference.


See Also

For the full API, component properties, and C++ extension guide:

For all ClassWizard templates across GS_Play gems:


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

4.13.7 - 3rd Party Implementations

For usage guides and setup examples, see The Basics: GS_UI.


Get GS_UI

GS_UI — Explore this gem on the product page and add it to your project.

4.14 - GS_Unit

Character and entity control — unit lifecycle, player and AI controllers, input processing, movement, and grounding.

GS_Unit is the gem that gives entities agency. It provides a unit lifecycle system for spawning and possessing controllable entities, a controller layer that separates player and AI concerns, an input pipeline that converts raw device input into movement vectors, a family of mover components covering free 3D, surface-sliding, and physics-driven movement, and grounding components for stable surface contact. The gem depends only on GS_Core and introduces one data asset, GS_UnitMovementProfile, for authoring movement parameters outside of code.

For usage guides and setup examples, see The Basics: GS_Unit.

 

Contents


Unit Manager

The lifecycle system for spawnable, possessable units. The Unit Manager is a singleton that handles unit spawn requests and tracks the active player controller. Individual Unit components identify an entity as a game unit and manage possession state.

ComponentPurpose
GS_UnitManagerComponentSingleton manager. Handles unit spawn requests and player controller registration.
GS_UnitComponentMarks an entity as a unit. Manages possession, controller access, and standby state.

Unit Manager API


Controllers

The controller layer separates how a unit is driven from what the unit is. A base controller handles possession handshake. The player controller adds input-driven behavior. The AI controller provides a hook for script or code-driven NPC logic.

ComponentPurpose
GS_UnitControllerComponentBase controller. Manages unit possession and depossession.
GS_PlayerControllerComponentPlayer-specific controller. Routes player input to the possessed unit.
GS_AIControllerComponentAI controller. Provides the hook point for NPC logic.

Unit Controllers API


Input Data

The input pipeline converts raw device events into a normalised input state that movement and interaction systems can read. Input Reactor components handle discrete button events. Axis Reactor components handle analogue axes. Concrete implementations cover keyboard WASD and joystick movement.

ComponentPurpose
GS_InputDataComponentHolds the current normalised input state for a unit.
GS_InputReactorComponentBase component for reacting to discrete input events.
GS_InputAxisReactorComponentBase component for reacting to analogue axis input.
GS_PlayerControllerInputReaderComponentReads player device input and writes it into GS_InputDataComponent.
KeyboardMovement_InputReactorComponentConverts WASD keyboard input into a movement vector.
JoyAxisMovement_AxisReactorComponentConverts joystick axis input into a movement vector.

Input Data API


Movement

The mover stack translates input vectors and influences into entity motion. The Mover Context component tracks current movement state. Concrete movers cover three locomotion modes. Grounder components maintain stable surface contact and feed grounded state back to the mover. Influence components apply persistent or volume-scoped forces on top of any mover.

ComponentPurpose
GS_MoverComponentBase movement component. All concrete movers extend this.
GS_MoverContextComponentTracks contextual movement state shared across the mover stack.
GS_3DFreeMoverComponentFree 3D movement — suitable for flying or swimming characters.
GS_3DSlideMoverComponentSurface-sliding 3D movement — suitable for ground characters.
GS_PhysicsMoverComponentPhysics-driven movement via rigid body integration.
GS_GrounderComponentBase ground detection component.
GS_PhysicsRayGrounderComponentRaycast-based ground detection.
GlobalMovementInfluenceComponentApplies a global movement modifier (e.g., wind, current) to all units.
MovementInfluenceFieldComponentApplies a movement modifier inside a spatial volume.

Movement API


Installation

GS_Unit requires only GS_Core. It is one of the lightest gems in the framework to add.

  1. Enable GS_Unit in Project Manager or project.json.
  2. Add GS_UnitManagerComponent to the Game Manager entity and register it in the Startup Managers list.
  3. Build your unit prefab: add GS_UnitComponent, the appropriate Controller, GS_InputDataComponent, and a Mover for the locomotion type you need.
  4. Add GS_PlayerControllerInputReaderComponent and the relevant input reactor components to the player unit prefab.
  5. Refer to the Unit Set Up Guide for a full walkthrough.

See Also

For conceptual overviews and usage guides:

For related resources:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.1 - Unit Manager

The central manager for unit lifecycle — spawning, registration, standby coordination, and unit tracking across the game session.

The Unit Manager is the GS_Unit gem’s central lifecycle controller. It extends GS_ManagerComponent and coordinates all unit-related operations: spawning new units from prefabs, registering player controllers, tracking which entities qualify as units, and propagating standby signals to the unit subsystem.

Like all GS_Play managers, the Unit Manager is spawned by the Game Manager during the startup sequence and participates in the two-stage initialization pattern. It connects to the GameManagerNotificationBus for lifecycle events and standby coordination.

For usage guides and setup examples, see The Basics: GS_Unit.

Unit Manager component in the O3DE Inspector

 

Contents


How It Works

Unit Spawning

When a system requests a new unit via RequestSpawnNewUnit, the Unit Manager instantiates the provided unit prefab under the specified parent entity. Once the unit entity activates and its GS_UnitComponent registers, the manager broadcasts ReturnNewUnit back to the caller with the new unit’s entity ID.

Player Controller Registration

Player controllers register themselves with the Unit Manager via RegisterPlayerController. This allows the manager to track which controllers are active and route unit possession accordingly.

Unit Validation

Any system can call CheckIsUnit with an entity ID to verify whether that entity is a registered unit. This is useful for targeting systems, interaction checks, and other gameplay logic that needs to distinguish units from ordinary entities.

Standby Propagation

When the Game Manager enters or exits standby, the Unit Manager receives the signal and broadcasts EnterStandby / ExitStandby on the UnitManagerNotificationBus. All unit subsystems (controllers, movers, input readers) should listen for these notifications and pause or resume accordingly.


API Reference

GS_UnitManagerComponent

Extends GS_Core::GS_ManagerComponent. Handles unit spawning, controller registration, and standby propagation.

Data Fields

FieldTypeDescription
debugExitPointAZ::EntityIdEntity used as the spawn/exit point when PlaceUnitAtExit is called. Configured in the Inspector.
m_unitSpawnTicketsAZStd::vector<AzFramework::EntitySpawnTicket>Spawn tickets for all active unit spawn operations. Keeps spawns alive until the unit entity registers.
playerListAZStd::vector<AZ::EntityId>Registered player controller entity IDs. Populated via RegisterPlayerController.

Request Bus: UnitManagerRequestBus

Commands sent to the Unit Manager. Global bus — multiple handlers.

MethodParametersReturnsDescription
RequestSpawnNewUnitAZ::EntityId callingEntityId, SpawnableScriptAssetRef unitPrefab, AZ::EntityId spawnParentEntityIdvoidRequests the manager to spawn a new unit from the given prefab under the specified parent entity. The caller receives the result via ReturnNewUnit.
RegisterPlayerControllerAZ::EntityId controllerIdvoidRegisters a player controller entity with the manager for tracking and routing.
CheckIsUnitAZ::EntityId unitIdboolReturns whether the given entity ID is a registered unit.

Notification Bus: UnitManagerNotificationBus

Events broadcast by the Unit Manager. Multiple handler bus — any number of components can subscribe.

EventParametersDescription
HandleStartupFired during the manager startup sequence. Override in subclasses to hook into the two-stage initialization before OnStartupComplete.
ReturnNewUnitAZ::EntityId caller, AZ::EntityId newUnitFired when a requested unit has been spawned and registered. The caller ID matches the original requester.
EnterStandbyFired when the unit subsystem enters standby. Pause unit-related logic.
ExitStandbyFired when the unit subsystem exits standby. Resume unit-related logic.

GS_UnitComponent

The GS_UnitComponent is the component that makes an entity a “unit.” It handles possession by controllers, tracks its owning controller, and provides identity via a unique name. For full unit entity setup details, see the Units page.

Request Bus: UnitRequestBus

Commands sent to a specific unit. ById bus — addressed by entity ID, multiple handlers.

MethodParametersReturnsDescription
PossessAZ::EntityId possessingControllervoidAssigns a controller to this unit. The unit stores the controller reference and notifies listeners.
DePossessvoidRemoves the current controller from this unit.
GetControllerAZ::EntityIdReturns the entity ID of the currently possessing controller.
GetUniqueNameAZStd::stringReturns the unique name assigned to this unit.

Notification Bus: UnitNotificationBus

Events broadcast by a unit. ById bus — addressed by the unit’s entity ID.

EventParametersDescription
UnitPossessedAZ::EntityId controllerFired when a controller possesses this unit.
UnitEnteringStandbyFired when this unit enters standby.
UnitExitingStandbyFired when this unit exits standby.

Usage Examples

C++ – Spawning a Unit

#include <GS_Unit/GS_UnitBus.h>

// Request a new unit spawn
GS_Unit::UnitManagerRequestBus::Broadcast(
    &GS_Unit::UnitManagerRequestBus::Events::RequestSpawnNewUnit,
    GetEntityId(),       // caller
    myUnitPrefab,        // SpawnableScriptAssetRef
    spawnParentEntity    // parent entity
);

C++ – Listening for Unit Spawn Results

#include <GS_Unit/GS_UnitBus.h>

class MySpawner
    : public AZ::Component
    , protected GS_Unit::UnitManagerNotificationBus::Handler
{
protected:
    void Activate() override
    {
        GS_Unit::UnitManagerNotificationBus::Handler::BusConnect();
    }

    void Deactivate() override
    {
        GS_Unit::UnitManagerNotificationBus::Handler::BusDisconnect();
    }

    void ReturnNewUnit(AZ::EntityId caller, AZ::EntityId newUnit) override
    {
        if (caller == GetEntityId())
        {
            // This is the unit we requested — possess it, configure it, etc.
        }
    }
};

Script Canvas

Spawning a unit and receiving the result:

Getting the controller on a unit:

Possessing and de-possessing a unit:

Reacting to unit standby events:


Extension Guide

Extend the Unit Manager to add custom spawn logic, additional tracking, or project-specific unit lifecycle behavior. The Unit Manager extends GS_ManagerComponent, so follow the standard Manager extension pattern.

Header (.h)

#pragma once
#include <GS_Unit/GS_UnitBus.h>
#include <Source/Managers/GS_UnitManagerComponent.h>

namespace MyProject
{
    class MyUnitManager : public GS_Unit::GS_UnitManagerComponent
    {
    public:
        AZ_COMPONENT_DECL(MyUnitManager);

        static void Reflect(AZ::ReflectContext* context);

    protected:
        // Override manager lifecycle hooks
        void OnStartupComplete() override;
        void OnEnterStandby() override;
        void OnExitStandby() override;
    };
}

Implementation (.cpp)

#include "MyUnitManager.h"
#include <AzCore/Serialization/SerializeContext.h>

namespace MyProject
{
    AZ_COMPONENT_IMPL(MyUnitManager, "MyUnitManager", "{YOUR-UUID-HERE}");

    void MyUnitManager::Reflect(AZ::ReflectContext* context)
    {
        if (auto serializeContext = azrtti_cast<AZ::SerializeContext*>(context))
        {
            serializeContext->Class<MyUnitManager, GS_Unit::GS_UnitManagerComponent>()
                ->Version(0);

            if (AZ::EditContext* editContext = serializeContext->GetEditContext())
            {
                editContext->Class<MyUnitManager>("My Unit Manager", "Custom unit manager")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyProject")
                        ->Attribute(AZ::Edit::Attributes::AppearsInAddComponentMenu, AZ_CRC_CE("Game"));
            }
        }
    }

    void MyUnitManager::OnStartupComplete()
    {
        GS_UnitManagerComponent::OnStartupComplete();
        // Custom post-startup logic
    }

    void MyUnitManager::OnEnterStandby()
    {
        GS_UnitManagerComponent::OnEnterStandby();
        // Custom standby logic
    }

    void MyUnitManager::OnExitStandby()
    {
        GS_UnitManagerComponent::OnExitStandby();
        // Custom resume logic
    }
}

See Also

For component references:

  • Units – Unit entity setup and configuration
  • Unit Controllers – Controller components that possess units
  • Input Data – Input state and reaction components

For related resources:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.2 - Unit Controllers

The controller-unit possession model — base controller, player controller, and AI controller components for driving unit behavior.

Unit Controllers are the decision-making layer in the GS_Unit architecture. A controller “possesses” a unit entity, giving it authority over that unit’s actions. This possession model cleanly separates the what (the unit with its movement, abilities, and stats) from the who (the controller providing intent — player input or AI logic).

The system provides three controller components:

  • GS_UnitControllerComponent – The base controller with possession logic, placement helpers, and the virtual interface for extension.
  • GS_PlayerControllerComponent – A player-specific controller that integrates with the Input Data subsystem to translate player input into unit commands.
  • GS_AIControllerComponent – An AI controller for NPC units, providing hooks for behavior trees or custom AI logic.

For usage guides and setup examples, see The Basics: GS_Unit.

Player Controller and Input Reader components in the O3DE Inspector

 

Contents


How It Works

Possession Model

Unit Possession Pattern Graph

Breakdown

Every unit has exactly one controller at a time, or no controller at all. Possession is established by calling Possess on the unit and released by calling DePossess. The unit fires UnitPossessed on UnitNotificationBus whenever ownership changes so other systems can react.

ConceptDescription
PossessionA controller attaches to a unit. The unit accepts input and movement commands from that controller only.
DePossessionThe controller releases the unit. The unit halts input processing and enters a neutral state.
UnitPossessed eventFires on UnitNotificationBus (addressed by entity ID) whenever a unit’s controller changes.
GetControllerReturns the entity ID of the current controller, or an invalid ID if none.
GetUniqueNameReturns the string name assigned by the Unit Manager when this unit was spawned.

Unit Placement

Controllers provide helper methods for positioning their possessed unit:

  • PlaceUnitAtExit – Places the unit at the level’s designated exit point.
  • PlaceUnitAtEntity – Places the unit at a specific entity’s world position.

Unique Names

Both controllers and units maintain a unique name string. The base controller provides a virtual SetUniqueName() method that subclasses override to generate names from their specific context (e.g., player slot number, AI template name).


Component Reference

GS_UnitControllerComponent (Base)

The base controller class. All controller types inherit from this. It handles possession, placement, unique name generation, and the notification plumbing. Extend this class to create custom controller types.


GS_PlayerControllerComponent

Extends AZ::Component. The player controller adds input handling on top of the base controller pattern. It registers itself with the Unit Manager via RegisterPlayerController and integrates with the Input Data subsystem to feed player input into the possessed unit’s movement and action systems.

Typical entity setup for a player controller:

  • GS_PlayerControllerComponent
  • GS_InputDataComponent – Holds the current input state
  • GS_PlayerControllerInputReaderComponent – Reads raw input into InputData
  • One or more InputReactorComponents – Translate input data into movement vectors or action triggers

GS_AIControllerComponent

Extends AZ::Component. The AI controller provides the same possession interface as the player controller but is driven by AI logic rather than player input. It does not connect to the input subsystem. Instead, it exposes hooks for behavior trees, scripted sequences, or custom AI decision-making to issue commands to the possessed unit.


API Reference

Request Bus: UnitControllerRequestBus

Commands sent to a specific controller. ById bus — addressed by the controller’s entity ID.

MethodParametersReturnsDescription
PossessUnitAZ::EntityId targetUnitvoidPossesses the target unit. Broadcasts PossessedTargetUnit notification.
DePossessUnitvoidReleases the currently possessed unit.
PlaceUnitAtExitvoidMoves the possessed unit to the level’s exit point.
PlaceUnitAtEntityAZ::EntityId targetEntityvoidMoves the possessed unit to the world position of the target entity.
GetUnitAZ::EntityIdReturns the entity ID of the currently possessed unit.
GetUniqueNameAZStd::stringReturns the unique name of this controller.

Notification Bus: UnitControllerNotificationBus

Events broadcast by a controller. ById bus — addressed by the controller’s entity ID. Subscribe to receive updates about controller state changes.

EventParametersDescription
PossessedTargetUnitAZ::EntityId unitIdFired when this controller possesses a unit. GS_PlayerControllerInputReaderComponent listens to this to begin routing input to the new unit.

Virtual Methods

Override these when extending the base controller. Always call the base implementation.

MethodParametersReturnsDescription
PostActivateProcessing()voidCalled after the controller’s Activate() completes. Override to add custom initialization that depends on the controller being fully active.
SetUniqueName()voidCalled during initialization to set the controller’s unique name. Override to generate names from your project’s naming scheme.

Usage Examples

C++ – Possessing a Unit

#include <GS_Unit/GS_UnitBus.h>

// From a controller, possess a unit
GS_Unit::UnitControllerRequestBus::Event(
    GetEntityId(),
    &GS_Unit::UnitControllerRequestBus::Events::PossessUnit,
    targetUnitEntityId
);

C++ – Querying a Controller’s Unit

#include <GS_Unit/GS_UnitBus.h>

AZ::EntityId possessedUnit;
GS_Unit::UnitControllerRequestBus::EventResult(
    possessedUnit,
    controllerEntityId,
    &GS_Unit::UnitControllerRequestBus::Events::GetUnit
);

Script Canvas

Possessing and de-possessing a unit from a controller:

Getting the unit possessed by a controller:


Extension Guide

Use the UnitController ClassWizard template to generate a new controller with boilerplate already in place — see GS_Unit Templates.

Create custom controllers by extending GS_UnitControllerComponent. This is the standard approach for game-specific controller logic such as party management, vehicle control, or specialized AI behaviors.

Header (.h)

#pragma once
#include <GS_Unit/GS_UnitBus.h>
#include <Source/Controllers/GS_UnitControllerComponent.h>

namespace MyProject
{
    class MyCustomController : public GS_Unit::GS_UnitControllerComponent
    {
    public:
        AZ_COMPONENT_DECL(MyCustomController);

        static void Reflect(AZ::ReflectContext* context);

    protected:
        void PostActivateProcessing() override;
        void SetUniqueName() override;
    };
}

Implementation (.cpp)

#include "MyCustomController.h"
#include <AzCore/Serialization/SerializeContext.h>

namespace MyProject
{
    AZ_COMPONENT_IMPL(MyCustomController, "MyCustomController", "{YOUR-UUID-HERE}");

    void MyCustomController::Reflect(AZ::ReflectContext* context)
    {
        if (auto serializeContext = azrtti_cast<AZ::SerializeContext*>(context))
        {
            serializeContext->Class<MyCustomController, GS_Unit::GS_UnitControllerComponent>()
                ->Version(0);

            if (AZ::EditContext* editContext = serializeContext->GetEditContext())
            {
                editContext->Class<MyCustomController>(
                    "My Custom Controller", "Project-specific controller logic")
                    ->ClassElement(AZ::Edit::ClassElements::EditorData, "")
                        ->Attribute(AZ::Edit::Attributes::Category, "MyProject")
                        ->Attribute(AZ::Edit::Attributes::AppearsInAddComponentMenu, AZ_CRC_CE("Game"));
            }
        }
    }

    void MyCustomController::PostActivateProcessing()
    {
        GS_UnitControllerComponent::PostActivateProcessing();
        // Custom post-activation logic — connect to AI systems, load profiles, etc.
    }

    void MyCustomController::SetUniqueName()
    {
        // Generate a unique name from your project's naming scheme
        uniqueName = AZStd::string::format("CustomController_%d", m_controllerIndex);
    }
}

See Also

For component references:

  • Unit Manager – Spawning and lifecycle management
  • Units – Unit entity setup and the GS_UnitComponent
  • Input Data – Input state and reaction components used by player controllers
  • Movement – Movement components driven by controller input

Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.3 - Input Data

The decoupled input pipeline — reading hardware events on the controller, storing state on the unit, and reacting to state changes through reactor components.

The Input Data subsystem bridges the controller entity (where hardware input is read) and the unit entity (where input is acted on). The controller reads raw device events and routes named event values into the unit’s GS_InputDataComponent. Reactor components on the unit then convert that stored state into movement vectors or game actions.

This separation keeps units ignorant of input bindings — any controller can possess any unit without the unit needing to change.

For usage guides and setup examples, see The Basics: GS_Unit.

 

Contents


How It Works

Unit Input Handling Pattern Graph

Breakdown

The input pipeline has three stages. Each stage is a separate component on the unit entity, and they run in order every frame:

StageComponentWhat It Does
1 — ReadGS_PlayerControllerInputReaderComponentReads raw input events from the active input profile and writes them into GS_InputDataComponent.
2 — StoreGS_InputDataComponentHolds the current frame’s input state — button presses, axis values — as structured data.
3 — ReactReactor componentsRead from GS_InputDataComponent and produce intent: movement vectors, action triggers, etc.

All reactor components downstream of the store stage read from the same GS_InputDataComponent, so there is no duplicated hardware polling and no risk of two reactors seeing different input states for the same frame.


Component Reference

GS_InputDataComponent

Stores the current input state for a unit as a name-value map. Receives input from the controller via InputDataRequestBus. Broadcasts state changes to reactor components via InputDataNotificationBus.

Data: AZStd::unordered_map<AZStd::string, float> inputEventStates

Also listens to UnitNotificationBus::UnitPossessed to store the owning controller reference.


GS_InputReactorComponent (Base)

Listens to InputDataNotificationBus. Tracks a set of event names (inputReactEvents). Fires the correct virtual based on zero-crossing:

VirtualTrigger
HandlePressed(stateName)Value changed 0 → non-zero
HandleHeld(stateName)Value remains non-zero
HandleReleased(stateName)Value changed non-zero → 0

GS_InputAxisReactorComponent (Base)

Variant for analogue axis input. Fires HandleAxisChanged(stateName, value) whenever the value changes, without pressed/held/released semantics.


KeyboardMovement_InputReactorComponent

Converts four discrete key events into a 2D movement axis. Maps moveUp/moveDown/moveLeft/moveRight event names to +1/-1 contributions on X and Y, then calls MoverContextRequestBus::SetMoveInputAxis("x"/"y", value).

FieldDescription
moveUpEventEvent name for the forward/up key
moveDownEventEvent name for the back/down key
moveLeftEventEvent name for the left key
moveRightEventEvent name for the right key

JoyAxisMovement_AxisReactorComponent

Directly maps two joystick axis event names to the mover context X and Y axes via SetMoveInputAxis.

FieldDescription
xAxisNameInput event name for the horizontal axis
yAxisNameInput event name for the vertical axis

GS_PlayerControllerInputReaderComponent

Extends GS_Core::GS_InputReaderComponent. Sits on the controller entity. Routes input to the possessed unit on HandleFireInput. Updates its possessedUnit reference when PossessedTargetUnit fires. See Player Input Reader for full details.


API Reference

Request Bus: InputDataRequestBus

Commands sent to a specific unit’s input data component. ById bus — addressed by unit entity ID.

MethodParametersReturnsDescription
UpdateEventStateAZStd::string stateName, float valuevoidWrites the value for the named event and broadcasts InputStateChanged.
ClearInputStatevoidZeros all stored values and broadcasts ClearInput.
GetEventStateAZStd::string stateNamefloatReturns the current float value for the named event.

Notification Bus: InputDataNotificationBus

Events broadcast to reactor components on the unit. ById bus — addressed by unit entity ID.

EventParametersDescription
InputStateChangedAZStd::string stateName, float valueFired when any input event value changes. Reactor components compare against their tracked events and respond.
ClearInputFired when all input states are cleared.

See Also

For detailed component pages:

For related components:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.3.1 - Input Reactor

Base and concrete reactor components — convert unit input state into pressed/held/released events or axis changes for movement and actions.

Input reactor components sit on the unit entity and listen to InputDataNotificationBus. They are the unit-side response layer that turns stored input state into game actions — movement vectors, ability triggers, UI navigation, etc.

For usage guides and setup examples, see The Basics: GS_Unit.

Input Reactor component in the O3DE Inspector

 

Contents


How It Works

Each reactor declares which event names it cares about (inputReactEvents). When InputDataNotificationBus::InputStateChanged fires, the reactor compares the incoming stateName against its list. If it matches, it compares the new value against the previously stored value and calls the correct virtual method:

  • 0 → non-zeroHandlePressed
  • non-zero → non-zeroHandleHeld
  • non-zero → 0HandleReleased
  • any changeHandleAxisChanged (axis variant only)

Reactors store lastInputValues per event name to track zero-crossings between frames.


Base Classes

GS_InputReactorComponent

Base class for discrete button/key input. Extend this to react to named input events with press, hold, and release semantics.

Virtual methods to override:

MethodParametersDescription
HandlePressedAZStd::string stateNameCalled when the event value crosses from 0 to non-zero.
HandleHeldAZStd::string stateNameCalled each frame while the event value remains non-zero.
HandleReleasedAZStd::string stateNameCalled when the event value crosses from non-zero to 0.

Fields:

FieldDescription
inputReactEventsList of event name strings this reactor listens for
lastInputValuesMap of {eventName → float} tracking previous values for zero-crossing detection

GS_InputAxisReactorComponent

Base class for analogue axis input. Extend this for joystick, trigger, or any continuous value that should not use pressed/held/released semantics.

Virtual methods to override:

MethodParametersDescription
HandleAxisChangedAZStd::string stateName, float valueCalled whenever the axis value changes.

Concrete Reactors

KeyboardMovement_InputReactorComponent

Converts four discrete directional key events into a 2D movement axis. Maps moveUp/moveDown/moveLeft/moveRight to +1/-1 contributions on the X and Y axes. On any key press or release, recomputes the combined axis and calls MoverContextRequestBus::SetMoveInputAxis("x"/"y", value) on the unit.

FieldDescription
moveUpEventEvent name for the forward/up key
moveDownEventEvent name for the back/down key
moveLeftEventEvent name for the left key
moveRightEventEvent name for the right key

Internal accumulation fields:

FieldDescription
xAxisPositive / xAxisNegativeAccumulated +X and -X contributions
yAxisPositive / yAxisNegativeAccumulated +Y and -Y contributions

JoyAxisMovement_AxisReactorComponent

Directly maps two joystick axis event names to the Mover Context X and Y movement axes. Calls MoverContextRequestBus::SetMoveInputAxis("x"/"y", value) directly from HandleAxisChanged.

FieldDescription
xAxisNameInput event name for the horizontal axis
yAxisNameInput event name for the vertical axis

API Reference

Reactors themselves do not expose a request bus. They consume InputDataNotificationBus and produce calls to MoverContextRequestBus or other action buses.

InputDataNotificationBus (consumed)

EventDescription
InputStateChanged(stateName, value)Triggers zero-crossing comparison and calls HandlePressed/HandleHeld/HandleReleased/HandleAxisChanged.
ClearInput()Resets all lastInputValues to zero, triggering releases for any held inputs.

MoverContextRequestBus (produced by movement reactors)

CallDescription
SetMoveInputAxis("x", value)Sets the horizontal axis input (−1 to 1) on the unit’s MoverContext.
SetMoveInputAxis("y", value)Sets the vertical axis input (−1 to 1) on the unit’s MoverContext.

Extension Guide

Use the InputReactor ClassWizard template to generate a new input reactor with boilerplate already in place — see GS_Unit Templates.

Create custom reactor components by extending GS_InputReactorComponent or GS_InputAxisReactorComponent.

#pragma once
#include <GS_Unit/InputData/GS_InputReactorComponent.h>

namespace MyProject
{
    class AbilityInputReactor : public GS_Unit::GS_InputReactorComponent
    {
    public:
        AZ_COMPONENT_DECL(AbilityInputReactor);
        static void Reflect(AZ::ReflectContext* context);

    protected:
        void HandlePressed(const AZStd::string& stateName) override;
        void HandleReleased(const AZStd::string& stateName) override;
    };
}

In Reflect(), add the event names you want to listen to:

// In your component's activation or reflection, populate inputReactEvents:
inputReactEvents.push_back("ability_primary");
inputReactEvents.push_back("ability_secondary");

See Also


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.3.2 - Player Input Reader

Controller-side component that reads hardware input events and routes them to the possessed unit’s input data component.

GS_PlayerControllerInputReaderComponent is the bridge between the O3DE input system and the GS_Unit input pipeline. It sits on the controller entity alongside GS_PlayerControllerComponent and is responsible for reading raw hardware events and routing them to whichever unit is currently possessed.

For usage guides and setup examples, see The Basics: GS_Unit.

Player Input Reader component in the O3DE Inspector

 

Contents


How It Works

Inheritance Chain

GS_PlayerControllerInputReaderComponent extends GS_Core::GS_InputReaderComponent, which listens to AzFramework::InputChannelNotificationBus. When a hardware input fires, the base class matches the event against the configured GS_InputProfile and calls HandleFireInput(eventName, value).

Routing Input

HandleFireInput(eventName, value) checks whether a unit is currently possessed. If so, it routes the event to the unit:

GS_Unit::InputDataRequestBus::Event(
    possessedUnit,
    &GS_Unit::InputDataRequestBus::Events::UpdateEventState,
    eventName, value
);

If no unit is possessed, the input is silently dropped.

Tracking the Possessed Unit

The component listens to UnitControllerNotificationBus::PossessedTargetUnit on its own controller entity. When the controller possesses a new unit, this callback fires and the component updates its local possessedUnit reference. All future input calls then route to the new unit automatically.

Script Canvas - Enabling and Disabling Input Groups

Input Groups handling nodes in the O3DE Script Canvas


Setup

Add GS_PlayerControllerInputReaderComponent to the controller entity alongside:

  1. GS_PlayerControllerComponent — the controller that manages possession
  2. A GS_InputProfile asset assigned to the input reader’s inspector slot — defines which input events to listen for and their names

The input reader does not need to be configured per-unit. Changing possession automatically redirects all future input.


API Reference

GS_PlayerControllerInputReaderComponent does not expose a public request bus. It consumes two buses and produces one:

Consumed

BusEventDescription
AzFramework::InputChannelNotificationBusOnInputChannelEventReceives raw hardware input from O3DE. Matched against the GS_InputProfile.
UnitControllerNotificationBusPossessedTargetUnit(unitId)Updates the local possessedUnit reference when the controller possesses a new unit.

Produced

BusMethodDescription
InputDataRequestBusUpdateEventState(eventName, value)Routes matched input events to the possessed unit’s GS_InputDataComponent.

Virtual Methods

MethodParametersDescription
HandleFireInputAZStd::string eventName, float valueCalled by the base class when a matched input event fires. Routes to the possessed unit if valid.

See Also


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.4 - Units

What makes an entity a unit — the GS_UnitComponent, entity configuration, collision setup, and links to movement subsystems.

A “unit” in GS_Play is any entity that can be possessed by a controller and driven through gameplay. The GS_UnitComponent is the marker that transforms an ordinary entity into a unit. It provides the possession interface, unique naming, standby awareness, and registration with the Unit Manager.

Units are the characters, vehicles, creatures, or any other controllable actors in your game. They do not contain decision-making logic themselves — that comes from the controller that possesses them. A unit provides the body: movement, collision, visuals, and stats. The controller provides the brain: player input or AI logic.

For usage guides and setup examples, see The Basics: GS_Unit.

Unit component in the O3DE Inspector

 

Contents


How It Works

Registration

When a GS_UnitComponent activates, it registers itself with the Unit Manager. This allows the manager to track all active units and respond to CheckIsUnit queries. When the component deactivates, it unregisters.

Possession

Units are possessed by controllers through the UnitRequestBus:

  1. A controller calls Possess(controllerEntityId) on the unit.
  2. The unit stores the controller reference and broadcasts UnitPossessed on UnitNotificationBus.
  3. The controller can now issue commands to the unit’s subsystems (movement, actions, etc.).
  4. Calling DePossess() clears the controller reference.

Standby

Units receive standby signals from the Unit Manager. When entering standby, the unit broadcasts UnitEnteringStandby on its notification bus. Child components (movers, input reactors, etc.) listen for this and pause their processing. UnitExitingStandby reverses the process.


GS_UnitComponent Reference

Request Bus: UnitRequestBus

Commands sent to a specific unit. ById bus — addressed by the unit’s entity ID, multiple handlers.

MethodParametersReturnsDescription
PossessAZ::EntityId possessingControllervoidAssigns a controller to this unit.
DePossessvoidRemoves the current controller from this unit.
GetControllerAZ::EntityIdReturns the entity ID of the currently possessing controller.
GetUniqueNameAZStd::stringReturns the unique name assigned to this unit.

Notification Bus: UnitNotificationBus

Events broadcast by a unit. ById bus — addressed by the unit’s entity ID.

EventParametersDescription
UnitPossessedAZ::EntityId controllerFired when a controller possesses this unit.
UnitEnteringStandbyFired when this unit enters standby.
UnitExitingStandbyFired when this unit exits standby.

Virtual Methods

MethodParametersReturnsDescription
SetUniqueName()voidCalled during initialization to generate the unit’s unique name. Override in subclasses to use project-specific naming.

Setup

Unit Entity Configuration

A minimal unit entity requires:

  1. GS_UnitComponent – Registers the entity as a unit and provides the possession interface.
  2. Movement components – At least a mover and optionally a grounder for ground detection.
  3. PhysX collider – For physics interaction and ground detection.

A fully featured unit entity typically includes:

  • GS_UnitComponent
  • GS_MoverContextComponent – Aggregates movement input from movers
  • A mover component (e.g., GS_3DFreeMoverComponent, GS_3DSlideMoverComponent, or GS_PhysicsMoverComponent)
  • A grounder component (e.g., GS_PhysicsRayGrounderComponent) for surface detection
  • PhysX Rigid Body and Collider components
  • Mesh or Actor component for visuals

Unit Collider Configuration

Collision layers used for a unit collider, as seen in the Entity Inspector.

Units require properly configured PhysX collision layers to interact with the environment and other units. If you have not set up your PhysX Collision Layers or Groups yet, refer to the Setting Up Your Project Environment guide.

Typical collision layer assignments:

  • Unit layer – The unit’s own collider. Collides with environment and other units.
  • Ground detection layer – Used by grounder raycasts. Collides with terrain and walkable surfaces only.

See Also

For component references:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.4.1 - Movement

The complete movement subsystem — movers, grounders, movement influence, and the movement profile asset for configuring unit locomotion.

The Movement subsystem handles all unit locomotion. It is mode-driven: a single named mode is active at a time, and only the mover and grounder whose mode name matches will run each tick. This makes locomotion fully composable without any mover combining logic.

The subsystem has four layers:

  • Mover Context — Central hub. Transforms raw input into camera-relative and ground-projected vectors, owns mode switching, context states, and profile management.
  • Movers — Translate MoverContext input into physics forces on the rigid body. One mover is active at a time per mode.
  • Grounders — Detect ground contact, slope, and surface normal. Write results to the MoverContext and trigger mode switches.
  • Movement Influence — Spatial zones and global fallbacks that supply the active GS_UnitMovementProfile.

For usage guides and setup examples, see The Basics: GS_Unit.

 

Contents


Architecture

Movers & Grounders Pattern Graph

Breakdown

Movers and Grounders are multiple components on a Unit that are constantly changing activation based on the units state. When a mover is active, it freely processes the unit’s movement based on it’s functionality. At any moment, through internal or external forces, the Movement, Rotation, or Grounding state can change, which disables the old movers, and activates the new one to continue controlling the Unit.

Input
    ↓  InputDataNotificationBus::InputStateChanged
Input Reactor Components
    ↓  MoverContextRequestBus::SetMoveInputAxis
GS_MoverContextComponent
    ↓  ModifyInputAxis() → camera-relative
    ↓  GroundInputAxis() → ground-projected
    ↓  MoverContextNotificationBus::MovementModeChanged("Free")
GS_3DFreeMoverComponent  [active when mode == "Free"]
    ↓  AccelerationSpringDamper → rigidBody->SetLinearVelocity
GS_PhysicsRayGrounderComponent  [active when grounding mode == "Free"]
    ↓  MoverContextRequestBus::SetGroundNormal / SetContextState("grounding", ...)
    ↓  MoverContextRequestBus::ChangeMovementMode("Slide")  ← when slope too steep
GS_3DSlideMoverComponent  [active when mode == "Slide"]

The Slide mover activates when the Unit is walking on too steep an angle. It takes control over the unit, slides down the hill, then restores the previous movement behaviour.


Movement Profile

GS_UnitMovementProfile is an asset that holds locomotion parameters. Movers read from the active profile via activeProfile pointer, which the MoverContext updates whenever the profile changes.

FieldTypeDescription
LocomotionStyleenumMovement style archetype (affects animation matching).
moveSpeedfloatTarget movement speed (m/s).
speedChangeSmoothingfloatLerp factor for speed transitions between profiles.
canSprintboolWhether this profile allows sprinting.
canJumpboolWhether this profile allows jumping.
canRollboolWhether this profile allows rolling.

Create movement profiles as data assets in the Asset Editor. Assign a default profile to the GS_MoverContextComponent in the Inspector. Spatial influence zones can override the active profile at runtime.

Reading Movement Profile Data - ScriptCanvas


Movement Influence

Movement Influence component in the O3DE Inspector

The MoverContext selects the active profile by priority:

  1. Influence listMovementInfluenceFieldComponent adds its profile when the unit enters its trigger volume. Multiple fields stack by priority.
  2. Global fallback — if the influence list is empty and allowGlobalInfluence is true, falls back to GlobalMovementRequestBus::GetGlobalMovementProfile.
  3. Default profile — the defaultMoveProfile assigned directly on the GS_MoverContextComponent.

MovementInfluenceFieldComponent

Inherits PhysicsTriggerComponent. On TriggerEnter, calls MoverContextRequestBus::AddMovementProfile(entityId, profile*) on the entering unit. On TriggerExit, calls RemoveMovementProfile(entityId).

Request Bus: MovementInfluenceRequestBus

ById bus — addressed by the influence field’s entity ID.

MethodParametersReturnsDescription
GetPriorityintReturns this field’s priority. Higher values take precedence when multiple fields overlap.

GlobalMovementRequestBus

Broadcast bus — single global instance.

MethodParametersReturnsDescription
GetGlobalMovementProfileGS_UnitMovementProfile*Returns the global fallback movement profile. Used by the MoverContext when no influence fields are active.

Sub-Sections

  • Mover Context — Input transformation, mode switching, context states, profile management
  • Movers — Mover base class and concrete movers (GS_3DFreeMoverComponent, GS_3DSlideMoverComponent)
  • Grounders — Grounder base class and concrete grounders (GS_PhysicsRayGrounderComponent)

See Also

  • Units — Unit entity setup and GS_UnitComponent
  • Unit Controllers — Controllers that manage possession and drive input routing
  • Input Data — The input pipeline that feeds movement reactors
  • Stage Data — Handler for current stage functionality and global effects
  • Springs Utility — Spring-damper functions used by movers and grounders
  • Physics Trigger Volume — Physics overlap that triggers functionality.

Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.4.1.1 - Mover Context

The central movement hub — transforms raw input into camera-relative and ground-projected vectors, manages mode switching, context states, and movement profiles.

GS_MoverContextComponent is the central state store for all movement on a unit. It sits between the input layer and the mover/grounder layer, holds all shared movement data, owns the mode-switching logic, and manages the active movement profile.

Movers and Grounders never communicate directly — they all read from and write to the MoverContext.

For usage guides and setup examples, see The Basics: GS_Unit.

Mover Context component in the O3DE Inspector

 

Contents


Input Transformation Pipeline

Raw input from reactor components arrives as a Vector2 (rawMoveInput). The MoverContext applies two sequential transformations each time input changes.

Step 1 — ModifyInputAxis() — Camera-Relative

For a player unit (detected by “Player” tag): fetches the active camera’s world transform, flattens its forward and right vectors to the XY plane, then computes:

modifiedMoveInput = camForward * rawInput.Y + camRight * rawInput.X

For AI units (or when inputOverride = true): uses world cardinal axes (Y=forward, X=right).

Step 2 — GroundInputAxis() — Ground-Projected

Projects modifiedMoveInput onto the ground plane using the current groundNormal:

groundedMoveInput = modifiedMoveInput - groundNormal * dot(modifiedMoveInput, groundNormal)

Direction is then normalized and the original magnitude is restored. Movers use groundedMoveInput so movement always follows the surface contour, even on slopes.

Both steps run automatically whenever SetMoveInputAxis is called and again whenever SetGroundNormal is called (ground changed under a moving unit).


Mode Switching

The MoverContext maintains three independent mode strings, each with a current and last value:

Mode TypeCurrentLast
Movement modecurrentMoveModelastMoveMode
Rotation modecurrentRotateModelastRotateMode
Grounding modecurrentGroundingModelastGroundingMode

ChangeMovementMode("Free") updates the string and broadcasts MoverContextNotificationBus::MovementModeChanged("Free"). Each Mover and Grounder component checks whether its own named mode matches the broadcast and activates or deactivates itself accordingly.

RevertToLastMovementMode() swaps current and last — enabling one-step undo (e.g., return from “Slide” to whatever was active before).


Context States

A generic key-value store for runtime state flags used by movers and grounders:

AZStd::unordered_map<AZStd::string, AZ::u32> contextStates

SetContextState(name, value) writes and broadcasts MoverContextNotificationBus::ContextStateChanged.

Known state keys:

KeySet byValuesMeaning
"grounding"PhysicsRayGrounder0 = Falling, 1 = Grounded, 2 = SlidingGround contact state
"StopMovement"Various1 = stopSignals the Free Mover to zero velocity

Movement Profile Management

The MoverContext selects the active GS_UnitMovementProfile* by priority:

  1. Influence listAddMovementProfile(influencer, profile*) is called by MovementInfluenceFieldComponent when the unit enters a trigger volume. Multiple fields stack additively by priority via MovementInfluenceRequestBus::GetPriority.
  2. Global fallback — if influenceList is empty and allowGlobalInfluence is true, checks GlobalMovementRequestBus::GetGlobalMovementProfile.
  3. Default profile — the asset-configured defaultMoveProfile on the component.

When the profile changes, MoverContextNotificationBus::MovementProfileChanged(profile*) is broadcast. Movers update their cached activeProfile pointer.

Standby behavior:

  • UnitEnteringStandby → clears the influence list, broadcasts empty mode names (disables all movers/grounders)
  • UnitExitingStandby → re-evaluates profile priority, re-broadcasts current mode names to re-enable the correct movers

Editor-Exposed Fields

FieldDescription
defaultMoveProfileAsset reference to the fallback GS_UnitMovementProfile
allowGlobalInfluenceWhether to accept the global movement profile as a fallback
startingMoveModeMode name to broadcast on activate (one-frame delayed to allow all components to start first)
startingRotateModeRotation mode name to start with
startingGroundingModeGrounding mode name to start with
maxWalkDegreesSlope angle (degrees) above which movement is no longer considered grounded

API Reference

Request Bus: MoverContextRequestBus

Commands sent to a specific unit’s MoverContext. ById bus — addressed by unit entity ID.

Input Axis:

MethodParametersReturnsDescription
SetMoveInputAxisAZStd::string axisName, float axisValuevoidSets the raw input on “x” or “y”. Triggers ModifyInputAxis() and GroundInputAxis(). Clamps total magnitude to 1.0.
GetMoveInputAxisAZ::Vector2*Returns pointer to rawMoveInput.
GetModifiedMoveInputAxisAZ::Vector3*Returns pointer to camera-relative modifiedMoveInput.
GetGroundMoveInputAxisAZ::Vector3*Returns pointer to ground-projected groundedMoveInput.

Ground State:

MethodParametersReturnsDescription
SetGroundNormalAZ::Vector3 newNormalvoidUpdates the ground normal and re-projects modifiedMoveInput.
GetGroundNormalAZ::Vector3*Returns pointer to current ground normal.
GetSlopeDirectionAZ::Vector3*Returns pointer to the current slope direction vector.
GetSlopeAnglefloat*Returns pointer to the current slope angle (dot product with up).
GetMaxWalkAnglefloat*Returns pointer to the cosine of maxWalkDegrees.

Context States:

MethodParametersReturnsDescription
SetContextStateAZStd::string stateName, AZ::u32 stateValuevoidWrites a state value and broadcasts ContextStateChanged.
GetContextStateAZStd::string stateNameAZ::u32Returns the current value for the named state.

Mode Switching:

MethodParametersReturnsDescription
ChangeMovementModeAZStd::string targetMoveModevoidSets current move mode and broadcasts MovementModeChanged.
RevertToLastMovementModevoidSwaps current and last move mode.
ChangeRotationModeAZStd::string targetRotateModevoidSets current rotation mode and broadcasts RotationModeChanged.
RevertToLastRotationModevoidSwaps current and last rotation mode.
ChangeGroundingModeAZStd::string targetGroundingModevoidSets current grounding mode and broadcasts GroundingModeChanged.
RevertToLastGroundingModevoidSwaps current and last grounding mode.

Profile Management:

MethodParametersReturnsDescription
AddMovementProfileAZ::EntityId influencer, GS_UnitMovementProfile* profilevoidAdds an influence-field profile to the priority list and re-evaluates.
RemoveMovementProfileAZ::EntityId influencervoidRemoves an influence-field profile and re-evaluates.

Notification Bus: MoverContextNotificationBus

Events broadcast by the MoverContext. ById bus — addressed by unit entity ID. Movers and Grounders subscribe to this.

EventParametersDescription
MovementModeChangedAZStd::string modeNameFired when the movement mode changes. Movers activate or deactivate based on their m_moveModeName.
RotationModeChangedAZStd::string modeNameFired when the rotation mode changes.
GroundingModeChangedAZStd::string modeNameFired when the grounding mode changes. Grounders activate or deactivate based on their m_groundModeName.
ContextStateChangedAZStd::string stateName, AZ::u32 valueFired when a context state value changes.
MovementProfileChangedGS_UnitMovementProfile* profileFired when the active movement profile is replaced. Movers update their activeProfile pointer.

Virtual Methods

Override these when extending the MoverContext.

MethodDescription
ModifyInputAxis()Transforms rawMoveInput into modifiedMoveInput (camera-relative). Override for custom camera or axis mapping.
GroundInputAxis()Projects modifiedMoveInput onto the ground plane into groundedMoveInput. Override for custom surface projection.

Script Canvas Examples

Changing movement mode:

Reverting to the previous movement mode:

Setting a context state flag:


See Also

  • Movers — Mover components that read from the MoverContext
  • Grounders — Grounder components that write ground state to the MoverContext
  • Movement — Movement system overview
  • Input Data — The input pipeline that feeds SetMoveInputAxis

Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.4.1.2 - Movers

Mover base class and concrete mover components — translate MoverContext input into physics-based unit motion via named mode activation.

Movers translate the processed movement input from the MoverContext into physics forces on the unit’s rigid body. Each mover activates for a specific named mode — when the MoverContext broadcasts MovementModeChanged("Free"), only the mover whose m_moveModeName matches "Free" activates. All others deactivate. This makes the locomotion system fully mode-driven and composable.

For usage guides and setup examples, see The Basics: GS_Unit.

Mover component in the O3DE Inspector

 

Contents


Class Hierarchy

GS_MoverComponent (base — mode-aware activation, tick management)
  └── GS_PhysicsMoverComponent (adds RigidBody cache + validity check)
        ├── GS_3DFreeMoverComponent    (mode: "Free")
        └── GS_3DSlideMoverComponent   (mode: "Slide")

GS_MoverComponent (Base)

Tick: AzPhysics::SystemEvents::OnPostSimulateEvent (after the physics step). Optional debugForceTick uses AZ::TickBus for debugging without physics.

Mode-Aware Activation

Listens to MoverContextNotificationBus::MovementModeChanged and RotationModeChanged. Compares the broadcast mode name against its own m_moveModeName and m_rotateModeName. Calls ToggleMovement(true/false) and ToggleRotation(true/false) accordingly. The physics post-simulate handler is only registered when the mover is active, saving ticks when inactive.

Per-Tick Processing

Each tick:

  1. CheckCanOperate() — validates that the required pointers and state are valid
  2. HandleMovement() — calculates and applies movement (no-op in base)
  3. HandleRotation() — calculates and applies rotation (no-op in base)

Key Fields

FieldDescription
m_moveModeNameMode string this mover activates for (set in Activate())
m_rotateModeNameMode string this mover’s rotation activates for
movementActive / rotationActiveWhether movement/rotation processing is currently running
deltaCached frame delta time
activeProfilePointer to current GS_UnitMovementProfile (updated via MovementProfileChanged)
debugForceTickIf true, uses game tick instead of post-simulate (for debugging without physics)

GS_PhysicsMoverComponent

Extends GS_MoverComponent. On activate, fetches AzPhysics::RigidBody* from the entity. CheckCanOperate() verifies the rigid body pointer is valid before allowing tick processing.


Concrete Movers

ComponentMode NameDescription
GS_3DFreeMoverComponent"Free"Standard 3D locomotion — camera-relative, spring-damped velocity and rotation
GS_3DSlideMoverComponent"Slide"Slope-sliding locomotion — activated by the grounder when slope exceeds maxWalkAngle

API Reference

Movers consume two buses and produce one:

Consumed

BusEventDescription
MoverContextNotificationBusMovementModeChanged(modeName)Activates or deactivates movement processing based on mode name match.
MoverContextNotificationBusRotationModeChanged(modeName)Activates or deactivates rotation processing based on mode name match.
MoverContextNotificationBusContextStateChanged(stateName, value)Allows movers to react to state flags (e.g. "StopMovement").
MoverContextNotificationBusMovementProfileChanged(profile*)Updates the cached activeProfile pointer.

Virtual Methods

Override these when extending any mover:

MethodParametersReturnsDescription
ToggleMovementbool onvoidCalled when the movement mode activates or deactivates.
ToggleRotationbool onvoidCalled when the rotation mode activates or deactivates.
HandleMovementvoidCalled each tick when movement is active. Override to implement movement logic.
HandleRotationvoidCalled each tick when rotation is active. Override to implement rotation logic.
CheckCanOperateboolReturns true if the mover has everything it needs to run this tick.

Extension Guide

Use the Mover ClassWizard template to generate a new mover with boilerplate already in place — see GS_Unit Templates. The template offers two base class options: Physics Mover (default) for rigid-body locomotion, and Base Mover for transform-only movement.

Extend GS_PhysicsMoverComponent to create a custom physics-driven mover.

#pragma once
#include <Source/Unit/Mover/GS_PhysicsMoverComponent.h>

namespace MyProject
{
    class MyCustomMover : public GS_Unit::GS_PhysicsMoverComponent
    {
    public:
        AZ_COMPONENT_DECL(MyCustomMover);
        static void Reflect(AZ::ReflectContext* context);

    protected:
        void Activate() override;

        void HandleMovement() override;
        void HandleRotation() override;

    private:
        // Mode name must be set in Activate():
        // m_moveModeName = "MyMode";
        // m_rotateModeName = "MyMode";
    };
}

In HandleMovement(), read from the MoverContext and write to the rigid body:

void MyCustomMover::HandleMovement()
{
    AZ::Vector3* groundedInput = nullptr;
    GS_Unit::MoverContextRequestBus::EventResult(
        groundedInput, GetEntityId(),
        &GS_Unit::MoverContextRequestBus::Events::GetGroundMoveInputAxis
    );

    if (!groundedInput || groundedInput->IsZero()) return;

    AZ::Vector3 targetVelocity = *groundedInput * activeProfile->moveSpeed;
    m_rigidBody->SetLinearVelocity(targetVelocity);
}

See Also


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.4.1.2.1 - 3D Free Mover

Standard 3D locomotion — camera-relative movement and rotation driven by spring-damped physics, with optional slope slowdown.

GS_3DFreeMoverComponent is the standard mover for walking characters. It reads the ground-projected input vector from the MoverContext, applies optional slope attenuation, computes a destination velocity, and drives the rigid body via AccelerationSpringDamper. Rotation is handled separately using QuaternionSpringDamper to face the last non-zero movement direction.

Mode names: "Free" (both movement and rotation)

For usage guides and setup examples, see The Basics: GS_Unit.

Mover component in the O3DE Inspector

 

Contents


How It Works

HandleMovement()

Called each post-physics tick while movement mode is "Free":

  1. Reads groundedMoveInput pointer from MoverContextRequestBus::GetGroundMoveInputAxis.
  2. If EnableSlopeSlowdown is true, attenuates the input vector on steep slopes (see Slope Slowdown).
  3. Multiplies the attenuated direction by CalculateSpeed() to get destinationVelocity.
  4. Fetches rigidBody->GetLinearVelocity() as the current velocity.
  5. Calls GS_Core::Springs::AccelerationSpringDamper(position, velocity, cachedAcceleration, destinationVelocity, moveHalflife, delta).
  6. Applies the result: rigidBody->SetLinearVelocity(velocity).

The cachedAcceleration is a member field that persists between frames to produce smooth acceleration response even on sudden direction changes.

HandleRotation()

  1. Reads groundedMoveInput from the MoverContext.
  2. Updates lastDirection only when groundedMoveInput is non-zero — so the unit keeps facing the last direction when stopped.
  3. Computes target yaw from lastDirection using atan2f.
  4. Calls GS_Core::Springs::QuaternionSpringDamper(currentRotation, cachedAngularVelocity, targetRotation, rotateHalflife, delta).
  5. Applies cachedAngularVelocity.Z to rigidBody->SetAngularVelocity().

StopMovement State

When ContextStateChanged("StopMovement", 1) fires, the mover zeros both velocity and cached acceleration, then clears the state back to 0. This allows other systems (e.g. landing from a jump, ability wind-up) to cleanly halt the unit.


Slope Slowdown

When EnableSlopeSlowdown is true, the mover reduces movement speed as the slope angle increases toward maxWalkAngle. The attenuation curve uses an exponent applied to the dot product of the movement direction against the uphill slope direction:

attenuation = 1.0 - uphillSlowStrength * pow(dot(moveDir, slopeDir), uphillSlowExponent)
              × remap(slopeAngle, startSlowAngle, maxWalkAngle, 0, 1)

The slowdown begins at startSlowAngle degrees and reaches maximum reduction at maxWalkAngle. At or beyond maxWalkAngle, the grounder will switch the unit to "Slide" mode regardless.


Speed Calculation

CalculateSpeed() lerps curSpeed toward activeProfile->moveSpeed each frame:

curSpeed = AZ::Lerp(curSpeed, activeProfile->moveSpeed, expf(-speedChangeSmoothing * delta));

speedChangeSmoothing is read from activeProfile. This produces a smooth speed transition when the movement profile changes (e.g. entering a slow zone).


Editor-Exposed Settings

FieldDefaultDescription
moveHalflife0.1Spring halflife for movement velocity. Smaller = snappier acceleration.
rotateHalflife0.2Spring halflife for rotation. Smaller = faster turn response.
EnableSlopeSlowdowntrueWhether steep slopes reduce movement speed.
uphillSlowStrength0.5Maximum speed reduction factor at the max walkable angle (0 = no reduction, 1 = full stop).
uphillSlowExponent2.5Curve shape of the slowdown ramp. Higher = slower onset, steeper drop near max.
startSlowAngle30°Slope angle at which slowdown begins.

Extension Guide

Extend GS_3DFreeMoverComponent to override movement or rotation behavior while keeping the base spring physics.

#pragma once
#include <Source/Unit/Mover/GS_3DFreeMoverComponent.h>

namespace MyProject
{
    class MyFreeMove : public GS_Unit::GS_3DFreeMoverComponent
    {
    public:
        AZ_COMPONENT_DECL(MyFreeMove);
        static void Reflect(AZ::ReflectContext* context);

    protected:
        void HandleMovement() override;
    };
}
void MyFreeMove::HandleMovement()
{
    // Call base for standard spring movement
    GS_3DFreeMoverComponent::HandleMovement();

    // Add custom post-processing (e.g. strafe penalty, footstep IK correction)
}

See Also


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.4.1.2.2 - Slide Mover

Slope-sliding locomotion — automatically activated when slope exceeds maxWalkAngle, drives the unit down the slope via spring-damped physics with optional input resistance and automatic recovery.

GS_3DSlideMoverComponent handles slope-sliding locomotion. It activates automatically when the grounder detects a slope angle exceeding maxWalkAngle, and deactivates when the slope eases or the unit slows enough to recover.

Mode names: "Slide" (both movement and rotation)

For usage guides and setup examples, see The Basics: GS_Unit.

Mover component in the O3DE Inspector

 

Contents


How It Works

HandleMovement()

Called each post-physics tick while movement mode is "Slide":

  1. Reads slopeDir from MoverContextRequestBus::GetSlopeDirection.
  2. Computes destinationVelocity = slopeDir * baseSlideSpeed.
  3. If EnableInputSlideResistance is true, reads groundedMoveInput and applies an opposing influence:
    destinationVelocity += groundedMoveInput * InputResistanceInfluence
    
    This allows the player to push slightly against or across the slide direction.
  4. Fetches rigidBody->GetLinearVelocity() as the current velocity.
  5. Calls GS_Core::Springs::AccelerationSpringDamper(position, velocity, cachedAcceleration, destinationVelocity, moveHalflife, delta).
  6. Applies the result: rigidBody->SetLinearVelocity(velocity).
  7. Calls FinishSliding() to check whether the slide should end.

HandleRotation()

  1. Reads slopeDir from the MoverContext.
  2. Computes target yaw from slope direction using atan2f — unit faces down the slope.
  3. Calls GS_Core::Springs::QuaternionSpringDamper(currentRotation, cachedAngularVelocity, targetRotation, rotateHalflife, delta).
  4. Applies cachedAngularVelocity.Z to rigidBody->SetAngularVelocity().

Finish Sliding

FinishSliding() is called every movement tick. It returns true and triggers recovery when both conditions are met:

  • The current slope angle is below minSlideAngle
  • The unit’s current speed is below minSpeedToStop

On recovery:

MoverContextRequestBus::ChangeMovementMode(recoveryMoveMode);
MoverContextRequestBus::ChangeRotationMode(recoveryRotateMode);

Both modes default to "Free", returning the unit to standard locomotion.


Editor-Exposed Settings

FieldDefaultDescription
moveHalflife0.1Spring halflife for slide velocity.
rotateHalflife0.15Spring halflife for rotation toward slope direction.
baseSlideSpeed8.0Target speed along the slope direction (m/s).
EnableInputSlideResistancetrueWhether player input can partially resist or redirect the slide.
InputResistanceInfluence2.0Scalar applied to groundedMoveInput when computing resistance. Higher = more player control.
minSlideAngle20°Slope angle below which the unit may recover (if also slow enough).
minSpeedToStop1.5Speed threshold below which the unit may recover (if also on shallow slope).
recoveryMoveMode"Free"Movement mode to switch to on recovery.
recoveryRotateMode"Free"Rotation mode to switch to on recovery.

Extension Guide

Extend GS_3DSlideMoverComponent to override slide behavior while keeping the base spring physics and recovery logic.

#pragma once
#include <Source/Unit/Mover/GS_3DSlideMoverComponent.h>

namespace MyProject
{
    class MySlide : public GS_Unit::GS_3DSlideMoverComponent
    {
    public:
        AZ_COMPONENT_DECL(MySlide);
        static void Reflect(AZ::ReflectContext* context);

    protected:
        void HandleMovement() override;
    };
}
void MySlide::HandleMovement()
{
    // Call base for standard slope physics
    GS_3DSlideMoverComponent::HandleMovement();

    // Add custom post-processing (e.g. audio trigger, particle effect on slide)
}

See Also

  • Movers — Mover base class and class hierarchy
  • GS_3DFreeMoverComponent — Standard free-locomotion mover
  • Mover Context — Provides slopeDir, groundedMoveInput, and mode switching
  • Grounders — Detect slope angle and trigger the switch to Slide mode
  • Springs UtilityAccelerationSpringDamper and QuaternionSpringDamper used internally

Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.4.1.2.3 - 3D Strafe Mover

Aim-relative strafing movement for first-person and third-person units. Not yet implemented.

The 3D Strafe Mover provides aim-relative movement where the unit strafes relative to its facing direction. This is the standard movement model for first-person and third-person shooters where the camera or aim direction determines the movement frame.

This component is not yet implemented. This page will be updated with full API reference and usage documentation when the component is available.

For usage guides and setup examples, see The Basics: GS_Unit.


See Also


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.4.1.2.4 - Side Scroller Mover

2D side-scrolling movement constrained to a horizontal plane. Not yet implemented.

The Side Scroller Mover provides two-dimensional movement constrained to a horizontal plane with vertical jump support. This is the standard movement model for side-scrolling platformers, beat-em-ups, and other 2D gameplay projected in a 3D environment.

This component is not yet implemented. This page will be updated with full API reference and usage documentation when the component is available.

For usage guides and setup examples, see The Basics: GS_Unit.


See Also

  • Movement – Movement subsystem overview
  • Movers – All available mover components

Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.4.1.2.5 - Grid Step Mover

Tile-based grid movement for turn-based or grid-locked unit locomotion. Not yet implemented.

The Grid Step Mover provides discrete, tile-based movement for units that move in fixed steps along a grid. This is the standard movement model for turn-based RPGs, tactics games, puzzle games, and any project that uses grid-locked locomotion.

This component is not yet implemented. This page will be updated with full API reference and usage documentation when the component is available.

For usage guides and setup examples, see The Basics: GS_Unit.


See Also

  • Movement – Movement subsystem overview
  • Movers – All available mover components

Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.4.1.3 - Grounders

Grounder base class and concrete grounder components — detect ground contact, compute ground normals and slope data, and drive mode switching on the MoverContext.

Grounders run each physics tick to determine whether the unit is in contact with the ground, what the slope looks like, and whether the surface is walkable. They write their findings to the MoverContext and switch movement modes when the ground state changes.

For usage guides and setup examples, see The Basics: GS_Unit.

Grounder component in the O3DE Inspector

 

Contents


Class Hierarchy

GS_GrounderComponent (base — mode-aware activation, tick management)
  └── GS_PhysicsRayGrounderComponent  (grounding mode: "Free")

GS_GrounderComponent (Base)

Tick: AzPhysics::SystemEvents::OnPostSimulateEvent (after the physics step).

Mode-Aware Activation

Listens to MoverContextNotificationBus::GroundingModeChanged. Compares the broadcast mode name against its own m_groundModeName. Calls ToggleGrounder(true/false) accordingly. The physics post-simulate handler is only registered when the grounder is active.

Per-Tick Processing

Each tick:

  1. CheckCanOperate() — validates required pointers and state
  2. HandleGrounding() — performs ground detection and updates MoverContext (no-op in base)

Key Fields

FieldDescription
m_groundModeNameGrounding mode string this grounder activates for
deltaCached frame delta time

Concrete Grounders

ComponentGrounding ModeDescription
GS_PhysicsRayGrounderComponent"Free"Raycast-based grounder with coyote time, slope detection, and manual gravity

API Reference

Virtual Methods

Override these when extending any grounder:

MethodParametersReturnsDescription
ToggleGrounderbool onvoidCalled when the grounding mode activates or deactivates.
HandleGroundingvoidCalled each tick when the grounder is active. Override to implement ground detection logic.
GroundingStateChangeAZ::u32 newStatevoidCalled when ground contact state changes. Override to react to grounding transitions.
CheckCanOperateboolReturns true if the grounder has everything it needs to run this tick.

Consumed Buses

BusEventDescription
MoverContextNotificationBusGroundingModeChanged(modeName)Activates or deactivates this grounder based on mode name match.

Produced Calls

BusMethodDescription
MoverContextRequestBusSetGroundNormal(normal)Updates the ground normal used by the MoverContext for input projection.
MoverContextRequestBusSetContextState("grounding", value)Sets grounding state: 0 = Falling, 1 = Grounded, 2 = Sliding.
MoverContextRequestBusChangeMovementMode(modeName)Switches to "Slide" when slope exceeds maxWalkAngle, or back to "Free" on recovery.

Extension Guide

Use the Grounder ClassWizard template to generate a new grounder with boilerplate already in place — see GS_Unit Templates. The template offers two base class options: Physics Ray Grounder (default, includes raycast, coyote time, and gravity) or Base Grounder for fully custom detection.

Extend GS_GrounderComponent to implement custom ground detection.

#pragma once
#include <Source/Unit/Grounder/GS_GrounderComponent.h>

namespace MyProject
{
    class MyGrounder : public GS_Unit::GS_GrounderComponent
    {
    public:
        AZ_COMPONENT_DECL(MyGrounder);
        static void Reflect(AZ::ReflectContext* context);

    protected:
        void Activate() override;

        void HandleGrounding() override;
        void GroundingStateChange(AZ::u32 newState) override;

    private:
        // Set in Activate():
        // m_groundModeName = "Free";
    };
}

In HandleGrounding(), run your detection and write results to the MoverContext:

void MyGrounder::HandleGrounding()
{
    AZ::Vector3 groundNormal = AZ::Vector3::CreateAxisZ();
    bool isGrounded = false;

    // ... custom detection logic ...

    GS_Unit::MoverContextRequestBus::Event(
        GetEntityId(),
        &GS_Unit::MoverContextRequestBus::Events::SetGroundNormal,
        groundNormal
    );

    AZ::u32 state = isGrounded ? 1 : 0;
    GS_Unit::MoverContextRequestBus::Event(
        GetEntityId(),
        &GS_Unit::MoverContextRequestBus::Events::SetContextState,
        "grounding", state
    );
}

See Also


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.4.1.3.1 - 3D Free Grounder

Raycast-based grounder for standard 3D locomotion — detects ground contact with coyote time, applies manual gravity when airborne, switches to Slide mode on steep slopes.

GS_PhysicsRayGrounderComponent is the standard grounder for walking characters. Each physics tick it casts a ray downward from the unit’s capsule base, evaluates the slope, applies gravity when airborne, and updates the MoverContext with ground normal and grounding state.

Grounding mode name: "Free"

For usage guides and setup examples, see The Basics: GS_Unit.

Grounder component in the O3DE Inspector

 

Contents


How It Works

Each post-physics tick, the grounder runs two operations in order:

  1. GroundCheck() — determines contact, slope, and normal
  2. GroundingMove() — applies gravity or snap force based on contact result

Ground Check

GroundCheck() casts a sphere (matching the capsule base radius) downward from the unit’s feet:

  1. If the ray hits geometry within groundCheckDistance:
    • Computes slope angle from the hit normal vs. world up.
    • Updates MoverContextRequestBus::SetGroundNormal(hitNormal).
    • If slope angle ≤ maxWalkAngle (from the MoverContext): reports Grounded.
    • If slope angle > maxWalkAngle: reports Sliding.
    • Resets coyote timer.
  2. If no hit:
    • Increments coyoteTimer.
    • If coyoteTimer < coyoteTime: still reports Grounded (coyote grace period).
    • If coyoteTimer ≥ coyoteTime: reports Falling.

Grounding Move

GroundingMove() applies vertical forces based on the current state:

Grounded: Applies a downward TimedSpringDamper to keep the unit snapped to the surface without bouncing.

Falling: Applies manual gravity — adds gravityScale * AZ::Physics::DefaultGravity to the rigid body’s linear velocity each frame:

AZ::Vector3 velocity = rigidBody->GetLinearVelocity();
velocity.SetZ(velocity.GetZ() + gravity * delta);
rigidBody->SetLinearVelocity(velocity);

Sliding: Does not override vertical velocity — the Free Mover’s AccelerationSpringDamper handles the slide direction entirely.


Grounding State Changes

GroundingStateChange(newState) is called whenever the ground state changes:

StateValueTriggerAction
Falling0Lost ground contact beyond coyote timeSetContextState("grounding", 0)
Grounded1Ray hit within walkable angleSetContextState("grounding", 1)
Sliding2Ray hit but slope > maxWalkAngleSetContextState("grounding", 2), ChangeMovementMode("Slide")

When returning to Grounded from Sliding, the grounder calls ChangeMovementMode("Free") to restore locomotion.


Editor-Exposed Settings

FieldDefaultDescription
groundCheckDistance0.15Raycast distance below the capsule base to detect ground.
capsuleRadius0.3Radius of the sphere used for the downward cast (should match capsule).
coyoteTime0.12Seconds of grace period before reporting Falling after losing ground contact.
gravityScale1.0Multiplier on AZ::Physics::DefaultGravity applied when airborne.
snapHalflife0.05Spring halflife for the ground-snap TimedSpringDamper. Smaller = tighter snap.

Extension Guide

Extend GS_PhysicsRayGrounderComponent to add custom ground detection or state reactions.

#pragma once
#include <Source/Unit/Grounder/GS_PhysicsRayGrounderComponent.h>

namespace MyProject
{
    class MyGrounder : public GS_Unit::GS_PhysicsRayGrounderComponent
    {
    public:
        AZ_COMPONENT_DECL(MyGrounder);
        static void Reflect(AZ::ReflectContext* context);

    protected:
        void GroundingStateChange(AZ::u32 newState) override;
    };
}
void MyGrounder::GroundingStateChange(AZ::u32 newState)
{
    // Call base to handle mode switching
    GS_PhysicsRayGrounderComponent::GroundingStateChange(newState);

    if (newState == 0)
    {
        // Unit became airborne — trigger jump animation, etc.
    }
}

See Also

  • Grounders — Grounder base class and class hierarchy
  • Slide Mover — Activated when this grounder reports Sliding state
  • Mover Context — Receives ground normal and state from this grounder
  • Springs UtilityTimedSpringDamper used for ground snap

Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.5 - Templates

ClassWizard templates for GS_Unit — unit controllers, input reactors, mover components, and grounder components.

All GS_Unit extension types are generated through the ClassWizard CLI. The wizard handles UUID generation, cmake file-list registration, and module descriptor injection automatically.

For usage guides and setup examples, see The Basics: GS_Unit.

python ClassWizard.py \
    --template <TemplateName> \
    --gem <GemPath> \
    --name <SymbolName> \
    [--input-var key=value ...]

 

Contents


Unit Controller

Template: UnitController

Creates a Controller component that lives on the Controller Entity (the player or AI controller, not the Unit Entity itself). Manages which Unit is currently possessed, and coordinates possession/unpossession events to sibling controller-side components.

Generated files:

  • Source/${Name}ControllerComponent.h/.cpp

CLI:

python ClassWizard.py --template UnitController --gem <GemPath> --name <Name>

Post-generation: Implement OnPossess(unitEntityId) and OnUnpossess() to route activation to the correct input readers and other controller components. The controller entity is persistent; the unit entity is swappable.

See also: Unit Controllers — full extension guide with header and implementation examples.


Input Reactor

Template: InputReactor

Sits on the Unit Entity. Subscribes to named input events from InputDataNotificationBus (broadcast by an InputReaderComponent on the Controller side) and translates them into MoverContextRequestBus calls.

Generated files:

  • Source/${Name}InputReactorComponent.h/.cpp

CLI:

python ClassWizard.py --template InputReactor --gem <GemPath> --name <Name>

Post-generation:

  • Register event name strings in inputReactEvents in Activate().
  • Implement each OnInputEvent(name, value) handler to call the appropriate MoverContextRequestBus method.
  • Multiple reactors can coexist on a Unit, each handling different input channels.

See also: Input Reactor — full extension guide with examples.


Mover Component

Template: Mover

Creates a Mover component that drives an entity’s movement each tick. Activates when GS_MoverContextComponent switches to the matching mode name, then calls HandleMovement() and HandleRotation() while active.

Two base classes available via --input-var:

OptionBase ClassMovement DrivePhysics Required
Physics Mover (default)GS_PhysicsMoverComponentrigidBody linear/angular velocity, post-simulate tickYes — PhysicsRigidBodyService
Base MoverGS_MoverComponentTransformBus world translation, AZ::TickBusNo

Generated files (Physics Mover):

  • Source/${Name}PhysicsMoverComponent.h/.cpp

Generated files (Base Mover):

  • Source/${Name}MoverComponent.h/.cpp

CLI:

# Physics Mover (default):
python ClassWizard.py --template Mover --gem <GemPath> --name <Name> \
    --input-var mover_base="Physics Mover"

# Base Mover:
python ClassWizard.py --template Mover --gem <GemPath> --name <Name> \
    --input-var mover_base="Base Mover"

Post-generation:

  • Set m_moveModeName and m_rotateModeName strings in Activate() to match the mode names configured in your MoverContext.
  • Implement HandleMovement() and HandleRotation(). The Physics Mover template comes pre-filled with spring-damper stubs using AccelerationSpringDamper and QuaternionSpringDamper.
  • In CheckCanOperate(), lazy-fetch any additional context pointers your mover needs.
  • One Mover per movement mode. All Mover components on a Unit Entity coexist — the MoverContext activates only the matching one at a time.

See also: Movers — full mover architecture and extension guide with code examples.


Grounder Component

Template: Grounder

Creates a Grounder component that determines ground state (Falling / Grounded / Sliding) each tick and reports it back to GS_MoverContextComponent. Active when the MoverContext switches to the matching grounding mode name.

Two base classes available via --input-var:

OptionBase ClassDetectionExtras
Physics Ray Grounder (default)GS_PhysicsRayGrounderComponentDownward raycast from base classCoyote time, spring position correction, gravity application
Base GrounderGS_GrounderComponentBring your own detectionTick only

Generated files (Physics Ray Grounder):

  • Source/${Name}PhysicsRayGrounderComponent.h/.cpp

Generated files (Base Grounder):

  • Source/${Name}GrounderComponent.h/.cpp

CLI:

# Physics Ray Grounder (default):
python ClassWizard.py --template Grounder --gem <GemPath> --name <Name> \
    --input-var grounder_base="Physics Ray Grounder"

# Base Grounder:
python ClassWizard.py --template Grounder --gem <GemPath> --name <Name> \
    --input-var grounder_base="Base Grounder"

Post-generation:

  • Set m_groundModeName in Activate().
  • For Physics Ray Grounder: The base class handles the raycast, spring correction, and gravity. Override HandleGrounding() and GroundingStateChange() only to add custom reactions — call the base first.
  • For Base Grounder: Implement HandleGrounding() with your detection method. Call GroundingStateChange(0/1/2) when state changes (0 = Falling, 1 = Grounded, 2 = Sliding), and SetGroundNormal() on the MoverContext when on a surface.
  • Typically one grounder per unit type. Swap the grounder via mode name for units that switch between ground-based and other locomotion.

See also: Grounders — full grounder architecture and extension guide with code examples.


See Also

For the full API, component properties, and C++ extension guide:

For all ClassWizard templates across GS_Play gems:


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.14.6 - 3rd Party Implementations

For usage guides and setup examples, see The Basics: GS_Unit.


Get GS_Unit

GS_Unit — Explore this gem on the product page and add it to your project.

4.15 - 3rd Party API

API for 3rd party support.

How to handle 3rd party support.

5 - Learn

Lessons for quick, or extended learning.

Lessons around the various features. Installation. And extended tutorials on deeper usage, or project creation.


How This Section Is Organized

Video Tutorials are one or a playlist of videos covering a targeted topic.

Lessons are guide documents that describe in step by step ways, with informational detail, on how to accomplish the targeted goal. Lessons tied to a video tutorial will have a link to the Video Page for use.

5.1 - Lessons

Index of Lesson Guides

Lesson Guides List

5.1.1 - Understanding GS_Play

What is GS_Play and what does it do?

GS_Play Methodology and Purpose

GS_Play is an intermediate to advanced game development and production framework. Because of this it can rapidly create prototypes and prove out gameplay, but is deeply extensible and customizable — allowing the project to grow, and the game to become exactly what you want to make. This does mean the tools are not as “out of the box” as more beginner-friendly options. You should already know how to make videogames, or be actively studying how to develop features, to get the most out of this framework. Check out the library of lessons and guides to get embedded in any GS_Play feature you’d like to explore.

Due to its modularity, your project should only need to target the features most relevant to your intended gameplay and genre, then build on and around the framework to satisfy any additional needs.

GS_Play is built on the idea of simple, intuitive patterns for every feature — patterns that let you think about how to deploy the functionality, not how it works under the hood. Because of this core tenet, you should be able to reason about what the premade functionality can do, and what custom features you want to contribute to that pool, to author your project rapidly and precisely.


What GS_Play Supports

The core functionality is oriented around character-centric, live-action gameplay. This can be slow and subtle — a point-and-click adventure or survival horror where the action revolves around exploration, investigation, and choosing your own path. It can also be extended to high-paced action: character-driven unit gameplay, supported by cinematic performer visuals, where you attack, roll, jump, dodge, and traverse a complex world. Enter and exit game-time cinematics, or transition into rich, fully authored sequences. The PhantomCam system can keep pace with anything you throw at it.

That is not to say other styles and genres are off the table. Many GS_Play feature sets are genre-agnostic and simply complete the needs of a full production. GS_UI can serve as straightforward menus and indicators, but with the dynamic animation system, wealth of widgets, and precise control of input focus, it can anchor heavily UI-reliant gameplay. With deeply extensible unit control, AI, and input handling, you can pursue the group formation and pathing an RTS requires. Any genre, supported by Audio, VFX, Cinematics, and gameplay systems working together.

The ultimate goal for GS_Play is to get you from a blank canvas to the end credits, and everything in between.


Best Practices

Focus on learning the patterns of each feature set. There are many base elements ready to use from the start, but as you work, keep asking: “What would I do if I needed X for my game?” That question is the right lens for every feature — it keeps you thinking about your game’s needs, not the framework’s internals.

Target only what your game requires. GS_Play’s modularity means you are not obligated to use every gem. Start with the features that directly serve your genre and core loop, then expand from there as the project demands it.

Lean on the EBus pattern. Most systems communicate through request and notification buses. Understanding how to listen for state changes and issue requests is the skill that unlocks the entire framework.

Review Best Practices for full coverage.


Specs

15+ gems, with 30+ feature sets across them.

Feature sets covering

  • Operation — Settings, hardware compatibility, and startup sequencing.
  • Character & Action — Unit controls and actions, with and against the world around them.
  • Environment — Rich environmental development: time, day/night cycles, and sky configuration.
  • Cinematics — Camera work, sequencing, dialogue, and character performance.
  • Game Feel — Audio, UI and 3D effect bursts, post processing, and motion-based feedback.

5.1.2 - Simple Project Setup

Easy set up to get started.

A guide to get everything propped up rapidly

Video Tutorial

Embed youtube guide.

Link to video_tutorials.

5.1.2.1 - Configure Project

Configure Project to run GS_Play.

Install Gems

Configure Project Gems

Add Project Dependencies

cmakelists: package build targets

<ProjectGemName>.Private.Object STATIC
BUILD_DEPENDENCIES
    PUBLIC
        Gem:GS_Play_Core.API
        Gem:GS_<your desired module>.API

Setup Project Environment

Environment Setup

Need Help?

Check out the Project Setup Video Tutorial (Links to video_tutorial tag for setup video)

5.1.2.1.1 - Setting Up the Physics Environment

Setting up your Physics Environment

These are the necessary details to create your project and have it run reliably with all GS_Play featuresets.


Physics

Image showing the standard PhysX Configuration necessary to support all GS_Play features.

Collision Layers

Collision Layers configuration panel

Image showing the standard PhysX Collision Layers necessary to support all GS_Play features.

NoCollision Layer

Environment Layer

Doodad Layer

Trigger Layer

Unit Layer

Pulse Layer

Interact Layer

Regions Layer

Collision Groups

Collision Groups configuration panel

Image showing the standard PhysX Collision Groups necessary to support all GS_Play features.

UnitGrounding Group

Triggering Group

Pulse Group

Doodad Group

Interact Group

AllButEnviro Group

OnlyRegions Group


Set Ground Collision Layer

In order to properly introduce a GS_Unit to the default scene you need to create an object with the “Environment” collision layer to stand on.

The easist way to start is to change the collision layer of the default “Ground” entity in the level.

5.1.2.2 - Prepare Managers

Prepare Managers to run GS_Play.

Start Creating Manager Prefabs

  • Blank Entity
  • Create Prefab
  • Add to Game Manager

Need Help?

Check out the Project Setup Video Tutorial (Links to video_tutorial tag for setup video)

5.1.2.3 - Prepare Startup

Prepare Startup to run GS_Play.

Start Creating Manager Prefabs

  • Blank Entity
  • Create Prefab
  • Add to Game Manager

Need Help?

Check out the Project Setup Video Tutorial (Links to video_tutorial tag for setup video)

5.1.2.4 - Prepare Camera

Prepare Camera to run GS_Play.

Start Creating Manager Prefabs

  • Blank Entity
  • Create Prefab
  • Add to Game Manager

Need Help?

Check out the Project Setup Video Tutorial (Links to video_tutorial tag for setup video)

5.1.2.5 - Get Ready to Start!

Prepare Gameplay to run GS_Play.

Resources to start using specific features.

Good things to know before starting.

You’re ready to start!

5.1.3 - Understanding Agentic Guidelines

Human-facing explanation of how the Agentic Guidelines page works — its structure, language rules, and how to keep it accurate. A how-to reference, not an open invitation to edit.

What the Agentic Guidelines Page Is

The parent page — Agentic Guidelines — is not a documentation page in the normal sense. It contains no prose meant for a human to read comfortably. It is a structured seed document, intended to be pasted or loaded into an AI agent session before any GS_Play framework work begins.

Think of it as a mission briefing card. It is dense, compressed, and declarative. Every line exists to be parsed and acted on by a machine — not to be understood leisurely by a person.

Why it exists: Agents working without this document make predictable, repeatable errors. They guess EBus names incorrectly, place EnableForAssetEditor in the wrong reflection context, write polymorphic containers with the wrong pointer type, or generate C++ where a scripting answer was needed. The agentic guidelines page is the countermeasure for all of those failures.

What it is not: It is not a tutorial, an overview, or a feature summary. Those documents live elsewhere in the framework documentation. This page exists only to give an agent precise, actionable constraints and lookup data before it starts working.


Who Uses This Framework and Why It Matters

The framework has two distinct user groups. This distinction determines what kind of guidance an agent should give, and it is the single most important thing to keep in mind when editing this document.

Designer / Scripter

This user works entirely inside the O3DE editor. They configure components in the entity inspector, connect logic with Script Canvas or Lua, set up prefabs, and tweak exposed properties. They never write C++. For this user, an agent should provide guidance about:

  • Which components to add to an entity
  • How to configure component properties
  • How to wire EBus events through Script Canvas nodes
  • How to structure prefabs and entity hierarchies

Engineer / Developer

This user writes C++ to extend the framework. They inherit from GS_Play base classes, create new gems or project-level systems, write new components, and build on top of the existing architecture. For this user, an agent should provide guidance about:

  • Which base class to inherit from
  • How to implement EBus interfaces
  • How to register new types (dialogue effects, pulse types, etc.)
  • How to structure reflection, module registration, and CMake targets

Engineers have full freedom to combine gems. GS_Complete is the framework’s reference integration layer — it demonstrates cross-gem patterns but is not a restriction on what engineers can do in their own projects. An engineer may freely include headers from multiple gems in their project code.


How the Document Is Structured

The page uses three tiers of information. Understanding these tiers is the most important thing before contributing.

Tier 1 — Invariants

Hard rules with no exceptions. These live at the very top of the document in the INVARIANTS section. An agent that reads only the first 600 tokens of the document will encounter all of them.

Invariants are:

  • Specific, not general (“polymorphic containers use BaseT*”, not “use the right container type”)
  • Complete enough to act on without reading anything else
  • Framed as prohibitions or exact requirements — never suggestions

Do not add to the Invariants section unless a rule is a hard technical constraint that has no contextual exceptions and causes a silent failure or editor malfunction when violated.

Tier 2 — Ontology and Reference

This is the structural description of what the system is: the SYSTEM ONTOLOGY paragraph, the GEM INDEX table, the EBUS DISPATCH REFERENCE, and the HOT PATHS section. This tier answers the question: what is this system and how do I operate it?

This tier is:

  • Table-first. Prose is minimal.
  • Accurate. A single wrong EBus method name here means an agent fails silently.
  • Compressed. One row per concept, not one paragraph per concept.

Tier 3 — Conditional Depth

This tier does not provide information directly. It tells the agent when to stop and go deeper. The CONTEXT ANCHORS, CONFIDENCE THRESHOLDS, and CLARIFICATION TRIGGERS sections all belong here. They point the agent toward more information and define the exact conditions under which to seek it.


The Token Grammar

The document uses a small, fixed vocabulary of bracketed tokens. These act as machine-readable instruction labels. Agents across all backends pick them up reliably through pattern matching.

TokenMeaningLocation
[INVARIANT]Hard constraint. Never violate. No exceptions.INVARIANTS section
[ANCHOR]Stop and read the linked documentation before proceeding.CONTEXT ANCHORS, inline in HOT PATHS
[MEMORY]Persist this content if a memory system is available.MEMORY PROTOCOL section
[ASK]Do not infer. Pause and ask the user before proceeding.CLARIFICATION TRIGGERS, inline in HOT PATHS
[ANTIPATTERN]What not to do and briefly why.ANTIPATTERN CATALOG section

Use only these tokens. Do not invent new ones. Tokens work because they are stable — an agent learns to treat [INVARIANT] as a certain category of instruction. A new, undiscussed token may be ignored or misinterpreted.

If you genuinely believe a new token category is needed, document the reasoning and discuss it before adding it. The bar is high: the existing five tokens cover almost every instruction type needed.


Editing the Agentic Guidelines: Step-by-Step Workflow

Step 1 — Identify what changed

Before opening the file, answer these questions:

  • Was there a codebase change (new EBus, new component, renamed method)?
  • Was there an error an agent actually made — something it got wrong that this document should have prevented?
  • Is there a new common task that agents are regularly asked to do with no hot path covering it?
  • Is there a new documentation page that an agent should be directed to?

If none of these are true, you probably do not need to edit the parent document. Editing it without a concrete reason adds noise.

Step 2 — Locate the right section

Use this map to find where your change belongs:

Type of changeTarget section
A rule that must never be brokenINVARIANTS
A new gem or renamed managerGEM INDEX
A new or renamed EBus or methodEBUS DISPATCH REFERENCE
A new common code taskHOT PATHS
A mistake agents keep makingANTIPATTERN CATALOG
A decision that needs user inputCLARIFICATION TRIGGERS
A documentation page that gives deep contextCONTEXT ANCHORS
Something agents can safely assumeCONFIDENCE THRESHOLDS
A memory entry agents should persistMEMORY PROTOCOL

If your change doesn’t fit any of these cleanly, it is probably better expressed in the framework’s technical documentation pages rather than here.

Step 3 — Apply the minimum change

Do not add surrounding context, explanations, or improvements to adjacent content. The agentic document is optimized for token density — adding prose that an agent does not need to act on degrades its quality.

Apply the minimum correct change:

  • One new row in a table
  • One new [INVARIANT] entry
  • One revised method name
  • One new numbered step in a hot path

Resist the impulse to expand.

Step 4 — Verify accuracy

Before saving, verify every name, method, and path in your change against the source:

  • EBus method names — Open the source header for that bus. Copy the method name exactly. Do not paraphrase.
  • Doc paths — Confirm the path actually exists. A broken anchor is worse than no anchor.
  • Base class names — Confirm against the gem’s public Include/ headers.
  • TypeIds — Do not add TypeId values unless you have copied them from GS_{Name}TypeIds.h.

Step 5 — Apply the self-test

Read only the section you changed and ask: does an agent reading only this section know what to do, or does it still need to infer?

If it still needs to infer, the entry is incomplete. If it specifies exact names, exact locations, exact conditions, and exact patterns — it is ready.

Step 6 — Update this contributing page if structure changed

If you added a new section to the parent document, added a new token type, changed the tier structure, or changed any guidance about how to contribute — update this page to match. The two documents must stay in sync. See the section below: When to Update This Page.


Section-by-Section Editing Reference

BOOTSTRAP PROTOCOL

Edit when: The loading order or session re-entry behavior of the document should change.

Rules:

  • Keep this section under 8 lines. It is the first thing an agent reads.
  • Instructions must be ordered. Numbered list only.
  • Do not add explanatory prose. Each line is an action, not a description.

INVARIANTS

Edit when: A new silent-failure rule has been discovered through a real agent error, or an existing invariant is no longer accurate.

Qualifying test for a new invariant — all must be true:

  1. It is a specific technical rule, not a general principle.
  2. Violating it causes a failure that is silent or extremely hard to diagnose.
  3. It applies in all contexts, with no exceptions.
  4. It cannot be reliably derived from general O3DE knowledge.

Format:

`[INVARIANT]` **Short title.** One or two sentences of exact, actionable constraint.
Optionally: a code block showing the correct pattern.

Do not add O3DE general best practices here. They do not belong in the invariants list — agents already know them.


MEMORY PROTOCOL

Edit when: A new section is added to the parent document that should be persisted to memory, or a memory key is renamed.

Rules:

  • Memory keys are lowercase with underscores: gs_play_invariants, not GS Play Invariants.
  • Only add memory entries for sections that are dense enough to be worth re-loading from memory rather than re-reading.
  • The freshness rule (source code wins over stored memory) must never be removed.

SYSTEM ONTOLOGY

Edit when: The engine version changes, the build toolchain changes, or the platform target changes.

Rules:

  • The attribute table is for hard facts only: version numbers, language, build system, platform.
  • The “Core architectural rules” bullets must remain at three or fewer. If you need to add a fourth, one of the existing three should be absorbed into a more general statement.
  • Do not add GS_Complete or internal gem restrictions as architectural rules. These are framework-internal development constraints, not user-facing rules.

GEM INDEX

Edit when: A new gem is added, a gem is renamed, a manager component is added or renamed, a primary bus is renamed, or a documentation path changes.

Rules:

  • One row per gem. Do not split a gem across two rows.
  • The dependency graph must be updated any time the gem dependency structure changes.
  • Doc paths must be real paths. Verify before adding.
  • GS_Complete remains in the index as the integration layer reference — it documents a real gem. Its description should clarify it is Genome Studios’ reference layer, not a restriction on user code.

EBUS DISPATCH REFERENCE

Edit when: A new EBus is added to any gem, a method is renamed, or the dispatch policy of a bus changes.

Rules:

  • Method names are copied from source headers. Not paraphrased, not summarized — exact.
  • Include only the most commonly-called methods. This is a quick-lookup reference, not a full API.
  • If a bus is entity-addressed, the Dispatch column must say ById. If it is a singleton broadcast, the column says Broadcast. If it is a listener-only bus, say Listen.
  • Adding a new gem requires a new ### {GemName} buses subsection.

HOT PATHS

Edit when: A new common task emerges that agents are regularly asked to perform, or an existing hot path’s steps become inaccurate.

Qualifying test for a new hot path — all must be true:

  1. Agents are asked to do this task regularly.
  2. There is a correct pattern that is non-obvious or differs from standard O3DE conventions.
  3. There is a genuine way to get it wrong that the hot path would prevent.

Format:

### HOT PATH N — Short task description

1. First discrete action.
2. Second discrete action.
3. Third discrete action.

`[ANCHOR]` Condition requiring deeper reading → /docs/path/to/page/

Rules:

  • Steps are single, discrete actions. Not compound sentences.
  • No prose between steps.
  • Add an [ANCHOR] at the end if there is a documentation page that provides necessary deeper detail.
  • Hot paths are for engineer (C++ extension) tasks. Designer tasks belong in component or scripting documentation, not here.

ANTIPATTERN CATALOG

Edit when: A real agent error is identified that this document should prevent, or an existing antipattern is no longer relevant.

Qualifying test — all must be true:

  1. An agent actually made this mistake, or will predictably make it without the warning.
  2. The failure mode is silent or ambiguous — not a compile error that would surface immediately.
  3. The correct alternative is already documented (in INVARIANTS or HOT PATHS).

Format:

`[ANTIPATTERN]` **Short title.** One sentence describing the failure. One sentence pointing to the correct behavior or its location in this document.

Do not add antipatterns that are just general bad practices. Every entry here should represent a failure mode specific to GS_Play or to O3DE patterns that agents commonly get wrong.


CONFIDENCE THRESHOLDS

Edit when: A GS_Play-specific assumption has been confirmed safe (add to “proceed” list) or a pattern that agents incorrectly assume they know turns out to need verification (add to “stop” list).

Rules:

  • The “proceed” list contains patterns where O3DE general knowledge is accurate and sufficient.
  • The “stop” list contains patterns where GS_Play diverges from general O3DE knowledge, or where API details are too precise to guess.
  • Do not add items to the “proceed” list unless they have been verified accurate across the framework.

CONTEXT ANCHORS

Edit when: A new documentation page is added that meaningfully fills a knowledge gap an agent would otherwise have to guess around, or an anchor’s target path changes.

Qualifying test for a new anchor:

  1. There is a specific documentation page at a specific path.
  2. An agent working without that page would make structural errors on this task.
  3. The condition is narrow enough to fire at the right moment (“before writing unit movement code”) — not so broad that it triggers on general O3DE work.

Format:

`[ANCHOR]` **Before doing [specific task]:**
→ Read /docs/path/to/page/ for [what the agent needs from it — be specific].
→ `[ASK]` Optional inline clarification prompt if a user decision is required here.

Note on GS_Complete anchors: GS_Complete is a valid reference target. An anchor pointing to it as a reference (“review GS_Complete for examples”) is appropriate. An anchor that frames it as a restriction (“you must implement this there”) is not — that is a framework-internal structural convention, not a rule that applies to user project code.


CLARIFICATION TRIGGERS

Edit when: A new category of user decision is identified where agents predictably infer the wrong answer, or an existing trigger is too broad to fire usefully.

Qualifying test:

  1. The agent cannot determine the correct answer from documentation alone — it requires user intent.
  2. Agents predictably get this wrong when left to infer.
  3. Getting it wrong has real consequences (wrong code generated, wrong scope, irreversible decisions).

The USER CONTEXT trigger at the top of this section must never be removed. It is the most important clarification in the document — it gates all subsequent behavior between designer/scripter guidance and engineer guidance.

The two documentation sections referenced in USER CONTEXT are:

  • The Basics (/docs/the_basics/) — Scripter and designer documentation. Covers component usage, property configuration, and scripting for all gems.
  • Framework API (/docs/framework/) — Engineer documentation. Covers C++ interfaces, EBus signatures, base class architecture, and reflection patterns for all gems.

Both sections cover the same featureset across the same gem set: core, cinematics, environment, interaction, performer, phantomcam, ui, unit, ai, item, rpstats. Update the USER CONTEXT doc links if any gem is added, renamed, or its documentation path changes in either section.


What Makes Bad Agentic Content

Understanding what to avoid is as important as understanding what to write.

Prose explanations. The parent document must never read like a tutorial. “The EBus system in O3DE works by…” is wasted tokens. An agent already knows what EBus is. It needs the specific name and specific dispatch syntax.

Vague headings. “Common Patterns”, “Unique Details” — these tell neither a human nor a machine what type of instruction to expect. Every section heading in the parent document is also a category of instruction.

Soft language. “You should consider”, “it is generally recommended” — agents do not weight soft language correctly. If something must be done a specific way, say so directly. Preferences belong in other documentation pages, not here.

Information without a trigger condition. Raw information with no instruction about when to use it adds cognitive load without adding value. Every fact in the parent document is attached to a pattern, a constraint, a lookup trigger, or a warning.

Framework-internal conventions framed as user-facing rules. Conventions about how the framework’s own gem structure is organized (like where cross-gem components live within the framework itself) should not appear as [INVARIANT] or [ANTIPATTERN] entries. Engineers extending the framework in their own projects have full freedom to organize code as their project requires.

Outdated EBus names or method signatures. This is the most damaging type of error in the document. An agent trusting a wrong bus name fails silently, and the diagnosis will be non-obvious. Before adding or updating anything in the EBus section, verify against the current source header.


Maintaining Accuracy

When any of the following changes in the codebase, update the corresponding section:

Codebase changeSection to update
New EBus addedEBUS DISPATCH REFERENCE — add table row
EBus method renamedEBUS DISPATCH REFERENCE — update row; scan HOT PATHS for any direct references
New gem addedGEM INDEX — new row; update dependency graph
New Manager Component added or renamedGEM INDEX — update Manager column and Primary Bus column
New documentation page publishedCONTEXT ANCHORS — add anchor if it fills a knowledge gap
Documentation page moved or renamedCONTEXT ANCHORS — update the path; if it is a gem root page in The Basics or Framework, also check CLARIFICATION TRIGGERS USER CONTEXT links
New gem added to the frameworkGEM INDEX, EBUS DISPATCH REFERENCE, CLARIFICATION TRIGGERS USER CONTEXT gem list (both /docs/the_basics/ and /docs/framework/ paths)
O3DE API change affecting Reflect() patternsINVARIANTS and HOT PATH 1, 5 — verify code blocks still compile
New common agent task identifiedHOT PATHS — add new numbered path
New silent-failure pattern discoveredANTIPATTERN CATALOG — add entry
GS_Play base class API changesCONFIDENCE THRESHOLDS — verify the “stop” list is still accurate

When to Update This Contributing Page

This contributing page should stay in sync with the parent document’s structure. Update it when:

  • A new section is added to the parent document — add a corresponding entry to the Section-by-Section Editing Reference above.
  • A token is added to the grammar — add it to the Token Grammar table.
  • The tier structure changes — update the “How the Document Is Structured” section.
  • The distinction between user contexts changes (new user type, renamed category) — update “Who Uses This Framework”.
  • Any “rules” in the section-specific guidance become outdated.

This page is allowed to have prose. It is written for humans. Explanations, reasoning, and examples are all appropriate here. The parent document is the one that must remain dense and machine-optimized.


Testing Your Contribution

Before saving any change to the parent document, apply this test:

Read only the section you changed. Ask: does an agent reading only this section know exactly what to do, or does it still need to infer?

If it still needs to infer, the entry is incomplete. It is ready when it specifies:

  • Exact names (method names, class names, bus names)
  • Exact locations (file paths, doc paths, section names)
  • Exact conditions (the trigger that fires this instruction)
  • Exact patterns (the correct code form or step sequence)

A strong contribution eliminates a whole class of agent errors. A weak contribution adds words without eliminating uncertainty. When in doubt, add nothing and write the knowledge into the relevant framework documentation page instead. Let a [ANCHOR] point to it.

5.2 - Video Tutorials

Index of Video based tutorials.

See our GS_Play Youtube Channel

Tutorial Sets

5.2.1 - Simple Project Setup

Easy set up to get started.

A guide to get everything propped up rapidly

Video Tutorial

Embed youtube guide.

Link to video_tutorials.