Skip to content

2021Cyan/TestZero

Repository files navigation

Test Zero logo
Test Zero
2D Unity Game Development Journal

Find Test Zero on Steam

🌐 다른 언어로 보기: 한국어 🇰🇷

This journal explores the journey and challenges we faced while developing a 2D shooter in Unity as part of the INTD450 course at the University of Alberta. We hope it will help whoever is interested in game development, especially with Unity, to learn from our experiences.

Note: Anything in this journal might be outdated or not best practice. It is meant to be a learning resource, not a definitive guide. Always refer to the latest Unity documentation and community resources for up-to-date information.

Table of Contents

Basics in Unity

Back to the Top

If you have little expereince in programing languages, you would have heard of Object oriented programming (OOP). However, in many game engines, including Unity, understanding Entity-Component-System (ECS) would be helpful. According to Wikipedia, "An ECS comprises entities composed from components of data, with systems which operate on the components.".

Entities are the objects in your game, such as a player or an enemy. Components are the data and functionality that tell the entity how to behave, such as its position, health, or what it can do. Systems are the logic that processes the components of entities. In other words, systems are to control the behaviors of entities based on their components.

GameObjects and Components

Back to the Top

In Unity, GameObjects are the entities. Interesting thing is that GameObjects can work as folders. You can organize your GameObjects in a hierarchy, where a GameObject can have child GameObjects.


Object Hierarchy

Notice how objects are organized in parent-child relationships, allowing for grouped transformations and better scene organization.


Scene View

Components can include scripts, physics properties, renderers, and more.

Each GameObject can contain multiple components that define its behavior and appearance. In this scene, each wall and ceiling object contains a BoxCollider2D component, which allows them to interact with other objects in the game world. These BoxColliders are represented by the green outlines visible in the scene view.

While these objects also have SpriteRenderer components (which would normally make them visible), they aren't visually apparent in the scene because this level uses a tile-based approach for visuals rather than individual sprites for background and each collision object.

Camera & UI

The Camera is the viewpoint of the game. It determines what is visible on the screen. UI (User Interface) is the visual elements that allow players to interact with the game, such as menus, buttons, and HUD (Heads-Up Display).

Camera

Back to the Top


Scene View (left) and Game View (right) with 2D view

White outlines represent the camera's viewport in the Scene View, showing what will be visible in the Game View. The Game View displays the actual game as players will see it.


Scene View with 3D view

Although this game is 2D, you might have to consider the camera's perspective and how it affects the player's view of the game world. Sometimes, it helps to change camera perspective to see how the game looks from different angles, especially when debugging or designing levels.

Cinemachine

Back to the Top

Cinemachine is a powerful package in Unity that provides advanced camera controls and features. It is very simple to create complex camera behaviors without writing a lot of code.


Cinemachine Camera Transition

Adding several Cinemachine Camereas with different priority values allows you to switch between cameras based on the player's actions or game events. For example, you can have a camera that follows the player, another that focuses on a specific area, and a third that provides a cinematic view during cutscenes.

UI

Back to the Top


Scene View (left) and Game View (right)

This is an example of a simple UI button setup. Interesting thing is that these UI elements are not visible in the Scene view, but they are visible in the Game view. This is because UI elements are rendered on top of the game world, allowing players to interact with them without interfering with the game objects.


Scene View with camera zoomed out

If you zoom out the scene view, you can see how the UI elements are positioned relative to the camera.


Overlay

This is an example of a UI overlay that displays the player's health and ammo. The overlay is positioned in the top left corner of the screen, and it updates in real time as the player takes damage or uses ammo.

Input Handling

There are two main ways to handle user input in Unity: the Old Input System and the New Input System.

Old Input System

Back to the Top

Old Input System is very simple and easy to use.

public class OldInputSystem : MonoBehaviour
{
    void Update()
    {
        if (Input.GetKeyDown(KeyCode.Space))
        {
            Debug.Log("Space key was pressed");
        }
        if (Input.GetKey(KeyCode.Space))
        {
            Debug.Log("Space key was pressed");
        }
    }
}

One of the ways to handle user input is through the Update() method, which is called once per frame.

Note that Input.GetKeyDown() checks if a key was pressed down during the current frame, while Input.GetKey() checks if a key is being held down. If I hold the space key, the first log will be printed once while the second log will be printed every frame until I release the key.

We are not sure if there is another way to manage user input in the Old Input System. However, This method is not ideal for all cases, as it can lead to performance issues if not managed properly.

New Input System

Back to the Top

The New Input System provides a more flexible and efficient way to handle user input. Instead of checking for input in the Update() method, you can use events to respond to user input. This allows you to handle input more efficiently and reduces the need for constant polling.

Understanding these terms would be helpful when using the New Input System:

  • Subscription: A way to register a method to be called when an input action is triggered.

  • Action: Represents a specific input action, such as "Jump" or "Fire". Actions can be bound to multiple input devices (keyboard, gamepad, etc.).

  • Action Map: A collection of related actions, allowing you to group input actions together.

  • Input Action Asset: A file that defines input actions and their bindings.


Input Action Asset

Input Action Asset is a file that defines the input actions and their bindings (key or button combinations). There can be multiple action maps within a single Input Action Asset, allowing you to organize your input actions based on different contexts (e.g., gameplay, menu navigation).

In the example above, there are two action maps: "Player" and "UI". The "Player" action map contains actions for movement, jumping, and shooting, while the "UI" action map contains actions for navigating menus.

The idea of having multiple action maps is to allow you to switch between different sets of input actions based on the current context of the game. However, you can manage all input actions within a single action map if you prefer simplicity.

Optimization

There are several ways to optimize your Unity game for better performance. Here are some tips I found useful:

The Update() method is called once per frame, which can lead to performance issues if used excessively. Instead, consider using events or coroutines to handle input and other time-sensitive actions.

Events

Back to the Top

Let's say you want to run some code whenever the player is close to an object.

void Update()
    {
        if (Player != null)
        {
            float distance = Vector3.Distance(Player.transform.position, transform.position);
            if (distance < 5f)
            {
                Debug.Log("Player is close to the object.");
            }
        }
    }

If you use the Update() method, it will calculate the distance every frame. If there are many objects in the scene and each object has this script, it can lead to performance issues.

The more complicated the logic in the Update() method, the more performance issues you will encounter.

Collider is one of the most popular components you would use. It allows you to detect collisions between objects.

void OnCollisionEnter2D(Collision2D collision)
    {
        if (collision.gameObject.CompareTag("Player"))
        {
            // Perform the desired action when the player collides with this object
            Debug.Log("Player collided with the Opt object.");
        }
    }

You can use the OnCollisionEnter2D() method to detect collisions. This method is called only when a collision occurs, which reduces the number of calculations and improves performance.

Coroutines

Back to the Top

According to Unity's documentation, "Coroutines provide an excellent way of easily managing things that need to happen after a delay or over the course of time. They prevent Update methods from becoming bloated with timers and the other workings required to achieve the same outcome with a different approach."


Spot light (left) and Spot light with coroutine (right)

It is possible to achieve the same effect with the Update() method, but it would be easier to manage the code with a coroutine.

public class SpotLightController : MonoBehaviour
{
    [SerializeField] private float _time = 3f;
    [SerializeField] private float _rotationAngle = 30f;
    [SerializeField] private bool _rotateLeft = true;
    private Transform _lightHead;

    private void Awake()
    {
        _lightHead = transform.GetChild(0);
        StartCoroutine("RotateLightHead");
    }
}

As soong as the script is initialized, it starts the coroutine "RotateLightHead". The coroutine will run in the background and rotate the light head continuously.

IEnumerator RotateLightHead()
    {
        while (true)
        {
            int direction = _rotateLeft ? 1 : -1;
            
            // First rotation
            yield return RotateByAngle(direction * _rotationAngle);
            // Return to origin
            yield return RotateByAngle(-direction * _rotationAngle);

            // Second rotation (opposite direction)
            yield return RotateByAngle(-direction * _rotationAngle);
            // Return to origin
            yield return RotateByAngle(direction * _rotationAngle);
        }
    }

IEnumerator RotateByAngle(float angle)
    {
        float startAngle = _lightHead.localRotation.eulerAngles.z;
        if (startAngle > 180f) startAngle -= 360f; // Normalize angle
        float targetAngle = startAngle + angle;
        float elapsedTime = 0f;

        while (elapsedTime < _time)
        {
            float currentAngle = Mathf.Lerp(startAngle, targetAngle, elapsedTime / _time);
            _lightHead.localRotation = Quaternion.Euler(0f, 0f, currentAngle);
            elapsedTime += Time.deltaTime;
            yield return null;
        }

        _lightHead.localRotation = Quaternion.Euler(0f, 0f, targetAngle);
    }

RotateByAngle is a coroutine that rotates the light head by a specified angle over a given time period. My understanding of coroutine is a way to run the logic in the background (multithreading).

Debugging

Back to the Top


Console Tab

The Console tab is a powerful tool for debugging your Unity game. It allows you to see log messages, warnings, and errors generated by your scripts. If Console tab is not visible, it can be opened by going to Window > Pannels > Console.

Debug.Log("This is a log message."); // White text in the console
Debug.LogWarning("This is a warning message."); // Yellow text in the console
Debug.LogError("This is an error message."); // Red text in the console

Log messages can be printed like above. Warnings are not critical, but they can indicate potential issues that may affect your game in the future. Errors, on the other hand, will prevent your game from running until they are resolved.

If there are any errors in your code, they will be displayed in the console. Checking the debug messages is the first step in debugging your game.

The Unity's console is not available in the build mode, so we used an in-game debug console to check messages.


Development Build

When it is built, Development Build option can be enabled to see the console in the build version. (Personally, I prefer the in-game debug console because you can filter messages)

Build

Back to the Top

Although the game works perfectly in the Unity editor, it may not work as expected when built. Sometimes, it does not even run at all because of critical errors.


Script Execution Order

Some errors, especially null reference errors, may be caused by issues with the Script Execution Order. You can adjust this order in Edit > Project Settings > Script Execution Order. Ensuring that scripts initialize in the correct sequence helps prevent errors, particularly when scripts depend on each other (for example, when using the Singleton pattern).

Key points in Test Zero

Player Control

The PlayerController script handles most of the core gameplay mechanics for the player, including movement, health, input, and special abilities.

Movement & Controls

Back to the Top

Basic Movement: Directional movement with running, walking backward, jumping, and air dodging.

Dodge System: Different animations and distances depending on the state (grounded, walking back, or airborne). Includes invincibility during dodge.

Coyote Time: Implements a short grace period after leaving the ground to allow more responsive jumping.

Combat & Stats

Back to the Top

Gun Mechanics: Manages fire rate, reload speed, spread, ammo, and bullet types (e.g., ricochet, penetration).

Bullet Time: Slows down global time for a short duration using a separate gauge. Player speed and animation adjust accordingly.

Damage System: Player can take and recover damage, with visual/audio feedback and temporary invincibility(i-frame).

This controller serves as the central hub for player-related gameplay, enabling responsive control, dynamic combat, and integration with other systems like audio and UI.

Enemies

To manage multiple enemy types efficiently, we created an abstract base class called EnemyBase.cs. All six enemies (including bosses) inherit from this base, allowing shared logic for health, damage handling, and death.

Shared Behaviors

Back to the Top

Health Management: Each enemy has maxHealth, currentHealth, and a resourceAmount rewarded upon death.

Damage Handling: Supports both instant and over-time damage.

Status Effect: Status from bullet modifiers such as corrosive effect.

Flexibility and customization

Back to the Top

Enemies can override methods like Die() to customize behavior (e.g., custom death animation).

Utility functions like Smite() or ZeroResourceAmount() help with scripted kills or disabling rewards.

Additionally, each enemy can be equipped with its own unique behavior or ability logic, allowing for further AI customization without affecting the shared base. For example, some enemies may summon minions, or change attack patterns based on health.

This system makes it easy to create new enemy types while keeping the core logic centralized and maintainable.

Shooting

The shooting system combines bullet behavior, weapon modifiers, and aiming logic to create a responsive and flexible combat mechanic.

Bullet Logic

Back to the Top

Each bullet is an independent object with its own speed, direction, lifetime, and damage. Upon hitting an enemy, it applies damage and optionally triggers effects like healing, corrosive DoT, or combo bonuses depending on the bullet type.

Hit Detection: Uses OnTriggerEnter2D to detect enemy contact.

Visual Feedback: Displays hitmarkers and floating damage numbers.

Bullet Types: Includes variants like ricochet, penetration, lifesteal, corrosive, tracking, and combo bullets. Each type has unique behavior (e.g., bouncing off walls or seeking enemies).

Aiming

Back to the Top

The Aim script handles arm and head rotation based on mouse position. It also flips the player’s sprite to face the correct direction. This ensures that the aiming visuals stay accurate and immersive.

Head & Arm Tracking: Tracks mouse position in real time unless paused.

Sprite Flipping: Ensures player faces toward the cursor direction.

Angle Clamping: Prevents unnatural head rotation by limiting angle ranges.

This system enables diverse shooting behaviors with minimal changes to the core structure.

Procedural Map Generation

Back to the Top

Procedural content generation (PCG) is a technique used to create game content algorithmically rather than manually. In Test Zero, we implemented a procedural map generation system to create unique levels each time the game is played.


Map Segments for Procedural Map Generation

PCG is a technique that allows for the creation of game content algorithmically rather than manually. However, you have to make rules for the algorithm to follow. Hand-authored map segments are used as building blocks for the procedural generation. Each segment is designed to fit together seamlessly, allowing for a variety of combinations. Once all the map segments are prepared and the rules are defined, the procedural generation algorithm can create a unique map layout each time the game is played.


Checking boundary of map segments

To ensure that the segments fit together correctly, we check the boundaries of each segment. This is done by checking if the segments are convex polygons and if they overlap with each other. If they do not overlap, the segments can be placed next to each other without any gaps.


A completed level with procedural map generation

Although PCG is a powerful technique, it can be challenging to implement. It requires careful planning and design to ensure that the generated content is fun and engaging for players.

A notable example of procedural content generation in games is The Classrooms, which uses a blend of hand-authored segments and algorithmic generation to create fresh map layouts on each playthrough. This approach demonstrates how PCG can enhance replayability and variety while maintaining carefully designed gameplay experiences.

Procedural Gun Generation

Back to the Top


Procedurally generated weapons

To encourage replayability and variety, the game features a procedural gun generation system that creates randomized weapons with different stats, appearances, and rarities.

Generation Logic

Guns are generated through an interactable GunCreate station. Each generated gun has:

Rarity Tier: Common, Uncommon, Rare, Legendary

Part Levels: Barrel, Frame, Magazine (randomly distributed within max tier level)

Grip Type: Determines base stats like damage and fire rate

Bullet Type: Added based on rarity to affect combat behavior (e.g., lifesteal, tracking)

A pity system ensures higher-tier guns appear periodically (e.g., every 20 guns guarantees a Legendary).

Legendary Guns

Legendary weapons are defined separately using a data structure (LegendaryGunData) and have handcrafted stats and unique bullet types that cannot be rolled procedurally.

Recycling & Interaction

Guns can be recycled for partial resource refunds based on rarity.

Players can spawn multiple guns at once by holding the interact button.

Stat panels and visuals update in real time when hovering over a gun.

This system delivers meaningful weapon variety while keeping generation rules controlled and expandable.

Animation(Rigging)

Back to the Top


Bone

To create fluid character movement and reduce the need for frame-by-frame sprites, I used 2D skeletal animation via Unity's built-in 2D Animation Package.

Bone Rigging

Characters are composed of multiple sprite parts (e.g., torso, arms, legs) connected through a bone hierarchy. Each bone controls a specific body part, allowing for smooth and reusable animations like walking, jumping, or aiming.

Animator


Animator

Animations like idle, walk, jump, and dodge are handled through Unity’s Animator Controller, using state transitions to create fluid animation.

This setup enables reusable animations with minimal sprite assets and consistent motion quality.

2D Light & Particles

When exploring Lights and Particles, you delve into visual effects that enhance the game's atmosphere and player experience. Proper use of 2D lighting can create mood, highlight important gameplay elements, and add depth to scenes. Particle systems bring dynamic feedback for actions like shooting, explosions, or environmental effects, making the game world feel more alive and responsive.

When discussing VFX, it's also important to consider Post-Processing. Post-processing lets you apply effects such as bloom, color grading, and motion blur to enhance your game's visuals. While these techniques can significantly improve the game's atmosphere and polish, they are not covered in this journal.

2D Light

Back to the Top

2D lighting in Unity offers a streamlined alternative to 3D lighting, focusing on ease of use for 2D projects. By default, 2D lights affect sprites based on their color and alpha, but do not simulate light direction or cast shadows. Achieving more advanced effects—like realistic shading or shadow casting—requires additional techniques, such as custom shaders or sprite adjustments (see this YouTube example).

Understanding how textures interact with lighting is important: sprites designed with gradients or baked-in shading can better reflect the influence of 2D lights, resulting in more visually appealing scenes.

Particles

Back to the Top

Particles are used for visual effects like bullet impacts, explosions, and environmental details. Unity's Particle System allows for complex effects with minimal performance impact.


Healing Particle

You could imagine a healing effect that spawns particles around the player when they pick up a health item.


Spark Particle with 2D Light

Unity's built-in Particle System does not natively support 2D lights (Light2D). As of Unity 6 (2025), this limitation remains, and Unity has stated they are not planning to add Light2D support to the Particle System (last update in 2023).

// This script is a workaround to add Light2D to each particle in the Particle System

void LateUpdate()
    {
        int count = m_ParticleSystem.GetParticles(m_Particles);

        while (m_Instances.Count < count)
            m_Instances.Add(Instantiate(m_Prefab, m_ParticleSystem.transform));

        bool worldSpace = (m_ParticleSystem.main.simulationSpace == ParticleSystemSimulationSpace.World);
        for (int i = 0; i < m_Instances.Count; i++)
        {
            if (i < count)
            {
                if (worldSpace)
                    m_Instances[i].transform.position = m_Particles[i].position;
                else
                    m_Instances[i].transform.localPosition = m_Particles[i].position;
                m_Instances[i].SetActive(true);
            }
            else
            {
                m_Instances[i].SetActive(false);
            }
        }
    }

This code was suggested by Unity's community as a workaround for the lack of native Light2D support in the Particle System. It works by instantiating a Light2D prefab for each particle, positioning it to match the particle's location. While this approach can create the illusion of 2D lighting on particles, it may impact performance if used with large numbers of particles.


Performance Issue - Particle System with Light2D

Each particle system emits a large number of particles, and each particle has a Light2D component attached. After testing this approach, I found that it led to significant performance issues, especially when multiple particle systems were active simultaneously. The overhead of instantiating and managing numerous Light2D components for each particle can cause frame rate drops and stuttering in the game.

// This script is a compromised solution to add Light2D to a Particle System without overwhelming the system with too many Light2D instances

void LateUpdate()
    {
        int particleCount = m_ParticleSystem.GetParticles(m_Particles);
        if (particleCount == 0)
        {
            return;
        }

        // Calculate the average position of all particles
        Vector3 averagePosition = Vector3.zero;
        float totalLifetime = 0f;
        float totalRemainingLifetime = 0f;


        for (int i = 0; i < particleCount; i++)
        {
            averagePosition += m_Particles[i].position;
            totalLifetime += m_Particles[i].startLifetime;
            totalRemainingLifetime += m_Particles[i].remainingLifetime;
        }
        averagePosition /= particleCount;
        m_LightInstance.transform.SetParent(m_ParticleSystem.transform);

        bool worldSpace = (m_ParticleSystem.main.simulationSpace == ParticleSystemSimulationSpace.World);
        if (worldSpace)
            m_LightInstance.transform.position = averagePosition;
        else
            m_LightInstance.transform.localPosition = averagePosition;

        // Adjust intensity based on remaining lifetime percentage
        float lifetimeRatio = totalRemainingLifetime / totalLifetime;
        light2D.intensity = Mathf.Lerp(0f, 1f, lifetimeRatio); // Adjust the max intensity as needed

        // Activate light
        m_LightInstance.SetActive(true);
    }

My solution was to use a single Light2D component per particle system and dynamically adjust its intensity and range based on the overall behavior of the particles. This approach provided a visually pleasing effect while avoiding the performance issues caused by attaching a Light2D to every particle.

While 2D lighting and particles offer unique visual possibilities, you may find that 3D projects provide more flexibility and fewer limitations in Unity. If you prefer to avoid the constraints of 2D workflows—such as limited lighting options or particle system integration—consider focusing on 3D projects, where Unity's lighting and VFX systems are more robust and widely supported.

Shader

Back to the Top

Shaders are scripts that tell the GPU how to render graphics. They control how objects are drawn, including their colors, textures, and effects.


Lava Shader

I also experimented with creating a lava shader. To allow the player to cross the lava, we plan to add a raft mechanic. For the visual effect, I used a Noise texture to simulate the look of flowing lava (tutorial). Additionally, I applied UV distortion to animate the surface, making the lava appear as if it's moving.

FMOD (Audio)

Back to the Top

FMOD is a robust audio middleware that integrates seamlessly with Unity, enabling advanced audio features such as real-time mixing, spatialization, and dynamic event control.

We selected FMOD for its flexibility in handling complex audio scenarios—like adaptive music that responds to gameplay and context-sensitive sound effects. FMOD’s event-driven workflow makes it straightforward to trigger and manage sounds based on player actions, such as switching gun types.

While we didn’t leverage every FMOD feature, it proved valuable for organizing and customizing audio, improving both workflow and the overall sound experience in our game.

Discrete parameter bug


Discrete Parameter in FMOD

Whenever gun type changes, RuntimeManager.StudioSystem.setParameterByName("GunType", gunType); is called to update the FMOD parameter. However, there is a bug in FMOD that causes the sound to not change immediately when the parameter is set to a discrete value.


Nested Event (Discrete Parameter)

I am not sure what causes this bug. However, Nesting events was the solution from FMOD community.

About

2D shooter game developed in University of Alberta INTD 450 class

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages