top of page
Image Sequence_022_0000.png

Final Major Project:
'404: Exit not found' (2025)

​Word Count: 5500 words

Introduction

404: Exit not found is a solo-developed psychological horror game, created as the final major project of my degree. â€‹This project reflects my ability to work independently across the entire development pipeline, encompassing design, programming, and production.

 

From the outset, I prioritised efficient systems, and an iterative development process grounded in testing and technical refinement. Many of the mechanics were designed to scale, allowing for complexity to grow alongside the project’s evolving scope.

 

What follows is a breakdown of that development process from concept through to completion. (Read more about the research and design here.)

Concept & Execution

01

Concept & Execution

I’ve always been drawn to psychological horror, not just for its atmosphere but for the craft behind it. With 404: Exit not found, I set out to explore the genre from a technical angle - learning how space, sound, pacing, and ambiguity can be used to unsettle players, and using that process to grow my own skillset. Inspired by P.T. (2014), I focused on creating a liminal environment where tension builds through repetition rather than exposition. I also drew from games like Amnesia: The Dark Descent (2010) and Layers of Fear (2023) to shape my approach to diegetic storytelling, sound design, and the role of subtlety in horror. This led to a game built around a procedurally generated looping hallway, where audio and visual cues steadily build a sense of unease.​

Screenshot 2025-05-29 201932.png

Project Kanban style Trello board

Though scoped for the full semester, I spent the first few weeks ideating and exploring different options, as it took me some time to settle on the final concept; hence my active development period on the project was approximately 8–9 weeks. Like many projects I have completed previously, I focused on utilising Agile methodology with a Scrumban (Atlassian, 2024) approach - maintaining a Kanban board for ongoing tasks while loosely structuring sprints around key deliverables. This allowed me to balance planning with creative flexibility and adapt to the unpredictable nature of game development.

 

​Within my production planning, I prioritised first developing core systems such as the looping mechanic, scene management, and trigger-based event planning, and leaving polish and more quality of life improvements for later sprints to avoid destabilising gameplay. Doing weekly reviews within my sprint pattern helped me log bugs and identify and respond to any emerging issues as they happened. ​To avoid burnout, I also alternated between tasks with different mental loads - for example, switching between environment building which was a huge task in itself, UI work, and audio editing and creation. This structured yet adaptable workflow helped me stay engaged and maintain a clear overview of the project’s evolving structure, giving me space to experiment while keeping momentum consistent, which was essential for solo development under time pressure.

Sustainable development

02

Sustainable Development

As a developer, I see game design not just as a creative and technical discipline, but as a medium for reflecting on and challenging social inequalities. While this wasn’t my conscious starting point, the values that underpin my practice such as equity, inclusion, and accessibility, align closely with Sustainable Development Goal 10: Reduced Inequalities (United Nations, 2025), which advocates for fairer opportunities in both local and global contexts. 

​

Throughout this project, the narrative subtly explores themes of systemic neglect and social invisibility, centring on the experience of a child left behind both physically and metaphorically by a world that failed to notice. Though fictional, the game draws on real patterns of abandonment and marginalisation, and rather than relying on horror tropes that reinforce stigma or stereotype, I used environmental storytelling to try and create a more thoughtful kind of fear, one that encourages players to think about how people become forgotten or unreachable within broader systems.

​

Beyond the narrative, I focused on accessibility in the game’s design - avoiding mechanics that demand quick reflexes and focusing instead on intuitive, user-friendly interactions. Alongside clear visual and audio cues that guide players, these decisions are part of a wider effort to make the game more understandable and playable for as many people as possible. Individually, these might seem like small decisions, but together they show my commitment to making the game accessible, which I see as essential, even as a solo developer.

​

SDG 10 isn’t just a theme in this project, it reflects values that shape how I work and how I think about the games industry more broadly. As a Women in Games Ambassador (2016), and someone who has often been in the minority in technical spaces, I’m acutely aware of the structural barriers marginalised voices continue to face and I challenge those barriers through how I build, creating clean, modular systems with clear documentation that invite reuse, collaboration, and understanding.

Image Sequence_037_0001.png
Systems & Architecture

04

Core Systems and Architecture

Hallway Spawner

The procedural looping hallway system formed the backbone of my game, so I began by developing the HallwaySpawner script. It dynamically instantiates hallway segments as the player progresses, enabling seamless transitions that support both narrative pacing and gameplay flow. My goal for this system was to create an extensible, efficient system that balances narrative and standard hallways while managing resources through controlled instantiation and cleanup.

​

Initially, I had experimented with a teleportation mechanic that repositioned the player between hallways via a trigger in a small transition room (see below - left). However, this broke immersion and felt mechanically disjointed. In order to improve this, I designed a system that instantiates hallway prefabs one at a time, aligning the entry point of each new segment precisely with the exit of the previous one; allowing for smooth visual continuity (see below - right).

Early iteration - Teleporting loop prototype

Final mechanism demo - HallwaySpawner system

To ensure accurate alignment, I implemented the AlignHallway() method. Rather than relying on fixed offsets or grid-based alignment which would limit hallway design flexibility, this method dynamically calculates the positional delta between the new hallway’s entry point and the previous hallway’s exit point.

 

The new hallway's entrance transform is hence positioned and rotated to match the exit transform, then offset by the vector difference between their world-space positions.

HallwaySpawner C# Script - AlignHallway()

This approach accounts for variations in hallway geometry, enabling precise connections even in irregular layouts such as T-junctions, dead ends, or maze-like structures. As a result, the system ensures modular alignment without gaps or misplacement, maintaining spatial continuity crucial in a looping environment designed to disorient the player.

Procedural hallway generation alignment diagram

The SpawnNextHallway() method handles instantiating and positioning the next hallway prefab in the sequence. Once aligned, the spawner activates a blocking object (such as a locked door) behind the player, in place of the previously spawned hallway via its own Hallway script component. This limits player access to only two hallways at once, regulating and strategically guiding gameplay flow and improving performance by restricting active scene areas.

The HallwaySpawner also works to maintain a queue of spawned hallways. When the maximum count (maxHallways) is exceeded, the oldest hallway is destroyed and removed from memory. This queue-based approach, along with the clear separation between HallwaySpawner and Hallway logic, results in clean, maintainable code and modular level design. Each hallway prefab manages its own state using the Hallway script, while the spawner coordinates sequencing and alignment.

HallwaySpawner C# Script - SpawnNextHallway()

Hallway generation & clean up visualisation

While this system generally improved performance by limiting scene complexity, I encountered frame drops during moments when new hallways instantiated and old ones were destroyed. I found that early builds with lightweight prefabs ran smoothly, but later versions with detailed assets and built out environments introduced visible lag. Gameplay remained fluid otherwise, indicating the issue stemmed from runtime overhead and garbage collection.

Given more development time, I would experiment with implementing object pooling to mitigate this. Reusing inactive hallway instances instead of destroying and creating new ones could reduce memory spikes and improve consistency in asset-heavy scenes - preserving the immersion during transitions.

The order in which hallways are spawned was also consciously built to balance both narrative progression and player experience by mixing narrative hallways with variable numbers of regular hallways. Narrative hallways represent key story beats and are designed to appear only once and in a fixed sequence as defined in the inspector. Between these, the system inserts a random number of regular hallways, with the range also specified in the inspector. These can repeat and appear in any order, maintaining unpredictability while supporting the overarching narrative.

HallwaySpawner C# Script - PickNextHallway()

This behaviour is governed by the PickNextHallway() method, which checks whether the required number of regular hallways has been spawned before moving on to the next narrative one. If so, it advances the narrative; if not, it selects a regular hallway at random. 

Screenshot 2025-05-24 184736.png

HallwaySpawner - Unity inspector view

The modular design lets narrative events play out reliably, while varied regular hallways break the rhythm to keep tension high and each run feeling like a slightly different experience. After some difficulty assessing the effectiveness of the narrative hallways and scare sequences - largely due to my familiarity with the sequences making it difficult to assess their impact objectively - I received feedback suggesting a randomisation mode for testing. I implemented this by shuffling the sequence of narrative hallways and concealing their order within the inspector on a set button press. This helped reduce designer bias and allowed me to approach testing with fresher eyes, making it easier to evaluate pacing, anticipation, and better emulate the player experience in testing.

Modular Event Planner

Another key system developed was the ModularGameEventPlanner, a flexible system designed to orchestrate sequential game events through various triggers such as timers, player interactions, and trigger colliders. I used this approach to avoid hardcoding behaviours, and develop my skillset in building scalable, maintainable event management systems suitable for both professional teams and solo developers.

​

The system manages a list of event steps (EventSequenceStep), each representing a discrete gameplay or narrative unit triggered in a defined order; either automatically or via player/environmental interaction. This sequencing allowed for precise control over pacing and progression, preventing premature triggers and enabling deliberate narrative timing, which was critical in shaping the horror game’s tension and overall sense of narrative guidance.

Each step includes a descriptive name, a trigger type, an optional source object, a configurable delay, and a UnityEvent (Unity Technologies, 2025a) callback - making the system flexible and easy to extend.

 â€‹I also created the system to support multiple trigger types including automatic start, trigger colliders, player interactions, timed delays, and chained delays following previous steps. This variety allows both predefined scripted sequences and player-driven events, reducing linearity and allowing the players autonomy to shape their own gameplay experience. To manage this the TryActivateStep() method (shown right) interprets each trigger type and sets up the appropriate behaviour, whether subscribing to interaction events or starting delay timers. 

​

By leveraging Unity’s built-in event system, the design enables behaviours to be configured directly in the editor without additional coding, supporting rapid iteration and bridging the gap between design intent and implementation. This workflow mirrors common industry practices and significantly reduced the time it took to build each narrative hallway, which was essential given the scope of the project and that I handled all development myself.

ModularGameEventPlanner C# Script - TryActivateStep()

Event planner usage example - Narrative hallway  (Unmute for audio)

I chose to handle the timing element through a modular delay system - once a step is triggered, a countdown begins, and when it elapses, a UnityEvent fires and the sequence progresses. This keeps pacing control simple and adaptable, making it easy to tweak timings without major rewrites. To avoid tight coupling between components, I exposed public trigger methods (e.g. OnTriggerEnteredExternally, ReceiveTriggerEnter, ReceiveInteraction) so that other scripts could interact with the event planner without relying on direct references. This hugely improved flexibility across systems and aligned with sustainable, scalable coding practices.

Custom Event Planner Editor script

Screenshot 2025-05-29 120801.png

Event Planner - Editor view

Creating a custom editor (Unity Technologies, 2017) for the ModularGameEventPlanner was also a new challenge for me, as I hadn’t previously worked with Unity’s editor scripting or property drawers. I wanted to improve the usability of the system during development, so I built a custom inspector that cleanly displayed each event step, auto-expanded UnityEvents, and made the workflow far more efficient to iterate on.

​

To support development efficiency, I also prioritised robustness (Sky, 2023) by building in boundary checks, logging, and input validation to catch misconfigurations early and avoid runtime errors. These defensive strategies were especially useful during iterative cycles, where stability and clear debugging tools made a significant difference.

Narrative & Player interaction

05

Narrative and Player Interaction

Narrative building

Designing the narrative hallways was the most creatively challenging part of the project for me. Coming from a primarily technical background, balancing subtlety, pacing, and thematic cohesion initially felt very difficult, yet this process quickly became one of the game’s most rewarding elements. I treated each hallway like a short film scene, aiming for a clear visual hook, a contained emotional beat, and some form of escalation; whether through environmental storytelling, audiovisual effects, or the traditional jump scare. This approach was informed by research into narrative pacing techniques focused on tension-building through controlled timing and player perception (see full research breakdown here).

​

In building the hallway environments, I intentionally used implied space, designing areas that suggest something beyond the player’s immediate reach to induce curiosity and unease; as well as adding environmental assets like decayed books or broken objects to align with the theming and narrative of the game. I also decided to add more verticality through balconies and landings which created liminal spaces that feel both confined and open, further intensifying the atmosphere of disquiet.

​

As aformentioned, the HallwaySpawner inserts a random number of regular hallways between narrative segments to break up story pacing and reduce predictability. I chose to keep this structure consistent until the final loop, where I deliberately made an exception. Through testing, I found that placing the final narrative hallway directly after the chase sequence created a stronger emotional payoff, as the stark shift in tone heightened the psychological tension and helped bring the arc to a more impactful close.

The screenplay-style script I wrote (shown right), acted as more of a guide during planning. It was early feedback that suggested writing narrative events in this form, which helped me clearly visualise each scene’s flow and narrative. This approach also reflects common industry practice, where narrative designers often collaborate using screenplay formats or storyboards to map out player experience before committing to full implementation.

​

​In addition to the main narrative, I included a couple of small easter eggs to reward exploration. Some of the child’s drawings scattered throughout the hallways have codes scrawled on their backs - these codes correspond to radio stations that players can tune into, revealing additional backstory and narrative layers. I built this in to encourage players to engage with the environment more deeply and to add an extra dimension to the storytelling without interrupting the core gameplay flow.​​

Narrative hallway rough script (use arrows to navigate)

Beyond the creative and technical challenges, this part of the project deepened my understanding of how narrative can be embedded into systems, space, and pacing; not just delivered through text or cutscenes. I made deliberate choices to prioritise environmental storytelling and sensory cues over dialogue or exposition, keeping interactions simple so players could focus on atmosphere rather than mechanics. This approach not only served the horror but also supported accessibility, allowing a wider range of players to engage with the experience on their own terms. I'm committed to designing games that are both artistically ambitious and inclusive, and this project pushed me to think more critically about how to achieve that balance.

Chase Sequence

I designed the chase sequence to deliver a sharp spike in intensity just before the game’s conclusion, providing a contrast to the otherwise slow, more suspenseful pacing of the game. I started the process by sketching out the maze layout on Procreate, experimenting with dead ends, looping paths, and intersections intended to disorient the player (shown below – left).

Untitled_Artwork.png
Screenshot 2025-05-26 130441_edited.jpg

Initial maze planning sketch

Maze built in Unity - Baked NavMesh in blue

After planning, I built the maze in Unity and baked a NavMesh (Unity Technologies, 2025b) to define the doll’s navigable area (shown above – right). To create the actual chase mechanic, I developed the ChaseTrigger script, which enables the doll to dynamically track the player using Unity’s NavMeshAgent and triggers a respawn if caught. Instead of a traditional fail state, being caught causes a glitch effect and displays the word 'RUN' before respawning the player at the maze’s start - keeping the player disoriented and hunted without revealing the doll’s next position (see below). This loop was designed to sustain tension without breaking immersion or punishing the player unfairly.

ChaseTrigger C# Script - Section of Update()

During the chase, the doll's footsteps grow louder as it gets closer, creating a creeping sense of threat that players feel rather than see immediately. I wanted players to question whether they were being watched, followed, or simply imagining it. Working on this system pushed me to think narratively while developing technically, as I wanted to utilise the environment and effects to create a more adrenaline-fuelled loop than a typical chase sequence.

Respawn during chase sequence  (Unmute for audio)

Terminal Ending

The terminal based ending brings the game full circle, locking the player in place as the ending reveals itself on screen. There's no way to move, or to look away, only the growing sense that something has been watching all along. The screen glitches as the audio distorts, and the game resets. It then mirrors the introduction almost exactly, but now the player knows more; and still can’t change what happens.

​

Developing this sequence required disabling all player input and camera control, lerping to a fixed-angle camera view; with a slow fade to black to give the illusion of interacting directly with a computer screen. The terminal UI was designed as a diegetic element, embedded within the world rather than overlaid on top of it to enhance immersion. Text is then typed line by line using a coroutine, with variable timing to control pacing and tension. I utilised ambient audio layers including low-frequency glitches, whispering, and synthetic interference which was synchronised with the visual text distortion to blur the boundary between in-world events and meta-narrative cues.

However, the visual sequence underwent several iterations before reaching it's final form. Initially, I had used a single typewriter-style font, but early feedback on the system suggested it lacked the corrupted, unstable quality I intended. I tried overlaying two TextMeshPro components - one with normal text, one with glitch text - but due to the variable width of the fonts in use, resulted in misalignment issues across lines. I also experimented with character-by-character font swapping in TextMeshPro (Unity Technologies, 2025c), but this proved unfeasible as it doesn’t support runtime glyph updates between fonts.

​

Eventually, I created the coroutine, GlitchFlickerLoop, which rebuilds the string at timed intervals, substituting random characters with glitched variants using a secondary font (Hackattack SDF). This preserved spacing and order while simulating real-time glitching, finally delivering the visual degradation I wanted.

TerminalEnding C# Script - Real time font change

TerminalEnding C# Script - Final line control

To reinforce the breakdown of reality at the end, I layered looping glitch-text with distorted sound effects and a semi-transparent image of the doll that fades in behind the terminal; with the intention of resembling a reflection, suggesting the entity is now directly behind the player. This not only tied the ending back to earlier doll encounters but also maintained immersion without relying on a jump scare or cutscene.

​

The final line 'you’ll never leave.' rounds off this segment as it's typed repeatedly until the screen fills, mimicking a system overload. I then followed this with a final burst of glitch distortion, and a hard cut to black before returning to the starting screen. On post-completion replays, a boolean flag tracked by the GameIntroManager changes the intro text as a narrative reward and reminder that the loop never ends. However I ensured that the flag reliably resets if the player quits to the menu mid game, allowing repeated loops without locking players out of future full playthrough attempts.

​

Before this project, I’d predominantly used TextMeshPro to display static UI text. However, for the ending sequence, I needed more control over pacing and presentation, which led me to explore how to manipulate it through code. Through this process I learned how to manipulate TextMeshProUGUI components at runtime to produce real-time effects including character-by-character text rendering through coroutines, glitch overlays using embedded tags, and font asset switching mid-line. 

Terminal ending (Unmute for audio)

Interactable & Inspectable Objects

The interaction system is built around a lightweight interface, IInteractable, which defines a contract for all objects the player can interact with. This includes everything in game from inspecting drawings to opening doors and triggering one-off events using the event manager. Any object implementing IInteractable must expose an Interact() method and an OnInteracted event, allowing external systems to listen for and respond to player input without tightly coupling behaviour. For example, the InspectableObject class implements IInteractable and exposes a UnityEvent for behaviour definition via the Inspector. Its interface implementation explicitly throws on event access, since UnityEvent is used in practice for editor convenience.

​​

Object_Interact Script - Inspect vs Interact

This approach separates interaction from object logic, making it easy to add new interactive elements without rewriting core systems. As with many of my other systems I designed this with scalability and maintainability in mind, applying core object-oriented principles, particularly encapsulation (Janssen, 2023) and the open–closed principle (Janssen, 2018) where I can, to ensure new functionality can be added with minimal changes to existing code.​​

 

At the core of this system lies raycasting from the centre of the screen, which detects objects implementing IInteractable. When the player presses the interact button within range, the system calls the object’s Interact() method, triggering behaviours such as opening doors, inspecting items, or initiating scripted events. This clear separation between player input handling and object-specific responses reduces coupling and supports extensibility.

Object inspection

Door interactions - Responsive UI

When implementing the UI for these objects, my focus was on reflecting the player’s current focus by displaying contextual text like 'Interact', 'Open', 'Close' or 'Locked' etc (see above - right). Although the prompts are currently implemented with some hardcoded elements, the system’s structure largely relies on ScriptableObjects for prompt management, thereby improving consistency and for the purposes of potential future system use, facilitating localisation.​​

​

For the objects designated as inspectable, the system enables detailed examination by allowing players to pick them up and rotate them using mouse input (see above - left). During inspection of the object, player movement and camera controls are disabled, and a spotlight is activated to highlight the object and draw attention. To maintain immersion and avoid jarring positional changes, the object’s original transform is saved, which allows it to be smoothly returned to it's original state once inspection ends. After some issues during testing, I also temporarily disabled colliders on inspected objects to prevent unwanted collisions within the environment.

​The rotation of objects whilst being inspected is driven by mouse delta values and calculated within world space, with a configurable speed to balance responsiveness and user control (see right). The system also works to accommodate optional, object-specific behaviours during inspection; for instance, the RadioTuner script activates only while its associated radio is being inspected. This keeps object-specific functionality encapsulated, avoiding added complexity in the core interaction system.​

Object_Interact Script - Object inspection control

Currently, the Object_Interact script manages raycasting, UI prompt handling, and interaction state transitions. In retrospect, further decoupling these responsibilities, for example, by isolating transform manipulation or inspection logic into dedicated components, could enhance maintainability and promote reuse in a better way. Nonetheless, the system has demonstrated reliable performance across a wide variety of interactive objects within the game environment.

Radio

Radio tuning using easter egg frequency - Narrative radio station, audio created using Narakeet & Audacity (Unmute for audio)

The radio system plays a central narrative and mechanical role, functioning as both an environmental storytelling tool and a player-driven means of progression. It anchors key sequences with fragmented broadcasts that slowly unveil the backstory and heighten the sense of unease. Tuning becomes both a literal and symbolic act; players must actively search for meaning, aligning with the game’s themes of distorted memory, perception, and space (see above).

​

Each radio station is defined by a ScriptableObject, containing a stationPosition float (-1 to 1) and a reference to its own unique audio clip. These are instantiated through RadioStationBehaviour components in the scene, which handle fade logic and continuously check tuning proximity. This separation of data and behaviour in this way enabled the rapid addition of new stations without altering the core system which was a valuable feature as narrative complexity increased.

To create an intuitive radio interaction, I built a system around a single float value representing tuning position. The main RadioTuner script maps mouse input to this normalised float, which then drives multiple systems simultaneously; it controls the physical dial's rotation using localEulerAngles (Unity Technologies, 2025d), updates a UI tuning indicator via RectTransform (Unity Technologies, 2025e), and calculates a mapped radio frequency display between 88.0 and 108.0 MHz to maintain a sense of realism. All visual and auditory feedback is synchronised through this one float, ensuring the system remains consistent and responsive.

RadioTuner C# Script - HandleTuningInput()

Radio tuning crossfade - (Unmute for audio)

When the player tunes within a defined tuningThreshold of a station’s position, that station fades in over time while the transitioning static fades out, handled using time-based interpolation of AudioSource.volume. If the player tunes away or approaches a different station, the system crossfades accordingly, creating a smooth and deliberate sense of feedback (see left). Additional static sound effects play briefly when tuning starts or ends, reinforcing the tactility of the interaction. 

I explored several approaches before settling on this implementation. In earlier iterations I used hardcoded floats and enums for stations, which quickly became inflexible and error prone. Switching to ScriptableObjects allowed the stations to be treated as data, making reordering stations and updates easier without rewriting the code. I also considered using UnityEvents and delegates but felt it added unnecessary complexity; given the linear logic, simple float comparisons with coroutines for fade transitions turned out to be more efficient and easier to manage.

​

​For input, I wanted tuning to feel deliberate and precise, so the dial is designed to be sensitive without being frustrating. To improve accessibility and accommodate players less interested in manual tuning, I also added a right-click function that snaps to the next active station, balancing usability with narrative discovery.

Telephone

The phone serves as one of the game’s key responsive interactive objects, contributing to both pacing and worldbuilding through diegetic audio. This reinforces the core narrative idea of intrusion - of something reaching out to the player uninvited.​

Example of phone interaction & audio - (Unmute for audio)

The telephone system is built using a PhoneInteraction script that implements the shared IInteractable interface, maintaining consistency with other interactables across the game. Its behaviour is modular meaning one can assign optional ringing audio (phoneRingAudio), a phone line audio clip (phoneVoiceAudio), and a linked object to activate on use (such as a door or event trigger). These elements are conditionally triggered depending on the phone’s current state, which toggles between 'idle' and 'in use'.

​

The interaction is handled via ObjectClicked(), which enforces a short lock to prevent double clicks, and runs either the pickup or putdown sequence based on the isPhoneInUse flag. This flag controls both the phone’s visual state (switching between idle and active prefabs) and its behaviour, from sound effects (pickupSound, putdownSound) to triggering other objects (activateOnUse). To ensure clear, immediate player feedback, I avoided coroutines or blended animations, favouring a simple authored sequence. This keeps the interaction clean and ensures each phone moment is deliberate and responsive.

Audio

06

Audio

While I had previously created features like the footstep audio system which made it's implementation smoother, this project marked my first time using Unity’s audio capabilities beyond basic effects. This resulted in building several custom scripts to handle non-linear, ambient audio designed to make players feel uneasy without relying on predictable cues. For the environmental sound, I wrote a RandomAudio script that plays one of several audio clips (creaks, distant knocks, etc.) at random intervals between 30 and 120 seconds. Each sound is spatialised by spawning a temporary AudioSource at a random position within a set range around the player using Random.insideUnitCircle (Unity Technologies, 2025f), and is destroyed afterwards. I also adjusted the volume and distance using spatialBlend, rolloffMode, and maxDistance, with the goal of making the gameplay experience feel more unpredictable.

RandomAudio Script

Screenshot 2025-05-26 194840.png

RandomAudio - Unity inspector view

To evoke paranoia and create the illusion that something is following the player, I created a GhostFootsteps script that hooks into the player's existing footstep system via OnFootstepPlayed. It plays a delayed duplicate of their last footstep sound, a fixed distance behind them. The effect is triggered randomly after a set amount of in-game time and only lasts for short bursts, controlled by a coroutine that toggles it on and off. I wanted this small manipulation to make players question whether they're just hearing their own footsteps - or something else mimicking them.​

​

I also used Unity’s Audio Mixer group to route everything through a master mix, and applied low-pass filters to specific moments, to give the effect of a muffled audio. For the radio segments, I used Narakeet to produce the voice lines and created the final audios in Audacity, lowering the sample rate and applying band-pass EQ and static to simulate an AM radio signal.  All of this was about building tension in a way that couldn’t be easily predicted. Instead of overly scripted events, I leaned into randomness and systems-driven behaviour, which resulted in learning a lot about Unity's built in audio components that I hadn't used before.

UI

07

UI

I designed the UI to be minimalist and unobtrusive, maintaining immersion while supporting accessibility and responsiveness. The central PauseMenu script I built handles all transitions between UI states, manages cursor visibility and lock states, and controls both audio and post-processing effects.

Within this menu, I implemented sliders for mouse sensitivity and master volume using Unity’s UI system. Sensitivity changes apply in real-time to the FirstPersonController, allowing players to tune input precisely without restarting. Additionally volume adjustments use logarithmic scaling (via Mathf.Log10) and an AudioMixer, offering perceptually accurate control and avoiding flat volume ranges that often plague linear sliders.

​

To reinforce the game’s psychological horror atmosphere, I also layered subtle post-processing during pauses using Unity's URP Volume system. For instance, the game applies a chromatic aberration override during pauses, subtly distorting visuals to imply disorientation or rupture in reality. I also added a custom glitch effect controlled by a GlitchEffect script for these UI states. 

​

The glitch shaders themselves are based on an open-source asset by Keijiro Takahashi (2016), but I extended their use through a modular control script. This script modulates parameters like scan line jitter, colour drift, and vertical jump dynamically across both gameplay and UI states. There are two core intensity levels that I built in - a full glitch state for climactic narrative events, and a more restrained version used in menus and transitions. By using coroutines and randomised distortion patterns, I kept the visual language consistent across the game’s layers. Ultimately, glitching became more than a visual effect; it served as a narrative tool to reinforce the game's themes, most notably of an unstable reality.

Main Menu

Pause Menu

Settings Menu

PauseMenu - Pause method

GlitchEffect Script - Subtle glitch coroutine

Final Word

08

Final Word

404: Exit not found has been the culmination of everything I’ve learned across three years of game design, development, and self-driven experimentation. It challenged me not just to build something more complex and technically robust, but to move beyond linear design and have confidence in my skills.

 

I built nearly every aspect from scratch, writing 47 scripts, creating 22 custom environment hallway prefabs and spending countless hours iterating, testing, and refining. What I’ve shared here is only a fraction of the work that went into the final game - but I’m proud of how much of myself is in it. Showing it publicly for the first time at the end-of-year showcase and receiving such positive feedback was  an incredibly rewarding conclusion to the project.

​

More importantly, it gave me space to experiment, make fast decisions, and adapt. The game isn’t perfect, but it reflects the range of my technical and creative ability I've worked hard to develop. I finished something ambitious, complex, and personal - and I’m excited to see where that takes me.

References

References

Amnesia: The Dark Descent (2010) Xbox, Playstation, PC [Videogame]. Helsingborg: Frictional Games. Atlassian (2024) Scrumban: Mastering Two Agile Methodologies. Atlassian. Available at: https://www.atlassian.com/agile/project-management/scrumban. [Accessed 18 May 2025]. Janssen, T. (2018) Solid Design Principles Explained: The Open/Closed Principle with Code Examples. Stackify. Available at: https://stackify.com/solid-design-open-closed-principle/. [Accessed 18 May 2025]. Janssen, T. (2023) OOP Concept for Beginners: What is Encapsulation. Stackify. Available at: https://stackify.com/oop-concept-for-beginners-what-is-encapsulation/. [Accessed 18 May 2025]. Layers of Fear (2023) Xbox, Playstation, PC, MacOS [Videogame]. Krakow: Bloober Team. P.T. (2014) PlayStation 4 [Videogame]. Tokyo: Konami Digital Entertainment. Sky, D. (2023) What Does it Mean for Software to be Robust? - Dmitry Sky - Medium. Medium. Available at: https://medium.com/@Sky_Hustle/what-does-it-mean-for-software-to-be-robust-319f4f26677b. [Accessed 5 May 2025]. Takahashi, K. (2016) KinoGlitch. Github.com. Available at: https://github.com/keijiro/KinoGlitch. [Accessed 26 May 2025]. United Nations (2025) Goal 10 | Reduce Inequality within and among Countries. United Nations. Available at: https://sdgs.un.org/goals/goal10. [Accessed 21 May 2025]. Unity Technologies (2017) Unity Manual: Custom Editors. Unity3d.com. Available at: https://docs.unity3d.com/560/Documentation/Manual/editor-CustomEditors.html. [Accessed 13 May 2025]. Unity Technologies (2025a) Unity - Scripting API: UnityEvents. Unity3d.com. Available at: https://docs.unity3d.com/6000.1/Documentation/Manual/unity-events.html. [Accessed 13 May 2025]. Unity Technologies (2025b) Unity - Scripting API: NavMesh. Unity3d.com. Available at: https://docs.unity3d.com/6000.1/Documentation/ScriptReference/AI.NavMesh.html. [Accessed 14 May 2025]. Unity Technologies (2025c) TextMesh Pro Documentation | Unity UI | 2.0.0. Unity3d.com. Available at: https://docs.unity3d.com/Packages/com.unity.ugui@2.0/manual/TextMeshPro/index.html. [Accessed 20 May 2025]. Unity Technologies (2025d) Unity - Scripting API: Transform.localEulerAngles. Unity3d.com. Available at: https://docs.unity3d.com/6000.1/Documentation/ScriptReference/Transform-localEulerAngles.html. [Accessed 19 May 2025]. Unity Technologies (2025e) Unity - Scripting API: RectTransform. Unity3d.com. Available at: https://docs.unity3d.com/6000.1/Documentation/ScriptReference/RectTransform.html. [Accessed 19 May 2025]. Unity Technologies (2025f) Unity - Scripting API: Random.insideUnitCircle. Unity3d.com. Available at: https://docs.unity3d.com/6000.0/Documentation/ScriptReference/Random-insideUnitCircle.html. [Accessed 19 May 2025]. Women In Games (2016) About Individual Ambassadors. Women in Games. Available at: https://www.womeningames.org/ambassadors/about-individual-ambassadors/. [Accessed 23 May 2025].

  • LinkedIn
Untitled design.png
WiGAmbassador_White_RGB.png

© 2025 By Shahrzad Beevis

Logo.png
bottom of page