profile-pic

Hi, I'm Radu!

Enthusiastic Creative Media and Game Technologies student on the path to becoming a skilled game programmer. Explore my evolving portfolio to observe my journey and achievements in the world of game development.

Projects

PC / Unreal engine / 2 months / solo

Into The Backrooms (2024)

For this elective, I decided to create a horror game using the first-person template. Escape the enigmatic backrooms by collecting seven peculiar images. But beware, for a menacing entity prowls these eerie corridors, seeking to intercept any who dare to cross its path.

AR / Unity / 2 months / team of 4

De Buitenschool AR (2024)

With the help of our AR application designed for young explorers aged 12 to 15, visitors can embark on an immersive journey through the rich history and culture of Glimmen's primary school.

PC / Unity / 2 months / solo

FPS Microgame: Leveling System (2024)

For the game programming focus track, I opted to implement a leveling system feature using the existing codebase of the FPS Microgame template. The game features a dynamic leveling system with escalating enemy waves, providing strategic depth as players retain upgrades but lose skills and coins upon death, enriching gameplay with enhanced progression and customization.

VR / Unity / 3 months / team of 5

Fast Food Worker Simulator (2023)

This VR game serves as a training simulator for fast-food workers in a future where 3D-printed food from recycled materials is the norm and efficiency is the key. Players embark on a journey to master the unique skills and procedures required in this cutting-edge culinary environment.

PC / Unity / 1 month / solo

Delusions (2023)

This project was developed for the game design focus track of the second year. You take the role of a person living with OCD and embark on a surreal journey through an alternate dimension, where you must clean houses, confront unique challenges, and navigate distorted realities.

Education

Hanze University of Applied Sciences

Creative Media and Game Technologies (2022 - 2026)

Currently learning game development and several related topics, such as:

  • 2D / 3D Development

  • VR / AR Development

  • Mobile Development

  • Game Programming

  • Unity

  • Unreal Engine

  • 2D Art

  • 3D Modeling / Animation

  • Game Design Theories

  • Scrum Methodology

  • Level / Narrative Design

  • User Experience / User Interface

  • User Research

University of Bucharest

Computer Science (2021 - 2022)

I studied Computer Science for one year in my home country but later realized I didn't fully enjoy it. Some useful and interesting topics I have learned are:

  • x86 Assembly

  • Python

  • Computer Architecture

  • Linux / VirtualBox

  • Web Development

Skills

Technical Skills

  • C# / C++

  • Unity

  • Unreal Engine

  • Git / GitHub

  • Blender

  • Figma

  • Trello

Soft Skills

  • Able to efficiently manage my time and prioritize my work

  • Able to plan my work and set goals

  • Able to constantly adapt to new work environments

  • Able to problem-solve through observation, brainstorming, and logical reasoning

  • Experienced at working within a team and being an active team member

  • Experienced at communicating and presenting ideas

  • Experienced at researching and troubleshooting encountered problems

About Me

My name is Radu Duicu, also known by my online alias, "capnRadu". At the age of seven, I embarked on an exciting gaming journey that introduced me to classics like Counter-Strike and Minecraft. Since then, I've continuously immersed myself in various gaming genres, forging new connections, and cultivating enduring friendships that continue to this day.During my high school years, my interest in programming and editing began to grow. Over the four years, I honed my skills in C++ and fostered a foundation of critical thinking. While solving coding challenges was enjoyable, I craved a means to channel my creativity through programming. This led me to question how lines of code could culminate in a captivating gaming experience on people's screens. Concurrently, I delved into editing software, such as Photoshop and Sony Vegas, initially as a personal hobby. As time progressed, I honed my proficiency in these tools, relishing the hours I dedicated to each project.With a passion for programming firmly established, I made the decision to pursue computer science in my home country, Romania. However, upon reflection, I realized that this field did not wholly satisfy my aspirations. Nevertheless, certain courses provided invaluable knowledge, particularly in Python and computing systems architecture by experimenting with x86 Assembly, nurturing my critical thinking aptitude.My journey ultimately led me to Hanze University of Applied Sciences, where I am currently a second-year student. Here, I have been exposed to game industry programs and have had the privilege of collaborating with exceptional individuals on group projects. It is within this environment that I have discovered my passion for becoming a game programmer.

Get Your Shit Together

PC / Gamemaker studio 2 / 2 months / team of 5

About

This project was my first journey into game programming. The task was to create a game tailored for first-year international students preparing to study abroad. The core concept revolved around 'wayfinding'—a guide to help newcomers navigate their new environment, adapt to different people, and embrace a new lifestyle.Two additional requirements included fostering intercultural competence, allowing players to learn about themselves and others, and integrating design ethics from Chris Nodder's book 'Evil by Design: Interaction Design to Lead Us into Temptation.' To fulfill these objectives, we utilized GameMaker Studio 2, my initial experience with a game engine.

Work Method

I was in a project group with four other members, and I was assigned the programmer role. My team created all of the assets and UI elements present in the game, and I tied everything up together, alongside developing the mechanics included and the gameplay loop, to create the final product, 'Get your shit together!'.During the seven weeks in which we developed the game, we used Trello for SCRUM sprints and stand-ups, in order for us to be updated with each other's tasks; I would work on new mechanics or improve the game, and ask for feedback from my team to see if everything was corresponding with our vision.

Story

The story of the game is that you, the main character, started your studies at 'Happy University', and you must find your way in the city maze, in order to reach the University. What's more, is that you have to work on yourself and improve aspects of your life to beat the maze. On the path, you will encounter new people that will help you escape the maze, and learn new things about the adventure you embarked on.There are three main NPCs that you have to talk to, and they will each give you a task to accomplish to progress further in the maze. These represent common tasks a student can encounter while pursuing their studies and living alone. Once all the tasks are completed and you gain all the useful insights from the NPCs, you are ready to exit the maze and confidently embrace the student life.

How does the game meet the criteria?

The wayfinding aspect of the game is symbolized by the maze, representing the challenges an international student faces when beginning their studies in a foreign country. The maze serves as a metaphor for the anxieties of living independently, meeting new peers, finding accommodation, managing finances, and self-discovery. To enhance this metaphor, a black-and-white filter overlays the protagonist's life initially, reflecting a sense of dullness and uncertainty. However, as players progress through tasks and gain insights from the NPCs, the world gradually regains color, representing personal growth and transformation.The intercultural competence requirement is fulfilled through the three key NPCs, known as "failed students." Despite their failures, these characters provide valuable lessons about student life and its obstacles. By interacting with them, the main character learns about the nuances of the student experience and gains intercultural understanding. The term "failed students" highlights the notion that failure can lead to valuable insights and personal growth.

Obtained Skills

  • GameMaker Language

  • Team management with AGILE SCRUM

Check out the project

Gallery

© capnRadu. All rights reserved.

BOS!

pc & android / Unity / 4 months / team of 6

About

'Bos!', previously known as 'Living Isle', is the second game project completed during the second semester of the first year. The assignment entailed developing a Unity 3D game that fulfilled one of the nine specified cases. Collaborating with a team of five other members, we selected the 'Wildlife' case at the beginning of the semester. Our client, Staatsbosbeheer National Forest Service, sought a tool to raise children's awareness about the diverse wildlife inhabiting their national parks.While the target audience focused on children, the product aimed to broaden their understanding that humans are an integral part of the natural world. The requirements necessary for this project were to conduct and evaluate research done with the target audience, use game design tools, such as game design documents and one-pagers, and develop a publishable game.

Work Method

The project marked a significant milestone in my skill development. I took charge of programming the core mechanics, laying the foundation for Android mechanics, which another team member later refined and enhanced. Additionally, I contributed to world-building efforts and implemented sound effects. Among the mechanics I developed were camera drag and rotation, upgrade system, income system, and animal unlock system. My team focused on various aspects of the game, including UI design, modeling, wireframing, gameplay loop, narrative, and conducting research.Throughout the development process, we organized our tasks into weekly sprint sessions and held daily stand-ups to monitor progress. Our aim was to create an idle game that captivated and engaged children, highlighting the diverse wildlife found in the Netherlands.

Gameplay

The goal of the game is to build a healthy ecosystem together with the animals you meet, and learn about their strengths! The more animals you help, the wealthier the environment becomes. Interact with animals, discover their needs, and decorate their habitats! The unique selling points are the creative decoration of the animal lots, the number of animals and upgrades, and the information about the animals.

How does the game meet the criteria?

Our product aims to provide an immersive experience that captivates players through actions that involve helping animals and their wildlife. We carefully designed the gameplay to strike a balance where it engages children without promoting excessive dependence on the game, discouraging extended periods of indoor play. To address this concern, we implemented a mechanic that restricts access to the game unless a code provided by a Statsbosbeheer forest ranger is entered. This encourages children to explore the real-life wildlife and fosters a connection to nature.In addition, our product caters specifically to children by incorporating adorable animal characters, a vibrant world with nature-inspired colors, an easily comprehensible user interface, and a visually appealing comic-like environment achieved with the assistance of shaders. These elements collectively create an experience that is both entertaining and educational, stimulating children's awareness of wildlife while keeping them engaged in a child-friendly setting.

Obtained / Refined Skills

  • C#

  • Unity

  • Game balancing

  • Team management using AGILE SCRUM and daily stand-ups

  • Developing game progression and income system

Check out the project

Gallery PC

Gallery Android

Gallery Older Versions

© capnRadu. All rights reserved.

Delusions: Shadows of the Mind

pc / Unity / 1 month / solo

About

In undertaking the game design focus track assignment, the task involved the creation of a polished and playable game level. This encompassed not only the actual development of the level but also the formulation of a comprehensive game design document elucidating the underlying concept. The project demanded the application of diverse game design theories, spanning mechanics, narrative, aesthetics, and various other facets of game development. Moreover, the assignment granted a degree of flexibility, allowing us to select assets that would help in the game's evolution, provided that these resources were sourced independently.

Work Method

The structured work method entailed initiating the game design process by formulating a comprehensive game design document, commencing with a high-level overview of the game, outlining the target audience, external and internal goals, and detailing mechanics, aesthetics, story, and technology, supported by concept images, wireframes, and a schematic illustrating the game's flow; subsequently, emphasizing the originality of the game design concept and its innovative contributions to games, genres, or mechanics, while articulating the intended user experience; then, within the GDD, shifting focus to level design by detailing item placement, player progression, and the overall flow of the level, elucidating the player experience and the integration of elements such as flow, pacing, chokepoints, and aesthetics contributing to the desired user outcome, with indications of the designed game genre; finally, moving on to the practical implementation of the level design, utilizing assets, ensuring clear documentation of asset sourcing in the GDD, and constructing a playable space enriched with an objective, collectibles, pickups, marks, or quests aligned with the game concept and featuring a recognizable beginning and ending.For an in-depth explanation of the game flow and design choices, you can see the complete GDD on the itch.io page down below.

Gameplay

In this game, players step into the shoes of an individual navigating the challenges of obsessive-compulsive disorder, confronting the daily struggle of maintaining a balance between work and an overpowering urge for cleanliness. The narrative takes an unexpected turn when the protagonist's cleaning routine becomes a gateway to an alternate dimension. In this surreal realm, identical houses present opportunities to escape, but only by meeting strict cleaning challenges within tight time constraints, enduring disturbances, and visual distortions.The game introduces a dual challenge as players contend not only with physical chaos but also relentless intrusive thoughts that intensify as cleaning falters. The pressure mounts, with the impending consequences of failing to satisfy compulsions heightening the stakes throughout the immersive experience.

Unique Selling Points

"Delusions: Shadows of the Mind" distinguishes itself through a blend of unique features that collectively contribute to a captivating gaming experience. The game's standout characteristics include a visually distinct aesthetic achieved through the combination of VHS and PS1 art styles, fostering a nostalgic yet striking atmosphere. Notably, the game delves into mental health themes, specifically addressing the protagonist's struggle with OCD, offering players a nuanced exploration that enhances narrative depth and emotional connection.The game excels in crafting a psychologically charged horror atmosphere, employing disturbing visuals, eerie soundscapes, and unsettling environments that heighten player emotions and perceptions. The introduction of surreal environments in an alternate dimension adds an element of mystery and unpredictability, challenging players to navigate through distorted realities and enhancing overall immersion. Interactive engagement is elevated through the implementation of skill checks tied to the cleaning process, requiring players to actively participate and make skill-based decisions that directly influence the game's atmosphere.Additionally, the game adopts a puzzle-driven gameplay approach, incorporating environmental puzzles that demand critical thinking and strategic decision-making, thereby enhancing player engagement and contributing to the overall challenge.

Obtained Skills

  • Game design

  • Level design

  • Narrative crafting

  • Psychological horror design

  • Skill-based gameplay implementation

  • Puzzle design

  • Challenges implementation

  • Dramatic progression

  • Feedback loops

  • Difficulty progression

Check out the project

Gallery

© capnRadu. All rights reserved.

Fast Food Worker Simulator

VR / Unity / 3 months / team of 5

About

This game, crafted at the beginning of the second year, represents my first project where I experimented with VR development. Working within a team of 5 members, I took charge of the programming aspect of the project. Our task was to use games as a tool to explore the potential applications of futuristic technologies. The challenge was to conceptualize and create a game set 30 years in the future, envisioning its look, gameplay mechanics, and target audience.Questions surrounding the next breakthrough in VR gaming and the upcoming mainstream technology in the gaming industry were explored. The design constraints included the use of Unity as the development platform, adherence to a visually fitting style, and ensuring playability for a live demonstration. Ultimately, we had to position ourselves within the context of an evolving future.

Work Method

We've envisioned a future as the setting for our game where 3D-printed fast food from recycled materials has become a staple of the culinary world due to its convenience, affordability, and sustainability. People lead busy lives and value quick, cost-effective meals. The game is a playful, yet informative tutorial simulation designed to train new fast-food workers on efficiently using 3D printers for food preparation. 3D-printed fast food has contributed to an eco-friendlier dining culture.Customers appreciate the commitment to recycling and reducing plastic waste. Furthermore, the game has become an integral part of training programs for fast food workers. Aspiring employees must complete the simulation to acquire the necessary skills for their roles. The training provided by this game has significantly improved efficiency in fast food establishments, thus, the 3D printing technology is used to its fullest potential, reducing waiting times and enhancing customer satisfaction.However, one potential challenge that needs to be addressed is the initial hesitation and discomfort that many restaurant workers may experience when introduced to this technology, potentially leading to a sense of discouragement. Our solution tackles this challenge by offering an interactive tutorial game that presents essential knowledge in an engaging and enjoyable manner, thus empowering workers to embrace the future confidently.

Gameplay

The primary goal of the game is to train new fast-food workers in the efficient and skillful use of 3D printers. Players, taking on the role of aspiring employees, engage in a tutorial simulation that guides them through various challenges representative of real-world fast-food kitchen situations. Customers have the option to choose from three distinct recipes—hamburger, hotdog, and the daily special—allowing them to personalize their orders based on their preferences.The overarching objective is for players to successfully navigate and master the essential skills involved in handling 3D printers, ensuring the proper printing and assembly of the received orders. Moreover, players are tasked with maintaining the 3D printers, ensuring their optimal functionality.

Core Mechanics: NPCs

The first version of the NPCs was the most basic one and laid the foundation of how it would be programmed in the future. The core aspects of the NPC were: spawn, move to the counter, order, receive order, and move to the destroy point. For the movement of the NPC, I used empty GameObjects that represent waypoints, and the MoveTowards and Distance functions for moving between these waypoints. For the ordered object, I used an array that had all the possible order options and a variable that randomly chooses an option out of these. For managing the NPC's current state, I used a switch function that toggles between different states. Once the NPC prefab is instantiated, the script first chooses an order, and then changes the state to "spawn" using an IEnumerator. During the "spawn" state, the script changes the position, and checks if the NPC has reached the next waypoint (i.e. the counter). If it has reached the counter, the script makes a reference to the canvas, instantiates the NPC text prefab, and finally changes the state to "order". During this state, the NPC waits for the ordered object. If an object enters its collider, the script makes sure that the object is the ordered one by comparing the tag, and checks if the state is "order", so the player can only serve the order during this state. Finally, the state is changed to "destroy", and the NPC moves to the destroy point, where is, in fact, destroyed. For spawning customers, I made a separate script that uses the InvokeRepeating function to instantiate the NPC prefab.

The second version of the NPCs added more mechanics to the previous ones, like a timer bar for the order, a more complex order check for a hamburger, and improvements to the movement. The NPC's script remained the same in structure but improved in order to facilitate the new mechanics. In the beginning, I wasn't sure how to program the mechanic of checking if a complex order made from multiple ingredients was fulfilled, and I searched for a tutorial about making a food game in order to gain insights. In the end, I found that a good solution to achieve this is to have a recipe index that uses the powers of ten to build the full recipe. For example, let's imagine a simple cheeseburger that is made out of a bottom bun, patty, cheese, and top bun. Given that the recipe must always have a bottom bun, the bottom bun index is 1, following the patty, which is 10, the cheese, which is 100, and the top bun, which is 1000. So, in the end, the recipe index of a burger recipe made out of a bottom bun, patty, cheese, and top bun, is 1111. once the NPC is instantiated, it first randomly generates the ordered recipe index. This is done by using a "for" statement for how long the length of the order recipe array is. This array is set in the inspector to the number of total ingredients. In addition to the previous script, after the order index is generated, the script uses a list to find how many spawned NPC prefabs are there for movement purposes. During the "spawn" state, the script checks whether or not there are multiple NPC prefabs and changes the movement accordingly, in order to avoid bugs like NPC moving through each other or waiting in the exact same position. After the NPC has reached the counter, the script changes the text according to how many ingredients the order should have based on the recipe index. Furthermore, it also instantiates a timer bar for the order. During the new "order" state, the NPC waits for the order until the timer bar is empty. To see if the order is correct, the script checks if the entered collision is in fact a hamburger, and if it is, it compares the NPC recipe index to the hamburger order index generated in a separate script. If the timer runs out, or if the order is received, the NPC moves towards the destroy point.

The third and final version added new recipes, difficulty progression, star rating, new NPC models, dynamic timer, and comments to the script. Based on the previous recipe index mechanic, I added two more possible recipes: a hotdog, made out of a bun and sausage, and "today's special" (player can serve whatever recipe or ingredient). During the Start method, the script now checks for how many child objects the NPC prefab has (these are the models), randomly chooses a model, while keeping a reference to its animator, and destroys the rest of the children. Regarding the multiple orders, the script first chooses a random number between 1 and 3 (hotdog, hamburger, today's special), which represents the first order. For difficulty progression, the script chooses a second order, this time between 0 and 2 (nothing, hamburger, hotdog), only if there have been more than 9 customers. The script also takes into account the number of ingredients inside the order and adds extra time to the base timer. After this is done, the script generates 2 recipe indexes for the two orders. During the "spawn" state, in addition to the older version, the script now also updates the model animation by changing a bool variable. At the end of this state, the text is updated for the first order. Moreover, now there is one more state, "order 2", which is basically the same as "order" but for the second order. In order to check the received order, the OnCollisionEnter method first checks the recipe index of the first order. After this is done, the script checks whether or not there is a second order, and if there is, the timer is reset, the customer text is updated for this order the same as in the "spawn" state, and changes the state to "order 2". While all of this happens, the script also takes into account if the received order is correct or not, or if the timer ran out, and updates the number of stars accordingly.

Core Mechanics: Recipes

For the first version of the recipe script that handles burger building, I took into account the tutorial I previously mentioned. The ingredients that we chose to have in the game were bottom bun, patty, cheese, bacon, and top bun. The script first makes references to the prefabs of these ingredients and also stores their indexes inside a variable. It is important to note that this script is attached to the bottom bun prefab, which means that it is mandatory to start building the recipe with the bottom bun. Furthermore, inside the OnCollisionEnter method, the script checks if the object is an ingredient by comparing tags, calls the AddIngredient function, and adds the respective ingredient index to the order index that is used by the NPC script to check if the received order is correct. The AddIngredient function handles how the ingredient is placed on the bottom bun. The new instantiated ingredient's position takes into account the spacing between it and the previously added ingredient if any. After this is done, the new ingredient is made a child of the bottom bun GameObject, and the collider of the bottom bun is updated to also fit the size of the newly added ingredient.

The final version of the recipe script works the same as the previous one but is improved in terms of spacing and facilitates new ingredients. At this point, the list of ingredients was updated to the bottom bun, patty, cheese, tomatoes, onions, salad, and top bun. There is also a new script that handles the building of the hot dog, which is structurally the same as this one. During the Start method, the spacing variable is updated to the box collider of the bottom bun divided by 2, in order to have a more precise spacing. Furthermore, the DestroyObject function is invoked in order to destroy the object after some time in the event it is not used in a recipe. The building of the order index principally remains the same as before. In comparison, the AddIngredient function is improved; in the beginning, the spacing adds to its value the box collider size of the new ingredient divided by 2, meaning that the ingredient is instantiated right on top of the bottom bun or the previously added one if any. After this, the ingredient is set as a child, and the bottom bun's box collider is updated. In the end, the spacing value adds to its value one more time the value of the box collider size of the new ingredient divided by 2. Given the fact that the instantiated object is spawned at the origin point of the bottom bun, which is in the center of the model, adding the value of its box collider divided by two to the spacing variable basically sets the instantiate point on top of the bottom bun. Furthermore, given the fact that the instantiated object is spawned with its origin point, which is in the middle of the object, at the spawn point, which now is the top of the bottom bun, means that the two models will intersect. For this reason, at the beginning of the AddIngredient function, the spacing is updated to add to its value the new ingredient's box collider divided by 2, which again sets the instantiate point to where the origin point of the new ingredient should be. After this is done, the spacing value adds to its value once again the value of the new ingredient's box collider divided by 2, in order to set the instantiate point on top of the new ingredient, same as with the bottom bun. In addition, if I had more time to work on the project, I would've fixed some of the bugs that this mechanic has. One of the bugs that I have also shown in the video is the placing of the ingredients. If the bottom bun is facing the right way up, the building takes place normally, but if it is upside down, the ingredients still spawn on top of the bottom bun, and the box collider's size is updated upward.

Core Mechanics: 3D Printers

During the first stages of development, we had to choose how to handle object instantiating. I programmed two mechanics for this: a 3D printer that spawns objects by pressing a button, or in-hand instantiating, which spawns and, at the same time, grabs an object in the hand. The 3D printer mechanic uses the OnTriggerEnter and OnTriggerExit methods to check if the button is pressed or not and invokes whatever function is chosen, either when the button is pressed or released. The SpawnObject function is used to instantiate a certain GameObject at the spawn point. The in-hand instantiating script inherits the XRBaseInteractable script and uses the OnSelectEntered script to instantiate a GameObject and grab it at the same time.

Once we chose to go with the 3D printer, I developed the second version of the mechanic. Structurally, it is the same as before, with only one addition to the SpawnObject function. The object is now instantiated only if the isInstantiated bool variable is false, which means if the printing space is free, in order to limit the number of spawned objects.

The final version of the 3D printer mechanic features new additions, such as a cleaning mechanic, smoke particles, and difficulty progression. The spawning of the object remains the same as before with the exception that a new object can only be instantiated if the printer health is above 0. The printer damage is handled by another script. In the Start method, the script sets the printer health, and the damage bar position, and disables the smoke emission rate over time. Inside the Update method, the script checks if there were more than 14 customers instantiated for difficulty progression, and enables the damage bar. After this, the smoke particles' emission rate is changed based on the printer's health. The OnTriggerStay method is used for the cleaning of the 3D printer. The script checks if the object is a sponge by comparing tags, and if the printer's health is not at its max, and the damage bar is enabled, then the cleaning can happen. At first, I thought of using the transform.HasChanged method, as it seemed suitable for what I needed, but later decided to manually program the movement of the sponge, as it allowed for more personalization. So, if the sponge is moving, the printer's health is increased. In the end, the IncreaseDeterioration function is set inside the inspector and is used when spawning an object in order to decrease the printer's health and update the damage bar. Talking about the sponge cleaning, this mechanic is handled by the CleaningObject script, which uses an IEnumerator to check the movement. The startPos variable memorizes the current position of the sponge, and the finalPos variable memorizes the current position of the sponge after 0.1 seconds. After this, if the difference between startPos and finalPos is found between a certain interval, then it means that the sponge is moving.

Core Mechanics: Ads

Later in the development of the game, we decided to implement intrusive ads. In order to do this, I made a script that handles the game menu, alongside ads. Inside the Start method, it repeatedly invokes the DisplayAd function, which activates the ad canvas. Inside the Update method, the position of the menu is updated to follow the player's head and update the number of stars. Furthermore, if an ad appears, the script checks the distance between the ad canvas and one of the hands and destroys the ad accordingly.

Obtained Skills

  • VR development

  • Performance optimizations

  • Future literacy

  • Serious game design

  • Futuristic conceptualization

Check out the project

Gallery

© capnRadu. All rights reserved.

FPS Microgame: Leveling System

PC / Unity / 2 months / solo

About

The objective of the game programming focus track was to simulate a professional game studio environment by implementing a specific feature within an existing codebase, the FPS Microgame. This process involved three primary phases: research, including understanding the codebase, studying implementations in other games, and compiling a list of specifications; technical design, which entailed creating a plan for the feature's implementation; and implementation, involving building the feature, iterating on it, and refining the design structure as needed.

Work Method

As a feature, I decided to implement a leveling system, and my personal learning goals for the project included a comprehensive understanding and experimentation with the FPS Microgame codebase, and improving my ability to plan ahead for scripts by envisioning and sketching how different mechanics will interact with each other.Regarding the research phase, the feature draws inspiration from two distinct games, Brotato and Hades. In Brotato, players face successive waves of enemies, earning XP, coins, and health boosts upon elimination. Various menus, like the level-up and shop menus, offer upgrades, items, and weapon combinations. In Hades, players pursue persistent upgrades across runs, embracing failure for narrative and gameplay advancement. They accumulate resources, engage in dialogue, and unlock upgrades in a cyclical journey. These influences converge to create a dynamic leveling system that enriches player progression and narrative depth. Based on this, the requirement list for the leveling system mechanics included XP and coin gain, a level-up menu, a shop menu, a stats menu, and the persisting upgrades.

Gameplay

The leveling system in the FPS Microgame is a pivotal and dynamic element that transforms the gaming experience into a series of challenging enemy waves. From the main menu, players can initiate a new run, and in the event of their death during a run, they have the option to respawn, starting from the initial wave. However, all acquired skills and coins are forfeited, while persistent upgrades chosen up to that point and the current level are retained, introducing a lasting strategic layer to the gameplay. The run persists until the player actively opts to restart it or return to the main menu.Each wave presents escalating difficulty, characterized by an increase in the number of enemies, their health, and corresponding rewards. As players navigate through these waves, each successful elimination yields XP points and coins, forming the foundational mechanic of the leveling system. At the end of each wave, players encounter specific menus: the level-up menu, triggered only if the player levels up, allowing them to upgrade health, speed, or damage by predefined values based on their level-up amount and the shop menu, where collected coins can be allocated towards purchasing new skills and their higher levels from a randomly generated list. These skills include HP regeneration, reload speed, critical damage, critical chance, life steal, and coin gain. The stats menu, triggered alongside both menus, allows players to observe their stats resulting from chosen upgrades and skills.The primary role of this feature is to enhance player progression and strategic decision-making. Beyond that, it provides a sense of accomplishment and customization, enriching the overall gaming experience.

Technical Design

Various UML diagrams were used in development to visually represent the feature's systems, aiding in understanding design, code architecture, and implementation. The final version is summarized by a comprehensive set of UML diagrams, including object, flowchart, and class diagrams, offering a clear and organized depiction of the underlying mechanics.

The initial diagram highlights gameplay object instantiation, emphasizing their multiplicity. In the FPS Microgame's main scene, a hoverbot and a Patrol GameObject were present, utilizing the PatrolPath script. To integrate the leveling system, the WaveManager script is used to spawn multiple enemies. Using the Enemy_HoverBot prefab created a problem with shared patrol routes, resolved by creating a prefab with unique routes. Two issues emerged: the Enemy_HoverBot lacked a PatrolPath reference, resolved by obtaining it from the parent, and hoverbot destruction left the Patrol GameObject. This was fixed by checking if the EnemyController component from its child was null, signaling hoverbot destruction. The second part illustrates the Skill script and its instances, representing skills in the shop menu, with further details available in the class diagram section.The flowchart provides a detailed and systematic illustration of how the entire system operates within the game. It visually breaks down the multiple interactions, processes, and decision points that contribute to the logical functioning of the system during gameplay.The class diagram visualizes the object-oriented aspect of the system, offering insights into the classes, attributes, methods, assembly definitions, and relationships within the feature. The WaveManager script serves as the foundational component, managing player upgrades, resources persistence, and the wave system. Key functionalities include NextWave and SpawnEnemies methods for enemy spawning and progression, along with a ResetWave method for new run initialization. It has a composition relationship with the GameFlowManager script for sound effects during wave starts.During final testing, I identified a chance to improve the wave system. The original design, relying on a fixed enemy count to progress, felt static. Building on playtesting insights, I introduced a second iteration with a 45-second wave timer, spawning mini waves upon eliminating the previous enemies. This dynamic change enhanced the gameplay, encouraging players to be more attentive and strategic.The GameFlowManager script controls the game flow, checking objectives, player death, and scene transitions. It has a composition relationship with PlayerResources, loading the level-up menu on level-up and the shop menu otherwise.PlayerResources maintains player resource information, with methods like GainCoins, GainXP, and HandleXP for updating values upon enemy elimination. It has composition relationships with WaveManager and SkillManager to update persistent values.The Health script manages player and enemy health, using RegenHP, Heal, TakeDamage, HandleDeath, and Kill methods. It has an aggregation relationship with PlayerResources for coin and XP updates, and composition relationships with WaveManager for health upgrades, and SkillManager for HP regeneration updates.SkillManager handles player skill persistence, level, and stats. It uses InstantiateSkills to spawn skills in the shop menu.The Skill script, associated with each skill prefab, initializes skill values and updates UI text. It has a composition relationship with SkillManager for communication and value updates.The BuySkill script manages skill purchases and upgrades, checking and updating player coins through WaveManager and stats via SkillManager.ChooseUpgrade handles upgrades within the level-up menu, using the Upgrade method. It has a composition relationship with WaveManager for persistent value updates.PlayerCharacterController controls player movement, with a composition relationship with WaveManager for applying speed upgrades.WeaponController maintains the weapons system, aggregating WaveManager for damage upgrades.The other scripts handle UI, projectiles, and enemy AI in the game.

Reflection

Implementing the feature was a fulfilling journey that contributed significantly to achieving personal learning goals. In the initial stages, delving into the existing code base allowed me to grasp the underlying mechanics and understand their complex communication. The realization that scripts are organized into multiple assembly definitions, though initially frustrating, turned out to be an invaluable learning opportunity. Utilizing UML diagrams was crucial for planning and visualizing system interactions. Creating mechanics similar to beloved games was both exciting and motivating, leading to a cohesive leveling system structure.Reflecting on past weaknesses, I've experienced significant growth. Adopting a disciplined approach to project initiation with thorough planning has enhanced my ability to anticipate interactions between elements, leading to a more coherent coding process. Transitioning from impulsive to methodical coding has reduced issues like spaghetti code and repetition. Over time, I've effectively addressed my lack of experience in implementing features within an existing code base, sharpening my skills through contributions to the Microgame.

Obtained Skills

  • Feature implementation

  • Working with existing codebase

  • Researching

  • Technical design

  • Iterating

  • Working with assembly definitions

  • Planning and organizing scripts

  • UML diagrams

Check out the project

Gallery

© capnRadu. All rights reserved.

De Buitenschool AR

AR / Unity / 2 months / team of 4

About

The game represents my first project where I experimented with AR development. Working within a team of 4 members in total, I took charge of the programming aspect of the project, alongside UI designer. Our goal was to create a solution to a specific problem of a real client that was assigned to us. The development process included following the Agile process to develop, iterate and test the solution to the client’s design brief.Our client was De Buitenschool, and their main goal was to transform the primary school from Glimmen into one of the most appreciated and striking top locations in the region. To do this, they needed a solution that would bring them a large number of visitors. Our solution is an augmented reality application that informs the visitors of the school about the architecture, history, art, and culture of the place through an interactive and immersive experience. The target audience we chose to design the app for is children from 12 to 15 years of age.

Work Method

Until this project, I had never developed an application/game using augmented reality, hence I thought it would be an interesting idea to work with. To familiarize myself a little bit with the concept, I downloaded the Unity AR Foundation samples and experimented with each sample found in the app by testing it and looking over the scripts that handled certain mechanics.Since our app was going to be mainly focused on image recognition, and that it would contain multiple images that would need to be scanned, recognized, and display specific prefabs, I followed multiple tutorials that explained the most basic functions of Unity's image recognition feature.Besides programming the functionality of the application, I also designed the UI menu which showcases the information from the stickers.

Gameplay

The app is accompanied by a main flyer, that is given to visitors when they first step foot on the premise, and which has a QR code from where they can download the app. Once the app is installed, the flyer can be scanned, and it will display a mini-story and the overall objective: explore the surroundings, find stickers, and scan them.After a sticker is scanned, users must complete a puzzle represented by an image that relates to the sticker objective so the target audience feels more inclined to find out what they receive after completing the puzzle. It is important to mention that the puzzle mechanic was entirely developed by another team member, and it wasn't implemented in the application. After the puzzle is completed, users can read the presented information, or press the history button, and read information about the past.The app presents information regarding 4 different stickers, one of which is a secret sticker, meaning that the visitor must first scan the other 3 stickers, and only then they unlock the secret sticker. Besides this, there is the inventory menu, which displays the information from the unlocked stickers, so users can read them anytime without having to revisit the school.

Development

Regarding functionality, the PrefabTracking script handles the image recognition and instantiating of prefabs by having an array of GameObjects where the prefabs are placed. These prefabs must have the exact name of their target image from the AR image library, so when one of those images is tracked, the prefab with its name is instantiated. After the prefab is created, the scale and position are defined such that it appears on the right side of the tracked image.The FlyerEnable script handles information about the prefabs. When a prefab is created, the script checks if the prefab is not the main flyer or the secret sticker and if it wasn't already present in the scene in order to increase the scanned stickers count. This is important so the secret sticker is unlocked only when the first 3 stickers are scanned. The next part of the script, alongside the StickerStatusUI script, handles the UI from the inventory menu, specifically the locked/unlocked text.The ScannedStickers script is used only to keep track of the number of scanned stickers, so it can pass the value to the SecretStickerMechanic.The SecretStickerMechanic script handles the status of the secret sticker; if the other 3 stickers were scanned, then its content becomes available.The ButtonScript handles the logic of the "History" and "Back" buttons that are present inside each prefab, while the ButtonsUI script handles the button logic and functions from the inventory menu.

Obtained Skills

  • AR development

  • Wireframing

  • UI design

  • User researching with target audience

  • Communicating and fulfilling client's needs

Check out the project

Gallery

© capnRadu. All rights reserved.

Into The Backrooms

PC / Unreal engine / 2 months / solo

About

The objective of this elective was to gain a comprehensive understanding of Unreal Engine 5 by creating a project that incorporates interactivity through blueprints. Utilizing one of the available standard templates as a foundation, we were given the flexibility to develop either a game or a movie that would demonstrate the ability to work with Unreal's visual scripting system.I decided to use the first-person template for developing a horror game. Initially, I found myself uncertain about the genre of game I wished to create. After experimenting with several concepts, none managed to provide the necessary motivation to delve into Unreal Engine. However, remembering about the Slender Man game which I used to enjoy back when it was released, made me realize it could serve as an ideal starting point for this project.Combining the gameplay mechanics of the Slender Man game with the eerie atmosphere of the backrooms map seemed fitting. Additionally, I also drew inspiration from Dead by Daylight's skill check mechanics. Having previously implemented similar logic for skill checks in a Unity project focused on game design, I felt comfortable with the mechanic's workings, although adapting it to Unreal Engine presented its challenges.

Gameplay

The player finds themselves trapped in the eerie depths of the backrooms, tasked with collecting seven scattered images to secure their escape. However, each image retrieval presents a challenge, as players must pass a skill check upon interaction to successfully collect it. Failure means the image will respawn elsewhere in the maze, prolonging their escape.Armed with only a flashlight and their stamina, the player must navigate the dimly lit corridors, and manage the battery power as they search for the elusive images. Along the way, limited interactables are available inside the maze such as energy drinks for stamina and batteries for the flashlight.The relentless entity lurking within the maze poses a constant threat. If the player is spotted, it will relentlessly pursue them until they are caught, or until they escape its sight. With danger waiting around every corner, the player must outsmart the entity, master the skill checks, and collect the images to secure their escape from the haunting depths of the backrooms.

AI

Animations and FootstepsThe AI has a custom animation blueprint which utilizes two animations from the unreal engine manny character: idle and walk forward.

Using a blendspace, we smoothly transition between these animations based on the entity's speed, determined by the speed variable in the animation blueprint, reflecting the entity's velocity.

Inside the manny walk forward animation, a custom notify triggers with each step, rather than using the play sound function, to ensure 3D sound accuracy. This custom notify, located in the animation blueprint, plays the footstep sound at the entity's location, applying attenuation settings for proximity awareness.

The footstep sound itself undergoes modification through attenuation settings and a meta sound source, introducing pitch shift variations for each play. Additionally, the output sound is amplified by 0.5 for enhanced clarity.

Logic

The AI setup includes the main entity blueprint, a blackboard containing a behavior tree, and an AI controller used by the main blueprint to run the behavior tree. The behavior tree decides the logic of the AI based on the "seePlayer" variable. When "seePlayer" is not set, meaning that the AI doesn’t see the player, the entity roams around the map, pausing three seconds between selecting new random destination points. If the variable is set, meaning that the player is seen by the AI, the AI starts chasing and, if necessary, kills the player.The value of "seePlayer" is determined within the main blueprint, which uses the pawn sensing component. Utilizing the "on see pawn" event, if the detected pawn matches the player, "seePlayer" transitions to true. If the logic had been left like this, the AI would continuously pursue the player without stopping.Because the pawn sensing component doesn’t have an event like on stop seeing pawn, we have to manually assess this condition. During the begin play event, we schedule the "lose player" event to activate every 0.8 seconds, resetting the "seePlayer" value to false. To summarize, when the entity detects the player, it toggles the "seePlayer" value to true and initiates pursuit. Subsequently, the "losePlayer" event activates, resetting the "seePlayer" variable to false and repeating this process. Thus, if the entity loses sight of the player, the "losePlayer" event triggers last, resetting the seePlayer value to false and maintaining the false value. To make this transition as unnoticeable as possible, we minimize the sensing interval of the pawn sensing component, allowing the "seePlayer" value to swiftly revert to true if the entity still perceives the player when the "losePlayer" event is triggered.Additionally, within the "on see pawn" event, if the "has detected player" value is false, a sound cue is played to alert the player that they are being followed, with the value reset to true. Similarly, within the "losePlayer" event, this value is reset to false. If I wouldn’t have used this variable, and only play sound, then the sound would have played each time the on see pawn event is triggered, in this case each .001 seconds.

TasksRegarding the actual tasks, the roam task selects a random position on the map for the entity's movement. Similarly, the chase player task directs the entity towards the player's location instead of a random one.The kill player task serves to trigger the "kill player" event from the main entity blueprint, which utilizes a sphere trace to inflict damage upon any intersecting actors. Within the first-person character blueprint, the "anydamage" event is used to respond to the sphere cast damage, playing a sound and displaying the jumpscare widget before restarting the level. To ensure this sequence occurs only once, we implement the "wasKilled" variable. This variable prevents the repetition of events between the triggering of the event and the level restart, allowing for necessary delays and animations to unfold without the event being activated multiple times.

Speed and animation rateThe remaining logic deals with the entity's walking speed and animation rate, which vary based on three distinct speed modes. Each mode has a corresponding unique animation rate scale: normal speed utilizes 300 and 0.6, chasing speed operates at 400 and 1, while slowed speed employs 200 and 0.5.In the roam task, if the entity's current walk speed differs from the designated normal speed, it is reset to that value alongside the animation rate scale. Similarly, the chase task performs these adjustments, but with the chase speed and rate scale.In terms of the slowed movement and rate scale, these are not set by the entity itself, but by the player activating the flashlight. Within the first-person character blueprint, there is the flashlight mechanic. The "slowentity" event is triggered while the flashlight is activated, using a sphere trace to apply damage to the entity if it is hit.In the entity blueprint, upon the "anydamage" event, the "wasSlowed" variable is toggled true, adjusting the walk speed and animation rate scale accordingly.In the chase player behavior tree task, the walk speed and rate scale values are set to the chasing ones only if the entity is not slowed by the flashlight. We are doing it this way because there is no behavior tree task that deals with checking if the entity is slowed down. In comparison with the roam task, here we also check if the wasSlowed variable is false, because this chase task is not constant so to say. The roam task would fire again once the AI reaches the target destination and only then it would set the speed and rate scale variables. But in terms of the chase task, because for a millisecond we are setting the seePlayer value to false and then back to true, the behavior tree would run the roam task and then the chase task again.In the event tick we check if the wasSlowed variable is true, and we set it to false after a delay of 0.2 seconds. This is done in order to check if the flashlight is still being pointed at the entity, so if it is not anymore, the wasSlowed variable will be changed to false in the end, and the expected values for the speed and rate scale will be set in the behavior tree tasks.

First Person Character

SprintingFor the sprinting mechanic, I followed a tutorial that explained how sprinting works. In the tutorial, the sprinting could be triggered only by pressing the left shift key and then the W key. But because I wanted the option to initiate sprinting either by pressing left shift first and then W or by pressing W first and then left shift, I had to develop this option based on what I learned from the video.Starting with with the first option, after the left shift key is pressed, we first check if the player’s movement is active, set isSprinting to true and check if canSprint is true, which is set when we either press or release the W key. If so, it means that we can start sprinting, and we first increase the rate of the footsteps, update the max walk speed to a higher value, and change the camera FOV. If the key is released, isSprinting is set to false, and if canSprint is true, it means that we stopped sprinting and we have to reset the footsteps play rate and the max walk speed, and then reverse the camera FOV to the initial one.Starting the sprinting by first pressing the W key and then the left shift key works exactly the same, but by interchanging the isSprinting variable with can sprint. If I had let the sprinting start only by pressing left shift first, I wouldn’t had needed the isSprinting variable no more.Besides the ChangeFov timeline, we have other things happening when we start or finish sprinting. When we start sprinting, we first add the stamina bar widget to the viewport, and then retrigger each 0.2 seconds the drain stamina event. Inside this event, we first subtract 3 from the stamina value, while making sure the value is clamped between 0 and 100, then if the stamina is less than or equal to 0 or the player movement is not active it means that we either ran out of stamina and can’t sprint anymore, or the movement is not active, thus the stamina should not drain. If one of the cases is true, then we set isSprinting and canSprint to false, so the event is not retriggered again, and then we have to redo the things that normally would have happened in the case we stopped pressing the sprinting key, so we reset the footsteps play rate, the max walking speed, reverse the camera fov, and retrigger the replenish stamina event. The second thing we do in this case, is to check if canGasp is true, and if so play the gasp sound and set the variable to false. When we stop sprinting we do the same thing as the drain stamina event, but with the replenish stamina event.Besides the retriggerable delay, we have a simple delay, so the replenish stamina event starts a little later after the sprinting is finished. In this event, we check the values of isSprinting and canSprint. If they are both true, it means that we began sprinting, so we set canReplenishStamina to false, so the retriggerable delay doesn’t start again. If the values are both false, we increase the stamina by 2, while clamping it between 0 and 100, and then check if stamina is greater than or equal to 100. If this is true, we set to false canReplenishStamina, and remove the stamina bar widget from the viewport only if it is on the screen. If the condition is false, we set canReplenishStamina to true, and then check if stamina is greater than or equal to 6 and canGasp is not true. If so, we set canGasp to true. We do this in order to play the gasp sound only when stamina is greater than 6. If we didn’t have this condition, then the sound would have played each time the player ran out of stamina, so if the player kept spamming the sprinting key, the gasp sound would play each time.

FlashlightWhen the flashlight input, in this case the F key, is pressed, we check if the player movement is active. If so we play a sound, and check the value of isBatteryEmpty. If it is false, we check if the spot light is visible or not. If it is not visible, it means that we have to power the flashlight on, so we set the spot light to be visible, and then we have a sequence of methods.First, we perform the drain battery event with a retriggerable delay, then add the flashlight bar widget to the viewport, and then retrigger the slow entity event.In the drain battery event, we subtract the battery by 1 and clamping the value between 0 and 100, and then check if the battery is less than or equal to 0. If true, it means that we ran out of battery, and similar to the drain stamina event, we have to redo the things that are done also when we turn off the flashlight, so setting the spot light visibility, and removing the flashlight bar from the viewport if the battery is at 100 and if it is on the screen.We also have the removeWidget custom event, which is used by the battery interactable. When we take a battery we set isBatteryEmpty to false, and check if the battery is above 100. If true, we then check if the flashlight bar widget is on the screen, and if the spot light is not visible. If this is true, we remove the widget from the screen.

LeaningWe have the leaning input action, which is triggered by pressing either Q or E. What this function does in short is to store the target location for the camera. In the select node, we have three options, the first and third store the location of the left and right lean target locations, and in the middle, we have the default camera location.The actual changing of the camera location happens inside the tick event. We are setting the location of the camera, from the current location to the target location, by interpolating it. Until this point, the leaning would work, but the camera wouldn’t rotate to the 45-degree angle, instead it would stay at the same angle. To fix this, we are also changing the rotation of the controller by using interpolation and a rotator. For the pitch and yaw we are using the ones from the controller, but for the roll, we are selecting between the rotations of the left and right lean

Depth of fieldFor the auto depth of field, we have a custom event which triggers a line trace starting from camera location and ending after 10000. Based on the location of the encountered hit relative to the camera location, we are changing the depth of field by using the aperture post process setting. To make this work in the game, we are triggering this function in the begin play event every 0.2 seconds.

Head BobbingFor the head bobbing effect, I first made three different blueprint classes with the parent being legacy camera shake, for the head bobbing modes, idle, sprint, and walk, each having different settings for the camera shake.In the first person character blueprint, there is the headbob custom event which is used to cycle between the three camera shake modes based on the player’s velocity. So if the velocity is not greater than 0, it means that the player is idle, if the velocity is greater than 0 but less than 401, it means that the player is walking, and if the velocity is greater than 401, it means that the player is sprinting. To make this work in the game, we are calling this custom event in the event tick one.

InteractionFor displaying the interact dot, in the tick event, we are first doing a line trace, and if the hit actor implements the interact blueprint interface, we are adding the interact widget to the viewport, and if not, we are removing it.Furthermore, if we are pressing the interact input, in this case the left mouse button, and the line trace hits something, we are triggering the interactable event from the blueprint interface, to let all actors that implement the interface know that we are interacting with it.

Heart BeatIn order to update the hear beat volume and speed based on how close the entity is to the player, in the tick event, we first check if 1 minus the distance to the entity divided by 1500 is greater than 0. In this case, we want to update the heartbeat values only if the entity is at a distance of 1500 or closer from the player. If this is true, we first multiply the value we checked before in the branch, with the maximum value we want the heartbeat volume to be, in this case 13, while also clamping the value. We do the same thing with the heartbeat speed, but in this case, we have a minimum value of 0.5 and a maximum of 1. After the values are set, we are playing the heartbeat sound with the heartbeat volume based on its speed by using the delay node.

Energy Drinks and Batteries

For the batteries, once we interact with them, we first check if the battery value is greater than or equal to 100. If true, then we are playing an error sound indicating that we don’t need to interact with them. If false, we are adding 10 to the battery value, while clamping it, play a sound, destroy the actor, and finally call the remove widget event from the first person character, which checks if battery is above or equal to 100, and if so, removes the flashlight bar widget, and sets the isBatteryEmpty value to false.The energy drink blueprint does the exact same thing, the only difference being the added value to the stamina, and not having a custom event that checks if the stamina is full, because the stamina can replenish, and has an event in the first person character blueprint that deals with the replenishing, which also checks the stamina value. Due to the fact that the flashlight battery is not replenishable, we had to make a separate event that checks the battery value.

Images and Skill Checks

Regarding the collectable images, there are 7 total images inside the game. Each image has a blueprint class, whose parent blueprint is the image collectable blueprint. This blueprint deals with the setup of the images, as well as the skill checks. Each image has two possible spawn points, to allow for replayability. When the player interacts with an image, they get a skill check. If they pass the skill check, they collect the image, if not, the image is respawned to the next spawn point. The image collectable blueprint has the spawnPoints variable which is an array and it is used to keep track of the spawn points.In the begin play event, we choose a random value between 0 and 1 in this case, because each image has only two spawn points. Then, the update spawn point event is called, which cycles through the spawnPoints array, and updates the location and rotation of the image to the values of the respective spawn point. After this event is called, the actor is set to not hidden, and we call the reference image manager event, which stores a reference to the image manager blueprint.The image manager blueprint is used to keep track of the collected images, and also to allow the skill check and the image collectable blueprint communicate with each other. In the begin play, we find out how many images are in the game, and store the number in the images count variable, and also set the total images variable of the image collect widget to the images count one. The image collect widget is displayed when an image is collected, and informs the player on how many images are collected and how many are left. Back to the image collectable blueprint, in the interactable event from the blueprint interface, we first create the skill check widget.

This widget uses three different widgets that use a material instance in order to form the full skill check. We have the first circle widget which is the full white circle, the black section which moves, and the red section, which is the target section in order to pass the skill check.Inside the graph view, in the preconstruct event, we set the red circle section to a random angle between 0 and 360, we store a reference to the image manager, and disable the player movement and input.In the tick event, we check if the right mouse button is pressed and if this condition is false, we are updating the position of the black section of the circle. We are having a branch here in order to clamp the angle value of the black section between 0 and 360.In the image collectable, where we spawn the skill check widget, we also set the rotation speed and store a reference to this image blueprint so the skill check knows what image spawned the skill check. Then we are adding the skill check to the screen, play a sound, and then disable the image until we find out the result of the skill check, because we don’t want to update the new spawn point or destroy the image without knowing the result.Back to the skill check, if we press the skill check input, in this case the right mouse button, we set this value to true, so the black section of the circle stops moving, and we check the value of the angle at which the section stopped. If the value is within the respective bounds, it means that the skill check is passed, and if not, it is failed.Because we want the skill check to communicate with image manager blueprint, we are using a blueprint interface, which has a function skillcheckresult that takes as arguments a bool variable indicating the result, and an image collectable object reference. After we check the angle value, we call this function from the blueprint interface with the pass value set to the result from the branch condition, and the instantiatior value being the skill check instantiator one, which was previously set in the image collectable blueprint when we spawned the skill check. After this is done, we enable player movement and input, and remove the skill check widget.Back in the image manager blueprint, when the skill check result event is triggered, we are checking if the skill check result is a pass or not. If not, we play a fail sound, update the spawn point of the image which instantiated the skill check, and lower the speed of the skill check only if it is not equal to 1.5, this being the minimum value. If the skill check was passed, we play a success sound, destroy the image which instanstiated the skill check, increase the skill check speed, and update the collected image value from the collect widget. We also check if the player collected all images, and if true, open the end level.To explain a little bit why I did this logic the way it is, it’s because I needed a way for the image manager to know what image spawned the skill check, so it could apply to it the necessary settings based on the skill check result. Let’s say I didn’t have this skill check result event. I would interact with an image, spawn the skill check, and fail it. But given the fact that there are 7 images in the game, I wouldn’t know whose image’s spawn point to update.

Obtained Skills

  • Proficiency in Unreal Engine

  • Familiarity with the first person template

  • Blueprint visual scripting

  • Behavior trees and tasks

  • AI logic

  • Horror game development

Check out the project

Gallery

© capnRadu. All rights reserved.