Life Of Bean is a college project which was submitted for my year two FMP. It's a single player story driven FPS game with a focus on visual effects and lighting. The section of the game described and included within this post features a fully voice acted cutscene, tutorial firing range, and an arena level.
How to access-
Life Of Bean can be accessed exclusively through this site and my GitHub. Links to all of this is below
Github Source - https://github.com/AlfieRichards/Life-Of-Bean
Project Build - https://github.com/AlfieRichards/Life-Of-Bean/releases/tag/v1.0-beta
Cutscene - https://youtu.be/70P5DSOyhZI
Development Process-
Below are the weekly writeups which I made documenting my development process and any various troubles which occured throughout this time.
Week 1 6/03/23-
Overview
During this week, my focus was primarily on written work rather than practical implementation. I worked on creating my brainstorming and proposal documents, which will serve as the foundation for my upcoming design document. Following the guidelines provided by my tutor, I started by writing the rationale section of my proposal, where I outlined the concept for my game, how it will be developed, and my motivation behind it. Subsequently, I delved into my brainstorming document, initially assuming it would be a straightforward task. However, upon receiving feedback from my tutor, I realised that it required more effort and had to be elevated to a higher standard. Although it was time-consuming and challenging, I managed to complete it within the week, with only a few sections remaining.
Proposal
For my game concept, I decided to focus on a single-player first-person shooter (FPS) game, drawing inspiration from the vibrant community and following that exists for such games. As an avid player of FPS games for several years, I also sought input from others on what they would like to see in a game. Through research, I explored other popular games within the community, such as "I Expect You To Die VR," which sparked the idea for my project - a single-player story game set in the 1980s spy culture.
Brainstorming
This week, I made progress on my brainstorming document by conducting extensive research on my target audience and demographic. This involved delving into the world of first-person shooters, top-down shooters, and game theories to identify where my game, "Life Of Bean," would fit within the larger landscape of video games. By understanding the preferences of my audience and their expectations, I was able to refine my ideas and make more informed decisions about the direction of the project.
In addition to researching the genre and audience, I also explored different input methods and control schemes. By considering the strengths and limitations of each approach, I was able to determine the best way to design the game mechanics to provide an enjoyable experience for the players. Furthermore, I studied typography, colours and connotations to ensure that the game's visual elements would be cohesive and consistent, henceforth avoiding brand dissociation.
Furthermore, I conducted thorough research on sound effects, music and dialogue along with their usages in other games. Using this knowledge I researched various pieces of music and sound effects which could be used within my project, these will be used to build an effective soundscape and general auditory experience for my project.
Another critical component of my project is art style. I conducted research into this by studying the various design elements and techniques used in popular spy style games such as the Hitman series. Using this research I have been able to design a visual aesthetic that would help my game stand out and capture the attention of my target audience.
I also worked on my brainstorming document this week, conducting research to further develop my ideas. I delved into the genre of spy games, identifying where my game, "Life Of Bean," would fit within the landscape. Additionally, I researched and compared the differences between top-down and three-dimensional games, analysing how my game would be experienced on these platforms. I also delved into other aspects such as gameplay mechanics and user interface design, drawing inspiration from various games to inform my decisions.
What's Next
In the upcoming week, I plan to continue refining my brainstorming document and start working on my design document. I want to ensure that I make progress on these tasks to avoid being overwhelmed with written work towards the end of the project. By staying organised and adhering to my plan, I aim to create a solid foundation for the development phase of my game project.
Furthermore, I will be focusing on incorporating feedback from my tutor and peers to further improve my proposal and brainstorming documents. I will also continue researching relevant game mechanics, art styles, and storytelling techniques to enhance the overall concept of my game.
One aspect that I will be prioritising in my design document is the game's story and narrative. As a single-player story game, the plot and characters play a crucial role in engaging players and immersing them in the game world. I will be developing a detailed storyline, and creating well-defined characters with distinct personalities and arcs. Additionally, I will be exploring various storytelling techniques, such as cutscenes, dialogue, and interactive elements, to deliver an engaging and memorable narrative experience for players.
Another important aspect of my design document will be the gameplay mechanics. As an FPS game, the gameplay needs to be engaging, challenging, and enjoyable. I will be researching and experimenting with different gameplay elements, such as combat mechanics, level design, and player progression systems, to create a seamless and immersive gameplay experience. Additionally, I will be considering the game's difficulty curve, pacing, and replayability factors to ensure that players are motivated to continue playing and exploring the game world.
In addition to the story and gameplay mechanics, I will also be focusing on the visual and audio aspects of the game in my design document. I will be researching and experimenting with different art styles, colour palettes, and visual effects to create a visually appealing and cohesive game world. Similarly, the audio elements, such as background music, sound effects, and voice acting, will contribute to the overall immersion and player experience. I will be carefully selecting and integrating appropriate audio elements to enhance the game's atmosphere and storytelling.
Furthermore, I will be considering the user interface (UI) and user experience (UX) design in my design document. The UI needs to be intuitive, easy to navigate, and visually appealing. I will be researching and designing UI elements, such as menus, HUDs, and interactive elements, that are user-friendly and enhance the overall gameplay experience. Additionally, I will be focusing on the UX design, considering factors such as player feedback, player guidance, and accessibility, to ensure that the game is enjoyable and accessible to a wide range of players.
Week 2 13/03/23-
Overview
During this week, my focus was again primarily on written work rather than practical implementation. I worked on creating my design document and schedule, which will serve as the foundation for my upcoming practical project. The design document is what I will follow throughout my project to guide me on the practical work which I do. This will serve as a point of reference to make sure that I am staying on track with the project overall and that I continue to meet the conditions set by the brief. Alongside the design document, I also created a detailed schedule outlining the tasks that need to be completed and deadlines for each task. This schedule will help me manage my time and ensure that I stay on track with the project. I also made sure to factor in additional time for unexpected issues or changes that may arise during development.
Design Document
This week I both began and completed my design document, this document will be guidance for my practical project and will be what it is based off of. I first started creating this by evaluating some of the research in my brainstorming document and coming to conclusions. The first section which I completed was on my Target Market and Demographic. Here, after analysing all of my previous research, I concluded that I would market my product towards the age range of 16-24, on the pc platform, for keyboard and mouse. I also then thought of several ethical considerations which I would need to pay attention to whilst developing my practical project such as stereotyping, violence, and player data usage.
Following this I then began writing up sections about the overall gameplay of my project. Here I broke down the players controls for both keyboard, mouse and controller. And then continued by breaking down these controls into sections along with short explanations on what they will do. I made sure to use common control layouts for both keyboard, mouse and controller so that my game is intuitive to play for anyone with prior experience playing FPS titles on the PC platform. After completing the sections on Player Input I began making weapon designs and deciding what would be available to the player. I researched various weapons platforms and systems and concluded what would fit my project the most, along with coming up with statistics for these which can be used within my weapons system when it's made. These include values such as damage, fire rate, magazine size and reload time. Next I planned out the appearance, statistics and functionality of my enemies, drawing inspiration from various other sources and games such as Killer Bean.
After completing these sections on gameplay I began designing my levels and the overall layout of my game. To do this I first loaded up an image processing application and began vaguely sketching out ideas on how a firing range might look within my project. To aid with this I took inspiration from real life firing ranges and their appearance, along with my prior knowledge of how they work and the designs of them. Using all of this I was able to produce a rough design for my initial levels which I could follow to produce a scene for my game. I then continued this approach for the second section of my game which is intentionally similar to the first as it still takes place in the same firing range. Following this I began designing the final section of my game. This is the large arena at the end where the player will have to test their newfound skills against waves of enemies to beat the game. For this I initially drew inspiration from various real life counterparts such as indoor Airsoft fields, and then began my same process of designing in the image processing application.
Once these designs were completed I moved onto the next sections of my design document which would all focus on the visual appearance and aesthetic of my game. First I drew conclusions from the initial colour palettes proposed in my brainstorming document. One set of colours stood out to me in particular as the correct match for my project so I decided that these would be the colours I use. I also researched into the various connotations of these colours which will help to decide how I apply them in the practical. After this I concluded my prior research into fonts and picked out some of the ones which I think suited my project best. For this I picked several fonts to make sure that all potential bases for my project were covered. This meant that whenever I needed a font such as for a larger body of text, title or subtitle I would have one ready which fitted the theme of my project. This section was relatively easy and quite enjoyable as I personally like looking into various typography and it's one of my favourite elements of design.
Next I decided what render pipeline I would be using for my project. This is significant as in my chosen engine (Unity) there are several different render pipelines each with their own different set of abilities and tools within them. For this I decided to use Unity's High Definition Render Pipeline as to me it is the most visually appealing and I think it would be best suited for my project due to the range of tools which it gives me. These include functions such as scriptable shaders and shader graphs. These can be used to create complex materials which were not previously possible in other pipelines. Another great functionality is the particle system which can be used to effectively and dynamically create large amounts of physics based particles within a scene at a time without significantly reducing the overall performance.
I also then looked into post processing and the various elements of which could be applied to my project. One of the ones I focused on primarily was bloom. This allows you to over expose certain sections of a scene creating a sort of glow around them adding to the overall visual appearance of a project. This is especially effective in areas with bright lights such as flashlights or fire, and looks excellent on reflective objects too such as a bullet casing. Other post procession options include depth of field and motion blur. These will likely be used within my project to add depth to the overall visual appearance however will need to be carefully considered due to their potential performance impact overall.
Finally after completing these sections I drew conclusions on the audio sections of my game. First I decided a music genre which my game would primarily be using, along with selecting a couple of pieces of music which match the games aesthetic and vibe. After this I also decided various locations where sound effects would be used within my project such as gunshots and reloading sounds. These can all be sourced from a website that I found whilst conducting my initial research for this project which is called StoryBlocks. Within this section I also wrote out the initial script for the cutscene and gameplay of my project. This will be submitted to various voice actors so that my game can have fully functional dialogue and story especially within cutscenes. This script also features stage directions which will be used when animating and designing the large cutscene which my project begins with.
What’s Next
Next week I plan to begin my practical work. I will start by creating various key systems such as player movement, audio systems, and other core functionality of my game. I will also go over the written work which I have already done with my tutor to make sure that it is of adequate quality. These systems will likely take up the entire week due to their complexity and the time it takes to complete and debug them, they will also be being developed completely from scratch for this project as I don't have any systems which I have previously made that I would be happy using for this.
Furthermore I will be working on various 3D models for my project if I get the time. First I will be making a fully rigged model for the main character agent bean, this will be used in all cutscenes and gameplay so it is important that I get it done soon. I also plan to model the initial firing range which will be used for the first two rooms the player is in. All this should be relatively quick due to the simplistic design of my game's characters and the relatively simple model of a firing range which I will be making.
Week 3 20/03/23-
Overview
This week I focused primarily on my practical work for the first time since starting this project. I made several sophisticated systems which will be used throughout the rest of my project such as my weapons system, audio system and player movement system. These will be used in combination with various other scripts and objects within my project to make a functional game which fits my brief.
Thanks to my quick progress on these I also had time to work on various 3D models such as the initial firing range, player model, and weapon model. These were also able to be completed relatively quickly due to the large amount of motivation which I had during this time. I also got a large amount of work done on the cutscene section which is why I chose to divert from my schedule and do some of the scripts now. I figured that in the end they were more important and should be prioritised over the visuals of my game.
Practical Project
The first thing I decided to do was my weapon and debug systems. These will be extremely important throughout my project so it makes sense to prioritise them over anything else at the moment.
Weapons System
Functions-
My weapons system needs to have some key functionality to make it usable in my game. First is the basic functionality which lets it work as a weapon:
Raycast Firing
Set ammo count
Set magazine size
Reloading
Running out of ammo
Then there are several less integral but still important functions such as:
FireRate limit
Different fire modes
Animations
Camera shake
Damage
And finally some features which are just nice to have like:
Sound
Modularity
Ease of use
Ease of customisation
For the core functionality I began by making an extremely simple weapons system using raycast firing and with all the key features I set out above. This was done by using some simple code from the Unity documentation on input methods and raycasts.
Below in Figure 1 is the code for getting the user input. I decided to use Input.GetButton to allow this script to be much more versatile with input methods. This allows me to easily implement controller support at a later date within my project. You can also see some of the code which was added later to work out what fire mode the player is using.
Figure 1 - A simple piece of code to get player input and fire the weapon.
After this I made a simple ammunition system. This will mean that the player has a set amount of times they can fire the weapon before they need to reload it. This reload will then subtract from the player's total ammunition as is shown in Figure 2.
Figure 2 - This function handles reloading of the player's weapon. First it checks to see if the player has reloaded without a full magazine and that they have spare ammo. If they do, the script works out how many bullets would need to be added to the player's current mag. It does this by subtracting the current amount from the mag size set by me in the inspector. It then checks to see if you actually have enough spare ammo to fill the magazine. If you do it adds the required spare ammo to the mag and subtracts it from your remaining spare ammo. If you don't have enough to fill the mag but do have some spare ammo it will add it onto your current mag and then set your spare ammo to 0. The flowchart used when coming up with this process is shown in Figure 3.
Figure 3 - The flowchart which was used to design the reload functionality of the weapons system. The code for this is above in Figure 4.
Finally I made the piece of code shown in Figure 5 which would actually fire the weapon. This will fire a raycast from the players aimpoint if the player has any spare ammo and subtract 1 round from their magazine.
Figure 5 - This code fires the weapon whenever the player has enough ammo to do so and is not currently reloading.
Once all this core functionality was completed I could begin adding some of the more sophisticated sections. The first thing I wanted to do was limit the rate at which the player can fire their weapon along with adding semi and fully automatic modes, this was done using the code in Figure 6.
Figure 6 - This code is a modification of the original way in which I fired the weapon. Now with the addition of a check to see if the weapon is set to automatic or semi automatic. The most important part of this is to not stop trying to fire the weapon until the fire button is released. This allows fully automatic weapon fire.
To integrate a delay between shots which works even while the weapon is semi automatic I invoke a shot delay function which will prevent the weapon from being fired again until a set amount of time has passed. This prevents usage of things such as auto clickers.
After this I was playing a game and noticed that weapons fire out of the players head and not the gun itself. This makes sense in a way as the player will always hit where they are actually looking, however it's quite unrealistic as the weapon could be completely obstructed yet still hit where the player is looking. This is currently how the weapons work within my project which I have now realised is going to be an issue as my game progresses with level design, along with it going against my focus on realism. To fix this issue I made an entirely new method of firing the weapon as is shown in Figure 8 with an illustration showing the issue which I am having in Figure 7.

Figure 7 - This is a simple illustration of the issue which I have been having with weapons. If I were to fire the weapon whilst using head raycasts I would hit the target just fine, even though my weapon is completely obstructed. With my new system this is prevented and the weapon will just hit the box.
Figure 8 - This new method of firing the weapon uses 2 raycasts and a linecast to decide where the weapon will hit and if it will hit where it's aiming. I think that this is a much better system than the usual firing out of the head as that is extremely unrealistic. First a raycast is fired from the head as usual, this will be used as the shot location if the weapon is unobstructed. Next a physics linecast is used to see if there is a clear line from the front of the weapon to where that first raycast hit. This prevents the player from shooting through things if they have a line of sight. If there are no obstructions the player will shoot from their camera. If there is an obstruction there will be another ray cast in a straight line from the front of the weapon and that will be where it hits instead.
Debugging System
Once this was completed I began work on my debugging system. Though simple this allows me to debug any problems which may occur in the future with relative ease even in project builds. The debugging system is opened with a simple set of keybinds; one to unlock the mouse and one to show the GUI. The code for these is shown in Figure 9. Once the debugging GUI is open the user can select what they want to see on screen with buttons along with seeing their current framerate. This is shown working in Figure 10 and was made using the code in Figure 11.
Figure 9 - This code is used to open and close the debugging GUI along with toggling the cursor state. The section above is duplicated 4 times for opening and closing the GUI and locking and unlocking the mouse.
Figure 10 - These are images of the debugging controls and the various information it can give the user. The debugging controls can be used to toggle on and off sections of the GUI along with viewing your framerate. More sections will be added as the project progresses to help with any potential problems that arise with things such as enemy ai or audio playing.
Figure 11 - The top section of code is used for the buttons, it uses the OnGUI function which I found from the Unity documentation. It uses this to draw a box and then overlap it with a button used to toggle the other windows. The code for one of these other windows is below and shows how text can be written to the screen using GUI.Label.
Modelling And Animation
This week I also designed, modelled and animated sections of my games initial cutScene and characters. First I created reference sheets for some of the characters in my game, such as the protagonist of my game which will be Agent Bean. I also modelled Carlos (coffee bean) and their reference sheets can be found below in Figure 12.
After this I began modelling the Beans in Blender, I first created a base bean which my other bean models would derive from. This base bean is the one used for Agent Bean himself. Once Agent Bean had been completed and rigged on Blender using some help from the tool Rigify and the Blender documentation, I used copies of his model to easily change clothing and other aspects of my characters without the need to completely remodel and re-rig all of them which would be extremely time consuming if I had not made them in this way.
Figure 12 - These are the initial designs for two of the main characters within my project. Agent Bean, and Carlos. I spent a few hours developing ideas for these and drawing inspiration from various characters in other games along with outfits that I saw online. A good example of this is shown in Figure 13 with Cappuccino bean from the short animated film Killer Bean, which heavily inspired my own bean Carlos.
Figure 13 - Cappuccino bean from the short animated film Killer Bean
https://youtu.be/qyYHWkVWQ4o
Once all of these various designs were completed I began modelling my beans. I started by making the distinct capsule shape and then adding arms and legs. Once this was completed I made the initial rig using Rigify and manual weight painting so that I could test it would all work correctly. Following this I made an outer shell in sculpt mode which would be the beans clothing, this was also then rigged in the same way. Finally I added everything together with some textures to create a finished rigged bean. This process is shown in Figure 14 and is identical to the one used to create Carlos bean shown in Figure 15.
Figure 14 - A step by step breakdown of the modelling process for Agent Bean. On the right is pictured the original capsule body, with the clothing layer in the middle, and the final model on the right.
Figure 15 - Agent Bean (pictured on the left) and Carlos Bean (pictured on the right) standing in an elevator together.
What’s Next
Next week I plan to model and animate a CutScene portion of my project. This will be a driving scene. I also plan to model the main room which the player will be in and plan the rest of the scenes in the game. Finally I will make the targets in the rooms actually swing when they are moved which should be relatively easy to implement. I will also likely make one or two test renders of the Beans to make sure that they look how I want them to along with learning more about rendering in Blender.
Conclusion
I'm really pleased with how this week unfolded. I successfully completed all the tasks on my agenda and handled my time management skillfully, despite a minor reshuffling of priorities. By adhering to my design document and effectively managing my time, I believe I am making significant progress towards meeting the requirements of the project brief.
Week 4 27/03/23-
Overview
This week I focused mostly on the modelling and animation side of my project. I managed to get all of the first section of my cutscene modelled, animated and rendered. I also modelled and rigged the weapons for the player, did two test renders, and modelled the first section of firing range for my project.
Furthermore, I continued the week by bringing the firing into the game and setting it up to work with lighting, reflections and more. However, this led to me getting extremely stuck on one issue which I still have yet to fix due to time constraints and a drive failure limiting the progress I can make this week.
Modelling And Animation
Agent Bean
First I began by rigging and weight painting Agent Bean. This was a relatively simple endeavour however there were some struggles with his non-humanoid shape making clothes particularly challenging to rig.
To do all this I followed several online tutorials and guides from various sources such as youtube, blogs and the Blender community Discord server. After I had learnt vaguely how to do it the model was relatively easy to rig. With some trial and error I was able to get everything working and the only place I really had any issues was with the clothing deformation, this was a struggle since I had rigged the body separate to the clothing making it extremely difficult to get the mesh to deform correctly. However, after a day or two of trying I managed to solve this issue for the most part only needing to remove sections of the collar from Agent Bean’s suit.
Weapons
Once this was completed I began modelling a weapon for the player. This would be the M16A4 which I had planned to use in my design document. This was actually quite easy as I have a large knowledge of weapons and their functionality. The main difficulty here was getting some of the more complicated geometry on the lower part of the weapon as the complex curves can be quite challenging to make. I also ran into one issue with this model when testing it being that the faces on the magazine and grip did not show in Unity. After around an hour or two of experimenting I found out that the faces were non triangular meaning that Unity couldn't work out how to render them so just decided to cut them instead. To fix this I just had to select these faces in blender and in the mesh settings triangulate them.
Test Renders
Once all this rigging, weight painting and modelling was done I then decided to do a few test renders to make sure that everything was ready to be used for more complex animations and posing. The first one of these was of the weapon, it was a simple render of the weapon in a crate in a nicely lit scene. This render is shown below in Figure 16 and led to me realising several issues with the weapon. First I had modelled the grip incorrectly and it required more curvature around the edges. I also realised that the barrel and handguard was far too long and required shrinking and resizing to look correct. Once all of these issues were fixed I was able to create the render below.
Figure 16 - This is a simple render of one of the weapons that will be used by the player in my project. This was quite easy to make due to my experience and knowledge with weapons platforms and I think the render came out really nicely.
Once this was done I decided I should do a slightly more sophisticated render making use of Agent Bean so that I could test his rigging and weight painting. This render was also to be done in a small section of the firing range that the player would be in.
First I modelled the firing range shown in Figure 17 for the player using the designs in Figure 18. This was extremely easy as it's just a hollow box with some minor detailing and was able to be produced very quickly. This was completed with 0 issues due to its simplicity and allowed me to quickly continue with my work on the render. I began by posing agent bean in a suitable location for the image shown in Figure 19. Once this was done I positioned the weapon in the scene and posed the beans hands around it, before adding some detailing such as ejected shells and a muzzle flash. Once this was all done I did a render along with research into the Blender compositing functionality and how to add post processing effects such as bloom. For this I used this YouTube video https://youtu.be/Tu3U6wD7lu4 which effectively explained the different ways of adding bloom and some of the basics of the compositing layer. This all led to the final product shown in Figure 19.
Figure 17 - A screenshot of the scene whilst it was in development. I had just posed Agent Bean and am about to add the weapon into the scene.
Figure 18 - The designs for the various levels within my project with the firing range on the left.
Figure 19 - The final render. I am extremely happy with this and I think it looks excellent. The overly intense bloom really compliments the game's cartoon aesthetic and emphasises my focus on visual aesthetics. Text was also added later saying bean industries on the wall in photoshop.
Cutscene
This week I also designed, modelled, animated and rendered the first section of my games introductory cutscene. This features an intense driving sequence with various effects such as smoke, fog, and tire tracks.
First I found a car model which would be suitable for this scene. I found this Bugatti Vision GT model available on Sketchfab which I could use royalty free. (https://sketchfab.com/3d-models/bugatti-vision-gt-caafea640b7d4b4fb26a16a2ce072181) I then heavily modified this model to suit my needs by changing the colours, interior, and exterior in various ways such as making the doors and wheels functional for animation which can be seen in Figure 20.
Figure 20 - A side by side comparison of my altered Bugatti model which was used in the cutscene compared to the original.
Once the car model and bean were all prepared for the scene I planned the route which they would take and modelled the terrain around it. For this I used the bezier curve tool in blender to create a smooth path for the car to follow, I then shaped the terrain in a way that meant the section the car would drive on would always be flat. This meant that I didn't have to worry about any potential issues with it clipping through the floor.
Once this was done I hand animated the path which the car would take along the route which I had planned, making sure that the car handled in a believable and realistic way. This path is shown in Figure 21 along with an image of the terrain.
Figure 21 - The path which I planned for the car to follow. On the right is an image of the terrain showing the flattened section for the car.
Once this was all done I had to figure out how to do various effects such as dust and tire tracks on the ground. For the tire tracks I researched various ways which I could do them and came to the conclusion that I could use a particle emitter on each wheel which would drop a tire track object. For the dust I decided against using volumetric smoke due to my lack of experience with it and the increased rendering time it would bring. Instead I decided to use a metaball based system with particle emitters again. This actually ended up working much better than I initially thought that it would as can be seen in Figure 22.
Figure 22 - A still frame from the final render of the driving cutscene. The lighting looks excellent thanks to the volumetric fog throughout the scene, especially when combined with the quite effective smoke effect and the tire tracks (circled in red).
Practical Project
After all this work on cutscenes, animation and modelling was done I decided to get back to work on the actual game itself. First I imported the firing range model into the game's scene and configured the lighting, shadows and reflections as can be seen in Figure 23.
Figure 23 - An in editor screenshot of my project once everything was imported and set up.
Lighting, Shadows and Reflections
Since I am using HDRP for my project I have quite a lot of freedom when it comes to post processing and lighting within my project. For now due to time constraints I will be doing a simple set of lights and reflection probes.
For the lighting In my scene I first made the physical lights emissive for the bloom effect, however these don't actually contribute much lighting to the scene so I put an area light underneath them and just used the emission for effect. These were used in combination with a light probe system, this is used in the area where the player can move to allow them to have almost the effect of real time lighting without the performance hit. All of my lights are baked and static for efficiency and performance. I also have reflection probes scattered throughout the scene allowing for reflective materials such as metals to look correct.
What’s Next
Next week I plan to do some changes and detailing. I will be making an upgraded version of an audio system which I made around 2 years ago. The main difference is that I will be adding support for 3D audio and emitting sound from objects in the scene, along with adding dynamic reverb at a later date. Along with this I will also be doing some simple changes to the scene such as making the targets movable and making it so that they swing when they do. I will also be adding Discord integration to my project so that when you are playing it it shows in your Discord status.
These are all relatively simple and short changes compared to the previous week however some of them will actually be quite complicated. I'm currently unsure on how to make the targets swing so that will likely take quite some time to figure out.
Conclusion
I'm satisfied with the outcome of this week. I successfully completed all the tasks I had planned and maintained a good handle on my time, even though some tasks were done in a different order. By adhering to my design document and managing my time effectively, I am confident in meeting the expectations outlined in the project brief.
Weeks 5 and 6 3/04/23 - 10/04/23
Overview
This week I plan on implementing several small changes such as making my audio system, making targets move, and adding Discord integration. These should all be relatively simple to do and give me most of the week free to polish other things which i've done so far and have a little break from the chaos which was last week.
Week 5
So This week started off brilliantly, turned on my computer and noticed everything was extremely slow. After some digging I found out my main SSD with everything on it was running at around 0.3 mbps compared to the 400 it was meant to be at. Realising this was the end of this SSD I turned off my computer and waited for a new hard drive to be delivered, this took around three days. I figured that the SSD likely would have limited read and write cycles left and I didnt want to risk any more damage occurring by using it in this state.
Once the hard drives had arrived I installed them and then booted off of a live Linux USB so I didn't have to install an OS. Using this I copied all my files from my secondary SSD onto a hard drive. Once this was done I reset that secondary SSD and installed windows onto it. This took around a day in total. Finally after that was all done and updated I copied everything from the broken SSD onto the secondary one and the hard drive, and then duplicated one hard drive onto the other one so that I had a full backup. This whole process took up my entire week and was extremely stressful. This meant that I could do no work on my project this week. Luckily it was already going to be quite a laid back week with not much project work planned but it does now mean next week will be even more of a rush.
Audio Manager
Within my audio manager I need several pieces of core functionality. This will be 3D audio, one shot audio played on the player, looping background sound, sound which persists across scenes, and ease of use.
Sound Class
First I began by making a serializable class which will be used for each sound in my project. This class will contain various information about the sound such as its name, volume, pitch, the mixer it plays on, etc. To do this I looked into some simple sources on how to do these classes such as this tutorial. https://tij.medium.com/c-back-end-store-data-as-class-objects-and-store-in-a-list-and-how-to-reuse-those-data-8361faaf79c2
This made it really easy to understand what I had to do so that I could implement it within my own project as is shown in Figure 24.
Figure 24 - The sound class used by my audio system, this holds all the relevant information related to playing a sound and will be stored in a list in the audio manager.
Audio Manager
The audio manager is the next component of my audio system, this will handle the importing and using of any sounds within my project. To start I began by making it a singleton. Since this audio manager will be persistent across scenes there is potential for issues where multiple of them could exist. This will prevent any issues like that from happening once this is done. I started by iterating through all the potential sounds which it could play. For this I made a public list of Sounds which uses the sound class I made before. This will store all of the potential sounds for the audio manager to play. I then added a foreach loop that would iterate through all of these and create a new audio source component for each one, it then set all of the various values of this to the information from the Sound class. I did one of these for sounds and one of these for music as is shown in Figure 25.
Figure 25 - The loops iterating through sounds and music in my audio management system.
After this I began on my PlaySound function. This will play sounds using the audio sources made earlier. It does this by first finding the sound we are looking for by searching its name using Array.Find from the System.Linq collection. If the sound has not been found an error message is logged to the console. If the sound is found it runs the .Play() function on the corresponding audio source. I also made a play music function shown in Figure 26 which does pretty much the exact same thing but for music.
Figure 26 - The playsound function in my audio manager, there is also a play music one which is essentially the same.
Once this was completed I finally added a StopAll function, this will allow me to stop every single source currently playing audio which is extremely useful when switching between scenes. I also added a StopSpecific function shown in Figure 27, which will allow me to stop specific sounds from playing based on their names. This will be useful for changing music mid scene or something like that.
Figure 27 - The function in my audio manager for stopping individual sounds based on their names, this allows you to instantly stop any currently playing sound.
Discord Integration
To add Discord integration into my project I first needed to make a Discord application. To do this I opened the Discord developer portal and logged in with my account then I made a new application and made sure to set it up to be used as a status activity and not as a Discord bot. I then added a brief description, name and cover image.
Once this was done I began researching the Discord api to try and find out what I would need to do for my project. I found from the Discord developer documentation (https://discord.com/developers/docs/intro) that I would need to import the Discord API into my project. This was quite easy and I just downloaded the files I needed and stuck them in my project.
Next, I started making a script. To start I began by making it a singleton. Since this Discord manager will be persistent across scenes there is potential for issues where multiple of them could exist. This will prevent any issues like that from happening once this is done. Next I logged my script into the Discord application that I made earlier. I did this by making a new instance of Discord shown in Figure 28 which I had referenced at the top of my script.
Figure 28 - The small section of code which lets me communicate with the Discord application that I made earlier
Once this was done I could begin adding some functionality. The first thing I did was make it so that the object would be destroyed if Discord was not running. This prevents it from running unnecessarily in the background and stops any potential issues where Discord wouldnt be found. I then made an UpdateStatus function with help from the documentation which will be the main functionality of my system.
The UpdateStatus system first makes a new Discord activity and gets the current Discord activity manager, it then parses the various information into this which is needed to display a status (image and text). I then use the activityManager to display this current activity. If this fails it will log a warning message stating that it could not connect to Discord , it will then destroy itself again for the same reasons as before.
Swinging Targets
Next I want to make the targets in my scene movable and make it so that they swing around realistically when I move them. This was quite the process to figure out and took me the entire rest of the week.
First I had to make a system so that the player could interact with objects in the scene. This was extremely easy as all I had to do was draw a ray from the player's camera and see if the object it collided with had a specific tag. If it does I just run a function with the script attached to that object. For this interaction script I made a simple update loop shown in Figure 29 checking to see if it had been interacted with, if it was I would have it run the desired functions
Figure 29 - The basic update loop to check and see if the object had been interacted with. I check to see if it's a target script as this interaction script is designed to be universal and work for multiple completely different types of objects.
Next I wrote the target control section, this would move the target between 2 user set positions. This was really simple and just uses the code in Figure 30 with Vector3.MoveTowards to move the target parent towards 1 of 2 points
Figure 30 - The block of code used to move the target between two user defined locations. It's extremely simple and robust and easy to adjust anything with.
Now this is all done I need to make the target swing somewhat realistically so it looks a bit better. My first thought on how I'd do this was by using a cloth simulation. After extensive research into how cloth simulations work in the Unity Docs I managed to set a simple one up on my target, this swung perfectly as it was moved and I was surprised at how easy it was to implement.
However, when I shot the target I wanted there to be a hole in it, I have done this by instantiating a prefab on the target which makes a hole in it. The issue with this is that when the cloth swung it left the holes behind floating in space as is shown in Figure 31.
Figure 31 - Hole prefabs floating in space nearby the cloth which they are meant to be anchored to.
I now know that this is thanks to cloth simulations not actually moving the object. The cloth's position and rotation never changes, but the individual vertices on it are deformed and moved. This is all well and good but it means when I make the holes a child of the cloth they just float nearby. Within Unity you can get the transforms of the individual vertices to anchor the holes to them however this is also problematic. The vertices are all in the corners as it's just a bevelled plane, and without subdividing the entire thing repeatedly making the simulation much more computationally expensive, this can not be done. Not to mention it's rather complicated. This method was suggested to me in Figure 32 by a member of the Unity Discord server.
Figure 32 - A member of the Unity support Discord server explaining the vertice method to me.
When I explained that this wouldn't work they then suggested in Figure 33 that I could use decals and a custom shader to make this happen. This would prove to be an absolute nightmare.
Figure 33 - The same user suggesting that I use decals using the HDRP decal projector and custom shaders.
Now the main issue with this is the incomprehensible complexity of programming your own shaders. Unity completely remade their shader system in HDRP meaning that all code from the legacy pipeline isn't remotely transferable. Not to mention that the decal projector requires a Unity decal shader and will complain if you do anything else. My way round all this was to make an exact copy of the decal shader and then completely rewrite it without knowing how so that it worked to create holes. Somehow I managed to adapt an old method of doing this in the legacy pipeline to almost work using my own custom programmed shader. However it was incredibly inconsistent and difficult to use, so I decided to abandon this idea too.
After wasting a day on this I finally decided to do my own research and discovered hinge joints, surely these would be the solution to all of my problems. Thanks to a different member of the Unity community who pointed me in the right direction I was finally able to make targets that swing in my intended way. After some explanation shown in Figure 34 I was told to just “Make a bunch of Rigidbodys in a line and connect them all with hinge joints”.
Figure 34 - Me explaining what I wanted using a few diagrams which I made. The green line here is a target, when it moves in the direction of the red arrow it swings. The blue dots on it are objects which I need to stay stuck to whilst it moves (holes).
After this user's albeit vague explanation I was able to make a chain of Hinge Joints which swung my target when the top one was moved. This was absolutely perfect. However there would be issues again. In this attempt I'd had no idea how to actually move a hinge joint and had found my own strange way of doing it shown in Figure 34 by exploiting what can only be described as a bug in Unity.
Figure 35 - In this code I move the top hinge joint in a very strange way. I found that when I moved the connected anchor in the inspector it would move the top hinge joint, so logically I concluded that I would just do this in script. The first few lines just work out the difference between the anchor's current location and where I want it to go, and then just move it accordingly. The issue was that this did nothing when I tried it in game (probably because you really are not supposed to do this). I then naturally found a way round this issue by enabling and disabling autoConfigureConnectedAnchor on the hinge joint. From what I can tell this forces a reset on the joint making it completely recalculate its location, and somehow this worked perfectly. And accomplished everything that I wanted it to. This was excellent news and I let myself rest now this was finally all done.
Except, it's still broken. After doing a test build I found this solution worked perfectly in the Unity Editor but in builds it did not work at all. I'm still unsure as to why this happened but I figure it's something to do with resetting the autoConfigureConnectedAnchor. Perhaps this isn't exposed to my scripts in builds? Anyways I’m still unsure on what happened with it. I was completely back to the drawing board, I knew hinge joints were still the route to go however had no idea how to use them.
I questioned this in the Unity Discord and had a pleasant exchange with a member there who told me to look in the player .log to see what was happening. I asked them if this was in the build and where it could be found to which they politely replied “ffs. google Unity log files”. I did exactly this ignoring their snideness and after researching the Unity Logs page there was nothing said about logging or debugging in builds or built projects. I then questioned their response saying that the issue I had was in the build not in game and yet again got a rude response followed by “do you think I am telling you this for the good of my health?”. A different person then tried to help me by linking me to the relevant documentation, to which the rude person then complained that “he shouldn't need you to google that for him”. Finally after some more useless back and forth another person interjected and explained to me that the Player that the documentation speaks of is actually the built game. This is extremely strange considering there was no mention of this on the Unity documentation for logging. I commented on how odd it was that it wasn't mentioned in the documentation and the same rude person gave me a great response of “there is a link at the bottom of the page you can click to whinge about it”. I decided to follow their advice and leave a report on the documentation explaining it was a little vague when explaining how to debug builds. In conclusion this whole 3 hour long experience was completely useless and the logging made no difference.
After once again consulting the Unity Discord a few hours later and getting some suitably useful responses from people complaining about the way I formatted my code and telling me that I don't know how to read I finally got a useful answer. The same person that told me to use hinge joints originally responded by saying that I should just move the top ones rigidbody. Somehow I hadn't considered this the entire time.
Following this I totally redid all of my code to move the target by just moving the top joints rigidbody, however as usual there were issues. Whenever I moved the top joint the target would just spin around in a circle and get very confused.
I decided to consult the only person who had actually helped me throughout this entire process with the message shown in Figure 36.
Figure 36 - I am trying here to politely ask the same person for help who's helped me with everything else, expecting a response or just to have to wait until they're free. Naturally this was too much for the wonderful people of the Unity Discord and a random person interjected with a sarcastic response telling me to “don't ping if you are sorry 😛”. After this useful exchange I gave up and managed to figure out the issue myself. Turns out joints behave quite strangely and I would need a third joint, my issues with this are illustrated below in Figure 37.
Figure 37 - This diagram shows the different hinge locations and the effect which they had when moving the target. Image one is with only two hinge joints with the top and bottom set to kinematic, this made the paper not move at all. Option three was with 3 hinge joints, one in the middle and two on either end. Logically I thought this would work but apparently not as it just spun around in the middle and didn't move. Following this I realised the middle joint was the pivot point so moved it up to the top like in image two and finally got the desired result. This now all works perfectly in editor and in builds and is finally done.
What's Next
Next week I will be taking a break from Unity to focus on the second part of my cutscene, this will all need to be planned and animated within the week. I will also be fixing a few small issues with my player movement and animations, along with making a few changes to the games branding by adding a logo, splash screen, and title to the Unity project. I’d also like to add a feature that lets me conditionally hide public values in the inspector. It's not the most useful but would make some scripts much easier to work with.
Conclusion
I'm feeling really positive about this week's progress. I accomplished everything on my to-do list and maintained a well-structured schedule, even though the order of tasks shifted slightly. By following my design document closely and efficiently managing my time, I believe I am on track to meet the project brief.
Week 7 and 8 and 9 17/04/23 - 24/04/23 - 01/05/23
Overview
My goal for this week is to make the entire second half of my cutscene, this includes animating it, modelling it, rendering it, and editing it. I expect this to take the entire week and any spare time will be spent on making some pieces of branding such as the logo and splash screen for my game. I will also need to refine the script and plan all of the animation paths, and finally design the soundscape for the entire cutscene.
Scriptwriting
Before I could do anything on my cutscene I needed to make sure that my script is up to date and suitable for the project. The final version of this script is below in Figure 38.
Figure 38 - The final script for this cutscene. The green colour represents cut aways, lilac is stage directions, bold blue text is Carlos, blue text is Agent bean and finally yellow represents the first part of the cutscene.
This script was quite challenging to make as I had to capture a mix of comedy and seriousness in my game. I first wrote a script focusing primarily on comedic effect and then took away large amounts of this to overlay a more serious tone. An example of this is the juxtaposition of Carlos seriously saying that many people have gone missing and Agent Bean having no shown expression. I think that in the end I got this balance about right and the player learns all the basic information that they need to know about the game and their objectives.
Once this initial writing was all done I began finding voice actors. Luckily I am friends with Pete Brown of Microsoft’s audio department who was able to voice-act Carlos for me. He also provided several useful pointers further refining the script to make it sound more natural along with it being easier for him to say. Once this section was done I got a friend with a suitable voice to act in Agent Bean’s line. This was my first time getting other people to voice act in a project of mine and it was definitely something new. I had to tailor the script to better suit their voices and way of speaking, along with tidying the audio which they sent me in the end, these were all things which I haven't had to do before.
Movement paths
Once everything was voice acted and ready to go I began planning the cutscene movement paths. These would be the directions which the beans would walk through the scene and the speeds which they would do them. To do this I first planned out roughly how the scene would look and then drew out various sections of it. I made sure to make the paths cross in ways which meant the characters would not collide as is shown in Figure 39.
Figure 39 - One of the pieces of work I did when planning this section of the cutscene. This is the path which the Beans will take throughout the underground section of Bean Industries. The green path is the path of Carlos, while the Lilac one is the path of Agent Bean. There is one other bean in the scene for detail which is represented in orange.
Animation
Once everything was planned and my audio was ready I began animating the actual scene. This would prove to be extremely challenging due to my lack of experience and several things would need to be learnt throughout the week.
The first thing I did was make a walk cycle for the beans. When I previously modelled them I made sure that they all used the same rig and weight painting allowing for them to be easily animated with the same animation set shared between them. I began making the walk cycle by finding an image of a suitable human gait cycle online as is shown in Figure 40.
Figure 40 - The human gait cycle reference image used to animate the beans.
Once I had this gait cycle I traced the movement by modifying rotations of the beans rig in Blender to create an effective gait cycle. This was definitely not perfect primarily due to my lack of experience animating but for now at least it is definitely adequate.
To use this animation within my cutscene I had to use Blender’s non linear animator along with their dope sheet and timeline. The only one of these tools I had any experience with was the timeline which was unlucky since I would barely be using that for this animation.
With the non linear animator I could easily add animation clips to the objects within my scene and suitably blend them together in a visually appealing way. This was rather complicated to learn and understand especially because of one key issue. When I was using Blender at home I was on the latest version of 3.5. Here the scale functionality worked as intended in the NLA (non linear animator) and scaled animation clips by speeding them up and down to fit within the desired frames. This was all fine until I went to college where they used Blender 3.4.1. It turns out that specifically in version 3.4.1 (https://projects.blender.org/blender/blender/issues/101130) this scale functionality is almost completely broken and instead just changes the frame in which the animation starts and ends, never changing the speed. This meant all of my animations were cut off half way through when scaled, completely breaking the loop. The only solution to this was to install a new version of Blender on the college computers which was actually up to date.
Once this first section of the walk cycle had been completed I had a pretty decent grasp on methods of animation in Blender. Though it had been difficult to learn, I was now able to make simple looping animations quite quickly. Next I had to do several custom animations such as Agent Bean pressing the button in an elevator shown in Figure 41 or the vault door opening shown in Figure 42. The elevator button one was extremely simple as it was just a singular hand move, in comparison to the door opening which took me a full day to animate due to its complexity.
Figure 41 - Agent Bean pressing one of the elevator buttons by moving his hand and depressing the button. I had to make sure that the button itself would move in sync with his hand to press it in a realistic way.
Figure 42 - Agent Bean opening the vault door towards the start of the cutscene. This was extremely challenging to animate as I had to make everything move in sync with the door handle spinning, bars retracting, and the arm moving.
Once this was all done it was basically just repeated walk cycles and camera movements throughout the scene. The only other thing with its own custom animation was a firing scene. This was actually quite complicated to do as I had to work out how to animate particle volumes effectively. For this I keyframed the density of the smoke and then the flame on the gun's muzzle flash in Figure 43 along with a light at the same time to add effect to the flash.
Figure 43 - The muzzle flash in question mid animation. The scene is illuminated by the light with the muzzle flash itself not actually contributing to the scene's lighting.
Editing and sound
Once all this animation was done, rendered and ready to go I began on the next stage. This would be editing and producing the final cutScene. For this I began by importing all of my media into premiere pro. Once I’d done this I started compositing the scene. First I ordered all of the clips onto my timeline and trimmed them accordingly, I then added various transitions and cuts to effectively composite them all together. Once this was done I added the voice acting. I imported all of the audio and set it all up to play at the correct points and trimmed it all so that it fitted correctly. Once all of this was done I then began composing the soundscape for my cutscene.
Finally I put everything together to produce a cohesive, concise and completed cutscene to use in my project.
Figure 44 - A screenshot of the timeline in premiere pro, the audio sections at the bottom overlap to create a detailed and dynamic soundscape.
Branding
I wanted to make some cohesive branding for my project such as a logo and a splash screen. For the logo shown in Figure 45 I used a section from an earlier test render and cropped it down and tidied up the edges. This was really easy and only took me around 10 minutes to complete.
Figure 45 - The logo in photoshop for Life Of Bean.
Once this was done I made the splash screen shown in Figure 46. I wanted to follow in a similar style to the default Unity one which I cant remove without buying a licence. I first laid out my text and worked out where I wanted everything to go and then made a new render of Agent Bean with a transparent background. For this I used the same test render and just hid all of the background and changed the beans pose.
Whats Next
Main things left to do now are writing up my test plan, correcting some issues on my written work from earlier, fixing a few minor issues with things in game, implementing enemy ai and animations, implementing menus, and a few more things. I'm getting a lot closer now to finishing and will be focusing a lot more now on making sure that my written work is perfect.
Conclusion
I'm quite pleased with how this week turned out. I successfully completed all the tasks I had planned and effectively managed my time, albeit with a slight rearrangement of priorities. By adhering to my design document, I feel confident that I am meeting the requirements of the project brief.
Week 10 8/05/23
Overview
For this week, my main focus will be developing the enemy AI and designing the main menu for my game. Given the complexity of my plans for the menu, I anticipate it taking a substantial amount of time to create. As a result, I expect these two tasks to occupy the entirety of the week.
The implementation of the enemy AI is a critical aspect of my project's overall gameplay. I will focus on designing intelligent and dynamic behaviours for the enemies, ensuring they provide engaging and strategic encounters for the player. I will also be modelling and animating them in a visually appealing manner which fits their intended design.
Simultaneously, I will be putting a lot of effort into making a user friendly and visually appealing main menu for my game. This menu serves as the player's entry to the game and it will be the first thing they see when beginning the game. I plan to design a visually captivating menu that reflects the game's aesthetic and theme, while also ensuring intuitive navigation and functionality. Creating an immersive and polished main menu will contribute to a positive first impression for players and enhance their overall experience.
Enemy Ai
First I decided to work on the enemy AI. I will be splitting this into sections to make it easier for me to manage my time and deliver an effective system. First I will be working on the models, then the animations, pathfinding, and finally the combat systems.
Models
First I decided to make a 3D model for the enemy AI. For this I will be using Blender. For this I decided to use the player model shown in Figure 47 as a base and then modify this to work for my enemy design.
Figure 47 - The player model for my game. This model is very simple and I have already rigged it and prepared it to be used within the Unity game engine. This should make it simple to adapt for any other models which I need to make for my game.
Once I had prepared this base for my model I began altering it to match my designs shown in Figure 48. These designs make it easy for me to know the changes which must be made to the original model.
Figure 48 - The design for the enemy which I have made this week. They feature a camouflage outfit and will be carrying an M16 in game as was planned in my design document.
Using these designs I began making my model. The first thing I did was change the textures on clothing to match my designs. The main changes here would be the jacket, shirt and trouser textures. Once this was done I decided to add a beret style hat to complete the aesthetic as is shown in Figure 49.
Figure 49 - The model for an enemy bean within my game. Featuring camouflage clothing, a beret style hat, and an M16 rifle.
Animations
Now that the animations were done I could begin work on rigging and animating the model. Luckily I have a friend who animates weapons in games professionally who could show me how to do this (https://karsonjulien8.wixsite.com/karsonj). I started by rigging the gun to be used in animation. For this I set up a system of bone parenting and inverse kinematics so that everything can be moved with ease. There is one bone which moves the entire gun. The enemy's hand will follow this using inverse kinematics so that wherever I put the gun the arms and hands of the enemy will follow as is shown in Figure 50.
Figure 50 - The selected bone here moves the entire weapon and also causes the hands to follow the weapon.
Once this was done I added a secondary bone that would control the weapons magazine. This would also be a child of the one which moves the entire weapon, however when you move this bone it will be able to separates the magazine from the weapon as is shown in Figure 51.
Figure 51 - The green bone allows me to move the magazine and the left hand will follow it. This makes it extremely easy to animate weapon animations without messing with the location of the actual weapon. Overall this makes the whole process of animating this much easier.
Once this was all setup I began animating the various movements for the bean. For now these will be relatively simple due to primarily the lack of time which I have to work on everything this week. Thanks to me preemptively adding inverse kinematics to my rigging it's now very easy to animate this model in a natural and realistic way.
Pathfinding and Combat
Now that all of the modelling and animation is done I can begin working on the actual movement and functionality of my enemy. For this I will be using the Unity NavMesh functionality which allows me to easily implement realistic physics based enemy agent movement within my game. Using this component I can easily implement areas where the enemy can and cant path along with dynamically updating this to deal with moving objects such as doors or vehicles. I will be using this in combination with a simple state machine system. This will allow me to easily get the enemy to do different things and simplistically blend between them.
The first thing I decided to make would be the random patrolling functionality of my enemy Ai. This will allow it to walk around within a set area making sure that the player isn't there. To do this I use Unity’s NavMesh.SamplePosition function, which I found out about in their api documentation, to pick a random point on the navmesh within a range and path to it. The code for this is shown in Figure 52.
Figure 52 - This code allows me to pick a random point on the Ai NavMesh within a range and path the enemy to it. It also draws a debug ray on that location so that in the editor I can easily tell what the Ai is trying to do.
Once this was done I began on the next state which would be chasing. The idea for this is that when the enemy sees the player they will begin to chase them by pathing to wherever they last saw them. To do this I use the code in Figure 53 and a simple trigger box on the enemy which will be everything that they can see. Whenever the player is within this and they have a line of sight to them the enemy will switch to chasing.
Figure 53 - This code is used for detecting the player. Whenever they are within their sight collider the Ai checks to see if they have an unobstructed line of sight to the player using a Unity Linecast to make sure they aren't trying to path through a wall. Once this is done the _lastSeen variable is then assigned and the enemy will path to it. One of the interesting things about this Ai is it has no reference to the player at all and only ever knows where it last saw it. This prevents any potential issues with it somehow pathing to the player through walls and allows the player to actually escape and get away.
Now that the chasing code is done I will add in the attacking code as is shown in Figure 54. This is really simple as it just needs to deal damage to the player.
Figure 54 - This code first checks to see if the random number is equal to 2. This means that the enemy only has a 1 in 3 chance of hitting the player. Once this is done it checks that the player still exists, that it can see the player, and that it's definitely hitting the player. When this is done it just subtracts some health from the player and invokes a reset to add a suitable delay between each shot.
All of this code combined makes up the enemy Ai for my game. It is at the moment very simple however it's more than adequate for the task at hand and is capable of doing everything which I wanted it to do.
Main menu
For my main menu I have decided to make a primarily 3D environment and only use Unity Ui assets for the buttons and overlays. This will give my game a unique artistic flair along with helping me effectively fit my brief and appeal to my target audience.
Models
First off I began by modelling the basic scene. For this I first modelled a wall, desk and computer monitor as is shown in Figure 55. This took me about a day to do due to having to find suitable textures and come up with a design that suited my vision.
Figure 55 - This shows the computer desk and computer stand in Blender.
Once this was done I began modelling the paper files that will open and close along with show all of the settings information. This was really easy as they are essentially just 2 planes that I merged together in
Animations
Animations were pretty simple as all I needed to do was make the folders open. The main difficulty was rigging and weight painting them so that the paper did not clip through the folder when it was closed. After researching other people's issues that were similar on stack overflow I found that if I used boolean union to join the paper to the folder it would fix my issues. This combines both meshes into a singular connected mesh and makes everything so much easier. Once this was done I was able to rig, weight paint, and animate the folders to get the end result in Figure 56.
Figure 56 - The rigged and weight painted folders ready to be animated.
Functionality
Now that that is all done I can import into Unity and make the menu actually functional. First I added text to the monitor, this would be how the player navigates the menu. Then I added the buttons onto each file, these would gradually fade in once the files were opened. Once these base parts were in I began writing the code for it all. First I made a system that would navigate the menu through the text on the computer screen. For this I wanted to make it so that you could use the arrow keys to go up and down, and then enter to select. To do this I made it so that whenever you press the up or down arrows it iterates an integer, this integer is then checked and used to open the correct page with the code in Figure 57.
Figure 57 - This code opens the corresponding menu by playing the required opening animations and fading in the text.
Once this was all done I made it so that the buttons would actually change the corresponding settings. This was really easy as I have made main menus prior to this and can for the most part just copy the code from them. Finally I made everything save using PlayerPrefs so that the users settings could save and sync with the pause menu settings as well.
Whats Next
Next week I need to finish everything that's left. This will be the pause menu, arena level, and anything else that comes to mind when testing. After that I will be doing the initial testing, getting my test plan done, and doing the evaluation.
Conclusion
Overall I think this week went really well. I've got everything done that I wanted to and have managed to stay on schedule albeit in a slightly different order. Ive effectively stuck to my design document and managed my time well and I think I'm in a good position at the moment to be meeting the brief.
Week 11 and 12 15/05/23 - 22/05/23
Overview
During these weeks, my main focus will be making the main level for my game, the pause menu and fixing any issues which may come up. I will dedicate my time to developing the arena level, where the core gameplay will take place. This will involve designing the layout, creating interactive elements, implementing enemy AI, and fine-tuning the overall experience. Building a compelling and engaging arena level is crucial to provide players with an exciting and immersive gameplay environment.
In addition to level development, I will also work on creating a pause menu. The pause menu serves as an essential feature in providing players with options to adjust settings, resume and quit the game. Designing a user-friendly and intuitive pause menu is important to enhance the overall user experience and ensure seamless gameplay transitions.
Finally, testing will play a significant role in my project this week. I will assemble a team of testers who will provide valuable feedback and help debug the game. This collaborative effort will assist in identifying areas that require improvement, optimising performance, and ensuring the game meets the desired standards of quality.
Pause Menu
The first thing that I decided to work on was my pause menu. To do this I made the minimalistic UI shown in Figure 58 which fitted the designs layed out in my design document.
Figure 58 - This pause menu is very simplistic and features a clean and aesthetically pleasing design whilst delivering all of the relevant information and options to the player.
Once this was all done I next implemented all of the code for the pause menu. This was really easy as it was essentially the same as the code used within my main menu, with the only real changes being the ones shown in Figure 59.
Figure 59 - This is a section of the code which controls my pause menu, the main difference between this and the main menu is the addition of a sensitivity slider along with options to return to the main menu.
Arena Level
Now that the pause menu is done I began work on the Arena level. This will be split into two main sections being the model and gameplay.
Arena Model
To make the arena model I first consulted my design document. Here I planned a basic layout for my arena, along with some reference imagery of what it will look like. Using this information I constructed a more detailed design that is shown in Figure 60.
Figure 60 - This design includes the arena on right and the firing ranges on the left. This is what was used to make the model of the arena and also to plan how it will be laid out within unity. After the designing was done I started to model the arena in Blender. First I made an assortment of prefab buildings which are shown in Figure 61. These will be randomly tiled throughout the scene to make up the main body of the arena.
Figure 61 - These buildings were made out of extruded and loop cut cubes in the program Blender. I plan to randomly scatter these along the floor of my arena using the Blender hair particle tool.
Once these buildings were done I made the main building of the arena and scattered them throughout it using the hair particle tool in Blender. At this stage I also textured the structures and the main arena building and added some lighting as is shown in Figure 62.
Figure 62 - A picture within Blender of the arena with my prefab buildings, lighting and textures.
After making all of this I realised the arena was still rather sparsely decorated. To remedy this issue I made more prefabs of various stacks of coloured barrels and scattered these throughout the scene. This made it feel much more populated and generally better looking as is shown in Figure 63.
Figure 63 - The final appearance of my arena scene. This features all of the various models and lighting that will be included within Unity once it is imported and set up.
Gameplay
The first thing I did once I'd imported everything into Unity was get the player in and fix any issues that came up. There were a few colliders that were fixed really easily however there was one key issue which stood out to me.
The main issue I found when importing and setting up the lighting was with the player. For some reason the scene would be illuminated correctly, but the player would be extremely bright as is shown in Figure 64.
Figure 64 - Any non static object within my scene looked like this. It is extremely bright to the point of blinding whilst the rest of the scene is illuminated perfectly.
After extensive research I realised that this was caused by the light probes within my scene. For some reason they were extremely bright and were causing all non static objects to appear like this. Frustratingly there is no default feature in Unity to change the intensity of light probes so I would have to find or make my own. To do this I first found a previously existing tool to change light probe intensity, however this tool was around 7 years old now and borderline completely dysfunctional due to its methods of checking if lighting was baled or realtime. I modified this tool to upgrade it for the latest versions shown in Figure 65 and implemented it within my project by researching spherical harmonics and tetrahedrons. A useful source for this was the Unity documentation (https://docs.unity3d.com/ScriptReference/Rendering.SphericalHarmonicsL2.html)
Figure 65 - The main things I had to change were any references to the bakingOutput. These all referenced extremely old api functionality which no longer works in the more modern versions.
After this issue was resolved I implemented the enemy spawners and scoring system. For this I used a level manager script which would handle the score, time played, waves, enemy spawning, and everything else I may need.
To make this I first added a timer system as shown in Figure 66. This counts up over time and displays it throughout the round and when the play finishes. This is also used to handle wave spawning and adds another layer of competition and scoring to the game.
Figure 66 - This code counts up for every second based on Unity’s delta time function which I researched in their documentation. This is then converted into minutes and seconds by using the modulo operator.
After this was done I implemented the systems for spawning enemies on their various spawn points. This was really easy and whenever the player eliminates them all there is a 3 second delay before the next wave of spawning begins. This is all done with the code in Figure 67.
Figure 67 - This simple piece of code configures all the relevant variables when it comes to spawning enemies. It works out how many enemies need to be spawned along with the current wave number and then sets all that to be used for the enemy spawning.
Testing
After around 3 or so days of testing with a varied set of people. This produced quite the list of problems all of which having their own solutions with varying amounts of time and research needed. A table of these is below.
Problem | Cause | Solution | Research |
Enemy footsteps overlap when chasing | Playing one shot sounds without a delay | Add a delay and use playSound instead | https://docs.unity3d.com/ScriptReference/AudioSource.PlayOneShot.html |
Sound directionality isn't that obvious | Audio rolloff set incorrectly | Change rolloff to linear and decrease the maximum distance | https://docs.unity3d.com/ScriptReference/AudioRolloffMode.html |
When you kill one ai they all die | Using GameObject.Find found all of the enemies to damage not just the one you hit | Get hit object from raycast instead | https://docs.unity3d.com/ScriptReference/Physics.Raycast.html |
Holes not working in initial scene | Incorrect layer on targets | Changed target layer to damageable | https://docs.unity3d.com/ScriptReference/LayerMask.html |
Cursor invisible when transition to main menu from cutscene | Not resetting cursor lock state to none | Set cursor lock state to none | https://docs.unity3d.com/ScriptReference/Cursor-lockState.html |
Scene Indexes all wrong | Adding new scenes had changed the index numbers of pre existing ones | Reorder the scenes in the build settings | https://docs.unity3d.com/ScriptReference/SceneManagement.Scene-buildIndex.html |
Debug Menus broke | Unsure (Maybe OnGUI) | Leaving like this as its unnecessary in the builds anyway | |
No mouse in main menu from cutscene | Not resetting cursor lock state to none | Set cursor lock state to none | https://docs.unity3d.com/ScriptReference/Cursor-lockState.html |
Default sensitivity is 0 so can't move the mouse | Never actually set default playerprefs values | Add default values for playerprefs | https://docs.unity3d.com/ScriptReference/PlayerPrefs.GetInt.html |
Default fps is 30 | Never actually set default playerprefs values | Add default values for playerprefs | https://docs.unity3d.com/ScriptReference/PlayerPrefs.GetInt.html |
cutscene not fitting correctly | The object the video plays on is too small | Correct the size of the object | https://docs.unity3d.com/ScriptReference/Video.VideoPlayer.html |
no cutscene volume control | Forgot to add volume control to the cutscene | Added a slider and new audio mixer channel | https://docs.unity3d.com/Manual/AudioMixer.html |
cutscene playing every time you open the game | Not saved whether or not its been opened before | Saved whether it has been opened in playerprefs | https://docs.unity3d.com/ScriptReference/PlayerPrefs.GetInt.html |
Jumping slides when you land | Incorrect physics material on the player | Added friction to the players physics material | https://docs.unity3d.com/Manual/class-PhysicMaterial.html |
Fullscreen bool not saving | Never actually set playerpref value | Set to dynamic float in inspector | https://docs.unity3d.com/ScriptReference/PlayerPrefs.GetInt.html |
ai hitting 100% of the time | Incorrect code for getting random values | Changed the code so it gets a number between 1 and 100 | https://docs.unity3d.com/ScriptReference/Random.Range.html |
overlapping voice audio | Voice lines overlap due to usage of playOneShot sound | Added a new function to the audio manager that stops the current voice line and plays a new one | https://docs.unity3d.com/ScriptReference/AudioSource.PlayOneShot.html |
quit button | Quit button exits game instead of changes scene | Set it to change the scene instead |
Conclusion
Overall I think this week went really well. I've got everything done that I wanted to and the testing caught so many issues which I wouldn't have necessarily noticed. Ive effectively stuck to my design document and managed my time well according to my schedule and I think I'm in a good position at the moment to be meeting the brief and finishing this whole project on time.
Final Evaluation
Overall, I am very satisfied with the results of my game. I have developed a complex and visually engaging project with numerous well-functioning features that effectively appeal to my target market. At present, there are no known game-breaking bugs, ensuring a consistent and enjoyable gaming experience, thanks to my testing efforts.
Firstly, I believe that the project was a great success. Right from the start, I was aware of the ambitious nature of this project along with the tight time constraints I had. However, despite the immense pressure and the fact that this is, by far, the most complex project I have ever worked on, I have successfully developed a complete game that fulfils all of its intended purposes while maintaining an aesthetically pleasing design. Features such as the cutscene and weapons system posed challenges that made me uncertain about their implementation, almost leading me to abandon the project. Nevertheless, I persevered through all the obstacles and emerged as a more skilled developer in the process.
However, despite this success, there are still some issues. Many desired features had to be cut from the game, including the grenade section of the firing range, which would have been a valuable addition. Another significant feature that was omitted was the ability to crouch. Although technically present in the game, I couldn't effectively lower the player's height, hindering its actual implementation. Prioritising this feature would have greatly enhanced the overall gameplay experience, highlighting aspects of my development process and how I can become engrossed in finer details, such as the cutscene. In hindsight, I should have considered the challenges associated with developing the crouching system and allocated less importance to less relevant elements such as the cutscene. The cutscene, while a time-consuming aspect of my project, didn't significantly contribute to the gameplay. Although I am satisfied with its final outcome, I could have utilised my time more efficiently.
I believe that the decisions made to cut features and change certain aspects demonstrate my ability to make critical decisions under time pressure and adjust my schedule accordingly. This is further emphasised by my decision-making process regarding when to deviate from the schedule and when to adhere to it. For instance, I chose to postpone the cutscene to allocate more time for core gameplay elements and prioritised the visual elements towards the latter stages of the project.
If there's one thing I could do differently with this project, it would be to incorporate more gameplay and story elements. I invested considerable time in aspects like cutscenes and the main menu, whereas my primary focus should have been on the gameplay itself. Ideally, I should have included around 30 minutes of story-driven gameplay, adding depth beyond the current arena setup.
Based on the feedback I received from my peers, I gathered that the game is generally well-received and enjoyed. However, some players find it challenging to defeat the enemies and suggested that I "weaken the enemies" to enhance the overall experience. Aside from this specific feedback, all other comments were overwhelmingly positive. Players expressed their enjoyment of the "Menu Scene", "Cutscenes", "Voice lines", "Visual effects", and "Lighting" highlighting the game's overall polish.
However, I also received a variety of negative feedback from my peers. One key issue identified was the absence of particle effects, such as bullet tracers, weapon flashes, or blood effects. These effects were intended features that would have greatly enhanced my project, but had to be omitted due to time constraints. Additionally, there were complaints regarding the user interface in my project. Some individuals felt that the section at the end of the arena, where the player's score is displayed and the next action can be chosen, was visually lacking and did not blend well with the rest of the UI. A similar complaint was raised about the cutscene slider, which used a default Unity slider, subtracting from the visual impact. Many of these issues, including the cutscene slider, should have been addressed. However, resolving the issues with the arena would have required a complete redesign of that particular scene. Originally, I had planned for a file to open, displaying all the player's scores, resembling the main menu. Unfortunately, time constraints prevented me from implementing this feature. Finally, it was noted that there was no skip button for the credits scene, which lasts for a considerable duration. Regrettably, this was a feature I completely overlooked and failed to include.
Aside from these few minor issues and complaints, the feedback was generally really positive, and everyone seemed to enjoy my game. There even became a competitive aspect among the testers to see who could achieve the highest scores in the arena, with some playing for more than 10 minutes straight to reach a score of 150. This proves that my game was successful among my target audience and that I have created an enjoyable experience for the end user.
Finally, I believe that I adhered closely to my design document. I rarely deviated from the original plan and only made a few minor adjustments as I progressed, addressing things I had initially overlooked or believed could be enhanced. The only significant change I made was to the sound, opting for an entirely different genre of music. I believe these modifications were beneficial and generally improved the game compared to how it would have been if I had strictly followed the original design. This is particularly true for the new music, which I consider a substantial improvement over the initial choice.
Overall, I'm really happy with how Life Of Bean turned out. Everything went pretty smoothly, and I've delivered a project that exceeded my initial expectations. Unlike my last major project, it is nearly bug-free, showcasing significant improvement in my development and testing process. Moreover, this project stands out visually as one of the best things I've ever created, particularly with elements like the cutscene, which was a completely new endeavour for me. I believe that this project has not only helped me enhance existing skills but also allowed me to acquire new ones, such as animation. In general, it has been a great success.