Week 4: Gameplay Design
Monday 21st - Friday 25th February 2022
WEEK 4 -
TO DO: GAMEPLAY DESIGN
PURPOSE FOR DESIGN
Over the last few weeks, we have ran into a number of problems and changes with our scope. A lot has happened in a short amount of time, and now we must take all these changes and apply them to our design. Over the next couple days, Kiera and I have agreed to focus on gameplay design, by outlining exactly what will happen at each stage of the game, and show this through visual demonstrations, animations and mock-ups, as well as other design thinking processes such as experience maps. I intend to finalise this process by creating a new refined Project Scope, which should provide us with the tools to begin development.
WHAT WILL WE BE DOING?
We will be working together to map out the full gameplay experience. I plan to create a series of video demonstrations to showcase how players will interact in the space and across 4 screens. It's important for us to map out the 4 screens and pin-point exactly what will happen at each phase. We can use experience maps to identify how players should feel throughout, and evaluate the effectiveness of this through testing in the upcoming weeks.
GOALS FOR THE DAY
Create a series of gameplay demonstrations. Map out the full experience across four screens, create experience maps and refine Project Scope - what is necessary at each stage of the experience?
TEAM WORK - WHERE DO WE BEGIN?
I am going to begin this process by drawing up a basic storyboard to illustrate the main phases of the game and outline how it has changed overtime. From here, Kiera and I will work together to map out the full experience across 4 screens on her wall!
QUICK PLAN FOR THE NEXT FEW DAYS
Monday 21st - Sunday 27th February
Evaluate changes we have to make to gameplay and set-up - as a result of recent discussions and problem-solving.
Illustrate gameplay experience - storyboards ad annotations.
Map out full experience with Kiera - create video/animation demonstration.
Create refined Experience Map and Beat Chart.
Refine and finalise Project Scope.
TO DO: CHANGES WE'VE MADE
Over the last 4 weeks, we have encountered many changes with our scope and a lot of problems that we've had to find quick solutions for! To start the day, I am going to outline the changes we've made and the design choices we have followed through with, to prepare myself for development.
In Week 2, we established an idea for gameplay that we are both happy with. This idea requires players to fly a bird across the environment, to collect items which trigger layers of a soothing soundtrack. This idea allows control of both the mind and body, and requires players to move steadily to collect the necessary items. This also provides players with agency, giving them control over how the soundtrack sounds and placing them in a space unique to them. The idea of a bird represents freedom, during moments where players might feel trapped in their own thoughts and feelings. These elements of gameplay work cohesively with the music and visual design to provide players with a cathartic experience, encouraging both emotional and physical regulation.
INTERACTION / INPUT
We began the project knowing we wanted to incorporate physical movement into the experience, knowing how important this is to practice self-care. Our initial idea involved the use of a single Touch Controller to guide the bird throughout the environment, however, we have recently come across the process of 'webcam/image processing' which requires no controller, just the use of a webcam and gesture control. This means players will be able to guide the bird using their palm with no need for additional equipment. This should encourage players to move slowly and make the most of the physical aspect of the experience. In comparison to our initial idea, this is both playful and more beneficial to players. It also means we can create a separate program for our game, which will run the game completely online. (No need for Unity Hub)
As I mentioned before, the way we would go about using webcam processing is a completely different program to what we're used to. There will be no need for us to use Unity Hub which we are most familiar with, so having regular discussions with our programming lead will be extremely important. We are scheduled to talk with James Stallwood in the next week or so, regarding how things will work and how he plans to get this set up for us. This is because he is currently working on a similar project and is interested in our use of gesture and projection. We should have a demo by the end of next week which we can play around with, and think about how we can set up a space using a potential projector. From what I know already, this program will be set up in its own server and will run from the web, so we can easily link a laptop/pc to a projector using HDMI to run our first tests.
THE SET UP
We concluded that for right now, we will put the box construction on pause. This is because, with our new input method in place, we are currently unaware of how exactly to set the space up. We have to consider position of laptops, webcams, screens and projectors, which could drastically change overtime. As a result of this, we will be informing both Chris O'Connor and Andy Brooke to make them aware of these changes and contact them when the time is right. For now, we have agreed to experiment using our own structure and set up, using existing panels and walls and transforming our own space into the Getaway.
NEXT STEPS FOR THE PROJECT
Next steps for the project will be to refine and finalise gameplay, with everything we need to begin development. From here, we can move on with asset creation and sound production, as well as focusing on how we will set up the physical space. This week, we will be focusing on finalising gameplay and getting everything set in stone before we begin development.
TO DO: GAMEPLAY STORYBOARD
The main gameplay experience - as we have made a number of changes, the game idea has developed a lot overtime. I am going to refine the concept today by using storyboards and experience maps to demonstrate the refined game idea and intended player experience.
This storyboard visualises the main stages of the game and what each phase should consist of. This makes it easier to break the experience down into individual components, to prepare us for development. By doing this simple task, I have already formed a list of design questions that we need to answer to ensure everything works how we expect it to.
I can take these questions to the next meeting with Kiera and come up with solutions for this.
Will there be UI to begin the game? Or will it play automatically.
Will the bird have an idle mode to allow players to take a break/pause? - Touched on this before.
How long will the experience last? - Approx 10 mins, allow time for players to sit and listen to the music they created.
PLAYER EXPERIENCE: Breakdown
Student is feeling overwhelmed by their workload and student responsibilities.
Student visits the 'Getaway' located on campus.
Student enters the 'Getaway' space and is presented with a 360-degree projection of a mountainscape environment and soothing music.
Player interacts using the palm of their hand to guide a bird on screen through the mountains, across each wall.
Player hovers over 'music triggering' objects and notices layers of the soundtrack develop overtime.
Player observes their environment and listens to the calming music.
Player leaves the 'Getaway' space, feeling calm and relaxed.
TO DO: MAP OUT THE FULL EXPERIENCE WITH KIERA
We have established the main phases of gameplay and now plan to map out the full experience, across 4 screens on one wall. This will demonstrate gameplay within the space and show how interactions will work across walls, with the development of music overtime.
I began by sketching out a full environment on a wide piece of paper attached to the wall, consisting of different elements within each region. I then split the environment into 4, to represent the 4 individual screens. As I was doing this, Kiera was drawing and cutting out the interactive elements, such as the bird and music triggering items, which in this case, we have selected to be a snowflake! We were discussing what this might be and thought having light and subtle snowfall could be nice in the environment, and make some of these more distinct, to symbolise the interactive objects.
THE FULL GAMEPLAY EXPERIENCE
Red Objects - Interactive
I split each phase of the game into 4, to represent the four different stages of the game. There shouldn't be an increased level of difficulty overtime, or any sense of challenge at all. The difference between these phases will be the number and placement of musical objects. We previously agreed that around 3-4 musical objects (snowflakes) will trigger one layer of music, so should have around 5 layers of music overall which will develop overtime. Also, each screen consists of a slightly different scene, to switch up the design of the environment and keep players interested and engaged, to avoid repetition and prevent boredom.
As we did this activity, we were constantly speaking about next steps, ideas and what needs to be done for it to work. I noted down our discussion which is as follows:
NOTES THROUGHOUT PROCESS
Begin with 3 interactive snowflakes on the first screen, this should trigger the first track to ease players into it. Music should already be playing to make players comfortable in the environment, to avoid silence.
Potentially use of User Interface to begin the game, otherwise players will be unaware of how to begin.
Maybe an indication of how to play in the space - but I feel as though this could be demonstrated on a simple pamphlet outside the space.
Snowflakes to represent the musical objects - will slowly fall amongst other snowfall. (Slightly lower opacity, not as distinctive)
We spoke about the opportunity for User Interface, but we don't want to take away from the overall experience. We don't expect players to have to read text or information beforehand, we intend for it to be a light-hearted experience which doesn't require too much thought - just relaxing play and satisfaction. During testing, we can ask our users if they require more indication as to how to play in the form of simple User Interface or HUD on screen, because we can't continue to make these decisions without input from our target audience and potential consumers! Although, to speed up the process of development, we can make rapid decisions depending on their priority and severity.
To further demonstrate gameplay, I plan to create a stop-motion animation using our visual experience map and add in the additional features necessary. This animation should demonstrate the interaction between the bird and the snowflakes, leading to the development of music overtime. This will help with development, so we know exactly how many assets to create and how many layers of music to produce. This information is especially important for our Sound Designer, Harry Williams.
GETAWAY: ANIMATION DEMO
WHAT DOES THIS SHOW?
This video demo showcases each interaction between the bird (the player) and the snowflakes, which represent the 'music triggering items'. It shows that after interaction with 3/4 snowflakes per screen, a second, third, and fourth layer of the soundtrack develops. Once each snowflake is interacted with, players can continue to soar through the environment or just simply sit, walk or stand and listen to the soundtrack they created overtime. The movement of the bird in this demo represents how players will guide its movement through the environment using the palm of their hand through hand-recognition and image/webcam processing.
REFLECTION & EVALUATION OF GAMEPLAY EXPERIENCE
We are both happy with the idea in general and believe it has potential to go further, and even be applied to different health scenarios. I feel the most important thing is to get the interaction set in stone. This is a major component in terms of our intended player experience. This interaction MUST encourage users to move slow and controlled, as this is a big part of physical regulation. As well as this, we must ensure our sound is designed particularly, with a heavy focus on the patterns I studied last Semester. This way, we can provide people with an experience that was well thought through and meaningful. Every design decision that we made last Semester must be brought to life, within reasoning. I think so far, we have the great foundations for a meaningful experience. Now we must focus on the development, ensuring players move slowly, ensuring the music has its desired effect on the way players feel in that moment, and that the visuals are engaging enough to maintain interested throughout. For now, I plan to create a breakdown of the experience, highlighting every point and listing the components that need to be developed as a result.
TO DO: REFINED EXPERIENCE MAP & BEAT CHART
WHAT WILL I BE DOING?
Now we have established gameplay and are aware of what should happen at each stage of the game. I am going to make a refined player experience map and Beat Chart, outlining every component ready for development to fulfil each stage of the game!
I refined my previous Experience Map from Week 2 with our new ideas for gameplay, incorporating the use of webcam processing and gesture control into the main interactions.
HOW CAN WE FURTHER DEVELOP OUR IDEA?
I took note of the opportunities at each stage of the game. Most of which involves the use of User Interface to guide the experience, which some may find useful. This will have to be done in a way that doesn't require too much thought or concentration. Rather than being in a text format, I feel a small gesture animation on screen would work best. This would prevent any misunderstanding or confusion.
Hands down the most important aspect of our experience is achieving our desired level of effectiveness. As outlined at the start of the project (Re-cap) our Project Goals involve the encouragement of emotional and physical regulation, through design choices that help reduce anxiety and stress. We are able to do this with a heavy focus on visual and sound design, as well as the use of a physical space to achieve full immersion and provide players with a space to escape. Although, we cannot simply just make these decisions and allow them to fall into place, we must conduct user testing in the upcoming weeks to ensure gameplay has its desired effect on players and we get the intended result from the entire project! How? We can use our experience maps to understand exactly how players should feel at each stage of the game and when testing the different components, can evaluate whether or not these expectations are fulfilled. (Through Question & Answer and monitoring heart rates before and after the experience.
Next steps will be to outline each component during each stage of the game to gather a better understanding of exactly what we have to make in order for our game to function. I plan to do this in the form of a Beat Chart similar to how I did in Week 2.
TO DO: THE BEAT CHART - inspired by Scott Rogers
WHAT WILL I BE DOING?
Scott Rogers suggests using 'a Beat Chart' to identify the individual components that make up a game. This outlines each element of each stage including the use of haptics, User Interface and mechanics. Breaking down our game like this will prepare us for development by enabling us to create a more refined Project Scope.
THE BEAT CHART
Using my refined Experience Map, I have broken down each phase into individual components.
HOW HAS OUR IDEA CHANGED?
In comparison to the last Beat Chart I made (Ideation), the difference is the method of input/interaction and the opportunity for UI. As we have established our new input to be gesture, there will be no need for Haptics. Most of the features have remained the same however, the change in input has changed the way we planned to develop the game immensely.
NEXT STEPS - where do we go from here?
Finalise the Project Scope! Using the Project Scope we made in Week 2, inspired by our Project Proposal, I will be making refinements using our new gameplay idea and Beat Chart to create a detailed components and asset list.
TO DO: REFINE AND FINALISE PROJECT SCOPE
PREVIOUS SCOPE: https://docs.google.com/document/d/1Rm-Pr1NB6pO-GOjd3Cd0ich0Ur28gmVEdkeT6Dhntaw/edit?usp=sharing
WHICH COMPONENTS ARE NEEDED TO FULFIL THE INTENDED PLAYER EXPERIENCE?
Can be found in my Sketchbook
Last Semester, Emma Raey suggested a way for us to split our gameplay experience into Visual, Audio/Verbal, Haptic and Ludic. I have applied this method to our gameplay experience to separate the components:
Hand-drawn art style
Engaging visual design
Mountainscape Environment - Inspired by Hallstatt Austria.
Consisting of small, intricate animations.
Bird Animation to collect snowflakes.
Uniquely composed Soundtrack - Piano track separated into 5-6 layers.
Snowflakes to trigger layers of music
Nature sounds to accompany the soundtrack.
Fly bird around the environment to collect 'snowflakes' which trigger layers of music.
Become immersed by soothing music and visual design.
Use 'gesture' to guide the bird around the environment.
Can be found in my Sketchbook
I can now create a detailed 'Component List' to outline every element needed to achieve the full gameplay experience for Getaway! I will be splitting this into Animations, Mechanics, Scripts etc, to prepare ourselves for development and be aware of what we need to create.
In Semester 2, Year 1 Game Development, our tutor taught us the basic structure of a Project Scope. I will be finishing the day with a refined Project Scope for Getaway to get us started with development. Using this components list, I can document a detailed 'Asset List' which outlines the expectation of each asset, it's design and its function within the game. This Project Scope should contain milestones, deadlines and goals too.
NEW PROJECT SCOPE:
This document outlines the features/what must be present to create the game, a break down of assets, how these assets will be presented and our key milestones.
I have documented everything we need to know regarding the creation of assets and their expectations in game, including the environment and particular animations. However, the only thing not set in stone at this moment is the set-up of the space. This should become more refined overtime as we discover an appropriate way to structure the space and with which specific equipment.
HOW CAN WE USE THIS?
This should prepare us to distribute development tasks between us in an efficient way. We identified our individual roles/responsibilities in Week 2 and now it's down to us to begin with the necessary tasks. As Art Lead, this document will be especially helpful when distributing asset creation between us, and creating the necessary documents to record progress and provide feedback. It'll be important for me to state the expectation of each asset beforehand to ensure development is as expected.
Now we have every aspect of our game set in stone, minus the set-up of the space. We have agreed over the next few weeks to look into the equipment we might use and how we will structure this, but for now, we aim to tailor the design of this around the experience itself. According to our schedule, next week (WEEK 5) should be spent focusing on the appearance of the game and preparing us for asset creation in Week 6. For now, I aim to spend the rest of the week planning our short film/documentary for the project so we can begin filming around Phase 3!
Next tasks for this week: