top of page

Week 11:
Playtesting

Tuesday 10th - Friday 13th May 2022

WEEK 11  -

TO DO: PLAYTESTING!​

 

PURPOSE FOR TESTING

Today we are continuing our playtests to gather feedback and take notes on ways to improve to further develop our experience. We are running casual playtests throughout the day, to allow people to comment on interaction, gameplay and how they feel about the experience overall. On the 13th May, we have a scheduled playtesting session with students/alumni from WSA.

​

WHAT WILL WE BE DOING?

Kiera and I will be setting up one of the laptops and run 1/2 servers for players to test on, because we know we can duplicate our editing to code/changes across both servers. We have also setup two screens to do further playtesting with, to gather a greater idea of how our setup will work at our Degree Shows.

​

GOALS FOR THE WEEK​

By the end of the week, we should have a series of playtests documented. We must gather feedback, take notes on suggestions and improvements to make necessary changes and progress with development.

​

​

TEAM WORK - WHERE DO WE BEGIN?

We will begin by setting up gameplay and talking our first tester through the process of the session.

MINI PLAN

Tuesday 10th

  1.  Casual Playtesting throughout the day

​

 Friday 13th May

  1. Playtesting Session 2 - 4 pm

PLAYTESTING

​TEST 1

Tuesday 10th May

​TESTER: Adam Procter

DURATION OF TEST: 10 minutes​

 

PROCESS:

We setup the game and talked through the issues we were having with Adam, and we spoke about possible solutions and ways we could improve.​

COMMENTS & FEEDBACK

​

  • Can edit live code to help debug things

  • Wrist detection still seems a little difficult

  • PoseNet Script - draw skeleton, to see how detection works and how we can improve the detection of the right wrist.

  • Speak with James about the camera on screen to show skeleton and wrist detection.

  • Up and down movement to be improved.

  • Background at the degree show - what will be behind players? Will the background interfere at all?

  • Snowflakes to fall after camera/script/detection has happened.

  • Potential idea to have a loading screen - to allow for server to sync and load before players can begin.

​TEST 2

Tuesday 10th May

​TESTER: Josh

DURATION OF TEST: 10 minutes​

 

PROCESS:

We briefly spoke Josh through the experience and what was required from him. He began the game and we discussed the experience throughout, to gather feedback in real-time.​

COMMENTS & FEEDBACK

​

  • Would like the snowflakes to disappear to feel more satisfying

  • Objective not clear because of this - how do you know when you've done anything if it's not visualised?

  • Fascinated by the interaction/use of gesture

  • Was excited to see animations placed in the environment

  • Liked the idea of being projected across screens

  • Music was nice

  • Overall found the experience relaxing

EVALUATION OF TESTS

​

THINGS THAT WORKED

From our brief playtesting session this afternoon, we managed to gather some good suggestions regarding how to improve the experience as well as what works and what people enjoyed about it. Visually, we are happy with the outcome of our design and we don't feel there is anything to improve in terms of asset production. In addition to this, the music combined worked well to create a relaxing atmosphere/tone which helped to calm players.​

​

ISSUES WE FACED

From talking to testers and reading over their feedback, it's clear that there is an issue in regards to the detection/interaction and initially starting up the experience. This is because, players are unaware of both how to play and what to do, leading to confusion. Because of this, it may be necessary for us to consider a way we can indicate how to play, and how exactly to use the 'wrist' to interact. Along the lines of detection, this is still extremely jittery and we unsure why! When we speak with James on thursday, we can talk through the current issue with this and maybe touch on the idea of changing the body part detected, to something less effort requiring. (Such as palm or finger)

​

HOW CAN WE OVERCOME THIS?

A way to overcome the issue with the wrist detection is simply use another body part to control the bird. Although, this may take more time than we have. The next most important thing to cover is establishing a way to introduce the experience to players, to not throw them in blind, with no idea of how to interact or what to do whatsoever. This could be a short loading screen/intro before the experience begins, with an indication of how to move/position the wrist and what the aim is, although, we did like the idea of having players experience it for themselves. Lastly, we must ensure that once players have interacted with the snowflakes, these disappear and indicate that the player has made progress.

PLAYTESTING SESSION WITH SETUP​

Tuesday 10th May - Friday 13th May

​PROCESS:

With our setup already prepared, we thought it would be best to gather insight from playtesters regarding the setup of the space as well as the experience. The setup itself is a large part of the intended player experience, as it enhances the level of immersion, providing them with an increased level of engagement and interest. This is extremely important for our initial project goal, so testing with our 'two screen' setup today has been really beneficial.

​

Setup can be found here: Testing Setup

​

We lined up the panels next to each other, with room for players to walk and move around as they interact. The projectors were placed up against each wall, each connected to our individual laptops, running the two game servers. The positioning of our setup meant that players could control the bird on one screen, and eventually glide through to the next. Although, the position of the webcams have to be incredibly specific.

​

​

13th May - UPDATE

Today, we are hosting a testing event from 2-4 pm where people will be coming in and providing us with feedback regarding gameplay/experience/interaction etc, which I will be documenting the results for below. We have set up the laptop for them to interact with, as we no longer have the projectors - they are currently being used for the fashion degree-shows.

IMG_3195.HEIC

PLAYTESTING VIDS​

COMMENTS & FEEDBACK

(From all sessions)

​

I also supplied testers with a 'Feedback Form' which can be found here: https://docs.google.com/forms/d/e/1FAIpQLScmHQNH0-0w9Rlojpn2uFh2zmZgLYPq0kgajO-snNi5RaBZcg/viewform?usp=sf_link

​

  • Should put in a loading screen to wait for the snowflakes to fall before players can begin - this playtester had to wait before he could start. Loading screen could last around 5-10 seconds. Can be used to introduce the game.

  • Interaction quite uncomfortable and didn't work so well

  • The bird kept pinging back and forth

  • Had to begin gameplay on both screens, pressed button on both screens - how do we make it so they work simultaneously?

  • Unsure of how to play - a small indication or instruction would've been nice.

  • Player held wrist wrong and kept rotating their arm rather than holding it still - this would prevent the bird from flying correctly.

  • Interaction with each screen and the position of the webcams led to the bird moving on both screens at once.

  • Performance of interaction is less effective once projected, as the player isn't in the best position for the webcams to pick up movement. The position of the webcams in our final setup for the Degree shows must be placed parallel to players and in eye sight.

EVALUATION OF TESTS

​

THINGS THAT WORKED

We successfully projected gameplay onto to screens, enabling players to move around as they interacted. Players moved slowly and the outcome of the experience overall was how we desired. Testers commented on the relaxing sound and engaging visuals, so we know these two components are working successfully and we no longer have to worry about these elements. However, most of our concern is built around the interaction itself.​

​

ISSUES WE FACED

Players are still struggling to get used to the 'wrist' interaction, as they rotate their arm, resulting in a jittery flying bird, and the webcams inability to detect their wrist. This resulted in confusion and frustration which we want to avoid at all costs. In addition to this, the position of the webcams today meant that as players interacted with one, it triggered the other, which can be overcome by a simple adjustment to the position of the panels and the webcams. Lastly, to begin the game, players had to press the 'Begin' button on both screens, which was a little inconvenient, especially if we plan to hide the devices themselves. To overcome this, we'll have to begin gameplay ourselves.

​

HOW CAN WE OVERCOME THIS?

We must talk with James regarding the wrist interaction and see if we can get this changed. We now only have around 2 weeks left, which isn't a lot of time to be changing scripts and developing processes! We should also take into consideration the position of the webcams for our final degree shows, but we have time to prepare this after hand-in. The most important thing for now is the gameplay, and we must make sure it's both intuitive and effective. For now, the most important elements to refine are the following:

​

​

TO DO LIST - REFINEMENTS

​

  • Discuss potential changes to 'wrist detection' - how can we change this to fingers or palm instead?

  • Create a 'loading' screen to allow the server to start up.

  • Import animations into both servers

  • Provide links for both servers for projection use

  • One server for access on computer/other devices - this must have all tracks within it).

​

​

NEXT STEPS

We discussed refinements with James and ways we can improve interaction, and due to our time frame limitation, it is too close to hand-in to begin developing a new process/program to detect the palm/fingers, because of this, we have to work on the detection we currently have. This weekend, I plan to spend my time working on the 'loading screen' which should introduce the game and contain a little about how to play, to ensure players interact correctly and fully understand the experience.

​

Week 12 should be spent refining the final version of our game, importing all animations into the environment and providing links to our servers, to work on all devices, so people can setup the experience from the comfort of their home.

​

​

Asset Creation

  • Instagram
  • Twitter
  • LinkedIn
bottom of page