Angle Jungle was a great project and on completion we wanted to expose our work to the wider community for more feedback and potentially even to explore possibilities of formal assessment of the game actually being effective in its transformational objectives. Therefore we wrote up a paper and submitted it to CHI Play. Fortunately it was accepted and we were invited to present the game. My intention at the conference was then to connect with people who were able to help formally assess Angle Jungle as well as keep an eye out for good connections.
Why I loved this was that by the end of the work they had a playable deliverable that made engaging with language an essential part of the experience, not at all like the typical chocolate broccoli that tends to happen. I definitely connected with the speaker and will remain in touch with them.
Religion is a hot button issue (as usual) and I loved this talk because it opened my mind to the actual academic consideration of religious discussions in video games. Particularly interesting was how it attempted to categorize people into behavior categories.
Introduction: Angle Jungle is an award winning puzzle game built by a team of students at Carnegie Mellon University’s Entertainment Technology Center in 15 weeks for Pennsylvania’s Intermediate Unit 1. Angle Jungle has value to first graders and above, its primary purpose though is as a supplement for 4th to 6th graders learning basic geometry.
Platform: iOS | Time: 15 weeks | Role: Game Designer | Team Size: 4
Design Goal: The goal of the project was to achieve the following transformations in our target demographic:
Primary Transformation: Build familiarity with the angle by having players solve puzzles that use a mechanic that encodes the numeric and spatial representations of angles
Introduce positive and negative angles
Introduce clockwise and anticlockwise rotation
Introduce angles greater than 180 degrees
Build familiarity with the protractor tool
Design Challenges: We faced a number of design challenges during this project:
Protractor tool introduction
Finding an mechanic which made angles essential
Crafting fun and engaging puzzles
Crafting additional sources of motivation
My Contributions: As the game designer on the project I took the lead on directing our creative efforts. My efforts helped create a well received, fun, and engaging experience which made a good attempt to achieve our transformational goals. Other areas I made significant contributions in were:
An ideation process that created the main mechanic of the game
In this article I will chronicle my design process in creating Angle Jungle an award winning transformational puzzle. Then how I went creating the puzzles within the experience, and finally lessons learned.
Angle Jungle is an award winning educational puzzle game for fourth to sixth graders studying geometry. At the start of development our requirements were up in the air. Following discussions with our client we settled on the following objectives:
From our paper prototypes, we choose to refine two based on feedback.
In parallel we began the process of creating digital prototypes based off these paper prototypes.
Our breakthrough moment came when Jesse Schell, a faculty member at the ETC, posed to us that though these games used angles, both could be played without thinking about angles. We needed to make an angles essential experience. This priceless notion lead us to create Angle Jungle’s progenitor which we called Treasure Hunter.
Treasure Hunters mechanic encoded the relationship between the numeric and spatial representation of angles. This was achieved by having players use numeric representations to create spatial representations in-order to solve a puzzle. We believed this embodied a system where angles were essential. We then began refining Treasure Hunter.
After positive feedback from playtesting we next created a digital prototype.
In the above video players slot numeric values into a beam maker which creates a spatial value. A certain spatial value is required to hit an objective to solve a puzzle and receive treasure. This digital prototype then went through many more iterations.
At this point in development we had the foundations for an experience. What was needed next was to design that experience.
How does one go about creating an experience? There are infinite ways, but we began with considering the difficulty curve within our experience.
The above graph is an abstract difficulty curve which displays a sequence of tense and release cycles of increasing difficulty. This curve would form the underlying foundation of our experience.
With an idea of what we wanted the experience to look like, next we conceptualized the elements within the greater experience. The inspiration for this process came from a number of sources including the learning materials of our target demographic.
Our aim was essentially to gamify our target demographics learning material. We would achieve this through gameplay elements which attempted to capture aspects of the kind of problems they faced in the classroom. These gameplay elements would form the core components of the experience.
Whilst conceptualizing our gameplay elements we also considered the possibility that the puzzle may not be intrinsically motivating enough for players. Therefore we created two additional supporting motivational factors.
A gender-neutral character that needed assistance (inspired by Jesse Schell’s Lens of Help). Given the use of supporting characters in educational experiences is common, and there exists research on the potential beneficial effects for players. We hoped this would augment learning within our experience.
In addition we created The Cabin. The Cabin would contain rewards in the form of treasures and trophies. The Cabin would act as motivational element by creating Golden Expectations (expectation of rewards) through the aesthetic use of empty shelves as well as serve as a measure of game progress.
We also recognized the need to space out our rewards for better impact. We therefore arranged rewards into evenly spaced intervals.
Together these pieces could further flesh out the difficulty curve of our experience. The peaks of our difficulty curve would now commonly correspond to the introduction of gameplay elements, and the dips would be periods of rest at The Cabin.
The experience needed more though, it cried out for substance in the form of puzzle content.
Transformational Puzzle Complexity
With a high-level view, and the fundamental elements of the experience in mind we went about crafting a set of transformational puzzles.
This process resulted in a jumbled pile of puzzles. This was a good first step, but it did not fit the experience structure we wanted. We therefore turned to a mighty tool. The spreadsheet.
The spreadsheet consisted of columns of each gameplay element which we incrementally increased to raise puzzle complexity. This tool complemented the design process as we created more puzzles based on these new complexity constraints.
Two additional considerations came to mind during this process:
Include drops in puzzle complexity when introducing new gameplay elements to allow for more effective tutorials.
Have the majority of learning occur early when complexity is low.
The result of this work was a structure of thirty levels which we then playtested.
Although initial playtests were largely positive they revealed two design issues:
Lack of Angle Diversity – High occurrence totals of fewer number of angle values in the total experience meant a lesser exposure to different angle values.
One Gem Solutions – Solutions which required only one angle gem on more complex levels meant less interaction with different angle values.
Both issues were detrimental to our goal of building familiarity with the angle system. Therefore, two methods of analysis were used to solve these issues:
Angle Distribution Analysis – Counts of each angle value used.
Angle Solution Analysis – A comparison of solution angles against angle values used.
These methods revealed a number of such ‘issue’ levels.
The result of iteratively applying this analysis was that both the complexity and angle diversity was maintained and improved. This ultimately meant a better attempt at achieving our transformational goal.
At the end of the project we ended up with a concrete primary transformational objective, and several secondary transformational objectives.
Build familiarity with the angle system by having players practice solving puzzles using a mechanic that has an encoded relationship between the numeric and spatial representations of angles.
Sharon Carver – ‘The actual angle choices at the various levels and the angle meter seemed to work well and COULD promote learning of the concepts and spatial relations of angles, as long as students don’t game the system’.
In addition to our primary transformational objective we took the opportunity to introduce a number of secondary transformational objectives in manners that were natural extensions of the core experience (providing the experience with more puzzle content).
Protractor Tool Usage
To solve a puzzle, players had to work out the angle that was required to be made. This was difficult for some playtesters and therefore provided a natural opportunity to introduce a protractor scaffolding tool.
By making this tool available we built in the protractor in a manner that was of a natural clear benefit to our players. We hoped by doing so to build familiarity and appreciation of the tool by creating a puzzle environment where it was undoubtedly helpful. Playtesting showed that this strategy ‘seemed’ to work.
Sharon Carver – ‘I especially like the meter that shows the full 360 degrees while the player is working on selecting angles. It would definitely be worth testing the impact’
Introduce both anticlockwise and clockwise rotation, and angle addition and subtraction.
Angles Above 180
Expose students to angles greater than 180 degrees.
Whilst exposing students to our core mechanic (an encoding between the numeric and spatial representation of angles), initial levels would allow brute force approaches to be rewarded in order to draw in the player with easy rewards.
Allowing for such ‘brute force’ (choices made without solid reasoning) approaches, resulted in the following criticism being raised:
What if players are not doing the thinking you want?
In the defense of brute force, we responded with the following counter points:
Absolute mindless play is rare, so since the use of numeric angle values are essential even with a brute force approach, players are likely to at least reason about this aspect of the game.
Supporting brute force approaches makes the experience more accessible (we had first graders reach level 22 with help!).
Brute force approaches are only reasonably satisfying in low complexity puzzles (playtesters who solely practiced a brute force approach experienced frustration on more complex puzzles).
Most importantly though, we admitted that when complexity was low players would not have to think ‘much’. This was intentional. The experience allowed it for a deeper purpose.
We intended to combine that brute force motivation together with puzzle complexity as a transformative tool to incentivize a ‘logical’ approach. As puzzle complexity slowly increased the experience would naturally create skill appropriate ‘teachable moments’ for teachers to capitalize on.
The results of this process created an experience that contained:
Suitable learning and puzzle complexity curves
An appropriate pattern of tense and release
Appropriately interspersed rewards
An exposure to a wide variety of angle values
A mechanic where angles were essential (encoded the relationship between spatial and numeric representations of angles)
As part of the educational game project my team was working on we were required to build a reward system. This system took the form of a trophy room which would display trophies that players had earned. After playtesting though we found we had created an expectation for treasure which we were not fulfilling. The following is a gameplay video where our players would collect treasure chests at the end of each level.
So in order to fulfill this expectation we created additional art assets which we would use to fill up our empty room. We faced a dilemma in this regard. We did not want to force players to see treasure added to the room at the end of every level. This would be far too disruptive to the game experience. So how does one fulfill the expectation of reward without forcibly having the player see the reward appear?
Well one thing helped us in this regard. We already designed fixed reward intervals through the trophy system which forced players to go to the trophy room and observe the new trophy being added to the trophy room.
In our experience we had periods of fixed visitation where the player would be guaranteed to be seeing the Trophy Room. Looking at the experience more methodically we were giving trophy’s at the following intervals (we had thirty levels).
One and thirty were absolutely necessary since they began and ended the experience. The others were decided based on difficulty curve which was designed in previous weeks. Again we asked ourselves the question. How does one fulfill the expectation of reward without forcibly having the player see the reward appear?
As part of the Game Design course taught by Jesse Schell at Carnegie Mellons Entertainment Technology Center, we were required to create whatever game experience we wished. The one requirement we had for this experience was that it was to be excellent! So I created Dominate, a tablet top strategy game!
In addition to creating the experience we prepared a marketing and rule sheet, as well as a written record of our iterative playtest driven process. The following is materials from my playtest notes.
Date: March 29th 2017
Purpose: Playtesting initial concept
Time: 20 minutes
A significant number of broken rules
Two resources for construction, sheep and wood were unnecessary
Need a method of counting the different resources, can’t keep track of it mentally
How do I know when I won?
Playtesters had trouble counting tokens
Giving players the choice of resource location made resource placement polarized and clumped
Since no restrictions of village placement players would build lots of villages around themselves making the game drag out longer
Made counting easier
Made rule about connecting villages
Limit the construction of villages and temples to speed up the game
Gave temples life
Made possible a lose condition (all enemy temples hp goes to zero)
Wrote up rule set
Needed a document to playtest rules with
Date: March 30th 2017
Purpose: First playtest with largely functioning game
Male – 21
Male – 23
Time: 40 minutes
Fireball need chance on hit, I didn’t like knowing I would lose for sure
Who casts first should be based on a dice roll, again I didn’t liked knowing I would lose for sure
The rules for village placement are confusing
Found resource collection rate difficult to count
Liked the strategic element in fireballing then converting enemy villages
Players had a good time
Players wasted a lot of time counting resources
Found an issue when a player placed their temple in a certain pattern, they became blocked from building
Both my playtesters were programmers
Allowed world to wrap around itself
Avoid issue of limitation of three building connections per building
Fixed in rule sheet to clarify village placement
Clarification based on request
Added initiative system to allow the spell phase not be a guaranteed thing
Stop the feeling that you were guaranteed to lose
Add a conversion of resources to belief 2:1
People seemed to enjoy the spell phase more than the build phase so I wanted to charge up the spell phase. Also it was one method of increasing the utility of resources making investing in resource growth more useful.
Date: March 30th 2017
Purpose: First iteration of rule sheet, introduction of game to more ‘casual players’
Male – 28
Female – 30
Time: 45 minutes
Make the game board bigger!
Color code the villages!
Board is so cluttered, can’t see anything!
Don’t need initiative rolls every time, just do contest rolls on build if wanting to build in the same spot (everyone declares where they are planning to build then builds)
Playtesters got bored waiting for their turn
Playtesters didn’t read the rules at all
Playtesters had great difficulty counting belief and resources
Playtesters found the world wrap rule super hard to visualize
Both my playtesters were more artistic individuals, casual game players – from the previous playtest it seems that my game is more suited to strategy game fans
Playtesters converted all their resources in belief as they found that part most fun
Playtester though the strategy of high belief would work. but lost because had no base of resources to sustain that burst of belief
Made game board bigger
Made color coded tiles and villages
Made one's own villages easier to see
Introduced contest rolls on build
Way to allow free for all building while allowing to resolve two players wanting to build on the same place
Touched up rule page
Added more pictures in case people didn't want to read
Date: 5th April 2017
Purpose: Second iteration of rule sheet and 1v1v1 setting
Male – 26
Male – 30+
10 minutes to understand rules
40 minutes to play game
Include pictures of tiles on instructions
So what is the victory condition?
Mention influence earlier
Use the word adjacent, its more clear
Clarify construction rules, they are not clear
Mention that villages at 1 development level cannot be destroyed
Typo on spells, town not village
Watching these playtesters reading the rules showed that I needed to change the information order to make the document easier to process
Playtesters were confused that they needed to select separate colors
Playtesters placed tiles on top of each other which I needed to verbally clarify
Playtesters found the phrasing of various parts of the rules confusing, and had to jump back and forward in the rule book to understand the rules
Players found the overlap rule confusing
Players found counting the resources wasnt too bad
Player suggested using higher value counters to make collection of resources faster
Players suggested a counting tool to keep track of how much you need to collect
Players suggested bidding resources to win the spell phase
Players suggested building should not be simultaneous but instead be one after another like before
Players suggested a thematic change to lighting bolt
Player had difficulty understanding the rules at first but then got into the game
Players felt the counting of belief and resources was most tedious
Reduce cost of fireball to 1 but introduced a probability of it missing (intention is to create more tension when attacking)
Create a balanced fireball spell with an element of chance
Added image of village and temple to rule set
Wanted a visual indicator of what was what for easier understanding
Made a resource/belief tracker for easier counting
Wanted players to focus on the game rather than counting chips
Added 2-1 conversion to rule sheet
Improve the rulesheet
Made variety of fixes to rule sheet e.g reordered sections – clarified victory conditions – made explicit mention that tiles dont stack – clarified construction rules – explicitly said players are assigned colors
Improve the rulesheet
Date: 8th April 2017
Purpose: Third iteration of rule sheet and 1v1v1 setting
10 minutes to understand the rules
1 hr 20 minutes to play game
Playtester complained that reading the rules felt like studying
Very interesting moment when players said no need for chips use the income tracker to keep a track of how much you have instead
Playtesters mentioned income tracker could use a zero
Playtesters suggested having some visual indicator for turn order
Players wanted the resource and belief tokens on the income tracker to be more obvious
Playtesters found the income tracker awkward to use, and instead wanted more numbers on it instead of having to do arithmetic
Playtesters wanted a more efficient way of removing and adding villages to the board, and suggested making color coded physical representations of the village which could be placed and removed from the board
Playtesters suggested carefully considering how to manage the player who would lose the game early – either give them incentives to stay after losing, design it so they can continue and have an incentive to stay, or accelerate the game to end quickly
Playtesters suggested trying 1v1 or 2v2 game format.
First time I explained as little as possible and had playtesters read the rules and play, had to explain income tracker.
Playtesters understood how to generate the board, and do the initial game setup
Had to explain the income tracker
I needed to explain both how to represent development levels, how to use the income tracker, and using d6 to represent hp on the temple
Players never used the offering mechanic
With three playtesters the maximum amount of belief/resources reached around 15-16
What happened was a Mexican standoff moment where each player had direct access to attack the other players temple, and it turns out that based on chance of spell phase the weakest player actually won the game because one player destroyed one other player and the weakest won the spell phase of the next turn and killed the other player before they could retaliate
Changed the income tracker to the warchest a tool for keeping account of how much resource and belief a player has
Completely eliminate the need to use chips for keeping track of a player's belief and resources
Kept the offering mechanic
Wanted to test how it would affect a game when used properly and it was designed reduce the power of the spell phase and also mess with the power that a guarantee of casting spells first gave
Changed the income tracker to warchest also added a zero on it
Completely removed the need to use chips to represent the amount of resources you had allowing players to focus even more on the core experience
Date: 9th April 2017
Purpose: Wanted to test what 1v1 was like
Male – 21
Time: 25 minutes
Playtester got upset and felt cheated by the game because didn’t fully understand the rule of only allowed to connect to three adjacent buildings
Playtest was short, and other player lost very quickly, playtester wasn’t happy at all, felt cheated by the game
Problem was they were in a situation where they could not build anything anywhere – I think a solution that would be in the 1v1 game mode give players two temples rather than one to add more skill to it
Used the offering mechanic to spell first
Made three game modes – 1v1v1 – 2v2 – two players with two temples each – 1v1 – each player has two temples
Avoid the disastrous playtest happening again with giving a single player two temples
Date: 9th April 2017
Purpose: Wanted to test out what the 1v1 with two temples was like
Time: 33 minutes
The dynamic was certainly different, two allied temples were placed back to back
Other two were on sides of map
What ended up happening was that middle two gained lots of resources and that built up over time, eventually the aggressive village tactic was overcome by resource snowballing and the central allied players eventually won, and the two outer players forfeited before the end of the game
Found that placing resource chips (chips that represent the resource income of a tile) made counting of resources so much faster, will do it in future playtests
Added resource tokens onto village and temple tiles
Making counting of resource income much faster
Date: 10th April 2017
Male – 24
Time: 42 minutes
Initially I was doing well then the playtester converted a critical village and I lost
Playtester liked the idea of converting resource to belief
Told me that playing required multidimensional thinking, resource gain, blocking, and long term growth
Resources became so important because of offering system
Required finding critical villages and capturing them, anticipating your enemies offering
Playtester commented that warchest system was good, but they didn’t mind the old system of counting chips one by one
Playtester appreciated new method of displaying village and resources on map
Found it hard to find resource tiles since tiles were in a pile
Playtester found better way of arranging belief and resource tokens on warchest. Keep it by the side as to not obstruct the numbers. Will update that in the rule set
Improve warchest by having tokens not obscure the warchest
Made a box with compartments to make it much easier to find the piece you needed
Reduce the hassle in finding game pieces
Added the resource and belief token representations to the rules
Speed up the process of counting resources and belief
Date: 10th April 2017
Male – 21
Male – 21
Male – 22
Time: 40 minutes
Asked if resources were generic
Couldnt find use for belief
Confused about building only within area of influence
Found village upgrade table super confusing they thought it cost one to upgrade to level 2
Got confused by a line that said build first cast spell last
Highly disliked the whole 3 adjacent village thing
6×6 feels small for 4 players
Game suffers from same problem as RISK where one player clearly snowballs to victory
Feels like you know who is going to win from the start based on the position
Playtesters said consider a large map and multiple temples
Playtesters suggested giving temples some resistance to fireballs
Read the rules in 6 minutes – skimmed it
Allied players placed their temple in a resource rich but locationally disadvantaged position, and were unable to get lucky enough to break out of their bad positioning and so lost the game
Playtesters did not know the rule of adjacent first and so placed thinking they could place anywhere and that they said messed up the game for them
Remove the rule of adjacent to three
Players were not liking this rule and often players including myself forgot about keeping to this rule
Change the phrase resource cost to construction cost and phrasing around construction and upgrade of villages
To clarify this
Added new rule for temple damage
Made temples resistant to fireballs to reduce likelihood of player losing in one turn
Made changes to rule set based on confusions from playtest
Improve the ruleset
Date: 11th April 2017
Male – 28
Time: 42 minutes
Destroyed temple should become empty
Board still needs to be bigger, still feels cluttered but is improved from before
Fun game, liked the warchest system
Moving around map, places hard to reach
Didn’t want to place 1 belief villages as it was suboptimal
Inert villages seem weird in 1v1 didnt think to convert own because it felt you already owned it
I would play again
Real time strategy board game
Wished there was another dimension to movement
Player went crazy in converting to belief to try and take me out quickly
I invested in building up resources and eventually snowballed to victory
When a temple is destroyed is becomes empty
More sensical outcome and reward for the player who destroyed the temple
Clarified offering rules in rule sheet
Improve the rule sheet
What Went Right
Warchest system was a marked improvement over the old system of counting chips. The warchest cleared up the playspace and created an easy way for players to keep track of their resources without fussing around with chips. This allowed them to focus on the game.
New method for representing income and belief made collecting resources at the start of the turn much easier, before a significant amount of time was wasted counting, and this was a marked improvement.
Adding dice rolls to attacking heightened the tension in the game and had a positive effect on gameplay.
Once players got over learning the rules they had generally positive feedback about the experience, particularly that throughout the game players had the option of several interesting choices.
Adding the resource to belief conversion rule was highly appreciated. By doing so it created a good reason to invest in growing one’s village network so that a player had more resources to convert to belief. Now players would avoid wasting placing villages that weren’t connected to a resource. This helped address the problem I had seen in my first playtest of arbitrarily building villages.
The way the game was designed allowed it to be very easily scalable in terms of grid size, number of players, temples per player, resource tiles per column. This design supported a wide variety of game modes 1v1/2v2 which felt distinct, and so the game was more accommodating to different numbers of players.
Procedural generation of the board helped make the board experience fresh each time, increasing replayability.
What Went Wrong
Playtesters didn’t spend much time reading the rules, and so made suboptimal choices in the game and got upset, and felt cheated by the game. What was particularly bad was placement of temples and villages. If placed incorrectly could mean the game was lost if players didn’t get lucky with die rolls.
As one playtester pointed out my game suffers from the problem in RISK where one player will snowball to victory and this is apparent. This caused forfeiting to occur multiple times to save time because the odds were clearly stacked against the player. RISK attempted to address this problem with country cards that gave bonus armies, perhaps something equivalent would help my game.
Procedural generation of the board acted as a double edged blade. If in the case the board was generated in a manner that made blocking of a players progress easy, new players felt upset and cheated (in tandem with point 1)
During Spring break we had the chance to playtest a digital prototype of our game. The game consisted of five puzzles, and the intention of the playtest was to see if our target demographic and client (Colonial School) liked the game, and their thoughts. Feedback from both the teacher, and our target demographic was as follows:
Kids like the game
Thought it was easy, wanted more challenge
Understood the mechanic immediately
Completed the game within 5 minutes
When asked about characters they wanted they mentioned all kinds of animals they saw in the jungle
Again asked for a wrestler
Had no major complaints about art or mechanic or story
One kid wanted dragons
One kid recognized it was a maths game but kept playing
Asked for more levels!
Teacher liked the game
Said reverse angle gems (move in opposite direction) would be fine but only on advanced levels
Wanted some source of competition so star rating system should have a total for students to compete against each other
Teacher said using games to teach angle of shapes would be fine
Teacher said students are not taught physics at their level (leaving physics out is a good idea)
Improv is a skill we use every single day, it is a facet of how we deal with the unknown, and its development has incalculable benefits to our lives. Whilst at The Entertainment Technology Center the following exercises I found most useful:
I Own This Place
In this exercise we would receive a card from a pack of playing cards which would assign us a number. Based on that number we would adopt a status between extreme high and low.
Learning the concept of high, and low status as well as their traits has allowed me to reflect on myself. Not only do I better recognize status traits in others, but I intend to use this knowledge. I aim to exhibit higher status, and avoid lower status traits as I feel they are essential for many things including leadership positions which is what I aim for in my career.
Different Language Conversation
This exercise involved sitting in a semi-circle, and talking to each other in different languages.
My take away was a reinforcement of how paying attention despite not understanding is important. In and out of the industry we will have conversations where we don’t understand the ‘lingo’ of the speaker, such as when listening to highly technical speakers. Listening intently in those cases improves the conversation by respecting the speaker, and allows for a smoother transition to a language one does understand.
This week was spent working on UX changes as well as polish to the game.
A number of UX changes were made .
One Gem Solutions
One gem solutions are The changes this work consisted of solving a number of one gem solutions that appeared during playtesting.
Changed protractor tool tutorial to an earlier level, then introduced it again in a later level to hopefully increase the probability that players will use it.
During our playtest it was revealed that slotting and removing a gem constantly could be used as a cheat to beat a level. We solved this issue technically by having a check for slotting, and not allowing a win to occur if a slot had occur within sometime.
We reconsidered the flow of the first time play experience. Initially the first time players played the game they start directly at level one. The intention behind this was done in attempt to get players attention by showing them the most interesting thing first. This was changed to start with the map first because:
It was our actual homepage.
Many other games followed a standard of showing the map first rather than introducing the gameplay.
At the start of week twelve polishing the game was on the forefront of our minds. In this regard, design wise we continued to struggle with small, but vitally important decisions namely considering the visual representation of angles during gameplay and the introduction our scaffolding tool (the protractor from week eleven).
We met with Jessica Hammer on Thursday to get a perspective on what we had done and the issues facing us. She told us the following:
Clarify our learning goals and sort it out into a table
Make red and blue gems beam movement uniform, so red always goes anticlockwise, and blue always goes clockwise
Reconsider the visual representation of clockwise movements
Interest in protractor tool introduction and suggested we put it on level three where we introduce no new things and so cognitive load is not high
Jesse to the Rescue!
Following this we met with Jesse Schell on the evening of the same day. Being the masterful designer he is, Jesse gave us a suggestion of displaying the spatial representation of the angle.
Jesse’s suggestion was when the beam rotated clockwise, the beam maker would make the full 360 degree representation pop out, and be subtracted from when the beam moved past 0. In the case of the beam rotating anticlockwise the sector would grow as the beam moved anticlockwise.
We implemented this feature, then spent the rest of the week playtesting the levels we had, and weeding out one gem solution angles.
Starting Week 11 we finished creating digital versions of our remaining puzzles. In addition we began working on the various aspects of the game that we presented to our playtesters at the end of Week 10.
We added a map to replace the original level select screen. The new map would serve two functions.
It would display the progression of the game to the player
Create a more visually appealing method of level section
We also implemented a reward system in the form of trophy’s added to ones treasure room after completing a ‘boss level’. We hoped such an addition would add a motivational factor for completing the game.
Later in the week Jesse Schell played the game, and suggested a new way to show treasure room. Instead of having trophys placed on the desk, have shelves arranged in a geometric way with numbers on them to reinforce the central theme of angles. In addition to this we considered including random treasures which we hoped would add a surprise factor.
During Week Ten we prepared designs for the final levels of the game. These levels were in line with the complexity metrics we established during Week 9.
During this process we also documented our puzzles, and their solutions. This document would not only help recreate these puzzles during development, but could be handed off to teachers as a supporting document.
Meanwhile we began preparation for The Entertainment Technology Centers playtest day. This would involve members of our target demographic visiting our project rooms to playtest our game. For this day we came up with a number of questions to ask our playtesters as well as prepared video and screen recording equipment to capture gameplay footage.
On Playtest day we had five groups of playtesters. Each group played the game for approximately fifteen minutes. We then conducted a short interview with them, and found several good insights such as:
They really enjoyed the game, we never had a case of a bored playtester
Even when playtesters got stuck they cried out for help, and we had cases of playtesters working together to solve puzzles
The protractor tool was useful, but since there was no clear tutorial playtesters found it by mistake
Playtesters liked the art, music as well as the treasures we would reward them with
Playtesters didn’t object to the main character, but found certain animations weird
Recently we have been working to create an educational game on angles. Part of that requires designing puzzles that try to provide educational value. The following blog post is a continuation of a look at our process.
The most important part when analyzing our puzzles was first to recognize our puzzle metrics. Initially these metrics were as follows:
Number of slots
Number of gems
We began our first pass using these metrics to craft the thirty puzzles that would form the core structure of our game. The process essentially boiled down to a table of each of these metrics listed in columns. We incrementally increased metrics until key climax moments which we referred to as ‘boss levels’. Following a boss level we dropped the metrics to allow for the introduction of a new system in a simpler environment.
Our first pass at developing the puzzles allowed us to create the initial structure of the experience. On further examination, points three and four actually had more depth to them. We broke these points into each and every gem value. This additional depth warranted further analysis.
We then went about constructing a meaningful method of presenting what we called ‘angle distribution’. Using this we mapped out each and every gem per level. This method of analysis revealed several levels that were problematic for different reasons such as:
High angle overlap
Had no garbage
Levels that were similarly structured
These key points conflicted with our main educational objective of improving familiarity with both numeric and visual representations of angles. As for one having a large degree of similar angles meant that the exposure to different angle values in the 360 angle system was lower. So for our second pass we went about redesigning certain levels adding in garbage, and choosing angle gems carefully to avoid overlap.
On making a third pass at the we again found a problem. Our third pass took the form of playing the levels. What we found was some gems were included that were direct solutions to problems in hard puzzles.
We needed to weed out as though it is good that players are able to discern such a solution, we felt that doing so would mean engaging less with the angle gems in the level as several other gems were left out entirely in the solution. Thus we weeded such scenarios out during our third pass.
Essentially the process boiled down to a number of steps:
Carefully study the components within our structure
Extrapolate areas for further fine grained analysis
Develop a tool for analysis
Apply the tool
Identify and address problem areas
Replay the experience
Using this process we iteratively analyzed our puzzles redesigning when necessary to ensure levels had particular solutions to problems with minimal overlap. Now with a clear design process, all thats left to do is playtest and hope the design worked!
I was fortunate enough to be able to attend this years Game Developer Conference (GDC). Whilst there I had the pleasure of attending its Game Design Workshop.
The Game Design Workshop took place over two days. Both days included a general session, and an elective. I attended both Day 1 & Day 2. The following is a brief account of the experience. If you are an aspiring designer, and have the opportunity to attend the workshop this is a must do event!
On the workshops first day we played SiSSYFiGHT. After playing a few rounds, we were asked to come up with a new theme for the game. This involved each team member writing sticky notes, grouping them then voting on a theme.
With a theme of ‘artists vying for attention in the art world’ we added a steal mechanic. The steal mechanic would allow the attacking player to gain the points the other player lost. We quickly found that this mechanic made the game go on infinitely.
We then changed our chosen mechanic to Favor. The Favor mechanic gave a single point to the player of our choosing. This mechanic was better balanced, and encouraged cooperative behavior.
Game of Games
The first elective I chose was Game of Games run by Marc LeBlanc. For the elective we created a system (we did a card game) with a single rule. Our rule had players first play two cards of the same suite. Then play another two cards which had to add up to the higher of the last played two cards. We then were instructed to merge our game with another.
Fortunately the merge was easy as the other game employed a rule that was similar to ours (it was another card constraint rule where the total of the two cards needed to add up to an odd number). We repeated this system merge process four times, until finally we had to merge sixteen different systems.
During this ‘ordeal’ the hardest part was merging a card based system with a dice based system. Our first attempt to tackle this merge was setting up two asymmetric games which were played simultaneously against each other, but this was not a satisfactory outcome.
We continued to struggle until Marc LeBlanc allowed us to cut from the system during merging the only constraint being to keep the core components (the dominoes, dice, and cards). At that point we brainstormed and came up with a method of dealing with this which was to combine the system through a medium they all shared, which was numbers.
What we created was a game which involved matching numbers based on eleven dice that were initially rolled, and remained fixed throughout the game. Cards, and dominoes played sequentially, and had to get a pair of numbers that matched the dice’s number to be able to claim it. The winner claimed six out of eleven available dice.
By the end of the workshop Marc LeBlanc introduced the MDA Framework, an awesome way of design a game.
At the beginning of the week 9 we had our halves presentation. Following this we met Jesse Schell on Tuesday, and presented our thoughts on how we would go about designing our puzzles. His suggestion was simple.
JUST MAKE PUZZLES. Worry about the details later.
So that is what we did.
The inspiration for our puzzles came from a combination of two sources:
The teaching material that our client used
A map of element complexity against time
The process of considering elemental complexity began with a consideration for the interest curve of the experience. Essentially we wanted an initial large peak then a period of rest, followed by ascending peaks with rests until a climax at the end.
When designing puzzles Level Design for Games by Phil Cosuggested listing the elements of a game, and systematically designing puzzles with incrementally harder arrangements of elements.
In our case we intended to use the elements to increase complexity, but explore fundamentally the same (problems related to the 360 angle system). The elements of our game were:
Receivers & Obstacles
With these elements we create a table of level against elements, and incrementally increased the number of elements. When a new element was introduced we would drop other elements to lower the difficulty experience for players to more clearly grasp the new element.
As part of my Masters in Entertainment Technology I am working on an educational game project at The Entertainment Technology Center. My team aims to essentially create a living 360 degree angle system for fourth to six graders to interact with whilst solving puzzles. We hope that through our demographics interaction with this system we will:
Clarify misconceptions about the system
Build a familiarity with the system through puzzles which require students to use estimation
In approaching this problem we have gone through an extensive ideation process, and the result is that we finally nailed down a core mechanic that makes considering angles essential. The following is a prototype of what we came up with:
Currently in our project we are at a point where we have to create the puzzles that will make up the heart of our educational game. To do this properly requires the creation of an interest curve; but not just any interest curve! As well needing to be an entertaining experience we must go one step further, and include the element of educational value.
With the objective of gamifying the material that our client uses to teach their students we began designing an interest curve. The first part of this process is to study the material which took the form of common core sheets.
We looked at each of the sheets, and broke down the different tasks involved which were as follows:
Create an angle using a protractor
Obtuse, acute, right, and straight problems
Visual identification of obtuse, acute, right, and straight
Identification of obtuse, acute, right within different shapes
Given a protractor diagram identify the angle
Estimate an angle between two points
Find the missing angle given a total angle
Find supplementary angles
Finding complementary angles
Find missing angles in a cross shaped
Find angles in portions of a circle
Find the angles in a triangle
Next with these tasks we looked at what tasks were best suited to the game we have created which was 1, 2, 3, 5, 6, 7, 8, 9, 11, 12.
In parallel we created a number of game elements to help us create these problems:
Receivers & Obstacles
We then identified what is essentially our core gameplay challenges that our player will face:
Dragging angle gems into beam generator/receivers
Remove angle gems from beam generator/receivers
Value deciesions between angle gems
Clockwise angle gem addition problems
Anticlockwise angle gem addition problems
Given our design and students curriculum, we made some assumptions about these challenges:
We consider clockwise movement a more advanced topic
Increasing complexity means increasing challenge, which can be achieved with more mirrors, angle gem slots, and receivers with obstacles
Now with these elements we imagined an interest curve.
A brief description of the process you used to create your adventure. Include any brainstorming notes, etc.
I begun the process of creating my adventure with a theme/fantasy. I had a number of ideas including:
A sports adventure theme
A wild west themed game
A game with vampires
I settled on doing something set in the time period of the Roman Civilization. In particular I loved the setting of the movie Gladiator so my intention was to recreate a similar storytelling experience.
Next I searched for an interest curve that roughly mapped onto what I wanted to create.
Next, based on the five point on the interest curve I imagined the main scenes of the story with a brief description of what I wanted to achieve in that scene, and the main story beats.
Capture – I wanted the player to be captured.
Training Ground – A scene in the gladiator house of them learning skills and familiarizing themselves with their new world
Gladiator Battle 1 – First gladiator battle, high intensity
Villanus Mansion – A more social situation, with a puzzle
Gladiator Battle 2 – Last gladiator fight, high intensity, kill the boss to win one’s freedom, or kill each other.
I was inspired by the game Shadow of Rome, and wanted to find a system that support combat and social situations. I could have used the roleplaying 101, but I instead chose to use a system from a tabletop RPG game I had played before called Vampire The Masquerade (VTM). More specifically I used Vampire: Dark Ages (medieval setting) for their armour, and weapons.
To flesh out my world of I needed to perform significant research, namely:
Be aware of the different types of gladiators to give my players and generated enemies some grounding in the world
I also wanted to include animals at one points so I found applicable stats.
Made a list of important characters and some of their traits to help me roleplay them.
Each scene needed a map so I drew one, including details about who was in each scene.
Refamiliarize myself with VTM’s leveling scheme, social and combat systems.
Found example stats to base my NPC’s on.
There were also a number of things I did not do:
Also thought of adding in some currency and letting players by equipment but thought this might add too much added complexity.
Thought of adding special sections such as chariot racing but left it out due to the added complexity.
All of this I compiled into a long supporting document I used whilst DM’ing that I will include in the following section.
We began week 8 with preparing our digital prototype for playtesting, iterating on various artistic, and functional elements including sound, and animations. The following was used for our first internal digital playtest.
Based on a focus on Treasure Hunter at the end of Week 5 we added various design additions to the idea which was shaping up to be a dungeon adventure where players:
Could move around a character
Had an inventory (method of dealing with many gems in a level)
Could defeat monsters (requested by our audience)
Could pick up gem bags (method of incrementally introducing gems incrementally to our puzzles)
Features 1, and 2 were integrated into the following early prototype.
A New Perspective
We met with a designer from Zynga who was visiting The Entertainment Technology. She had a look at our idea, and advised us to focus on our core mechanic which was the slotting gems into the beam maker.
So based on the feedback we:
Removed gem bags.
Made our main character stationary. The character would now be an assistant who would act like guide (akin to Dora the Explorer games) giving advice, information and hints but not actually solving the puzzle directly.
Constrained problems to only 180 because the teacher requested it.
Finally created 10 levels at the end of the week.
Simultaneously our artist continued to make aesthetic progress.