Sign in or register
for additional privileges

C2C Digital Magazine (Fall 2022 - Winter 2023)

Colleague 2 Colleague, Author

You appear to be using an older verion of Internet Explorer. For the best experience please upgrade your IE version or switch to a another web browser.

Modalities and experiences: Unlocking the gamified metaversity

By Annie Els, Mike Jones, and David J. Swisher, Indiana Wesleyan University


As interest in virtual reality, mixed media, and the metaverse have grown, the Learning Experience Design and Faculty Enrichment teams at Indiana Wesleyan University (along with several departments and programs) have been actively exploring the possibilities and potential applications, researching the data and learning science behind its use, and talking with lots of vendors and potential partners to better understand how to develop effective learning experiences for students and faculty alike in the metaverse.   
 
There is often a legitimate fear of technology and innovation, yet when seen through the lens of iterative change, we may realize we adapt faster than we think.  Mark Rober is an engineer, inventor, and popular YouTube creator.  His YouTube channel (which has over 22 million subscribers) features science, engineering, & design demonstrated with crazy ideas that make learning fun.  This excerpt is from a TEDx presentation he gave at U-Penn.




 
SOURCE:  Video segment excerpted from “The Super Mario Effect - Tricking Your Brain into Learning More” presentation by Mark Rober at TedX Penn, available online at: https://youtu.be/9vJRopau0g0.  Mark Rober’s YouTube Channel can be viewed here: https://www.youtube.com/c/MarkRober. The full video is The Super Mario Effect - Tricking Your Brain into Learning More | Mark Rober | TEDxPenn


Notice that the output is completely the same…you’re still pushing the buttons in a pre-specified sequence and timing to achieve some kind of objective.  But re-framing the approach makes a HUGE difference in motivation and persistence.  This is a good example of what we’re referring to when we talk about learning experience design: Which would you rather take, a 32-page written TEST?  Or an interactive GAME with an objective?
 
It is important to recognize that there is a difference between modality and experience.  The modalities are HOW we deliver the experience: In-Person, Online, Hybrid, Virtual Reality, Metaverse, or some combination thereof.

 

 

SOURCE: Illustration by Mike Jones, Copyright © 2022 Indiana Wesleyan University.


As shown below, experiences happen within the Modalities. These could be projects, scenarios, discussions, games, and gamification.
 



SOURCE: Illustration by Mike Jones, Copyright © 2022 Indiana Wesleyan University.

 
This helps to visualize the difference. And to be clear, you can have multiple experiences within any one of the modalities.  In the Mark Rober clip, we saw the same learning experience described, but with two different modalities…one of which was far more effective than the other.
 
If we look at these different modalities, we can begin to see where the overlaps are and the differences between them:
 




SOURCE: Illustration by Mike Jones, Copyright © 2022 Indiana Wesleyan University.
 
 
We’ve got Mixed Reality (MR) which sits in the middle and touches on all three of these other types of extended realities, then you have Augmented Reality (AR) and Virtual Reality (VR).  And then at the bottom we’ve got 360-degree Video which can really be implemented in all three of the other modalities.
 
So let’s take a moment to think about how these are actually implemented in the workplace or education.   
 

Mixed Reality



In this technology, 3D objects or sets of instructions are mapped to real-world objects.  The state of these 3D objects can be tracked by the mixed reality hardware, and the digital asset overlays will change based on the real-world objects’ current state or required scenario.  In this example, you can see how the digital fetus is superimposed right over top of the physical patient simulator. The instructor can change this scenario for whatever learning is required.  Students can examine the orientation and actions of the baby from their specific vantage point in the room.  As the state of the physical simulator in the room changes, the mixed reality overlay changes accordingly.
 

Augmented Reality



Somewhat similar to this mixed reality environment, augmented reality overlays 3D assets into the real world via the camera on the mobile device.  In this case, everything but the room is augmented.  In this case study, I created a randomized virtual patient that could be substantiated in any environment through the student’s mobile device which was running the app.  Since there was no actual physical patient simulator or actual stethescope, the stethescope image was on the screen of the mobile device.  To interact, the student would move their phone until the stethescope image came in contact with the correct placement for listening to the patient’s heartbeat.  The phone would then play the sound of the heartbeat through the speakers of the mobile device, and also cause the device to vibrate at the same rate as the randomized heartbeat.  In this way, there was no cost involved for a patient simulator, a stethoscope, or a doctor’s office.  Students could substantiate a patient anywhere they were, and use their mobile device to practice as many times as they wanted.

 

360-Degree Photo / Video



Our next application of these technologies is 360-degree video which is accessible in non-immersive technologies like an Internet browser where a student can click and drag the image to look around.  These can also be deployed in fully-immersive environments such as a head-mounted display, or even simpler technologies like Google Cardboard which can be assembled cheaply at home.  In this implementation, you can take students, employees, or trainees literally anywhere where you can get a 360-degree camera.
 
  • Need to take a student around the world to immerse them in a place or a culture?  Done.
  • Need to enable remote employees to attend an all-employee meeting and sit with their peers?  Done.
  • Want to place an IP-enabled 360-degree camera on the sidelines of a professional sports venue and then sell the IP address access for thousands of dollars to anyone in the world?  Done.





 
SOURCE: Google Earth 360 View captured by Mike Jones © 2022 Indiana Wesleyan University

 

Hopefully you can see the power of these technologies and have a better understanding of what they are.  Now let’s jump into the more complex ideas of virtual reality and the metaverse.   
 


Virtual Reality



Virtual Reality (VR) has been represented in LOTS of movies over the last couple of decades.  In this image here are just a few you may have seen, and there are many more; in fact, I counted up some 3 dozen movies over the last couple of decades where virtual reality featured heavily into the narrative.   
 




SOURCE: Illustration by David Swisher, with marketing images provided by the respective movie distributors.  Copyright © 2022 Indiana Wesleyan University.

 
So what is virtual reality, exactly?  According to Jeremy Bailensen, it’s “the computer-generated simulation of a three-dimensional image or environment that can be interacted with in a seemingly real or physical way by a person using special electronic equipment, such as a helmet with a screen inside combined with hand tracking or gloves and/or body suits fitted with sensors.”
 
For many of us involved in educational technology, our first exposure to an early form of virtual reality was virtual worlds like Second Life.  These were often 2D representations of imaginative places with exploration and social interaction where anything’s possible, and over time they evolved to allow immersive VR options.
 
Second Life debuted in 2003, and by 2007, a number of universities were active in the space.  Early adopters included the Alliance Library System (Peoria, Illinois), who set up the first library system in Second Life, called InfoIsland, and Ball State University (Muncie, Indiana), “where English instructor Sarah Robbins used Second Life to teach students research and composition skills.”   Two years later (2006), Harvard Law School offered a course called “CyberOne: Law in the Court of Public Opinion” in Second Life. [ Jennings, N. & Collins, C. (2007).  “Virtual or Virtually U: Educational Institutions in Second Life,” International Journal of Social Sciences 2, 3.  World Academy of Science, Engineering, and Technology (WASET).  https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.180.3529&rep=rep1&type=pdf
 
Early Second Life and educational simulations in virtual reality were typically a shared experience, but could be contained or localized. Most allowed those who had the time, energy, and funding to create simulation worlds which could house walkthroughs and scenarios for rehearsal and familiarization.   
 
The initial image below shows our colleague and adjunct ID Dr. Jase Teoh’s avatar, and the video below shows a health assessment classroom which Mike Jones developed in the space.  This was a closed environment where only the instructor and fellow classmates could interact with the learning spaces.


 


SOURCE: Screenshot from Second Life by Jase Teoh.  Copyright ©2007 by Jase Teoh. 

 
 

 

SOURCE: Screenshot recording of app & interactions created by Mike Jones, Copyright © 2022 Indiana Wesleyan University.


The picture on the right is from David’s doctoral program and in a space called SpotOn3D, and it was for conversation about creativity and innovation.  This program also started out in Second Life, but after we ran into some unsavory characters in inappropriate avatars who kept crashing our classes, we moved to an open-world but closed environment called SpotOn3D.  

 




SOURCE: Screenshot from virtual class at George Fox University held in SpotOn3D by David Swisher.  
 

What you see here is a place in that platform called “SpiritLife Cathedral,” and it was designed to be eclectic and theoretical, employing concepts that couldn't happen that way in real life (thus ideal for a virtual world): The waterfall is gorgeous and the water actually flows, the trees rustle in the wind, there’s realistic stained glass in the background, and a library of great books from our faculty mentor and related subjects in the rear. That’s our instructor (Len Sweet) on the carpet in the center, and up against the rock in the center, that’s David (in virtual worlds, his avatar dresses like Indiana Jones with a fedora and has wings so he can fly).  Some of us were hovering, some swimming, and some seated in deep thought.  But what you can't really see in that screenshot is that the James Bond theme song "SkyFall" from Adele was playing in the background and snow was gently falling inside. That was a perfect learning environment for a class on imagination, creativity, & metaphor!
 
One of the most helpful resources we have found to really understand VR is Jeremy Bailensen’s book Experience On Demand: What Virtual Reality Is, How It Works, and What it Can Do (Norton, 2018).   
 




 
Bailensen was an early adopter and scholarly thought leader on virtual experience and the science and evidence behind its learning power.  He runs the Virtual Human Interface Lab at Stanford University where a significant amount of VR testing and research is being generated, and he's ALSO the co-founder of Strivr (a corporate training VR solutions pioneer).
 
In the book, he explains:  
When VR is done right, all the cumbersome equipment - the goggles, the controller, the cables - vanishes.  The user becomes engulfed in a virtual environment that simultaneously engages multiple senses, in ways similar to how we are accustomed to experience things in our daily "real" lives. This is distinctly different from other media experiences, which only capture fragmented aspects of what our senses can detect.  For instance, the sounds you hear in good VR don't come from a speaker rooted in one place, but instead, they are spatialized, and they get louder or softer depending on the direction you are facing (or if you are in a tracked environment, how close you move to the source of the sounds.  When you look at something in VR, it is not framed by the dimensions of a monitor, or television set, or movie screen.  Instead, you see the virtual world as you see the real one.  When you look to the left or right, the virtual world is still there.
[Bailensen, Jeremy (2018).  Experience On Demand: What Virtual Reality Is, How It Works, and What it Can Do.  Norton. 45.]

As many of you no doubt know from researching quality online learning, this is what we call presence, and enhancing that sense of presence is one of the most important aspects of delivering a quality and effective learning experience online.  Most mediated forms inevitably leave a perceptual distance barrier that has to be intentionally bridged; not so with VR.
 
As Bailensen explains further:
 
Study after study has shown the experiences that people have in VR have an impact on them.  Their behaviors can change, and these changes don't disappear right away.  This leads to a conclusion that captures the considerable promise and peril of the medium.  VR feels real, and its effects on us resemble the effects of real experiences.  Consequently, a VR experience is often better understood not as a media experience, but as an actual experience, with the attendant results for our behavior. (p. 46)
 


Why Learning Flourishes in Virtual Reality




At the SIDLIT conference in August 2022, Dr. Jase Teoh and Dr. David J. Swisher co-presented a paper which explored theory from Mikhail Bakhtin and Lev Vygostky and how those informed our approach to virtual learning. [Swisher, D. & Teoh, J. (2022).  Bakhtin, Vygotsky, & the Metaverse: Why Learning Flourishes in Virtual Reality.  SIDLIT Conference, July 27, 2022. Swisher & Teoh will be re-presenting that session as a Professional Development workshop through Colleague2Colleague on January 25, 2023 at 4:00 pm (Central).  For more information, visit: https://colleague2colleague.org/c2c-pd-events/]   

 




SOURCE: Presentation title slide by David J. Swisher.  Copyright © 2022 Jase Teoh & David J. Swisher.
 
 

In the list below you see the 5 key areas which explain why Virtual Reality can be such a powerful tool for facilitating learning:
 


Metaverse Learning Contexts are…



Carnivalesque


They provide spaces where label-free, boundary-less dialogue can occur…a place where critique, commentary, and questioning can occur without fear of power dynamics…and it’s an environment where learners often feel freer to speak up or ask questions because they’re virtually represented rather than “in person.”  Shy introverts and fearful back-seaters often feel empowered to engage more fully.   
 

Dialogic


With the use of simulated environments and avatars, metaverse spaces are an ideal learning space to encourage the free flow of ideas and perspective, encouraging dialogue without silencing or excluding alternate voices.  They allow students to develop new meanings and modify existing meanings by realistic interactions that are “safe” because they’re virtual.
 

Gamified


Metaverse environments are ideal playgrounds for developing gamified learning by leveraging the core mechanics of game theory as motivational factors to help learners desire & achieve success.  Often they don’t even realize they’re learning because the immersive environment and the gameplay itself is fun and rewarding.
 

Scaffolded for ZPD (zone of proximal development)


VR platforms provide an optimal platform for social interaction with More Knowledgeable Others to address the gap  from what they can learn on their own and what they can learn from others. It is literally the “construct” where they construct their learning!
 

Present


And most of all, they provide an immersive sense of presence.  They bridge the perceptual distance barrier created by the screen and immerse the learner in “thereness” (what it means to be & feel truly “there” and present in the experience).
 
Thus, Virtual Reality enables a carnivalesque context with inherent dialogic interactions and gamified motivation factors using scaffolded Zones of Proximal Development…a modality and context where learners experience “presence” in an immersive way that was previously impossible to replicate through other forms of media.
 
 

Immersive & Empathetic Potential



VR has also proven to be a very powerful method of teaching empathy.  Two great examples are:
 



SOURCE: Screenshot of app’s marketing image 



Traveling While Black – “a cinematic VR experience that immerses the viewer in the long history of restriction of movement for black Americans.” The viewer shares an intimate series of moments with several of the patrons of the historic Ben's Chili Bowl in Washington DC as they reflect on their experiences of restricted movement and race relations in the U.S. You see and hear what it’s like to be treated differently simply because of the color of your skin.
 



SOURCE: Screenshot of app’s marketing image 


Clouds Over Sidra – a 360-degree web VR film about the Syrian refugee crisis narrated in first-person perspective by a refugee who takes you around the encampment and shares what life is like there.  It can be viewed on YouTube and via other forms of web-based media, but when viewed in VR through a Head-Mounted Display, it is immersive and emotionally moving.  You connect with Sidra and her plight better than with any other form of media.




 
The introduction to Jeremy Bailensen’s Experience on Demand book describes Mark Zuckerberg’s visit to the author’s Virtual Human Interaction Lab at Stanford where he introduces first-time VR inquirers to “the plank,” which he says is “one of the most effective ways to evoke the powerful sensation of presence that good VR produces.”  [Bailensen, Experience On Demand, 2.]
 
When a team of us from IWU National & Global visited a VR Escape Room as part of our metaverse potential explorations, I (David) decided to try this one out for myself.  It’s mind-blowing!    You step into a virtual elevator, push a button, and then watch (through the crack in the doors) as the elevator takes you “up” 80 stories.  When you arrive, you’re presented with a 6-foot long wooden plank extending out from the elevator over the cityscape, and you can look “down” and see the ground 80 stories below.   That experience is what you will see in this video.  








SOURCE: Richie’s Plank Experience Trailer © 2017 Toast

 
Now, you know for a fact that you’re standing on a hard, level floor in a room a safe distance away from other obstacles, and you’ve seen others try it (and probably laughed as they reacted because it’s so obviously “fake” from the observer’s vantage point).  But when you strapped on those VR goggles, the Head-Mounted Display unit, and your eyes see only the virtual world and your ears are covered with headphones that only hear the sounds of the in-game experience, all of your dominant senses are telling you otherwise.  You “feel” like you’re really there.

Often people gasp, clutch their heart, bend their knees to lower their center of gravity.  Some even crawl.  Pulses quicken, and fear often sets in.  And yet, it’s all in your mind.  There is no plank or elevator, and you’re standing on solid ground.  But tell that to your dominant senses!
 
In the words of another famous VR-themed movie, “There is no spoon...you’ll see that it is not the spoon that bends, it is only yourself.” [The Matrix, directed by The Wachowski Brothers, produced by Joel Silver (Warner Brothers Village Roadshow Pictures, 1999), DVD.]
 
In a research journal article featuring meta-analysis in the British Journal of Educational Technology, Di Natali & colleagues share, “the main advantage of IVR seems related to the possibility for users to have first-hand experiences that would not be possible in the real world (Freina & Ott, 2015), simultaneously offering unique opportunities for experiential and situated learning, as well as promoting students’ motivation and engagement.”  [Di Natali, et. al. (2020), “Immersive virtual reality in K-12 and higher education: A 10-year systematic review of empirical research.”  British Journal of Educational Technology, 51 (6), 2006–2033.  doi:10.1111/bjet.13030] 
 
Some examples of actual uses of immersive VR that they studied include:

  • Visiting unreachable places such as the Roman Augusta Emerita archeological site
  • Embodying abstract concepts such as a chemical binding and mathematical functions
  • Build spatial representations of microscopic environments such as blood cells or macroscopic places such as the solar system by exploring them in first-person and interacting with their elements
 
Often the solar system is one of the most challenging science concepts to convey because to get it all to fit on one model or slide, the sun is inevitably shrunk down almost 80% and the distance between the gas giants and the furthermost planets are consolidated.  But to fully experience the massive size of the sun, the variability of the planets, and the vastness of space, you have to experience it for yourself…which could take decades and millions of dollars in real life, or you can fly students through it in VR to show them first-hand.
 
All of these are actual contexts where VR has been used and the results reported in research studies.
 
 

Cognitive, Psychomotor, & Affective Potential



Immersive VR can also be used to train specific skills such as cognitive, psychomotor and affective skills that are relevant in educational contexts (Jensen & Konradsen, 2018).
 
In another research article in the academic journal Virtual Reality, Lemmens, Simon, & Sumter (2022) explain: “In the past decade, virtual reality (VR) has been successfully applied to treat various psychological problems, including social anxiety, (Emmelkamp et al. 2020) fear of flying (Cardo et al. 2017), post-traumatic stress disorder (Botella, et al. 2015), and fear of heights (Diemer et al. 2016).”  [Lemmens, J.S., Simon, Monika, & Sumpter, S. (2022).  “Fear and Loathing in VR: The Emotional and Physiological Effects of Immersive Games.” Virtual Reality (2022) 26:223–234. https://doi.org/10.1007/s10055-021-00555-w]
 
Bailensen describes how one therapist helped first responders and bystanders who were present at the 9/11 attacks overcome irrational responses to things like stairways and glassy buildings by re-creating the scenarios and walking patients through what they observe and feel in a controlled environment (that can’t hurt them) so they can process their fears and anxieties in a constructive way.   [Bailensen, Experience On Demand, Ch. 5 ”Time Machines for Trauma,” 136-149.]
 
Lemmens, Simon, & Sumter explain, “The effectiveness of VR as a therapeutic tool, particularly in the context of exposure therapy, is mainly attributed to the immersive quality of this technology. Highly immersive technological systems such as VR lead to a strong sense of situational presence, which can trigger emotional reactions identical to those experienced in similar real-life situations (Alghamdi et al. 2017). Repeated exposure to virtual experiences and subsequent emotional responses reduce negative affective symptoms in real life (Cardo et al. 2017).”  [Lemmens, Simon, Monika, & Sumpter (2022).  “Fear and Loathing in VR.”]
 
An example of this is depicted in the experiment shown in the video below, where a patient’s fear of spiders or insects is being treated by introducing the fear-inducing scenario in a controlled environment so the therapist can talk the patient through their responses and reduce that anxiety.
 






SOURCE: AR Phobia Treatment Application © 2020 SmartTek Solutions


Defining the Metaverse



The definition of metaverse is a somewhat easier one to digest because there really are a lot of parallels between metaverse and computer-generated simulations.
 
In this example of a metaverse platform, I created a virtual reality experience on a platform called AltSpaceVR, which is a metaverse platform provider.  Here we can have virtual meetings and presentations, meet with prospective students, or just hang out and shoot some virtual reality hoops (when we access it using a headset that’s compatible with AltSpace.   







SOURCE: Recording of app & interactions created by Mike Jones, Copyright © 2022 Indiana Wesleyan University.


 
This can be a full-on 360-degree interactable virtual reality experience, or it can be accessed through the AltSpace VR app and browsed and explored in a 2D experience.  In the 2D version of this metaverse example, the viewer can move around using their mouse or keyboard, but they would have a hard time with interactable elements such as picking up a basketball and shooting some hoops or examining a widget and interacting with its controls.  In the 2D instance, you would have more of a “stand-to-communicate“ experience instead of an interactible one.  
 
So our first differentiators between virtual reality and the metaverse are access and immersion.  The metaverse has a higher level of accessibility, whereas virtual reality offers better immersion.
 
The Metaverse is also a much more complex ecosystem since it has had more time to mature than Virtual Reality (although virtual reality is quickly growing in its ecosystem as well).  
 
It is important that we differentiate between the term metaverse from the company Meta.  Meta (previously Facebook) offers their own hardware (the Oculus Quest) and software (Horizon Worlds and Horizon Workrooms) as well as access through their hardware to many other vendors and metaverse providers.  But, there are plenty of metaverse platforms that are not owned, accessed through, or controlled by the company Meta.
 
As you can see by the image (& linked report) below, the metaverse is comprised of a massive collaboration of developers, creators, users, security experts, economy, and platforms.  This also creates an entirely new opportunity for business, from advertising to product and design pre-visualization and online sales...plus, just about anything else your mind can imagine.
 




SOURCE: NewZoo Consulting, “Metaverse Ecosystem Diagram 2022.”
 

As you can see in this graphic, there are many collaborators in this ecosystem called the metaverse:
  • Gateways/Platforms
  • Avatars and Identity
  • User Interfaces & Immersion
  • Economy (pay-to-play/collect)
  • Social Interaction
  • Infrastructure/Digital Twins
  • Artificial Intelligence
  • Visualization
  • Advertising
  • Connectivity


Technological Benefits


ACCESS


As mentioned already, the metaverse offers increased access.  In this way, the metaverse currently offers more equity to its users. It can be accessed through internet browsers, metaverse computer applications, mobile devices, virtual reality headsets, and mixed reality headsets. Yet it can also offer less protection than moderated VR experiences.  Although some metaverse vendors have safety tools in place, bad operators are always looking to prey on those who are not prepared.  In a world where there are no actual sticks and bones, words can be dangerous.  Preparing users to protect themselves will soon become part of every industry’s cybersecurity awareness programs...especially as some metaverse platforms can now bring your physical computer directly into the metaverse or mixed reality environments.
 

ANONYMITY


As with most tools, what can initially look like a bad thing can open the door to good.  One aspect of the metaverse and virtual reality is that it allows users to inhabit an avatar that they design.  In this way, they can look however they want to look when operating in these spaces.  As educators of primarily adult students, we can often get pushback from students in regards to turning on their Zoom cameras for a synchronous class.  Now, with the use of these technologies, a student who was previously hesitant to engage via video for fear of judgement over their look or the look of their environment can now join free to be whatever avatar they’d like to be.  Another well-documented benefit is that students, trainees, and customers tend to speak more freely and quickly while also contributing in new ways.  
 

REACH


Another aspect of the metaverse is reach.  Whether in the business of education and training or corporate environments and communication, our reach into our internal organizations and to our customer base is very important.  For example, in our educational environment at IWU, we have an acronym we use in thinking about the work we do; it is “LEAP - Love & Educate Anyone Possible.”  The idea is to be where our customers need us to be, when they need us to be there.  You can replicate this idea within your own organization’s training and interactions as well as extending your reach to your customer base.
 
If it can be done in the real world, you can probably do it in the Metaverse!
 
Here is just a short list of ways the Metaverse can become a part of your organization’s landscape:
  • Language Learning
  • Interactive Project Collaboration
  • Field Trips
  • Scenario/Scene Examinations
  • Recruiting
  • College Fairs
  • Social & Networking
  • Conferences & Presentations
  • Graduations
 
In higher ed, we call it the Metaversity, and we consider it a fourth modality for delivering accredited education for our students.
 
Now, how can we mix all these up for educational purposes instead of just entertainment?  We do that by applying learning science to these modalities of delivery. One way this can be done is using something called gamification.
 


Gamification



Gamification is about taking something that isn’t a game and applying game mechanics to increase user engagement, happiness, and loyalty with a purpose of changing behavior.

It’s important to note that by including game-like activities in your course (cross-word puzzles, flip cards, etc.), you are likely to increase motivation for your students, but a gamified course goes beyond that.
 
So let us identify some common gamification elements which are needed to fully gamify the learning experience.  A quick Google search will reveal some of the most common elements to create a successful gamified learning experience.  
 
Some students from Brigham Young University provide a really helpful summary of the elements in this way:  [Erickson, A., Lundell, J., Michela, E., Pfleger, P. (n.d.) The students' guide to learning design and research: Gamification. EdTech Books. Retrieved on August 22, 2022 from https://edtechbooks.org/studentguide/gamification?book_nav=true&nav_pos=748
 

FREEDOM TO FAIL


Freedom to fail means giving students the chance to experiment and fail without pressure or fear of irreversible damage. Video games incorporate this element by offering players multiple lives and opportunities to start from a check-point, rather than at the very beginning each time. The freedom to fail is important in maintaining student motivation, because it encourages experimentation in problem-solving and fosters persistence through difficult tasks. Related to this idea of freedom to fail is the freedom to choose, or the opportunity to decide one’s own path to reach the goal.
 

RAPID FEEDBACK


Rapid Feedback allows students to evaluate their own learning, see the results of their efforts, and make decisions about strategies and next steps. Immediate feedback, especially when paired with repeated chances to implement that feedback, can be an effective learning tool (Simões, et al., 2013). In games, immediate feedback can be seen in earning points, advancing levels, unlocking achievements, earning badges, and moving up on a leaderboard.
 

PROGRESSION


Progression gives the player the impression of advancement by (a) increasing the difficulty of obstacles (e.g., more capable opponents, limited resources, more complex missions) and (b) enhancing the player's ability (e.g., extra resources, new powers, leveling up, experience, increased skill).
 

STORYTELLING


In an educational setting, a story functions as a way to put learning into a meaningful context, thus increasing engagement and motivation. The most important principles of storytelling are character, setting, and plot held together by the conflict of the story. Keep in mind, good use of story may be as simple as providing a meaningful problem to solve with the learned material.
 
 
For a more in-depth view of gamification elements, refer to Wilk Oliveira's gamification taxonomy from the 2019 International Conference on Advanced Learning Technologies (ICALT) conference [Toda, Armando & Oliveira, Wilk & Klock, Ana & Toledo Palomino, Paula & Pimenta, Marcelo & Gasparini, Isabela & Shi, Lei & Bittencourt, Ig & Isotani, Seiji & Cristea, Alexandra. (2019). A Taxonomy of Game Elements for Gamification in Educational Contexts: Proposal and Evaluation. 10.1109/ICALT.2019.00028] or the Digital2Learn podcast featuring Oliveira and this model.
 
 

Gamification in Action in IWU Education



Annie and David recently worked with Dr. Frank Poncé and a few other team members to gamify a Fine Arts General Education course. The content covers Ancient and Early Western Art. We thought that the Indiana Jones theme would be really fun to parody. So we ran with the idea! Raiders of the Lost Art: Appreciating Ancient and Early Western Art!


 


SOURCE: Screenshots from FINA 179 course at IWU.  Copyright © 2022 Indiana Wesleyan University.


Students are assigned as apprentice archaeologists who are required to learn various elements of art appreciation.




SOURCE: Screenshots from FINA 179 course at IWU.  Copyright © 2022 Indiana Wesleyan University.
 


Then we send them to dig sites for continued practice applying those elements of art appreciation.





SOURCE: Screenshots from FINA 179 course at IWU.  Copyright © 2022 Indiana Wesleyan University.
 

Students are provided choice as they move through the content, curate their own ancient artwork and "pitch" their artwork to the Gallery Collector.




 

SOURCE: Screenshots from FINA 179 course at IWU.  Copyright © 2022 Indiana Wesleyan University.
 

They are given the freedom to fail through low stakes, repetitive practice activities to build their confidence with elements of art appreciation. Rapid feedback is provided all throughout those low stakes practices experiences.





SOURCE: Screenshots from FINA 179 course at IWU.  Copyright © 2022 Indiana Wesleyan University.
 

Students progress through the dig sites. 




SOURCE: Screenshots from FINA 179 course at IWU.  Copyright © 2022 Indiana Wesleyan University.
 

Learning awards and badges as they go.





SOURCE: Screenshots from FINA 179 course at IWU.  Copyright © 2022 Indiana Wesleyan University.
 

And a major gamification element used in this course is storytelling!







SOURCE: IWU Media. Copyright © 2022 Indiana Wesleyan University.



Students begin as apprentice archeologists, as they build their skill they curate their own ancient art works, classify, analyze, and evaluate the value of the artwork in a presentation to a Gallery Collector. If their work is good enough, they will earn a big payday when their gallery of curated artwork is accepted into a new wing at a Gallery.  [For their work on this rich multimedia & well-gamified FINA 179 course, the Learning Experience Design Team at Indiana Wesleyan University received the 2022 “Outstanding Online Course Award” (and a positive accolade from keynote Bryan Alexander) from Colleague2Colleague at the Summer Institute for Distance Learning & Instructional Technology (SIDLIT) in August 2022: https://www.youtube.com/watch?v=0KoWQ0hpMTM


Moving Beyond: The Vast Potential



Now, imagine we took that gamified foundation and made it immersive with VR, or even metaverse learning contexts.  




SOURCE: Adobe Stock #414284811, used under license.
 
 
We all know the value of fieldtrips for students constructing and applying their knowledge, but those can be costly in terms of time and travel expenses...especially if we're studying ancient Greek, Roman, or Egyptian art.
 
But there's an awesome immersive VR experience where students can enter Nefertiti's tomb (which is currently closed to the public) and walk around to virtually touch, feel, and explore.  Or we could go to an actual dig site (like we saw with Sidra) and meet an archaeologist and see their context...maybe even practice unearthing a find, brushing it off, or categorizing it.
 
And with VR (especially when it's gamified), we have permission to fail.  In VR, if you drop the artifact on your first try and it breaks, no big deal...just try again.  Not so in real life!
 
But even with that, perhaps we're "digging in the wrong place"?  What if we could take it a step further...into the metaverse?
 
Imagine an assignment that lives on. Students could place their artifacts and their research into a Metaverse museum where they could showcase their own work and explore each other’s finds.  It could even be a collection that grows over time (since that's easy to do in metaverse spaces), allowing students to present their artifacts to not only their classmates, but to the entire world in vivid, interactable 3D!
 
Teachers could even bring their classes to this virtual museum to show examples of student work and to teach lessons, allowing students to pick the objects up, hold them, enlarge them, turn them around to explore every detail, all without fear of damaging or breaking the artifacts since, after all, they're only virtual.
 
 

Looking to the Future



So why now?  Why are we focusing on this here at IWU National & Global?  And why do we believe it’s essential that we be actively talking about this as educators, educational technologists, and distance & online learning enthusiasts?   
 
Well to answer that, look closely at this image.  What do you see?      





SOURCE: Adobe Stock #6433940, used under license.



This is more than just an ocean scene.  It’s a wave that is starting to crest.  It doesn’t look like much in those early stages; in fact, before it starts to break with those whitecaps (like you see here), it’s barely even perceptible…it’s just a swell, just a rise in the ocean surface. But that swell is laden with kinetic energy, and beneath the surface, a cyclic action is happening that’s propelling energy and water molecules to swirl.  And as that swell grows, it becomes a larger wave until finally it starts to break the surface.  That’s the moment we’re in right now.
 
As we’ve shared, virtual reality and its implications for learning have been around for a couple decades.  Those are the early swells.  But now it has moved mainstream.  Corporate workforce training companies (in fact, 3 of the top 5…even Wal-Mart) have invested heavily in VR-based training [Petrov, Christo (2022).  “45 Virtual Reality Statistics that Rock the Market in 2022.”  Retrieved from: https://techjury.net/blog/virtual-reality-statistics/ on Jan. 6, 2023] and Jeremy Bailensen’s company (Strivr) has made virtual reality simulations for major NFL teams…and has documented impressive learning outcomes (translation: undeniable improvements in their players’ and teams’ scoring record) to such an extent that many more teams are now signing on and wanting to use the technology, too. [Bailensen, Experience On Demand, Ch. 1 ”Practice Made Perfect,” 14-43]
 
Then, in 2022 we all saw Facebook re-brand itself as “Meta.”  That was a very intentional and strategic move…because they see the power and potential of VR and believe it’s the future of all things social.


 


SOURCE: Adobe Stock #268044046, used under license.
 


One of the essentials to good surfing is being there, watching, waiting, observing so that you spot where the swell is emerging and position yourself to be ready when it starts to crest so that you can ride it. Like this guy.
 
Most of us missed the first opportunity.  But that’s understandable.  When the swell was just beginning, it’s easy to miss. But now we know what’s happening.  We see the swells, and they’re ideal.  They're perfect for surfing.   We also see where and how those waves are starting to break.  And that’s where we want to be.
 
We’re seeing a paradigm shift in learning technology that’s converging with business strategy, corporate training, social media growth, and infrastructure investment: Everything’s coalescing around the rise of the metaverse.  And it’s a paradigm shift on par with the emergence of online learning two decades ago, a time when IWU’s leadership wisely saw the opportunity and seized the moment.
 




SOURCE: Adobe Stock #166097997, used under license.


 
If we miss this moment, we’ll be paddling fast and furious after a wave that has long since passed us by.
 
But if we get this right, here’s what it looks like…it’s a beautiful thing.   
 







SOURCE: Adobe Stock #115718796, used under license.

 
We get to ride the momentum of energy that’s been converging in all these areas.  And we get to ride that barrel wave of opportunity while others look on and wish they had seized the moment.
The early adopters who are already using VR and exploring metaverse spaces become eager enthusiasts who celebrate our arrival and advocate for us.   
 
Of course, not every university is prepared for this like we are.  And many are hesitant, fearful, or unwilling to embrace the rising swell and seize the moment.
 
For those universities that are hesitant, fearful, or unwilling to embrace the rising swell, they’re going to have a much different experience.

 





 
SOURCE: The Perfect Storm.  Used under 17 U.S. Code § 107



The same opportunity for growth that we are able to ride because we’ve strategically positioned ourselves to embrace the opportunity will become an overwhelming force that latecomers and fearful universities won’t be able to stand against.   
 
Instead of riding the cresting wave to a bright future, for those who are afraid or want to “wait & see,” this is more what they’re going to see.
 
So we challenge you to consider and explore the opportunities before us so that we can once again be on the cresting side of a generation-shaping and epicly transforming wave.   
 
Indiana Wesleyan University is actively partnering with VictoryXR (and other providers) to develop metaverse learning spaces for student experiences.  Chief among them is a partnership grant which enables us to set up a digital twin campus, obtain 40 Oculus 2 headsets, work with over 400 virtual classroom spaces, and offer at least 4 learning experiences in the metaverse within the next academic year:   [Indiana Wesleyan University is in the 2nd round of the VictoryXR./Meta pilot project featured in this video which features the metaverse pioneers of higher education: https://www.youtube.com/watch?v=ucdLhN3yn10]
 




SOURCE: Victory XR, “Virtual Reality Metaversity: What You Need to Know about a Metacampus.”  



We look forward to dialogue with others who are actively exploring the potential of virtual reality, metaverse, and gamification to improve student learning.



 

References


Bailensen, J. (2018).  Experience On Demand: What Virtual Reality Is, How It Works, and What it Can Do.  Norton.  
 
Di Natali, et. al. (2020), “Immersive virtual reality in K-12 and higher education: A 10-year systematic review of empirical research.”  British Journal of Educational Technology, 51 (6), 2006–2033.  doi:10.1111/bjet.13030  
 
Erickson, A., Lundell, J., Michela, E., & Pfleger, I. (2018). Gamification. In R. Kimmons, The Students' Guide to Learning Design and Research. EdTech Books. Retrieved from https://edtechbooks.org/studentguide/gamification.

Jennings, N. & Collins, C. (2007).  “Virtual or Virtually U: Educational Institutions in Second Life,” International Journal of Social Sciences 2, 3.  World Academy of Science, Engineering, and Technology (WASET).  https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.180.3529&rep=rep1&type=pdf   
 
Lemmens, J.S., Simon, Monika, & Sumpter, S. (2022).  “Fear and Loathing in VR: The Emotional and Physiological Effects of Immersive Games.” Virtual Reality (2022) 26:223–234. https://doi.org/10.1007/s10055-021-00555-w  
 
The Matrix, directed by The Wachowski Brothers, produced by Joel Silver (Warner Brothers Village Roadshow Pictures, 1999), DVD.
 
Petrov, Christo (2022).  “45 Virtual Reality Statistics that Rock the Market in 2022.”  Retrieved from: https://techjury.net/blog/virtual-reality-statistics/ on Jan. 6, 2023.
 
Rober, M. (2018).  “The Super Mario Effect - Tricking Your Brain into Learning More” presentation by Mark Rober at TedX Penn, available online at: https://youtu.be/9vJRopau0g0.
 
Swisher, D. & Teoh, J. (2022).  Bakhtin, Vygotsky, & the Metaverse: Why Learning Flourishes in Virtual Reality.  SIDLIT Conference, July 27, 2022.
 
Toda, Armando & Oliveira, Wilk & Klock, Ana & Toledo Palomino, Paula & Pimenta, Marcelo & Gasparini, Isabela & Shi, Lei & Bittencourt, Ig & Isotani, Seiji & Cristea, Alexandra. (2019). A Taxonomy of Game Elements for Gamification in Educational Contexts: Proposal and Evaluation. 10.1109/ICALT.2019.00028.
 
Victory XR (2022), “Virtual Reality Metaversity: What You Need to Know about a Metacampus.”  Retrieved from: https://www.youtube.com/watch?v=H3eZiOT5u28&feature=youtu.be
 
VictoryXR (2022).  "VictoryXR Announces Launch of 'Metaversities' in U.S. in Partnership with Meta". Retrieved from: https://youtu.be/ucdLhN3yn10
 

 
 

About the Co-Authors


 



Annie Els, M.Ed., is a Learning Experience Designer at Indiana Wesleyan University. She serves as Lead ID for the innovative Doctorate of Business Administration program. Annie loves to explore new tools of the trade with the intent of creating excellent learning experiences for students as well as teaching other instructional designers about her findings. Because of this, she has jumped on board the strategic team that will be spending this coming year learning how to build virtual reality worlds. With a background as an elementary school teacher, her work in gamification in Higher Education comes naturally. She was the project lead on the team that won SIDLIT’s 2022 Outstanding Online Course award. Additionally, Annie is fascinated by the neuroscience of learning. She is married to Graeme, and together they have two sons. In her spare time, you can find her weaving fiber art on her loom.

Her email is Annie.Els@indwes.edu. 
 




Mike Jones, MFA, serves in IWU’s Innovations and Partnerships office as their Mixed Media/Gamification Producer and XR Developer on the National and Global campus. Mike weaves the magic of the movies and storytelling in collaboration with faculty and staff to bring life to our online courses through course videos, faculty introductions, flipped classrooms, and the gamification of course content as well as interactive videos and extended reality environments. In addition, he engineers and edits the weekly Digital2Learn Podcast and the bi-weekly D2L Behind the Scene of Scholarship podcast. He also hosts N&G webcasts on video related topics, and trains interested faculty and staff in the use of learning technology as well as acting for the screen. His latest assignment is to spearhead IWU’s first deployment of XR technologies for the classroom. Outside of his work with IWU, Mike volunteers as a Lead Film Instructor and missionary with www.insideoutglobal.org and produces videos for organizations such as Habitat for Missions, The United Way, and Grant County Veteran’s Court. His production company, Parable Pictures, LLC, has also worked in collaboration with IWU to video The Next Great Awakening Tour with Jim Garland and David Barlow and create the documentary, As You Were. Other video products he has created include a joint venture called, Fishing Guide in a Box and an award-winning documentary filmed on location in New Zealand.

His email is Mike.Jones@indwes.edu.


 


Dr. David J. Swisher is a Senior Learning Experience Designer at Indiana Wesleyan University who also coordinates the 3rd party technology integrations with our Brightspace LMS.  He serves as Lead ID for the technology and ministry leadership programs and Experiential Learning courses for Innovation & Partnerships and is part of the strategic team that has been exploring and pursuing metaverse learning options.  Previously he was the Director of Learning Management Technologies at Tabor College and Classroom Technology Coordinator at Kansas State University’s Polytechnic campus.  Outside of IWU he is active in the Metaverse Learning Community sponsored by Leadership Network and TheChurch.Digital.  He has been active in Colleague2Colleague and SIDLIT for 17 years, has served on the Steering Committee for many years, and is a former Chair as well as “Outstanding Technical Support Staff” award recipient. 

He may be reached at david.swisher@indwes.edu.

A team image follows.


Comment on this page
 

Discussion of "Modalities and experiences: Unlocking the gamified metaversity"

Add your voice to this discussion.

Checking your signed in status ...

Previous page on path Cover, page 2 of 22 Next page on path