Playful Assessment

The MIT Education Arcade, in collaboration with the Teaching Systems Lab, is creating a game-based assessment system that gives students and teachers access to ongoing authentic assessments that measure multiple learner outcomes in a deep and robust way. We are working to re-envision the role of formative assessment in math classrooms using digital games as a vehicle to assess learners’ growth in core fundamental knowledge as well as in cognitive and non-cognitive skills, such as persistence and creativity. We will do this by using an ongoing game-based assessment model that doesn’t rely on single observations and single types of game mechanics, but rather gathers data continuously over time and ubiquitously across contexts and standards. The psychometric qualities of the assessment results will be researched so that ultimately the assessment system could replace tests, and the implementation model of short, frequent interactions will make it feasible to integrate into classrooms. Our goal with this project is to not only create one game-based assessment system, but to develop a process whereby this type of tool will be more feasible to design and build, in order to reshape classroom assessment practices.

So far we have completed the background research phase and are moving into the design phase. We have selected the topic area of geometric measurement and dimension, including the relationship between 2D and 3D shapes, and we have brainstormed possible directions for the game’s core mechanics and premise. Our goal is to create an environment in which students demonstrate their conceptual understanding of geometry by imagining and building shapes to solve modeling problems. It will be open-ended enough that players can solve problems in multiple ways, but constrained enough to yield interactions and choices that can feed assessment models. These are the core goals that will drive our next steps of exploring and prototyping the game ideas. In addition to the focus on game design, we are preparing to expand our team to include a new data analyst with machine learning skills. We have been researching existing frameworks that could be used for data collection and analysis. And we have begun designing a modular system architecture that will enable all the elements of this assessment system to be developed in parallel and work together smoothly.

Looking ahead, in the next few months, we will be working to identify additional competencies and iteratively align them with the game mechanics. We plan to have a digital prototype by the summer with which we can collect data to iteratively refine the assessment mechanics and game mechanics, and begin to develop the assessment machinery. The API, database, and server architecture will also be put in place to support these efforts. All of this will set us up for polishing the game and running psychometric studies leading into the following year. Additional work such as designing ways for the assessment results to feed back into the game experience, and creating dashboards and visualizations for teachers, is beyond the scope of the current project plans but will enable the game-based assessment tool to have a much greater impact down the road.

Contact us about this project!

  • Yoon Jeon Kim
    Yoon Jeon Kim Research Scientist

    Yoon Jeon “YJ” Kim is a research scientist at the Teaching Systems Lab. Yoon Jeon’s research centers on the design and development of learning and assessment in technology-rich environments, particularly video games and simulations. She also has been working closely with teachers co-designing curricula that incorporate emerging technologies within STEM domains for the purpose of supporting “21st Century Skills” such as systems thinking and science inquiry.

     

Related Projects

Start typing and press Enter to search