Researchers Adapt Minecraft to Teach Spatial Reasoning

Researchers Adapt Minecraft to Teach Spatial Reasoning

MinecraftSchool of Education and Social Policy faculty members Marcelo Worsley and David Uttal have received a $750,000 National Science Foundation grant to create a multimodal platform based on the popular Minecraft video game, an effort designed to help elementary and middle school students develop spatial reasoning and computational thinking skills.

The new interface will allow children to interact with Minecraft not only through keyboard commands but also through speech, physical blocks, and eye gaze. The multi-faceted approach will give them a chance to practice important cognitive skills in a more natural and productive setting, the researchers said.  

“The ability to use spatial reasoning and think like a computer scientist will be essential for future STEM (science, technology, engineering, math) workers, said Worsley, assistant professor of learning sciences, electrical engineering and computer science. “Our research can potentially create learning pathways to STEM careers for underrepresented young learners.”

Minecraft is a type of virtual sandbox where users gather and place 3D objects – mainly cubes – to build various constructions. Educators have adapted the game to incorporate their own curricular content; some use it to help teach problem-solving, critical thinking, and other skills.

In the Northwestern study, about 400 students from Evanston/Skokie School District 65 will try the proposed multimodal learning system over three years. The researchers will test several theories about spatial reasoning and computational thinking through laboratory and afterschool learning club studies.

Once the software and design have been created, it will be freely available to the public to help increase the number of students who are exposed to the STEM fields.

“Our project investigates the future of multimodal interfaces,” said Uttal, professor of education and psychology at Northwestern and director of the Spatial Intelligence & Learning Center. “The ability to interpret everyday language and express an idea through speech or gestures while collaborating are key components of the next-generation multimodal, voice-enabled interface.”

This grant leverages a $235,000 award from the National Science Foundation on Multimodal Learning Analytics (MMLA). In particular, Worsley is developing tools and approaches for studying student learning across a variety of data streams, such as speech, gestures, eye gaze and skin cond, ctance, indoor location.

Worsley and Uttal will use multimodal analytics, in conjunction with more traditional approaches from psychology and the learning sciences to examine the different processes and practices associated with computational thinking and spatial reasoning with their innovative interface for Minecraft.

"MMLA makes it possible to explore learning environments -- whether classrooms or design studios -- that have been difficult to investigate before,” Worsley said. “New types of sensors and data mining techniques now mean we can move beyond just looking at pre-tests and post-test and instead look at the process that students follow.” 

By Julie Deardorff
Last Modified: 10/25/18