In a high school biology classroom, a student attached three sticky electrodes to his arm and then made a fist. As his hand closed, the simple robotic hand on his desk closed too. 

What caused the robotic hand to move? The electrodes picked up the electrical signals in the student’s muscles and sent them to a specialized circuit board connected to his laptop. On the laptop, a computer program—written by the student and his lab partners—translated the signals into digital outputs that controlled the robot.

This successful initiative took place in 2023, when seven high school biology teachers piloted a unit in their classrooms on neural engineering—an emerging field that integrates neuroscience, engineering design, and programming. Ido Davidesco, an assistant professor in the Lynch School of Education and Human Development, co-led the development of the experimental curriculum as part of a project funded by the National Science Foundation (NSF).

Ido Davidesco

Ido Davidesco. Photo: Matthew Healey

Davidesco, a cognitive neuroscientist, runs the Lab-to-Classroom Research Group, which works at the intersection of neuroscience and education. The lab uses the tools of neuroscience, such as eye-tracking technologies and electroencephalography (EEG), to figure out what’s happening in students’ brains as they learn. Through projects like the neural engineering unit, they also aim to teach K–12 students about neuroscience itself.

“I feel that understanding how the brain learns can help students become better learners,” Davidesco says. “I also think neuroscience can be a good hook. Students are naturally interested in understanding how their own brains work, so even if they’re not that interested in science in general, their interest in neuroscience can be used as leverage to engage them in science more broadly.”

Improving the curriculum with the help of artificial intelligence

Davidesco recently received a three-year, $1.5 million NSF grant to build upon his neural engineering project. With this new grant, he and his collaborators (including Aaron Kyle of Duke University’s Biomedical Engineering Department, Bianca Montrosse-Moorhead of the University of Connecticut’s Neag School of Education, and Leslie Bondaryk of the Concord Consortium) will redesign their neural engineering tools and curriculum to add an artificial intelligence (AI) component.

Participating teachers in the Engineering x Science x AI workshop

Participating teachers in the Engineering x Science x AI workshop. Photo: Matthew Healey

Davidesco’s team designed the original curriculum to be used in a core biology course, rather than a computer science or engineering course, so that it could reach more students. (Eighty-nine percent of high schools offer biology courses, while 49 percent offer computer science courses.) But they found that students who had never tried computer programming before struggled with the coding portions of the unit. 

“That led us to think about how we could potentially incorporate AI tools to provide better support to students and teachers,” Davidesco explains.

The updated curriculum will include an AI agent that can help students who get stuck think through the steps required to create their code. 

“Of course, we don’t want the AI agent to do the thinking for the student,” Davidesco says. “A big part of this work will be to find the right balance between too little assistance and too much.

Two hands with sensors on the right arm and the circuit board in between them

A participant in the Engineering x Science x AI workshop places sensors on the right arm. Photo: Matthew Healey

Davidesco’s team is working with two New England high school biology teachers to design the AI agent and test it with their students. Their goal is to have a revised, AI-enhanced neural engineering unit ready for fall 2026, when they’ll recruit 10 more teachers to pilot it in their classes. The researchers will then refine the tools and curriculum for a second group of teachers to implement in 2027. Ultimately, about 20 New England teachers and 500 students will experience versions of the unit and provide their feedback.

Once classes complete the unit, the researchers will measure improvements in the students’ engineering design skills, computational thinking, and ability to prompt the AI agent (an increasingly useful skill). If the unit proves effective, the team’s next step will be to conduct a randomized experiment—where some classes have access to the AI and others don’t—to test the effectiveness of the AI agent.

Divider

WHAT IS COMPUTATIONAL THINKING?

This structured approach to solving problems draws on concepts fundamental to computer science, including: 

  • decomposition (breaking down complex problems into smaller parts),
  • pattern recognition, abstraction (filtering out unnecessary details to focus on core information), and 
  • algorithmic thinking (creating step-by-step instructions to solve a problem).
Divider
Na’ama Av-Shalom, postdoctoral research associate, helping a teacher and student at the workshop

Na’ama Av-Shalom, postdoctoral research associate, helping a teacher and student at the workshop

While student experiences and outcomes are the direct focus of the project, the researchers will also pay close attention to the participating teachers, according to Na’ama Av-Shalom, a postdoctoral research associate in Davidesco’s lab.

“It’s definitely important to us,” she says, “to figure out what kinds of supports that teachers, especially life sciences teachers, need to implement this curriculum that involves so much computing, computational thinking, and engineering.”

“Students are naturally interested in understanding how their own brains work, so ... their interest in neuroscience can be used as leverage to engage them in science more broadly.”

—Ido Davidesco

Assistant Professor, Lynch School of Education and Human Development
Principal Investigator, Lab-to-Classroom Research Group

Using eye-tracking data to understand student engagement

Building on his commitment to classroom-oriented technology, Davidesco is spearheading several initiatives that bring sophisticated laboratory tools into everyday learning environments. One such study, funded by a 2021 NSF CAREER award, employs eye-tracking glasses to study student attention and engagement during actual college courses.

Undergraduate Research Assistant Rebekah Che in the eye-tracking glasses

Undergraduate Research Assistant Rebekah Che wearing eye-tracking glasses. Photo: Xiaorui Xue

He began this research last semester with a pilot study involving 15 Boston College students taking an undergraduate applied psychology class. They wore eye-tracking glasses to class a few times during the semester, and software on mobile phones collected eye-movement data, including gaze direction, scan paths, fixation duration, and blink frequency, among others, from the glasses. On the days students wore the glasses, the instructor briefly paused the lecture every five to 10 minutes to ask them what they were thinking about. Students used an online survey tool to choose one of five answers: the class content, how well they understood the class content, personal matters unrelated to class, their current physical or emotional state, or other. At the end of class, they took a short quiz about the day’s lecture.

Preliminary findings from the pilot show that during class, students’ minds wandered about a third of the time.

“Which sounds like a lot,” suggests Davidesco, “but it’s actually consistent with previous literature.” Also consistent with previous studies: the more students’ minds wandered, the lower their scores on the end-of-class quizzes. “What is new,” according to Davidesco, “is the addition of the eye-tracking data, which we are currently analyzing.”

To refine this analysis, Davidesco is scaling up the research this semester, involving a class of BC undergraduates. His team plans to use machine learning to identify eye movement patterns that indicate attention and inattention. With that information, they can conduct future studies using only the eye-tracking glasses, with no need for disrupting student learning with questions about their attention.

“If we know how attention fluctuates in real-world classroom learning, instructional design can be improved.”

—Xiaorui Xue

Doctoral Student

Xiaorui Xue

Xiaorui Xue

The ultimate goal of this research, explains Xiaorui Xue, a doctoral student on the research team, is to help teachers better pace their lectures and lessons. 

“If we know how attention fluctuates in real-world classroom learning,” she says, “instructional design can be improved.” For example, by understanding typical attention patterns, teachers can decide when to shift from lecturing to something more interactive—or when to simply offer students a brain break.

Davidesco wants to ensure that students participating in his research can learn from it too. At the end of the BC pilot study, his team led two class sessions during which students developed research questions, then analyzed their own data to reach preliminary conclusions.

“That’s a unique experience to have as an undergrad,” Davidesco says. “It’s a common thread in my work that we treat students not only as research participants but also as research collaborators.”

Divider

CORE IDEAS

“Students are expected to use laboratory tools connected to computers for observing, measuring, recording, and processing data. Students are also expected to engage in computational thinking, which involves strategies for organizing and searching data, creating sequences of steps called algorithms, and using and developing new simulations of natural and designed systems.” 

—Next Generation Science Standards, Appendix F

Divider

Read more about how Lynch School faculty conduct research with educators.

Studying brain function in the classroom

Lynch School faculty are using brain imaging technology to develop new pedagogical techniques.

Lab-to-Classroom Research Group

This group conducts research at the intersection of neuroscience, cognitive science, psychology, and education.

Back To Top