Purpose: This project team will fully develop and test a product for educators that organizes and interprets student solutions in response to mathematical problems. Mathematical problem solving is at the heart of effective mathematics learning because it requires students to think deeply about content and employ higher-order thinking. However, assessing problem solving is difficult. Multiple choice questions only assess the final solution and ignore the process. When students do show their work when solving problems, work can often be messy, incomplete, and hard to follow, making it difficult for educators to analyze and assess. Project Activities: During Phase I in 2020, the team developed a prototype of a dashboard for teachers to view, monitor, and gain insight from students' digital work in mathematics to inform instruction. The prototype presents insights to support teachers' pedagogical approaches, identify areas for personalized intervention, and plan math discussions that address student misconceptions and support peer-to-peer learning. At the end of Phase I, researchers completed a pilot study with three grade 5 teachers and 75 students. Results demonstrated that the prototype functioned as planned, that educators could use the prototype to track student progress, and that teachers agreed that the prototype facilitated ways to sharpen instruction and deepen student learning. In Phase II, the team will fully develop a machine learning engine to automatically categorize student problem solving strategizes and provides insights to teachers, as well as training materials to support implementation. After development is complete, the research team will conduct a pilot study to assess the feasibility and usability, and promise of the product to support educators in understanding individual students' problem-solving strategies. The study will include 60 grade five classrooms, half will be randomly assigned the product and half will continue to use business as usual procedures. Researchers will employ a mixed-methods design to assess how educators use the product to inform instruction, including researcher developed interview protocols, teacher logs, classroom observations. To examine differences in students in treatment and control classrooms from pre- to post-test, researchers will employ items from the Mathematics Assessment Project and from NAEP items. Researchers will gather cost information using the "ingredients method" and will include all expenditures on things such as personnel, facilities, equipment, materials, and training. Product: Teachley Problem-Solving Assessment will organize and interpret student solutions in solving math problems. The product's dashboard will present insights to help teachers view, monitor, and assess student work to inform their instruction. The main components will include a machine learning engine will designed to generate automated prompts where students justify their work and a function that automatically categorizes, sorts, and presents student work, a user interface designed to cleanly present student work, and teacher professional development resources to implementation.