The committee met with the purpose of coming to consensus on which instructional materials would be field tested in addition to the materials currently being tested, Agile Mind.
After reviewing the agenda and renewing commitments to norms, participants reviewed answers from the sales reps for Agile Mind, Glencoe, and Digits that were generated at the adoption committee meeting on January 26.
The team reviewed the layers of purpose for completing an IMET screening on all of the instructional materials under consideration:
Participants worked in grade-alike partnerships to complete an IMET screening using Non-Negotiables for Glencoe and Digits.
Once an IMET had been completed for each grade level of Glencoe and Digits instructional materials, the committee reviewed a cross-reference of their work, which revealed Digits did not meet Non-Negotiable criteria in several areas of Grade 7 materials and one area of Grade 8 materials; while Glencoe met Non-Negotiables in every area of each grade.
The committee used a circle protocol to discuss these results as well as other insights that had been gained through the IMET process. This discussion was followed with a vote, during which Glencoe was unanimously chosen as the instructional materials that will be piloted along with Agile Mind.
Details of the piloting process were discussed, including dates. Participants will pilot with at least one grade level of students. Both Agile Mind and Glencoe are to be evaluated using a piloting scoring sheet, which uses a 4-point scoring rubric and asks participants to rate the instructional materials on a variety of indicators addressing the content, instructional approaches, differentiation, assessment, engagement, teacher friendliness, and format of each instructional material. At the close of the meeting, participants were asked to fill out the piloting scoring sheet for Agile Mind, which participants have piloting experience with.
Participants practiced using the Circle Forward protocol to start the day.
The adoption team saw a presentation from Glencoe, spent some time looking at the materials, and filled out a pre-screening form based on our district’s Philosophy for math as well as our-agreed upon screening criteria.
Participants then saw a presentation from Digits, spent some time looking at the materials, and applied it to the same pre-screening form.
The adoption team used the Circle Forward protocol to discuss what we were thinking and wondering after seeing presentations and pre-screening. Overwhelmingly we felt that we needed time to take a deeper dive with the materials and “lift the hood” to see beyond the presentations and packaging.
The IMET was introduced as a tool for objectively looking at instructional materials and their relationship to standards. We spent some time learning about the Non-Negotiable metrics presented by IMET and then worked in grade-alike teams to apply the IMET to both Glencoe and Digits. All teams had surveyed the first few Non-Negotiable metrics when the afternoon drew to a close.
The Circle Forward protocol was used to discuss what we were thinking and wondering after using the IMET. Our initial impressions after using the IMET was that none of the materials were as strong as we had hoped, and we are hoping that completing the IMET will give us greater evidence to support a decision on which materials to pilot in addition to Agile Mind (which is currently being piloted).
The adoption committee reviewed their norms and set out to examine the essential question of the day: What is the relationship between standards and materials?
Participants read articles about how to adopt and implement curriculum. The committee felt it helped give them guidelines for looking critically at publisher’s materials and what it really means to be aligned with Common Core and the Mathematical Practices. The article also talked about the importance of PD when rolling out a new curriculum, and how imperative it is to start out on the right foot, and make sure all who are teaching the topic/unit are prepared and knowledgeable about curricular materials and standards.
The committee spent time unwrapping standards in anticipation of building Frameworks to assist with objective feedback during the pilot. Unwrapping the standards at first was a daunting task, but working with colleagues help to break down the process into smaller chunks helped the process be more manageable. This is an important process because it lends itself to what we as educators should be focusing on with our instruction- standards, depth of knowledge, and how these are connected and built across the year and across grade bands. Looking at the standards on an individual basis and using the template helped to see the bigger picture and identify what students should know and will learn based on the standards.
At the end of the day, the committee saw a presentation from Agile Mind and used their agreed upon criteria to complete a pre-screening of Agile Mind materials.
Scaling Google Sheet - Participants analyzed the criteria from the “Criteria for Selecting Understanding Based Curriculum Materials” (UbD - Seif) and assigned values to each of the criteria for selecting mathematics materials, in order to rank their importance. Before voting, participants added some criteria to make a list of 15 criteria.
HomeRoom- Participants looked at District Data and reviewed data samples. There is no test or standard that has been implemented for a long enough time to create a data trend for comparison. There also is no test for a statistical basis as to the effectiveness or not of any recently used curriculum in our district. However, through looking at a number of ancillary assessments and measurements, we may be able to draw some conclusions about program effectiveness. STAR and some stability in state assessments will make program evaluation easier in the future.
Committee Powering Results/ CCSS Powered Standards - Participants compared/contrasted the results from our MMA Day 1 power standards exercise. Looking at the results from the table data, we observed what we believed to be some of the power standards and what CCSS says are the power standards. There were both similarities and differences among their opinions.
UbD Reading: looked at the 11 criterion from this reading and then took a deeper look at each of those criterion, and then also developed a few more for our Scaling Google Sheet.
Power Noticings- We looked at the committee powering results and Common Core powered Standards and discussed grade level thoughts & vertical thoughts on our noticings. We discussed how some of the standards that the committee valued as major clusters, were only labeled as additional when it came to the Common Core Powered Standards. We then analyzed why we made the decisions we did and how that will impact the way we look into choosing our next set of materials.
CCSS Publisher’s Criteria - We looked at what criteria is used to evaluate materials for adoption. Much of the language focused on consistency and mathematical progression. Some of the main ideas presented were focus, rigor, and coherence. This information was then used to compare/contrast with UbD criteria and we used this as a tool to scale based on importance level (from 1 - 8).
The Adoption Process- Participants learned about how the adoption process works (Board Policy 2020 and Procedure 2020P) and why MSD is adopting materials for middle level math right now. These ideas included findings in the PDK Audit, our adoption cycle timeline, and our status as being in year 3 of piloting materials, which is the maximum allow by policy and procedure.
Norms- Participants read a framing article about norms and used IdeaScale to suggest, vote upon, and adopt norms for the middle level math adoption committee.
Math Philosophy Statement- Participants read a draft of Marysville School District’s philosophy statement, vetted the draft, and suggested changes.
Powering Standards- Participants learned about the rationale for powering standards locally (avoiding “curricular chaos,” benefitting vertical alignment, maximizing connections amongst standards, is the number one factor in high performing schools- guaranteed and viable curriculum). Participants used the filtering criteria of endurance, leverage, and readiness for the next level of learning to examine standards and used IdeaScale to suggest likely power standards and vote on them.
Learning List- Participants were introduced to Learning List, a third party program that conducts unbiased review of instructional materials. Participants spent some time learning about the tool and exploring various materials available using the tool, with a focus on Agile Minds.