PM1 to PM3 Predictive Modeling: Using Fall Baselines to Forecast Spring Outcomes

5 min read FAST Assessment
PM1 PM3 predictive modeling FAST assessment progress monitoring data analysis Florida intervention

PM1 to PM3 Predictive Modeling

The FAST system's progress monitoring design, with three administrations per year (PM1 in fall, PM2 in winter, PM3 in spring), allows for highly accurate predictive modeling. For school leaders and data coaches, the ability to look at a student's PM1 score and predict their PM3 outcome is a game-changer for intervention planning.

The research behind this is compelling, and the practical applications are immediate.

The Correlation Between PM1 and PM3

Data from large Florida districts and correlations from NWEA linking studies confirm extremely strong linear relationships between fall baselines (PM1) and spring outcomes (PM3):

  • ELA: Correlation coefficient r is approximately 0.77
  • Math: Correlation coefficient r is approximately 0.76

These are statistically very strong correlations, meaning PM1 is a highly reliable "leading indicator" of where a student will end up on PM3. Not perfect (the remaining variance is influenced by instruction, intervention, attendance, and other factors), but strong enough to make actionable predictions.

What the Correlation Means in Practice

PM1 Scores Predict PM3 Levels

We can map PM1 scores directly to predicted PM3 outcomes to identify "Action Zones." Using Grade 3 ELA as an example:

PM1 Score Range Predicted PM3 Outcome Action Zone
Below 171 Predicted Level 1 Intensive intervention needed
171-180 Predicted Level 2 Bubble zone: intervention can shift to L3
181-194 Predicted Level 3 On track, monitor for regression
195+ Predicted Level 4/5 Enrichment and acceleration

The Intervention Opportunity Zone

The students scoring in the upper half of the predicted Level 2 range (e.g., 176-180) are the prime candidates for "Bubble" intervention. They are projected to almost pass. A deflection of their growth trajectory by just 5-10% ensures proficiency.

Conversely, students in the lower half of the predicted Level 3 range (181-185) are "At-Risk Proficient" students who need "protective" intervention to ensure they don't slide back into Level 2.

Building a PM1-Based Prediction System

Step 1: Establish Your Baseline Data

After PM1 administration (typically September-October), pull every student's scale score and current achievement level. This is your starting snapshot.

Step 2: Calculate Distance to Cut

For each student, calculate: - Distance to the next level boundary above (how far to the next level up) - Distance to the level boundary below (how close to sliding down)

Students within 8-10 points of any boundary are in the "action zone."

Step 3: Apply Historical Growth Rates

If you have prior-year data, calculate the typical PM1-to-PM3 growth for students at similar starting points. This gives you a school-specific growth expectation, which is often more accurate than generic predictions.

Step 4: Classify Students into Action Groups

Based on the PM1 prediction, sort students into intervention tiers:

Group Description Recommended Action
Green: On Track PM1 predicts L3+ on PM3 Monitor, enrich, maintain
Yellow: Bubble Up PM1 predicts high L2, intervention could push to L3 Targeted Tier 2, focus on weakest strand
Orange: Growth Target PM1 predicts L1-High or L2-Low, sub-level gain possible Tier 2/3, target next sub-level boundary
Red: Intensive PM1 predicts deep L1, significant gap Tier 3 intensive, target sub-level gain
Blue: At-Risk Proficient PM1 predicts low L3, could slide to L2 Monitoring + targeted support on weak areas

The PM2 Check-In: Adjusting the Forecast

PM2 (administered mid-year, typically December-January) is the critical checkpoint. By comparing PM1-to-PM2 growth against expected PM1-to-PM3 trajectories, you can determine:

On Track

The student has grown at or above the expected rate. Maintain current intervention.

Stalled

The student has shown minimal growth from PM1 to PM2. This is an immediate red flag. If a bubble student shows flat velocity at PM2, intervention must be changed (not just continued) before the PM3 window.

Possible adjustments: - Increase intervention frequency (2x/week to 4x/week) - Change intervention type (switch from comprehension to fluency if that is the underlying bottleneck) - Change the interventionist (sometimes a different teaching style unlocks growth) - Add a confidence/test-stamina component if the student knows the content but underperforms on the CAT

Regression

The student's PM2 score is lower than PM1. Investigate immediately: - Attendance issues? - Social-emotional factors? - Was PM1 an anomaly (artificially high due to lucky guesses on the CAT)? - Has instruction changed (new teacher, different curriculum)?

Predictive Modeling for School-Wide Planning

Projecting School Grade Components

By applying the PM1 prediction model to every student in the school, administrators can project:

  1. Predicted proficiency rate (percentage of students predicted to score L3+ on PM3)
  2. Predicted learning gains rate (percentage of students predicted to show a gain)
  3. Predicted lowest quartile gains (percentage of bottom 25% predicted to show a gain)

These projections, made in October, allow schools to identify exactly how many additional students need to be moved to hit target thresholds for each school grade component.

Example School-Wide Analysis

A school with 300 tested students might find after PM1:

  • 42% predicted proficient (need 50% for an "A" component)
  • That means 24 additional students need to cross into Level 3
  • Of those 24, 16 are in the "Yellow: Bubble Up" zone within 10 points of the cut
  • If intervention succeeds with 75% of the bubble group (12 students), the school still needs 12 more from other zones
  • The school can now plan resource allocation with precision

Using PM1 Data with Cross-Subject Analysis

The predictive power of PM1 data increases significantly when combined with cross-subject analysis. A student's PM1 Math score combined with their PM1 ELA score creates a richer profile:

  • Strong Math, Weak ELA PM1: This student is a Technical Bubble candidate in ELA. Their math ability suggests cognitive capacity for higher performance. Target ELA-specific intervention.
  • Weak Math, Strong ELA PM1: Technical Bubble candidate in Math. Leverage their reading ability (they can engage with word problems and math text) to accelerate math growth.
  • Weak Both PM1: Broader intervention needed, but prioritize the subject where they are closest to a cut point.

Common Pitfalls in Predictive Modeling

Pitfall 1: Over-Reliance on a Single Data Point

PM1 is one snapshot. Students who were sick, anxious, or had a bad testing day may have artificially low scores. Cross-reference with classroom performance before making permanent group assignments.

Pitfall 2: Ignoring the Confidence Interval

Remember the Standard Error of Measurement (SEM). A PM1 score of 292 and a PM1 score of 298 may look different, but within the SEM, they could represent the same true ability. Do not over-segment students based on small score differences.

Pitfall 3: Setting It and Forgetting It

PM1 predictions are a starting point, not a destiny. The whole purpose of PM2 is to validate or adjust the prediction. Schools that assign intervention groups in October and never revisit them until April are wasting the most powerful feature of the FAST system.

Pitfall 4: Not Acting Fast Enough

If PM1 data arrives in October and intervention groups are not formed until November, that is 4-6 weeks of lost instruction. Build the intervention schedule before PM1 data arrives, using prior-year data as a placeholder, and slot students in as soon as PM1 results are available.

Key Takeaways

  1. PM1 is a strong predictor of PM3 with correlations around 0.76-0.77. Use it to forecast and plan, not just report.
  2. Identify your Action Zones immediately after PM1: Bubble Up, Growth Target, Intensive, and At-Risk Proficient.
  3. PM2 is for course correction, not just progress reporting. If velocity is flat, change the intervention before PM3.
  4. Project school-wide outcomes using PM1 data to quantify exactly how many students need to move and where the resources should go.
  5. Combine PM1 with cross-subject data for richer student profiles and more targeted intervention planning.

Need help putting this into practice?

Our AI Teacher Assistant can help you create standards-aligned plans, differentiated activities, and intervention strategies based on this knowledge.

Try Fast Action Edu Free