Data SGP is a software package used to calculate student growth percentiles and projections/trajectories using large scale longitudinal education assessment data from tests, portfolios or grading scales. This can identify students in need of extra support; assess current educational systems; and help close any achievement gaps between high and low performers.
SGPs are calculated by combining results from prior testing windows with scores on their most recent assessment, then dividing by the average score for all students who took that assessment – creating a percentage to indicate a student’s growth compared to similar MCAS performer histories; the higher this number is, the more impressive their progress has been.
SGP stands for Standard Growth Profile, or https://coalingachamber.org/ Progress Measure, which assesses student progress over time based on other students with comparable MCAS performance histories. Teachers and other stakeholders can access this data to analyze trends and identify patterns in student achievement as well as narrow gaps between high performers and low performers and focus on improving teaching practices and classroom environments.
The sgpData_INSTRUCTOR_NUMBER table provides anonymized instructor data associated with each test record of students taking GPA exams. This information is combined with those from the tables sgpData_STUDENT_PERCENTILES and PROJECTIONS_WITH_INSTRUCTOR_NUMBER to generate estimates of student growth percentiles and projected/trajectories with an instructor.
Example: Students performing poorly on both math and ELA in their first year of MCAS would have an SGP score of -22.5; this indicates they made less progress than half of all similar students with comparable MCAS performance histories. Conversely, a student performing well across both subjects while working with an instructor with an SGP score of 4 would receive a progress score of 50; meaning they have made greater than 50% more progress than half of similar MCAS scores.
Although these relationships would still exist if SGPs could be measured accurately using standardized tests without measurement error, it is essential to remember that estimated SGPs are noisy measures of latent achievement traits; hence they should not be taken as meaningful differences in student performance. Our analyses suggest that many of the differences seen in Figure 2 could be attributable to contextual influences or teacher sorting rather than changes in aggregated SGPs.