Welcome to the 2018 M-STEP Reports Webcast for School and District Assessment Coordinators.
This presentation covers the Spring 2018 M-STEP reports that are currently available in the
Dynamic Score Reporting Site which can be accessed through the Secure Site.
If you need access to the Secure Site, please go to www.michigan.gov/mstep
and select the "Secure Site Training Documentation" link that is found under the Professional Development heading.
In this presentation, I will begin by building some background regarding the type of data that is
included in the reports on the Dynamic Score Reporting Site, as well as the difference between the
Overall Performance Levels and the Claim Performance Indicators that are provided on the
English language arts and mathematics reports.
First, I will review the different types of data that are provided in the various reports in the dynamic
score reporting site.
Next, the difference between performance level scale scores and
claim performance indicators is examined and discussed.
Then, you will see what data is included in the student level M-STEP reports, followed by the aggregate level M-STEP reports.
Finally, I will discuss the appropriate use of the data provided in the dynamic reporting site.
There are two types of data reported in the M-STEP reports.
Student-level and aggregate – or group – data.
It is important to note that the type of data determines how the data in a report is appropriately used.
Here we have the two types of data: student level and aggregate. Student level data is data reflecting one student's performance
on the M-STEP. It provides the educator with a snapshot of what a student knows and is able to do relative to
Michigan's Academic standards at one point in time.
Aggregate data reports measure groups of students' performance relative to Michigan's standards at one point in time.
This data can be used to look at program effectiveness, curriculum alignment, how a building's
program is serving students in different demographic groups, or how a program serves students in one grade level over time.
It can also be used to compare program effectiveness within one building compared to another building in the district,
compared against the district as a whole, or even the state.
It is important to note, both student-level and aggregate data are important and necessary
when considering the education schools provide to Michigan students.
However, keep in mind the different types of data are intended for different purposes, and should be used accordingly.
Next, we will discuss Performance Levels and Claim Performance Indicators.
The M-STEP reports provide information on what a student knows and is able to do in a content area as a whole as well for the claims
that are within each content area in mathematics and ELA.
Performance Levels and Claim Performance Indicators are what are used to report this information.
Let's see what you already know. Pause this video and take a moment to write on a scratch paper:
What are Performance Levels, and what are Claim Performance Indicators?
Think about: what they are, how they are the same, and how they are different.
This slide lists and defines each of the overall performance levels.
These are based on the scale scores that determine whether a student scored Advanced, Proficient, Partially Proficient,
or Not Proficient in the assessed content-area.
Each of these Performance Levels are based on what students know and are able to do based on Michigan's content standards.
They are descriptors of content-area performance.
An advanced proficiency level indicates that performance has exceeded grade level content standards;
proficient indicates performance that meets grade level content standards;
partially proficient indicates performance that shows a partial understanding of grade level content standards,
and not proficient indicates performance showing a minimal understanding of grade level content standards.
This is a sample of the Performance Levels as seen on the M-STEP reports.
The scale score ranges for each performance level are shown below the graph.
Also shown on this graph are the Performance Levels – Not Proficient is shown in red, Partially Proficient in yellow,
Proficient is shown in green, and Advanced in blue.
This is an example of the claim performance indicator bar.
This bar provides context to the student's claim performance by showing the
student's claim-level performance relative to the range of possible performance for each claim.
If a student scores "attention" on a claim, the educator can see on the bar where in the range of scores for this indicator the student scored.
This slide shows you the Drill Down path that is a feature of the Dynamic Score Reporting Site. As a note - users can begin drilling down on
any report that has the drill-down feature; it is not necessary to start at the District Comprehensive Report.
The user can select the school name on the District Comprehensive Report to open the School Demographic Report for that school.
The School Demographic Report drill-down enables the
user to select the link in the Number of Students Assessed column to open a Student Roster Report
which includes the students represented in the aggregated group.
While in the Student Roster Report, the user can drill down further by selecting the student name to open
an Individual Student Report.
After a user has selected the link to drill down into the next report, a breadcrumb area appears
below the ISD/District/School entity information.
Each report name in the breadcrumb is an active link.
To return to any previous report, the user selects the report name in the breadcrumb.
Next, we will discuss the M-STEP reports that are available in the Dynamic Score Reporting Site. This slide lists the
aggregate data reports. Aggregate reports report the performance of groups of students and are helpful
to educators in reviewing the effectiveness of programs and curricula in schools.
The Comprehensive Report provides a comparison of student achievement among schools within a district or districts within an
ISD. The Demographic Report provides a comparison of student achievement across
demographic sub-groups such as gender and race or ethnicity.
Both the Demographic Report and the Comprehensive Report summarize the mean scale score and performance levels for the total
number of students assessed.
The Expectation Analysis report provides detailed information on student performance in social studies.
This report shows the number of students falling within identified percentage ranges of points earned over points possible for social
studies only.
The Target Analysis Report provides relative strength and weakness information on target-level performance for
English language arts and mathematics.
The Science Field Test District Summary report is a basic summary report displaying a frequency
distribution of the percent of average - or mean - points earned and where a district falls on the statewide distribution.
The report does not include constructed response results and is not intended to provide any indication of student proficiency.
The data from the field test will be used to determine how well the field test items measure the intended standards.
A note about the Science field test – in 2018 and 2019, the M-STEP science test is a field test.
Results from a field test are not intended to provide information on student achievement. Instead, field test results verify the adequacy of
testing procedures and the statistical characteristics of new test items or new test forms.
Field test data also informs the test development process regarding the quality and performance of the new items based on state
academic standards.
So, the Science Field Test District Summary Report, which will be available after the initial release of aggregate reports, will display district
and state level aggregate raw score data. This report is the only report that contains any science data –
no individual level or other aggregate report contains science field test data.
Next, we will look at the student level data reports.
You can see in this table the title of each report with a short description of each of the student level reports.
The Student Record Labels are printed and sent to schools as sticker-labels for inclusion in the student's CA-60 folder.
Additional copies are available on the Secure Site by selecting Student Record Labels from the Reports drop-down menu.
They summarize student performance levels in each content area assessed.
Individual Student Reports –also called ISRs- are reported by each content area assessed –
a student has a different ISR for each content area assessed.
Parent Reports summarize student achievement by content area and are printed and sent to schools to be delivered to parents.
This report includes a letter from Interim Superintendent Shelia Alles to parents regarding the report results and provides
resources for parents.
The Student Roster provides overall proficiency information for the aggregate groups and for the rostered students,
and it also includes student level achievement data for educators in a sortable form.
Educators can sort by student name, scale score, and ELA or mathematics claim or social studies discipline
while in the Dynamic Score Reporting System.
The last student level report is the Student Overview.
The overview provides scale score, performance level and claim or discipline information in a summary format for all content
areas assessed.
This section will discuss the aggregate level student reports:
which includes the Comprehensive Report, Demographic Report, Target Analysis Report, and the Expectation Analysis Report.
Also included is a preview of the Science Field Test District Summary Report.
This is a sample of the district level comprehensive report.
The comprehensive report contains entity proficiency information by grade and content and is available at the ISD and district levels.
As shown in this example, the All Schools graph displays proficiency information for all schools in the district,
and the user can also select one school (or district, in the ISD report) to view a graph of the school (or district)'s proficiency information.
The school name in blue allows the user to drill down to the School level Demographic Report.
The drill down is only available on the District level Comprehensive Report.
Users can use the comprehensive report to view district or school- wide proficiency information.
The Demographic report provides a comparison of students by grade and content, aggregated across user-selected demographic
groups, showing the percentages proficient at each level.
This slide shows an example of a School Demographic Report.
You can see the graphs on the left highlighting the performance levels according to aggregate group.
The All Students graph is the default view, and users can select any demographic subgroup to display a graph that contains the performance
level data for the selected demographic group.
In the table format, users can view and compare the numbers of students assessed, mean scale score, and percentages of all
students – as well as in each listed demographic subgroup – earning Not Proficient, Partially Proficient, Proficient,
Advanced, and the combined percentage of students earning Proficient and Advanced.
The School Demographic Report includes a drill-down feature that allows users to select the blue number in the Number of Students
Assessed column to open a Student Roster Report.
Next, I will discuss the Target Analysis Report. It is important to note that the Target Analysis Report is not like the other aggregate reports –
it does not report proficiency data. It reports relative areas of strength or weakness as compared to performance by the aggregate
group, on the test as a whole for English language arts and mathematics only.
This report is only available for ELA and math because they are adaptive assessments.
The Target Analysis Report is available at the state, ISD, district, and school levels.
The blue upward pointing triangle indicates an area of relative strength, the yellow circle indicates that an assessment target is neither
a strength nor weakness, and an orange downward pointing triangle indicates a relative weakness –
again, as compared to the aggregate group's performance on the test as a whole.
There is also an asterisk symbol that indicates there is insufficient data to report.
On this report, in order to make a valid determination of the relative strength or weakness –
or neither – of an assessment target, there must be at minimum 15 unique students assessed per target,
3 unique items per target, and 25 responses per target.
If any one of these requirements is not met, then an asterisk is displayed.
The number of students assessed is displayed for the report.
Next, the Claim is displayed, followed by a list of the Assessment Targets within each claim.
The Assessment Targets are mapped to the Claims and Michigan's Academic Standards in the Crosswalk documents that are available
on the M-STEP webpage at www.michigan.gov/mstep.
You can view these documents to see how the claims, assessment targets, and content standards are grouped.
The right column displays the symbol representing the relative strength and weakness indicators for each assessment
target. The Target Analysis Report can be used to determine relative areas of strengths and weaknesses
for the represented aggregate group.
The Expectation Analysis Report provides the percentages of points earned by grade and content area expectations in each discipline.
This report presents social studies data only for grades 5, 8, and 11.
The report provides an overview of performance by content expectation,
however users should keep in mind that the number of items assessed on each expectation may be small.
The table identifies the associated expectation assessed. Next, the Number of Students assessed in each expectation,
the average percentage of points earned, and the number of students scoring in one of four percentage groupings:
0-25%, 26-50%, 51-75%, and 76-100% are displayed.
The Science field test District Summary Report will be available after the initial release of reports.
The report will contain frequency distribution graphs that display aggregated raw score information about
student performance for the district, and state.
The x-axis of the graph will display mean points earned percent ranges,
and the y-axis will display the percentage of districts scoring in each points earned range.
The report will be a series of graphs – one for the overall content area and one for each domain – Physical Science,
Life Science, and Earth Science.
Watch the Spotlight for more information about when this report will be available.
As a reminder, this report is based on data from field test items.
The field test data will be used to determine how well the field test items measure the intended standards.
This data is not intended to provide proficiency information in relation to the Michigan K-12 Science Standards or domains.
Decisions about school improvement goals, curriculum, or other instructional decisions should be based on locally developed science
assessments that measure student achievement based on the Michigan K-12 Science Standards.
In this section we will identify the information provided in the Student-Level reports: the Individual Student Report,
Parent Report, Student Roster, and Student Overview report.
Student level reports contain federally protected student information, therefore the information in student level reports must
be used in accordance with the Family Educational Rights and Privacy Act – or FERPA.
All of the images used in this presentation have been de-identified and use mocked up data so that no actual student performance information
is shared.
This is the Individual Student Report with identifying information removed for privacy purposes.
There are three main sections in the ISR, and they are marked with 1, 2, and 3.
First is the entity information and the student demographic information.
This section details the school, district, and ISD where the student took the test.
Next is identifying information about the student – you will see the student's name, grade, gender, date of birth, ethnicity,
whether the student has a disability, is an English learner or former English learner,
and whether the student received any designated supports or accommodations when taking the M-STEP.
In the second area – marked by the 2 on this report - you will see the overall content performance.
This example is mathematics report.
The student's scale score is listed, 1512, and the student's Performance is indicated –
in this case, Proficient.
You can also see the margin of error for the student's score displayed in gray.
Along the bottom of the Performance scale, you can see the scale score ranges for each Performance level.
1409-1477 for Not Proficient, 1478-1499 for Partially Proficient, and so on.
Next you see in table form the Subject, Scale Score, Margin of Error, Performance Level, and Student Growth Percentile.
Student Growth Percentiles will be available after the initial release of reports.
The third section has detailed information on claim performance.
You will recall from a the previous slide that the claim performance indicators are indicators of performance within each claim.
The Claim performance indicator bar shows how the student performed on the claim relative to the range of possible performance
on that claim. In this example, you can see that this student has earned attention for the
Concepts and Procedures claim, and for the two claims that are combined for reporting:
Problem Solving and Modeling and Data Analysis, and the student has earned adequate progress in the Communicating Reasoning
claim.
Again, the Claim Performance Indicators are used only for English language arts and mathematics –
because these are computer adaptive tests; raw score data – that is, points earned out of points possible –
is not a valid representation of student achievement on a computer adaptive test.
This Individual Student Report is a Social Studies report. What is different for Social Studies from the ELA (and Mathematics)
reports is the Disciplines reported.
You can see that instead of Claim Performance Indicators, there are raw scores reported –
points earned and points possible, sorted by discipline -
in Social Studies the disciplines are History, Geography, Civics and Government, Economics, and Public Discourse.
Raw scores are not comparable across different forms of the test –
for example, 4 points out of 7 possible points for student A is not the same as 4 points out of 7 possible points for student B, because raw
scores are not equated across test forms.
Also, when reviewing raw score data with relatively small numbers of items, be sure to use caution when making large-scale
decisions.
Remember that science is not reported for individual students because the 2018 M-STEP Science test was a field test.
On this page, you can see the raw scores for each Grade Level Content Expectation reported.
The Discipline level is also reported – as it was on page one – showing which assessment expectations belong with which discipline.
In this example, the expectations highlighted are in the History Discipline.
As you go down the column of raw scores for each assessment expectation,
you can see which assessment expectations were answered correctly, and which were missed by this student.
Please remember that this report provides detailed student achievement
information about what a student knows and is able to do at one point in time.
It is important to use formative assessment, classroom observation, and other local data when
making instructional decisions for individual students.
Summative, standardized state-wide assessments are intended to provide information about school and district
information about student achievement for use in program evaluation and to inform school improvement initiatives or curricular decisions.
Parent reports are printed and sent to schools to be distributed to parents.
The Parent Report begins with the Superintendent letter.
This letter describes the information that can be found in the Parent report, and
provides resources that are available to parents on the MDE's M-STEP webpage.
As a reminder: science field test data is not included on any student level report.
The Superintendent letter contains an explanation for parents regarding the science field test.
On page 1 of the parent report is the Overall Performance information for English language arts
including the scale score and the associated Margin of Error.
Next is claim performance information.
Below the claim information on page 1 are definitions for common assessment-specific terms found on the Parent report.
This includes a definition for Claims, Disciplines, Claim Performance Indicator, and Margin of Error.
Page 2 of the Parent report includes the overall scale scores for the remaining content areas tested.
This example is a 5th grade report, so ELA was reported on page 1, Mathematics on page 2 and Social Studies is also on page 2.
At the bottom of page 2 are the definitions for the Performance levels – Advanced, Proficient, Partially
Proficient, and Not Proficient.
Please go to www.Michigan.gov/mstep for a Parent Report video that reviews the Parent Reports in further detail.
Next, we have the Student Roster report. New in 2018 is the Overall Proficiency Summary section of the report.
The Student Roster report the number and percentage of valid tests scoring in each performance level for the state, district, school,
and the roster of students based on
the user's selections.
It also includes individual student data for the selected group of students.
There are extensive filter options on the Student Roster Report.
Users can build a roster report by filtering on grade, content area, reporting code, performance level,
demographic groups – including gender, ethnicity, economically disadvantaged, English learner or former English learner,
homeless students, migrant students, students with disabilities, and by student name.
Once the report has been built, the first row is the Student Name column.
After the student names is a small i – this is a hover-over feature.
Users can hover over this "i" and view the student's UIC and date of birth.
This is helpful when there are two students on a roster report with the same or similar name.
Next is the Scale Score, Student Growth Percentile – or SGP - Margin of Error, and then Performance level.
The Overall Scale Score is also displayed in graph form in the fifth column.
The last columns contain claim performance indicators for English language arts and mathematics,
while for science and social studies it displays raw score data for each discipline – this will be shown on the next slide.
This view shows a Student Roster report with the Points Earned over Points Possible as is displayed on the social studies reports.
This view shows a Student Roster report that has been sorted by Scale Score by the user.
There are multiple sort options on the Student Roster report. The student list defaults to alphabetical order.
By selecting the word "Students", the list of students sorts to reverse alphabetical order.
Selecting a second time sorts the reports back to alphabetical order.
Users can also sort by Scale Score – shown here. Selecting the words "Scale Score" will sort the reports from the highest to lowest scale
score.
Selecting a second time sorts the reports from lowest to highest.
The final sort is the Claim-level sort. You can see on the right that the words "Reading", "Writing", "Listening", and "Research" are all
blue – indicating they are links. The user can select a claim and sort by performance level indicator within that claim.
Users can also select a student name to drill-down into the Individual Student Report for the selected student.
This is the Student Overview Report. It provides summary student level data for each content area in which the student tested on a
single page. This 5th grade sample has English Language Arts, Mathematics, and Social Studies data.
There are three primary data points shown in the Student Overview Report.
First – highlighted by the number one on this image, is the Scale Score.
Like the other reports, the scale score is reported, along with the scale score ranges shown below the graph.
Second – highlighted by the number 2 on this image, is the margin of error.
Again, as in other reports, the margin of error is represented graphically by the gray area as well as in the table beneath.
Third is the Performance level.
Again, this is shown graphically as well as in the table as shown here.
Claim performance indicators are reported for English language arts and mathematics,
and raw score data is reported by discipline for social studies.
Finally, be sure to sign up for the weekly Spotlight on Student Assessment.
You'll receive weekly up-to-date information about upcoming deadlines, assessment task reminders,
updates about the administration of the assessments, report information and much more.
You can follow the link on this slide, or go to the M-STEP webpage at www.michigan.gov/mstep
and scroll down to select the "Spotlight" icon to sign up.
If you have any questions, please feel free to contact us via email at mde-oeaa@michigan.gov
or by phone at 1-877-560-8378 and select option 3. Thank you for watching!
Không có nhận xét nào:
Đăng nhận xét