EVAAS

Education Value Added Assessment System

Login for SAS EVAAS Reporting

https://ncdpi.sas.com/evaas/login.jsp

Managing Your Account

When an administrator creates an account for you to view SAS EVAAS reporting, you will receive an email message with your login name and password. When you login, you will need to enter this password exactly as it appears in the message. Passwords are case sensitive.

Changing Your Password

When you receive login information for a new account, a random system-generated password will be emailed to you. You will be required to change this password the first time you login. After that first login, you may change your password again at any time. To do so, click My Account in the global menu at the top of the screen. Then, click the Change Password button. In the pop-up window that appears, enter the current (old) password. Enter the new password. It must be at least 8 characters long. Then enter the new password a second time for verification. Click the Submit button. Your password has now been changed.

General Navigation

The Tab Menu

Above each report, you will see a series of tabs. You will notice as you navigate through your reports that these tabs change. On each page you will see only the tabs that are appropriate for the report you are currently viewing.

Under each tab is a drop-down menu with a list of options. To see the list of options, roll your cursor over a tab. Then, click on an option to select it. For example, to view a different report, roll your cursor over the Reports tab. You will see a list of all reports that are available to you. Click on a report name to view that report.


The Global Menu

At the top each page, you will see the global menu. This menu contains options that are available to you at all times, regardless of what report you are currently viewing.

Home: / Click this link to return the Welcome page.
Search: / Click this link to search for individual students or groups of students with similar characteristics.
Admin: / Roll over this link to see a list of options for the Admin Tool. This tool allows State, District, and School Admins to create and modify accounts, email users, and run usage reports. NOTE: This functionality is only available to persons with administrator privileges.
My Account: / Click this link to change the password or email address for your account.
Help: / Click this link to open the help file for the page you are currently viewing. The help files contain specific information about your reporting.
Contact Us: / Click this link if you are experiencing technical difficulties. It will open a separate window into which you may type a message to SAS EVAAS technical support.
Logout: / Click here to logout of the reports. For security reasons, it is important to logout if you plan to leave your browser window open.

Introduction to SAS EVAAS Reporting

SAS EVAAS reports provide a wealth of diagnostic information:

§  Value Added Reports provide detailed information about the progress rates of an individual school or district.

§  Diagnostic Reports allow you to identify patterns of progress among subgroups in a school or district within the same grade and subject.

§  Performance Diagnostic Reports allow you to identify patterns of progress among subgroups in a school or district by their predicted Performance Level (Levels I – IV).

§  Summary Reports provide a comparison of the progress rates of students at all schools in a district for each test, subject, and grade.

§  Student Reports present a table of available student scores in each subject tested. The accompanying graph provides the student's percentile, the school's mean percentile, and the district's mean percentile for each test administered.

§  Student Projections Reports present a table of available student scores in the subject tested, followed by the student's probability of achieving a particular level of proficiency on subsequent tests. A graph of these data accompanies each table.

Report estimates were completed via SAS EVAAS methodology and software from SAS

Institute Inc. SAS EVAAS methodology uses up to five years of available test scores for individual students, merged longitudinally, to provide the best estimates of student achievement for a school or district. Scores for all students, even those with partial data, are included in the analyses. Test scores for all subjects are analyzed at the same time, improving the precision of the estimates. The effect of schools on the rate of academic progress is estimated from this database.


Value-Added Report

The Value-Added Report offers a conservative estimate of a school's or district’s effectiveness.

This report compares the progress of students at each school or district to the state average.

This comparison indicates how a school or district influences student progress in each subject tested.

As you view and interpret the Value-Added Report, it's important to understand what each of the values in the table represents.

How effective is this school or district?

Start with the column labeled School (or District) vs. State Avg. In this column, you will see Above, Below, or NDD (Not Detectably Different). These designations indicate how much progress students at this school or district made compared to others in the state. The Above, Below, and NDD designations are based on the School (or District) Effect. This value is a conservative estimate of how effective the school or district has been in the selected test and subject.

• To be labeled Above, a school or district must have an Effect significantly higher than the state average (2 standard errors above).

• Likewise, to be labeled Below, a school or district must have an Effect significantly lower than the state average (2 standard errors below).

• Effects within 2 standard errors of the state average are labeled NDD (Not Detectably Different).

How is the Effect Calculated?

The Effect is the difference between the Mean Student Score and the Mean Predicted Score. The Mean Predicted Score is what we would expect this school's or district’s students to score, on average, based on their past performance. The Mean Student Score indicates what the students actually achieved, on average, in the most recent test administration. Compare the Mean Predicted Score to the Mean Student Score to see if the students' average scores are in line with what they were expected to score. If the Mean Student Score is significantly higher than the Mean Predicted Score, then students at this school or district scored higher than expected, indicating that the school or district is doing a good job on average with this grade and subject. You may notice that the Effect is not exactly the difference between the Mean Student Score and Mean Predicted Score. The reason is that the Effect is estimated using a methodology that ensures greater statistical precision and reliability.

TIP: Among the NDD schools and districts, there will be a range of Effects. Some of the NDD schools and districts will have large positive or negative Effects, while others will be closer to zero. Consider the size of this number in conjunction with local knowledge about the school and district when drawing conclusions about its effectiveness.

Navigation

To see the Value-Added Report for a different school, choose from the drop-down list under the Schools tab.

To see the Value-Added Report for a different subject, choose from the drop-down list under the Subjects tab.

To see this school’s Diagnostic Report for the currently selected test and subject, click on the underlined Effect.


Value-Added Summary Report

This report indicates how effective each school in the district has been in reading and math for grades three through eight and for tested high school subjects. Only those schools to which you have access are included in the report. For End of Grade reporting, all grades for the currently selected subject are displayed. For End of Course reporting, all subjects are displayed. If your district has reporting for SAT, values will be displayed for Math, Verbal, and Composite.

Green: students in this school made significantly more progress in this subject than students in the average school in the state.

Yellow: the progress of students in this school was Not Detectably Different from the progress of students in the average school in the state.

Light Red: students in this school made significantly less progress in this subject than students in the average school in the state.

N/A: indicates that no data is available for this school for the test and subject in the most recent year.

Navigation

To view the Value-Added Summary Report for another test or subject, select from the tabs above the table.

Student Pattern Report

This report disaggregates progress for specific groups of students that you choose. The Student

Pattern Report enables you to see how effective the school has been with the lowest, middle, and highest achieving students in the group you have selected.

NOTE: When interpreting this report use caution; the subgroup means come from a liberal statistical process and are less conservative than the estimates of a school's influence on student progress found in the School Value Added Report. This report should be used as a diagnostic tool only and not for accountability purposes.

Creating a Student Pattern Report

To create a report, go to a school’s Diagnostic Report and click on the subject name in the leftmost cell of the table. You will see a list of students who tested in that subject in the most recent year.

You can also access this list of students by selecting Student Pattern List from the Reports menu and then selecting a school.

In this list you will see each student’s name, Predicted Score, Observed Score, the State Percentile for the most recent year, the Performance Level for the most recent year, and the name of the school where the student tested. You can sort this list by clicking on the underlined column headers. To see a Student Report, click on a student’s name. To select students for your report, click the check boxes next to the students’ names. To generate a report, you must select at least 15 students who had both Predicted and Observed Scores. You may choose all students by clicking Select All at the bottom of the page. Although check marks will appear next to each student's name, only students with both current and previous years' scores will be included in the report. Deselect All clears your selections. When you have finished selecting students for the report, click Submit.

Interpreting the Report

The students you selected for the report are divided into three groups (Low, Middle, and High) based on their predicted scores. A predicted score is an expected score based on the student's previous testing history. Students are assigned to one of these three groups based on where their predicted score falls in the distribution of selected students. The names of the students in each subgroup are listed in a table at the bottom of the report. The green Reference Line represents the amount of progress students in each subgroup must make in order to maintain their level of achievement from year to year. The blue bars on the graph represent the Mean Observed Minus Predicted Score for each of the three groups of students. Standard errors are shown in red for each group. The standard error allows the user to establish a confidence band around the estimate. Bars above the line indicate that students in that subgroup made better than average progress. Bars below the line indicate that students made less than average progress.

Student List

The student list allows you to drill down to academic achievement information for individual students. Columns with underlined headings allow sorting. The student's predicted test score for the subject appears next to his or her name, followed by the student's observed score, the student’s Performance Level, and the name of the school in which the student was enrolled during the most recent testing window. A student's predicted score is an expected score, based on his or her performance on previous tests, assuming the student is in the average school in the state.

Click on the student's name to display his or her Student Report. The Student Report contains the student’s scale scores, percentiles, and state performance levels for all subjects and years tested. The information is presented in both a table and a graph.

Student Report

The Student Report contains all available test scores for an individual student, along with the student's percentile and state performance level for each test and subject. The accompanying graph provides the student's percentile, the school’s mean percentile, and the district’s mean percentile for each test administered.


Interpreting the Graph

The graph provides a picture of the student's history in the tested subject.

Blue Dot: Estimated Mean Percentile for the district where the student was tested. If the blue line is not visible on the graph, then the district and school percentiles are identical.

Green Diamond: Estimated mean percentile for the school.

Red Triangle: Student's percentile in the years tested. Sometimes students have two dots per grade. This indicates that they were tested in the same grade in two different years, suggesting that they were either retained in that grade or repeated the subject. When viewing the graph, compare the student’s line to the district or school line for the same grade.

Parallel lines: Student’s progress is similar to the average student