1. Method of Interpreting Results
1.1. Norm Referenced
1.1.1. Function: "Describes student performance according to relative position in some known group".
1.1.2. Illustrative Instruments: "Standardized aptitude and achievement tests, teacher-made surveys tests, interest inventories, and adjustment inventories".
1.2. Criterion Referenced
1.2.1. Function: "Describes student performance according to a specified domain of clearly defined learning tasks".
1.2.2. Illustrative Instruments: "Teacher-made tests, custom-made tests from test publishers, observational techniques".
2. Grading
2.1. Types of Grading and Reporting Systems
2.1.1. Traditional Letter-Grade System
2.1.1.1. "Assigns a single letter grade for each subject. This system is concise and convenient, the grades are easily averaged, and they are useful in predicting future achievement".
2.1.2. Pass-Fail System
2.1.2.1. "Often used for courses taught under a pure mastery learning approach".
2.1.3. Checklists of Objectives
2.1.3.1. "These reports typically include ratings of progress toward the major objectives in each subject-matter area".
2.1.4. Letters to Parents/Guardians
2.1.4.1. "Letters make it possible to report on the unique strengths, weaknesses, and learning needs of each student and to suggest specific plans for improvement".
2.1.5. Portfolios of Student Work
2.1.5.1. "A carefully constructed portfolio can be an effective means of showing student progress, illustrating strengths, and identifying areas where greater effort is needed".
2.1.6. Parent-Teacher Conferences
2.1.6.1. "A flexible procedure that provides for two-way communication between home and school".
2.2. Functions of Grading and Reporting Systems
2.2.1. Instructional Uses
2.2.1.1. The report "clarifies the instructional objectives, indicates the student's strengths and weaknesses in learning, provides information concerning the student's personal-social development, and contributes to the student's motivation".
2.2.2. Reports to Parents/Guardians
2.2.2.1. These reports "should help parents understand the objectives of the school and how well their children are achieving the intended learning outcomes of their particular program".
2.2.3. Administrative and Guidance Uses
2.2.3.1. "Grades and progress reports serve a number of administrative functions. They are used for determining promotion and graduation, awarding honors, determining athletic eligibility, and reporting to other schools and prospective employers".
3. Major Considerations in Validation
3.1. Content
3.1.1. "How well the sample of assessment tasks represents the domain of tasks to be measured and how it emphasizes the most important content".
3.2. Construct
3.2.1. "How well performance on the assessment can be interpreted as a meaningful measure of some characteristic or quality".
3.3. Assessment-Criterion Relationship
3.3.1. "How well performance on the assessment predicts future performance or estimates current performance on some valued measures other than the test itself".
3.4. Consequences
3.4.1. "How well use of assessment results accomplishes intended purposes and avoids unintended effects"
4. Determining Reliability
4.1. Test-retest
4.1.1. Measure of stability: "give the same test twice to the same group with some time interval between tests, from several minutes to several years, then correlate the two sets of scores".
4.2. Equivalent forms
4.2.1. Measure of equivalence: "give two forms of the test to the same group in close succession, then correlate the two sets of scores".
4.3. Test-retest with equivalent forms
4.3.1. Measure of stability and equivalence: "give two forms of the test to the same group with an increased time interval between forms, then correlate the two sets of scores".
4.4. Split-half
4.4.1. Measure of internal consistency: "give test once; score two equivalent halves of test; correct correlation between halves to fit whole test by Spearman-Brown formula".
4.5. Kuder-Richardson and coefficient alpha
4.5.1. Measure of internal consistency: "give test once; apply Kuder-Richardson or Cronbach's alpha formula".
4.6. Interrater
4.6.1. Measure of consistency of ratings: "give a set of student responses requiring judgmental scoring to two or more raters and have them independently score the responses, then correlate the two sets of scores".
5. Nature of Assessment
5.1. Maximum Performance
5.1.1. Function: "Determines what individuals can do when performing at their best".
5.1.2. Illustrative Instruments: "Aptitude tests, achievement tests".
5.2. Typical Performance
5.2.1. Function: "Determines what individuals will do under natural conditions".
5.2.2. Illustrative Instruments: "Attitude, interest, and personality inventories; observational techniques; peer appraisal".
6. Form of Assessment
6.1. Complex-Performance Assessment
6.1.1. Function: "Measurement of performance in contexts and on problems valued in their own right".
6.1.2. Illustrative Instruments: "Hands-on laboratory experiment, projects, essays, orals presentation".
6.2. Select-Response Test
6.2.1. Function: "Efficient measurement of knowledge and skills, indirect indicator".
6.2.2. Illustrative Instruments: "Standardized multiple-choice test".
7. Use in Classroom Instruction
7.1. Diagnostic
7.1.1. Function: "Determines causes (intellectual, physical, emotional, environmental) of persistent learning difficulties".
7.1.2. Illustrative Instruments: "Published diagnostic tests, teacher-made diagnostic tests, observational techniques".
7.2. Summative
7.2.1. Function: "Determines end-of-course achievement for assigning grades or certifying mastery of objectives".
7.2.2. Illustrative Instruments: "Teacher-made survey tests, performance rating scales, product scales".
7.3. Formative
7.3.1. Function: "Determines learning progress, provides feedback to reinforce learning, and corrects learning errors".
7.3.2. Illustrative Instruments: "Teacher-made tests, custom-made tests from textbook publishers, observational techniques".
7.4. Placement
7.4.1. Function: "Determines prerequisite skills, degree of mastery of course goals, and/or best mode of learning".
7.4.2. Illustrative Instruments: "Readiness tests, aptitude tests, pretests on course objectives, self-report inventories, observational techniques".