CURRICULUM-BASED MEASUREMENT NORMING FOR READING FLUENCY AND WRITTEN EXPRESSION FOR FRENCH IMMERSION STUDENTS IN SCHOOL DISTRICT #57 By Sylvie St-Pierre B. Ed. (Elem.), Universite du Quebec aMontreal, 1985 B. A., Universite de Montreal, 1982 PROJECT SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF EDUCATION m CURRICULUM AND INSTRUCTION THE COLLEGE OF ARTS, SOCIAL, AND HEALTH SCIENCES © Sylvie St-Pierre, 2002 UNIVERSITY OF NORTHERN BRITISH COLUMBIA May 2002 UNIVERSITY OF NORTHE~N BRITISH COLU~BIA LIBRARY Prince Geof~e , nc lll ABSTRACT Standardized tests are not always appropriate to assess French Immersion students. In School District #57, Learning Assistance (L.A.) teachers identified the need for an easy, inexpensive and reliable test. Curriculum-Based Measurement was a logical choice as it is directly related to classroom materials and instruction, and it is widely used in the English program to assess reading fluency, written expression and basic mathematics skills. The purpose of this project was to develop French CBM probes for reading fluency and written expression, and to develop local norms for the French Immersion program. The specific measures selected were Words Read Correctly, Total Words Written and Words Spelled Correctly. Norming tables were created with the data obtained during three norming periods. These tables will permit L. A. teachers and classroom teachers to assess and monitor students' progress efficiently and inexpensively. This report explains in detail the steps taken to develop the reading fluency and the written expression probes, the administration procedures and the scoring rules. It also verifies the reliability and the stability of the probes over time. The various probes are shown to be equivalent within grade. lV TABLE OF CONTENTS Abstract lll Table of Contents lV List ofTables Vl List ofFigures Vll INTRODUCTION 1 METHODS Subjects Instruments Development of Reading Fluency Norming Probes Development of Written Expression Norming Probes Procedure Reading Fluency Probes Administration Written Expression Probes Administration Analysis Ethics Approval 4 4 9 10 11 11 RESULTS Preliminary Analysis Demographics Analysis Problems with Class List 12 12 12 14 Main Analysis Descriptive Statistics Analysis of Probe Difficulty Analysis of Reading Fluency Probe Difficulty Analysis of Written Expression Probe Difficulty Reliability of the Measures Creation of the Norming Tables and Smoothing DISCUSSION Issues Raised by Data Students' Growth Reading Fluency Probe Difficulty Written Expression Probe Difficulty Reliability of the Measures · Norming Tables Limitations 5 5 8 9 14 14 17 18 19 20 21 23 23 23 24 25 25 26 26 v SUMMARY Implications for Further Research Implications for Practice 27 27 29 REFERENCES 31 Appendix A Examples of Primary (gr. 2) and Intermediate (gr. 5) Reading Probes (Teachers Copies) 34 Appendix B Graphemes and Sight Words Introduced(-), Worked on(+) and Acquired (·)from Grade One to Seven 37 Appendix C Table of Graphemes for Grade Five and Assessment of Probes 39 AppendixD CBM Written Expression Probes 41 Appendix E Directions for Administration of Written Expression Probes and Scoring procedure 45 Appendix F Directions for Administration of Reading Probes and Scoring procedure 49 Appendix G Letters of Permission and Information Letter to Parents 52 Appendix H Graph with Raw Scores and Graph with a Smoothed Curve (Grade Six) 56 Appendix I Descriptive Statistics of Written Expression (TWW) Results for Three Norming Periods 58 Appendix J Norming Table for Total Words Written (TWW) and Words Spelled Correctly (WSC) for Grade Six 60 Vl LIST OF TABLES Table 1 Number of French Immersion Students (n) per School and per Grade in SD57 5 Table 2 Number of Students (n) Being Tested During the Three Norming Periods 12 Table 3 Percentage of Boys and Girls per Grade Participating in the CBM Norming 12 Table 4 Number of Students (n) and Percentage for Each Reading Fluency and Written Expression Probe 13 Table 5 Numbers of Students (n) per Grade for Each Written Expression Probe in Fall 13 Table 6 Descriptive Statistics of Reading Fluency Results for Three Norming Periods 14 Table 7 Descriptive Statistics of Written Expression (WSC) Results for Three Norming Periods 16 Table 8 Reading Fluency Probe Differences Across Norming Periods For Grade One to Seven 18 Table 9 Written Expression Probe Differences Across Norming Periods For Grade One to Seven 19 Table 10 Pearson Correlation for Words Read Correctly Scores Between Norming Periods 20 Table 11 Pearson Correlation for Words Spelled Correctly Scores Between Norming Periods 21 Table 12 Norming Table for Reading Fluency- Grade Three 22 Vll LIST OF FIGURES Figure 1 Mean and SD by Grade and Norming Period for Reading Fluency (WRC) 15 Figure 2 Mean and SD by Grade and Norming Period for Written Expression (WSC) 17 1 INTRODUCTION Learning Assistance teachers are often asked to evaluate students and establish programs for students who are academically or behaviorally challenged. Requests for such assessment come from teachers, parents, in-school administrators, school board administrators and Ministry of Education officials. In order to assess a student efficiently and accurately, Learning Assistance teachers need to use tests that have adequate normative data, are easy to administer, inexpensive, and reliable. Often, commercial standardized tests are used as they offer norming tables so that students can be compared with other students of their age or grade level. However, these tests fail to demonstrate students' progress accurately (Deno, 1985, 1992; Marston, 1989). Consequently, they are not very practical for decision-making about a student' s instructional program. Commercial tests have many other disadvantages: they are expensive, tiine consuming, and not considered valid in many cases as they are not related to any specific curriculum (Fuchs & Deno, 1994). Curriculum-Based Measurement is a standardized measurement system that measures basic skills in reading, spelling, written expression and mathematics. Its assessment focuses on measurements that are observable such as counting words read correctly in one minute and counting words spelled correctly during three minutes given a story starter (Marston, 1989). The advantages of CBM are numerous and important: (a) the tests are tied to the student's curriculum (the materials in which instruction occurs) and not on a series of problems created by commercial test developers (Deno, 1985); (b) the tests are quick to administer and facilitate frequent administration by teachers/educators; (c) the tests can 2 have many multiple forms (Baker & Good, 1995). (d) The tests are inexpensive to produce in terms of time and expense (Deno, 1985 ; Marston, 1989); (e) they measure academic behaviors in the basic skills that are observable in a specific domain (Deno, 1985; Deno, Marston, Mirkin & Lowry, 1982; Marston, 1989); and (f) tests are rehttively unobtrusive (Deno, 1985; Marston, 1989). However, it is important to remember that CBM is one indicator and does not preclude using other specific tests to pinpoint particular problems in academic areas (Shinn & Bamonto, 1998). Many studies have demonstrated the reliability and validity of Curriculum-Based Measurement as an indicator of student progress in the basic skill areas (Shinn & Hubbard, 1992). CBM results were also compared to teachers' holistic judgement on rating of the students' reading proficiency and were found to correlate highly (Marston, Mirkin, Deno, 1984; Marston, 1989). Deno (1985) also found that "all ofthe curriculumbased measures were highly correlated with performance on the standardized, normreferenced tests except for the word meaning test" (p. 222). School District #57 (SD57, 1995a) recognized all the advantages that CurriculumBased Measurement offers to teachers and specialists who make decisions about student placement and educational programs. In order to get assistance for a student who needs services beyond the resources available at his or her school, educators have to be able to assess certain skills frequently to present a precise picture of the child' s performance. CBM was adopted as it matches the problem-solving process identified by the School Support Services (School District #57, 1996b) and the one defined by Salvia and Y sseldyke (1991 ), i.e. , screening, program planning, pupil progress monitoring, and program evaluation. Students of the French Immersion program were included in the 3 CBM for Mathematics (Walraven & MacMillan, 2000), but not in the earlier CBM for Reading Fluency and Written Expression (SD57, 1996a) as English was not their language of instruction. In its 1995 report, the district committee on the development of local norms for Curriculum Based Measurement recommended developing such norms for the French Immersion program (School District #57, 1995a). Some tests, such as Bilan Qualitatif de 1' Apprentissage de la Lecture (Campeau-Filion & Gauthier, 1984) and Test de Rendement pour Francophone (n.d.) were available but these tests were developed for children.with French as their first language. These tests were not valid for the French Immersion students whose first language is English. Hence, the purpose of this project is to develop CBM norming tables for Reading Fluency and Written Expression for the French Immersion program. 4 METHODS This project closely follows the steps used in other projects as it is a replication of the English CBM Reading Fluency and Written Expression (SD57, 1996b). However, some variations in procedures are explained with more detail as the use of the French language created the need for new scoring rules and new reading probes. Subjects School District #57 offers French Immersion from Kindergarten to grade 12. French Immersion is a program for rion-Francophone children. Both parents generally speak English at home and have at best a limited background in French. Students are immersed in the French language from the first day of school. Most of the schooling is done in French. English instruction starts in grade three or four, depending on the class organization. The students have a choice of three French Immersion schools in Prince George. Two schools offer the program from Kindergarten to grade seven and one school from Kindergarten to grade five. The students of the latter school go then to the district Middle school for grades six and seven. All the students continue on to secondary school where French becomes a smaller portion of their instruction. By grade 12, students have only one course in French. The population for this study consisted of the entire within-district population of the French Immersion students of School District #57. There were between 240 and 321 students tested depending on the testing period. A detailed distribution by school and grade is shown in Table 1. 5 Table 1 Number of French Immersion Students (n) per School and per Grade in SD57 Grade School A School B School C School D Total per N N n n grade 1 18 29 17 - 64 2 16 14 18 - 48 3 12 19 16 - 47 4 16 6 13 - 35 5 13 II 11 - 35 6 - 12 21 8 41 7 - 16 25 10 51 Total 75 107 121 18 321 All the French Immersion students from grade one to seven were tested for both Reading Fluency and Written Expression. Grade one students were not tested in the first two periods, as they have not yet received sufficient instruction to develop basic skills (Shinn, 1989). Instruments Development of Reading Fluency Norming Probes Three reading probes per grade level were developed from texts used in class. All the elementary French Immersion teachers in the district sent me a list of readers or books that they use in their classroom during a school year. Most primary teachers use readers from the series Je lis, j'ecris. A mots Decouverts. Contes Roses/Jaunes. Mirabelle and Baluchon. Intermediate teachers use a variety of texts from novels, Science or Social Studies textbooks. A selection of passages or complete stories was chosen as probe material for each grade level. Three probes per grade were considered sufficient for this 6 project, as there were only four schools to assess and the sample size was small. Two examples of probes, one for the primary grade and for the intermediate grade, are in Appendix A. The selection of probes was an important procedure as it was essential that probes were equivalent in difficulty in each grade level (Marston & Deno, 1982). Probe difficulty must increase as grade level increases. Tilly and Carlson (1992) recognize the difficulty of choosing materials that were representative of each grade level expectations. They warn "that large differences in text difficulty makes drawing valid conclusions from student test data more difficult" (p.10). It is important to choose materials that fall within a specific reading level. Readability was an important step in the norming project. School District #57 used the Fry' s Readability Graph to determine the readability of their probes (SD57, 1995b). Unfortunately, this test could not be used for French texts. Thus, tables to establish the readability had to be created. I based the criteria on a sequential table established by the Abbotsford School District in British Columbia (n.d.). For each grade level, their table shows the sounds introduced, worked on, and mastered during that school year. I built tables indicating the sounds expected to be mastered in each grade level during a school year (Appendix B). I included the sight words expected to be mastered in grade one and grade two in their respective table. I kept passages that had more than 80 appropriate words for the grade level in the first 100 words. There is an example of the assessment of three probes in a detailed grade level table in Appendix C. I used a second method to verify that the difficulty level of the probes was Cl;ppropriately chosen for the grade. Six to seven students from various grade levels read 7 probes from a different grade level than their own. If an average reader from grade five, assessed last year, had difficulty reading more than 20 words in the first 100 words of a grade four probe, the probe was reconsidered for a higher level. I also showed the texts to teachers of different grade levels and asked for feedback. In addition to the level of difficulty, other criteria had to be considered when choosing the probes. The stories should not be written as poems or plays, include many unusual proper nouns, or use extensive dialogue (Shinn, 1989; Tilly & Carlson, 1992). The length criterion was quite difficult to achieve. The guidelines suggest that texts should be approximately 150 words in the primary grade levels (SD57, 1995b) and at least 250 words for intermediate grade levels. Grade one to three texts were shorter as French Immersion texts are generally shorter than comparable grade level texts for English as a first language. Texts chosen in this project were a combination of texts students are expected to read in Language Arts, Science or Social Studies. There was a possibility that some texts might have already been read in class by certain students as the materials were taken from the classroom curricula. Fuchs and Deno ( 1994) note that degree of familiarity can become "a source of measurement error" (p. 19) and that "conclusions about the students' general level of proficiency could then be overly optimistic" (p. 19). Due to varying levels of familiarity, the validity of the probes will be difficult to assess. However, statistical tests comparing the level of difficulty among the same grade level probes will show whether a text was possibly too familiar for all students, or just for certain students. Each text was retyped so that pictures would not provide clues, and so that type style differences would be minimized. Two different copies of each text were prepared. 8 The examiner used a copy that has the number of words per line and a space to write the results. The students read the copy with the words only. Development of Written Expression Norming Probes The starting sentences in the writing probes were translated from the English language. These probes, developed by SD57 teachers, used criteria developed by Tilly and Carlson (1992). Three out of six possible probes were chosen using two selection criteria. The first criterion used was to have a starter that would not elicit controversial stories. Basic vocabulary in the sentence was the second criterion chosen so that grade one and two students would be able to understand the language. The written expression probes are in Appendix D. Two teachers, one teaching French Language Arts in a Secondary School and one who taught from Kindergarten to grade five in French Immersion, met with me to establish standardized rules for scoring the written passages of the students. We based the rules on the Written Expression Scoring Rules established by the CBA Institute of the University of Oregon (Baker, Collins & Goodwin, 1992). Our committee added several rules to the list of rules defining "What is a correctly spelled word?" (p. 94-95) established by Baker et al. For example, students had to write the acute and grave accents correctly to get full score. Abbreviations could replace the English series of words, i.e. NASA was acceptable. English words were rejected except for proper names such as films, persons and cities. The committee felt that rules showing grammar should be included in the list (Appendix E) as a reminder about looking at each word as a separate entity regardless of the grammar. 9 Procedures Reading Fluency Probes Administration I translated the scoring procedure of the CBA Training Institute of the University of Oregon (Baker, Collins & Goodwin, 1992) to determine what counted as a word read (Appendix F). Learning Assistance teachers met for half a day to be instructed about the administration and the scoring of the tests, and to listen to the texts read by students on a tape recorder. Each L.A. teacher scored two readings for each of three grade levels. They compared their scores and discussed the differences heard on the tape. Notes were taken about possible mistakes in pronunciation, accents and liaisons (tying two words together; for example: un ami becomes un nami). Unfortunately, there was not enough time to listen to all the grade levels. Comments and questions were exchanged throughout the testing period when someone was confused or uncertain about a word. After the first norming period, one L.A. teacher was replaced by a new teacher. I reviewed the directives and the scoring rules with her before the second testing period. ·For each grade level, students were divided into three groups: A, Band C. The L.A. teacher assigned a letter to the students followingthe list provided by each teacher. Each school was assigned with a different letter/ reading probe. In the fall, the L.A. teacher from School A assigned probe A to the first student on the class list, and then cycled through the list for the remaining students. The L.A. teacher from School B assigned probe B first, and School C assigned probe C first. This method was used to avoid having the biggest number of students with a certain letter if all the groups at each grade level had an odd or even number. Students read a different probe in the winter and . 10 spring periods. For example, if a student read probe Bin October (fall), he/she .read probe C in January (winter) and probe A in April (spring). The first norming period was in late October. Only students in grade two to grade seven were tested. L.A. teachers administered the reading tests in their own schools. The L.A. teachers from School C administered the reading tests to grades 6 and 7 of School D as there is no L. A. teacher for the French Immersion program in that school. In the last norming period, she also administered all the reading probes in School B. All students were recorded on tape to increase the accuracy of the scoring. L. A. teachers were able to listen many times to a student's reading if it was not clear or too fast. The examiners had to follow specific directions to ensure uniformity. The directions were translated from the Directions for 1-minute Administration of Reading Passages of Baker, Collins and Goodwin (1992). Students were asked to read to the best oftheir abilities. They were told they would be stopped after one minute. If students hesitated with a word for three seconds, the L.A. teachers had to say the word and mark it as incorrect. The administration procedure and the scoring rules are given in Appendix F. Written Expression Probes Administration The written expression probes were administrated by each French Immersion teacher administered the written expression probes to his or her own class. They had specific directions (Appendix E) to follow to ascertain the uniformity of the administration of the tests. Each school started with a different probe. All the students of Schools A and D started with probe A. School B students started with probe Band School C students with probe C. In January and April, students had a different starter sentence (or probe) to write about. 11 Students were asked to continue a story starter for three minutes. They had one minute to gather ideas. They were not allowed to ask for words in French or to get help with spelling. Only I did the scoring to ensure uniformity in the process (Shinn, 1989). I followed the rules defined by the committee members. At times, I found unexpected spelling. I noted these cases to ensure similar marking in the event of future occurrences. Analysis All the results for both Reading Fluency and Written Expression were put into tables per school and per grade level. I used Excel 97 to record the data and to do statistical tests. I used the Analysis of Variance (ANOVA) for one factor to assess probes' difficulty at each grade level. The Pearson correlation test was used to establish the equivalence and stability coefficient for the scores over time and across probes. The correlation value r is a measure of stability and equivalence; stability of the repeated testing of students, and equivalence of the different probes used for testing. Ethics Approval The project was approved by School District #57 and the University of Northern British Columbia Ethics Committee. In order to keep the students' results confidential, student library numbers were used in all testings, and school names were replaced by letters. I also sent a letter to all the parents of French Immersion students explaining the project (Appendix G). Letters of approval are in Appendix G. 12 RESULTS Preliminary Analysis Demographic analysis The number of participants ranged from 240 to 31 7 students during the three norming periods. The results in Table 2 show that the number of students participating in the written expression assessment is different from the number of students participating in the reading fluency assessment in each of the norming periods. The difference in numbers is due to absentees during testing days, and to students moving in or out of schools. The increase in the spring is largely due to the inclusion of grade one students. Table 2 Number of Students (n) Being Tested During the Three Norming Periods Reading Fluency Written Expression Fall n 249 244 Winter n 240 248 Spring n 317 300 The total number of French Immersion students registered in April was 320. The population consisted of 47.7% boys and 52.3% girls. Table 3 shows the percentage of boys and girls for each grade level. Most grades had a noticeable difference in the boy/girl ratio. Table 3 Percentage of Bovs and Girls tier Grade Participating in the CBM Norming Grade I Grade 2 Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Total Number of boys 32 18 27 16 13 22 25 153 Percentage % 50.0 37.5 57.5 45 .7 37.1 54.0 49.0 47.7 Number of girls 32 30 20 19 22 19 26 168 Percentage % Total students 50.0 62.5 42.5 54.3 62.9 46.0 51.0 52.3 64 48 47 35 35 41 51 321 13 The percentage of students reading each probe was as evenly distributed as possible. In each school and for each testing period, students were evenly assigned a different probe. Table 4 shows the number of students and the percentage of the population for each probe and for all the testing periods. Table 4 Number of Students (n) and Percentage for Each Reading Fluency and Written ExQression Probe Fall Writing Reading % % n n 84 33.7 73 30.0 74 30.3 83 33.3 82 33.0 97 39.8 244 100 249 100 Probe A B c Total Winter Reading Writing % % n n 104 41.9 32.9 79 29.0 79 32.9 72 82 34.0 72 29.0 240 100 248 100 Spring Reading Writing n % % n 103 32.5 32.3 93 103 32.5 105 36.5 · Ill 35.0 31.2 90 317 100 288 100 The percentage of students using each of the three writing probes is not the same for each probe, as the number of students varies from one school to another. Table 5 shows the number of students per probe for each grade level for the fall norming period. In grade six, for example, there are 19 students who wrote in response to probe C and only eight students who wrote in response to probe A in the first norming period. Only six students used probe B in grade four. Table 5 Numbers of Students (n) oer Grade for Each Written Exoression Probe in Fall Grade 2 Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Total Probe A 16 11 16 12 8 10 73 Probe B 12 18 6 11 12 15 74 Probe C 17 14 13 10 19 24 97 Total 45 43 35 33 39 49 244 14 Problems with Class List Learning Assistance teachers (L.A. teachers) used class lists to assign probes. In the second norming period, lists were not reviewed and new students to the school were not included. For the last testing period, each L.A. teacher used recent lists to insure that all students were given both tests. Main Analysis Descriptive Statistics Table 6 shows various statistics for each grade level and each norming period for Reading Fluency. Table 6 Descriptive Statistics of Reading Fluencv Results for Three Norming Periods Grade Grade 1 Grade 2 Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Fall Winter Spring Fall Winter Spring Fall Winter Spring Fall January Spring Fall Winter Spring Fall Winter Spring Fall Winter Spring - - Mean 22.21 29.45 42.43 55.48 57.09 67 .69 71.07 87.31 91.32 96.20 75.15 78.70 83.55 69.13 75 .64 76.54 . 77 .34 80.20 80.30 - SD 18.58 21.01 25 .80 30.89 21.34 25.39 25.10 21.23 22.16 20.70 19.78 21.81 21.77 17.36 14.91 16.40 22 .64 22.48 22.87 - - Min - 1 3 10 8 23 28 30 43 33 44 41 35 44 39 44 46 30 41 31 - Max 75 75 140 126 117 124 -137 127 128 135 119 119 122 116 112 120 136 142 123 - - Skew 1.24 0.66 1.28 0.51 0.65 0.44 0.46 -0.26 -0.75 -0.19 0.36 0.05 0.13 0.59 0.15 0.47 0.09 0.51 -0.10 Kurtosis - - 0.87 -0.62 3.02 -0.46 0.62 -0.89 -0.09 -0.49 0.26 -0.08 -0.20 -0.66 -0.83 0.40 0.11 0.49 -0 .07 -0.05 -0.50 For the most part, the data are positively skewed across all grades and norming periods but close to the normal distribution value of 0. The kurtosis values are a mix of 15 small positive and negative values, close to a normal distribution. In grade two, for the fall, there is an odd value for the kurtosis and the skew: both values are very high. The means increase from fall to winter and from winter to spring at each grade level. There is a more noticeable increase in grade two and three from fall to winter. As shown in Figure 1, the increase is minimal for grades four, five, six and seven between certain testing periods. As a result, the curve on the percentile graph has some raw scores for winter under the fall curve, or very close to it. A graph with raw scores and a graph with a smoothed curve are in Appendix H. Smoothing was necessary and will be discussed in more detail in a subsequent section. Figure 1. Mean and SD by Grade and Norming Period for Reading Fluency 110 100 r 90 80 70 VJ Q) 60 ca 50 ::I > r I 40 30 20 / /"'\ \ .............. A ./ /. ~~ .... _.._ ..- 10 \/ -- ~ ~ - __ _., ......._ 0 1 2 3 4 5 6 7 Grades The grade four mean is greater than the grades five, six and seven means in all the norming periods. Students read more words in grade four than in grade five, six and seven. There is also a decrease in words read from grade five to six. 16 Table 7 shows the descriptive statistics of the results for written expression. Only the results for the Words Spelled Correctly (WSC) are in the table. The results for Total Words Written (TWW) are in Appendix I. They follow the same pattern as the WSC. Table 7 Descriptive Statistics of Written Expression (WSC) Results for Three Norming Periods Grade 1 Grade 2 Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Fall Winter Spring Fall Winter Spring Fall Winter Spring Fall Winter Spring Fall Winter Spring Fall Winter Spring Fall Winter Spring - - Mean 10.40 8.47 12.78 15.91 16.28 20.91 21 .23 23 .94 30.35 31.56 30.58 36.73 . 39.31 37.49 45.43 45.49 46.47 51.82 51.94 - SD 8.32 5.30 7.67 7.58 8.52 11.43 11.12 9.58 10.68 8.49 10.19 13.38 11.30 12.35 14.43 12.40 12.78 12.94 14.95 - Min 1 0 1 4 2 5 6 2 5 13 7 9 19 14 18 23 26 30 27 - Max 38 21 33 30 44 66 62 42 54 56 49 59 76 65 72 77 73 84 . 89 - Skew 1.53 0.60" 0.79 0.31 0.97 1.42 1.19 0.09 0.00 0.28 -0.30 -0.47 0.89 0.41 0.07 0.37 0.23 0.66 0.42 Kurtosis - 2.41 -0.64 0.01 . -1.13 1.38 3.85 2.76 -0.03 0.21 0.91 -0.59 -0.40 1.97 -0.18 -0.91 -0.19 -1.03 -0.05 -0.42 For the most part, the data are positively skewed across all grades and norming periods, except for two norming periods in grade five. The kurtosis values are a mix of small positive and negative values, close to a normal distribution. There is a normal distribution, skew of 0, in grade four for the winter testing period suggesting a normal distribution of the curve. In grade three, the kurtosisvalues are high and they correspond with higher positive values of skew. The means are increasing consistently from grade one to seven as shown in Figure 2. In grades two, five and six, the mean in the fall is lower than the previous grade in the 17 spring. In all the grades, the mean increases noticeably between fall and winter but not as much between winter and spring. 55 50 45 40 35 - r- ~ 30 ~ 25 20 15 10 5 0 .... / / ::I ... / _/ ..Ill' 1 2 3 I /"' I-+~ -so-Mean - ,_/ ....___ ~ --- ./ ,_.. .a ......- ' 4 __..... ~ ' 5 ' 6 ' 7 Grades Figure 2. Mean and SD by Grade and Norming Period for Written Expression (WSC) Analysis of Probe Difficulty It is important that probes within the same grade level offer the same challenge to students. I used two methods to verify the probe difficulty. I used the analysis of variance (ANOVA) test for single factor to compare the means among the probes. The ANOVA results for reading fluency and written expression are in Tables 8 and 9. An alpha level of .05 was chosen. In Tables 8 and 9, the letters ns mean there is no significant difference among the three probes. The letters gg indicate that there is a significant difference among the probes. The probes are ranked from most to least difficult. 18 Analysis of Reading Fluency Probe Difficulty Table 8 indicates that of the 19 tests of differences for probe difficulty, there were only two instances, spring grade two and spring grade seven, of a significant difference. This is taken as evidence of a general lack of difference among probes within grade. Table 8 Reading Fluencv Probe Differences Across Norming Periods for Grade One to Seven Grade I Grade 2 Probe Fall Grade 4 ns B c ns A B c ns B A Grade 5 Grade 6 Grade 7 Probe Winter Mean Winter Probe Spring ns c ns c B A ns c A B ns c B A 25.00 28.06 35.67 50.36 55 .75 64.80 79.81 84.83 96.67 73.09 74.36 77.75 64.58 65 .77 76.69 72 .59 75.63 83.71 ns B A c ns A c B ns B A c ns A c B ns A B c ns B c A 35.13 40.13 51.44 61.23 68.94 72 .62 86.75 93.64 94.00 77.67. 78.82 79.80 68.17 76.46 81.29 71.67 78.93 89.38 Mean Spring c 21.35 21.41 23 .89 c 39.47 55.07 73.07 A A Grade 3 Mean Fall B Sig A B ns A B c ns A B c ns B c A ns c B A sig c B A 62.19 68.93 83.50 88.56 98 .94 100.39 73 .82 87.33 89.70 73.31 75.83 80.14 69.71 81.40 91.20 19 Analysis of Written Expression Probe Difficulty I used the scores of the Words Spelled Correctly (WSC) variable to compare the level of difficulty of the writing probes. I analyzed the results of the number of Words Spelled Correctly, and not the Total Words Written results as both sets are highly correlated (Marston, 1989). Analysis of one set can be transposed to the other series of results. In Table 9, the one factor ANOVA test indicates there is a significant difference among the probes 10 out of 18 times. Table 9 Written Expression Probe Differences Across Norming Periods for Grade One to Seven Probe Fall Mean Fall Grade I Grade 2 Grade 3 ns B c 5.75 8.71 10.25 c 11.94 17.93 21.27 A stg B Grade 4 A stg B Grade 5 A ns B Grade 6 A sig B Grade 7 c 12.83 24.00 28.06 c 25.73 31.60 34.17 c 29.58 39.63 44.25 c 40.07 46.13 56.90 A sig B A Probe Winter sig c A B sig c A B ns B c A ns c B A ns c A B ns A c B Mean Winter 8.29 14.33 14.82 13.72 23.31 28.50 · 27.27 28 .67 34.69 32.33 35.25 41.50 43 .00 44.33 51 .86 49.38 50.56 60.56 Probe Spring sig B A Mean Spring c 6.80 9.67 14.50 c 12.83 16.94 17.29 c 16.53 21.06 28.73 ns A B sig A B ns B c A ns A B 27 .75 32.38 37.00 c 37.91 38.55 41.15 c 39.25 44.57 54.13 c 43.08 51.84 62.80 stg A B stg A B 20 In the fall, the order of the probes is the same for all grades. Note that the group that wrote in response to probe Bin the fall was given probe C in winter and probe A in spring. The meaning of these patterns are discussed in a later section. Students who wrote on probe A had the best results. There was a significant difference among the probes in all the grades except for grade five. Reliability of the Measures I used the Pearson Correlation test to evaluate the stability of the probes over time (from fall to spring). I compared the scores of the fall to those of winter, the scores of winter to those of spring, and for the entire period of testing, from fall to spring. Table 10 indicates a consistent relationship for Words Read Correctly (WRC) between the norming periods for grades two and five. The ! value is consistent for all the periods. In grades three and seven, the correlation is higher in the winter and spring periods. In grades four and six, contrary to what is expected about the progression of the · reading skills, there is a lack of decrease in the fall-spring (6 month) period. The highest correlation value is in grade five. Table 10 Pearson Correlation for Words Read Correctlv Scores Between Norming Periods Grade Grade 2 Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 r Fall- Winter (3 month) .77 .79 .74 .87 .76 .68 * Grade one was tested only m Sprmg r Winter- Spring (3 month) .78 .80 .73 .87 .69 .81 r Fall- Spring (6 month) .77 .70 .80 .86 .79 .74 21 Table 11 indicates a higher correlation between the Words Spelled Correctly (WSC) scores in the winter-spring period for all the grades. It decreases in the fall-spring period for most grade levels as the period covered is larger, thus more prone to fluctuation in the students' writing abilities or motivation. The r value increases in grade five only, going from .52 to .60. The r values for writing are lower than the r values for reading, thus less stable. Table 11 Pearson Correlation for Words SQelled Correctlv Scores Between Norming Periods Grade r r r Fall- Winter Winter- Spring Fall- Spring (3 months) (3 months) (6 months) .57 .71 .48 Grade 2 Grade 3 .73 .75 .72 Grade 4 .42 .45 .30 Grade 5 .32 .52 .60 Grade 6 .66 .77 .65 Grade 7 .49 .84 .56 * Grade one was tested only m Spnng Creation of Norming Tables and Smoothing In order to build norming tables, all the students' results were ranked from 1 to 99 percentile. Raw scores and percentiles of each testing period were put on one graph for each grade level. The resulting graphs show the students' improvement in the skills tested through the testing periods. Unfortunately, some results were less than to the results of the previous testing period. As a result, some curves were below the curve of the next norming period. Hence, the curves needed to be smoothed. I hand-smoothed most curves in the three categories reported on, Words Read Correctly (WRC), Total Words Written (TWW) and Words Spelled Correctly (WSC). I generally lowered the scores in the fall and winter, instead of increasing them, so that cut-off scores are not artificially inflated. I changed the scores as minimally as 22 possible, to just below the score of the later testing period. I had to make so many changes that I did not try to make the curves as smooth as possible. I did not modify any graph for grade one as students read and wrote only in the spring. I created the norming tables using the data of the modified graphs. Some percentile data were available directly from the results. Some other data needed to be pulled from the curve. Table 12 is an example of a norming table for reading fluency for the grade three students. The two set of scores for written expression, i.e., Total Words Written (TWW) and Words Spelled Correctly (WSC) are in .the same norming table for each grade. An example is in Appendix J. Table 12 Norming Table for Reading Fluency- Grade Three Percentile 99 95 90 85 80 75 70 65 60 55 50 45 40 35 30 25 20 15 10 5 1 Fall WRC 112 88 80 76 71 68 66 64 63 61 57 56 52 47 45 42 39 35 31 27 24 . Words Read Correctly- Grade Three Spring Winter WRC WRC 121 109 101 94 88 83 78 75 70 65 60 58 55 53 51 50 48 40 38 34 28 129 115 102 96 90 88 84 79 76 72 70 68 64 61 57 54 51 43 38 36 32 Description Well above average Above Average Average Below Average Well Below Average 23 DISCUSSION Many statistical tests are required to develop reliable assessment tools. If the tools are reliable, the norming tables emerging from the results can be considered reliable as well. Consequently, it is important to examine the results of the statistical tests performed in this project. The sensitivity of the probes to students' growth, the difference between the probes and, finally, the reliability of the probes will be discussed. Issues Raised by Data Students' Growth Overall the curve of the means for reading fluency (Figure 1) shows a normal upward curve that plateaus in grades six and seven. The curve for CBM reading fluency in English also shows an increase from grade one to seven, with a slower growth rate at the upper intermediate levels (SD57, 1996a). The two curves were judged to be similar. In grades two and three, the curve shows a steady improvement from one period to the other. In these grades, much emphasis is on reading skills instruction, with the result that students demonstrate considerable reading growth in a short time period. Grades six and seven students show less growth in their reading skills, as indicated by the plateau shown on the curve. This corresponds to a shift in reading instruction focus. In the upper intermediate grades, the emphasis is on reading as a way to obtain information through the application of skills taught in earlier grades. There is a constant increase from grade one to seven with the written expression. This was not observed with the English written expression (SD57, 1996a). There is a second effect that is apparent. Students improve between each period, but more noticeably between fall and winter. The means in the fall for grades two, five and six are 24 slightly inferior to the mean of the previous grade level for the spring period. This phenomenon is called the Summer Effect. Students don't use their writing skills in the summer as much and their spelling is not as sharp as in winter or spring. The saw tooth effect is similar to the one observed with the English CBM. Reading Fluency Probe Difficulty Comparable assessment instruments are essential when they are used to compare students' abilities. It is then important to build instruments that are comparable in difficulty over extended periods of time, as they offer "several instructionally related advantages over measuring student performance on ever-changing or increasingly more difficult samples of material" (Fuchs & Deno, 1994, p. 20). The results obtained from the methods used to verify the level of difficulty of the probes have an impact on the effectiveness and the reliability of the norming tables created in this project. The results show no significant differences among the reading fluency probes for all the grade levels, indicating that probes are of equal difficulty. Although the English reading fluency probes within grade were judged to be of equal difficulty, greater variation among theses probes was apparent. Using texts from the classrooms' curricula brings a risk of having some students being familiar with a probe. The results could be affected by this situation and the norming scores could be higher than if all students would read a new text, especially if the sample is small. However, it is part of the CBM characteristics to use texts studied in class. This situation happened during the course of this project. In grade two, three students had read one of the texts. However, their scores were average and did not stand out from the others. It did not affect the results. 25 Written Expression Probe Difficulty The selection of probes is also important for the written expression norming. The results were greatly affected by the group of students using a specific probe. In the first norming period, probe A has the highest mean in all the grades. These students, with the exception of grade four and five in winter and grade four in spring, always had the most words correctly spelled. It can be explained by the fact that they are all from the same school and that they participated in a program to enhance written expression. They had used the program Write On (Hart, 2001) since the beginning of the school year. The students who had the least correctly spelled words except for grade four (winter and spring) and grade seven (winter) are from the same school. The probes' order changes but the groups using them always stay in the same position. Even though the ANOV A test suggests a difference in difficulty between the probes in 10 instances out of 19, the difference can be attributed to the strength of the students. It is evident that a school · effect has influenced the scores. Reliability of the Measures The! values demonstrate that the reading fluency probe-s were stable over time and across probes. The range ofr values is wider for the French Immersion CBM (.68 to.87) than for the! values for the English CBM (.77 to .89) reported in the school district technical report (SD57, 1996a). The median in English is .85 and the median in French is .78. Values from the Pearson test demonstrate that the reading probes are stable over time. I regard these two values as equal. The correlation between the written expression probes is not as satisfactory as the reading probes. The! values for the Words Spelled Correctly vary greatly from one 26 period to the other for all the grades, except for grade three. The range of r values is from .30 to .77. Similar data for the English CBM are not available for comparison. However, the correlation coefficients for Total Words Written are available; they range from .48 to .69 (SD57, 1996a). The median value in English is .62 and the median in French is .58; they are both similar. In both instances, the correlation coefficients are lower than the values for reading. The decrease in stability for the written expression probes can be explained by the fact that reading and writing are different cognitive activities. Reading a text for one minute does not require any creativity or inspiration as writing does. Therefore, students' performance in written expression can be affected by their mood or motivation on any given day. Hence, more uneven performances are to be expected. Norming Tables We were successful in our attempt to create local norms for the French Immersion program. The regularity of the smoothed curves across testing times and the similarity to the English norming curves are a demonstration of this. Limitations The population of the students registered in French Immersion was quite small with a maximum of 321 students. In order to have more students in the study, the researcher would have had to include French Immersion students from another school district. This option would have contradicted the purpose of establishing local school district norming tables and created many logistical problems. The number of students varied for each testing period due to student absentees. The limited time allotted to French Immersion Learning Assistance teachers in each school did not permit the testing of students who were absent for a long period. 27 SUMMARY The objectives of this project were to develop efficient and inexpensive tools for Learning Assistance teachers to use with French Immersion students, and to create norming tables for reading fluency and written expression. The project was based on standard methods used in the School District #57 norming study (1996a). Three reading fluency probes for grades one to seven were developed. The ANOVA test shows no significant difference in difficulty between the probes. The written expression probes show significant difference in difficulty for many probes but the grouping of the students had an effect on the results as one group was stronger in written expression skills. Based on all the preceding analysis, both sets of probes, reading fluency and written expression, are considered reliable testing instruments. The norming tables created in this project are a tremendous asset for the French Immersion program. They provide logical and usable norms for reading fluency and written expression. All the results show similar reliability to the previous English study (1996a). Implications for Further Research French Immersion students start receiving formal English Instruction in grade three or four depending on the composition of their classes (split or one grade only). Even though they can read in English at that point, they do not spell in English as well as their peers who have had English instruction since Kindergarten. Anecdotal discussions suggest that by grade seven, French Immersion students have caught up with their counterparts from the English program. It would be interesting to compare the French Immersion students' CBM scores for reading fluency and written expression to ·CBM 28 scores in their first language after English instruction has begun. One might suppose that they would write more in their first language as they are more comfortable in English. One might also expect no significant difference between their reading fluency scores as they have read and been read to in English from a very young age. While much research has demonstrated the "validity of CBM reading as a measure of general English reading proficiency, including comprehension" (Marston & Deno, 1982; Shinn & Good, 1992), there is no general consensus about second languages. A study by Baker and Good (1995) provides "initial support for the validity of CBM reading as a measure of English reading proficiency, including reading comprehension, for bilingual students."(p. 572). However, Bertin (1988) demonstrates in her study that knowing the linguistic code and the meaning of each word individually are not sufficient to comprehend the text as a whole. As a fifteen year experienced educator, I support Bertin' s view. Students may be able to decode extremely well but may not know the meaning of the words or the idea expressed. Developing a French CBM test for comprehension like a cloze-test and comparing the scores with CBM scores for reading fluency could enhance our knowledge about this topic. CBM has proven reliable for assessing the basic skills of students in French Immersion. If these scores compared to the achievement grades letters French Immersion students receive at school, would they be highly correlated? Do the CBM scores give a realistic picture of the academic performance of the students? It would be interesting to compare letter grades of academic subjects, such as Language Arts, Social Studies, Mathematics and Science to CBM scores for reading fluency and Mathematics. That 29 research was done for the English students and indicated positive correlation between CBM scores and grades (Fewster, 2000). Implications for Practice Creating norming tables for French Immersion is certainly an asset for French Immersion L.A. teachers and classroom teachers. The norming tables are easy to use and results give a good idea where a student' s basic skills are compared to others. Teachers often need to evaluate the progress of a grade one or grade two student and help parents make a decision for his or her future placement. It is in the early primary grades that schools, parents and teachers prefer to decide if French Immersion is the right program for their child. CBM testing is quick to conduct and teachers themselves can administer it frequently. CBM is also a useful tool for quickly assessing quickly a large group of students. In the fall, teachers often do not know all their students' abilities and can not ask for support from their L. A. teacher at that point. The scores generated by CBM allow them to quickly evaluate the students and give the necessary support early in the year. My data are available for research study comparing boys to girls' scores. Knowing if one of the genders has much lower scores in reading fluency or written expression could help teachers evaluate and adapt their teaching strategies to the gender composition of their class. The norming tables created in this project were designed to assess students of School District #57. Exporting the norming tables to other districts would have to be done with caution. Other school district French Immersion programs are in a different context which may affect the learning progression of their students. For example, school 30 districts might have only one French Immersion school, a big tum over of teachers every year, many trilingual students, an advisor, full-time L.A. teachers, and so on. The norming tables might not be reliable in these different contexts. However, I am available · to help any school district who may want to develop CBM probes and norming tables for their French Immersion students. 31 REFERENCES Abbotsford School District (n.d.). Sequential Grapheme Table. Abbotsford, B.C.: Author. Baker, S, Collins, V. & Goodwin, M. (1992). Administration and scoring of CurriculumBased Measurement. CBA Training Institute, University of Oregon, p. 88-89 Baker, S. K., & Good, R. (1995). Curriculum-Based Measurement of English reading with bilingual Hispanic students: A validation study with second-grade students. School Psychology Review, 24 (4), 561-578. Bertin, C. (1988). Le role des strategies de lecture dans la comprehension des textes en langue etrangere (The role of reading strategies in foreign language text comprehension), Canadian Modem Language Review, 44 (3), 527-535. Campeau-Filion, F., & Gauthier, G. (1984). Bilan Qualitatif de 1' Apprentissage de la Lecture CBOAL) [Qualitative status of reading acquisition] (2"d ed.). Manuel de l'examinateur [Administratormanual]. Presses de l'Universite du Quebec. Deno, S. L. (1985). Curriculum-Based Measurement: The emerging alternative. Exceptional Children, 52 (3), 219-232. Deno, S.L. (1992). The nature and development of curriculum-based measurement. Preventing School Failure, 36 (2), 5-16. Deno, S, Marston, D., Mirkin, P., and Lowry, L. (1982). The use of standard tasks to measure achievement in reading, spelling, and written expression: A normative and developmental study. (Research Report No. 87). Minneapolis: University of Minnesota, Institute for research on Learning Disabilities. 32 Fewster, S.A. (2000). School-based evidence for the validity of curriculum-based measurement norms in School District #57. Master' s thesis, University of Northern British Columbia, Prince George, British Columbia, Canada. Fuchs, L., & Deno, S. (1994). Must instructionally useful performance assessment be based in the curriculum? Exceptional Children, 61 (1), 15-24. Hart, N. (2001). Write On. Prince George, B.C.: School District #57 (Prince George). Marston, D.B. (1989). A Curriculum-Based Measurement approach to assessing academic performance: What it is and why do it. In M.R. Shinn (Ed.), Curriculum-based measurement: Assessing special children (pp. 18-78). New York: Guilford Press. Marston, D., & Deno, S. L. (1982). Implementation of direct and repeated measurement in the school setting. Minnesota University. Minneapolis Institute for Research on Learning Disabilities (ERIC Document Reproduction Service No. ED 226048) Marston, D., Mirkin, P., & Deno, S. (1984). Curriculum-Based Measurement: An alternative to traditional screening, referral, and identification. The Journal of Special Education, 18 (2), 110-117. Microsoft (1997). Excel 97 [Computer software] . Salvia, J., & Ysseldyke, J. E. (1991). Assessment (5th ed.). Boston: Houghton Mifflin. School District #57. (1995a). Report ofthe committee on the development of local norms for Curriculum Based Measurement. School District #57. Prince George, BC: School District #57. School District #57. (1995b). Curriculum Based Measurement. School District #57. Prince George, BC: School District #57. 33 School District #57. (1996a). Guidebook for the use of Curriculum Based Measurement in School District #57. Prince George, BC: School District #57. School District #57. (1996b). School Support Services: Practices, organization, and principles. Prince George, BC: School District #57. Shinn, M. R. (1998). Identifying and defining academic problems: CBM screening and eligibility procedures. In M. R. Shinn (Ed.), Curriculum-Based Measurement: Assessing special children (pp. 90-129). New York: Guilford Press. Shinn, M. R., & Bamonto, S. (1998). Advanced applications of Curriculum-Based Measurement: "Big ideas" and avoiding confusion. In M. R. Shinn (Ed.), Curriculum-Based Measurement: Assessing special children (pp. 1-29). New York: Guilford Press. Shinn, M . R. & Hubbard, D. D. (1992). Curriculum-Based Measurement and problemsolving assessment: Basic procedures and outcomes. Focus on exceptional Children, 24 (5), 1-20. Test de Rendement pour Francophone (Performance Test for Francophone). (n.d.). Tilly III, W. D. & Carlson, S. (1992). Creating measurement materials. CBA Training Institute, University of Oregon. Walraven, G. & MacMillan, P. (2000). Curriculum Based Measurement norming for math (calculation) in School District 57. University ofNorthem British Columbia, Prince George, British Columbia, Canada. 34 Appendix A Examples of Primary (gr. 2) and Intermediate (gr. 5) Reading Probes (Teacher's Copies) 35 Nom: Niveau: Date: Dans rna maison. Dans le salon, on se repose, on lit et on parle. C'est la qu'on reyoit la visite. C'est la aussi qu'on regarde la television. Voici la cuisine. Papa et maman travaillent beaucoup ici. Souvent, la table est dans la cuisine. Alors, la famille mange dans la cuisine. L 'heure du repas est un moment bien agreable. Dans la chambre a coucher, il y a unlit pour dormir. On se repose et on s' amuse aussi dans sa chambre. Mon ami a des lits superposes. J e suis bien dans mon lit quand il fait froid. Les lits ont des couvertures chaudes et un bon oreiller. J e fais souvent de beaux reves. Souvent, je me cache sous le lit. Je saute aussi dessus. C'est agreable de lire au lit. J'aime bien parfois manger dans mon lit. Ce qui est important, c'est que je peux y dormir. Nombre de mots lus: Nombre d'erreurs: Nombre de mots Ius correctement: 2-C 3 14 20 27 30 36 43 50 58 70 80 86 96 106 112 119 123 129 136 146 36 Nom: Niveau: Date: Un loup-garou comme mari? Il etait une fois, au Nord du Grand Lac des Esclaves, dans le Territoire du Nord-Ouest, une jeune femme inuit qui habitait avec ses deux freres. Elle n'etait pas encore mariee. Ses deux freres s'inquietaient pour son avenir. Or, un jour un tres bel homme vint leur rendre visite. Elle rut vite impressionnee par la gentillesse de cet inconnu. Ce jeune homme se montra tres habile et tres responsable. Les deux freres l'aimerent bien. Ils crurent qu'il serait un bon mari pour leur petite soeur. Quelques semaines apres, le jeune visiteur demanda a cette jolie dame de 1'epouser. Elle aimait beaucoup et accepta dele marier. Heureux, les epoux emmenagerent dans leur maison. La nuit de noces, la mariee rut soudainement reveillee par des bruits. Elle entendit des hurlements. Lajeune epouse voulut que son mari aille voir ce qui se passait dehors. A sa grande surprise, son mari n' etait pas Ia. Elle dec ida done d'aller voir d'ou venaient ces bruits. Ellene vit rien. Elle retouma se coucher. Quelques heures plus tard, elle entendit encore des bruits. Mais, cette fois-ci, ils venaient de 1'etable. Les animaux s' agiterent. Pour se proteger, elle apporta une fourche. Un animal etait la . . . c'etait un loup. Nombre de mots Ius: Nombre d'erreurs: Nombre de mots Ius correctement: 5- c 5 16 25 35 44 55 65 75 85 94 104 112 119 128 138 149 160 171 179 188 197 206 209 37 Appendix B Graphemes and Sight Words Introduced(-), Worked on(+) and Acquired(-) from Grade One to Seven 38 sounds Letters sounds Simple syllable Ex: ba, mo Inverse syllable: Vowel- consonnant II Cons.- vowel- cons. Simple sound Ex: on, ou, en, in, un, an, er, et, ai, ei, au, oi, ch, qu, ui, eu Silent "h" Sound "o" open and close Complex sounds: ex: ph, eau, ain, em Syllables with 2 consonants Ex: br, fl, gr, ... M before p and b Ex: campagne, chambre Complex sounds: eu, oeu. Ex: oeuf Complex sounds Ex: noeud S=Z Ex: fraise Ent - ending in verbs Ex: aiment Ent- as a sound Ex: lentement "tion" " ille" "ail", "aille" "eil", "eille" "ouille", "ouil" "oeil", "euille", "ueille", "eil" Sight words 1 A, a, aller, allons, arrive, au; aussi, autre, avec, beau, belle, bien, bon, bonjour, ce, bonne, ces, cette, comme, content, deux, contente, dans , de, des, dit, du, elle, elles, en, et, faire, fait, gros, il, il y a, ils, j'ai, je, je suis, jouer, Ia, le, leur, leurs, lui, rna, mais, manger, marche, mes, moi, mon, ne pas, notre, nous, on, ou, par petit, petite, plus, pour, pres, qu ', que, qui, sa, sans, si, son, sur, ta, te, toi, tous, tout, toute, tn!s, tu, un, une va, cas, vous, votre, viends, venez, vite, voir, vois, voit, veut, veux, y Sight words 2 Grade 1 + Grade 2 - + - + - + + - Grade 3 - + - + - + - - - + + + - - - + Aux, achete, appelle, apportons, autour, autre, bientot, cette, ces, chante, chez, chien, combien, depuis, derriere, ecris, en, enfants, faites, froid, il y a, jamais, jouons, leur, lis, monsieur, neige, nos, on, par, parce que, peur, plus, pourquoi, pris, que!, qu 'est-ce que c' est, sait, sans, sien, soi, tire, toujours, toute, travail, tres, va-t-en, vieux, voici, vos - + Grade 4 + + + + + + + + + + - Grade 5 + + + + + + + Grade Grade 7 + + + + 6 39 Appendix C Table of Graphemes for Grade Five and Assessment of Probes 40 Sounds Complex sounds Ex: ph, eau, ain, ein Syllables with 2 consonants Ex: br, fl, gr, ... M before p and b Ex: campagne, chambre Complex sounds: eu, oeu. EX: boeuf, Complex sounds: eu, oeu. Ex: noeud, "s" = "z" ex: fraise "ent" - ending in verbs Ex: ils marchent "ent" - as a sound Ex: lentement "tion" "ille" "ail", "aille" "eil", "eille" "ouille", "ouil" ·"euil", "euille", "ueille" "veil", "oeil" Others (-) introduced (+)to work on (.) acquired Probe A Mon anniversaire Probe B Mouvements de l'ocean Probe C Un loup-garou commrne mari? 8 6 5 7 1 1 + + 2 3 1 1 + + + + + 8 2 3 2 1 1 1 magnifiques 4 d' est, en est, . equateur, superficiels Words with more than 3 syllables 4 magnifiques, merveilleuse, marguerites, anniversaire (2) Hard words (others and words with> 3 syl. not included in others) Easy words 5 4 95 96 1 superficiels 7 s'inquietaient, impressionnee, esclaves, inuit, homrne, or, femme 2 Territoire, s' inquietaient, impressionnee, gentillesse, responsable 10 90 41 Appendix D CBM Written Expression Probes 42 Numero de l'eleve: Niveau: Date: A Ecris une histoire qui commence par ... Hier, un singe est entre dans l'ecole en passant par la fenetre et ... TME: TMBE: 43 B Numero de 1'eleve: Niveau: Date: Ecris une histoire qui commence par ... Je jouais dehors lorsque tout a coup des extra-terrestres ... TME: TMBE: 44 c Numero de l'eleve: Niveau: Date: Ecris une histoire qui commence par ... J'ai trouve un crayon magique et ... TME: TMBE: 45 Appendix E Directions for Administration of Written Expression Probes and Scoring Procedure 46 Directives pour !'administration du test CBM d'ecriture (Directions for Administration of Written Expression Probes) Materiel: test, chronometre, les directives 1- Assurez-vous que les eleves aient un crayon bien aiguise et une gomme a effacer. 2- Assurez-vous qu' ils connaissent leur numero d'eleve. 3- Donnez les feuilles de test aux eleves et demandez-leur d'ecrire leur numero d'eleve et de toumer la feuille a l'envers lorsqu'ils ont fini . 4- Dites ces directives (n'ajoutez pas d'autres directives, s.v.p.) Vous allez ecrire une histoire. Premierement,je vais vous lire une phrase qui est un debut d'histoire. Vous allez ensuite continuer l'histoire, (ecrire ce qui arrive apres ce debut d;histoire). Je vais vous donner 1 minute pour penser a l'histoire et ensuite vous allez ecrire pendant 3 minutes. II ne faut pas oublier qu'il faut ecrire le mieux possible. Si vous ne savez pas comment ecrire un mot, ecrivez-le du mieux possible, comme vous pensez. Vous ne pouvez pas me demander des mots en f ~ (suivez bien cette regie- cela va ralentir leur ecriture s'ils vous posent des questions- 3 minutes d'ecriture sans interruption) . Faites du mieux que vous pouvez. Avez-vous des questions? (pause) Voici le debut de l'histoire: ... 5- Apres avoir lu le debut de l'histoire, laissez les eleves reflechir pendant 1 minute. Verifiez qu'aucun eleve ne commence a ecrire. Apres 30 secondes, dites: vous devriez etre en train de reflechir a ... (phrase de l'histoire) 6- Apres que la minute soit terminee, dites: Commencez a ecrire maintenant. Commencez a compter la periode de 3 minutes. 7- Encouragez les eleves a se concentrer sur le travail a accomplir. 8- Apres 90 secondes, dites: vous devriez etre en train d'ecrire au sujet de (sur) ... (debut de 1' histoire) 9- Apres 3 minutes, dites: Arretez, deposez vos crayons. Vous avez bien travaille. Me rei. 10- Ramassez les feuilles. Asstirez-vous qu' ils ont ecrit leur numero d'eleve 47 C.B.M. - Ecriture - Regles de correction (Scoring Procedure) Le test verifie deux habiletes en ecriture: l' aisance a ecrire et l' epellation. Pour que les resultats soient comparables entre les ecoles, il est important que les correcteurs/trices appliquent les memes regles pour compter le nombre total de mots ecrits et le nombre total de mots bien epeles. En cas de doute, il faudra contacter la coordinatrice du projet. 1- Comment compter les mots: a) L ' apostrophe: une lettre suivie d' une apostrophe sera compte pour un mot carla lettre remplace un mot. Exemples: L' ecureuil - 2 mots (le ecureuil) J'ai- 2 mots Ge ai) Ce n'est pas- 4 mots (ce ne est pas) b) Le trait d ' union: si le trait d'union unit 2 mots qui existent separement, on comptera 2 mots pour ce groupe. Par contre, si un des mots n'existe pas separement, on comptera 1 mot pour ce groupe de mots. Exemples: Porte-feuille- 2 mots Peut-etre- 2 mots Est-ce que- 3 mots Pre-test- 1 mot c) Les mots anglais: chaque mot sera compte. d) Les mots inventes: les mots inventes seront comptes. e) Les abreviations: les abreviations seront comptes pour un rriot Exemples: BBQ - 1 mot T.V. ou tv- 1 mot CD -1 mot f) Les nombres: les nombres doivent etre ecrits avec des lettres. Exemples: Les 2 chaises - 2 mots J'ai 8 ans- 3 mots J'ai vingt-cinq ans- 5 mots 2- Comment compter les mots bien epelles: On ne doit pas tenir compte des differentes regles de gramrnaire de la langue fran<;aise . a) L ' accord des verbes au sujet: si le verbe existe avec l' epellation du texte, il sera compte comrne "bon". Il ne doit pas s'accorder avec le sujet. Exemple: Ils mange - 2 mots bien epeles Nous avons bus - 3 mots bien epeles 48 b) L' accord des verbes aux temps et modes: si le verbe existe avec l' epellation du texte, il sera compte comme "bon". Une mauvaise utilisation du temps ou du mode de verbe ne doit pas etre penalise. Exemples: Ils ont mange - 3 mots bien epeles Vous pense - 2 mots bien epeles J' ai mettre - 3 mots bien epeles c) L'accord des adjectifs et des noms: les noms et adjectifs ne seront pas penalises s'ils ne sont pas accordes aux genres (feminin/masculin) et au nombre (singulier/pluriel) Exemples: Les beau fille- 3 mots bien epeles Le petite chaises vert - 4 mots bien epeles d) Les determinants: le choix errone d'un determinant (article, adjectifpossessif) ne sera pas penalise si le mot est bien epele. Exemples: Ma chapeau- 2 mots bien epeles La ecole - 2 mots bien epeles e) Les homophones: le choix errone d'un homophones ne sera pas penalise si le mot utilise est bien epele. Exemples: Les chiens on mange 1'heure os. - 6 mots bien epeles Mait yeux sons bleus - 3 mots bien epeles f) Les accents: les accents sur les voyelles doivent etre precis (bonne direction). Exemples: Tres fache- 1 mot bien epele Le gateau - 1 mot bien epele . g) Les noms propres de personnes ou d'animaux: si l'orthographe des noms propres de personnes ou d' animaux varie dans le texte, 1' orthographe du mot le plus utilise sera pris en consideration. Exemples: Sabrina, Sabrinae, Sabrinae - 2 mots bien epeles h) La majuscule au debut d'une phrase: on ne comptera une faute si le premier mot de la phrase n'a pas de majuscule. i) Les lettres inversees: si la lettre inversee produit un mot bien orthographie, ce mot sera accepte. Mais si la lettre produit un inot qui n' existe pas ou qui n' est pas bien ecrit, il ne sera pas considere comme mot bien epele. Exemples: Doule au lieu de boule - 0 mot bien epele Bon au lieu de don - 1 mot bien epele j) Les abreviations: les abreviations sans points entre les lettres seront considerees comme bien orthographies. Exemple: La tv - 2 mots bien epeles La NASA - 2 mots bien epeles 49 Appendix F Directions for Administration of Reading Probes and Scoring Procedure 50 Directives pour I' administration du test CBM de lecture (Directions for Administration of Reading Probes) Materiel: Copie de leleve (probe A, B ou C) Copie du professeur (avec le nombre de mots par ligne) Un chronometre Un enregistreuse a cassette 1- Placer Ia copie de I' eleve devant lui. 2- Placer Ia co pie du professeur dans un cartable (ou autre) devant vous de fa<;on a ce que l'eleve ne voit pas ce que vous ecrivez. 3- Dites ces directives a I'eleve: Quand je dirai "vas-y"' tu pourras commencer a lire a voix haute du debut du texte (montrez le debut avec votre doigt). Lis le texte du mieux que tu peux. Je vais t'arreter apres 1 minute. Situ as de Ia difficulte avec un mot, je t'aiderai. As-tudes questions? 4- Dites le nom de I' eleve ou son numero de bibliotheque dans I' enregistreuse. Dites ensuite: vas-y. Commencez le chronometre quand l'eleve dit le premier mot. Si l'eleve hesite sur le premier mot du texte pendant 3 secondes, dites-lui le mot, inscrivez-le comme une faute et commencez le chronometre quand il dira le deuxieme mot. 5- Suivez Ia lecture sur votre copie. Soulignez ou encerclez les mots que 1' eleve lit incorrectement. 6- Si l'eleve hesite ou ne sait pas un mot pendant 3 secondes, dites-lui le motet inscrivez-le comme une faute. 7- Ala fin de la minute, inscrivez une barre oblique(!) pour indiquer la fin de Ia lecture et dites: arrete. 8- Remerciez I'eleve et dites-lui un mot d'encouragement. Passez a l'eleve suivant. * Occasionnellement, un eleve lira a toute vitesse, c'est-a-dire qu'illira rapidement et sans expression. Dites-lui que ce n'est pas un test de vitesse et qu'il doit lire du mieux qu'il peut. Recommencez la lecture. · 51 Regles de correction pour CBM lecture (Scoring Procedure) 1- Mots Ius correctement a) b) c) d) e) Mots bien prononces Mots corriges par le lecteur lui-meme Mots repetes (comptes pour un mot) Mots dits avec un accent different (dialecte) Mots inseres sont ignores 2- Mots consideres comme "incorrect" a) b) c) d) Mots mal prononces ou changes pour un autre mot (meme si c' est une substitution logique). Mots omis Hesitations Mots dits dans le mauvais ordre (inverses) 3- Regles speciales a) b) c) d) Nombres numeraux et ordinaux Mots avec apostrophes Mots avec trait d'union Abreviations 52 Appendix G Letters of Permission Information Letter to Parents SCHOOL DISTRICT NO. 57 (PRINCE GEORGE) 1894 Ninth Avenue, Prince George, B.C. V2M 1L7 Phone: (250) 561-6800 Fax: (250) 561-6801 www .schdist57 .bc.ca October 23, 2001 Sylvie St. Pierre Teacher College Heights Elementary School Dear Sylvie: This letter is to confirm our telephone conversation regarding your request to develop reading and writing norms for the French Language Arts curriculum as a part of your master's thesis. You spoke with Dave Devore in September and received tentative permission to expand the request for a project. Although the School District Curriculum Department is not planning the development of French CBM norms in the next few years, we do give permission in principle for this project to go forward. However, the permission of each of the principals of the four schools mentioned will need to be obtained before you initiate the project in their school. Your letter of September 261h outlines the project very well. As we discussed during our conversation, you have agreed to use the students' numbers rather than their names on the data in order to protect confidentiality. Please also forward to me a copy of the UNBC Ethics Committee approval. The school district receives many requests for research projects. Although we do try to support many of these projects, in principle, and encourage schools to accept research students into their schools, we are unable to provide any release time for projects of this nature. If you have any questrons, please do not hesitate to call me. Good luck with your project. Sincerely, ~~ Bonnie Chappell Director of Curriculum & Instruction BC/hg cc: D. DeVore, Director C. Anserello, School Services Administrator French Immersion Principals uflsc UNIVERSITY OF NORTHERN BRITISH COLUMBIA Research Ethics Board MEMORANDUM To: Sylvie St. Pierre Peter MacMillan Education Program From : Alex C. Michalos Chair, Research Ethics Board Date : December 12, 2001 Re: Ethics Proposal 2001.1207.112 Curriculum Based Measurement Norming Tables for Reading Fluency and Written Expression for SO #57 French Immersion Program Thank you for submitting the above noted proposal to the UNBC Research Ethics Board for review. Your proposal has been approved and you may begin your research . If you have any questions regarding the above , please feel free to contact me. Sincerely, ( Dr. Alex C. Michalos, Chair UNBC Research Ethics Board L_ __ _ 55 January 2002 Dear Parents, During this school year, Learning Assistance Teachers (L.A.T.) in the French Immersion schools are conducting a series of tests in reading and writing with French Immersion students. These tests, Curriculum-Based Measurement (CBM), have been used in our School District for many years for all the students at the elementary levels but in English only. In a 1995-96 report, the School District has recommended the development of probes and norming tables for French Immersion. As a U.N.B.C. graduate student in Education, I have taken this endeavor on as the final project for my Master of Education degree, and has been approved by the School District Administration. Being a French Immersion Learning Assistance Teacher myself, I am taking on a task that is part of the L.A.T.'s regular responsibility, thus developing a project that is relevant for the students' ongoing evaluation. Although the results will be part of the University project, both student names and school names will not be used in the project nor in the report to the school board. Library numbers are used to keep track of student data. However, each school L.A.T. will have access to student names and results as assessing is part of their regular responsibility. Testing is done in October, January and April for grade two to seven. Grade one students are tested only in April. The series of three tests are to establish norming tables so that educators can have a bench mark to compare students at different periods of the school year. These tests are easy to.administer and measure the student progress in the same curriculum used the French Immersion classroom. It is very difficult to find standardized tests for French programs and the results of this project will help both L.A.T.s and classroom teachers to develop programs relevant to the needs and skills of the students. If you have any concerns or questions about the tests or the UNBC project, you can contact your school L.A.T. or myself, Sylvie St-Pierre, L.A.T. (UNBC student) at College Heights Elementary School at 964-4408 or at home at 562-9268. Merci, thank you! I would like to reiterate and assure you that both school and student names are kept confidential. Sincerely, Sylvie St-Pierre French Immersion teacher grade 5/6 French Immersion Learning Assistance Teacher Ecole College Heights Elementary School 56 Appendix H Graph with Raw Scores and Graph with a Smoothed Curve (Grade Six) 57 Words Read Correctly in Grade Six 130 120 I 110 100 90 80 0 a:: 3: •: .•.. .. 70 60 6 • I el .• • •• 6 • • 6 • ....- ... • .- 6. •••• 6 • u -FaiiWRC • • --.-Winter WRC -springWRC 50 • I 6 • 40 I • • 30 20 10 0 (;) ~~~~~~~~~~~~~~~~~~~~ Percentiles Words Read Correctly in Grade Six Smoothed 130 ~ 120 +---------------------------------------__. ... :. 110 100 ~ ~ ~~ , ~ 90 80 70 ~ 60 ~~ I • • I : • 50 ~~~~~ 40 I~ •• • I ~~~ ~ ~ I •. • ~~~~ • •• • .... ~ ~~ • ~~ - ~ ~ ~~~~~ ~~~~ ~ ~~ L ~L ~ • • --.-Winter WRC • • ~ 30 -l--- - - - - - - ------------------------------------1 20 ~ 10 ~ 0 ~ ~ ~,, ~ , , ,, , , ~ 0 5 10 15 20 25 30 35 40 45 50 55 60 65 70 75 80 85 90 95 10 Percentiles -FaiiWRC -springWRC 58 Appendix I Descriptive Statistics of Written Expression (TWW) Results for Three Norming Periods 59 Descriptive Statistics of Written Expression (TWW) Grade 1 Grade 2 Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Fall Winter Spring Fall Winter Spring Fall Winter Spring Fall Winter Spring Fall Winter Spring Fall Winter Spring Fall Winter Spring Mean SD 15.48 11.82 17.41 19.12 20.23 25.20 24.86 28.43 35.26 36.53 35.12 40.97 44.77 42.62 .50.76 51.03 50.00 57.10 56.70 10.50 7.22 9.19 8.16 9.44 12.71 11.66 11.24 11.17 8.99 10.15 13.86 12.24 12.70 13.69 12.53 13.66 13.20 14.41 Min 1 1 4 5 5 8 8 2 6 21 8 10 24 19 27 28 30 35 30 Max 48 31 41 34 50 73 68 50 64 61 52 64 84 71 78 81 82 90 92 Skew 1.17 0.60 0.80 0.08 1.12 . 1.18 1.18 -0.15 0.02 0.51 -0.47 -0.43 0.83 0.38 0.31 0.50 0.25 0.65 0.22 Kurtosis 1.34 -0.39 0.03 -1.04 1.54 2.80 2.98 0.19 0.96 0.45 0.31 0.50 1.90 -0.25 -0.92 0.06 -0.75 0.01 -0.45 60 Appendix J Norming Table for Total Words Written (TWW) and Words Spelled Correctly (WSC) for Grade Six 61 Total Words Written and Words Spelled Correctly Grade six Fall Winter Spring Percentile TWW wsc TWW wsc TWW wsc 99 95 90 85 80 75 70 65 60 55 50 45 40 35 30 25 20 15 10 5 1 70 66 63 56 51 50 49 45 43 42 42 40 39 38 37 35 32 29 28 25 20 64 62 53 49 47 45 44 43 40 38 38 37 35 32 29 28 27 26 23 22 16 77 71 66 66 63 59 55 52 51 50 49 47 44 43 40 39 37 36 34 31 28 71 68 62 58 54 53 50 48 47 46 45 42 40 38 36 34 33 31 29 24 19 80 76 70 62 59 55 54 51 49 48 47 46 44 42 40 38 36 34 32 31 28 24 72 67 66 63 59 56 53 52 51 50 49 48 46 45 43 41 39 38 32 29 Description Well above average Above Average Average Below Average Well Below Average