I wish someone had told me results weren’t going to happen overnight…
I wish someone had told me that the results were not going to happen overnight…
I must first confess that I am a data nerd and look at data a lot, formative and summative. In my early years as a dual language principal, I crunched numbers and used the data from the central office the way my colleagues in monolingual buildings did. The students were not as successful as we wanted them to be; there was an achievement gap between white and students of color (black, brown)/LEP/EC students. Since I was in a school within and school, the traditional teachers said the students were low because they were learning in Spanish. I knew that is not what the research said should be the case but that is what the data said. I did not know what to say or do next. I kept thinking about the other pieces of the puzzle that were missing- how were students doing in their native language? What did the Latino group look like across traditional and DL? It was at that moment that I figured out our school/district did not have consistent measures to know how students were doing in Spanish. I began asking other programs what they were doing. I got a variety of answers-nothing formal, teacher created assessment, translated version of the English assessments, formal measures in Spanish for reading/math (DRA, MCLASS, ARC, Kid-Biz, I-Ready), measures for Spanish language (STAMP, Apple). Honestly reflecting now, I did not know what some of these assessments were, how to order them, were they good/quality, how would I pay for them, etc. I contacted the central office people in charge of content areas and they were honest in saying they had no idea what to recommend. The Dual Language support person was also no help but she was willing to purchase or help us in whatever decisions needed to be made. I just knew we didn’t have anything and were not using data in the target language to look across language domains and knowledge.
I brought together my instructional team and coaches in separate settings and shared the limited data picture. There were a lot of head nods and an understanding of the why we needed assessments. At that point I pulled out the guiding principles standards about assessment. We all agreed we had no evidence for the indicators. We began asking for samples from various companies to get an idea what the choices were. Once we had some options, we selected a Spanish reading assessment that mirrored our English (DRA). After that when the district selected a new assessment, we were ready to advocate for needing a tool for Spanish too. Over the years for Spanish reading, we have used Spanish DRA, Fountas and Pinnell, a school developed measure similar to Fountas and Pinnell (for upper grades), MCLASS, I-Station and American Reading Company. For mathematics, we have used Envision, Investigations, Ready Math. There are lots of options for math so most school systems are able to purchase the Spanish version for math. However, many times the materials are not as fully developed in Spanish. Some companies only have the work, games, whole class lessons. I strongly encourage your teachers to review all components with the English components side by side so you can see what is really available. I have also had the experience of a company sending Spanish samples that were excellent but when the purchased math materials arrived, the quality of translation was not as strong and there were many errors. The company asked our teachers to “edit” for them and send them the corrections so they could be fixed. Our experience overall with math assessment is that the Spanish assessment was comparable to the English assessment. To measure Spanish language, our school initially used IPT in Spanish (we also used IPT in English before WIDA) and found the results helpful in guiding our program decisions, particularly for grammar for production in speaking and writing. Our school district transitioned to the STAMP assessment and we administer it in grades 5 and 8, but we have experimented using it in grade 2 and grades 4-5. The results are analyzed annually to look at student progression and acquisition of language at the end of each grade span. The speaking part seems harder for students and they consistently perform lower in this area than the other language domains.
Once we had target language data, we were able to have deeper dialogue about student progress across both languages. Some students were low in both languages and needed support in the content area and/or language development. Other students were high in both languages and needed enrichment in both languages. Other students were high in one language and low in the other language. Those students needed additional support and explicit instruction to bridge their knowledge of content and language from one language into the other. This type of information was very helpful for teachers as they were planning whole and small group instruction and targeting students for specific skill development. It allowed teachers to know who needed support for language development, content development or both.
For data report in our school improvement plan, our school district annual testing report continues to only include English accountability measures (state end of grade tests). In dual language, we know that is only half the picture of a child’s development. We calculate student progress in both languages and were able to report on both (see sample table) to our school community. This table shows how students are doing for each of our subgroups beginning, middle and end of year in Spanish on the F & P measure. The last two columns are English measures that are more “test like” and show student progress in English. What this table tells us is that students overall are able to read in Spanish on grade level. When students take a “test” in English, White (English speakers) perform at the same level. Hispanic and LEP students have a much wider gap in their Spanish and English proficiency. Thus, we believe our students do not need more reading instruction; they can read well in Spanish. They need support transferring their knowledge into English and also support with testing. While our system is far from perfect, we have found analyzing student data across the languages very helpful.
The frustrating part is figuring out how to use this data to advocate with our central office and parent community who review the English only data picture and compare our schools to other schools. In the comparisons, LEP students usually perform better in other schools than they do in our school. As a principal, I fully own and know we need to target our LEP students and provide them the time and targeted instruction they need. However, often there is not additional digging that goes into what the LEP populations look like in other schools in our system. Many LEP students in other schools are not Spanish speakers living in poverty. There is a large Asian population that moves in and out of our school system every 3-5 years, mostly for graduate studies at the local universities. Those LEP students are very different. I also enjoy sharing that the students (all subgroups) grow 3-5 times more than any other school, so the larger picture is not always about proficiency. I don’t wait for someone to ask me about the data. When I have meetings with central office staff or parent groups, I talk about the data, where we compare to other schools and what our school's next steps are based on what we think the data is telling us. I learned over fifteen years ago that you have to “stay the course” and keep going back to other measures you have and the rate of growth.