|
Using Data in Discussions of Sustainability: Some thoughts from a mid-point LSC |
I appreciate the opportunity to share some thoughts about data and sustainability from an evaluator's perspective. First a brief backdrop. The LSC with which I work is just past the midpoint of its funding period, providing professional development (p.d.) to support implementation of kit-based science modules to K-6 teachers in 48 schools across a rural five-county area. The modules were selected to align with the state's grade-level standards for science. A participating teacher is introduced to the kits one at a time, over a three year period, principally through summer institutes with academic year follow-up. Participation early in the project was based on individual volition, so the first cohort represents the more "open" teachers; the third cohort has more of the "reluctant" teachers, who received stronger administrative encouragement (or mandate) to attend. This school year the first cohort of teachers is completing its first time implementing all four units for their respective grade levels.
As I see it, the role of evaluation for an initiative like an LSC is to work in four areas, each of which has formative and summative aspects:
In addition, evaluators play two important process roles with the project:
Thinking about sustainability involves looking closely at impact and residue, as well as using the "story" of the initiative to make a case for the efforts to continue. The right kinds of data, appropriately used, can assist the LSC to stay on-course, make needed changes, document its effects, and make its case to stakeholders. But focusing on inappropriate data can distract the initiative from its work. How do you tell the difference? Here are some thoughts for reaction and conversation.
Probably the area with the greatest potential for "use and abuse" of data is looking for the effect of LSC professional development on student performance. Increasing student learning is ultimately what we're all about, and is foremost on the minds of important stakeholders (especially local administrators). So there is great pressure to "sell" the success of the initiative as quickly as possible, using student impact data that resonate with the local audience. The perils here are many, and revolve around the question of what data do you look at and when? Issues include the following:
The list above is by no means exhaustive, but represents a few of the issues we've discussed in the unfolding evaluation of the LSC I work with.
Let's take a look at the first bullet. As I mentioned, there is great pressure to "sell" the success of the initiative as quickly as possible. I advocate, however, for an "Orson Welles" approach. As he used to say in the commercial, "we sell no wine before its time." Similarly, we should not try to "sell" LSC impact on students until the activities have borne fruit and that fruit has had a chance to ripen. We must ask at what point in the project's work is it reasonable to expect that students would feel a significant effect of their teachers' professional development? In my LSC, we've taken the position that we won't look for changes in student performance until participating teachers have the opportunity to consistently implement the materials and strategies throughout the school year. Since teachers take three years in the LSC design to begin to use all the modules, we are just now at the point where data on teacher implementation might be worth examining in a systematic manner with respect to broad student effects.
Explaining this to teachers and administrators has been something of a challenge, but not as tough as we expected. Once we help them think about it, they agree that a teacher who doesn't implement the materials consistently and effectively probably won't see the same effects as a teacher who does. And a teacher who is only implementing one six-week module, no matter how skillfully, should probably not expect much change in students' "total science" scores on the state assessment. Implementation in a consistent, comprehensive, and effective manner - this is the "trigger point" for expecting to see the student impact that local personnel want the project to report. It's a matter of monitoring to tell when we've reached that point.
Of course, we haven't been sitting on our hands for four years, waiting for the magic moment. To help the project build its case and keep stakeholders informed, we have gathered and analyzed data on intermediate steps of participants' journey toward full implementation. An overly-simplified version of that journey could be outlined as follows (this should be viewed as an interacting set of elements, not a linear sequence). I've included a few examples of the types of data we have examined and how they were useful to the project.
For brevity, I only mention the participant-related aspects of the "implementation journey." There are also system-related aspects (administrative support, resources, collegial interactions, etc.) that interact with the participant aspects and that we collect data to examine as well.
As I mentioned above, we are at the point in the project's schedule where the first cohort of teachers has been oriented to the full complement of modules for their grade level and have used each module with their students. We are now engaged in gathering data to gauge degree of current implementation. This includes both observation and interview data, with a rubric to assess the status of key characteristics, resulting in classification of a school's implementation level. I'm still a bit hesitant about trying to tie this to student performance yet, but this is a big question from the local folks, one that will impact discussions about sustaining the LSC's p.d. and ongoing support functions. So the project needs us to start looking at the issue, and we'll do it as best we can. Politically, the state assessment must be featured in the data and subsequent analysis, even though we know that concerns exist over the assessment's alignment and scope relative to the LSC materials. Again, we'll do the best we can to use the assessment data in ways that make sense and are defensible in terms of the LSC goals. The political aspects of how the project needs to make its sustainability case do impact what we look at and how.
With that in mind, here are some questions to think about and, I hope, respond to during our panel conversation.
|