Home

Mike McMahon AUSD
BOE Meetings Assessment Facilities FinancesFavorite Links

Source:The NEA Foundation for the Improvement of Education (NFIE) was created by the members of the National Education Association (NEA) in 1969 and is sustained by their continuing support. The big ideas of teachers, education support professionals, and higher education faculty and staff become reality with resources, technical assistance, and funding from the foundation. Together, The NEA Foundation and America’s educators Think Big!

Using Data about Classroom Practice and Student Work to Improve Professional Development for Educators

Two Approaches

Can all or some of these data help teachers be more effective and, if so, how? This issue brief offers a starting point for investigating the possibilities and challenges of using a variety of data to improve teachers’ knowledge and skills on the job. Broadly speaking, data may be used in two, equally important ways to improve professional development:

  1. as the actual substance of professional development, as educators convene with each other to study student work and analyze the instructional practice that produced such results
  2. as a basis for making decisions about educators’ on-the-job learning, including decisions pertaining to professional development resource allocation, content, and delivery

Either way, Audrey Champagne, a scientist and educator at the State University of New York–Albany and an expert on the National Assessment of Educational Progress, offers important words of caution. “Data are collected to serve specific purposes,” she says. For example, “Using student achievement data collected for accountability purposes to plan or evaluate professional development is a highly questionable practice.” Not all data are alike; they have different purposes and potential uses and answer different questions. Any serious consideration of using data in the service of professional development must encompass the training necessary to build educators’ knowledge and capacity to use the data tools available to them. In reference to video data, for example, Ronald Gallimore suggests that media literacy becomes an essential tool in certain circumstances: “This is not watching TV. That’s one of the problems we run into. Most of us are not prepared to analyze video. So one of the initial things has to be learning how to do that.” Jeffrey Watson of the Wisconsin Center for Education Research observes that when it comes to professional development and using data, “each can be used to improve the other.”

In the sections that follow, we examine how data analysis can be the focus of teacher learning and how it can help guide sound decision-making about professional development. We also provide an overview of the capacity issues that schools and school districts must address before they can use data effectively for professional development purposes. This is followed by a list of resources and guiding questions identified by those knowledgeable of the appropriate uses of data in education.

The Substance of Teacher Learning

In 1996, following a two-year nationwide study, The NEA Foundation published a report on high-quality professional development. The report concluded that when teachers analyze and discuss instructional practice and the resulting samples of student work, they experience some of the highest caliber professional development available. Although such opportunities are still too rare in teachers’ daily working lives, it is clear that good professional development delves deeply into the data — the teaching practice and student work that together make up what Audrey Champagne calls the “artifacts” of learning.

High-Quality Professional Development . . .

improves student learning fosters better subject-matter knowledge, greater understanding of learning, and a full appreciation of students’ needs
helps educators meet the needs of students who learn in different ways and come from diverse backgrounds is designed and directed by teachers, and includes the best principles of adult learning
allows enough time for inquiry, reflection, and mentoring and is part of the normal working day balances individual priorities with school and district needs, and advances the profession as a whole
is sustained, rigorous, and adequate to the long-term change of practice makes best use of new technologies
is directed toward teachers’ intellectual development and leadership is site based and supportive of a clear vision for student achievement

Source: The NEA Foundation

Lesson study, a process long used by Japanese educators and now gaining favor in the United States, incorporates data analysis at the heart of the professional development experience. The data in question include the lessons taught and observations of these lessons by other teachers. Teachers have the opportunity to discuss the lessons and their analyses, while refining their practice according to what the evidence suggests works well with their students. The process has the advantage of being highly contextualized and relevant to a particular teaching situation, unlike much educational research, which offers globalized findings that may or may not hold true under various circumstances.

The Council for Basic Education’s Schools Around the World and Academy for Teaching Excellence provide an especially rich approach to using data to improve instruction, with an emphasis on mathematics and science achievement in nine countries. These programs offer teachers a protocol for examining the student work from their own classrooms and for discussing whether or not the work is good enough to meet high standards. The process guides teachers on how best to intervene with those students who did not achieve. The student work — the data — provides the evidence for serious discussions about teaching and learning. Teachers also conduct much of the analysis of instruction and student work on the Internet, with reference to academic achievement standards.

International Comparisons

Schools Around the World maintains an international database of student work. This international perspective on the data can yield profound insights about the methods used by American educators, as researchers James Stigler and James Hiebert have found: “Because our models of how teaching should look are so widely shared and so familiar, they become nearly invisible. We come to believe that this is the way teaching must be. When we observe teaching in other countries, these accepted and unquestioned practices are brought to light, and we see that we teach the way we do because we choose to teach this way. This realization is crucial because it opens up new possibilities for how we might improve teaching.”

A digital video library of the type being assembled by LessonLab begins to make these observations possible. LessonLab’s mathematics and science database contains video from a thousand classrooms around the world. By studying the video, one veteran mathematics teacher who spent three months at the lab, for example, observed more teaching by other teachers in that short time span than he had during his entire fourteen years working in the schools. He was able to view and compare many different teachers teaching the same concept. Variations in their instructional strategies helped him to understand the strengths and weaknesses of his own instruction.

Building Professional Knowledge

Most teachers value information that is directly applicable to their day-to-day practice and linked to specific instructional objectives and learning concepts. While academic research produces general knowledge, practitioners seek just-in-time data that help them improve their work with students. Ronald Gallimore, James Hiebert, and James Stigler believe the time has come for educators to find ways of creating a “professional knowledge system.” They point out that “other professions have created ways to accumulate and share knowledge. In medicine there is a case literature; a physician can read the latest reports from other physicians who have tried and refined new ways of treating specific illnesses. Lawyers have the case law; they can follow the interpretations of laws as they evolve through court decisions.” A knowledge base for education should similarly go beyond results from standardized tests. It must leverage data that document in many forms the art and science of teaching, describe student work in all its complexity, and help practitioners understand the consequences of different instructional strategies.

“Learning through Teaching”

Michèle Foster of Claremont Graduate University has organized pedagogical laboratories known by the acronym L-TAPL (Learning through Teaching in an After-School Pedagogical Laboratory). They serve students of color in the public schools of Los Angeles and Oakland, California, with a new site scheduled to begin in Trenton, New Jersey, in the fall of 2003. In these laboratories, data about teaching and student learning are made meaningful and useful for teachers because they get a chance to observe an accomplished teacher in an actual classroom setting working with pupils like those whom they teach in their own classrooms. They can also discuss what they have seen with the master teachers and their peers. “I tried to incorporate the best principles of professional development into the pedagogical laboratories, giving teachers sustained opportunities to observe a master teacher at work with urban pupils, the chance to teach and watch their peers teach, to give feedback and receive feedback on their own practice, and to have access to researchers,” says Michèle Foster. “I wanted to provide teachers with opportunities for recursive cycles of learning through observation, practice, inquiry, and reflection.”

One of the many positive aspects of these labs is the close interaction between teachers and researchers. “Collecting data and giving it to people who look at it and say that we’re not doing our job is not helpful,” says Subira Kifano. She is a master teacher for the pedagogical laboratory in Los Angeles and has been certified by the National Board for Professional Teaching Standards. Ms. Kifano explains that researchers who study the labs’ work “talk to us about how we’ve moved the students, and how the children have been engaged in the lessons, and what’s working in terms of our instructional practice. I think that kind of data is important. The question that we’re always asking ourselves is, ‘What does good teaching look like with African-American and Latino students?’” The laboratories focus on many different types of data: observations and videotapes of teaching practice, teachers’ journals, transcriptions of teacher conversations, student work, and student attitudes toward learning, lest “we forget that their intellect is inseparably tied to their emotions,” says Ms. Kifano. It reminds us again that integrating data into professional development experiences need not be limited to conversations about standardized test scores. The efforts in Los Angeles and Oakland also suggest that teachers are perhaps more inclined to pay attention to and apply the lessons from data that derive from their own practice and their students’ work.

Challenges to Overcome

There are many familiar obstacles to incorporating video analysis, classroom observations, student work samples, and other forms of data into teachers’ professional conversations and professional learning. The most pervasive may be the lack of time in the school day and contract year. Insufficient partnerships between school faculty and higher education faculty of the type established by Michèle Foster to bring researchers and practitioners together in a common learning community is another enduring concern. Other capacity issues are discussed more fully below.

Some obstacles, however, have little to do with resources and much to do with long-standing patterns in the way schools, districts, and state agencies conduct their business. “There’s a time and place for all data,” notes Edward Szopinski, the principal of a Milwaukee middle school that has experimented in the use of data in many aspects of its instructional program and operations. “One of the greatest challenges,” he says, “is to present data in a useful way at the right time for the right purpose. The publication of standardized scores and the comparison of the good to the bad performers create a mistrust of data among teachers.” Oftentimes policymakers and the education community maintain a climate in which data are collected and used primarily in ways that may result in adverse consequences for schools and individual educators (loss of funding, poor evaluations, etc.). In such cases, many begin to view data analysis with suspicion rather than as something with the potential to improve their professional knowledge and skills. Those who work with video are especially attuned to such perceptions, because a camera in the classroom can be intimidating under any circumstances, let alone situations where evaluation consistently takes precedence over professional growth. Creating a sense of confidence that data will be used fairly and appropriately is essential, both for incorporating data analysis into the heart of teachers’ learning and for making crucial decisions about professional development.

Making Sound Decisions

“Do teachers trust data?” asks Sarah Mason of the Wisconsin Center for Education Research. Her work in the state shows that “teachers have some qualms about data accuracy, data sources, and the quality of assessments and what they actually measure. They also have problems with the timeliness of data, and they know that just looking at data isn’t going to tell them how to fix what they find.” Many are also wary of a tendency to track students along a particular academic route based on their standardized test results. Such concerns reflect the complexity of using data for different ends: compliance, diagnosis, and remedy. Most often data are used to gauge compliance with various state or federal requirements, although many schools and school districts are now investigating the potential to use data for other purposes. James H. Johnson, the author of Data-Driven School Improvement, notes that “a systematic analysis of data that are already being collected can reveal previously unseen patterns and opportunities for improvement.” But Warren Simmons, executive director of the Annenberg Institute for School Reform, reminds us, “The important thing about a test is not its general validity, but its validity for a specific purpose. Tests that may be valid for holding schools accountable may not be appropriate for making high-stakes decisions” in other areas. Clearly this is a field through which one must step cautiously. Sarah Mason offers practical guidance: “Inferences and conclusions reached through data analysis should be based on multiple sources and measures and reviewed by school staff for ‘face validity’ — i.e., do the data . . . fit their own observations?” Assuming that nontrivial issues such as accuracy, validity, and reliability are resolved, data can allow educators to identify areas where student performance must improve. The capacity to disaggregate these data by various student characteristics, furthermore, helps to uncover the needs of particular groups of students.

From Problem to Solution

But the step from diagnosis to remedy may not be obvious. As Jeffrey Watson explains, “The problem is how do you go from year-end test scores to actually knowing what to do next week?” If data from criterion-referenced assessments suggest, for example, that few eighth graders in a district are performing to standard in algebra, should the district devote more time to mathematics, rewrite the curriculum to focus on depth rather than breadth of understanding, or help teachers change their instructional strategies in some way? Perhaps a combination of these or other interventions would be best. E.D. Hirsch, for one, believes that “educational data are difficult to apply in a dependable way because of contextual variables that change from classroom to classroom and from year to year, and that drown out the effects of single or multiple interventions.”

The solution may rest in part with a “cycle of inquiry,” a process favored by the Annenberg Institute for School Reform. (See AUSD use of Cycle of Inquiry.) Using such a process, educators can make carefully reasoned decisions about their professional development based in part on data from the context within which they work (including the face validity of those data), and then follow through with subsequent analysis to determine if their professional development produced the desired outcomes. The process, summarized in the chart at right, is continuous and requires timely and periodic access to multiple forms of data. It applies to goals for students’ academic performance, as well as to data about school climate or parental involvement — indeed any aspect of schooling that educators hope to improve through professional development and other interventions. Warren Simmons stresses that merely having data is unhelpful without the “ongoing planning, problem solving, and inquiry that bring people together to look at the data and make decisions and take actions on that data.”

A working hypothesis is central to the process. For example, a number of school districts benchmark their mathematics and science achievement using TIMSS data, and make professional development decisions accordingly. Some look closely at strategies commonplace in the TIMSS high-achieving nations, and posit that certain professional development practices favored in those locations hold the potential to yield improvements in student performance closer to home. As described above, lesson study is one such professional development strategy. Deborah Nelson, writing for the Consortium for Policy Research in Education, explains further that “based on data demonstrating the relative weakness of teachers’ content knowledge in the United States compared to the content knowledge of teachers in high-achieving nations, [the benchmarking districts] are increasing the number of content-related professional development opportunities for teachers. These can take the form of summer institutes, graduate courses, or school-based support from content specialists. Other TIMSS-related professional development offerings address pedagogical strategies related to teaching for understanding and inquiry-based learning.”

Decision-making of this sort is complex. How, for example, does one account for significant contextual differences between schools and classrooms in Singapore and those in Ohio? Will such differences unravel the benefits of a professional development intervention? How do we know if an intervention is or is not working? A cycle of inquiry helps because it constantly revisits key questions and attempts to “make meaning of the data.” But this requires a deep and abiding commitment by districts and state education agencies. As a rule, the “more inventive data sources require greater efforts to collect and interpret them.” Sending test results back to schools once a year is wholly insufficient, as are any number of patchwork measures that alone fall short of serious changes to the systems in place to make professional development and instructional decisions at every level, from the school to the statehouse. Capacity — technological, organizational, and human — to deal with such complexity quickly emerges as a major issue for those seeking to use data for professional development purposes, or any aspect of educational improvement.

Creating the Capacity to Use Data

Recommending one or more of the many data-management products currently available to the education market remains outside the scope of this issue brief. Adopting data-management systems, whether home-grown or proprietary, forms only one part of a larger strategy to establish the capacity to use data both as the substance of teacher learning and as a tool for professional development decision-making.

Building such capacity includes but extends well beyond the technology of data management. Jeffrey Watson explains that “using data is iterative,” meaning that educators must have the time and tools to cycle through steps involving inquiry, data acquisition, management, analysis, application, and evaluation. Capacity issues arise at each step of the cycle. Human resources expertise, for example, is required at all points of the cycle. Careful strategic planning at both the school and district levels is essential, especially when managing data over multiple years and across multiple grades, according to Mr. Watson. He has identified a short list of some the technical and organizational capacities that educators most often encounter as they begin to use data for school improvement and professional development purposes.

Using data well requires technical and organizational capacities:

Technical Organizational
Information Systems/Technology Expertise Strategic Planning Processes
Database/Spreadsheet Knowledge Knowledge of Outside Resources
Analysis Skills Staff Retention Systems
Graphing and Reporting Capability Effective Communications
Assessment Expertise Staff Support
Evaluation Expertise Team Building Opportunities
Problem Solving Skills Knowledge of Group Dynamics

Source: Jeffrey Watson, Wisconsin Center for Education Research

What happens to the data if key staff leave? It’s an important question to answer during a strategic planning process, because the situation is inevitable, and it speaks to the ability of the school or district to maintain the integrity of the data over time, and therefore its usefulness. To accommodate any number of possible changes, leadership and expertise in the areas identified by Jeffrey Watson must be distributed and sustained throughout the education system, from the budget office to the classroom. Many of these leadership roles are best filled by teachers. The pedagogical laboratories established by Michèle Foster rely on exemplary teachers who have a long and successful record of teaching urban students and who serve as master teachers. According to Dr. Foster, “The labs are organized to capitalize on the expertise of teachers and create new leadership roles for them.” Their union representatives are well positioned to collaborate with district administrators to define such roles and the skills and knowledge necessary to fulfill them.

Strategic planning should also address the ethics of using data. According to Sarah Mason, “to say that a school makes appropriate and ethical use of data implies that the school has taken the necessary precautions to ensure that data are accurate, valid, and reliable and that the analytical process is complete, equitable, and fair. . . . Schools need to become aware of these standards. . . . ” Ensuring that data are used ethically and fairly can also improve educators’ trust and willingness to integrate data analysis into their work. Videotape of a beginning teacher’s classroom instruction collected for professional development purposes, for example, should as a matter of policy have no part in a formal performance evaluation. Such policy matters are best examined and defined by teachers’ elected representatives working in partnership with school administrators.

Trust, sound policy, and fair and equitable use of data all help in “cultivating a desire to transform data into knowledge,” to use Sarah Mason’s phrase. Milwaukee middle school principal Edward Szopinski believes that these factors help to create a working culture that “values the measurement of school performance.” He feels strongly that school faculties working collaboratively should have the power to make decisions based on the analysis of the data available to them, and that data should be employed “as much to ask questions as to answer them.” School culture and collegiality, he says, have as important a role to play in using data for professional development purposes as the most sophisticated information management systems. The district, too, must look closely at how it supports teachers and schools as they use data in the service of professional learning. As Warren Simmons explains, “After you do the analysis, you have to have a professional development system that builds capacity in a differentiated way. You’re going to have low-capacity schools, moderate-capacity schools, and high-capacity schools. How does the district’s professional development system serve those schools not only in a differentiated manner, but also in a way that’s equitable and leads them all to reaching high standards?”

Conclusion

Like most changes in education, using data in the service of professional development requires system-wide thinking and realignment. A decade of professional development work by The NEA Foundation strongly suggests that education staff must play leadership roles in such work if it is to reach its full potential. The many complex issues surrounding the use of data in education are best examined collaboratively by practitioners, school administrators, union representatives, higher education faculty, and state agencies. Professional development improves when such collaboration forms an integral part of the education system. It is a sine qua non of using data in professional learning.

Resources:

Incorporating Data into Professional Learning:

  • “Candid Camera.” May 1999. Steve Olson. Teacher Magazine. www.teachermagazine.org/tm_printstory.cfm?slug=08candid.h10
  • “A Knowledge Base for the Teaching Profession: What Would it Look Like and How Can We Get One?” June/July 2002. James Hiebert, Ronald Gallimore, and James W. Stigler. Educational Researcher. www.aera.net/pubs/er/pdf/vol31_05/AERA310502.pdf
  • “Understanding and Improving Classroom Mathematics Instruction: An Overview of the TIMSS Video Study.” September 1997. James W. Stigler and James Hiebert. Phi Delta Kappan. www.pdkintl.org/kappan/kstg9709.htm
  • Learning through Teaching in an After-School Pedagogical Laboratory: www.theeditors.com/review/ltapl/
  • Lesson Study Research Group: www.teacherscollege.edu/lessonstudy/
  • LessonLab: Teachers Making Their Work Visible: www.lessonlab.com/software/demo.asp
  • Looking at Student Work: www.lasw.org
  • Schools Around the World: www.s-a-w.org
  • TIMSS USA: nces.ed.gov/timss/index.asp
  • U.S. National Research Center for the Third International Mathematics and Science Study (includes the International Curriculum Analysis Database): ustimss.msu.edu

Using Data to Guide Professional Development Decisions:

  • Data-Driven School Improvement. 1997. James H. Johnson. ERIC Clearinghouse on Educational Management.
  • www.ed.gov/databases/ERIC_Digests/ed401595.html
  • From Research to Practice and Back Again: TIMSS as a Tool for Educational Improvement. 2000. Marlies A. Dunson. Consortium for Policy Research in Education. www.cpre.org/Publications/rb30.pdf
  • “Schools Discovering Riches in Data.” June 12, 2002. Lynn Olson. Education Week. www.edweek.org/ew/ew_printstory.cfm?slug=40data.h21
  • Strengthening California’s Teacher Information System. 2002. Camille E. Esch, et al. The Center for the Future of Teaching and Learning.
  • Turning Data into Knowledge: Lessons from Six Milwaukee Public Schools. April 2002. Sarah Mason. Wisconsin Center for Education Research. www.wcer.wisc.edu/publications/workingpaper/abstract/Working_Paper_No_2002_3.asp
  • Using Data for School Improvement. 1998. Lorraine Keeney. Annenberg Institute for School Reform. www.annenberginstitute.org/images/using_data4.pdf
  • Using Data to Improve Teacher Induction Programs. 2002. The NEA Foundation. www.nfie.org/publications/inductionib.pdf
  • Using TIMSS to Inform Policy and Practice at the Local Level. 2002. Deborah I. Nelson. Consortium for Policy Research in Education. www.cpre.org/Publications/rb36.pdf
  • Data available through state education agencies: www.nfie.org/usingdata.htm

Guiding Questions

Incorporating Data into Professional Learning

What types of data can practitioners use to reflect on and make changes to their teaching?

  • Samples of student work?
  • Lesson plans?
  • Classroom observations?
  • Video of instructional practice?
  • International, national, and state databases?
  • School, district, and union survey data?
  • Other data?

What supports do practitioners need to incorporate data analysis into their ongoing professional learning?

  • Time to observe colleagues and collaborate with higher education faculty?
  • Protocols for analyzing samples of student work and/or video of instructional practice?
  • School-based instructional coaches knowledgeable in the subject areas, the local context, and the analysis and application of data?
  • Ongoing consultations with educational researchers?

What can partnering organizations (including higher education institutions) do to help practitioners build and learn from a professional knowledge base?

  • Provide assistance with videotaping, archiving, and making readily accessible examples of instructional practice?
  • Link samples of student work and samples of instructional practice to subject-matter standards?
  • Provide mechanisms to help practitioners compare and contrast instructional strategies?

How can union representatives and administrators establish a school culture that emphasizes and values the use of data for professional development purposes?

  • Create incentives to engage in lesson study with colleagues?
  • Provide risk-free opportunities to experiment with data collection, analysis, and application?
  • Establish a climate in which data are used fairly and ethically?
  • Link data analysis and professional development to school improvement plans?

Using Data to Guide Professional Development Decisions

What types of data can we use to guide decisions about professional development?

  • Aggregated and disaggregated standardized test score data?
  • Data from more proximal student assessments?
  • Data about teachers and teaching practice?
  • Practitioner opinion surveys?
  • International comparisons?
  • Professional development needs assessment data?

Do practitioners, school administrators, and others have ready access to the right kinds and amounts of data to guide their professional development decisions? Are data reported to decision-makers in a meaningful way and useful format?

Does the district/school have an effective, ongoing cycle of inquiry to ensure that data inform professional development decision-making in appropriate ways?

Do our data suggest a causal relationship between a particular approach to professional development and student achievement? Do we have evidence that this causal relationship would persist in different school and classroom contexts?

Creating the Capacity to Use Data for Professional Development Purposes

What technical and organizational capacity and information infrastructure do our district and schools need to use data for professional development purposes?

  • What process and systems do we have for collecting, managing, and preserving data? Who is responsible for this?
  • Have we taken steps to ensure that the systems and infrastructure are established in consultation with the end-users?
  • Are data made available in a timely way and in an accessible format to practitioners, school administrators, and other decision-makers?
  • How do we ensure that data are well maintained in the event of turnover among key staff?
  • Who and/or what process ensures that the data are accurate, valid, and reliable?

Does the district’s professional development system provide school faculties with the skills to use data effectively?

What are the costs (including practitioner time) associated with collecting, managing, preserving, and using data for professional development purposes? Have we made adequate provision in our budgets?

Copyright © 2003 by The NEA Foundation for the Improvement of Education, Washington, DC. This paper is available at www.nfie.org, along with additional information about the foundation’s professional development work. Special thanks to Erma Anderson, Audrey Champagne, Michèle Foster, Ronald Gallimore, Subira Kifano, Sarah Mason, Sandra Menzer, Elizabeth Powers, Warren Simmons, Robert Spielvogel, Carol Stoel, Edward Szopinski, and Jeffrey Watson for helping The NEA Foundation investigate the issues discussed in this issue brief. Financial assistance for this work was provided by the National Education Association, The Ford Foundation, and The Pew Charitable Trusts.

TOP

Send mail to mikemcmahonausd@yahoo.com with questions or comments about this web site.
Last modified: August 23, 2004

Disclaimer: This website is the sole responsibility of Mike McMahon. It does not represent any official opinions, statement of facts or positions of the Alameda Unified School District. Its sole purpose is to disseminate information to interested individuals in the Alameda community.