I wrote another blog that defined confirmation bias and how coaching and videoing can assist a teacher in reflective decision making in a manner that avoids seeking data that only reinforces existing beliefs and negating data that challenges an existing opinion or practice.
As I worked with that material I considered that the same problem of confirmation bias exists in the work of PLCs, as teachers examine a data wall or current student tests or student work like writing samples.
My discussion with teachers at all grade levels and content areas identifies that too often data conversations do not lead to productive actions and cause teachers to feel that the time invested had no pay-off in increased student learning.
I wonder if confirmation bias might be part of the problem. I believe this occurs from teachers trying to explain why the results are as they are rather than focusing on what student behaviors would produce an increase in learning and what teacher behaviors are likely to gain those student behaviors.
Larry Cuban’s report identifies a Use of Data Cycle for moving from data to action.
I believe that whether the situation is a single teacher with a coach/administrator, or a PLC team/department, or an entire school staff; when reviewing student data, the questions raised by the facilitator are of utmost importance. Those questions can reduce the impact of confirmation bias, create a no blame environment, and produce a culture of continuous improvement.
I’ve recently suggested that coaches refer to the Questions Wall rather than the Data Wall. The purpose of the data is to generate questions. As we study student work or scores,’ the questions that emerge are what will most likely generate our future learning. The purpose of the facilitator’s questions is to generate participant questions that will direct the future learning for teachers and then students.
Here are some sample facilitator questions:
What patterns emerge in your initial look at the data?
Where does the data surprise you with either high or low student performance?
How comfortable are you that this data correctly reflects the level of student learning? Why or why not? If not, how might we collect additional data to decide?
If there is a level of comfort that the data reflects student learning, the questions proceed.
Where do you see indicators of student learning that you would like to focus on improving?
What do we know about those students’ current past and current learning behaviors?
What behaviors/actions do those students need to engage in to create the desired learning outcomes?
What teacher behaviors on our part could initiate, motivate, and support, those student behaviors.
Now we initiate the teacher behaviors and begin to look for the student behaviors we desired. If they appear we continue and then assess to see if those student behaviors generate the student learning that was our goal. If the student behaviors do not appear after a time, we examine how we might change the teacher behavior action to get the initiation on the part of students.
I was working with a high school science department looking at goal setting for next year’s student achievement. The chemistry team was exploring students’ end of year biology scores and in conversation with the biology teachers setting goals for next year’s end of year chemistry results. They identified students who scored 85% plus on a final biology assessment as well as some who scored 70 -85% who they felt should have goals of 85% plus on the chemistry final.
When we explored what these students would need to do to reach this goal, one item that emerged was that some of the students would need to build math skills to have success in chemistry. They decided to request math backgrounds on the students to assess which students likely needed this extra support, rather than waiting for the problems to emerge later.
One initial teacher behavior option explored was the creation of a tutoring group called “math to succeed in chemistry.” Perhaps co-taught by a math and science teacher, students and their parents might be encouraged to begin the year taking part in the extra student behavior to build a successful chemistry outcome.
I continue to be encouraged that when teachers do goal setting for learner outcomes and identify the needed student behaviors for success, strategies for “teaching” quickly emerge.
May 18th, 2014 at 8:30 am
I loved the questions you have posed for teachers to ponder. So often, they get stuck on just analyzing the data without interpreting what the data really means and the possible actions they can take.
Today I read an article in the NY Times entitled, “Who Gets to Graduate?”
http://www.nytimes.com/2014/05/18/magazine/who-gets-to-graduate.html?emc=edit_th_20140518&nl=todaysheadlines&nlid=34661376&_r=0
In light of the research presented in the article, I want to propose some additional questions.
What impact does the article have in deciding how you might shift the mindsets of the students so they can succeed?
What data would you need to collect in order to do this?
What strategic supports would you need to put this into practice? ie. resources, colleagues etc.
May 20th, 2014 at 4:15 am
Roni….
Thanks…. great article I will share. College ready and career ready means a lot more than math and reading scores.