Wednesday, April 29, 2015
The Assessment Range: Using Data To meaningfully Affect Learning
The Assessment Range: Using Data In The Classroom To meaningfully Affect Learningby Terry HeickIf you do not already have a plan for the the data before giving the assessment, you're already behind.Among the challenges of assessment, this concept-as it Applies to formal academic classrooms designed to promote mastery of Common Core standards or similar-is near the top. Without a direct input into instructional design guide embedded within a dynamic curriculum map, an assessment is just a hurdle for the student-one they MIGHT clear, or one that MIGHT trip them up.And let's talk about how much we, as teachers, like to jump hurdles for others.This is the third time in as many weeks that I've written about the assessment, the which usually means there's something that's bothering me and I can not figure out what. In Evolving How We Plan, I pointed contentiously at the "unit" and "lesson" as impediments to personalized learning.Simply put, most planning templates in most schools used by most teachers on most days do not allow for the data to be Easily absorbed. They're not designed for students, they're designed for curriculum. Their audience is not students or communities, but rather administrators and colleagues.These are industrial documents.Depending on what grade level and content area you teach, and how your curriculum is packaged, what you should and are reasonably Able to do with the data MIGHT be different. But pit roughly, teachers administer quizzes and exams, and do their best to "re-teach." Even in high-functioning professional learning communities, teachers are behind before they give Reviews their first test.Their teaching just is not ready for the data.What Should Assessments "Do"?In The Most Important Question Every Assessment Should Answer, I outlined one of the biggest of the many big ideas that revolve around tests, quizzes, and other snapshots of understanding-information. In short (Depending on the assessment form, purpose, context, type, etc.), the primary function of assessment in a dynamic learning environment is to provide the data to revise planned instruction. It Tells you where to go next, like a bat's echo location.Unfortunately, they're not always used this way, even when they are. Instead, they're high drama that students "pass" or "fail." They're matters of professional learning communities and artifacts for the "data teams." They're designed to function, but instead they just parade about and make a spectacle of Themselves.Within PLCs and Data teams, the goal is to establish a standardized process to incrementally improve teaching and learning, but the minutiae and processes within Reviews These teaching tools can center Themselves improvement over the job they're supposed to be doing. We learn to "get good" at PLCs and Data teams the same way students "get good" at taking tests. Which is crazy and backwards and no wonder hates education innovation.To teach a student, you have to know what they do and do not know. What they can or can not do. "They" does not refer to the class either, but the student. That student-what do they seem to know? How did you measure, and how much do you trust that measurement? This is fundamental, and in an academic institution, more or less "true."Yet, "the 'constructivist' paradigm ... is not compatible with the 'conventional' paradigm of external examinations" (Galbraith's 1993). Constructivism, Depending as it does on the learners own knowledge creation over time through reflection and iteration seems to resist modern-assessment forms that seek to pop in, take a snapshot, and pop back out. Reviews These snapshots are taken with no "frames" waiting for them within the lesson or unit.They're just grades and measurements, with little hope of substantively changing how and when students learn what.
As Teachers Learning DesignersThere is the matter of teaching practice working behind the scenes here. What teachers believe, and how Reviews those beliefs inform Reviews their practice, Including design and assessment of data management.In 2003 in Classroom Assessment Practices and Teachers' Perceived Self-Assessment Skills, Zhicheng Zhang and Judith A. Burry-Stock separate "assessment practices and assessment skills," explaining that they "are related but have different constructs. Whereas the former pertains to assessment activities, the latter Reflects an individual's perception of his or her skill level in conducting Reviews those activities. This may explain why Reviews their assessment skills teachers rated as good even though they were found inadequately prepared to conduct classroom assessment in Several areas. "Assessment design can not exist Independently from instructional design or design curriculum. A "fantastic test" is as useful as a "brilliant telescope."Applied how?In The Inconvenient Truths of Assessment, I said that "It's an extraordinary amount of work to design precise and personalized assessments that Illuminate pathways forward for individual students-Likely too much for one teacher to do so consistently for every student." This is such a Because not challenge personalizing learning is hard, but personalizing learning is hard when you use traditional units (eg, genre-based units in English-Language Arts) and basic learning models (eg, direct instruction, basic grouping, maybe some tiering, etc. )Change the tools, and you can change the machine; change the machine, and you can change the tools. The question can then be asked: How can we design learning along chronological (time) and conceptual (content) boundaries so that that learning requires that the data to create itself? Adaptive learning algorithms within Certain #edtech Reviews These products are coded along lines. So how do we do this face-to-face, nose-to-book, pen-to-hand?If we insist on using a Data-based and researched-grounded ed reform models, this is crucial, no?Backwards Planning Of A Different KindAssessments are the data creation tools. Why collect the the data if it's not going to be used? This is all very simple: Do not give an assessment UNLESS the the data is actually going to change the future learning for * that * student.Think about what an assessment can do. Give the student a chance to show what they know. Act as a microscope for you to examine what they seem to understand. Make the student feel good or bad. Motivate or demotivate the student. De-authenticate an otherwise authentic learning experience.Think about what you can do with an assessment of data as a teacher. Report it to others. Assign an arbitrary alphanumeric symbol in hopes that it symbolizes-student-achievement-but-can-we-really-agree-what-that-means-anyway? Spin it to colleagues or parents or students. Overreact to it. Misunderstand it. Ignore it. Use it to make you feel good or bad about your own teaching-like you're "holding students accountable" with the "bar high," or like no matter what you do, it's still not enough.Grant Wiggins (Whose work I Often gush over) and colleague Jay McTighe are Known for their Understanding by Design template, a model of that depends on the idea of backwards design. That is, when we design learning, we begin with the end in mind. Reviews These "ends" are usually matters of understanding-I want students to know this, Be Able to write or solve this, etc.What if, however, we designed backwards from the data points? Here, the the data would not necessarily be the "end" (gross), but somewhere closer to the middle, serving more noble causes. And around this middle, we'd build in mechanisms to accept and react to that information.We'd have a system that expected A Certain amount of "proficiency" and "non-proficiency." Two weeks into the "units" (if we insist on using units), we're waiting on the very crucial of data from a small series of diverse assessments (non-threatening assessments maybe?) so that we know what to do and where to go next. We already have a plan for it before we even start. We're ready to use the data to substantively, elegantly, and humanly revise what we had planned. We can not move on this without the data, or else we're just being ridiculous.We keep the conveyors running while the bottles crash off the belts all around us.The Assessment Range: Using Data In The Classroom meaningfully; The Assessment Range: Using Data To meaningfully Affect Learning; adapted image attribution flickr user vancouverfilmschool
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment