Mobile technologies are an integral part of our daily lives. Where is the closest gas station? Ask Siri. Which toaster is best for my needs? Check customer reviews on Amazon.com. Going out to dinner with friends? Ask Yelp for a good restaurant within five miles of your house, make reservations on OpenTable, and forward the reservation to your friends, complete with driving directions. Mobile technologies have made our lives easier and are transforming the way we work and get things done. It isn’t about the device, but what the devices allow us to do. How can we translate this savvy use of technology into classroom learning experiences?
Too often, there is little similarity between the very connected world we live in and the world inside of a school. What does it look like when a school’s technological experience does mirror the real world?
In Sue Scott’s 8th grade language arts class at Preston Middle School in Fort Collins, Colorado, students work and create in ways that would be unlikely in a non-technology-infused environment. It is important to point out that students’ projects are not about technology; specific learning goals are accomplished around writing, researching, communicating, collaborating, and presenting. Technology is merely the enabling factor.
Recently, Mrs. Scott worked collaboratively with the district’s instructional technology trainer, Rhonda Summerlin, to develop what they dubbed the “Passion Project,” in which students focused on a topic or idea they were passionate about. Layers of technology were embedded in the project, but only as a means toward the end.
First, students brainstormed using Google Docs, reflecting on things they really wanted to learn, research, create, or discover. Students individually narrowed the topics down to three top choices and completed a free write on individual goals and how they would get there, any help they would need and how they would work on it weekly, what might be easy and what might be difficult about the project, and what would need to be done in class versus outside of school. They shared their writing with their teacher via their class assignment folder and added it to their digital portfolio.
Students used Blogger to post weekly about their chosen projects, including their successes, setbacks, valuable research, etc. They also enriched their blog posts with pictures, videos, and other media. In addition, they used Diigo to help them conduct their research, highlighting and annotating relevant articles in the cloud. Diigo also allowed them to create collaborative groups and share their research with their teachers.
Students reviewed goals every three weeks, looking at what they had accomplished and what they needed to do before the next checkpoint. They used a rubric in their class assignment folder for the checkpoints and conferenced with Mrs. Scott and Ms. Summerlin to show evidence of goal completion.
Each student created a two-minute digital presentation of their work to pitch their ideas to their peers, parents, and teachers. They created these presentations with no coaching, in order to measure growth in presentation skills as compared with their final presentations.
Using Blendspace, the students created digital portfolios of their passion projects, which included brainstorming and goal setting documents, valuable research, pitch time presentations, a link to their blogs, and pictures and videos of them in pursuit of their passion.
In preparation for final presentations, students watched exemplar presentations by kids their age, such as TED Talks and Ignite-style presentations. They used TodaysMeet to record the pros and cons of the presentations as well as other observations. They then collaborated on Google Docs as to what their final presentations should include and created rough outlines. The outlines became the bases for the presentation scripts, which they also created in Google Docs and shared with peers, who used the commenting feature to give feedback. They then revised their presentations.
The students developed rubrics for their final presentations via Google Docs with guidance from their teachers, using the rubrics to create their final digital presentations and practicing in rotations with different peer groups for feedback.
Ina culminating event, Preston Middle School invited the community to hear the students’ presentations in a TED Talk/Ignite-style event. Students had 3‒5 minutes to talk about their passion projects with their digital presentation playing in background. Later, students reflected on the experience on their blogs.
This is just one example, among many, in which technology enhanced students’ experiences of learning, creating, and producing in the classroom. The Framework for 21st Century Learning focuses on having students create, innovate, problem-solve, think critically, communicate, and collaborate. When we leverage the power of technology, as Mrs. Scott and Mrs. Summerlin did, we are truly preparing our kids well.
A former elementary and middle school principal, Dr. Howard Pitler is McREL's executive director of digital solutions. He is co-author of the second editions of Using Technology with Classroom Instruction That Works (2nd ed.) and Classroom Instruction That Works (2nd ed.), and was the lead developer of McREL's Power Walkthrough® classroom observation software. He can be reached at firstname.lastname@example.org or followed on Twitter at @hpitler.
For most occupations, routine continuing education is necessary to keep current with new and changing policies, procedures, and technologies and is critical to job expertise and career advancement. Why is it, then, that educators too often view professional development (PD) opportunities with a touch of dread and angst? From our conversations with teachers, we know these feelings are often rooted in concern about the relevancy of the PD and a lack of time to apply what’s been learned. In this post we offer a few suggestions for PD planners to address those challenges, based on our experiences working in schools and districts.
As mentioned in a previous blogpost, PD sessions ideally should be spread across the entire school year, giving teachers time to absorb the material and integrate what they are learning into their classrooms in a more meaningful way.
We also know that developing positive and open classroom environments helps students learn. Setting these same standards for teachers’ PD encourages a more meaningful and engaging adult learning experience, helping teachers feel more comfortable with sharing ideas, less anxious when struggling with new concepts, and at ease in acknowledging what they don’t know.
Building relationships and developing trust between the PD leader, teachers, and administrators is key, especially when multiple PD sessions will occur throughout the year. Begin developing those relationships prior to the first session by planning one-on-one meetings with each key stakeholder in the building or by scheduling an all-staff meeting. Encourage building administrators to attend the PD sessions to demonstrate that the learning is a priority and that teachers will be supported as they learn and apply their new skills in the classroom.
Consider these strategies for setting a positive tone in PD sessions:
Elicit teachers’ strengths and expertise by asking what they already know about the subject.
Encourage teachers to consider new ideas on the subject. Very few people know everything there is to know about a subject; keeping an open mind is essential to trying out new ideas and strategies.
Ask teachers what is important to them as learners and as program participants.
Ask teachers what concerns they have about the PD program. Even if there are factors out of a PD leaders’ control, at least teachers will know their concerns have been acknowledged.
We used these key strategies when piloting and field testing McREL’s mathematics formative assessment program, the Assessment Work Sample Method (AWSM). Middle school teachers attended twelve 45-minute sessions throughout the school year that were embedded in the school day. Over the course of the AWSM pilot, PD leaders developed a sense of trust and openness with teachers by taking time to build relationships and inviting administrators to attend sessions. Teachers shared their thoughts and concerns, related what they knew about formative assessment, shared strategies they used in their classrooms, and considered new ideas about formative assessment.
Developing a positive learning environment does not happen in one session—making time for these conversations throughout the year will further strengthen the trust and openness that are key to an engaging and meaningful PD experience for teachers.
Sarah Gopalani and Jesse Rainey conduct quantitative and qualitative analyses in support of McREL’s research and evaluation projects.
How does student work inform instruction? I read Katrina Schwartz's MindShift blog post, "How Looking at Student Work Keeps Teachers and Kids on Track," and immediately found connections to McREL's Institute of Education Sciences (IES) study of a formative assessment model for middle school math, now completing its third year. Not only does Ms. Schwartz highlight the use of student work as a method for improving student learning and teacher practice—a cornerstone of our study—but she also relates this to mathematics.
For the IES study, McREL's math and assessment specialists developed a program that provides teachers with authentic samples of student math work and an interactive, hands-on experience to increase their use of high-quality formative assessment practices. The program, called AWSM (shorthand for Learning to Use Formative Assessment with the Assessment Work Sample Method), has a goal of increasing middle school students' math achievement. (Read our AWSM success story.)
AWSM differs from other formative assessment professional development in that it:
focuses specifically on middle school mathematics,
features supportive peer review of colleagues' assessment practices,
centers on the use of student work samples,
is job-embedded, and
fosters teacher collaboration.
Teacher data from our pilot tests are quite promising, and many of the points that Ms. Schwartz identified in her blog post were evident in our study. For example, Ms. Schwartz noted that often students don't have a "clear sense of what a great project would look like." AWSM is designed to help teachers articulate a clear learning goal, select or develop a mathematically rich aligned task, and communicate the criteria for successful task completion to students. With clear success criteria in place, students can track their own progress, provide descriptive feedback to peers, and make adjustments in their own learning.
Interestingly, resisting the pressure to grade every assignment is a challenge for teachers in the AWSM program. AWSM promotes using descriptive feedback, rather than grades, when students are learning a new concept or skill, and, for some teachers, this conflicts with current practice and beliefs. When students receive a letter grade or number score on an assignment, they tend to pay little attention to the descriptive feedback offered by teachers, often missing important recommendations that could help them improve their work. Ms. Schwartz notes that looking at student work can bring the focus back to the learning goals, and this is echoed in our work with AWSM.
Using high-quality formative assessment practices requires many teachers to make significant shifts in their practice, as illustrated by these teacher comments gathered at the end of the pilot test:
"I used to think formative assessment was about the teacher knowing where students are in the learning process. Now I know that formative assessment must include students so that they understand how to improve their own learning."
"I used to think I had to grade everything. Now I know I can provide descriptive feedback and allow students to take action."
"It's the dimensions of clear learning goals and success criteria that have most impacted my instruction. I think I was always clear about what was being learned, but I needed to be more explicit about sharing this information with my students."
The AWSM team is disseminating study findings at conferences across the U.S. in 2014 and 2015. If you'd like more information about the program, please feel free to contact me directly at email@example.com.
Kathleen Dempsey has more than 30 years of experience as a teacher and administrator. She provides technical assistance and professional development to K‒12 mathematics educators and state education agencies, and she serves as the director of the North Central Comprehensive Center (NCCC), administered by McREL.
Out of curiosity, I recently asked 60 teachers attending a conference session on formative assessment to explain the difference between “summative” and “formative” assessment. To my surprise, the first volunteer described formative assessment as “the formal assessments we give kids to find out what they really know.” Other participant responses varied, from descriptions of in-class observations to a general understanding that any assignment a teacher uses to measure progress are all formative assessments—including online tests administered quarterly by the school district to gather program data.
When asking the question, I had mistakenly believed that most participants would easily describe the two as processes that provide assessment of learning (summative assessment) and assessment for learning (formative assessment).
In a 2007 article in Phi Delta Kappan magazine, Margaret Heritage, assistant director for professional development at the National Center for Research on Evaluation, Standards and Student Testing (CRESST) at UCLA, defines formative assessment not as a test nor a high-stakes standardized assessment, but as a process of feedback in which a teacher learns about a student’s current level of understanding to determine the next learning steps for that student. Sounds simple enough. Yet, as the teachers’ responses showed, there is still a lot of variation in how teachers define the two types of assessments.
So, why is the concept of formative assessment still so confusing? Part of the fault lies with educational jargon. Educators tend to use the term “formative assessment” to describe a whole host of opportunities to gather evidence of student learning.
In a 2005 book, Preparing Teachers for a Changing World, Lorrie Shepard, et al. tackled this confusing terminology, defining formative assessment as “assessment carried out during the instructional process for the purpose of improving teaching or learning.” What differentiates formative assessment from other classroom-based assessments (such as interim and benchmark assessments) is, first, that the evidence of student learning is not graded and, second, that the information is used immediately to inform instruction. Feedback is a critical part of the process.
My colleagues at McREL and I recently piloted a new mathematics formative assessment program, the Assessment Work Sample Method (AWSM) in a Colorado Springs school district, providing professional development for middle school math teachers to help them learn how to implement classroom formative assessment using authentic student work samples. (Read our Success Story here.) One teacher in the pilot said, “AWSM has helped me realize that differentiation is crucial. It helps me look for the outliers in my classes—the ones who are overachieving and the ones that are falling behind.”
When implemented effectively, formative assessment provides ongoing feedback to students about where they are relative to their goals, it equips them with resources and suggestions for further exploration, and it encourages questions that propel the learning process. Formative assessment matters because it has been shown to help students learn. Not only does it help with cognitive processes, but it also fully engages students with their learning.
I recently read a blog post on developing innovation by George Couros, a principal with the Parkland School Division in Stony Plain, Alberta, Canada. I'm a regular reader of Mr. Couros' blog, "The Principal of Change," but this one struck a particular chord with me.
In his blog post, Couros refers to Carol Dweck's work on "fixed" versus "growth" mindsets. In an interview with the OneDublin.org education blog's founder, Dr. Dweck differentiates between the two mindsets, explaining:
"In a fixed mindset students believe their basic abilities, their intelligence, their talents, are just fixed traits. They have a certain amount and that’s that, and then their goal becomes to look smart all the time and never look dumb. In a growth mindset students understand that their talents and abilities can be developed through effort, good teaching and persistence. They don’t necessarily think everyone’s the same or anyone can be Einstein, but they believe everyone can get smarter if they work at it."