Monday, November 24, 2014

Standards Based Quiz Spotlight: the Anatomy of Mastery Learning Cycles

Roughly there is one quiz per "I can statement." Students receive permission to take the quiz once they have completed the one-on-one conversation with me about the "I can" statement, called the "Hot Seat." Typically each quiz consists of five questions, designed at the application level on Blooms Taxonomy. Most of the questions are problems that assume the understanding of concepts and integrate the vocabulary and are similar to problem set questions. 

The quizzes are online and hosted on Moodle. Moodle is a powerful LMS because it supports randomized questions pulled from a question bank. This is important in an asynchronous class because the quiz questions are different on each attempt - whether a new student or the same student on a subsequent attempt. This reshuffling of questions allows for retakes and keeps academic integrity between the first or the last student to take the quiz on the same topic. 

Students must earn 4/5 or 80% to pass the quiz. This score is consistent with the school culture & my expectations. Once they pass the quiz, they earn level three out of four, which is meeting expectations. Most of the questions are multiple choice or calculation questions. I typically don't prefer to use multiple choice questions but use them when I'm able to input every possible choice as answers. For example, when figuring the probability in Punnett Square problems, the only possible answers are 0, 25, 50, 75 or 100. I also add a choice "there's not enough information to determine." With these types of problems, multiple choice can effectively gauge understanding - assuming there have been other assessments in the learning cycle. Some "I can" statements can't be quizzed at the application level w/ MC questions or calculations. In those cases, there are alternatives - building a model, writing a lab report, completing a lab or a case study. 

If students fail a quiz, there are certain tasks they have to complete, which differ based on the number of times they have taken the quiz. For every failed attempt, students have to make corrections and fill out a form describing their errors.

 Students also have to do the following:
  • after the first attempt: students complete any skipped problems from the Learning cycle problem set. Earlier in the cycle, students solved mandatory problems and as many optional problems as they felt the need to complete. After failing a quiz, the optional problems become mandatory. The hope is practicing more problems will help students review and prepare for a second attempt. 
  • after the second attempt: students complete at least one remediation activity. The remediation activities for a learning cycle may include online readings, simulations, extra problems and/or student made videos and problem sets
  • after the third attempt: students have to create their own set of problems and include solutions. In many ways, this last option is similar to the mastery projects in the next phase of the learning cycle. 

Once students pass the standards based quiz, they are able to move on to the next learning cycle. If they wish to further explore the same topic and/or show a higher level of understanding, they can complete mastery projects before moving on to the next learning cycle. 

Sunday, November 16, 2014

Quarter One Reflections

After a quarter into the school year, I have a solid grasp of the effects of the changes I've made. Here are the chief thoughts I have about quarter one.

Standards Based Grading
The transition to standards based grading has been mostly smooth. This year, I have a much better handle of what my students know and do not know. The SBG Grade book on Haiku is easy to use. The color codes make it easy to see which standards each student or class section is still working on. This has helped me identify which students need targeted intervention.

Standards Based Gradebook on Haiku

At first, it took students some time to understand the concept of "I can" statements and my particular system for showing learning. They seem to have figured out the system. 

The most noticeable difference is the quality of my reports. I've always struggled with writing first quarter reports because I barely feel like I know my students well enough by that time in the school year. This time around, I had plenty to say. Rather than including the general fluff, my reports focused on what my students knew and were able to do and the ideas and skills they still found troubling. Adding this component to my comments about performance on major assignments, my general impressions and suggestions moving forward, the reports are much more informative. 

Haiku LMS
The new learning management system is quite effective. The layout is beautiful and the interface is intuitive. I have consolidated many of my online tasks within Haiku - recording and sharing grades, assigning and collecting student work, repository of resources and interactive components like polls, practice quizzes and discussions. In the past, many of these roles would have been offloaded to separate resources. I'd like to move my actual quizzes to Haiku but it does not support randomized questions from a test bank, so I still need Moodle for that purpose. 

Haiku can be a bit buggy though. There is a limit to how many objects can be embedded on one page. Some students complained of notoriously long loading times. A student suggested that I make more usage of subpages. Now each step of the learning cycle is housed on its own page. This has significantly increased loading speeds.

Subpages on Haiku

Asynchronous learning
As mentioned in a previous blog post,  asynchronous learning continues to allow students to submit their best work and internalize a growth mindset. Most students are keeping to a reasonable rate, even though there are students who I believe can work faster. I've made some changes this year, which hopefully will help students adjust to the responsibility of setting their own pace. The most important change, at the request of a student, was allowing students to create their own weekly plans.

A student's week plan

Creating the plans take a lot of time so I've been trying to encourage students to send their plans to me during the weekend - with varying degrees of success. At the very least, students are using less class time to create their plans and becoming better at working while waiting for my indication that their plans are satisfactory. For students who show difficulty with this task, I've started to collaborate with them to create pacing calendars for a few weeks, rather than letting them work alone on their weekly plans. 

Mastery projects
A handful of students have elected to complete the mastery projects. In most cases, these projects have been good enough to help other students learn the content. My library of student made teaching materials is growing and some students have already taken advantage of this library to prep for a quiz. I recently added a leader-board to acknowledge students who have completed mastery projects- in hopes of motivating a few more projects.

Mastery Project Leader-board

Quiz retakes 
This year, I have a better handle on whether students are ready to take quizzes or retake quizzes. The hot seats have been a nice addition. The only problem I've seen with the hot seats is when students opt to take the quiz a few days after completing the hot seat discussion. 

After the first batch of quizzes, I've added a few layers of permissions for quiz retakes. In addition to submitting quiz corrections and explainations of the mistakes, students have to do one more thing for permission for a retake. Making the students go through a few obstacles seems to help students take each attempt more seriously. 

The switch to inquiry based labs has proved to be most effective with asynchronous learning. Last year, I tried a combination of inquiry and full class labs. I struggled with students who got to the labs first and figuring out whether they should use last year's data. It became confusing for students to know whether they were using this year's or last year's data sets. This also prevented me from adjusting procedures. 

For the full class synchronous labs, students working at a slower pace had to rush through content or temporarily skip steps in order to be "ready" for labs. Now that students design most of their own labs, there is no confusion about what data to use and no need to worry about skipping or rushing through steps - students do labs when they are ready.

So far, I've managed to keep up with the demand for lab materials. I place small lab kits around the edge of the counter space on labeled lunch trays. Since different students perform different labs, I only need to make a small amount of materials available for one particular lab. The trick is to have several labs prepared simultaneously and to anticipate when students will be ready for future labs. Below you can see how I organize lab materials.

DNA extraction lab materials
UV bacteria lab materials
Protein Synthesis model exploration materials

Upcoming changes
In the upcoming quarters, I'd like to incorporate some synchronous projects to help me experiment and think through PBL and 20Time in future years. I also want to offer optional content and let students who work ahead design their own parts of the course. 

Friday, November 7, 2014

The Sweet Sauce: Reflecting on "Flipped Learning - Gateway to Student Engagement"

I recently finished reading "Flipped Learning - Gateway to Student Engagement." This book chronicled the path several teachers took from Flip Class 101 to Flipped learning. Typically, teachers who go down the flipclass road start by assigning students to watch videos at home and complete traditional homework at school. Their next destination, often called the Second Iteration or flipped learning, can take on many forms, from mastery, to PBL, to UBD, to Explore Flip Apply or even to my model, Mastery Learning Cycles. The difference in these reiterations is the focus is on learning and how best to facilitate it.  

The "final" destination of each teacher varies based on teacher personality and philosophy, strengths and weakness, interests and preferences, as well the culture of the school and support from administration. The most salient conclusion from this book is there really are countless types of the "sweet sauce." To borrow an analogy presented in the last chapter, Prego became successful because, rather than trying to design the ONE spaghetti sauce preferred by most people, they departed from their competitors by creating varieties of sauces which turned out to be the best for a variety of people. Everyone has their own favorite flavor of spaghetti sauce, just as flipped teachers eventually adopt the model that works for them. 

So far, my sweet sauce seems to be a combination of Standards Based Grading and Mastery Learning Cycles. Students are assessed on how well they understand specific objectives and demonstrate important competencies, rather than merely being evaluated through points or averages. Students demonstrate mastery of these competencies by progressing through  modified 5E learning cycles at their own pace. The flipped videos are only used in one particular phase of the learning cycle to supply content knowledge and at other phases just to provide instructions for labs and other important procedures. This iteration will continue to evolve and hopefully closer approach my sweet sauce. 

The other useful application of the Prego analogy relates to student choice. Just as consumers are free to choose the spaghetti sauce they prefer, many of the flipped teachers highlighted in the book eventually provided choices to their students - choice in how they acquire content (My video? Another teacher's video? Textbook? Website? Simulation?) - how they processed or applied content (Lab? Problem set? Game?) - and how they demonstrated learning of the content (Quiz? Project? One on one chat?) 

The potential of flipped learning is indeed a gateway to student engagement.