Showing posts with label Standards_Based_Learning. Show all posts
Showing posts with label Standards_Based_Learning. Show all posts

Saturday, August 20, 2016

Recent Course Updates and Future Plans


It’s been awhile since I wrote an article to my blog. I’m unsure if it was due to lack of inspiration, distraction, complacency or some combination of different reasons. My class has continued to evolve and I made important changes to the course. I hope to continue to improve my course and reflect about it online.

Since the last blog post, these are the changes I’ve made to the course:
  • SBG improvements: science process standards that span between units as part of my SBG focus
  • Flirtation with gamification: leader-board and other graphics showing the number of level 4s and mastery projects completed by individuals and classes.
  • More voice & choice: robust offerings of optional units and mastery projects.
  • Differentiation in content delivery: iBook that accompanies most of the videos.
  • Lab report improvements: Less focus on formal lab report writing and greater emphasis on flexible formatted lab write ups.
  • More flexible hot seats: students decided how to show they understood the standards rather than answering questions from me.  

Upcoming this year:
  • Personalized learning continuum: as I continue to work on voice & choice and differentiation, there will be entire learning cycles that all students will be able to choose. Rather than only offering this choice to students who finish the course earlier than others, there will be two stopping points where all students will have to select a learning cycle from a menu of topics.
  • Claim Evidence Reasoning: as I moved away from the traditional format of lab reports, I was proud to see improvements in overall quality, yet many students needed more direction. I will use the technique of Argument Driven Inquiry, also known as Claim-Evidence-Reasoning for lab assignments. As a department, we agreed to adopt Claim-Evidence-Reasoning for lab reports because it helps to focus the students on the important elements of experiment analysis.

I am happy to report that the journey started as part of my shift to flipped learning has opened avenues for the course that I would not have predicted. These changes have led to a more engaging, rigorous and authentic experience for students.

Wednesday, June 24, 2015

Looking Back on my First Year of Standards Based Grading

The move to standards based grading solved many issues from last year. Conversations shifted away from percentages to learning. When a student struggled, we talked about specific learning targets rather than scores or whether an assignment was turned in or not. Equally as significant was the shift in my role. I wrote at the end of last year's reflection blog post, I felt like a study hall monitor, spending the bulk of my time checking off assignments. This year, I'm proud to report I spent most of my time answering questions, giving feedback and challenging students as they tried to convince me they understood learning targets. 

I had a better clue about what my students knew and did not know. I was better able to communicate the progress of each student and many students had a much clearer idea of what they needed to work on.  Admittedly, a number of my students shared they didn't pay attention to the "I can" statements. I believe a solution is to change the hot seat discussions; students will decide how they want to prove to me they understand an "I can" statement before gaining permission to take a summative quiz. 

The tracking system and grade book were clear and helpful. All activities were aligned to learning targets. In previous years, students argued they could learn without completing certain assignments - this was not as much a criticism this year. Even if some students did not pay attention to the "I can" statements, students were aware of what they needed to do and why they needed to complete certain assignments.

Most important was the buy-in of my students to revise and redo assignments. Many of my students appeared to have or develop a growth mindset regarding science class - or at least, this particular science class. Of course, some students didn't want to do quiz error forms and the obstacles I put in place for permissions to redo quizzes, but they all wanted the redos. 

Given that some students paid very little attention to the standards, it's no surprise that only a handful of students opted for mastery level on the standards. Some students shared reluctance in completing the projects because they were fearful of falling behind. Students also questioned my prohibition of using class time for these projects - a decision made to encourage students to move forward and not fall behind, which I am currently rethinking.

There are other tweaks that come to mind. I wonder if I should require students to earn a perfect score on the quizzes to earn proficient, instead of 80%. The argument makes sense. If students really understand the concept, they should be able to answer 5 out of 5 application level questions. This would require expanding my question bank and opening quiz attempts to five, instead of three. I would have to adjust my retake policy to account for the increased attempts. Alternatively, the most recent, not the highest, score on a summative assessment will stand. If a student retakes a quiz and earns a lower score, then that score will be used to evaluate the student. Again, if a student truly understands a concept, they should be able to pass a similar assessment a few days after a previous attempt. If they earn a lower score after a subsequent attempt, then arguably, the student did not really understand the learning target.

The most important adjustment I need to make is crafting the learning targets that span over several units. This year, I focused on the content standards specific to certain units. This was a decision of convenience. Unfortunately, the result was inconsistent and informal tracking and assessment of the important science process skills like organizing and analyzing data and using evidence to support claims. 

Even though I'm rethinking some of the finer details and execution of standards based grading, I have enough evidence that the model works. I look forward to a second year of standards based grading. 

Friday, January 23, 2015

Baby Steps to Standards Based Grading & Differentiation

Flickr : radhika_bhagwat
Susan Reslewic, a colleague of mine, and I recently discussed making a transition to standards based grading in her course. Susan agreed that I could share the contents of her email:

Getting the feeling that standards-based-grading goes hand in hand with differentiation...I think it [standards based grading] could really support my efforts to better differentiate.  When I looked at the physics tests today (all over the map: some failing grades, some perfect scores plus), I just felt like I wish the kids who "did poorly" could say "oh, I know a and b, but not x, y and z".  It frustrates me that some kids are going to get their test back and see a score in the 70s and then go to the place of "I did poorly. I don't understand physics. I hate physics. I hate science!" I wish instead...the grade communication focused on what the kid can and cannot do (yet).  As I write this I think maybe a first step is for me to provide detailed comments on the test next to the grade.... Basically saying here's where you are excelling and here are things you need to work on."

Susan makes some of the most compelling arguments for Standards Based grading (SBG): helps teachers differentiate and lets students know what they know and don't know. 

As I read Susan's email, I was reminded how overwhelmed and excited I felt about the idea of SBG. I responded with some thoughts included below about taking baby steps to SBG. 

Transitioning to Standards Based Grading: 
If you want to try baby steps to standards based grading, the easy way is to start reorganizing your tests, quizzes and other assessments. Label each test question with the idea being tested - maybe even grouping those questions together. For example, perhaps questions 1-5 are about calculating velocity and 6-15 are about applying Newton's Three Laws. Don't report a final total percentage on the test, instead report percentages on each group of questions: 80% or 4/5 on balancing equations, 70% or 7/10 on calculating K. Then, make students retake only portions of tests and quizzes that fall below a certain percentage. For example, you may tell student "X", you "mastered" balancing equations but have to retake a quiz on calculating K. 

In addition, make the benchmark quizzes about only one idea; if they pass the quiz, then they mastered the idea. Just like the tests, students only retake failing quizzes. Of course, you will need multiple versions of comparable tests and quizzes. It may take years to build a robust question bank but perhaps start with 2 or 3 versions of each test or quiz.

Another quick thing is to label the problem set and homework questions with the specific idea being practiced. You could even identify the basic and challenge questions within each subset of questions. Certain questions can be mandatory for all and excelling students can attempt the challenge questions. Over time, you can build a library of remedial activities and other resources to help students with particular skills or topics. If they fail a part of a test or a quiz, then you can point students to specific activities and resources that target their deficiency.

These are some baby steps that will not take much tweaking to course structure. These steps should help gain some of the benefits of a SBG course but keep in mind that these are temporary fixes. Not only does it take major structural changes to implement SBG but a comparable shift in mindset must also occur. 

Monday, November 24, 2014

Standards Based Quiz Spotlight: the Anatomy of Mastery Learning Cycles

Roughly there is one quiz per "I can statement." Students receive permission to take the quiz once they have completed the one-on-one conversation with me about the "I can" statement, called the "Hot Seat." Typically each quiz consists of five questions, designed at the application level on Blooms Taxonomy. Most of the questions are problems that assume the understanding of concepts and integrate the vocabulary and are similar to problem set questions. 

The quizzes are online and hosted on Moodle. Moodle is a powerful LMS because it supports randomized questions pulled from a question bank. This is important in an asynchronous class because the quiz questions are different on each attempt - whether a new student or the same student on a subsequent attempt. This reshuffling of questions allows for retakes and keeps academic integrity between the first or the last student to take the quiz on the same topic. 

Students must earn 4/5 or 80% to pass the quiz. This score is consistent with the school culture & my expectations. Once they pass the quiz, they earn level three out of four, which is meeting expectations. Most of the questions are multiple choice or calculation questions. I typically don't prefer to use multiple choice questions but use them when I'm able to input every possible choice as answers. For example, when figuring the probability in Punnett Square problems, the only possible answers are 0, 25, 50, 75 or 100. I also add a choice "there's not enough information to determine." With these types of problems, multiple choice can effectively gauge understanding - assuming there have been other assessments in the learning cycle. Some "I can" statements can't be quizzed at the application level w/ MC questions or calculations. In those cases, there are alternatives - building a model, writing a lab report, completing a lab or a case study. 

If students fail a quiz, there are certain tasks they have to complete, which differ based on the number of times they have taken the quiz. For every failed attempt, students have to make corrections and fill out a form describing their errors.

 Students also have to do the following:
  • after the first attempt: students complete any skipped problems from the Learning cycle problem set. Earlier in the cycle, students solved mandatory problems and as many optional problems as they felt the need to complete. After failing a quiz, the optional problems become mandatory. The hope is practicing more problems will help students review and prepare for a second attempt. 
  • after the second attempt: students complete at least one remediation activity. The remediation activities for a learning cycle may include online readings, simulations, extra problems and/or student made videos and problem sets
  • after the third attempt: students have to create their own set of problems and include solutions. In many ways, this last option is similar to the mastery projects in the next phase of the learning cycle. 

Once students pass the standards based quiz, they are able to move on to the next learning cycle. If they wish to further explore the same topic and/or show a higher level of understanding, they can complete mastery projects before moving on to the next learning cycle. 

Sunday, November 16, 2014

Quarter One Reflections

After a quarter into the school year, I have a solid grasp of the effects of the changes I've made. Here are the chief thoughts I have about quarter one.

Standards Based Grading
The transition to standards based grading has been mostly smooth. This year, I have a much better handle of what my students know and do not know. The SBG Grade book on Haiku is easy to use. The color codes make it easy to see which standards each student or class section is still working on. This has helped me identify which students need targeted intervention.

Standards Based Gradebook on Haiku

At first, it took students some time to understand the concept of "I can" statements and my particular system for showing learning. They seem to have figured out the system. 

The most noticeable difference is the quality of my reports. I've always struggled with writing first quarter reports because I barely feel like I know my students well enough by that time in the school year. This time around, I had plenty to say. Rather than including the general fluff, my reports focused on what my students knew and were able to do and the ideas and skills they still found troubling. Adding this component to my comments about performance on major assignments, my general impressions and suggestions moving forward, the reports are much more informative. 

Haiku LMS
The new learning management system is quite effective. The layout is beautiful and the interface is intuitive. I have consolidated many of my online tasks within Haiku - recording and sharing grades, assigning and collecting student work, repository of resources and interactive components like polls, practice quizzes and discussions. In the past, many of these roles would have been offloaded to separate resources. I'd like to move my actual quizzes to Haiku but it does not support randomized questions from a test bank, so I still need Moodle for that purpose. 

Haiku can be a bit buggy though. There is a limit to how many objects can be embedded on one page. Some students complained of notoriously long loading times. A student suggested that I make more usage of subpages. Now each step of the learning cycle is housed on its own page. This has significantly increased loading speeds.

Subpages on Haiku

Asynchronous learning
As mentioned in a previous blog post,  asynchronous learning continues to allow students to submit their best work and internalize a growth mindset. Most students are keeping to a reasonable rate, even though there are students who I believe can work faster. I've made some changes this year, which hopefully will help students adjust to the responsibility of setting their own pace. The most important change, at the request of a student, was allowing students to create their own weekly plans.

A student's week plan

Creating the plans take a lot of time so I've been trying to encourage students to send their plans to me during the weekend - with varying degrees of success. At the very least, students are using less class time to create their plans and becoming better at working while waiting for my indication that their plans are satisfactory. For students who show difficulty with this task, I've started to collaborate with them to create pacing calendars for a few weeks, rather than letting them work alone on their weekly plans. 

Mastery projects
A handful of students have elected to complete the mastery projects. In most cases, these projects have been good enough to help other students learn the content. My library of student made teaching materials is growing and some students have already taken advantage of this library to prep for a quiz. I recently added a leader-board to acknowledge students who have completed mastery projects- in hopes of motivating a few more projects.

Mastery Project Leader-board

Quiz retakes 
This year, I have a better handle on whether students are ready to take quizzes or retake quizzes. The hot seats have been a nice addition. The only problem I've seen with the hot seats is when students opt to take the quiz a few days after completing the hot seat discussion. 

After the first batch of quizzes, I've added a few layers of permissions for quiz retakes. In addition to submitting quiz corrections and explainations of the mistakes, students have to do one more thing for permission for a retake. Making the students go through a few obstacles seems to help students take each attempt more seriously. 

Labs 
The switch to inquiry based labs has proved to be most effective with asynchronous learning. Last year, I tried a combination of inquiry and full class labs. I struggled with students who got to the labs first and figuring out whether they should use last year's data. It became confusing for students to know whether they were using this year's or last year's data sets. This also prevented me from adjusting procedures. 

For the full class synchronous labs, students working at a slower pace had to rush through content or temporarily skip steps in order to be "ready" for labs. Now that students design most of their own labs, there is no confusion about what data to use and no need to worry about skipping or rushing through steps - students do labs when they are ready.

So far, I've managed to keep up with the demand for lab materials. I place small lab kits around the edge of the counter space on labeled lunch trays. Since different students perform different labs, I only need to make a small amount of materials available for one particular lab. The trick is to have several labs prepared simultaneously and to anticipate when students will be ready for future labs. Below you can see how I organize lab materials.

DNA extraction lab materials
UV bacteria lab materials
Protein Synthesis model exploration materials

Upcoming changes
In the upcoming quarters, I'd like to incorporate some synchronous projects to help me experiment and think through PBL and 20Time in future years. I also want to offer optional content and let students who work ahead design their own parts of the course. 

Wednesday, August 20, 2014

Spotlight on Mastery Choices: the Anatomy of Mastery Learning Cycles

In this spotlight, we'll explore the mastery assignments of the mastery phase. In each learning cycle, students will have the opportunity to extend their thinking on specific standards beyond the application level in Bloom's taxonomy. They'll also be able to choose the nature of the assignment and even within the options, there is room for differentiating the level of difficulty. Depending on the learning cycle, these mastery assignments may be organized into choice boards, 2:5:8 boards, tic tac toe boards, think dots or cubing boards. I'll explain some of the options below.

Choice Boards
In some learning cycles, I use choice boards. Choice boards typically have nine project choices, of which students select one. Each choice will encompass all of the relevant learning standards; therefore, only one project is needed. The Analysis, Evaluation and Creation levels of Bloom's taxonomy are equally represented. In addition, different learning preferences are represented - students have the option in the type of modality: video, article, essay, cartoon, poem, etc.


2:5:8 Board
The 2:5:8 board gives students options between levels of difficulty. The rule is the students have to complete assignments that add to ten; for example, a student may select one "2" level and one "8" level or two "5" level assignments. I also added a "10" level assignment, where students could opt for just one assignment at a higher level of difficulty. I used Bloom's taxonomy again to determine which activities are level 2, 5, 8 and 10.


Think Dots
I use Think Dots similarly to choice boards. In both case, students only select one option. The major difference is theoretically, students don't actually choose the assignment. Students roll die to determine which project to complete. Think Dots can work well if there aren't significant differences between rolling a "one" or a "three." In either case, students are basically completing the same assignment but the details are different. Other teachers use Think Dots differently but I like using it this way to encourage students to be okay with rolling the die and doing whatever assignment is randomly selected. There's a neat online die that one can use if physical dice are unavailable.


Tic Tac Toe
The Tic Tac Toe board is an effective variant of the choice board. Again, the assignments are aligned to specific standards as well as levels of Bloom's taxonomy. The way I use tic tac toe boards is when I have a variety of standards that are too difficult to encapsulate into one project. In this case, I still have nine project options, of which students have to select three. I can set up specific rules to force students to select specific types of projects. For example, in the board below, students have to go from top to bottom, either in the same column or at a diagonal. In that case, students are forced to select one project from each of the three rows. Each row has three options aligned to the same standards. The result is students cover all of the standards but have some choice in the combination of assignments.



My hope by using these strategies is students will complete higher order assignments to demonstrate mastery of specific standards. I'll be sure to reflect on the effectiveness of these tools at a later date.

Tuesday, August 12, 2014

Spotlight on the Mastery Phase - the Anatomy of Mastery Learning Cycles

Flickr // Powazny
 Based on recent questions via twitter after FlipCon14 regarding specifics of Mastery Learning Cycles (MLC), I've decided to write periodic blog posts about the inner workings of this model. This series, the Anatomy of MLC, will spotlight aspects of the model. In this first post, we'll explore the mastery phase.

Recall MLC consists of Explore, Flip, Apply and Mastery phases. More information about the other phases can be found in the initial blog post about MLC or future posts in this series. The Mastery phase is the culminating component of a learning cycle and consists of the following:



  1. Mandatory "hot seat" Discussion - a one-on-one or small group discussion where students are orally "quizzed" to determine whether they are ready to take the unit quizzes.
  2. Mandatory Standards-based Quizzes - randomized Moodle quizzes aligned to specific standards and application level of Bloom's taxonomy. Students can choose to, or be required to, retake the quizzes until they demonstrate proficiency of each standard. Students who fail will be required to complete a metacognition form for each quiz outlining their errors, as well as making corrections. Students can return to earlier phases and/or complete remediation activities before retaking a quiz.
  3. Optional Standards-based Mastery Assignments - higher order thinking assignments which are aligned to Bloom's taxonomy levels of Analysis, Evaluation and Creation. These projects will push the thinking of students and are based on specific standards. Completing the projects will move a student from application (3/4) to mastery (4/4) level.
The implications of this set-up is students can pick and choose which standards they want to demonstrate application or mastery understanding. The thinking here is students are required to be able to apply their learning of all standards but should be able to decide which standards they want to extend their thinking. Some students will be more interested in some standards; some students will find some standards more difficult. An added benefit of this approach in an asynchronous course is having optional assignments gives students an opportunity to catch up to the pace of other students, without loosing core content.

Friday, August 8, 2014

My Plunge into Standards Based Grading

Cody Hough // Wikipedia

After short-lived flirtations, I'm plunging into the Standards Based Grading (SBG) pool. Of all the recent changes to my course, SBG requires the biggest paradigm shift. Like many other updates to my biology course, my entry point was the flipped class - another testament to the benefits of flipped instruction.

What is SBG?
For those who are unfamiliar with standards based grading, this video should offer a good summary contrasting it with the traditional model of Assessment-Based Grading.


How am I approaching SBG in year one?
First, I've identified the standards. For each unit, I listed each learning objective and rephrased them as "I can" statements. (Thanks to @mrsebiology for the inspiration.) I've redesigned each unit as mastery learning cycles centered around these standards. I plan to acknowledge four levels of progress for each standard - does not meet (no evidence), approaching (explaining), meeting (applying) and exceeding (mastery.) 

First draft sample of a few genetics objectives
Second, I've tailored the learning materials (videos, readings, labs, etc) to the standards. This was an eye opening process because several materials that I've used in the past did not meet a specific standard; I was forced to delete these materials. The other important revelation was that some activities required too much effort in return for how they aligned with the standards. For example, if I identified a lab that took several days to complete but barely addressed one standard, I either modified it or chose an alternative. I forced myself to prioritize the "need to know" content and skills over the "nice to know" material. 

Third, I strategically designed assessments to align to the standards. Every quiz, lab or test question will be intentionally designed or modified to address the standards. Again, very eye opening process. I'm especially excited about this change because I will be able to generate informative data about each student. No longer will a student or parent see a vague "88%" on a report. The 88% could hide that this student struggled mightily on one topic, while excelling at the other topics. Instead of quoting numbers, I will be able to state the exact nature of the areas of strengths and weaknesses.  

I have a lot to learn and so glad that I have a wonderful group of teachers in my PLN.

So far I've found been using the following SBG resources:
I would love it if you can reply with your own list of Standards Based Grading resources.

Monday, June 30, 2014

Onto my Second Iteration of Flipped Learning: A Post FlipCon14 reflection



If my personal lessons learned from FlipCon13 were about the logistics of video making and setting up a flipped class, then this year's lessons were all about the "second iteration" (as Troy Crockum frequently mentions) or tweaks to my flipped class and connecting with the community of flipped educators.

Last year during FlipCon 13, so many flipped veterans said over and over again that flipped learning is not about the videos but it was a message that a baby flipper, like myself, could not internalize. But now that I have a YouTube library of good but not great videos and I'm primed to tackle more important questions of inquiry, project based learning, standards based grading, student blogging and 20 percent time, I really have internalized that mantra.

This year during FlipCon 14, I strategically selected sessions that would help me improve my second iteration of flipping. This meant that most sessions were part of the "Beyond Flip class" strand. Some of my thoughts and take-home lessons regarding a few sessions are below.

Keynote: "Living in Beta" with Molly Schroeder
This was a wonderful keynote that challenged me to further promote working in beta, or experimentation and revision in my class. Since I use a mastery model, I feel pretty confident that my students are usually working in beta. They're encouraged and even mandated to revise their work until proficiency. The real lesson I took away from Molly's session is that I need to be comfortable with allowing myself to work in beta. If companies like Google can fail with tons of unpolished products and still be seen as a successful company, then surely I can dare to fail as well.

Flipping DI with Lee Dewitt
This was a timely session for me since one of next year's goals is to differentiate instruction. The pre work and the session gave me some neat ideas about how to differentiate instruction. Although I'm happy with the Mastery Choice boards as my main vehicle to offer student choices, I can see the benefit of mixing things up. Perhaps some learning cycles will work best with choice boards, a 2-5-8 menu, a RAFT assignment, tiering or cubing. I'll play around with these options during the summer.
The most exciting thing I learned from Lee's session is how she scaffolds mastery in her course. My 8th graders struggled with staying on track and I'm hoping a better transition to self paced learning will minimize these issues.

Making the Grade with Jennifer Haze
This session was advertised primarily about standards based grading, although I learned some neat tricks about formative assessment as well. I really like her peer to peer techniques in formative assessment like "quiz, quiz, trade" and "find the matching answer." Adding these techniques to peer instruction will enhance the synchronous offerings in my asynchronous course.

Engaging videos with Jonathan Thomas-Palmer
My videos are serviceable and usually get good ratings in terms of learning. However, my videos are not particularly engaging. I do use the engaging design techniques Jonathan mentioned in his session, like changing the screen every few seconds, use transitions sparingly, purchase an external microphone or limit background light for the picture in picture feature. After the first few videos, I even included questions throughout the video to encourage students to pause and think. The most important thing Jonathan said that I needed to hear was that I need to enjoy myself in the videos, like I usually did during live direct instruction in the past. I'm too formal in my videos and need to make sure I'm having fun when I record the videos. I don't see videos as the most important aspect of my class and will most likely avoid redoing most on my videos; however, I will use that important piece of advice to make new videos.

Innovative Pedagogies with Julie Schell
I've been looking forward to Julie's presentation since I missed her during FlipCon 13. I already incorporate peer instruction into my class at least once per learning cycle. It has been successful and most students rated this strategy favorable, even more so than flipping. In this presentation, Julie explained and demonstrated how Just in Time Teaching and Peer Instruction work together. The Just in Time Teaching (JITT) technique requires students to answer two conceptual questions and submit one feedback question (e.g. what they found most difficult or what they still wonder about) regarding a concept they learned in a coverage assignment outside of class (e.g. flipped video or a reading.) The teacher reviews these responses prior to class and uses the responses to generate ConcepTests to be used for peer instruction. The hardest part about Peer Instruction is generating the higher order engaging questions. JITT can help me generate more of these questions.

In addition to these wonderful sessions, I met great people whom I already follow on twitter. What a wonderful community of welcoming educators. I look forward to further connecting and learning with my PLN and incorporating lessons learned from FlipCon14 into my class.

Friday, May 2, 2014

Interview about Asynchronous Learning and Standards Based Grading

Yagraph // Wikipedia

I had the privilege of chatting with Jonathan Bergmann on his radio show, The Flipside on the Bam Radio Network. Primarily we talked about my journey to flipped instruction and standards based grading. The interview is linked here

Aside from my nagging habit of saying "direct instructional days" rather than "direct instruction days," I thought the interview ran smoothly. I also see why Jon was an award winning educator; even as an interviewer, he was captivating, reminding me of his keynote address at FlipCon13




The messages I hope were conveyed during this interview:
  1. Flipped learning saves class time and creates more opportunities for greater engagement and individualized learning.
  2. Asynchronous learning allows for differentiation.
  3. It's possible to adopt aspects of flipped instruction and it is also possible to successfully adopt flipped instruction (and mastery learning) wholesale, without a long period of transition.
  4. Middle school students can thrive in a flipped class.
  5. The assessment based system of grading is broken because it can hide what students truly do and do not understand.
  6. Standards based grading is the solution to the broken assessment based grading system. 

Thursday, April 10, 2014

The Urgent Need for Standards Based Grading

I love moving to an asynchronous flipped course. However, it is clear to me that the next major paradigm shift must be toward Standards Based Grading. 

The asynchronous nature of the course works well for so many reasons,which I have mentioned in the past. My major struggle is the model seems to encourage, or at least allow, students to submit late, and typically useless, assignments at the end of the quarter. Most of these assignments are irrelevant at quarter's end because students already completed their summative assessments. Students submit these late assignments solely to increase their average. While the flipped model decreases the opportunity for the typical "students playing school," clearly some of that is still happening in my course! I could ban submission of assignments after the subsequent steps but that would treat the symptom, rather than the cause.

The culprit seems to be grades, or at least the traditional assessment based grading system. I incorporate assignment completion percentage into the quarter averages to encourage students to do their work. But this seems wrong to me. If we have to assign a grade, shouldn't it be based exclusively on what students have learned, rather than behavior, participation and assignment completion and timeliness rates? Don't get me wrong, I understand why these aspects are included in grades; teachers want to encourage certain behaviors while discouraging others. The easiest method is including specific behaviors in the grading system. Unfortunately, the result is inflated grades for compliant students and deflated grades for noncompliant students. Rather than grades reflecting learning, grades merely correlate or relate to learning in the traditional system. This is an odd paradigm when you really think about it! 

Standards based grading can be the solution. Students are graded exclusively on how well they demonstrate mastery of learning objectives or standards. Students can choose which learning activities (readings, videos, labs) to complete. They can redo assignments to learn or practice objectives before opting for a graded objective check or mastery quiz. If students want or need to retake the summative assessment, they can revisit some or all of the learning activities -  but I won't include the completion percentage rate into the grade. I haven't figured out the logistics but I am sure that my current grading system needs a makeover.  

Friday, October 25, 2013

Introducing Alternative Assessment & Peer Instruction in the Flipped Class: Week 7 reflections

Students only had an exploration, video, problem set and Moodle quiz  because I wanted to add some breathing room for students to catch up to my pace. I also instituted peer instruction at the beginning of lessons. Students are all over the place - anywhere from one week behind to one week ahead. The number of ahead students has increased in the last day or so. After next week, I plan to speed up the pace a bit and offer more synchronous activities.

Successes:
One cool thing that has emerged from post quiz conferences is pin pointing each student's conceptual issues. In previous weeks, I suggested struggling students to work through pre-planned remediation modules after a quiz. The pre-planned modules are divided into topics, so students always had the choice of how to spend their time. But these conferences have better equipped me to recommend certain tasks or to create remedial tasks on the spot. A few examples will make the point. Two students struggled on the same quiz but had different issues. One student couldn't couldn't decipher the difference between codominance and incomplete dominance, while another student had some issues solving blood typing Punnett Squares, which included codominance. For the first student, I told him to create a list of traits (the weirder the better.) He had to imagine the appearance of a heterozygote, in situations where the trait displayed codominance and incomplete dominance. I pointed the second student to a blood typing reading and online practice quiz that was already part of the remedial module. I'm also adding more remedial activities to the modules because these conferences are uncovering areas of confusion that I did not anticipate in the planning phase. It would've been difficult to identify and offer specialized remediation without the post quiz conferences built into the class period, courtesy of the flipped model. My eduwin for the week is using post quiz conferences to suggest tailor made remediation.

I really love adding a synchronous activity to the beginning of class. It changes the feel. It's slowed things down a bit and made the class feel whole. I'm glad that I opted to go with peer instruction instead of a tracking journal at the beginning of the class. The tracking journal could've helped with goal setting and slowed down the pace but it would not have the added benefits of peer instruction. I posted a scenario with a multiple choice answer. It was an engaging and conceptually rigorous question. Students thought about their answer individually, with no help. They jotted down their answer and closed their eyes to vote. I recorded the class results. Then students found someone with a different answer and tried to convince them to change their answer. After a few minutes, students re-voted. I shared the results of both votes and asked students to explain their reasoning. I made sure to include students who were convinced of the correct answer. I revealed the answer and had time for explanations and clarification. Once the session was over, I let students resume their work. Students were overwhelmingly enthusiastic about this change. We all craved for communal time. I'm ecstatic that the communal time was a research based approach that was still student-centered. It was the best of all worlds. 

Adjustments:
I expected a two year roll out of the flipped class; year one is flipped videos and associated guided notes and forms, explore flip apply, Moodle quizzes and asynchronous learning, while year two will be standards based grading, blogging and voice & choice. I'm starting to question whether I should institute voice & choice earlier. I have a student or two who have struggled with the Moodle quizzes, despite post quiz conferences and retaking them. Even though there are some real issues with learning the material, the lack of partial credit has been a real detriment here. Moving forward, I will have a paper version of the quiz to administer to a small number of students who struggle. In addition, I ought to institute voice & choice sooner for students to display learning in alternative ways. I've already opened things up a bit. I've asked some students to create their own problems and solve them to convince me of their understanding. Even though I've done this informally and on an as needed basis, it has worked nicely. I ought to build in these alternative opportunities more often and eventually, allow all students to choose from a menu of ways to demonstrate knowledge. 

The other change I'm thinking about instituting a year earlier than originally planned is an element of standards based grading. In a recent response to a my ongoing course feedback form, a student mentioned that all of the activities were unnecessary to understand the material. I, of course, agree; this is one of the main reasons that I planned to institute SBG next year. In the interim, I've decided to reach out to individual students and discuss making certain activities optional. I won't open things up for all students just yet until I've thought through effective accountability and grading systems. At the very least, I can tell students to complete as many or as few problem set questions that they need.