By Craig Forcese

Full Professor
Faculty of Law

Email: cforcese[at]

Twitter: @cforcese


Most Recent Blog Postings
Subscribe to Bleaching Law
« Deploying Experts in a Flipped Classroom: Active Use of a Passive Learning Resource | Main | Marrow and Moonshine: The Use and Misuse of Law School Summaries »

Flipping a First Year Mandatory Law Class: Results

I am now in my fourth year since I started "flipping" my law classes.  For past discussions of my initial experiences, see here.  For a slightly more academic treatment of flipped law teaching co-authored with Professor Peter Sankoff (Alberta), see here

Because I am always asked this question: I did not move to flipped teaching because I was dragged there by unhappy students.  Students appeared very happy with my conventional teaching, as reflected in teaching evaluations.  Instead, I dragged students to the flipped approach because I was concerned that students were not performing on exams to the level I thought they should after sitting with their bums in a seat and listening to me and engaging in Socratic discussion for a total of 40 hours.  Put another way, I concluded I was not being very useful as a teacher.  In truth, flipped teaching is actually harder, but as I have said before, it is more fulfilling.


This past semester, I flipped a first year law class -- my first time doing so.  This is a class every law student must take.  While we offer multiple sections of the course, students coming into first year have limited ability to customize their schedule to take a particular section with a particular professor.  This means that students are unlikely to self-select to my section because I advertise a flipped teaching methodology.  As a consequence, this is my first flipped experience with a relatively "random" sample of students.  I had 78 students in this class.

This was also a difficult subject matter for a flip -- or certainly a more difficult subject matter than my first flip (Administrative Law).  Introduction to Public Law and Legislation is a buffet topic designed to bring students up to speed with the public law infrastructure in Canada.  (The table of contents of the course text gives some sense of scope). This course is roughly the equivalent of the Foundations of Canadian Law subject mandatory for National Committee on Accreditation students.  However, I go substantially beyond the basics and we spend a lot of time on things like election law, access to information law, conflict of interest law, and lobbying regulation.

Generally speaking, students start with little to no understanding of how our system of law and government works.  And on top of this, this is not a topic that lends itself to a case-based approach of instruction -- in many of the areas we cover there are basically no cases.  It is, in other words, a very different course from the other first year topics students are covering, and they report finding it confusing to be drilled with cases in their common law courses and then need to think more structurally in Public Law.

The Flip

For the short version of how I ran the flip: in this flipped course, I pre-recorded podcast and videocast lectures, posted to the internet before (usually long before) the classroom session.  In the classroom session, I coached the class through active learning exercises designed to "put in play" the themes and information contained in the podcasts.  I used a variety of techniques, but the most common were what I call "blink" or "five minute" hypotheticals.  These are problem-based exercises in which I temper conventional Socratic by adding a discussion element.  Specifically, I posed (usually on the projection screen) a short hypothetical and students then had between 2 and 5 minutes to discuss with their seatmates before I resort to my call sheet for discussion.  Again, the hypos were designed to reinforce the material covered in the podcast lectures. 

Outside of class, I had students complete readings (reduced in length relative to my conventional, pre-flipped reading load).  I also deployed "feedback" quizzes done out of class.  Basically, these are online true/false quizzes that are designed to compel students to reflect on materials covered in the "passive" learning podcasts.  I also had them complete three in-class "two minute essays".  That is, I asked every student to complete one sentence on something they learned in the just-completed learning unit, and one sentence on something that remained murky. I used the data collected in this manner to detect common misunderstandings and difficulties and then tailored both the feedback quizzes and subsequent in-class hypotheticals to "work-over" important weaknesses.

And because I am committed to experiential instruction -- defined broadly to be "things that might be useful to know when students graduate" -- I also had them complete several out-of-class "public law lawyering" exercises in which they applied some of the substantive legal tools they were learning about in class.  Students submitted these to me and I provided modest feedback on what were largely fairly mechanical exercises.

These assignments and the quizzes were pass/fail, with a pass set at 70% and students kept repeating the exercise until they scored that B.

A full version of my syllabus is here.


As discussed here, there is considerable debate about the merits of flipped versus conventional teaching.  This being law school, much of this debate is entirely fact-free, lacking any empirical foundation for resolving the dispute one way or another.  At times, the arguments can resemble a Monty Python skit ("When I was my students' age, I walked to school in bare feet, through the snow, ate cold gravel for breakfast, and lived at the bottom of a lake.  And I learned lots of law, so how I learned law must be the best way to teach it.")

One of the reasons for the lack of data is the overall disinterest in pedagogical research in law schools.  The other is the question of design, and specifically how do you compare different teaching methodologies where there is no control group.

I have not solved that control group problem -- but I can compare results from a flipped teaching environment with results from my prior iterations of the same course, taught using a conventional approach (which in my case, was a lecture-based class with some Socratic). 

In my conventional approach, I used 100% exams.  I have since sworn off that destructive practice, and now students do "bank" marks prior to the exam through their participation in the feedback quizzes and experiential assignments.  But I continue to have an exam worth most of the grade (65% this year).  And I continue to evaluate in exactly the same way, with a detailed issue-spotting exam.  And I continue to share past exams and answer keys with each cohort of students in advance of the exam.  Put another way, while my teaching methodology has changed, my exam methodology has not, in the slightest.

That means I can compare raw exam scores across years in a loose proxy of teaching outcomes stemming from different teaching methodologies.  This is, of course, a ridiculously inadequate measure when it comes to scientific rigour.  But I believe it better than nothing.

When I last did this style of comparison (with Administrative Law), the student grades were five raw percentage points higher on the post-flipped class exam than on the post-conventional class exam.  This was a modest, but notable difference.  But most importantly, with the flipped class the mark distribution was markedly different.  Put simply, there were fewer grossly underperforming students.

I had an identical outcome with my flipped class in Public Law and Legislation this past Fall.  First, the average raw mark was again 5% higher than the last time I taught the course.  And again, the mark distribution was very different.  I reproduce the mark distribution for three different academic years.  Two of these years involved conventional teaching (2008 and 2010, which was the last time I taught this course before this year).  The third involved flipped teaching (2015).  The bars in the graph represent the percentage of students in each grade category.

Readers will note that there were still a number of raw grade failures in 2015, and also a sizeable number of marginal passes (D).  From D+ forward, the mark distribution then follows a "normal curve".  I could not say that at all about 2008 and 2010 -- the mark distributions for those years are either skewed to low grades (2008) or essentially flat (2010). 


This chart actually masks another reality: the failures in 2015 were "near failures" -- mostly in the 45% range.  This was not true in earlier years, where the failures were often dramatic.  (In the result, when the 2015 exam grade was tabulated with the assignment work, I have no failures overall in 2015.  In past years, even an aggressive shifting of the grading curve by as much as 10 percentage points to meet faculty marking guidelines -- setting the class mean at B -- still left a sizable number of failures).

Even more revealing is a chart showing the proportion of the class "below B" and "B or above" for these three years -- again looking strictly at raw exam grades. 



I acknowledge once more that this style of analysis is imperfect.  But overall, these results affirm the results from my earlier flips: students who complete my flipped class demonstrate greater competency on a law school problem-solving hypothetical exam than do students who complete my classic lecture/Socratic course.  One response to this (from the Monty Python-style skeptics) might reasonably be: you were just a crappy teacher and you got better, regardless of methodology.  This is entirely possible, but in my defence, my pro-flip results are also consistent with the empirical data from other disciplines reviewed in the article I wrote with Peter Sankoff, noted above.

But whatever the case: Having invested considerable effort in rebuilding my pedagogy, these are gratifying results.  A side-benefit of the flip is that the constant interaction and feedback orientation of the active learning component of the course means I gather intelligence on what works and what doesn't, in a way I did not with conventional teaching.  With time, it will be interesting to see whether I can figure out enough to intervene early and successfully with those students who continue to struggle at the bottom of the class.