By Craig Forcese

Full Professor
Faculty of Law

Email: cforcese[at]uottawa.ca

Twitter: @cforcese

 

Most Recent Blog Postings
Subscribe to Bleaching Law

Bleaching Law

Welcome to my blog on teaching law.  I have entitled it "bleaching law". This is an obvious contraction of "blog" and "teaching".  It is also a play  on words capturing the endless struggle of a law professor to convey as neat, tidy, proper and well-starched things that are emphatically not, like the standard of review in administrative law.  All opinions are my own and do not reflect on anyone else who I work with, for or around. 

Wednesday
Jan202016

Deploying Experts in a Flipped Classroom: Active Use of a Passive Learning Resource

My pitch for a collaborative effort from March 2015 to create "virtual expert" resources for use in an active learning, experiential-oriented classroom.  I am currently in the midst of interviews of international law practitioners for the purposes of bringing "virtual experts" into my Public International Law class.  I will be setting up a special website with the results, in the hope that other international law teachers may find them useful.

 

Deploying Experts in a Flipped Classroom: Active Use of a Passive Learning | Craig Forcese |Univ. of Ottawa| ILT 2015 from Michele Pistone on Vimeo.

Tuesday
Dec292015

Flipping a First Year Mandatory Law Class: Results

I am now in my fourth year since I started "flipping" my law classes.  For past discussions of my initial experiences, see here.  For a slightly more academic treatment of flipped law teaching co-authored with Professor Peter Sankoff (Alberta), see here

Because I am always asked this question: I did not move to flipped teaching because I was dragged there by unhappy students.  Students appeared very happy with my conventional teaching, as reflected in teaching evaluations.  Instead, I dragged students to the flipped approach because I was concerned that students were not performing on exams to the level I thought they should after sitting with their bums in a seat and listening to me and engaging in Socratic discussion for a total of 40 hours.  Put another way, I concluded I was not being very useful as a teacher.  In truth, flipped teaching is actually harder, but as I have said before, it is more fulfilling.

Background

This past semester, I flipped a first year law class -- my first time doing so.  This is a class every law student must take.  While we offer multiple sections of the course, students coming into first year have limited ability to customize their schedule to take a particular section with a particular professor.  This means that students are unlikely to self-select to my section because I advertise a flipped teaching methodology.  As a consequence, this is my first flipped experience with a relatively "random" sample of students.  I had 78 students in this class.

This was also a difficult subject matter for a flip -- or certainly a more difficult subject matter than my first flip (Administrative Law).  Introduction to Public Law and Legislation is a buffet topic designed to bring students up to speed with the public law infrastructure in Canada.  (The table of contents of the course text gives some sense of scope). This course is roughly the equivalent of the Foundations of Canadian Law subject mandatory for National Committee on Accreditation students.  However, I go substantially beyond the basics and we spend a lot of time on things like election law, access to information law, conflict of interest law, and lobbying regulation.

Generally speaking, students start with little to no understanding of how our system of law and government works.  And on top of this, this is not a topic that lends itself to a case-based approach of instruction -- in many of the areas we cover there are basically no cases.  It is, in other words, a very different course from the other first year topics students are covering, and they report finding it confusing to be drilled with cases in their common law courses and then need to think more structurally in Public Law.

The Flip

For the short version of how I ran the flip: in this flipped course, I pre-recorded podcast and videocast lectures, posted to the internet before (usually long before) the classroom session.  In the classroom session, I coached the class through active learning exercises designed to "put in play" the themes and information contained in the podcasts.  I used a variety of techniques, but the most common were what I call "blink" or "five minute" hypotheticals.  These are problem-based exercises in which I temper conventional Socratic by adding a discussion element.  Specifically, I posed (usually on the projection screen) a short hypothetical and students then had between 2 and 5 minutes to discuss with their seatmates before I resort to my call sheet for discussion.  Again, the hypos were designed to reinforce the material covered in the podcast lectures. 

Outside of class, I had students complete readings (reduced in length relative to my conventional, pre-flipped reading load).  I also deployed "feedback" quizzes done out of class.  Basically, these are online true/false quizzes that are designed to compel students to reflect on materials covered in the "passive" learning podcasts.  I also had them complete three in-class "two minute essays".  That is, I asked every student to complete one sentence on something they learned in the just-completed learning unit, and one sentence on something that remained murky. I used the data collected in this manner to detect common misunderstandings and difficulties and then tailored both the feedback quizzes and subsequent in-class hypotheticals to "work-over" important weaknesses.

And because I am committed to experiential instruction -- defined broadly to be "things that might be useful to know when students graduate" -- I also had them complete several out-of-class "public law lawyering" exercises in which they applied some of the substantive legal tools they were learning about in class.  Students submitted these to me and I provided modest feedback on what were largely fairly mechanical exercises.

These assignments and the quizzes were pass/fail, with a pass set at 70% and students kept repeating the exercise until they scored that B.

A full version of my syllabus is here.

Results

As discussed here, there is considerable debate about the merits of flipped versus conventional teaching.  This being law school, much of this debate is entirely fact-free, lacking any empirical foundation for resolving the dispute one way or another.  At times, the arguments can resemble a Monty Python skit ("When I was my students' age, I walked to school in bare feet, through the snow, ate cold gravel for breakfast, and lived at the bottom of a lake.  And I learned lots of law, so how I learned law must be the best way to teach it.")

One of the reasons for the lack of data is the overall disinterest in pedagogical research in law schools.  The other is the question of design, and specifically how do you compare different teaching methodologies where there is no control group.

I have not solved that control group problem -- but I can compare results from a flipped teaching environment with results from my prior iterations of the same course, taught using a conventional approach (which in my case, was a lecture-based class with some Socratic). 

In my conventional approach, I used 100% exams.  I have since sworn off that destructive practice, and now students do "bank" marks prior to the exam through their participation in the feedback quizzes and experiential assignments.  But I continue to have an exam worth most of the grade (65% this year).  And I continue to evaluate in exactly the same way, with a detailed issue-spotting exam.  And I continue to share past exams and answer keys with each cohort of students in advance of the exam.  Put another way, while my teaching methodology has changed, my exam methodology has not, in the slightest.

That means I can compare raw exam scores across years in a loose proxy of teaching outcomes stemming from different teaching methodologies.  This is, of course, a ridiculously inadequate measure when it comes to scientific rigour.  But I believe it better than nothing.

When I last did this style of comparison (with Administrative Law), the student grades were five raw percentage points higher on the post-flipped class exam than on the post-conventional class exam.  This was a modest, but notable difference.  But most importantly, with the flipped class the mark distribution was markedly different.  Put simply, there were fewer grossly underperforming students.

I had an identical outcome with my flipped class in Public Law and Legislation this past Fall.  First, the average raw mark was again 5% higher than the last time I taught the course.  And again, the mark distribution was very different.  I reproduce the mark distribution for three different academic years.  Two of these years involved conventional teaching (2008 and 2010, which was the last time I taught this course before this year).  The third involved flipped teaching (2015).  The bars in the graph represent the percentage of students in each grade category.

Readers will note that there were still a number of raw grade failures in 2015, and also a sizeable number of marginal passes (D).  From D+ forward, the mark distribution then follows a "normal curve".  I could not say that at all about 2008 and 2010 -- the mark distributions for those years are either skewed to low grades (2008) or essentially flat (2010). 

 

This chart actually masks another reality: the failures in 2015 were "near failures" -- mostly in the 45% range.  This was not true in earlier years, where the failures were often dramatic.  (In the result, when the 2015 exam grade was tabulated with the assignment work, I have no failures overall in 2015.  In past years, even an aggressive shifting of the grading curve by as much as 10 percentage points to meet faculty marking guidelines -- setting the class mean at B -- still left a sizable number of failures).

Even more revealing is a chart showing the proportion of the class "below B" and "B or above" for these three years -- again looking strictly at raw exam grades. 

 

 

I acknowledge once more that this style of analysis is imperfect.  But overall, these results affirm the results from my earlier flips: students who complete my flipped class demonstrate greater competency on a law school problem-solving hypothetical exam than do students who complete my classic lecture/Socratic course.  One response to this (from the Monty Python-style skeptics) might reasonably be: you were just a crappy teacher and you got better, regardless of methodology.  This is entirely possible, but in my defence, my pro-flip results are also consistent with the empirical data from other disciplines reviewed in the article I wrote with Peter Sankoff, noted above.

But whatever the case: Having invested considerable effort in rebuilding my pedagogy, these are gratifying results.  A side-benefit of the flip is that the constant interaction and feedback orientation of the active learning component of the course means I gather intelligence on what works and what doesn't, in a way I did not with conventional teaching.  With time, it will be interesting to see whether I can figure out enough to intervene early and successfully with those students who continue to struggle at the bottom of the class.

Monday
Nov022015

Marrow and Moonshine: The Use and Misuse of Law School Summaries

As the leaves fall, and the days shorten, the law school begins to hum with end of term, pre-exam activity.  During this period, first year students facing the prospects of law school exams for the first time turn to that authoritative source of systemic, unbiased and carefully researched data: the upper year student.  The latter, having run the exam gauntlet, delight in sharing their war stories.  They also share their definitive course summaries.  "Psst, want a summary buddy. Works like a charm.  Guaranteed A."

There is some legal Latin that student need to know in first year, and if they didn't learn it then, need it now. (Actually, there isn't really any useful legal Latin, but we like to speak dead languages every once in a while to justify the monopoly on legal knowledge.) In that tradition, here's the most important Latin for any student considering a canned summary: caveat emptor. Technically, this translates roughly into "buyer beware".  In the world of summary sharing, I translate it as "what the heck did you expect would happen?"

I know, law profs always want you to do the work and not take shortcuts. Very high-minded of us really.  So here's why.

1. Garbage in, Garbage Out

Every year -- and I mean every year -- there are answers on exams that contain bizarrely incorrect answers that are improbably consistent.  Actually, back up.  There are answers that are improbably consistent. Improbably consistent answers are evidence of collaboration or plagiarism.  Off to a bad start in academic fraud world if the explanation for the improbably consistent answer is "I was copying from someone else's summary".

But return to the answers that are both improbably consistent and also bizarrely incorrect.  Usually these are answers that rely on cases and principles I haven't taught for years.  And I haven't taught them for years because they have been overturned.  That is, they are no longer correct.  In other instances, the improbably consistent answers are simply misconstruals of the law.

This is exactly the disease that flows from the sharing of canned summaries, passed down from student to student over the generations.  It's like there is some fundamental flaw in the DNA of that summary that expresses in the form of uncanny errors on the exam.  Maybe the DNA was irradiated.  I think a lot of canned summaries were written in Chernobyl.

Needless to say the students with the bizarrely incorrect answers that are improbably consistent received predictably consistent lousy grades.

It's a little bit like that credit card commercial: "What's in your wallet"? Well, "what's in your summary?". It could be lethal and contagious.

2. The Medium is Not the Message

Even good summaries are not security blankets.  They are just more readings.  Students read them.  They may even memorize them.  Presumably that creates comfort.  I doubt it produces better marks.  Or more correctly, I doubt it produces better marks than would the case if they were used properly.

And there is only one proper summary: the one you do yourself.  That's more work.  Exactly.  The only -- and I really mean only -- virtue of a summary is to force you to sit down and consolidate the course.  Organize the material in your mind, digest it, spot what you don't really understand, correct those gaps.  Once you do your own summary, that's your studying.  You understand in ways that passive consumption of a canned summary will never allow.

Let me put this caps, bold and italics: THE SUMMARY IS A PROCESS, NOT A PRODUCT.

If you skip the process part and look for the Holy Grail of all canned summaries, you presumably are also the sort of person who envisages time in the Lac-Leamay Casino as your chief income generation strategy.  Keep those dice rolling.

3. Team B Testing

That's not to say that every student's home-baked summary is good.  Many are probably pretty rudimentary or just plain wrong. And so there is a role for canned summaries (and group work where students subdivide prep of a summary):  once you have gone from A to Z in preparing your own summary, it is useful to juxtapose that work product against that of others.  This is Team B testing of your Team A.  If there are inconsistencies, you need to turn your mind to this question:  which is correct?  Research it.  Fix it.  Go into an exam with eyes wide open. But remember: the Team B is just a backstop.  Canned or group-produced summaries never replace first doing your own.

4. The Summary stays in the Bag

I have seen summaries that must be longer than the full transcripts of all the lectures I give in the class.  Student bring them into exam rooms on wheels.  They put them on the desk (with a thud).  And in the exam, they leaf madly through the thing looking for I don't know what. 

Doom. 

I always, always design exams for the student who knows their stuff, not for the one who decides to learn it during the exam.  The latter won't have time to finish because they spend writing time flipping through their telephone book. That's not a security blanket.  That's cement encased feet.

I suppose there may be summaries with comprehensive indexes and navigation tools, but I doubt that is common.  And without any shade of doubt, the time spent on doing that sort of editing could be better deployed on, oh, learning law, playing Xbox, staring blankly at the wall.  Whatever.

Yes, every once and while, even a student following the advice on self-prep of summaries above may be stumped and need a quick look in the summary.  But the student who did their own summary will usually find that they have assimilated knowledge, and don't need to treat the exam as a time-constrained research exercise.  The summaries stay mostly closed.

5. The Summary is Rice Pasta

I have started eating rice pasta.  I can never get the quantities right.  When it boils it always seems to reduce.  Weird.  

A summary should be rice pasta.  It needs to be boiled down.  What should be on the desk next to the exam writer is the two page Rosetta stone.  This Rosetta stone translates exams into grades.  What is it?  Once you figure out most subjects, you'll find that they can be reduced to a decision tree.  There are legal tests that produces outcomes that trigger other tests.  The secret is to plug the fact on the exam into the proper spot in the decision tree.  And if you've taken the time to boil the summary down to that decision tree, you can just follow and note on your exam each step that then arises on your decision tree.  And at the end, your paper computer almost automatically spits out a reasoned outcome trolling all sorts of marks with it.

More than all this: the student who has this decision tree has thought about his or her subject from the optic of problem-solving matrix, and not as the useless alternative of "memorized knowledge". And that means (I earnestly believe) he or she is much more likely actually to spot issues on an hypo exams.

All of this is to say that the decision tree is the most effective way known to humanity of converting student knowledge into grades on a hypo exam.  

Moreover, if you write the summary and then take the supplemental step of producing the decision tree,  you know the material so well that you may actually be authorized to give advice to the first years who come after you.  Oh, and how about that eventual job practicing law? All that time and info in law school may stick a little better. You have got to the marrow.

So there you go.  In November, students need to decide whether their exam strategy will be "strive to do well because I know the genetic code of my stuff and can convey it" or instead simply hope that the moonshine they get from other students among the lockers doesn't give them a whopper of a hangover.

Ask the best students -- the ones with consistently good grades -- what decision they made.  I do all the time.  It's inevitably variations on choice 1.  Maybe they lie to me to make me happy.  But I doubt it.

Thursday
Mar262015

The Law Professor as Public Citizen: Measuring Public Engagement in Canadian Common Law Schools

Before I knew I would soon be "living the dream", I decided to embark on a data-rich study of "public engagement" by Canada's 600 common law professors.  After seemingly endless hours reviewing law school websites and number crunching on Excel, the results are now going to print, (2015) 36 Windsor Review of Legal and Social Issues, and I have posted the penultimate version to SSRN here.  Readers should be attentive to the important methodological caveats I include in the article.  The results should be considered more for what they tend to rebut than what they prove.  Nevertheless, my hope is that my article will help, in a small way, make empiricism fashionable in law school decision-making.

The abstract reads:

This article asks whether there is room for the law professor/public citizen in today’s law schools. It does so by measuring indicators of professor “public engagement” with constituencies outside of academia, such as government, civil society and media. As evidence for its inquiry, the article reviews a comprehensive data set collected from the public web profiles of Canada’s 600 full, associate and assistant common law professors. These data suggest that common law professor public engagement remains part of the tradition of the Canadian legal academy. More than that, there is no support for the view that public engagement diminishes scholarly productivity. Nor is there evidence that mainstream media participation distracts professors from conventional scholarship – in fact, the most media active professors appear to have above average net publication tempos. In terms of institutional implications, public engagement does no harm to law school reputation, and indeed there is a moderate positive correlation between the net level of public engagement represented on law professor web profiles and reputational rankings, such as they are. The connection between media presence and institutional reputation is more complex, and there are data suggesting little positive correlation between reputation and media presence. However, when one potentially anomalous case of a law school with a striking media footprint but a lower reputational scores is discounted, there is a moderate positive correlation between a law school’s media presence and reputational rankings. In sum, until a more comprehensive survey is undertaken, this article constitutes the best available evidence that law professors can be (and often are) teacher/scholar/public citizen.