|
|
|
|
|
Editor’s note:
This article appeared in The Recorder
newspaper November 24, 1999. It remains
timely because pass rates have continued
to decline, while law schools and the State
Bar claim they haven’t changed a thing. |
|
Explaining
the Inexplicable.
The Bar pass rates are down,
and no one seems
to know why
- or care. |
On
my first day of law school, I was shocked
when the torts professor mocked the idea
that the law had anything to do with justice.
Now I realize that's as much of a cliche
among lawyers as saying that the bar exam
has nothing to do with law school or with
success in the practice of law.
But that is small comfort to the 3,769
applicants who just found out they failed
the July 1999 exam. If law school doesn't
prepare them for it - and if the skills
it tests bear no relation to the ones
needed to practice - at what, exactly,
have they failed?
|
|
|
Failing bar applicants
are left scratching their heads all alone,
because most ABA-accredited California
law schools aren't asking themselves what
and how the bar tests and what they can
do to help their students pass.
| |
The pass
rate has been the lowest in 10 years -
for
four Bar exams in a row. |
They should
be, because the pass rate on each of the
last four exams has been the lowest in
10 years.
Overall pass rates for the last two February
exams came in at about 40 percent. And
just over half the test-takers passed
the last two July exams.
This year's pass rate of 51.2 percent
is the lowest in 12 years. |
|
|
Pass rates
have fallen between 12 and 20 percent
at nearly half the California law schools
accredited by the American Bar Association.
(See box.) Although we don't yet have
statistics by school for both exams in
1999, given the pass rate, this discouraging
trend is assuredly continuing.
|
The Bar
sees
no trend. |
What trend?
says State Bar Director of Admissions
Jerome Braun. According to Braun, there's
been no change in the difficulty of the
exam or the way it is graded. He says
that pass rate reflects the preparedness
and capability of people taking the exam.
If the bar exam hasn't been graded any
differently in the past four administrations,
then there must be a huge difference in
the population of applicants taking the
exam. I asked the schools listed here
what they were doing differently. Had
they changed their admission standards?
Grading policies? Faculty? You will not
be shocked to learn that the schools said
nothing much has changed.
Braun and the State Bar say the mix of
students taking the exam is different
each time, implying that the exam itself
is a rock-sold model of consistency and
standardization.
"Each group who takes the examination
is unique - unique skills, unique qualifications
and unique preparation," Braun told
The Recorder. "So I would
expect some fluctuation."
But if the outcome changes - not once
but repeatedly, and not a little but dramatically
- it's hard to believe all the players
contributing to that outcome when they
say nothing's changed.
|
If the
pass rate's changed,
something's changed. |
Or, as Yogi
Berra might say, if something's changed,
something's changed.
Could it be that the Bar is raising
the bar on exam grading?
To answer that question, a little background
on exam grading is in order.
There are two stages in the grading
of an exam. First, essays and performance
tests (PTs) are given a numerical score
on a hundred point scale.
|
Scaling
isn't doing its job. |
The numerical
score is then adjusted, by a process known
as "scaling" so that - theoretically
- the score reflects the same level of
achievement as it did in past years, with
a scaled score of 1440 required to pass.
That's what's supposed to ensure consistency.
But it clearly isn't doing the job.
Braun,
who has directed the admissions office
for 10 years, refuses to even attempt
an explanation of scaling, saying the
process is like the Rule Against Perpetuities
to him. He says the process is "arcane"
and likens it to "rocket science,"
but says scaling brings consistency to
the grading of the written exam. "We
are satisfied that the pass rate represents
the same level of achievement for July
of 1989 to July of 1999 to July 2003,"
he says.
I asked Steven Klein, a psychometrician
and consultant to the Bar on the examination,
to explain the scaling process. This is
not a complete explanation but focuses
on the part of scaling intended to stabilize
the pass rate.
The multiple-choice portion of the exam,
or MBE, is controlled as to its difficulty
by repeating 60 questions on each test.
After comparing success on the 60 repeated
questions with success on the remainder
of the exam, the MBE examiners adjust
the score up or down. This adjustment
is done to maintain the same level of
difficulty for each MBE exam.
Law
School |
1997
Overall Pass Rate |
1998
Overall Pass Rate |
Change
in Passing Percent |
California
Western
School of Law |
69.6 |
49.3 |
20.3 |
Thomas Jefferson
School of Law |
45.5 |
26.3 |
19.2 |
Loyola Law School |
75 |
60 |
15 |
Whittier Law
School |
61.7 |
45 |
16.7 |
Pepperdine
University
School of Law |
79 |
65 |
14 |
Univ. of San
Diego
School of Law |
74.1 |
61.2 |
12.9 |
McGeorge School
of Law |
71 |
53.5 |
17.5 |
Univ. of San
Francisco
School of Law |
82.6 |
67.2 |
15.4 |
Univ. of Santa
Clara
School of Law |
73 |
59.7 |
13.3 |
Source: General
Bar Examination statistics, State Bar
of California
Because the level of difficulty of the
MBE is presumably constant, the written
exam score - essays and PTs - is matched
to the MBE score.
Let's say 3,000 California applicants
get MBE scores at the 70th percentile
or above. Then the 3,000 California applicants
with the highest written scores will be
matched, highest to highest, and second
highest to second highest, with the MBE.
Their written score is given the MBE score
with which it matches.
It is theoretically possible, Klein
says, for a person who got all 60s on
her essays to pass the California Bar,
if her written score was among the top
third, say, of all written scores and
thus matched to a passing MBE score. So,
no one can look a Bar applicant in the
eye and say, "This is the grade you
are aiming for." The only guidance
possible is "1440," which is
a result of several complex mathematical
manipulations.
|
"What
you need
to pass is
a 'passing' score." |
It is the
equivalent of saying, "What you need
to pass is a passing score."
How's that for giving guidance to applicants?
To prove
that the pass rate has been stable over
the past 20 years, Klein suggest looking
only at first-time takers from ABA schools,
which naturally means looking only at
July bar exam results. Then Klein says
we must throw out results from two of
those exams - July 1984 and July 1994
- which be says were flukes. Even then,
the pass rate has varied among first-time
ABA applicants by as much as 8 percent.
In sum, the raw scores of the bar are
so arcanely manipulated that it isn't
possible to tell an applicant what grade
he or she needs to pass. That's done in
the name of stabilizing the pass rate.
Proof that it does stabilize the pass
rate can be seen only by looking at the
results achieved by one group of applicants
and only for July exams, with exception
for high and low results that can't be
explained.
|
ABA schools
see no reason
to change. |
Most ABA-accredited
schools remain convinced that the recent
low pass rates aren't their fault, and
see no reason to change their approach
to preparing students for the exam. Even
schools that admit the drop in the pass
rate deserves scrutiny have not done much
about it. Instructors remain resistant
to interference with their teaching or
testing. And ABA schools refuse to teach
to the bar.
Klein, who has consulted on California's
bar exam for 25 years, says the way the
exam is graded has not changed. He says
the failing of most failing students is
their lack of ability to apply facts to
law.
I believe that. I know the California
Bar Exam is not a trick test whose tricks
can be learned from a commercial purveyor.
The exam tests knowledge, yes, but it
also tests legal reasoning. What exactly
do academics mean when they say they don't
teach to the bar? Could they mean to say
law schools don't consider inculcating
legal reasoning to be their responsibility?
|
Law schools
blame the State Bar. |
There is
a disconnect between the law schools and
the State Bar that disserves Bar applicants.
The law schools don't understand, really,
how the bar is graded. And they don't
seem to care. And in blaming the Bar for
the drop in pass rates, they deny their
own responsibility to clearly teach and
test legal reasoning.
But the Bar bears some of the blame,
too, for hiding the ball. Putting aside
the scaling mystique, the Bar does not
disclose how an essay gets a 75. Granted,
it invites law school professors and deans
to one of two sessions where graders calibrate
the grades they give. But it closes the
first calibration session, where the reasoning
of the graders is exposed.
|
"Teaching
to the Bar"
means
teaching
legal
reasoning. |
It is there
that the law professors would discover
that "teaching to the bar" is exactly
what law schools are supposed to be doing.
Vivian Dempsey
is a faculty member at San Francisco Law
School, and has been a grader of the California
Bar Exam. She has taught The Writing Edge
Bar Review Course since 1985.
The Recorder * Wednesday,
November 24, 1999
| |
|
| |
|
|