Systems Check: Any Belief Gaps? (Part 2)

Inspired by a recent Twitter conversation about the ‘belief gap’ in education (educators’ lack of belief in certain students’ abilities to learn, that is), I spent a post last week diving a bit deeper into the idea. There, I basically drafted behind one of the discussion’s key posts and offered the following:

  1. Belief gap is more than an individual-teacher-level shortcoming operating in many individual classrooms; it can also be expressed at education’s system levels.
  2. These systemic expressions of belief gap — in particular via flawed, belief-gap-driven educational practices — lead to more dire results for more students than many might realize.
  3. These systemic expressions of belief gap create blind spots for both practitioners and critics, and these blind spots ultimately force the improvement conversation into stalemate.

I did not include, however, much in the way of examples or illustrations to bring these rather abstract ideas into the concrete. Here, then, is a ‘Part 2’ to do just that.


Let’s start with the term ‘flawed, belief-gap-driven practices’. By this, I mean any educational philosophy, curriculum, program, or practice beginning from the idea that, ‘done any other way, [INSERT STATISTICALLY LAGGING DEMOGRAPHIC GROUP HERE] kids won’t succeed’ — but that actually work in reverse to their intent. In my book, I dedicate all kinds of space to just these types of practices. Here are some quick examples, however, with a problem of each practice following in parens:

  • Slashing science and social studies instructional time to create more time for reading instruction (this example, from Kansas in 2012, will look familiar to many teachers; also, check this, filed yesterday in Massachusetts; look a bit yourself, and you’ll see it’s happening a lot), then filling that time with reading-skills-emphasizing, content-neutral instructional approaches like Balanced Literacy. (A problem because content knowledge is the key to effective reading comprehension).
  • To ‘meet all students where they are’, differentiating instruction as diversely as possible to accommodate each student’s academic ability level, preferred learning style, and/or interests. (A problem because differentiation — even done by design — does not guarantee that students of widely differing abilities will successfully access and master the same content. Though differentiating across heterogeneously grouped classrooms was meant to address ability tracking, it’s really just causing a lot of tracking in plain sight. Another problematic thing? ‘Learning styles’ isn’t actually a thing, making the time we spend accounting for them actually quite wasteful.)
  • Striving for student engagement at the expense of academic rigor. (A problem because keeping kids interested in a task does not necessarily mean they’re taking away the academic pieces that will enable readiness for future tasks, academic or non-academic. ‘Building a love for reading’ by reading only YA fiction or graphic novels, for example, likely won’t position a student well for the literary-structural demands they’ll face in their college-required lit classes. Though it’s fair enough to answer that not all kids are college bound, I’m skeptical that a YA-forged love of reading will be enough to power many kids, when adults, through the terms and conditions of their cellphone contracts and/or health insurance documents. Plain and simple: academic tasks that stretch students do matter to their growth, regardless of where they’re headed after K-12.)

Many times blowing right past the problematic considerations in each, education bets on practices like these and rolls them, packed in lots of resource materials and associated professional development, into schools and districts — into entire states, even — all the time. And in turn, teachers’ implementation of these practices often becomes required in the criteria of teacher-evaluation systems.

And this is where belief gap becomes more than something existing in the mind and subsequent actions of individual teachers: adopted by those teachers’ employing organizations as professionally expected practices, the belief gap becomes truly systemic.

So what kinds of effects do these flawed, belief-gap-driven practices have on student outcomes? It varies everywhere, of course. Looking at Minneapolis Public Schools, however, a large urban district that is regularly under fire for its demographic achievement gaps (and which, full disclosure, I once spent a lot of time studying data for as an employee), can be instructive.

In the late 2000s and in reaction to years of troubling results in math, the district moved in a number of changes: to give kids what it considered a more appropriate/effective mathematical foundation, it committed to a constructivist math-instructional approach* based on the Investigations in Number, Data and Space curriculum series in elementary grades; also, to improve uniformity and adherence to standards, the district constructed a comprehensive instructional framework (e.g., scope & sequence, lesson resources, benchmark assessments, etc.) they dubbed Focused Instruction.

*NOTE: The way-too-short version of ‘constructivist learning theory’ is that individuals each construct their own understanding according to their own experiences, knowledge, and mental frameworks. Constructivist math instruction, then, follows this theory by not prescribing ways of reaching mathematical answers; more so, it teaches a range of strategies for solving problems and leaves students to choose the ones that work best for them. It’s a way of teaching math that has been widely criticized, even touching off what some have called the ‘Math Wars’. For an oft-cited critique of this approach, see Kirschner, Sweller, and Clark’s ‘Why Minimal Guidance During Instruction Does Not Work’.

The graphic below, which shows breakdowns of MPS’s performance on Minnesota’s grade 11 math exam between 2006-2008 (chosen here from lots of available indices as it occurs closest to students’ post-K-12 realities), shows that a change of some kind was most definitely in order: overall proficiency in the district was around the 20% mark, with large disparities between subgroups’ percents-proficient. Clearly, whatever had gone on previous to 2008 was not doing a reliable or equitable job producing students who were post-K-12-math-ready.

MPSGrowthStudyGr1106-08

After 2008, then, came the system changes: the constructivist instructional approach, a ready-or-not requirement of Algebra in 8th grade to ‘raise the floor’, the centralized schedule of units and assessments, lots of associated professional development for teachers, etc., etc. — all based on a hunch, apparently, that such methods would produce more students genuinely post-K-12-ready in math, reducing the large gaps between historically privileged and non-privileged subgroups.

Looking at the state test results since that time, it would appear that at least a few pieces of this hunch are off.

See the below graphic, for instance (from the Minnesota Department of Education’s Data Center). The graphs represent the five-year trends (2010-2014) of Minneapolis Public’s black (on left) and white (on right) students on the state’s grade 11 math exam. The green line in each represents statewide results for like demographics.

MPSGr11MathBlWh

As you can see: six years into implementation of the belief-gap-driven practices, both black and white demographics score up and down, but each shows lower percentages of proficient students in 2014 than they did in 2010. As the 2014 students had the most time learning within the so-called enhanced system, and as system teachers had more time to master the system’s changes, it should be rather shocking to see them recording the lowest percent-proficient in the trend line provided.

Let’s look at another large ethnic demographic historically struggling in Minneapolis Public, just to see how the hunch is playing out elsewhere. Below: On the left, math test results of the last five years from the district’s Hispanic 11th graders; on the right, once again, the district’s white 11th graders. Here you’ll also see a pattern — if not as steep — of steady decline through the past four years.

MPSGr11MathHiWh

And how about income differences? What if we take ethnicity out of it altogether and compare the percent-proficient of all the district’s white 11th graders to the percent-proficient of the district’s white 11th graders qualifying for free/reduced lunch? Here they are, again on the left. Again, more steady decline.

MPSGr11MathFRLWh-Wh

And if that all looks rough, you really don’t want to see how Minneapolis Public’s ‘Balanced Literacy’-based reading instruction is setting kids up to perform on the now Common Core-aligned reading tests. (If interested, the link to the state data site is above. Reach out if you have questions about how to navigate.)

Now, of course I know there’s a lot of information these data sets leave out. I only focused on one test, I didn’t follow cohort groups to watch how they’re doing over time (a very good monitor of program effectiveness, by the way), factor in all the kids who have left the district or opted out of testing (telling data points in themselves), or break it down all kinds of other ways. Very simply, this piece has to wrap up somewhere. (Believe me, though: I have done many more comprehensive dives in other situations, and the views aren’t usually much better.)

In all, the data shown here should at least reinforce the idea that, despite many systemic changes in this one district, the percentages of all students proficient on this particular exam — not just those on the wrong side of belief and achievement gaps — are declining over time. Though system leaders often peddle this decline down to flaws at the implementation level, blaming teachers when their reforms don’t generate improvements, it’s high time that we look at the changes the systems require and ask if they’re the right ones for our kids.

Advertisements

One comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s