The American institution of public schooling is considered one of our greatest strengths–but why? What is the purpose of schools, and why do we force our children to attend them? The response you’ll most often hear is that school prepares our students for the task they must all ultimately undertake of being successful, responsible adults–that’s what we usually mean by the term education. Yet there is an odd contradiction implicit in this attitude: if the purpose of education is to prepare children for the “real world”, then why do we need schools? It’s generally understood that the schools themselves are not the “real world”, so they must necessarily be artificial worlds: invented realities designed to shelter our children from the responsibilities of adulthood until they are “ready” for them. It is no great secret that experiential learning is by far the most effective kind, yet the fierce irony of preparing our children for the “real world” by sequestering them from it entirely goes apparently unnoticed. This idea of preparing our children to be adults by treating them like anything but–what I call the “real world” fallacy–is a perfect encapsulation of everything that is wrong with our current way of thinking about school, education, and children in general.
Politicians, parents, and teachers all realize that something is amiss with the current system of schooling, but nobody seems to agree as to what that something is. Without knowing exactly what’s wrong or how we might fix it, one possible solution would be to make our schools competitive, starting with our teachers. The idea is that natural market forces would encourage teachers and schools to provide the best value possible for their students, so that they could be rewarded proportionately. Better paradigms would naturally take over as everyone rushed to emulate the most successful (and wealthiest) teachers, and the schools that employed them. Unfortunately, this scenario is at present a distant fantasy. The problem is not just that teachers are underpaid, or that bad teachers are difficult to fire, or that teachers and schools have almost no control over what or how they teach, or that parents have little choice as to which school their child attends. These may be contributing factors to greater or lesser degrees, but they are all just symptoms of a single common problem: with the possible exception of college professors, teachers are not hired, fired or paid based on actual performance.
I’m not the first to say this, of course. There have been significant movements for decades to increase so-called “teacher accountability”, but all have ultimately failed because they all start from within the same broken system. One of the most obvious problems with trying to increase teacher accountability from within the current system is that most students only have the same teacher for one subject in a single class for less than a year each, and so that particular teacher’s contribution to their overall success is often negligible. But even this is merely reflective of the fundamental problem shared by all such “reform” movements, regardless of method: every one of them is based on the judgements and decisions of adults, not students. The inevitable result is that no matter the technique, the measure of teacher and student performance ends up being simply whatever adults say it is.
In any other kind of market, this attitude would immediately be apparent for what it is: statist favoritism. A politician who proposed that we determine how much money car manufacturers got based not on how many cars they sold, but on which kinds of cars they made, would be run out on a rail–the very idea offends our democratic sensibilities. It seems obvious that they would be unfairly granting themselves the power to give the money to whomever they liked, regardless of what the people actually buying and driving the cars think, and it seems equally obvious that this is profoundly unjust. Yet because as a culture we still believe in the tenet “father knows best”–in the literal if no longer metaphorical sense–the idea of students choosing their own teachers and schools seems positively ridiculous.
Like most statist regimes, our education system is a self-fulfilling prophecy: tests written by adults are used to measure knowledge of subjects chosen by adults to judge the efficacy of teachers hired by adults following curricula and teaching methods designed by adults in order to produce an “ideal citizen” conceived of by adults, and all without the consultation or even consent of a single actual student. It’s a system designed not only to undercut the agency of its victims, but to make them complicit: students are forced into submission and bombarded with dogma to convince them it’s all “for their own good”, leading them to perpetuate the practice when they’re adults in order to resolve their cognitive dissonance. What’s worse, many adults quite enjoy their privilege and authority over children and see it as just reward for all the years of schooling and taking orders they themselves had to suffer through. Like all oppressors, these types of adults in particular are rightly afraid of what might happen if students were given free rein. If we allow students to learn whatever they want, for example, what’s stopping them from doing whatever they want? Or thinking whatever they want? They might even start trying to decide what to believe, or how they ought to act, all by themselves! Where would that leave us?
The idea that students are even capable of figuring out what’s best for themselves, without adults forcing their hand, is considered an oddly radical one. Indeed, the general sentiment seems to be that if left to their own devices children will do exactly what is worst for their own interests. Without adult “guidance”, the thinking goes, children’s natural tendencies would lead them to devolve into total chaos. Speak of a world where children do “whatever they want” and most people conjure mental images of surreal dystopias full of children eating themselves sick with sweets, robbing toy stores and candy shops blind, and destroying or defacing everything they can reach–eventually losing all traces of “civilized” behavior including rule of law, rationality, and even language. (The parallels to apologist rationalizations from other statist regimes throughout history are, again, striking.) Unfortunately, all the evidence we have stands in direct contradiction to this line of thinking. Children may be less experienced and capable than adults, but that does not make them any less intelligent. Indeed, anthropological evidence as well as modern-day examples such as the Sudbury Valley school show that when left to their own devices–when allowed to choose not only what they spend their time doing, but also who they spend it with, including their teachers–children prosper and succeed. They are capable of self-governance, self-regulation, and self-education. Not only do they naturally learn all the skills they need to succeed in the “real world”, they have a ton of fun doing it! What’s more, they continue learning and having fun throughout the rest of their adult lives.
Adults having fun? What nonsense! Our culture has a deep-seated resistance to the idea that fun, enjoyable things can have intrinsic value. Indeed, the general attitude takes the exact opposite stance: virtuous things are implicitly understood to be unpleasant, and things you enjoy doing are implicitly understood to be bad. We feel much more legitimate about doing “what’s good for us” when it feels like work, and we expect our children to learn to feel the same. But what exactly is “work”, anyway? The ready answer you’ll receive from any member of our society is that work is the opposite of play. Work is boring and stressful, while play is interesting and enjoyable. Work is necessary and mature, while play is frivolous and childish. Work is what adults do because they have to, and play is what children do because they can.
In fact, it seems that one of the very first lessons school teaches us is the difference between children and adults: namely, that children have to be forced to be miserable whereas adults have learned to inflict misery upon themselves. We are taught very early on that work is unpleasant but necessary, and that school’s purpose is to prepare children for work by putting them through a diluted form of the same sorts of tedium and drudgery. The great tragedy of this attitude is that, although it feels like an unshakable foundation of human existence, it’s actually fairly recent, starting only about ten thousand years ago in some parts of the world, and even more recently in others. Ten thousand years doesn’t sound very recent, but humans have been around in more or less their current form for over two hundred thousand years, and during the vast majority of that time we lived as hunter-gatherers who made no distinction between “play” and “work” at all. Nor should we! The so-called “play” of all animals, including humans, is simply practice for adulthood–that is, education. Kittens play by creeping, stalking and pouncing. Well, what do adult cats do to “earn their living”? Why, they creep, stalk and pounce! Puppies play by running and fighting–and what do adult wolves do? They run and fight! Human children play by being curious, creative, and social–and for hundreds of thousands of years that was the exact job description of an adult human. What on earth happened?
Agriculture happened. The introduction of agriculture in our societies allowed us to dramatically increase our populations and facilitated the rise of art, written language, large-scale communities such as towns and cities, specialization of labor, and ultimately industry–but it came with a steep cost. Health, height and happiness all went down, infant mortality and disease went up, and tightly-knit, egalitarian tribal societies gave way to sharply stratified, authoritarian patriarchies. Curiosity, creativity and complex social interactions were no longer associated with success–instead, backbreaking physical labor and deference to authority were the new ways to get ahead. Even then, “getting ahead” was measured less by personal wealth or happiness (which were determined almost exclusively by birthright) and more by the extent to which you didn’t hang or starve to death. Within this system, as before, children were expected to learn how to become adults by being exposed to the adult world–only now, instead of consisting of curiosity and resourcefulness, the adult world consisted of forced labor and keeping your head down. Most adults were little better than slaves, resulting in most children being treated as the slaves of slaves. This is the conception of education and the model of schooling that we’ve inherited, and it stands unchanged in its fundamentals to this day.
Thankfully, the industrial revolution leveled the playing field somewhat with the rise of the middle class, and the subsequent information age is well on its way to burying these feudal social structures for good. Yet the educational and cultural paradigms they fostered persist, hundreds of years later. We need to get rid of the implicit assumption that children must be tamed, and that it is the purpose of education to do so. The information age is rapidly creating an environment where our inborn tendencies towards curiosity, creativity, and socialization are once again beneficial traits–yet our system of education actively suppresses those very qualities. Our public schools are becoming a liability rather than a strength, and the reason is that the “real world” is no longer anything like our schools. The sooner we realize this, and the sooner we put our childrens’ lives and educations back into their own hands, the better off our country and all future generations will be.