• Home
    Home This is where you can find all the blog posts throughout the site.
  • Categories
    Categories Displays a list of categories from this blog.
  • Tags
    Tags Displays a list of tags that have been used in the blog.
  • Bloggers
    Bloggers Search for your favorite blogger from this site.
  • Archives
    Archives Contains a list of blog posts that were created previously.
  • Login
    Login Login form

Posted by on in General

There isn't a teacher shortage. Not really. But there is a shortage of districts and states that are successfully attracting people to teach careers. If I can't get a dealer to sell me a Lexus for $1.98, that does not mean there is an automobile shortage. The "teacher shortage" is really a shortage of $1.98 teachers.

Something is wrong. Not only do we have a drastic drop in the number of proto-teachers in the pipeline, but the profile of the teacher pool is off. The teacher pool is overwhelmingly female and white. Males and minorities are not represented in the teaching force in numbers that remotely resemble the demographics of our student population.

So how do we get and keep the teachers that we need?

After all, it ought to be easy. No other profession gets to pitch itself to every single young person who could possibly pursue it. So what are we missing?

To understand how to recruit teachers, we just have to remember how the teachers we have found their way to the classroom. And the most important thing to remember is how they start.

It's not a deep, complicated thing. Almost every teacher in a classroom started out as a student in a classroom, and that student had two simple thoughts--

1) I kind of like it here in school.

2) I can see myself doing that teaching thing.

That's it. If we get a student to harbor those two thoughts in his teenaged cranium, we have successfully created the seed from which a future teacher could grow. But looking at those two thoughts can also tell us where our edugardening has gone awry.

Kind of like it here.

No excuses. Speak when you're spoken to. School to prison pipeline. Assumption that black and brown students are a problem. Crumbling buildings. Lack of even basic supplies like books and paper. Curriculum that is centered on test prep. 

None of these are going to make a student feel as if school is just like a second home. And schools that carry the greatest weight of discrimination and mistreatment are the greatest anti-recruitment. If you have made a student feel unwanted, unwelcome and unsupported in school at age fifteen, why would that same student consider returning to school at age twenty-two?

I can see myself doing this.

The most fundamental part of this is the modeling of staff. It's hard (not impossible, but damn hard) to imagine myself doing a job if I can't see anybody like myself doing the job.

Beyond that, students will be influenced by what they think the job is, the job that they see teachers doing. Are male teachers of color responsible for breaking up all the fights in the building? Do coaches get to follow a different set of rules than other staff? Do lady teachers have to keep their heads down and never talk back to a male boss? Do some teachers spend half their time doing drill and drill and worksheet band dull, boring drill? Any such unwritten rules are noted by students, and factor into how appealing the job might be.

Do students see that teachers struggle financially, holding down extra jobs to make ends meet? Do students see their teachers treated with respect? Do students see teachers supported with resources and materials, or do they have to buy supplies out of their own pockets? Do they see the job turned into a low pay, low autonomy, de-professionalized drudge? These factors also affect whether students can see themselves living the teaching life.

The Path

Of course, there's more care required for these early seeds reach full flower. College teacher programs may support the fledgling teacher or throw more obstacles in the path (I often wonder how many male teachers of color we lose to repeated "Well, what the hell are you doing here?") Then we get to the luck of the draw with the match-up for student teaching, and finally, the problem of individual district hiring practices.

The Circle

And then we arrive back in the classroom, where the person who was once a student may have to withstand one more assault on their desire to teach. And we don't have time to get into all of that yet again.

Retention is a huge problem, easily as big as recruitment, but here's the irony-- the recruitment problem and the retention problem are the same problem, because the best way to recruit the teachers of tomorrow is by giving support and respect to the teachers of today. You cannot dump all over today's teachers and expect students to say, "Oh, yeah, I'd love to jump into that pool of pooh."  You cannot reduce teaching to mindless meat widget drudgery and expect students to say, "yes! Someday I want to be a soul-sucked functionary, too."

Of course, there are folks out there for whom the death of the teaching profession is a goal, not a problem. But for the rest of us, the path is relatively simple and clear. Elevate and support the teaching profession, and the people who look at it in action every day will want to join in. If you want good seeds, you have to tend to the plants that are already growing.

Last modified on
Hits: 10 Comments

Posted by on in Education Policy

steam engine

Personalized Learning is getting the hard sell these days. It's marketable for a number of reasons, not the least of which is that nobody really knows what Personalized Learning is.

What it suggests is something appealing, like Individualized Education Programs for everyone. Personalized Learning fans like to trot out exemplars like Chugach, Alaska, a remote, tiny town where a school system created a system in which each student had her own personal path to graduation, with projects, content, and assessment.

While there are plenty of problems with the Chugach thing, it's a good example of what most of us think Personalized Learning would mean. An educational program custom designed for each individual learner. Custom designed like a meal at a restaurant where you can choose the protein and spices and sauces and dishes and means of cooking and order exactly what you are hungry for.

But as Personalized Learning rolls out, that's not what it's like at all.

From the College Board's personalized SAT prep courtesy of Khan Academy, through bold plans like this IBM personalized education pitch is something else entirely. This is just path-switching.

The Brand X that we're supposed to be escaping, the view of education that Personalized Learning is supposed to alter, the toxin for which Personalized Learning is the alleged antidote is an education model in which all students get on the same car of the same train and ride the same tracks to the same destination at the same time. That's not what's actually going on in public schools these days, but let's set that aside for the moment.

Real personalized learning would tear up the tracks, park the train, offer every student a good pair of hiking shoes or maybe a four-wheeler, maybe even a hoverboard, plus a map of the territory (probably in the form of an actual teacher), then let the student pick a destination and a path and manner of traveling.

But techno-personalized learning keeps the track and the train. In the most basic version, we keep one train and one track and the "personalization" is that students get on at different station. Maybe they occasionally get to catch a helicopter that zips them ahead a couple of stops. (Think the old SRA reading program.)

Pat completes the first computer exercise in the module. An algorithm (cheerfully mis-identified as "artificial intelligence" because that sounds so super-cool) checks Pat's answers and the particular configuration of incorrect answers, by which the algorithm assigns the next exercise to Pat. Rinse and repeat. Pat is still on the train, but now there's a small web of tracks that he must travel. But Pat is still a passenger on this train, choosing no part of the journey, the destination, nor the means of travel.

That is in fact one of the key ways to identify whether you've got actual personalized learning or not -- how prominent is the voice of the student. If the pitch is "Our super-duper AI will analyze student performance and assign an appropriately awesome module to enhance learning swellness," this is not actual personalized learning, but Algorthmically Mediated Lessons (h/t Bill Fitzgerald) which is not personalized learning at all.

That's the bait and switch to watch out for. The promise is a hugely flexible and open-ended, even project-based, learning that is adapted to every individual learner. The delivery more often is the chance to pay big bucks for what is essentially a proprietary library of exercises managed by a proprietary software algorithm for doling the assignments out based on a battery of pre-made standardized tests and quizzes. That is not personalized learning. You cannot have personalized learning without persons. That includes persons making the decisions about hat the students do. That includes using knowledge of the person who is the student, and not handing out materials created by someone who has never met the students (and created the exercises before the student ever stepped into the classroom).

That impersonal education is not automatically terrible, and often has a place in education-- but it's not personalized learning.

And it's worth noting that the one train, one track model was abandoned by public education ages ago. Differentiated instruction, IEP's, authentic assessment, project-based learning, and a thousand other methods have been tried and adopted by classroom teachers who routinely work to meet students where they are and craft instruction to suit their personal needs. That's one of the great ironies of the bait and switch, the algorithmically mediated lessons - in the majority of US classrooms, when it comes to personalization, Faux Personalized Learning is actually a step backwards. The personalized bait-and-switch is about getting teachers to trade in their shiny hoverboards or rusty steam engines.

Last modified on
Hits: 372 Comments

Posted by on in Education Technology

The cliché is a fifty-year-old asking some ten year old student for help in making the computer work. Having trouble making working with your device or your software? Just grab one of those digital natives to handle it for you!

Well, not so fast. Here's Jenny Abamu at Edsurge saying what I've been arguing for over a decade-- our digital natives are hugely naïve about technology.

With the adoption of any new technology, there's a curve. In the 1910s, if you owned an automobile, you were also a reasonably savvy mechanic who knew how to work on his own machine. But in the century since, cars have become advanced in a way that has led to fewer and fewer car owners who could actually repair their own vehicle.

It's a simple fact of marketing-- early adopters may be willing to know the nuts and bolts of the tech, but to expand my market, I have to be able to say to non-savvy buyer, "Don't worry-- the tech will take care of everything for you." I have to make the tech user-friendly, and the friendlier it is, the less my customers need to know. The goal is to move from a product that only an aficionado can handle to a product that any dope can use. We are well into Any Dope territory with computer tech (spoiler alert: Linux is not the PC wave of the future).

Fifteen to twenty years ago, I could count on a few students in each class who could code. I used student helpers to build the school website from scratch. But nowadays I have to explain to my students how to save a photo the like on line, or how to use a Google doc. And students at the New Media Consortium Summer Conference echo that:

“Something you can do to prep your students for college is to have one day where you host a workshop on using Google Docs,” suggested Alejandra Cervantes, a junior at UCLA, in response to a question from an educator about the best way to support high school students heading to college. “Something simple like that can be pretty instrumental in helping them succeed in classes in the future.”

And yes-- that quote and the article its from raise its own set of issues. Because Google is working hard to inject themselves into the ed world, and they're not doing it just to be great humanitarians, so pieces like the Edsurge piece are meant to keep banging the drum that your student must know how to use Brand X Software or she'll fail at life.

And yet there is all this cool stuff to use, and my students don't have a clue. They know Snapchat, Instagram, a little twitter, and whatever the hot app of the week is (developers who think they can come up with an educational app that students will use enthusiastically for a year, starting months from now-- those developers have a naivete problem of their own). There are pieces of software that let them collaborate on projects-- they don't know how to use any of them. There are tools for including art and images and videos in one project and they don't know how to use any of them. And why do we keep reading stories about somebody who lost a job or a college spot because they posted something stupid on line? Because the vast majority of my students have no idea how the interwebs actually work.

In some cases it is tunnel vision-- they just use what they use, which is what they picked up from friends or the pre-loaded software on their device. In many cases, it's lack of access. A Pew Research Report from 2015 says that 17.5% of households with children have no internet access. That does not seem out of line with my own student population (though virtually all of my students have their own smartphones).

I have beaten my head against this cyberwall for years. I was hugely excited about the possibilities of web-based projects in which students could take 15 or 20 different works of literature and show a web of relationships between them-- far more complex stuff than could be managed in a traditional paper. But when I gave them the assignment, what I got was a traditional linear paper with each paragraph on its own page, linked so that the reader could go forward or back a paragraph.

I am not a thoughtless technophile, and I never implement tech just to do it. If it's not useful, I don't care. Where it is useful (I have replaced the traditional English teacher keep-em-writing practice of a paper journal with mandatory blogging for my students), I embrace it. But I have had to train and explore and learn myself first, because my digital natives are like people who have grown up in a big metropolitan city but only know their way around their own two-block neighborhood and don't even know the actual names of the streets there.

If you want to get your students into the technofuture, you are going to have to lead them there, just like you have to with Shakespeare and critical realism and new vocabulary words. That's the implication of this kind of article for teachers. The implications for people who think giving standardized tests on over-the-net software-- well, that's another discussion (spoiler alert: it's a bad idea).

Last modified on
Hits: 335 Comments

Posted by on in General

This is a line often included in one of those self-reported stories that people feel compelled to share when they discover they are talking to an English teacher. It's not quite as popular as those standards "I Always Hated English Class in High School" or "I Hate To Read" or the super-popular "I Guess I'll Have To Watch My Grammar When I'm Around You." Just today, someone once again summed up her experience by citing what someone, years ago, told her. "You'll Never Be a Writer."

"You'll Never Be a Writer" is different story because, first, it has nothing to do with feelings you had when you were younger, which are perhaps something adult you might want to keep to yourself (nobody of my age need proudly share that classic tale "The Year I Memorized the Shape of Farrah Fawcett's Right Breast in a Red Swimsuit"), nor does the story "You'll Never Be a Writer" include a thinly veiled prediction/criticism of someone's poor social behavior

"You'll Never Be a Writer" is a sad story of crushed dreams and truncated aspirations. But it's also wrong. Sometimes it's just meant as conversational filler, so I would hate to be that guy and correct someone who's just trying to make pleasantries (on the other hand, I am an English teacher and it's possible that I take great joy in correcting others at inappropriate moments). But here's the basic drift of what I have to say about this.

Now, YNBAW is sometimes a pronouncement on economic realties. "Writing," folks say,  "is not a real with which you can support a single grown human, let alone a whole family of them." I always assumed that I would write when I grew up, and I always assumed that I would never make enough to support myself, which was fine because I wanted to teach. I didn't care that I would never make serious money (anyone who wants to prove me wrong by giving me a lucrative book deal or syndication gig is welcome to contact me here). "Writing's very nice and all, " many a student and parent have said to me, "but you can't really make a living at it, can you?"

Well, yes and no. Writing the Great American Novel is not terribly lucrative, and creating the next Highly Profitable Property doesn't necessarily require great writing chops (looking at you, Stephenie Meyer and Dan Brown).

But if your goal is not to become a rich and famous fiction writer, other writing jobs exist. Virtually every specialized field in the world is primarily populated by people who know the field, but cannot communicate effectively about it. I have former students who became technical writers, nature writers, and sports writers. Being able to write is important to the writing life, but having a topic that you are knowledgeable and passionate about-- that's huge, too. When a student says, "Well, I'd really like to be a writer, but I really want to work in the widget industry, too," that student's solution is right in front of her.

"You'll Never Be a Writer" is wrong for other reasons as well, the most notable of which is that we are living in a text-based world. Thanks to the internet, we communicate more than ever via the written (typed) word. In both our work and personal worlds, it's now hugely important to be able to say just what you mean, and equally important to be able to read hat others write critically and carefully.

In the years ahead, you will write reports for your job. You will communicate with friends and family via text. You may very well court and couple with the use of text. If you enter politics, you will have to explain yourself through text. If you are an activist for a cause, some of your communication will be through text. Whatever it is you want to say, and whatever audience you want to say it to, you are likely to write it.

It may not bring fortune or fame. But it remains the best ways to communicate and store ideas and feelings across space and time. Much of human history has been spent searching for ways to record, transmit and store our various languages; digitizing it represents a new step forward in that process, meaning that the composing and arranging of that language has become even more important.

Regardless of what someone told you in some misguided attempt to crush your dreams or slap you upside the head with a cold, fishy slab or reality, they were wrong. Good or bad, inspired or flat, enthusiastic or grudging, because you are alive today, you are a writer. You will always be a  writer. Make the best of it you can, because you will always be a writer. Search for your voice and find your way, because like it or not-

You will always be a writer.

Last modified on
Hits: 406 Comments

Posted by on in Education Technology

Facebook absolutely insists on showing me "top stories." Every time I open the Facebook page, I have to manually switch back to "most recent," because even though the Facebook Artificial Smartitude Software thinks it knows what I most want to see, it can't figure out that I want to see the "most recent" feed. Mostly because the Facebook software is consistently wrong about what I will consider Top News.

Meanwhile, my Outlook mail software has decided that I should now have the option of Focused, an email listing that lists my emails according to... well, that's not clear, but it seems to think it is "helping" me. It is not. The Artificial Smartitude Software seems to work roughly as well as rolling dice to decide the ranking of each e-mail. This is not helpful.

I pay attention to these sorts of features because we can't afford to ignore new advances in artificial intelligence, because a whole lot of people think that AI is the future of education, that computerized artificial intelligence will do a super-duper job directing the education of tiny humans, eclipsing the lame performance of old-school meat-based biological intelligence.

Take, for instance, this recent profile in Smithsonian, which is basically a puff piece to promote a meat-based biological intelligence unit named Joseph Qualls. Now-Dr Qualls (because getting meat-based biological intelligence degrees is apparently not a waste of time just yet) started his AI business back when he was a lonely BS just out of college, and he has grown the business into.... well, I'm not sure, but apparently he used AI to help train soldiers in Afghanistan among other things.

To his credit, Qualls in his interview correctly notes one of the hugest issues of AI in education or anywhere else-- What if the AI's wrong? Yes, that's a big question. It's a "Other than that, how did you like the play, Mrs. Lincoln" question. It's such a big question that Quall notes that much AI research is not driven by academics, but by lawyers who want to know how the decisions are made so they can avoid lawsuits. So, hey, it's super-encouraging to know that lawyers are so involved in developing AI. Yikes.

Still, Qualls sees this rather huge question as just a bump in the road, particularly for education.

With education, what’s going to happen, you’re still going to have monitoring. You’re going to have teachers who will be monitoring data. They’ll become more data scientists who understand the AI and can evaluate the data about how students are learning.

You’re going to need someone who’s an expert watching the data and watching the student. There will need to be a human in the loop for some time, maybe for at least 20 years. But I could be completely wrong. Technology moves so fast these days.

So neither the sage on the stage or the guide on the side, but more of a stalker in the closet, watching the data run across the screen while also keeping an eye on the students, and checking everyone's work in the process. But only for the next couple of decades or so; after that, we'll be able to get the meat widgets completely out of education. College freshmen take note-- it's not too late to change your major to something other than education.

Where Qualls' confidence comes form is unsure, since a few paragraphs earlier, he said this:

One of the great engineering challenges now is reverse engineering the human brain. You get in and then you see just how complex the brain is. As engineers, when we look at the mechanics of it, we start to realize that there is no AI system that even comes close to the human brain and what it can do.

We’re looking at the human brain and asking why humans make the decisions they do to see if that can help us understand why AI makes a decision based on a probability matrix. And we’re still no closer.

I took my first computer programming course in 1978; our professor was exceedingly clear on one point-- computers are stupid. They are fast, and they are tireless, and if you tell them to do something stupid or wrong, they will do it swiftly and relentlessly, but they will not correct for your stupid mistake. They do not think; they only do what they're told, as long as you can translate what you want into a series of things they can do.

Much of what is pitched as AI is really the same old kind of stupid, but AI does not simply mean "anything done by a computer program." When a personalized learning advocate pitches an AI-driven program, they're just pitching a huge (or not so huge) library of exercises curated by a piece of software with a complex (or not so complex) set of rules for sequencing those exercises. There is nothing intelligent about it-- it is just as stupid as stupid can be but, but implemented by a stupid machine that is swift and relentless. But that software-driven machine is the opposite of intelligence. It is the bureaucratic clerk who insists that you can't have the material signed out because you left one line on the 188R-23/Q form unfilled.

There are huge issues in directing the education of a tiny human; that is why, historically, we have been careful about who gets to do it. And the issues are not just those of intelligence, but of morals and ethics as well.

We can see these issues being played out on other AI fronts. One of the huge hurdles of self-driven cars are moral questions-- sooner or later a self-driven car is going to have to decide who lives and who dies. And as an AP story noted just last week, self-driven car software also struggles with how to interact with meat-based biological intelligence units. The car software wants a set of rules to follow all the time, every time, but meat units have their own sets of exceptions and rules for special occasions etc etc etc. But to understand and measure and deal and employ all those "rules," one has to have actual intelligence, not simply a slavish, tireless devotion to whatever rules someone programmed into you. And that remains a huge challenge for Artificial So-called-intelligence. Here are two quotes from the AP story:

"There's an endless list of these cases where we as humans know the context, we know when to bend the rules and when to break the rules," says Raj Rajkumar, a computer engineering professor at Carnegie Mellon University who leads the school's autonomous car research.

"Driverless cars are very rule-based, and they don't understand social graces," says Missy Cummings, director of Duke University's Humans and Autonomy Lab.

In other words, computers are stupid.

It makes sense that Personalized Learning mavens would champion the Artificial Stupidity approach to education, because what they call education is really training, and training of the simplest kind, in which a complicated task is broken down into a series of simper tasks and then executed in order without any attention to what sort of whole they add up to. Software-directed education is simply that exact same principle applied to the "task" of teaching. And like the self-driven car fans who talk about how we need to change the roads and the markings and the other cars on the highways so that the self-driven car can work, software-driven education ends up being a "This will work well if you change the task to what we can do instead of what you want to do." You may think you can't build a house with this stapler-- but what if you built the house out of paper! Huh?! Don't tell me you're so stuck in a rut with the status quo that you can't see how awesome it would be!

So, they don't really understand learning. they don't really understand teaching, and they don't really understand what computers can and cannot do-- outside of that, AI-directed Personalized Learning Fans are totally on to something.

And still, nobody is answering the question-- what if the AI is wrong?

What if, as Qualls posits, an AI decides that this budding artist is really supposed to be a math whiz? What if the AI completely mistakes what this tiny human is interested in or motivated by? What if the AI doesn't understand enough about the tiny human's emotional state and psychological well-being to avoid assigning tasks that are damaging? What if the AI encounters a child who is a smarter and more divergent thinker than the meat widget who wrote the software in the first place? What id we decide that we want education to involve deeper understanding and more complicated tasks, but we're stuck with AI that is unable to assess or respond intelligently to any sort of written expression (because, despite corporate assurances to the contrary, the industry has not produced essay-assessment software that is worth a dime, because assessing writing is hard, and computers are stupid)?

And what if it turns out (and how else could it turn out) that the AI is unable to establish the kind of personal relationship with a student that is central to education, particularly the education of tiny humans?

And what, as is no doubt the case with my Top Stories on Facebook, the AI is also tasked with following someone else's agenda, like an advertiser's or even political leader's?

All around us there are examples, demonstrations from the internet to the interstate of how hugely AI is not up to the task. True-believing technocrats keep insisting that any day now we will have the software that can accomplish all these magical things, and yet here I sit, still rebooting some piece of equipment in my house on an almost-daily basis because my computer and my router and my isp and various other devices are all too stupid to talk to each other consistently. My students don't know programming or intricacies of certain software that they use, but they all know that Step #1 with a computer problem is to reboot your device because that is the one computer activity that they all practice on a very regular basis.

Maybe someday actual AI will be a Thing, and then we can have a whole other conversation about what the virtues of replacing meat-based biological intelligence with machine-based intelligence may or may not be. But we are almost there in the sense that the moon landings put us one step closer to visiting Alpha Centauri. In the meantime, beware of vendors bearing AI, because what they are selling is a stupid, swift, relentless worker who is really not up to the task.

Last modified on
Hits: 531 Comments