Tuesday, September 24, 2019

Keep the records spinning

Joe Biden's confusing and stumbling reference to keeping the "record player on" has been hashed out and ridiculed on Twitter quite thoroughly. I won't revisit the educational concept he was trying to touch on there (that preschoolers need to "hear more words" or even possibly classical music), or even detail the history of how record players have been used in educational settings over the years. (The movie Conrack used the record player in particularly memorable and cinematic ways, though the movie has been criticized for playing on the "white savior" myth.)

This is an image of a shirt for sale at The Bitter Southerner, a magazine I highly recommend. And their merchandise sales benefit the Appalachian Citizens' Law Center.


For me, a more interesting aspect of Joe Biden's gaffe is that he was obviously trying to cover up his mention of TV as a possibly beneficial influence on children. His brain accessed and uttered a quickly conjured replacement (inself an fascinating linguistic phenomenon) that seemed vaguely more educational and verbal than TV.

Over the years, there have been mixed reports of the value of children's programming on TV. This recent report on a study seems to show significant benefits.) Children are "hearing words" on TV, of course, but they're also being exposed to distracting imagery, as well as possibly violence, sex, and loads of advertising (if they flip the channel from PBS).

I think if I had kids and let them watch much TV, I would do so with the captioning on. Then the kids are "reading" TV as well, and also being exposed to the craft of dialogue in a certain way. (This would help a future screenwriter.)

As a kid growing up in the 70s and 80s, I watched a lot of TV. As the Albert Brooks character says in Defending Your Life, "it was everything to me." Although I can think of a lot of negative things I learned from TV, I also remember getting interesting career ideas, like wanting to be a photographer when watching the Odd Couple or becoming interested in news reporting while watching Lou Grant. (In fact, I ended up learning and using both those skills in my early career working in the newspaper business.) This was before the "golden age" of TV, as represented by the high-minded cable shows, but whatever TV lacked in thoughtfulness, it also had a certain innocence and light-heartedness that seems largely missing today. Even the comedies have a cynical or satirical edge.

At its best, TV exposes us to ideas and worlds we do not know. We shouldn't be so quick to dismiss it (or record players, which have been making a real comeback in recent years. What's old is new again.)

Tuesday, September 3, 2019

Students as Customers

Professors bristle at the ever-growing call to treat students in higher education as "customers" or "clients," and for good reason. As I've written about elsewhere, the need to compare students to anything is strange. Why do we need an analogy? And if this is the appropriate analogy, what are the implications for education?

Obviously, professors don't like this comparison (or change in labels) because it implies that students are paying for grades or paying to be treated well. It suggests that we are here to "serve" students in obsequious fashion and to treat education like a "product," reducing the classroom experience to a transaction.

More than anything, perhaps, professors think of the old adage that "the customer is always right," and wonder if this slippage of terminology will lead to a further erosion of authority and standards. If a student demands a grade change, does that mean the customer gets what they want?

As anybody who has ever worked in retail or the service industry will tell you, though, the customer isn't always right. Often the customer is wrong. While efforts may be made to make a customer happy, sometimes the customer is incorrect about the facts or misguided in what they think they deserve, and a worker may endanger the business or waste valuable time or resources by bending to the customer's wishes.

A drunk customer who wants more to drink but has been cut off is not "right" in demanding more, and the bartender would be unwise to serve him. A customer who demands an employee break company rules or ethical standards is not right. A customer who demands something for free, when it isn't really warranted, is really just engaging in a kind of theft by acquiescence. That's not fair, ethical, or "right."

Beneath consideration would be the customer who becomes rude, loud, or even verbally abuses a waitress or salesperson just because they believe they're allowed to do this as the customer. This is a customer who is not worth a business's time, no matter how much an employee might be tempted to try to do whatever he or she can to calm him down.

Students should be treated like customers in certain contexts, e.g., when registering and paying for classes, when eating at the cafeteria, or when buying books at the campus bookstore. Professors and everyone else at the college have a duty to be polite and professional with students, and to treat them with compassion and human understanding, but that's not the same as treating them like customers. There's no need for a new model or analogy, although we can all think about how to improve our day-to-day behavior and interactions with students.


Monday, August 26, 2019

Honorary Doctors

Honorary doctorates have long been an interest of mine, for reasons that are hard to articulate. It's curious how guarded and jealous humans are when it comes to titles, and it's rather fascinating how elevated the title of "Doctor" in particular has become in America. Territorial arguments abound about who is entitled to call himself "Doctor," even with earned doctorates (those outside of academia may wonder why anyone except physicians use the title, whereas those within academic point to the history and note that an M.D. is a professional doctorate). Honorary doctorates, of course, raise all sorts of suspicions. They are used for fund-raising or entice guest speakers; they are given to those who are less than deserving. They trivialize education and demean earned degrees. Some universities do not ever give them, as a matter of policy and practice. But still they proliferate, and we are weirdly fascinated when rock musicians or movie actors receive honorary doctorates. They probably shouldn't actually use the title, though. As this writer points out, celebrity figures, even if they are superstars in their field, who invoke or prefer the title "Dr." after receiving an honorary doctorate may be regarded as vain, supercilious, or worse.

The honorary doctorate, as the writer of the linked article also notes, has been diluted through its overuse as a reward for those giving commencement addresses. Even Kermit the Frog received an honorary Doctorate of Amphibious Letters, which is cute but awfully silly.  Yet I would argue that the perfectly legitimate purpose of the honorary doctorate, whether the person uses the title or not, would be to honor those like Maya Angelou (the main subject of the article above), who, even though she did not have much in the way of formal education, achieved great things in a field with academic connections (literature). 

Other deserving figures in the past have been similarly honored with doctorates. These include Mark Twain, who apparently liked wearing his Oxford cap and gown in all sorts of circumstances, and Samuel Johnson, who had a master's degree but was given an honorary doctorate by Trinity College for his dictionary, among other things. Others took to calling him Dr. Johnson (an honorific still applied as shorthand), and it fit because this man was a true leader in his field of scholarship. More recently, the classical guitarist Christopher Parkening -- a pioneer in his field who founded a guitar program at a university when he was only 22 -- never earned a college degree, though he was awarded a much-deserved honorary Doctor of Music in 1983. 

Mark Twain


Discussions abound about who should be called "Doctor." Some say the title should be reserved for medical doctors, but Ph.D.s push back by noting that the original word had specific connections to teaching and scholarship. Confusingly, not all countries use doctorate degrees for physicians, and historically, physicians and surgeons used the title "Mister."

The value of a true honorary degree (given to honor accomplishments) can best be understood by the examples that seem meaningful and even moving. When I mentioned to my wife the other day that Benjamin Franklin had little formal education but did have an honorary doctorate that he cherished (he referred to himself as "Doctor"), she about fell off her barstool. (Yes, this was in fact a "cocktail napkin" conversation.) The fact that this undisputed genius, a man generally regarded as one of the greatest humans who ever existed, a man responsible for countless inventions and scientific discoveries, did not have a college degree is astonishing. But I think she was more astonished to think how Benjamin Franklin, even with his many accomplishments, still had a need to be considered "educated" in a way that was certified by credentials and paper. We can call this vanity, but it speaks more to the continuing value that degrees have in the minds of people. That's not such a bad thing. As younger men, Franklin and Twain -- both from modest backgrounds -- were too busy (and too cash-poor) to indulge much time in education, but they still valued education and wanted to be associated with the academy. They certainly have that status now, so why shouldn't they have enjoyed it while still alive? 

Monday, August 5, 2019

Home Alone

In this finely written and witty recap of Season 2 of Stranger Things, reviewer Rebecca Farley calls the setting of the series "the land where parents do not care where you are." It's a funny reference to the non-concern of Nancy and Mike's parents in particular, but it's also a telling window into the show runners' understanding of the 1980s. Elsewhere in the recap, Farley shows some awareness that things were indeed more or less like this in the 1980s, so I don't think her joke is a sarcastic jab at the show not being realistic, as in, "Oh, come on, what kind of parents don't know where their kids are?"

Such a reaction from a younger viewer would not be surprising, however. As the movement celebrating "free-range parenting" demonstrates, giving kids freedom is not a new concept, though people somehow have to argue for its return. The cultural resistance to this kind of childhood liberty is pretty recent, springing largely from stoked-up fears about child abduction but also as a kind of generational reaction to what might be called a more "hands-off" approach to parenting in the 1960s-1980s. The children grew to become the parents they thought their own parents should have been, for better or worse.

Generation X is sometimes known as the generation that dealt with epidemic-level rates of parental divorce, but they (we) also grew up in a time in which parents were often exploring their own options and, perhaps in response to their parents' stricter parenting style, giving kids more freedom and the ability to explore.

This was not all bad. While there may have been times when we needed more guidance and supervision, we also learned to be independent and to take care of ourselves. We developed skills in things like cooking, shopping, and car repair. We learned to drive as soon as we could. There was certainly no need for a term like "adulting," which is not to say we were all mature or even all that together. (This was also the era in which things like PDAP were born.)

The fact that Nancy and Mike's father has no idea where his kids are is funny, but he would hardly be the only father in the neighborhood to be that nonchalant about it. It's an in-joke about the times, but it's also pretty true to reality. His reaction reminds me of the parents in films like Nightmare on Elm Street, where the terror originates with the misplaced revenge focus of parents who are otherwise out of touch with their kids as they indulge in their own passions. (In this same episode, Nancy's mom is drinking and talking on the phone. Maybe not by chance that this character shares the same first name as the first and most venerable of Elm Street's children.)

Allowing kids independence is not such a new concept, however, and not peculiar to the 1970s-80s. In my American Literature class, my students read an 1880s story by Ambrose Bierce called "Chickamauga." The story is ostensibly about a young boy playing in the woods who happens upon a Civil War battle. Without fail, there are always students in class who are primarily appalled or concerned about a young (fictional) child of about 5 who is allowed to wander off by himself! (The war itself is not as concerning as the safety of this young child.) I have to explain that children playing by themselves, even at such a young age, was not unusual at the time, and it's only in recent decades that we have come to be so panicky about childhood. (On the other hand, that is also an overstatement, as the Victorians had their own hangups and anxieties about childhood that manifested in other ways.)


Children have long been a staple of horror and suspense, of course, because we naturally fear the corruption of innocence and because the state of childhood is so unlike any other time of life. Bierce's story isn't primarily a horror story, though it can be read fruitfully that way if one chooses to dwell on the imagery, and it certainly is a story about the horrors of war. The whole thing reminds me of the way we tend to think other countries are more "dangerous" than America, even with our outrageous rate of gun deaths and other societal ills. Childhood is a kind of "othered" land, a place where we no longer live, so we naturally fear it (as well as idealize it) and consider it as somehow more dangerous or fraught with difficult than adulthood. Considering the previous prevalence of childhood disease and the basic responsibility parents have to protect and nurture kids physically, perhaps this is not surprising. Perhaps those parents in Stranger Things really are bad parents, and that's why their kids are subjected to all manner of supernatural trials. But those kids sure seem to be having a lot of fun even amidst all the danger.

Wednesday, May 15, 2019

Reality Bites, Reconsidered

As Reality Bites marks its 25-year anniversary, there has been much discussion and reevaluation of this now seemingly iconic film. As guests appearing on a recent TV documentary about Generation X seemed to agree, the film has aged a lot better than expected and seems to represent the zeitgeist of the times a lot more than we were willing to admit at the time.

When this film was released, the Gen-Xers who were its target audience saw it, of course, although not in the numbers required to make it a hit. We also pushed back against its narrative and the reality represented by its characters. We reacted like Troy would have, sickened by the obvious commercial packaging of the movie and taken aback by the broad stereotypes of our generation.

The thing is, this aspect of the film is what makes it ring true for wiser Gen-Xers looking back now. Writing about the film at the time, Roger Ebert complained that it was "blind ... to its own realities" and that the film needed more acknowledgment of certain obvious aspects of the story (like Troy was a jerk and Lelaina not a very good filmmaker).

Pizza figures into Reality Bites in interesting ways. Of course it does. (Photo by Justine!)


It's perhaps true that the story could have done more with these potential points of conflict, but I think the film and filmmakers were actually very much aware of these realities and how the characters embodied them. There's another scene in the movie where the Winona Ryder character (Lelaina, supposedly a valedictorian) is asked to define irony and can't do it. To me, this scene suggests a deeper aspect to this story, as written by a screenwriter very much of her generation at the time (I don't think many of us were aware of that fact, either). Lelaina doesn't get irony in the same way that post-grunge singer Alanis Morrisette didn't "get" it. (But then, of course she did. The song was ironic.)

When you're a Gen-Xer, you don't need to define irony because there's just too much of it. We did sit around and watch Good Times and talk about the Brady Bunch, but we didn't think it was all that ironic. (We didn't think it was all that serious, either.) As has also been pointed out by the online commenting class, the characters in this movie consume a lot of commercial products for people so apparently dead set against crass commercialism. But that actually seems pretty valid as an aspect of Generation X. We complained and even whined about commercialism, but we didn't feel we could do that much about it. We also liked Diet Coke and Quarter Pounders (as Troy notes). We saw no contradiction there. We didn't invent this stuff; we just consumed it. We didn't see a lot of choices. Yes, I will have this Quarter Pounder and Diet Coke. What else are you offering? 

It seemed we didn't have a lot of choices about life either. Roger Ebert makes much of the slacker who is "supposed to be" wise or interesting because he doesn't work, but that's not why alleged "slackers" struck that pose in the early 1990s. The economy really was in the toilet, and jobs for young people, even educated young people, were scarce. The slacker thing was more about dealing with one's circumstances and learning to survive in a less-than-amazing job. (Sarcasm is often part of survival.) Perhaps Ebert related more to the parents in this movie, who represent the parents of Generation Xers in very interesting ways, projecting that sense of disappointment even as they seem unwilling to acknowledge their own responsibility (both personal and collective-economic) in shaping this "reality' for dissatisfied 20-somethings.

Yes, we were whiny and thought we were great filmmakers and artists when maybe we weren't. But we also knew the irony of all that, even if we didn't want to admit it. It's like the when the Simpsons did the parody of the 1990s music festival, and the Gen-X-type character said he "didn't even know" if he was being sarcastic anymore. We don't tend to think about it as a separate concept because it's just too central to everything as we see it.

Gen Xers became cynical (and yes, ironic and sarcastic) as a survival mechanism, not as some sort of expression of cultural coolness. Many of us just happened to internalize that quality as we got older, and so we understood that kind of conversation. The Ebert review makes some good points about the film as such, but this Washington Post review gets a lot closer to the truth about what the movie will come to represent, especially for a review of the time. Maybe Ebert just didn't "get it."*

As I mentioned, a lot of us Gen Xers didn't "get it" either. Winona Ryder was our generation's star, and we usually trusted her, but we weren't sure about what they were representing here. We weren't this whiny and self-indulgent, were we? The answer is no; probably only the coolest of us came off like Ethan Hawke in this movie (Ebert's right about his performance also -- he's unlikable but also conveys a certain injured quality -- what's amazing is how different this is from some of his other characters of the time). Most of us were just struggling and searching and trying to put it together, and we weren't ready to have this kind of light focused on us yet or to make fun of ourselves yet -- even it was pretty damn accurate.

*Ebert also gets some facts about the movie wrong in his review, which is an astonishingly common feature of his reviews. 

Sunday, April 14, 2019

I apologize (that you're an idiot)

There has been much writing in recent years about the "non-apology," especially as we become more embroiled in the era of "outrage politics" and the so-called "he said/she said" negotiations surrounding "MeToo" accusations. Sometimes, those accused of "inappropriate behavior" or comments, or those accused of other crimes, don't really want to apologize and don't really feel like they did anything wrong, so they say, "I'm sorry if anybody was offended" or "I'm sorry if I hurt your feelings."

As psychologists remind us (though it should be obvious), these kinds of words are not really apologies at all. They are in the same neighborhood as saying, "No offense, but .... "

Honestly, when one must offer a non-apology, I find the best way is to go with the words of Rhett Butler, who told a young hot-head whom he obviously found pathetic and did not want to humiliate in a physical fight, "I apologize again for all my shortcomings."

Rhett Butler "apologizes."


As this New York Times piece reminds us, Southern speech, for all its supposed politeness and eloquence, is often passive-aggressive and many polite exchanges are designed to reinforce certain social codes and put people "in their place," so to speak. (An earlier blog of mine addressed this in a more general way.) Rhett Butler's apology is really more of "I'm sorry you can't see what an idiot you are," or, as this definition in Urban Dictionary puts it, "I'm sorry I don't think or act like an asshole like yourself." As the UD says, it's a "gentleman's" phrase, a polite way of telling someone, more or less, to "fuck off." It's nice to have this kind of language at your disposal, and the gentleman has the advantage of being able to deploy it and thereby cause more consternation than actual offense. He goes on his way, like Rhett Butler, to tour the grounds and find someone else to make sport of.

Thursday, March 14, 2019

College Admissions, Gen X style

This New York Times writer makes many of the points I have been thinking regarding the otherwise pathetic college-admissions scandal. The scandal is shocking for its involvement of celebrities and huge amounts of money, but it seems just a routine example in other ways of how corrupt and unseemly the admissions process has become. The system is corrupt on the most basic levels of assessment. We are often not basing admissions on student ability but rather on test preparation and, arguably, more abstract values associated with success in this area, such as motivation and organizational skills.

My wife and I remember this differently, but I don't recall many people studying for or worrying about the SAT test in the 1980s. There were test-preparation booklets and courses available, but the general consensus as I remember it was that you couldn't really study for this test, just as you couldn't really study for an IQ test. It was designed by experts in assessment to measure your ability to think and the likelihood of your success in college. (This viewpoint was confirmed by Brandon in a recent reviewing of a 90120 episode, in which he says "you can't study" for the SATs, even as his always-in-for-a-scam friend Steve Sanders spends hundreds of dollars on a prep package. Of course Brandon was being naive, and I know the fact that I have been watching 90210, in reruns no less, says little of my own intelligence or credibility.)

Granted, the stakes were lower in the 1980s. Admission into the Ivies was still very difficult and mysterious, but getting into the flagship public universities like the University of Texas at Austin was much easier. Fewer people were attending college, and those who knew they were college bound tended to have a pretty good idea, more or less, of where they were going. As has been well documented in writing and films about Generation X, parents were much more hands-off, as well. This had its advantages and disadvantages -- perhaps better fleshed out in another piece.

Photograph by with an eye. Used under Creative Commons license. https://creativecommons.org/licenses/by-nc-sa/2.0/legalcode
This is basically a problem with assessment, in that once again, a test designed for one purpose is being used in a different way and becoming just one more hoop that people will jump through in whatever way they see fit.

The New York Times piece mentions paying "editors" to help with college-admissions essays as well. This might seem innocent enough, but I can tell you from personal experience that students often want more than revision suggestions or basic editing assistance, and, in many cases, I'm sure the consultant ends up being the main author of the essay. We end up with a seemingly holistic piece of evidence that really tells us nothing about the student. This is corrupt, yes, but it can even happen without money being exchanged, such as when a student is simply able to enlist the help of an older sibling, parent, or teacher.

Those objecting to this point of view (that admissions corruption is a matter of degree) argue that these minor acts of subversion obviously do not compare to the fraud and outright bribery alleged in the FBI sting. There are many comments under the N.Y. Times op-ed that say this very thing, and it is true enough as a basic matter of fact. "Getting help" on an essay is not a crime, and in many cases is encouraged as part of the process, an acknowledgement that writing is a collaborative activity involving conversation and exchange of ideas. On moral and ethical levels, however, we're really just talking about how far one is willing to go. This doesn't mean that a student "shouldn't" get help or advice (or even a paid tutor) in writing a college-admissions essay; I have even helped students myself. We should be reminded, though, that what a college is or should be really interested in is the abilities and voice of the student.

Friday, March 1, 2019

What is a No. 1 pencil?

A former grad-school professor of mine enjoyed touting the benefits of No. 1 pencils. We had never heard of such a thing. Apparently, not many people have, especially in the age of standardized testing the ubiquity of the No. 2 pencil. My professor believed the pencils to be superior for writing in books because the marks could be more easily erased, as the lead was softer. Years later, I bought a box of No. 1 pencils and found they weren't that easy to locate as an item. (This must have been before Amazon started carrying everything.) When I did find a box, I was disappointed to learn that sometimes in the era of modern pencils the "No. 1" designation was just a marketing gimmick and bore no relation to the softness of the lead. I think I will buy another box now (on Amazon) to test this theory out in a more empirical way. Surely the graphic-arts industry demands a wide variety of pencils, whatever the numbering system.
Photo by Hafiz Issadeen. (Used under Creative Commons license. Image is unaltered.) 

Wednesday, February 13, 2019

Batman and Commissioner Gordon

Just a few days after the Aurora, Colorado, tragedy, it had already become tiresome to read the articles and random Internet postings musing on the connections of this horrific event to the new Batman movie -- that is, speculations on whether the movie influenced the way this crime was carried out, whether the shooter actually called himself the "Joker," and whether such violent movies can inspire real-life violence.

This general topic came to mind again the other day when I was discussing the common fear of clowns and the "trickster" archetype in literature and folklore. (We were discussing Native American oral traditions, where the trickster is particularly prevalent, but we see the figure in almost all cultures.) It occurred to me, more or less talking my thoughts out in class, that Bugs Bunny represents a pretty good American trickster character (teaching and revealing foolishness through his deceit), as does the Joker in Batman. The Joker, like the figure of the Jester, can be said to be connected to both the trickster archetype and to the scary-clown trope. The Joker is particularly "uncanny" in that he is not just a misshapen or disfigured person (as all clowns are), but is also a disfigured clown. Although the Joker is mostly just an evil bad guy with selfish motives, he also likes to draw out the conflict in spectacular ways and to be deceitful even when it's not necessary -- just as a way of exposing the weaknesses of Batman and others.

Maybe the shooter in Colorado considered himself a kind of trickster figure, although it is doubtful. He more likely saw himself in the vengeful role. There's a rather sad irony that Batman himself represents the lonely outsider, mysterious and cut off from society, a man who has been driven to sadness by violent crime and channels that sadness into trying to rid society of the crime that has burdened his own life and turned him into an aloof and solitary figure. In this sense, Batman represents all of us (city dwellers) in modern society, in that he is weighed down by the existential loneliness and puzzled by the rise of violent crime in his once-loved urban home. Of course, he turns this loneliness and detached sadness into something positive, by fighting on the right side of the law, instead of exploding outward. The Joker is the "bizarro" version of Batman, his projected alter ego as well as his archenemy.

The whole thing has made me think about the appeal of Batman in our modern age, why he sometimes seems more interesting, with his human frailties and brooding personality, than the hip Spiderman or the paragon of squaredom, Superman, When I was younger, I loved Batman and Robin in the old campy TV series. Of course, I never thought of the series as campy, and I'm sure that never occurred to other kids, either, although maybe the Marvel fans thought it was all a little goofy. I thought it was just good, colorful adventure with a cool car and interesting female characters in strange costumes that stirred something deep inside my 8-year-old male self.

Weirdly, as much as I liked Batman and his car, I was especially fascinated by Commissioner Gordon, the older guy who was officially in charge at the police station, yet also enjoyed an almost mystical connection to Batman. He was one of the few outside people who could reach Batman if he wanted. The commissioner as played by Neil Hamilton on the 1960s TV series (an actor who originally worked as a model and was known for being "strikingly handsome" as a younger man) was a neat-looking guy with silver hair and nice suits. He had an interesting, more obscure title. He was older, but didn't look like a grandfather type. He had a telephone and a wood-paneled office. That's bad-ass. To me, he seemed like Batman's boss. That's better than being Batman, right?

J.K. Simmons as Comm. Gordon in those Justice League movies I haven't seen. Photo from IMDB.


In the more recent Batman remakes and reboots, Commissioner Gordon has been played by the incredible actor Gary Oldman and, in his pre-commissioner days, by the interesting younger actor Ben McKenzie, who manages to keep a low public profile even as he has enjoyed starring roles on three major TV series. (The older version and appearance of Commissioner Gordon is so iconic that McKenzie once apparently dressed as his character's older self for Halloween.) It's telling that Oldman in particular would take on this character. He doesn't do boring roles, and he doesn't do one-dimensional characters. He understands that the Commissioner isn't just a police official; he's the boss and a guy with almost as much angst and dark memories as the Batman himself.

Addition: I remembered that Commissioner Gordon is also the father/uncle of Batgirl, or at least the most popular version of that character (not the character in the ill-fated George Clooney movie). How cool is that?

Batgirl (Image by FotoToad. Used under permission of Creative Commons license. No changes made.)