Sunday, January 3, 2010

Yule Know Next Time

Ah, December! The darkest month of the year, when we lose daylight at the depressing rate of more than two minutes per day, until the winter solstice marks the turning point. Humans crave light at this dark time, and electricity allows us to indulge this craving shamelessly. No wonder the ancients turned the solstice into a weeks-long festival of lights.

Let’s see how much you know about this very important astronomical event. The winter solstice appears on calendars and ephemerides as an exact date and time (to the minute) that changes every year. Do you know how the time is determined? Is it actually the shortest day of the year? Is it actually the first day of winter?

The answers, surprisingly, are no and no. The shortest day of the year in fact occurs several days before the official date of the solstice. The official date and time instead mark the movement of the sun out of Sagittarius and into Capricorn. And you thought astrology was irrelevant!

As for its being the first day of winter, that’s a lot of hooey. I have no idea who decided winter “begins” in late December, but it’s nonsense. The meteorologists sensibly account December first as the beginning of the winter season. The ancient Celts, also sensibly, held that winter begins midway between the autumnal equinox and the winter solstice, at the holiday they called Samhain (pronounced sowan) and we now know as Halloween. Interesting that we now switch from daylight saving to standard time at the end of October, proving perhaps that old ideas don’t die, they just get repurposed. In the Celtic calendar, all the seasons begin midway between an equinox and a solstice, and each has its holiday. Spring starts at the beginning of February, a festival once called Imbolc and now “celebrated,” much corrupted, as Groundhog Day. Summer begins with Beltane, now called May Day, and autumn begins in early August at a holiday called Lammas or Lughnasa. Sadly, this occasion is no longer remembered on American shores, though I believe it is still observed in Ireland. This older calendar is why the winter solstice is known to us confusingly as both the first day of winter and the time of the midwinter festival.

In Scandinavia, which experiences the most profound winter darkness on the European continent, and in the countries where the Vikings and Norsemen carried their traditions, the weeks leading up to and following the winter solstice were known as Yule. When I looked up Yule in my dictionary, however, I was dismayed to see it defined only as “the feast of the nativity of Jesus Christ.” I’m sorry, but it is not. The derivation admits that the word is the Old Norse name for a pagan midwinter festival. Yes, exactly. That festival was celebrated for thousands of years before the Christians coopted it, and I object to its total omission from the definition. I have the same objection when I encounter the saying that “Jesus is the reason for the season.” Again, no offense intended, but he is not, at least, not originally and not solely. Celebration of the season predates Jesus by millennia.

Cultures that experience the change of seasons generally have a long festival at midwinter, a festival of lights to chase back the darkness and call upon the sun to return. The Chinese have the Festival of the Lanterns. Even as far south as the Mediterranean, there is Hanukkah, an eight-day event. The Romans had the Saturnalia in mid-December, during which gifts were exchanged. Saturn was the god of harvests; consider the symbolism of holding his festival months after harvest is complete and months before spring planting will begin, when the earth seems barren. Also consider that saturnalia has come to mean a drunken orgy, and we can guess that people celebrated rather lustily, making the beast with two backs to encourage the land to be fruitful once more, a primitive form of magick.

In a festival of lights, people light up the dreary darkness, whether with pre-electricity candles and Yule logs and firecrackers or with modern icicle lights and glittery tinsel and rainbow LEDs. Here in the United States, we string lights on everything from trees to porch columns to fences and place electric candles in our windows like beacons of hope. We bring evergreens into our homes as a symbol that life will be renewed, that the deadness of winter cannot defeat the vitality hidden within soil and branch and seed.

The colors of the season are obvious choices. Evergreens supply the only spots of living color in the white and brown and gray monotony of the bleak winter landscape, and green’s opposite is the cheery and warming red that can lift our spirits and gladden our hearts. Nature herself loves this combination, exemplified by holly’s shiny green leaves and bright red berries. People have been decorating their homes at midwinter with living green branches and garlands of aromatic pine and cedar and sprigs of white-berried mistletoe (sacred to the Druids) for almost as long as homes have existed, and the Christmas tree is just the most recent expression of that impulse.

Pagan or Christian, the midwinter festival celebrates coming out of the darkness and the rebirth of the world. You don’t have to be a Christian to celebrate Yule, just a human who craves the return of the sun and its light and warmth. A pagan, after all, is simply one who dwells in the country, close to Nature and sensitive to her rhythms and moods and mysteries.

In March I’ll post a piece about the origins of Easter, another pagan holiday coopted by the Christians. Watch for it!

This is article 19 in a continuing series. © 2010 Christine C. Janson

Monday, December 7, 2009

Ignorance and Arrogance

Many years ago, I was given a paperback copy of a book called The Mother Tongue: English and How It Got That Way as a gift. So crammed are my shelves with unread books I’ve only just got around to reading it. I stumbled across it by chance a week ago and was drawn to it because I had covered some of the same territory in an earlier article, Mispellers of the World, Untie! I wanted to see what the author, Bill Bryson, had to say and see whether I had left out anything major.

Most of the book is a discussion of research and histories done by others. Nowhere does Mr. Bryson mention any original research done by him, and his name does not appear in the bibliography, so I’m guessing he has not published in this field previously. None of the books in the bibliography was published before 1931, and there are no primary sources, such as Chaucer’s Canterbury Tales. There is no information about the author at all; his academic background and current profession are unknown. According to the blurbs all over the cover, this book was a hardcover bestseller and well reviewed, even earned the accolade of “scholarly” from the Los Angeles Times. The writing style is breezy and fun, more magazine than academe, which of course adds immensely to readability. But the author betrays such a fundamental lack of understanding of the basic structures of English that I am astounded he got his book published.


In his chapter on English grammar and its oddities, Mr. Bryson states that in English, “A noun is a noun and a verb is a verb largely because the grammarians say they are.” He supports this by giving a list of nouns that are also verbs, such as charge and towel. But the statement is arrant nonsense. A noun is a word that functions as a noun; a verb is a word that functions as a verb. In the sentence “The charge appeared on my statement,” the word charge is being used as a noun, and grammarians are as powerless to turn it into a verb as they are to turn it into gold. Only native speakers can decide how to use a given word, setting its function through use, and if they want to change its use, they have to construct a new sentence for it. The fact that English has so many multifunctional terms is a tribute to its unique versatility. A word’s function (noun, verb, whatever) is not revealed until it is actually used; no one can look at the isolated word charge and declare it noun or verb, because it has the potential to be either. This is a very neat trick, not a shortcoming, and cannot be done in many other languages.

Claiming that “the parts of speech are almost entirely notional,” Mr. Bryson offers the examples “I am suffering terribly” and “My suffering is terrible.” He says the grammarians would call suffering a verb in the first but a noun in the second, but in his opinion both sentences use “precisely the same word to express precisely the same idea.” Well, no. Technically, the first suffering is a present participle, a verbal adjective, and the second is a gerund, a verbal noun, both of which are derived from the same verb, suffer. It is thus not at all odd that they should express the same idea, but the verb has been inflected in different ways, to form a participle (adjective) to use in the present progressive tense and to form a gerund (noun) to use as a subject. Every English verb has the ability to become a noun or an adjective by the addition of -ing; which it is is strictly a matter of how it’s used. As a native English speaker, the author has automatically used terrible to modify the gerund and terribly to modify the participle even as he claims they are modifying “precisely the same word,” proving that the language center in his brain is operating better than the reasoning center. It isn’t a noun or an adjective because grammarians say it is; it’s a noun or adjective because that’s how it’s functioning. There’s nothing “notional” about it.

Having said one puzzlingly harebrained thing, Mr. Bryson reveals even deeper ignorance of how his language works (the language, remember, he has dared to write a book about). In the same paragraph, he writes, “Breaking is a present tense participle, but as often as not it is used in a past tense sense (‘He was breaking the window when I saw him’). Broken, on the other hand, is a past tense participle but as often as not it is employed in a present tense sense (‘I think I’ve just broken my toe’) or even future tense sense (‘If he wins the next race, he’ll have broken the school record’).” These cavils reveal such a complete misunderstanding of basic grammar I am left breathless. Throughout the book he cites Fowler, Copperud, and other well-known grammarians, but he has clearly been too selective in actually reading them. No authorities are cited in this section, but the lack of support for his pet peeve didn’t stop him from ranting. No research went into these inanities. There is nothing here but gibberish.

First off, there is no such thing as a “present tense” or “past tense” participle; a participle is an adjective and has no tense. Participles, present and past, are used to form various tenses. The present participle is used to form the progressive tenses present, past, future, and perfect: I am walking, I was walking, I will be walking, I have been walking, I had been walking, I will have been walking. Likewise, the past participle is used to form the perfect tenses: I have walked, I had walked, I will have walked. Reexamine the statements that “present tense participles” are often used in a “past tense sense” and vice versa, and you realize that his statements make no sense at all, present, past, or future.

It gets worse. Mr. Bryson follows this arrogant demonstration of ignorance with one of boneheaded wrongness. I can only quote; paraphrase will not suffice. “A noun…is generally said [to denote] a person, place, thing, action, or quality. That would seem to cover almost everything, yet clearly most actions are verbs and many words that denote qualities—brave, foolish, good—are adjectives.” These arguments are meant to shore up the assertion that “the parts of speech must be so broadly defined as to be almost meaningless.”

Not in my universe, bub. He has ignored or overlooked the fact that a noun expresses an action or quality in a different way than a verb or an adjective, and it is not uncommon to have closely related words (cognates) in multiple functional categories (e.g., sleep as noun, sleep as verb, sleepy or sleeping as adjective, sleepily as adverb) so that statements about a topic can be made in multiple ways. How is this a failing?? The noun is bravery, the adjective is brave; they both describe a quality, and each can be used to express a thought about heroism. How does that render the categories of noun and adjective themselves meaningless? Wouldn’t it be a bitch if we always had to use sleep as a noun and cast every sentence to accommodate that inflexibility?

I repeat, in English a word is characterized by how it is used, and native speakers decide how any given word may be used by using it that way and being understood. Mr. Bryson would seem to prefer a language in which the nouns were always and forever nouns and referred very solidly and concretely to things, and so on. This is not only impossible, it is supremely undesirable. It takes away all possibility of wordplay and inventiveness, not to mention growth and change.

I cannot believe this book was ever subjected to an editorial eye. No editor worth her salt would have allowed this nonsense to stand. Although I enjoyed other sections of the book, once I had read this chapter, I could no longer trust any statement the author made that I didn’t already know to be true. “Scholarly,” my ass. I rather doubt the reviewer read the whole thing. Who knows what other idiocies lurk beneath the breezy exposition? I usually resell or donate my unwanted books, but this one is going in the recycling bin as too worthless and too dangerous to pass on. I am as puzzled and outraged as if I had come across arguments for a flat earth in a book on geography.

This is article 18 in a continuing series. © 2009 Christine C. Janson

Hey! You Talkin' ta Me?

Reality shows sometimes resort to on-screen captioning when the dialogue goes mumbly or gets scattered by noise. News programs do the same thing, for example, for heavy accents and 911 tapes. People do not speak orthographically; what they say must be interpreted into the symbols we call writing, and those symbols include more than letters. These on-screen transcriptions do a reasonably good job of presenting the spoken word within the standard expectations of spelling and usually (not always, alas) get the homophones correct. But punctuation is a different story. In particular, nobody seems to understand that direct address requires distinctive treatment to avoid syntactical hash.

We interpret the spoken word differently than we do the written word. The spoken word is always context specific; the written word is always outside the context and must be specified. If the person next to you says “The house is on fire!” you hear the urgency, and you can probably turn and see the flames and feel the heat. If you read the same words, you’re probably far from the fire in time and space, but the quotation marks tell you someone actually spoke the words, and the exclamation point conveys the sense of urgency. Like tone and emphasis in speech, punctuation works with syntax to create meaning in a written communication.

When you address someone directly, that is, call them by their name or title or honorific (e.g., sir), that instance of direct address is grammatically isolated from any other part of the sentence in which it appears: we say to Bob, “I heard, Bob, and I laughed.” In speech, we can emphasize this separation by a slight pause, but because Bob is actually present, context alone makes the meaning clear. In writing, we must isolate the address with commas as a parenthetical element that is not participating in the grammar. (I.e., the commas are not replicating the spoken pause so much as they are visually fencing off that which is grammatically irrelevant.) If we do not set it off, Bob becomes the direct object of the verb heard because, grammatically speaking, that is how we must read the sentence as written: “I heard Bob and I laughed.”

Failure to set off a direct address with a comma may cause great embarrassment or great amusement. A desktop sign seen in a recent catalog reads “Work with me people.” This is clearly advice to work with egotists, not a direct address pleading for cooperation, which would require a comma: “Work with me, people.” “John get the phone” is pidgin for “John gets (or is getting or got) the phone”; a demand that the phone be answered requires an indication of direct address and the imperative: “John, get the phone.” “Don’t hassle me dad” is a Briton’s command to leave his father alone; “Don’t hassle me, Dad” is a plea from son to father for some peace. Note the capital D on Dad. A title used as an address is capitalized: “This is my aunt Mary,” but “Welcome, Aunt Mary!” Now note that without the comma to show direct address, this becomes a command: “Welcome Aunt Mary!”

Another thing that is always set off with commas because it has no grammatical role is an interjection. Many interjections appear in company with the exclamation point that underlines their emphasis: Hey, you, outta my yard! (interjection followed by direct address); Oh, man, is it cold! (two interjections back to back, or perhaps again an interjection followed by direct address); Haven’t seen you in, jeez, 30 years! (euphemistic interjection in midsentence). Swear words and obscenities not participating in the grammar are set off as interjections: “Shit, where’s my cell?” but “Get your shit together.”

There is a new animated Christmas special premiering this December called Yes Virginia. The first word is an interjection, and the second is a direct address, providing two imperatives for separating these words with a comma. I suppose it’s too late for them to change all the ads and titles and unembarrass themselves? Gee, guys, that’s a shame—on you.

If you’re talkin’ ta me, baby, you better get it right.

This is article 17 in a continuing series. © 2009 Christine C. Janson

Yawning Emptiness

Humans are communicative critters. We trumped the animal kingdom’s grunts and whistles by inventing language, made up words and rules for stringing them together to yield meaning. After a few millennia to work out the kinks, we rose above ourselves with poetry, drama, rhetoric, and logic. We figured out how to record the words and preserve them, from cuneiform to alphabets to binary code, from clay tablets to parchment to CD-ROM. From the beginning, we also devised ways to subvert the communication that is the very reason for language. We invented lies and other prevarications, giving rise to legal systems for determining guilt and teasing the truth out of conflicting accounts. And we found ways to use words to say nothing at all.

Saying nothing at all ranges from the long-windedly verbose, like the seasoned politician who can speak stirringly for an hour and convey not one phoneme of real meaning, to the monosyllabically iterative, like those who “um” after every third or fourth word.

In current American idiom, there are several go-to phrases for saying nothing and filling the silence while you gather your thoughts. Two of the most dreaded are “you know” and “like.” Speakers in the habit of using these cannot be listened to for very long, because after the first two, every succeeding “you know” or “like” elicits a bigger wince, until the listener risks whiplash or assault charges.

There are also words that are mindlessly overused to the point that they lose all meaning. Right now, when I hear “amazing,” I am no closer to knowing what the speaker means, beyond general approval, than if he had not spoken. I’ve also heard enough of “actually,” which doesn’t actually mean anything most of the time. As for “toe-tally,” I’m not going there. These words could all be replaced with the nonsense syllable blah without changing the informational content one bit. However, blah would not carry the emotional charge, the thumbs-up of “amazing” or the emphasis of “toe-tally.”

Another way of using language to convey no information beyond emotional content was, until recently, not permitted in public. Now, only G-rated movies are guaranteed to be free of four-letter words and swearing, and obscenities can be heard on cable channels other than the pay-through-the-nose premiums. I am not against this. The words exist; people use them; I use them; it is unreal to portray the world entirely without them. However, I am no more inclined to listen to “fucking” three times per sentence than I am to listen to “you know” at the same frequency.

The constant bleeping of four-letter words on reality shows is bad enough. I am astonished that people resort to them as a matter of course, especially in front of cameras, knowing they will be aired (and bleeped) on national television. Swearwords are intensifiers, allow us to express pain, ill will, frustration, and anger without being specific. But a heartfelt “Jesus H. Christ!” when you stub your toe is one thing. A routine “Eat your fuckin’ vegetables, for Christ’s sake” at the dinner table is another. These words have no intended meaning beyond the expression of negative emotion, i.e., there is no actual reference to sex acts or deities. Language like this is a slap in the face, a confrontational way to say “Hey! Wake up! Listen to me! I mean it!” It’s hard for me to believe people are so ready to slap family, friends, and strangers alike.

Constant bleeping is bad enough; worse is the constant cussing on scripted shows such as The Sopranos and Deadwood. Lured to watch by rave reviews, I have never sat through an entire episode of either, because after the first ten or twenty uses of fuck and goddamn, about 5 to 10 minutes, I’ve had enough and hit the remote. Slap someone often enough and they’ll go numb. Intense language loses intensity through overuse, until intensity can only be maintained by increased density of use. When every utterance is redlining it linguistically (“The fuckin’ thing don’t fuckin’ work unless I fuckin’ beat on it”), the intensifiers lose all effect, and we are left with emptiness that echoes with negativity. The speaker is saying nothing just as vehemently as he can, shouting “Blah!” at top volume every few syllables.

I will, reluctantly, concede that perhaps people do talk to each other like this, with complete disrespect and belligerence, even within families. Reality TV is unpleasant proof of the ubiquity of bad language. I will not concede that such language is either necessary or acceptable as dialogue.

Drama may reflect life, but it’s life with most of the quotidian details mercifully left out. Real people visit a bathroom every few hours. That doesn’t mean we have to watch the characters in a play or movie interrupt the action to do the same in the name of verisimilitude. Unless it’s part of the story, we aren’t subjected to belches, nail biting, hiccups, nose blowing, or a thousand other common human acts. We don’t need to see every mouthful of food chewed and swallowed. We don’t want characters to spew “you know” and “like” multiple times in every sentence even though real people do, because they’re boring and annoying and turn the dialogue into Swiss cheese, riddled with empty spaces. And there’s no reason we should have to listen to a lot of meaningless cuss words that have had all the intensity sucked out of them. To hear “Fuck you!” once in a two-hour movie is shocking. To hear it thirty or forty times in a one-hour episode is just a bore, lots and lots of empty space between meaningful words. So much emptiness makes me yawn and go elsewhere, for characters who reveal the story through their words instead of slapping me silly with them.

This is article 16 in a continuing series. © 2009 Christine C. Janson

Sunday, November 29, 2009

Posting-It Notes

Congratulations! You have a blog! You have an outlet for all the thoughts in your head and experiences in your life, a way to communicate with the entire world one on-line reader at a time.

Perhaps you use it as a journal for personal reminiscences and ponderings. Perhaps you have chosen a very specific topic, no doubt your own obsession, be it baseball or Star Trek or puppets. (Mine, of course, is language.) Perhaps you see it as your version of Oprah’s O magazine, presenting a variety of material all centered around your world view. Or perhaps you intend it as a way to keep in touch with family and friends, an ongoing version of the e-mailed Christmas letter detailing everyone’s busy doings.

The act of posting material to a blog, no matter its intended purpose, is equivalent to publishing it, i.e., making it public. You may choose to restrict the size of that public, or the quality of your writing may restrict it for you. I can envision several basic scenarios for composing and posting a blog entry. Each will produce a very different caliber of material, which will greatly influence whether others might want to read that blog or return to it. These scenarios apply to blogs that are primarily essays; those that provide a service or database or carry far more images and videos than words have other means to attract followers. However, a visitor is more likely to view an image or video if there are words to lure him in…

Slapdash and Sloppy
You’ve just been out on an exhilarating mountain hike and can’t wait to share the experience. Seconds after walking in the door, your hiking boots are off and you’re at the keyboard, logging into your blog. First you upload and arrange half a dozen photos and caption them minimally (e.g., “Me and Joey at the top”). Then, with the cursor settled into the composing window, you start to write. Your thoughts spill out one after the other as they occur, with no attempt to order or arrange them for a sense of progression or continuity, never mind paragraphs. You’re typing too fast to worry about typos. You aren’t concerned about and might not recognize misplaced commas, sentence fragments, or subject-verb disagreements. Clichés abound because originality takes time and thought. You fall easily into texting and chat room habits, abbreviating all sorts of things into technoglyphs (for example, l8r). When your thoughts have been exhausted or dinner beckons, you hit Post, wait impatiently for confirmation, and log out.

Prognosis: Poor. Only friends and relatives will have any reason to slog through this stream-of-consciousness lack of style, replete with misspellings, random punctuation, grammatical hash, and other aggravations that hinder comprehension and enjoyment. The writer himself isn’t interested in reading it, hasn’t bothered to go back to correct errors or organize. Even if the experience was extraordinary, the attempt to capture it in words was haphazard and lazy at best and a failure at worst. The account serves mainly as a record of events, and the purpose does not encompass either poetry or philosophy. This is the writing equivalent of a crayoned drawing, appropriate for family viewing on the refrigerator door, not good enough for a frame or display in the living room.

It is my impression and my fear that this is the method employed by a great many bloggers. If I am correct in this impression, there is a whole lot of unreadable crap floating around in cyberspace. That’s okay, provided nobody expects me to read any of it. (Unless, of course, we share genes, in which case I will find it charming, just as I would find the artwork on the fridge charming.)

Considered and Careful
You’ve had a great idea for your blog on wallpaper. Before logging in, you spend some time thinking not only about what you want to say but what order you should say it in. Perhaps you even jot down a few notes to refer to as you compose. You take your time while writing, think about structure and flow as you go. As a prose stylist, you reach for metaphor and simile, enjoy alliteration and humor. You choose the best images and arrange them to work with the text, caption them pithily. You then read over what you’ve written, correcting typos and punctuation, consulting a dictionary for suspect spellings, perhaps even reaching for a thesaurus to avoid using the same descriptors again and again. You discover and correct a sentence fragment, a discordant subject and verb, a dangling participle, an unclear antecedent. One more quick read satisfies you that your piece is presentable, and you post it. If you’re the skeptical type or just enamored of your own writing, you go immediately to the blog to view it and perhaps read it one more time. Once it’s been posted, you probably won’t change it and may never read it again.

Prognosis: Fair. Because the writer’s subject is near to her heart and she presumably has some expertise, the blog has a good chance of being interesting. Because she has taken the time and trouble to edit the original composition, it also has a chance of being both readable and comprehensible. A blog that is interesting and comprehensible will attract followers, if only among those who share her obsession. This is the writing equivalent of an artwork created for a gallery show, worthy of being framed for viewing by the people who visit the gallery, some of whom will appreciate it more than others. Forethought and afterthought have both been used to narrow the focus and streamline the progression; editing has been applied to root out distracting errors and points of possible confusion. A few errors will no doubt slip through, but not enough to be annoying.

This, in my opinion, is the minimum level of effort required to turn out a readable blog. Juicy content will only balance bad writing up to a point. Dry content will have even less weight. Regardless of content, better writing will always mean more readers. I would be willing to read a piece written to this standard, depending somewhat on the topic and the style, but the chances that I would go back for more are only 50-50, again dependent on the topic and style.

Prepared and Perfected
It’s time to work up a new piece for your blog about politics, a subject about which your opinions have the weight of knowledge and experience. From a handy list of ten to twenty topics, you pick one and begin turning it over in your head, figuring out not only what you have to say but how to organize your presentation into a coherent whole. You want to create a suck-’em-in beginning, an opinionated, informative, entertaining center, and a satisfying conclusion. You love to find a title that’s clever or punny and will resonate with multiple meanings as the reader moves through the piece. The perfect opening sentence can take several days to construct, but once you have it the rest of the piece falls into place behind it in your head. You then compose your first draft, whether on paper or directly on the computer with Word (or whatever). As you write, you stop constantly to go back and make changes and check how the argument is developing. Once the draft is complete, you edit ruthlessly. Any mistakes missed while composing, including uncertainties of fact as well as spelling, are found and corrected now. Dictionary, thesaurus, and other references are in reach or standing by. You transpose words, sentences, even blocks of text to improve the flow, add things you forgot or thought of later and delete things that are irrelevant or interrupt the argument, no matter how interesting they may be. You agonize over word choice, not just for meaning but for meter and music, and delight in wordplay and truly original use of words. Irony, hyperbole, synecdoche, and all those other curiously named literary tricks are part of your writing toolkit. Finally, after going through the thing ten or twenty or thirty times, you have a piece that is perfect grammatically and polished stylistically. When you go to your blog and hit New Post to open the composition window, instead of typing, you simply paste in a copy of the file created with all the tools available in Word. You then go through it to restore lost formatting and format it further with the blog tools. After one final readthrough to ensure there are no errors, you post the piece and immediately go to view it. You can’t help but read through it one more time because you’re always tickled when you see your work “published.” It’s entirely possible you’ll find one or more small errors despite all the earlier editing and proofreading, and you make the effort to edit and repost the piece. Over the next few weeks you may actually go back and tweak the piece a bit as you think of ways to make it even better.

Prognosis: Excellent. This is the level of effort, talent, and sheer fussiness required to turn out a piece of writing that will delight as well as inform. Even if readers don’t share his obsession, his passion and persuasiveness will capture their interest. That interest will not be diverted by errors or infelicities of language but deepened by appreciation of his wit and his heart. This is the writing equivalent of a masterpiece, evidence of qualification for the rank of master, worthy of display, if not in a museum, then at least in a highly frequented public space. This is his best, created with every tool and skill at his command. There are no errors of composition or of fact (well, maybe a little one now and then; nobody’s perfect). The reader can simply enjoy the story or argument as it unfolds, be edified by reliable information and entertained by cleverness.

A blog written to this standard will attract numerous readers once word gets out, because it can be enjoyed by those who don’t share the writer’s genes or passion for the topic but can appreciate his skillful expression of it. This is the standard I strive for in my own writing. I would dearly love to discover blogs of this caliber on any number of subjects. Suggestions are welcome.

Whether you think my blog isn’t as good as all that or you think I’m meeting my goals splendidly, your comments are also welcome.

This is article 15 in a continuing series. © 2009 Christine C. Janson

Thursday, November 19, 2009

Comma-cal Anarchy

The idea that commas represent oral or mental pauses is a touchy-feely excuse for punctuation anarchy. My mother, bless her, would say to me, “Wherever I would pause in speaking, I put in a comma.” I would explain patiently that there were real rules for commas based on actual grammatical structures. She would listen, not arguing, for me to finish, and after a moment of silence, she would say, “Well, I use a comma wherever I would pause in speaking.” We had this exact conversation at least three or four times. One literally could not tell my mother anything she didn’t want to hear. How’s your ear-brain connection functioning?

Myself, I have no memory at all of being taught the correct use of commas in school, elementary or high. I placed out of Freshman Composition, so I have no idea what’s taught at the college level, but I’m betting they don’t spend much time on commas either. I learned most of my grammar in Latin class, and I learned about commas by reading a style manual called Words Into Type when I was twelve, with a graduate course during my training to be a copy editor.

I believe many people use commas more or less at random, the “pause = comma” dictum their only guideline. Even in the classroom, mistakes often go uncorrected, and when a change is made, the mere deletion or insertion of a comma teaches nothing if there is no explanation. Indeed, the change itself may be wrong, as ignorance, misinformation, and idiosyncrasy are widespread, even among those we might account experts, such as English teachers. The bewildered student may well wonder why this comma was added or that one was taken away; few are interested or persistent enough to actually plumb the mystery on a case-by-case basis. Chances are the person who made the change can’t explain it logically anyway.

Years ago, as part of a class on writing for publication, I wrote several magazine-style articles, all of which used commas according to the rules. The instructor routinely added commas between subjects, between verbs, or between objects when there was more than one in a given sentence. Not one of them was grammatically defensible. His argument was that the reader “needed a break” within a long sentence. Hemingway I am not, and thus he was adding an extraneous comma or two to nearly every sentence. (I have since taught myself to construct shorter sentences, which is the correct fix.) He was a trained writer, a published author, and he was just plain wrong.

The rules for basic comma use in the construction of English sentences are actually very simple, very straightforward, and very logical. Summed up very briefly, commas are used between independent clauses, after introductory elements (some leeway here), between the items in a list, and to set off parenthetical elements. Believe it or not, that pretty much covers ninety percent of all licit commas. The rules can be bent for stylistic reasons, but you should know them before you start bending them.

Failure to use commas where they are required is one thing. Sticking them in pointlessly occurs just as often. Again summed up very briefly, do not use a comma (unless it is part of a list or parenthetical element) between a subject and verb or verb and object, between an adjective and its noun, after a conjunction, or to set off restrictive material. Again, believe it or not, that’s pretty much it.

Unfortunately, to use these rules, you have to be able to identify a subject, verb, and object, a clause, a conjunction, and both restrictive and nonrestrictive elements. Perhaps that’s where everything gets fuzzy and mysterious. So much easier to just stick in a comma at every pause…

These grammar-based rules are the standard in any well-edited publication, such as the New York Times, should you care to verify this assertion with a piece of writing other than my own. Notice that following these rules gives readers very clear traffic signals as they move along. Here’s a comma followed by a conjunction, so a new clause is coming up, a new subject, a new thought. A comma placed within a clause (for instance, between dual predicates: “I slapped him, and listened to him cry”) violates that expectation, and the reader has to back up and regroup to grasp the meaning. Syntactically, it’s like a stop sign in the middle of the block instead of at an intersection, and we all know how annoying those are. Profligate use of commas can make it necessary to read every sentence two or three times to determine structure and sense.

Because the rules are so clear cut, if you give the same piece of writing to two trained editors, they will make exactly the same comma changes, both insertions and deletions. Moreover, they will be able to explain the reason for every change, and I guarantee the word pause will not come up.

This is article 14 in a continuing series. Formerly posted as Comma Chameleon. © 2009 Christine C. Janson

Sunday, November 15, 2009

Edless and Be-Eded

Let’s talk about participles!

Now that I’ve just lost half of my potential audience, let me assume that the readers who remain know that grammar is not just for geeks. It is a tool, and like any tool, it can be used well or badly. People who disdain grammar or any formal study of how words are put together think they can just set their thoughts down on paper as they occur and they’re done. However, this is like assuming that anything created with crayons is art. What’s good enough for the refrigerator door (think memo or chat room) probably isn’t good enough for a gallery (think magazine or blog), never mind a museum (think hardback or website). Just so for words. Forethought, structure, and judicious editing are all essential to good writing. If it’s a memo to yourself, be as slipshod and sloppy as you like. If it’s something you expect other people to read, you’ve got to follow the rules. An architect can scribble away on the drawing board designing castles in the air, but if he expects other people to live in them, he’d better bring them down to earth and make them structurally sound. Writing is no different. Sentences and paragraphs are constructed, and if they are badly constructed, they will collapse as surely as an unsupported roof and stun the reader with nonsense or aggravation. To avoid that calamity, you have to follow the rules.

Now, let’s talk about participles. (I promise the word dangling isn’t going to come up even once.) Participles are adjectives that are formed from verbs (sometimes from nouns; more on that later) and act as modifiers with verbal force. They describe an action in progress (present participles) or one that has been completed (perfect or past participles).

Present participles end in -ing: a swimming dog, a horrifying accident. They are used to form the progressive tenses: I am walking, he was talking, we will be falling down. Do not confuse present participles with gerunds, which also end in -ing but are verbal nouns: swimming is good exercise, I like painting. Gerunds are lots of fun and often misused and will be the subject of a future article.

Past participles used to be called perfect participles because they are used to form the perfect tenses, but I guess grammarians got tired of all that perfection. For regular verbs, the past participle, like the past tense, ends in -ed; irregular forms must be learned on an individual basis. Memorization involves the infinitive (to be, to go, to do, to teach, to drive, to walk); the present tense (I am, I go, I do, I teach, I drive, I walk); the past tense (I was, I went, I did, I taught, I drove, I walked); and the perfect tenses (I have [had, will have] been, I have gone, I have done, I have taught, I have driven, I have walked). Only the last of these exemplars, walk, is a regular verb.

There is another kind of past participle. It is an adjective formed from a noun that has verbal force. Some of my readers have just gone cross-eyed trying to figure this out, so I will point out that eyed is an example of such a participle. It is formed from a noun (eye) and functions as an adjective (“eyed like an old potato”). This sort of participle causes more problems than the other sorts because of its complexity, not to mention its close association with hyphens as a unit modifier, or temporary compound.

Why should nouns be turned into adjectives, and how can they imply action? That’s just the way English works, and it’s one of the things that make English so fantastically versatile. You can’t say “fair-haired boy” in French, any more than you can say “Diana’s dress”; in French you must employ prepositional phrases and say “the boy with fair hair” (le garçon aux cheveux blonds) and “the dress of Diana” (la robe de Diana).

One very common error with participles is omission of the -ed ending. Sometimes the error is so commonplace it becomes accepted. One example is ice cream. Because it means cream that has been transformed with ice, it is properly called iced cream, but the error has become ingrained in the language. The same thing is happening to iced tea, a participle modifying a noun, which is more commonly seen as ice tea, which is two nouns side by side. We know what it means, but grammatically it is uncoordinated.

Things get even more complex with modifiers created by combining an adjective with a participle formed from a noun, as in cross-eyed. Here too, the common error is to leave the participial -ed ending off the noun portion. An old-fashion candy is like a blue-eye boy, grammatically uncoordinated; these modifiers should be old-fashioned and blue-eyed. Logically, an old-fashion candy might be candy that prefers old fashions; we need the verbal force of the participle to give the sense that the candy has been fashioned in a time-honored way. Similarly, a blue-eye boy might collect them, not have them; here the verbal undertone implies that he has been endowed with eyes of blue. To quote one grammar manual,* “With past participles…the noun being modified is the object of the verb underlying the participle.” Thus, in blue-eyed boy, boy is grammatically the object of eyed, which is not the case with the noun in blue-eye; blue modifies eye, but eye does not refer to boy in the way eyed does.

What follows is a sampling of ed-less constructions culled from random sources. These samples are presented as they appeared, with or without hyphens, because the proper use of hyphens is a subject so involved it could well carry three or four articles.

From an ad for Kendall-Jackson wines, we have “Red Tail Hawk”; this should be red-tailed hawk, hyphenated, not capitalized. This beautiful raptor has an entry in the dictionary as well as in any field guide to birds, and one wonders why they didn’t bother to look it up before featuring it prominently in their ad. Because participles are so often used in temporary compounds, the dictionary won’t always be able to resolve problems, but in this case laziness was at fault. A catalog page of four shoe styles, all with fringe, weirdly gets it wrong and right in the same space: the incorrect “fringe bootie” occurs three times, right next to the single correct “fringed clog.” Another catalog offers “one dozen long-stem roses,” but it should be offering long-stemmed ones. Yet another catalog says its “long-sleeve, mock-neck top is semi-fit,” thus making the ed-less error twice; I’d rather have a top that is long-sleeved and semi-fitted. (Don’t get me started on mock-neck; it isn’t really a turtleneck, but I guarantee you the neck is real.) The editing at Martha Stewart Living magazine is usually faultless, but a recent issue had both “fan-shape leaves” and “balloon-shape calyxes”; both of these noun-noun combos need to be shaped up. A grocery store circular offered savings on “select sodas,” but I am certain the sodas are neither superior nor distinguished, just selected to go on sale. A clothing catalog described a sweater as having “full-fashion sleeves,” using an adjective-noun combo when what’s needed is an adverb and participle. Full-fashion sleeves might appeal to a trendy fashionista; fully fashioned sleeves are a hallmark of good construction in a knitted garment (not a knit garment). Finally, trout that have been split apart and laid open to resemble a butterfly are butterflied trout; the genetic engineers haven't got around to breeding butterfly trout yet.

Now we turn from the edless to the be-eded, where people get confused and put -ed on the wrong word. The J. Jill catalog offered a “capped-sleeve cardigan sweater.” Cardigan sweater is redundant, and the sleeves are not capped with anything; they are cap sleeves, and the cardigan is cap-sleeved. The Talbots catalog wants us to consider a “mix-stitched cardigan,” but there is no mix stitch in knitting that I know of. The cardigan instead has a mix of stitches and is mixed-stitch. The West Elm catalog has for sale a “blocked paisley print duvet cover,” but I’m sure the print has not been blocked in any way. The action here is not blocking but printing. The use of a carved block of wood to stamp a fabric with a repeating design is known as block printing, and this item should have been described as a block-printed paisley duvet cover. The final example has no specific source but is an error I have seen again and again, involving confusion between advance (adjective) and advanced (participle). Advanced ticket sales must involve some new technology for selling tickets; advance ticket sales offer tickets in advance of the event.

Please don’t get down on me for taking examples from catalogs, advertisements, and magazines. These publications are less likely to be subjected to a trained editorial eye (some magazines excepted) and thus more closely represent the language as it is used by the average person, educated to some extent in its use but not dedicated to its study. Through chat rooms and blogs and e-zines, writings by such people are being “published” more widely than ever, and they are one of the groups Crotchets is intended to reach.

My point is that communication is not automatic. You have to work for it; you have to enable it; you have to construct it. For that you need to have the right tools and know how to wield them. Those tools include dictionaries and style manuals. Anyone who writes something intended to be read by another, even if it’s just a blog or a catalog description, owes it to her readers to use those tools to get it right. As with ice tea, we may understand it even if it isn’t exactly right, but we’ll have a better chance of understanding if it is right.

Communication is a two-way street. If you will do your best to be comprehensible, I’ll do my best to comprehend you. Write like you don’t give a damn, and neither will I; I’ll turn the page, close the window, surf away, and you will have wasted your time and mine. Do you want to reach people or repulse them? The choice is yours. (You might want to avoid mentioning participles in the opening sentence.)

*The McGraw-Hill Handbook of English Grammar and Usage, which I generally find far too apologetic and accommodating.

This is article 13 in a continuing series. © 2009 Christine C. Janson