Instant messaging, texting, and chat rooms are fueling the invention of an entirely new “written” language, one designed specifically for keypads and keyboards and miniscreens. This is the context of real-time, live communication via electronic device and screens measured in millimeters, be it laptop or telephone. In this context, brevity is celerity. The fewer inputs you need to make, the more quickly your communication goes out and the “conversation” continues. The fewer inputs you receive, the more “words” can appear—and be comprehended—at one go. Brevity is also a plus when your thumbs are flailing like demented drumsticks over the tiny keys of your smartphone.
About a year ago, I coined the word technoglyph to describe the variety of symbols used to facilitate this new form of written communication. I am ready now to assert that we need this word and attempt to define it.
There are many ways to shorten and compact the standard written forms of words and phrases, and technoglyph is an umbrella term that includes all of them. Chief among these methods is abbreviation, but of a new-fangled sort. The old-fashioned rule was that abbreviations formed by taking the first letter of each word should be capitalized. The multiple steps required to obtain a capital on dumbphone keypads and mini-keyboards have shut this rule down, leaving us with ttfn, j/k, lol, and many others. The capital I of the first person is also being left to the lower case, and the use of an i for an I qualifies as a technoglyph, as it is not standard English (although it may become so if this habit becomes ingrained). Many of the new abbreviations common to texters are for social phrases that acknowledge the live connection in as few letters/inputs as possible (ttfn, for example).
Soundalikes also qualify as technoglyphs. For example, “c” is a technoglyph for the word see, as “u” is for you. They are not abbreviations. I cannot comfortably call them homophones, because they aren’t strictly words, although the definition of homophone has been extended to include single characters. As a puzzle enthusiast, I see them more as rebuses, symbols or pictures representing a soundalike word. Furthermore, in the texting context, “c” and “u” are more than simply homophone or rebus; they are not interchangeable for other soundalikes. For instance, “c” also sounds like sea, but I doubt a sailor is going to text “Set out to c 2day.” Likewise, a shepherd will not text anyone “Lost u 2day,” because it will be misunderstood. The “c” and “u” are not so much symbols or soundalikes as mnemonics for specific standard English words that occur constantly in interpersonal communications.
Numerals are also used as soundalikes/rebuses/mnemonics for common English words and sounds. They can occur alone (2 for to, 4 for for) or in conjunction with letters or pieces of words (2nite, l8er). As time and input savers, they work splendidly for very common words and expressions, not so well for the more exotic, un42nately.
Emoticons were among the earliest technoglyphs. There is an entire glossary of these punctuation-based faces, winking and frowning and sticking out their tongues. Three finger strikes (a colon, a hyphen, and a closing parenthesis), and we have communicated “I am smiling/happy/pleased.” That’s a lot of punch for three strikes! (Actually, I suppose the shift needed for the parenthesis would make it four strikes, but still.)
The use of @ to stand for at greatly broadens the application of an old handwritten symbol once confined to matters of math and commerce. Also on the upsurge is $ to stand for money in general as well as for dollar specifically. We don’t use ₤ for pound because it isn’t available on a keyboard, never mind a keypad. That fact allows me to classify $ and @ as technoglyphs.
Now let me attempt to define technoglyph. It may be an abbreviation, but one designed for the limitations and niceties of real-time communication. It may be a soundalike, but only for words in common use for personal interactions. It may be a rebus, but it will have only one solution. It may be an old handwritten symbol revamped for vigorous new-age use. It may involve nonstandard use of a standard word, as an i for an I. In each case, I believe the word that best describes the reduction process is mnemonic. Each of these symbols, abbreviations, rebuses, and soundalikes is intended to remind us of a standard English word or phrase with the fewest possible key strikes.
Here is one possible definition: A technoglyph is a mnemonic for a common word or phrase, devised specifically for communication (especially real-time live communication) via keyboard or keypad in applications such as instant messaging, chat rooms, and texting.
This is article 21 in a continuing series. © 2011 Christine C. Janson
Wednesday, January 19, 2011
Monday, January 17, 2011
Yule Know Next Time, Round 2: Deep Research
Ah, December! The darkest month of the year, when we lose daylight at the depressing rate of more than two minutes per day, until the winter solstice marks the turning point. Humans crave light at this dark time, and electricity allows us to indulge this craving shamelessly. No wonder the ancients turned the solstice into a weeks-long festival of lights.
Let’s see how much you know about this very important astronomical event. The winter solstice appears on calendars and ephemerides as an exact date and time (to the minute) that changes every year. Do you know how the time is determined? Is it actually the shortest day of the year? Is it actually the first day of winter?
The answers, surprisingly, are no and no. The shortest day of the year in fact occurs several days before the official date of the solstice. The official date and time instead mark the movement of the sun out of Sagittarius and into Capricorn. And you thought astrology was irrelevant!
As for its being the first day of winter, that’s a lot of hooey. I have no idea who decided winter “begins” in late December, but it’s nonsense. The meteorologists sensibly account December first as the beginning of the winter season. The ancient Celts, also sensibly, held that winter begins midway between the autumnal equinox and the winter solstice, at the holiday they called Samhain (pronounced sowan) and we know as Halloween, which was also their New Year’s Eve. Interesting that we now switch from daylight saving to standard time at the end of October, proving perhaps that old ideas don’t die, they just get repurposed.
In the Celtic calendar, all the seasons begin midway between an equinox and a solstice, and each has its holiday. Spring starts at the beginning of February, a festival once called Imbolc and now “celebrated,” much corrupted, as Groundhog Day. Summer begins with Beltane, now called May Day, and autumn begins in early August at a holiday called Lammas or Lughnasa. Sadly, this occasion is no longer remembered on American shores, though I believe it is still observed in Ireland. This older calendar is why the winter solstice is known to us confusingly as both the first day of winter and the time of the midwinter festival.
In Scandinavia, which experiences the most profound winter darkness on the European continent, and in the countries where the Vikings and Norsemen carried their traditions, including the British Isles, the weeks leading up to and following the winter solstice were known as Yule. When I looked up Yule in my dictionary, however, I was dismayed to see it defined only as “the feast of the nativity of Jesus Christ.” I’m sorry, but it is not. The derivation admits that the word is the Old Norse name for a pagan midwinter festival. Yes, exactly. That festival was celebrated for thousands of years before the Christians coopted it, and I object to its total omission from the definition. I have the same objection when I encounter the saying that “Jesus is the reason for the season.” Again, no offense intended, but he is not, at least, not originally and not solely. Celebration of the season predates Jesus by millennia.
Americans wish each other a Merry Christmas; in England it’s a Happy Christmas. But 2,000 years into the Christian era, the Swedes still greet their fellows with Good Yule (God Jul), and in France, it’s Joyous Noël (Joyeux Noël). Merriam-Webster’s defines noël as meaning Christmas or carol, says it’s derived from the Latin natalis, meaning birth, and cites its earliest documented use as 1811, in a book of old French songs. I find that almost impossible to believe. The Oxford English Dictionary defines noël simply as a song, with the same first-citation date, but refers the reader to nowel. This word it defines as a joyous shout, akin to hurray or hallelujah, with a 14th-century citation from Chaucer, although it also traces the origin back to the Latin natalis. Again, I’m not buying it. The French had no French word for Christmas before the 19th century and then “chose” a word that means song? Seems more than a little odd to me. That’s like the English wishing everyone a Happy Carol.
Nowel began life as a joyful cry and evolved to mean a joyful song sung at a joyful holiday. What are the chances that Noël, and the festival to which it refers, is actually as old in western Europe as Yule is to the north? Excellent, in my opinion. For that matter, what are the chances that Noël is derived from the ancient word Yule brought to French territory by the Norsemen who settled in Normandy? It’s less of a stretch from Yule to Noël than it is from natalis to Noël, that’s for sure. Etymologists, consider this a challenge!
Cultures that experience the change of seasons generally have a long festival at midwinter, a festival of lights to chase back the darkness and call upon the sun to return. The Chinese have the Festival of the Lanterns. Even as far south as the Mediterranean, there is Hanukkah, an eight-day event. The Romans had the Saturnalia in mid-December, during which gifts were exchanged. Saturn was the god of harvests; consider the symbolism of holding his festival months after harvest is complete and months before spring planting will begin, when the earth seems barren. Also consider that saturnalia has come to mean a drunken orgy, and we can guess that the Romans celebrated rather lustily, making the beast with two backs to encourage the land to be fruitful once more, a primitive form of magick.
In a festival of lights, people light up the dreary darkness, whether with pre-electricity candles and Yule logs and firecrackers or with modern icicle lights and glittery tinsel and rainbow LEDs. Here in the United States, we string lights on everything from trees to porch columns to fences and place electric candles in our windows like beacons of hope. We bring evergreens into our homes as a symbol that life will be renewed, that the deadness of winter cannot defeat the vitality hidden within soil and branch and seed.
The colors of the season are obvious choices. Evergreens supply the only spots of living color in the white and brown and gray monotony of the bleak winter landscape, and green’s opposite is the cheery and warming red that can lift our spirits and gladden our hearts. Nature herself loves this combination, exemplified by holly’s shiny green leaves and bright red berries. People have been decorating their homes at midwinter with living green branches and garlands of aromatic pine and cedar and sprigs of white-berried mistletoe (sacred to the Druids) for almost as long as homes have existed, and the Christmas tree is just the most recent expression of that impulse.
Pagan or Christian, the midwinter festival celebrates coming out of the darkness and the rebirth of the world. You don’t have to be a Christian to celebrate Yule with a joyful song, just a human who craves the return of the sun and its light and warmth. A pagan, after all, is simply one who dwells in the country, close to Nature and sensitive to her rhythms and moods and mysteries.
In March I’ll post a piece about the origins of Easter, another pagan holiday coopted by the Christians. Watch for it!
This is article 19 in a continuing series. © 2011 Christine C. Janson
Let’s see how much you know about this very important astronomical event. The winter solstice appears on calendars and ephemerides as an exact date and time (to the minute) that changes every year. Do you know how the time is determined? Is it actually the shortest day of the year? Is it actually the first day of winter?
The answers, surprisingly, are no and no. The shortest day of the year in fact occurs several days before the official date of the solstice. The official date and time instead mark the movement of the sun out of Sagittarius and into Capricorn. And you thought astrology was irrelevant!
As for its being the first day of winter, that’s a lot of hooey. I have no idea who decided winter “begins” in late December, but it’s nonsense. The meteorologists sensibly account December first as the beginning of the winter season. The ancient Celts, also sensibly, held that winter begins midway between the autumnal equinox and the winter solstice, at the holiday they called Samhain (pronounced sowan) and we know as Halloween, which was also their New Year’s Eve. Interesting that we now switch from daylight saving to standard time at the end of October, proving perhaps that old ideas don’t die, they just get repurposed.
In the Celtic calendar, all the seasons begin midway between an equinox and a solstice, and each has its holiday. Spring starts at the beginning of February, a festival once called Imbolc and now “celebrated,” much corrupted, as Groundhog Day. Summer begins with Beltane, now called May Day, and autumn begins in early August at a holiday called Lammas or Lughnasa. Sadly, this occasion is no longer remembered on American shores, though I believe it is still observed in Ireland. This older calendar is why the winter solstice is known to us confusingly as both the first day of winter and the time of the midwinter festival.
In Scandinavia, which experiences the most profound winter darkness on the European continent, and in the countries where the Vikings and Norsemen carried their traditions, including the British Isles, the weeks leading up to and following the winter solstice were known as Yule. When I looked up Yule in my dictionary, however, I was dismayed to see it defined only as “the feast of the nativity of Jesus Christ.” I’m sorry, but it is not. The derivation admits that the word is the Old Norse name for a pagan midwinter festival. Yes, exactly. That festival was celebrated for thousands of years before the Christians coopted it, and I object to its total omission from the definition. I have the same objection when I encounter the saying that “Jesus is the reason for the season.” Again, no offense intended, but he is not, at least, not originally and not solely. Celebration of the season predates Jesus by millennia.
Americans wish each other a Merry Christmas; in England it’s a Happy Christmas. But 2,000 years into the Christian era, the Swedes still greet their fellows with Good Yule (God Jul), and in France, it’s Joyous Noël (Joyeux Noël). Merriam-Webster’s defines noël as meaning Christmas or carol, says it’s derived from the Latin natalis, meaning birth, and cites its earliest documented use as 1811, in a book of old French songs. I find that almost impossible to believe. The Oxford English Dictionary defines noël simply as a song, with the same first-citation date, but refers the reader to nowel. This word it defines as a joyous shout, akin to hurray or hallelujah, with a 14th-century citation from Chaucer, although it also traces the origin back to the Latin natalis. Again, I’m not buying it. The French had no French word for Christmas before the 19th century and then “chose” a word that means song? Seems more than a little odd to me. That’s like the English wishing everyone a Happy Carol.
Nowel began life as a joyful cry and evolved to mean a joyful song sung at a joyful holiday. What are the chances that Noël, and the festival to which it refers, is actually as old in western Europe as Yule is to the north? Excellent, in my opinion. For that matter, what are the chances that Noël is derived from the ancient word Yule brought to French territory by the Norsemen who settled in Normandy? It’s less of a stretch from Yule to Noël than it is from natalis to Noël, that’s for sure. Etymologists, consider this a challenge!
Cultures that experience the change of seasons generally have a long festival at midwinter, a festival of lights to chase back the darkness and call upon the sun to return. The Chinese have the Festival of the Lanterns. Even as far south as the Mediterranean, there is Hanukkah, an eight-day event. The Romans had the Saturnalia in mid-December, during which gifts were exchanged. Saturn was the god of harvests; consider the symbolism of holding his festival months after harvest is complete and months before spring planting will begin, when the earth seems barren. Also consider that saturnalia has come to mean a drunken orgy, and we can guess that the Romans celebrated rather lustily, making the beast with two backs to encourage the land to be fruitful once more, a primitive form of magick.
In a festival of lights, people light up the dreary darkness, whether with pre-electricity candles and Yule logs and firecrackers or with modern icicle lights and glittery tinsel and rainbow LEDs. Here in the United States, we string lights on everything from trees to porch columns to fences and place electric candles in our windows like beacons of hope. We bring evergreens into our homes as a symbol that life will be renewed, that the deadness of winter cannot defeat the vitality hidden within soil and branch and seed.
The colors of the season are obvious choices. Evergreens supply the only spots of living color in the white and brown and gray monotony of the bleak winter landscape, and green’s opposite is the cheery and warming red that can lift our spirits and gladden our hearts. Nature herself loves this combination, exemplified by holly’s shiny green leaves and bright red berries. People have been decorating their homes at midwinter with living green branches and garlands of aromatic pine and cedar and sprigs of white-berried mistletoe (sacred to the Druids) for almost as long as homes have existed, and the Christmas tree is just the most recent expression of that impulse.
Pagan or Christian, the midwinter festival celebrates coming out of the darkness and the rebirth of the world. You don’t have to be a Christian to celebrate Yule with a joyful song, just a human who craves the return of the sun and its light and warmth. A pagan, after all, is simply one who dwells in the country, close to Nature and sensitive to her rhythms and moods and mysteries.
In March I’ll post a piece about the origins of Easter, another pagan holiday coopted by the Christians. Watch for it!
This is article 19 in a continuing series. © 2011 Christine C. Janson
Sunday, January 3, 2010
Yule Know Next Time
Ah, December! The darkest month of the year, when we lose daylight at the depressing rate of more than two minutes per day, until the winter solstice marks the turning point. Humans crave light at this dark time, and electricity allows us to indulge this craving shamelessly. No wonder the ancients turned the solstice into a weeks-long festival of lights.
Let’s see how much you know about this very important astronomical event. The winter solstice appears on calendars and ephemerides as an exact date and time (to the minute) that changes every year. Do you know how the time is determined? Is it actually the shortest day of the year? Is it actually the first day of winter?
The answers, surprisingly, are no and no. The shortest day of the year in fact occurs several days before the official date of the solstice. The official date and time instead mark the movement of the sun out of Sagittarius and into Capricorn. And you thought astrology was irrelevant!
As for its being the first day of winter, that’s a lot of hooey. I have no idea who decided winter “begins” in late December, but it’s nonsense. The meteorologists sensibly account December first as the beginning of the winter season. The ancient Celts, also sensibly, held that winter begins midway between the autumnal equinox and the winter solstice, at the holiday they called Samhain (pronounced sowan) and we now know as Halloween. Interesting that we now switch from daylight saving to standard time at the end of October, proving perhaps that old ideas don’t die, they just get repurposed. In the Celtic calendar, all the seasons begin midway between an equinox and a solstice, and each has its holiday. Spring starts at the beginning of February, a festival once called Imbolc and now “celebrated,” much corrupted, as Groundhog Day. Summer begins with Beltane, now called May Day, and autumn begins in early August at a holiday called Lammas or Lughnasa. Sadly, this occasion is no longer remembered on American shores, though I believe it is still observed in Ireland. This older calendar is why the winter solstice is known to us confusingly as both the first day of winter and the time of the midwinter festival.
In Scandinavia, which experiences the most profound winter darkness on the European continent, and in the countries where the Vikings and Norsemen carried their traditions, the weeks leading up to and following the winter solstice were known as Yule. When I looked up Yule in my dictionary, however, I was dismayed to see it defined only as “the feast of the nativity of Jesus Christ.” I’m sorry, but it is not. The derivation admits that the word is the Old Norse name for a pagan midwinter festival. Yes, exactly. That festival was celebrated for thousands of years before the Christians coopted it, and I object to its total omission from the definition. I have the same objection when I encounter the saying that “Jesus is the reason for the season.” Again, no offense intended, but he is not, at least, not originally and not solely. Celebration of the season predates Jesus by millennia.
Cultures that experience the change of seasons generally have a long festival at midwinter, a festival of lights to chase back the darkness and call upon the sun to return. The Chinese have the Festival of the Lanterns. Even as far south as the Mediterranean, there is Hanukkah, an eight-day event. The Romans had the Saturnalia in mid-December, during which gifts were exchanged. Saturn was the god of harvests; consider the symbolism of holding his festival months after harvest is complete and months before spring planting will begin, when the earth seems barren. Also consider that saturnalia has come to mean a drunken orgy, and we can guess that people celebrated rather lustily, making the beast with two backs to encourage the land to be fruitful once more, a primitive form of magick.
In a festival of lights, people light up the dreary darkness, whether with pre-electricity candles and Yule logs and firecrackers or with modern icicle lights and glittery tinsel and rainbow LEDs. Here in the United States, we string lights on everything from trees to porch columns to fences and place electric candles in our windows like beacons of hope. We bring evergreens into our homes as a symbol that life will be renewed, that the deadness of winter cannot defeat the vitality hidden within soil and branch and seed.
The colors of the season are obvious choices. Evergreens supply the only spots of living color in the white and brown and gray monotony of the bleak winter landscape, and green’s opposite is the cheery and warming red that can lift our spirits and gladden our hearts. Nature herself loves this combination, exemplified by holly’s shiny green leaves and bright red berries. People have been decorating their homes at midwinter with living green branches and garlands of aromatic pine and cedar and sprigs of white-berried mistletoe (sacred to the Druids) for almost as long as homes have existed, and the Christmas tree is just the most recent expression of that impulse.
Pagan or Christian, the midwinter festival celebrates coming out of the darkness and the rebirth of the world. You don’t have to be a Christian to celebrate Yule, just a human who craves the return of the sun and its light and warmth. A pagan, after all, is simply one who dwells in the country, close to Nature and sensitive to her rhythms and moods and mysteries.
In March I’ll post a piece about the origins of Easter, another pagan holiday coopted by the Christians. Watch for it!
This is article 19 in a continuing series. © 2010 Christine C. Janson
Let’s see how much you know about this very important astronomical event. The winter solstice appears on calendars and ephemerides as an exact date and time (to the minute) that changes every year. Do you know how the time is determined? Is it actually the shortest day of the year? Is it actually the first day of winter?
The answers, surprisingly, are no and no. The shortest day of the year in fact occurs several days before the official date of the solstice. The official date and time instead mark the movement of the sun out of Sagittarius and into Capricorn. And you thought astrology was irrelevant!
As for its being the first day of winter, that’s a lot of hooey. I have no idea who decided winter “begins” in late December, but it’s nonsense. The meteorologists sensibly account December first as the beginning of the winter season. The ancient Celts, also sensibly, held that winter begins midway between the autumnal equinox and the winter solstice, at the holiday they called Samhain (pronounced sowan) and we now know as Halloween. Interesting that we now switch from daylight saving to standard time at the end of October, proving perhaps that old ideas don’t die, they just get repurposed. In the Celtic calendar, all the seasons begin midway between an equinox and a solstice, and each has its holiday. Spring starts at the beginning of February, a festival once called Imbolc and now “celebrated,” much corrupted, as Groundhog Day. Summer begins with Beltane, now called May Day, and autumn begins in early August at a holiday called Lammas or Lughnasa. Sadly, this occasion is no longer remembered on American shores, though I believe it is still observed in Ireland. This older calendar is why the winter solstice is known to us confusingly as both the first day of winter and the time of the midwinter festival.
In Scandinavia, which experiences the most profound winter darkness on the European continent, and in the countries where the Vikings and Norsemen carried their traditions, the weeks leading up to and following the winter solstice were known as Yule. When I looked up Yule in my dictionary, however, I was dismayed to see it defined only as “the feast of the nativity of Jesus Christ.” I’m sorry, but it is not. The derivation admits that the word is the Old Norse name for a pagan midwinter festival. Yes, exactly. That festival was celebrated for thousands of years before the Christians coopted it, and I object to its total omission from the definition. I have the same objection when I encounter the saying that “Jesus is the reason for the season.” Again, no offense intended, but he is not, at least, not originally and not solely. Celebration of the season predates Jesus by millennia.
Cultures that experience the change of seasons generally have a long festival at midwinter, a festival of lights to chase back the darkness and call upon the sun to return. The Chinese have the Festival of the Lanterns. Even as far south as the Mediterranean, there is Hanukkah, an eight-day event. The Romans had the Saturnalia in mid-December, during which gifts were exchanged. Saturn was the god of harvests; consider the symbolism of holding his festival months after harvest is complete and months before spring planting will begin, when the earth seems barren. Also consider that saturnalia has come to mean a drunken orgy, and we can guess that people celebrated rather lustily, making the beast with two backs to encourage the land to be fruitful once more, a primitive form of magick.
In a festival of lights, people light up the dreary darkness, whether with pre-electricity candles and Yule logs and firecrackers or with modern icicle lights and glittery tinsel and rainbow LEDs. Here in the United States, we string lights on everything from trees to porch columns to fences and place electric candles in our windows like beacons of hope. We bring evergreens into our homes as a symbol that life will be renewed, that the deadness of winter cannot defeat the vitality hidden within soil and branch and seed.
The colors of the season are obvious choices. Evergreens supply the only spots of living color in the white and brown and gray monotony of the bleak winter landscape, and green’s opposite is the cheery and warming red that can lift our spirits and gladden our hearts. Nature herself loves this combination, exemplified by holly’s shiny green leaves and bright red berries. People have been decorating their homes at midwinter with living green branches and garlands of aromatic pine and cedar and sprigs of white-berried mistletoe (sacred to the Druids) for almost as long as homes have existed, and the Christmas tree is just the most recent expression of that impulse.
Pagan or Christian, the midwinter festival celebrates coming out of the darkness and the rebirth of the world. You don’t have to be a Christian to celebrate Yule, just a human who craves the return of the sun and its light and warmth. A pagan, after all, is simply one who dwells in the country, close to Nature and sensitive to her rhythms and moods and mysteries.
In March I’ll post a piece about the origins of Easter, another pagan holiday coopted by the Christians. Watch for it!
This is article 19 in a continuing series. © 2010 Christine C. Janson
Monday, December 7, 2009
Ignorance and Arrogance
Many years ago, I was given a paperback copy of a book called The Mother Tongue: English and How It Got That Way as a gift. So crammed are my shelves with unread books I’ve only just got around to reading it. I stumbled across it by chance a week ago and was drawn to it because I had covered some of the same territory in an earlier article, Mispellers of the World, Untie! I wanted to see what the author, Bill Bryson, had to say and see whether I had left out anything major.
Most of the book is a discussion of research and histories done by others. Nowhere does Mr. Bryson mention any original research done by him, and his name does not appear in the bibliography, so I’m guessing he has not published in this field previously. None of the books in the bibliography was published before 1931, and there are no primary sources, such as Chaucer’s Canterbury Tales. There is no information about the author at all; his academic background and current profession are unknown. According to the blurbs all over the cover, this book was a hardcover bestseller and well reviewed, even earned the accolade of “scholarly” from the Los Angeles Times. The writing style is breezy and fun, more magazine than academe, which of course adds immensely to readability. But the author betrays such a fundamental lack of understanding of the basic structures of English that I am astounded he got his book published.
In his chapter on English grammar and its oddities, Mr. Bryson states that in English, “A noun is a noun and a verb is a verb largely because the grammarians say they are.” He supports this by giving a list of nouns that are also verbs, such as charge and towel. But the statement is arrant nonsense. A noun is a word that functions as a noun; a verb is a word that functions as a verb. In the sentence “The charge appeared on my statement,” the word charge is being used as a noun, and grammarians are as powerless to turn it into a verb as they are to turn it into gold. Only native speakers can decide how to use a given word, setting its function through use, and if they want to change its use, they have to construct a new sentence for it. The fact that English has so many multifunctional terms is a tribute to its unique versatility. A word’s function (noun, verb, whatever) is not revealed until it is actually used; no one can look at the isolated word charge and declare it noun or verb, because it has the potential to be either. This is a very neat trick, not a shortcoming, and cannot be done in many other languages.
Claiming that “the parts of speech are almost entirely notional,” Mr. Bryson offers the examples “I am suffering terribly” and “My suffering is terrible.” He says the grammarians would call suffering a verb in the first but a noun in the second, but in his opinion both sentences use “precisely the same word to express precisely the same idea.” Well, no. Technically, the first suffering is a present participle, a verbal adjective, and the second is a gerund, a verbal noun, both of which are derived from the same verb, suffer. It is thus not at all odd that they should express the same idea, but the verb has been inflected in different ways, to form a participle (adjective) to use in the present progressive tense and to form a gerund (noun) to use as a subject. Every English verb has the ability to become a noun or an adjective by the addition of -ing; which it is is strictly a matter of how it’s used. As a native English speaker, the author has automatically used terrible to modify the gerund and terribly to modify the participle even as he claims they are modifying “precisely the same word,” proving that the language center in his brain is operating better than the reasoning center. It isn’t a noun or an adjective because grammarians say it is; it’s a noun or adjective because that’s how it’s functioning. There’s nothing “notional” about it.
Having said one puzzlingly harebrained thing, Mr. Bryson reveals even deeper ignorance of how his language works (the language, remember, he has dared to write a book about). In the same paragraph, he writes, “Breaking is a present tense participle, but as often as not it is used in a past tense sense (‘He was breaking the window when I saw him’). Broken, on the other hand, is a past tense participle but as often as not it is employed in a present tense sense (‘I think I’ve just broken my toe’) or even future tense sense (‘If he wins the next race, he’ll have broken the school record’).” These cavils reveal such a complete misunderstanding of basic grammar I am left breathless. Throughout the book he cites Fowler, Copperud, and other well-known grammarians, but he has clearly been too selective in actually reading them. No authorities are cited in this section, but the lack of support for his pet peeve didn’t stop him from ranting. No research went into these inanities. There is nothing here but gibberish.
First off, there is no such thing as a “present tense” or “past tense” participle; a participle is an adjective and has no tense. Participles, present and past, are used to form various tenses. The present participle is used to form the progressive tenses present, past, future, and perfect: I am walking, I was walking, I will be walking, I have been walking, I had been walking, I will have been walking. Likewise, the past participle is used to form the perfect tenses: I have walked, I had walked, I will have walked. Reexamine the statements that “present tense participles” are often used in a “past tense sense” and vice versa, and you realize that his statements make no sense at all, present, past, or future.
It gets worse. Mr. Bryson follows this arrogant demonstration of ignorance with one of boneheaded wrongness. I can only quote; paraphrase will not suffice. “A noun…is generally said [to denote] a person, place, thing, action, or quality. That would seem to cover almost everything, yet clearly most actions are verbs and many words that denote qualities—brave, foolish, good—are adjectives.” These arguments are meant to shore up the assertion that “the parts of speech must be so broadly defined as to be almost meaningless.”
Not in my universe, bub. He has ignored or overlooked the fact that a noun expresses an action or quality in a different way than a verb or an adjective, and it is not uncommon to have closely related words (cognates) in multiple functional categories (e.g., sleep as noun, sleep as verb, sleepy or sleeping as adjective, sleepily as adverb) so that statements about a topic can be made in multiple ways. How is this a failing?? The noun is bravery, the adjective is brave; they both describe a quality, and each can be used to express a thought about heroism. How does that render the categories of noun and adjective themselves meaningless? Wouldn’t it be a bitch if we always had to use sleep as a noun and cast every sentence to accommodate that inflexibility?
I repeat, in English a word is characterized by how it is used, and native speakers decide how any given word may be used by using it that way and being understood. Mr. Bryson would seem to prefer a language in which the nouns were always and forever nouns and referred very solidly and concretely to things, and so on. This is not only impossible, it is supremely undesirable. It takes away all possibility of wordplay and inventiveness, not to mention growth and change.
I cannot believe this book was ever subjected to an editorial eye. No editor worth her salt would have allowed this nonsense to stand. Although I enjoyed other sections of the book, once I had read this chapter, I could no longer trust any statement the author made that I didn’t already know to be true. “Scholarly,” my ass. I rather doubt the reviewer read the whole thing. Who knows what other idiocies lurk beneath the breezy exposition? I usually resell or donate my unwanted books, but this one is going in the recycling bin as too worthless and too dangerous to pass on. I am as puzzled and outraged as if I had come across arguments for a flat earth in a book on geography.
This is article 18 in a continuing series. © 2009 Christine C. Janson
Most of the book is a discussion of research and histories done by others. Nowhere does Mr. Bryson mention any original research done by him, and his name does not appear in the bibliography, so I’m guessing he has not published in this field previously. None of the books in the bibliography was published before 1931, and there are no primary sources, such as Chaucer’s Canterbury Tales. There is no information about the author at all; his academic background and current profession are unknown. According to the blurbs all over the cover, this book was a hardcover bestseller and well reviewed, even earned the accolade of “scholarly” from the Los Angeles Times. The writing style is breezy and fun, more magazine than academe, which of course adds immensely to readability. But the author betrays such a fundamental lack of understanding of the basic structures of English that I am astounded he got his book published.
In his chapter on English grammar and its oddities, Mr. Bryson states that in English, “A noun is a noun and a verb is a verb largely because the grammarians say they are.” He supports this by giving a list of nouns that are also verbs, such as charge and towel. But the statement is arrant nonsense. A noun is a word that functions as a noun; a verb is a word that functions as a verb. In the sentence “The charge appeared on my statement,” the word charge is being used as a noun, and grammarians are as powerless to turn it into a verb as they are to turn it into gold. Only native speakers can decide how to use a given word, setting its function through use, and if they want to change its use, they have to construct a new sentence for it. The fact that English has so many multifunctional terms is a tribute to its unique versatility. A word’s function (noun, verb, whatever) is not revealed until it is actually used; no one can look at the isolated word charge and declare it noun or verb, because it has the potential to be either. This is a very neat trick, not a shortcoming, and cannot be done in many other languages.
Claiming that “the parts of speech are almost entirely notional,” Mr. Bryson offers the examples “I am suffering terribly” and “My suffering is terrible.” He says the grammarians would call suffering a verb in the first but a noun in the second, but in his opinion both sentences use “precisely the same word to express precisely the same idea.” Well, no. Technically, the first suffering is a present participle, a verbal adjective, and the second is a gerund, a verbal noun, both of which are derived from the same verb, suffer. It is thus not at all odd that they should express the same idea, but the verb has been inflected in different ways, to form a participle (adjective) to use in the present progressive tense and to form a gerund (noun) to use as a subject. Every English verb has the ability to become a noun or an adjective by the addition of -ing; which it is is strictly a matter of how it’s used. As a native English speaker, the author has automatically used terrible to modify the gerund and terribly to modify the participle even as he claims they are modifying “precisely the same word,” proving that the language center in his brain is operating better than the reasoning center. It isn’t a noun or an adjective because grammarians say it is; it’s a noun or adjective because that’s how it’s functioning. There’s nothing “notional” about it.
Having said one puzzlingly harebrained thing, Mr. Bryson reveals even deeper ignorance of how his language works (the language, remember, he has dared to write a book about). In the same paragraph, he writes, “Breaking is a present tense participle, but as often as not it is used in a past tense sense (‘He was breaking the window when I saw him’). Broken, on the other hand, is a past tense participle but as often as not it is employed in a present tense sense (‘I think I’ve just broken my toe’) or even future tense sense (‘If he wins the next race, he’ll have broken the school record’).” These cavils reveal such a complete misunderstanding of basic grammar I am left breathless. Throughout the book he cites Fowler, Copperud, and other well-known grammarians, but he has clearly been too selective in actually reading them. No authorities are cited in this section, but the lack of support for his pet peeve didn’t stop him from ranting. No research went into these inanities. There is nothing here but gibberish.
First off, there is no such thing as a “present tense” or “past tense” participle; a participle is an adjective and has no tense. Participles, present and past, are used to form various tenses. The present participle is used to form the progressive tenses present, past, future, and perfect: I am walking, I was walking, I will be walking, I have been walking, I had been walking, I will have been walking. Likewise, the past participle is used to form the perfect tenses: I have walked, I had walked, I will have walked. Reexamine the statements that “present tense participles” are often used in a “past tense sense” and vice versa, and you realize that his statements make no sense at all, present, past, or future.
It gets worse. Mr. Bryson follows this arrogant demonstration of ignorance with one of boneheaded wrongness. I can only quote; paraphrase will not suffice. “A noun…is generally said [to denote] a person, place, thing, action, or quality. That would seem to cover almost everything, yet clearly most actions are verbs and many words that denote qualities—brave, foolish, good—are adjectives.” These arguments are meant to shore up the assertion that “the parts of speech must be so broadly defined as to be almost meaningless.”
Not in my universe, bub. He has ignored or overlooked the fact that a noun expresses an action or quality in a different way than a verb or an adjective, and it is not uncommon to have closely related words (cognates) in multiple functional categories (e.g., sleep as noun, sleep as verb, sleepy or sleeping as adjective, sleepily as adverb) so that statements about a topic can be made in multiple ways. How is this a failing?? The noun is bravery, the adjective is brave; they both describe a quality, and each can be used to express a thought about heroism. How does that render the categories of noun and adjective themselves meaningless? Wouldn’t it be a bitch if we always had to use sleep as a noun and cast every sentence to accommodate that inflexibility?
I repeat, in English a word is characterized by how it is used, and native speakers decide how any given word may be used by using it that way and being understood. Mr. Bryson would seem to prefer a language in which the nouns were always and forever nouns and referred very solidly and concretely to things, and so on. This is not only impossible, it is supremely undesirable. It takes away all possibility of wordplay and inventiveness, not to mention growth and change.
I cannot believe this book was ever subjected to an editorial eye. No editor worth her salt would have allowed this nonsense to stand. Although I enjoyed other sections of the book, once I had read this chapter, I could no longer trust any statement the author made that I didn’t already know to be true. “Scholarly,” my ass. I rather doubt the reviewer read the whole thing. Who knows what other idiocies lurk beneath the breezy exposition? I usually resell or donate my unwanted books, but this one is going in the recycling bin as too worthless and too dangerous to pass on. I am as puzzled and outraged as if I had come across arguments for a flat earth in a book on geography.
This is article 18 in a continuing series. © 2009 Christine C. Janson
Hey! You Talkin' ta Me?
Reality shows sometimes resort to on-screen captioning when the dialogue goes mumbly or gets scattered by noise. News programs do the same thing, for example, for heavy accents and 911 tapes. People do not speak orthographically; what they say must be interpreted into the symbols we call writing, and those symbols include more than letters. These on-screen transcriptions do a reasonably good job of presenting the spoken word within the standard expectations of spelling and usually (not always, alas) get the homophones correct. But punctuation is a different story. In particular, nobody seems to understand that direct address requires distinctive treatment to avoid syntactical hash.
We interpret the spoken word differently than we do the written word. The spoken word is always context specific; the written word is always outside the context and must be specified. If the person next to you says “The house is on fire!” you hear the urgency, and you can probably turn and see the flames and feel the heat. If you read the same words, you’re probably far from the fire in time and space, but the quotation marks tell you someone actually spoke the words, and the exclamation point conveys the sense of urgency. Like tone and emphasis in speech, punctuation works with syntax to create meaning in a written communication.
When you address someone directly, that is, call them by their name or title or honorific (e.g., sir), that instance of direct address is grammatically isolated from any other part of the sentence in which it appears: we say to Bob, “I heard, Bob, and I laughed.” In speech, we can emphasize this separation by a slight pause, but because Bob is actually present, context alone makes the meaning clear. In writing, we must isolate the address with commas as a parenthetical element that is not participating in the grammar. (I.e., the commas are not replicating the spoken pause so much as they are visually fencing off that which is grammatically irrelevant.) If we do not set it off, Bob becomes the direct object of the verb heard because, grammatically speaking, that is how we must read the sentence as written: “I heard Bob and I laughed.”
Failure to set off a direct address with a comma may cause great embarrassment or great amusement. A desktop sign seen in a recent catalog reads “Work with me people.” This is clearly advice to work with egotists, not a direct address pleading for cooperation, which would require a comma: “Work with me, people.” “John get the phone” is pidgin for “John gets (or is getting or got) the phone”; a demand that the phone be answered requires an indication of direct address and the imperative: “John, get the phone.” “Don’t hassle me dad” is a Briton’s command to leave his father alone; “Don’t hassle me, Dad” is a plea from son to father for some peace. Note the capital D on Dad. A title used as an address is capitalized: “This is my aunt Mary,” but “Welcome, Aunt Mary!” Now note that without the comma to show direct address, this becomes a command: “Welcome Aunt Mary!”
Another thing that is always set off with commas because it has no grammatical role is an interjection. Many interjections appear in company with the exclamation point that underlines their emphasis: Hey, you, outta my yard! (interjection followed by direct address); Oh, man, is it cold! (two interjections back to back, or perhaps again an interjection followed by direct address); Haven’t seen you in, jeez, 30 years! (euphemistic interjection in midsentence). Swear words and obscenities not participating in the grammar are set off as interjections: “Shit, where’s my cell?” but “Get your shit together.”
There is a new animated Christmas special premiering this December called Yes Virginia. The first word is an interjection, and the second is a direct address, providing two imperatives for separating these words with a comma. I suppose it’s too late for them to change all the ads and titles and unembarrass themselves? Gee, guys, that’s a shame—on you.
If you’re talkin’ ta me, baby, you better get it right.
This is article 17 in a continuing series. © 2009 Christine C. Janson
We interpret the spoken word differently than we do the written word. The spoken word is always context specific; the written word is always outside the context and must be specified. If the person next to you says “The house is on fire!” you hear the urgency, and you can probably turn and see the flames and feel the heat. If you read the same words, you’re probably far from the fire in time and space, but the quotation marks tell you someone actually spoke the words, and the exclamation point conveys the sense of urgency. Like tone and emphasis in speech, punctuation works with syntax to create meaning in a written communication.
When you address someone directly, that is, call them by their name or title or honorific (e.g., sir), that instance of direct address is grammatically isolated from any other part of the sentence in which it appears: we say to Bob, “I heard, Bob, and I laughed.” In speech, we can emphasize this separation by a slight pause, but because Bob is actually present, context alone makes the meaning clear. In writing, we must isolate the address with commas as a parenthetical element that is not participating in the grammar. (I.e., the commas are not replicating the spoken pause so much as they are visually fencing off that which is grammatically irrelevant.) If we do not set it off, Bob becomes the direct object of the verb heard because, grammatically speaking, that is how we must read the sentence as written: “I heard Bob and I laughed.”
Failure to set off a direct address with a comma may cause great embarrassment or great amusement. A desktop sign seen in a recent catalog reads “Work with me people.” This is clearly advice to work with egotists, not a direct address pleading for cooperation, which would require a comma: “Work with me, people.” “John get the phone” is pidgin for “John gets (or is getting or got) the phone”; a demand that the phone be answered requires an indication of direct address and the imperative: “John, get the phone.” “Don’t hassle me dad” is a Briton’s command to leave his father alone; “Don’t hassle me, Dad” is a plea from son to father for some peace. Note the capital D on Dad. A title used as an address is capitalized: “This is my aunt Mary,” but “Welcome, Aunt Mary!” Now note that without the comma to show direct address, this becomes a command: “Welcome Aunt Mary!”
Another thing that is always set off with commas because it has no grammatical role is an interjection. Many interjections appear in company with the exclamation point that underlines their emphasis: Hey, you, outta my yard! (interjection followed by direct address); Oh, man, is it cold! (two interjections back to back, or perhaps again an interjection followed by direct address); Haven’t seen you in, jeez, 30 years! (euphemistic interjection in midsentence). Swear words and obscenities not participating in the grammar are set off as interjections: “Shit, where’s my cell?” but “Get your shit together.”
There is a new animated Christmas special premiering this December called Yes Virginia. The first word is an interjection, and the second is a direct address, providing two imperatives for separating these words with a comma. I suppose it’s too late for them to change all the ads and titles and unembarrass themselves? Gee, guys, that’s a shame—on you.
If you’re talkin’ ta me, baby, you better get it right.
This is article 17 in a continuing series. © 2009 Christine C. Janson
Yawning Emptiness
Humans are communicative critters. We trumped the animal kingdom’s grunts and whistles by inventing language, made up words and rules for stringing them together to yield meaning. After a few millennia to work out the kinks, we rose above ourselves with poetry, drama, rhetoric, and logic. We figured out how to record the words and preserve them, from cuneiform to alphabets to binary code, from clay tablets to parchment to CD-ROM. From the beginning, we also devised ways to subvert the communication that is the very reason for language. We invented lies and other prevarications, giving rise to legal systems for determining guilt and teasing the truth out of conflicting accounts. And we found ways to use words to say nothing at all.
Saying nothing at all ranges from the long-windedly verbose, like the seasoned politician who can speak stirringly for an hour and convey not one phoneme of real meaning, to the monosyllabically iterative, like those who “um” after every third or fourth word.
In current American idiom, there are several go-to phrases for saying nothing and filling the silence while you gather your thoughts. Two of the most dreaded are “you know” and “like.” Speakers in the habit of using these cannot be listened to for very long, because after the first two, every succeeding “you know” or “like” elicits a bigger wince, until the listener risks whiplash or assault charges.
There are also words that are mindlessly overused to the point that they lose all meaning. Right now, when I hear “amazing,” I am no closer to knowing what the speaker means, beyond general approval, than if he had not spoken. I’ve also heard enough of “actually,” which doesn’t actually mean anything most of the time. As for “toe-tally,” I’m not going there. These words could all be replaced with the nonsense syllable blah without changing the informational content one bit. However, blah would not carry the emotional charge, the thumbs-up of “amazing” or the emphasis of “toe-tally.”
Another way of using language to convey no information beyond emotional content was, until recently, not permitted in public. Now, only G-rated movies are guaranteed to be free of four-letter words and swearing, and obscenities can be heard on cable channels other than the pay-through-the-nose premiums. I am not against this. The words exist; people use them; I use them; it is unreal to portray the world entirely without them. However, I am no more inclined to listen to “fucking” three times per sentence than I am to listen to “you know” at the same frequency.
The constant bleeping of four-letter words on reality shows is bad enough. I am astonished that people resort to them as a matter of course, especially in front of cameras, knowing they will be aired (and bleeped) on national television. Swearwords are intensifiers, allow us to express pain, ill will, frustration, and anger without being specific. But a heartfelt “Jesus H. Christ!” when you stub your toe is one thing. A routine “Eat your fuckin’ vegetables, for Christ’s sake” at the dinner table is another. These words have no intended meaning beyond the expression of negative emotion, i.e., there is no actual reference to sex acts or deities. Language like this is a slap in the face, a confrontational way to say “Hey! Wake up! Listen to me! I mean it!” It’s hard for me to believe people are so ready to slap family, friends, and strangers alike.
Constant bleeping is bad enough; worse is the constant cussing on scripted shows such as The Sopranos and Deadwood. Lured to watch by rave reviews, I have never sat through an entire episode of either, because after the first ten or twenty uses of fuck and goddamn, about 5 to 10 minutes, I’ve had enough and hit the remote. Slap someone often enough and they’ll go numb. Intense language loses intensity through overuse, until intensity can only be maintained by increased density of use. When every utterance is redlining it linguistically (“The fuckin’ thing don’t fuckin’ work unless I fuckin’ beat on it”), the intensifiers lose all effect, and we are left with emptiness that echoes with negativity. The speaker is saying nothing just as vehemently as he can, shouting “Blah!” at top volume every few syllables.
I will, reluctantly, concede that perhaps people do talk to each other like this, with complete disrespect and belligerence, even within families. Reality TV is unpleasant proof of the ubiquity of bad language. I will not concede that such language is either necessary or acceptable as dialogue.
Drama may reflect life, but it’s life with most of the quotidian details mercifully left out. Real people visit a bathroom every few hours. That doesn’t mean we have to watch the characters in a play or movie interrupt the action to do the same in the name of verisimilitude. Unless it’s part of the story, we aren’t subjected to belches, nail biting, hiccups, nose blowing, or a thousand other common human acts. We don’t need to see every mouthful of food chewed and swallowed. We don’t want characters to spew “you know” and “like” multiple times in every sentence even though real people do, because they’re boring and annoying and turn the dialogue into Swiss cheese, riddled with empty spaces. And there’s no reason we should have to listen to a lot of meaningless cuss words that have had all the intensity sucked out of them. To hear “Fuck you!” once in a two-hour movie is shocking. To hear it thirty or forty times in a one-hour episode is just a bore, lots and lots of empty space between meaningful words. So much emptiness makes me yawn and go elsewhere, for characters who reveal the story through their words instead of slapping me silly with them.
This is article 16 in a continuing series. © 2009 Christine C. Janson
Saying nothing at all ranges from the long-windedly verbose, like the seasoned politician who can speak stirringly for an hour and convey not one phoneme of real meaning, to the monosyllabically iterative, like those who “um” after every third or fourth word.
In current American idiom, there are several go-to phrases for saying nothing and filling the silence while you gather your thoughts. Two of the most dreaded are “you know” and “like.” Speakers in the habit of using these cannot be listened to for very long, because after the first two, every succeeding “you know” or “like” elicits a bigger wince, until the listener risks whiplash or assault charges.
There are also words that are mindlessly overused to the point that they lose all meaning. Right now, when I hear “amazing,” I am no closer to knowing what the speaker means, beyond general approval, than if he had not spoken. I’ve also heard enough of “actually,” which doesn’t actually mean anything most of the time. As for “toe-tally,” I’m not going there. These words could all be replaced with the nonsense syllable blah without changing the informational content one bit. However, blah would not carry the emotional charge, the thumbs-up of “amazing” or the emphasis of “toe-tally.”
Another way of using language to convey no information beyond emotional content was, until recently, not permitted in public. Now, only G-rated movies are guaranteed to be free of four-letter words and swearing, and obscenities can be heard on cable channels other than the pay-through-the-nose premiums. I am not against this. The words exist; people use them; I use them; it is unreal to portray the world entirely without them. However, I am no more inclined to listen to “fucking” three times per sentence than I am to listen to “you know” at the same frequency.
The constant bleeping of four-letter words on reality shows is bad enough. I am astonished that people resort to them as a matter of course, especially in front of cameras, knowing they will be aired (and bleeped) on national television. Swearwords are intensifiers, allow us to express pain, ill will, frustration, and anger without being specific. But a heartfelt “Jesus H. Christ!” when you stub your toe is one thing. A routine “Eat your fuckin’ vegetables, for Christ’s sake” at the dinner table is another. These words have no intended meaning beyond the expression of negative emotion, i.e., there is no actual reference to sex acts or deities. Language like this is a slap in the face, a confrontational way to say “Hey! Wake up! Listen to me! I mean it!” It’s hard for me to believe people are so ready to slap family, friends, and strangers alike.
Constant bleeping is bad enough; worse is the constant cussing on scripted shows such as The Sopranos and Deadwood. Lured to watch by rave reviews, I have never sat through an entire episode of either, because after the first ten or twenty uses of fuck and goddamn, about 5 to 10 minutes, I’ve had enough and hit the remote. Slap someone often enough and they’ll go numb. Intense language loses intensity through overuse, until intensity can only be maintained by increased density of use. When every utterance is redlining it linguistically (“The fuckin’ thing don’t fuckin’ work unless I fuckin’ beat on it”), the intensifiers lose all effect, and we are left with emptiness that echoes with negativity. The speaker is saying nothing just as vehemently as he can, shouting “Blah!” at top volume every few syllables.
I will, reluctantly, concede that perhaps people do talk to each other like this, with complete disrespect and belligerence, even within families. Reality TV is unpleasant proof of the ubiquity of bad language. I will not concede that such language is either necessary or acceptable as dialogue.
Drama may reflect life, but it’s life with most of the quotidian details mercifully left out. Real people visit a bathroom every few hours. That doesn’t mean we have to watch the characters in a play or movie interrupt the action to do the same in the name of verisimilitude. Unless it’s part of the story, we aren’t subjected to belches, nail biting, hiccups, nose blowing, or a thousand other common human acts. We don’t need to see every mouthful of food chewed and swallowed. We don’t want characters to spew “you know” and “like” multiple times in every sentence even though real people do, because they’re boring and annoying and turn the dialogue into Swiss cheese, riddled with empty spaces. And there’s no reason we should have to listen to a lot of meaningless cuss words that have had all the intensity sucked out of them. To hear “Fuck you!” once in a two-hour movie is shocking. To hear it thirty or forty times in a one-hour episode is just a bore, lots and lots of empty space between meaningful words. So much emptiness makes me yawn and go elsewhere, for characters who reveal the story through their words instead of slapping me silly with them.
This is article 16 in a continuing series. © 2009 Christine C. Janson
Sunday, November 29, 2009
Posting-It Notes
Congratulations! You have a blog! You have an outlet for all the thoughts in your head and experiences in your life, a way to communicate with the entire world one on-line reader at a time.
Perhaps you use it as a journal for personal reminiscences and ponderings. Perhaps you have chosen a very specific topic, no doubt your own obsession, be it baseball or Star Trek or puppets. (Mine, of course, is language.) Perhaps you see it as your version of Oprah’s O magazine, presenting a variety of material all centered around your world view. Or perhaps you intend it as a way to keep in touch with family and friends, an ongoing version of the e-mailed Christmas letter detailing everyone’s busy doings.
The act of posting material to a blog, no matter its intended purpose, is equivalent to publishing it, i.e., making it public. You may choose to restrict the size of that public, or the quality of your writing may restrict it for you. I can envision several basic scenarios for composing and posting a blog entry. Each will produce a very different caliber of material, which will greatly influence whether others might want to read that blog or return to it. These scenarios apply to blogs that are primarily essays; those that provide a service or database or carry far more images and videos than words have other means to attract followers. However, a visitor is more likely to view an image or video if there are words to lure him in…
Slapdash and Sloppy
You’ve just been out on an exhilarating mountain hike and can’t wait to share the experience. Seconds after walking in the door, your hiking boots are off and you’re at the keyboard, logging into your blog. First you upload and arrange half a dozen photos and caption them minimally (e.g., “Me and Joey at the top”). Then, with the cursor settled into the composing window, you start to write. Your thoughts spill out one after the other as they occur, with no attempt to order or arrange them for a sense of progression or continuity, never mind paragraphs. You’re typing too fast to worry about typos. You aren’t concerned about and might not recognize misplaced commas, sentence fragments, or subject-verb disagreements. Clichés abound because originality takes time and thought. You fall easily into texting and chat room habits, abbreviating all sorts of things into technoglyphs (for example, l8r). When your thoughts have been exhausted or dinner beckons, you hit Post, wait impatiently for confirmation, and log out.
Prognosis: Poor. Only friends and relatives will have any reason to slog through this stream-of-consciousness lack of style, replete with misspellings, random punctuation, grammatical hash, and other aggravations that hinder comprehension and enjoyment. The writer himself isn’t interested in reading it, hasn’t bothered to go back to correct errors or organize. Even if the experience was extraordinary, the attempt to capture it in words was haphazard and lazy at best and a failure at worst. The account serves mainly as a record of events, and the purpose does not encompass either poetry or philosophy. This is the writing equivalent of a crayoned drawing, appropriate for family viewing on the refrigerator door, not good enough for a frame or display in the living room.
It is my impression and my fear that this is the method employed by a great many bloggers. If I am correct in this impression, there is a whole lot of unreadable crap floating around in cyberspace. That’s okay, provided nobody expects me to read any of it. (Unless, of course, we share genes, in which case I will find it charming, just as I would find the artwork on the fridge charming.)
Considered and Careful
You’ve had a great idea for your blog on wallpaper. Before logging in, you spend some time thinking not only about what you want to say but what order you should say it in. Perhaps you even jot down a few notes to refer to as you compose. You take your time while writing, think about structure and flow as you go. As a prose stylist, you reach for metaphor and simile, enjoy alliteration and humor. You choose the best images and arrange them to work with the text, caption them pithily. You then read over what you’ve written, correcting typos and punctuation, consulting a dictionary for suspect spellings, perhaps even reaching for a thesaurus to avoid using the same descriptors again and again. You discover and correct a sentence fragment, a discordant subject and verb, a dangling participle, an unclear antecedent. One more quick read satisfies you that your piece is presentable, and you post it. If you’re the skeptical type or just enamored of your own writing, you go immediately to the blog to view it and perhaps read it one more time. Once it’s been posted, you probably won’t change it and may never read it again.
Prognosis: Fair. Because the writer’s subject is near to her heart and she presumably has some expertise, the blog has a good chance of being interesting. Because she has taken the time and trouble to edit the original composition, it also has a chance of being both readable and comprehensible. A blog that is interesting and comprehensible will attract followers, if only among those who share her obsession. This is the writing equivalent of an artwork created for a gallery show, worthy of being framed for viewing by the people who visit the gallery, some of whom will appreciate it more than others. Forethought and afterthought have both been used to narrow the focus and streamline the progression; editing has been applied to root out distracting errors and points of possible confusion. A few errors will no doubt slip through, but not enough to be annoying.
This, in my opinion, is the minimum level of effort required to turn out a readable blog. Juicy content will only balance bad writing up to a point. Dry content will have even less weight. Regardless of content, better writing will always mean more readers. I would be willing to read a piece written to this standard, depending somewhat on the topic and the style, but the chances that I would go back for more are only 50-50, again dependent on the topic and style.
Prepared and Perfected
It’s time to work up a new piece for your blog about politics, a subject about which your opinions have the weight of knowledge and experience. From a handy list of ten to twenty topics, you pick one and begin turning it over in your head, figuring out not only what you have to say but how to organize your presentation into a coherent whole. You want to create a suck-’em-in beginning, an opinionated, informative, entertaining center, and a satisfying conclusion. You love to find a title that’s clever or punny and will resonate with multiple meanings as the reader moves through the piece. The perfect opening sentence can take several days to construct, but once you have it the rest of the piece falls into place behind it in your head. You then compose your first draft, whether on paper or directly on the computer with Word (or whatever). As you write, you stop constantly to go back and make changes and check how the argument is developing. Once the draft is complete, you edit ruthlessly. Any mistakes missed while composing, including uncertainties of fact as well as spelling, are found and corrected now. Dictionary, thesaurus, and other references are in reach or standing by. You transpose words, sentences, even blocks of text to improve the flow, add things you forgot or thought of later and delete things that are irrelevant or interrupt the argument, no matter how interesting they may be. You agonize over word choice, not just for meaning but for meter and music, and delight in wordplay and truly original use of words. Irony, hyperbole, synecdoche, and all those other curiously named literary tricks are part of your writing toolkit. Finally, after going through the thing ten or twenty or thirty times, you have a piece that is perfect grammatically and polished stylistically. When you go to your blog and hit New Post to open the composition window, instead of typing, you simply paste in a copy of the file created with all the tools available in Word. You then go through it to restore lost formatting and format it further with the blog tools. After one final readthrough to ensure there are no errors, you post the piece and immediately go to view it. You can’t help but read through it one more time because you’re always tickled when you see your work “published.” It’s entirely possible you’ll find one or more small errors despite all the earlier editing and proofreading, and you make the effort to edit and repost the piece. Over the next few weeks you may actually go back and tweak the piece a bit as you think of ways to make it even better.
Prognosis: Excellent. This is the level of effort, talent, and sheer fussiness required to turn out a piece of writing that will delight as well as inform. Even if readers don’t share his obsession, his passion and persuasiveness will capture their interest. That interest will not be diverted by errors or infelicities of language but deepened by appreciation of his wit and his heart. This is the writing equivalent of a masterpiece, evidence of qualification for the rank of master, worthy of display, if not in a museum, then at least in a highly frequented public space. This is his best, created with every tool and skill at his command. There are no errors of composition or of fact (well, maybe a little one now and then; nobody’s perfect). The reader can simply enjoy the story or argument as it unfolds, be edified by reliable information and entertained by cleverness.
A blog written to this standard will attract numerous readers once word gets out, because it can be enjoyed by those who don’t share the writer’s genes or passion for the topic but can appreciate his skillful expression of it. This is the standard I strive for in my own writing. I would dearly love to discover blogs of this caliber on any number of subjects. Suggestions are welcome.
Whether you think my blog isn’t as good as all that or you think I’m meeting my goals splendidly, your comments are also welcome.
This is article 15 in a continuing series. © 2009 Christine C. Janson
Perhaps you use it as a journal for personal reminiscences and ponderings. Perhaps you have chosen a very specific topic, no doubt your own obsession, be it baseball or Star Trek or puppets. (Mine, of course, is language.) Perhaps you see it as your version of Oprah’s O magazine, presenting a variety of material all centered around your world view. Or perhaps you intend it as a way to keep in touch with family and friends, an ongoing version of the e-mailed Christmas letter detailing everyone’s busy doings.
The act of posting material to a blog, no matter its intended purpose, is equivalent to publishing it, i.e., making it public. You may choose to restrict the size of that public, or the quality of your writing may restrict it for you. I can envision several basic scenarios for composing and posting a blog entry. Each will produce a very different caliber of material, which will greatly influence whether others might want to read that blog or return to it. These scenarios apply to blogs that are primarily essays; those that provide a service or database or carry far more images and videos than words have other means to attract followers. However, a visitor is more likely to view an image or video if there are words to lure him in…
Slapdash and Sloppy
You’ve just been out on an exhilarating mountain hike and can’t wait to share the experience. Seconds after walking in the door, your hiking boots are off and you’re at the keyboard, logging into your blog. First you upload and arrange half a dozen photos and caption them minimally (e.g., “Me and Joey at the top”). Then, with the cursor settled into the composing window, you start to write. Your thoughts spill out one after the other as they occur, with no attempt to order or arrange them for a sense of progression or continuity, never mind paragraphs. You’re typing too fast to worry about typos. You aren’t concerned about and might not recognize misplaced commas, sentence fragments, or subject-verb disagreements. Clichés abound because originality takes time and thought. You fall easily into texting and chat room habits, abbreviating all sorts of things into technoglyphs (for example, l8r). When your thoughts have been exhausted or dinner beckons, you hit Post, wait impatiently for confirmation, and log out.
Prognosis: Poor. Only friends and relatives will have any reason to slog through this stream-of-consciousness lack of style, replete with misspellings, random punctuation, grammatical hash, and other aggravations that hinder comprehension and enjoyment. The writer himself isn’t interested in reading it, hasn’t bothered to go back to correct errors or organize. Even if the experience was extraordinary, the attempt to capture it in words was haphazard and lazy at best and a failure at worst. The account serves mainly as a record of events, and the purpose does not encompass either poetry or philosophy. This is the writing equivalent of a crayoned drawing, appropriate for family viewing on the refrigerator door, not good enough for a frame or display in the living room.
It is my impression and my fear that this is the method employed by a great many bloggers. If I am correct in this impression, there is a whole lot of unreadable crap floating around in cyberspace. That’s okay, provided nobody expects me to read any of it. (Unless, of course, we share genes, in which case I will find it charming, just as I would find the artwork on the fridge charming.)
Considered and Careful
You’ve had a great idea for your blog on wallpaper. Before logging in, you spend some time thinking not only about what you want to say but what order you should say it in. Perhaps you even jot down a few notes to refer to as you compose. You take your time while writing, think about structure and flow as you go. As a prose stylist, you reach for metaphor and simile, enjoy alliteration and humor. You choose the best images and arrange them to work with the text, caption them pithily. You then read over what you’ve written, correcting typos and punctuation, consulting a dictionary for suspect spellings, perhaps even reaching for a thesaurus to avoid using the same descriptors again and again. You discover and correct a sentence fragment, a discordant subject and verb, a dangling participle, an unclear antecedent. One more quick read satisfies you that your piece is presentable, and you post it. If you’re the skeptical type or just enamored of your own writing, you go immediately to the blog to view it and perhaps read it one more time. Once it’s been posted, you probably won’t change it and may never read it again.
Prognosis: Fair. Because the writer’s subject is near to her heart and she presumably has some expertise, the blog has a good chance of being interesting. Because she has taken the time and trouble to edit the original composition, it also has a chance of being both readable and comprehensible. A blog that is interesting and comprehensible will attract followers, if only among those who share her obsession. This is the writing equivalent of an artwork created for a gallery show, worthy of being framed for viewing by the people who visit the gallery, some of whom will appreciate it more than others. Forethought and afterthought have both been used to narrow the focus and streamline the progression; editing has been applied to root out distracting errors and points of possible confusion. A few errors will no doubt slip through, but not enough to be annoying.
This, in my opinion, is the minimum level of effort required to turn out a readable blog. Juicy content will only balance bad writing up to a point. Dry content will have even less weight. Regardless of content, better writing will always mean more readers. I would be willing to read a piece written to this standard, depending somewhat on the topic and the style, but the chances that I would go back for more are only 50-50, again dependent on the topic and style.
Prepared and Perfected
It’s time to work up a new piece for your blog about politics, a subject about which your opinions have the weight of knowledge and experience. From a handy list of ten to twenty topics, you pick one and begin turning it over in your head, figuring out not only what you have to say but how to organize your presentation into a coherent whole. You want to create a suck-’em-in beginning, an opinionated, informative, entertaining center, and a satisfying conclusion. You love to find a title that’s clever or punny and will resonate with multiple meanings as the reader moves through the piece. The perfect opening sentence can take several days to construct, but once you have it the rest of the piece falls into place behind it in your head. You then compose your first draft, whether on paper or directly on the computer with Word (or whatever). As you write, you stop constantly to go back and make changes and check how the argument is developing. Once the draft is complete, you edit ruthlessly. Any mistakes missed while composing, including uncertainties of fact as well as spelling, are found and corrected now. Dictionary, thesaurus, and other references are in reach or standing by. You transpose words, sentences, even blocks of text to improve the flow, add things you forgot or thought of later and delete things that are irrelevant or interrupt the argument, no matter how interesting they may be. You agonize over word choice, not just for meaning but for meter and music, and delight in wordplay and truly original use of words. Irony, hyperbole, synecdoche, and all those other curiously named literary tricks are part of your writing toolkit. Finally, after going through the thing ten or twenty or thirty times, you have a piece that is perfect grammatically and polished stylistically. When you go to your blog and hit New Post to open the composition window, instead of typing, you simply paste in a copy of the file created with all the tools available in Word. You then go through it to restore lost formatting and format it further with the blog tools. After one final readthrough to ensure there are no errors, you post the piece and immediately go to view it. You can’t help but read through it one more time because you’re always tickled when you see your work “published.” It’s entirely possible you’ll find one or more small errors despite all the earlier editing and proofreading, and you make the effort to edit and repost the piece. Over the next few weeks you may actually go back and tweak the piece a bit as you think of ways to make it even better.
Prognosis: Excellent. This is the level of effort, talent, and sheer fussiness required to turn out a piece of writing that will delight as well as inform. Even if readers don’t share his obsession, his passion and persuasiveness will capture their interest. That interest will not be diverted by errors or infelicities of language but deepened by appreciation of his wit and his heart. This is the writing equivalent of a masterpiece, evidence of qualification for the rank of master, worthy of display, if not in a museum, then at least in a highly frequented public space. This is his best, created with every tool and skill at his command. There are no errors of composition or of fact (well, maybe a little one now and then; nobody’s perfect). The reader can simply enjoy the story or argument as it unfolds, be edified by reliable information and entertained by cleverness.
A blog written to this standard will attract numerous readers once word gets out, because it can be enjoyed by those who don’t share the writer’s genes or passion for the topic but can appreciate his skillful expression of it. This is the standard I strive for in my own writing. I would dearly love to discover blogs of this caliber on any number of subjects. Suggestions are welcome.
Whether you think my blog isn’t as good as all that or you think I’m meeting my goals splendidly, your comments are also welcome.
This is article 15 in a continuing series. © 2009 Christine C. Janson
Subscribe to:
Posts (Atom)