Sunday, November 28, 2010

The Chosen

The Chosen

The line goes on forever
standing in the drifts
holding what’s left
of their lives
in frozen hands
against shivering bodies.

Merely a spectre and spectator alone
against the boxcars
I watch them shuffle
torn shoes not withstanding
against the snow
worn coats a worthless shield
and caps of no use against the wind

Trains come and go
but the line is eternally longer
coming from the gray storm
leading only to darkness
snapping dogs, boots, and
machine guns keep it moving

Their eyes tell them
what their mind rejects
The trains are arrivals only
there will be no departures

There are young
and old
some were rich
but nothing matters now

Pasts varied
and every walk a different one
but that which is same
is the tie that binds them to destiny
manifest
in the yellow star
upon the torn coats
and thin sleeves
of those with fearful eyes

I watch as they go
Only ashes await.

Monday, August 16, 2010

Rosen is Right--we can't read, but we can click.


In a world where technology is exploding forward at an incredible rate the unforeseen consequences seem to haunt every passing milestone of achievement. Christine Rosen of The New Atlantis argues that technology has actually given way to the precedence of the image over the written word, and that because of the image (both moving and still) literacy is declining—especially amongst the younger generations. She presses the point further in her assertion that the lack of desire for the written word and printed materials signals a declining intellectual mean in our society. The evidence presented here will show that the written word and literacy are in fact declining, and the statistics strongly infer that the rise of the image can contribute to intellectual decline. But in order to see the issue more clearly, we must look at two of the main culprits in this evolution, television and the Internet, and see their effects on literacy.

In the 1930s television was introduced to the United States as the first form of the moving image. Since then the television has become the archetype for the idea of a visual medium as a replacement for the written one—and the visual medium has grown. In 1950 only 9% of US households had a television, but as of 2009 Neilson reports that almost 99% of all households in the US had a television (Media Trends Track). Certainly the image was on the rise, but was the written word suffering at all? The answer is yes—and on all fronts. Since 1970 more than 10 million newspapers have closed their doors, and currently the average household spending on books is at a 20-year low (Crain) and in 2006 publishers reported a drop in the number of published books to the tune of 18,000 fewer publications (Naisbitt). Meanwhile the hours spent on television has only increased with time (as much as 28 hours a week for school aged children according to the Kaiser Family Foundation (Shapley)), but the grades have continued to drop—in some cases in direct relation to television viewing as revealed by Krista Conger of Stanford University. Conger found that children with a television in their room consistently had grades, “between seven and nine points lower on standardized mathematics, reading and language arts tests” compared to children who did not (Conger). Despite televised educational programs aimed at younger people the test scores don’t lie: television is a liability, not an asset.

But what of the Internet? There are those who argue that the Internet is the hallmark of the information age and the implication is that information and education are inextricably linked. Rand Spiro, a professor of educational psychology at Michigan State University, says that, “this new kind of learning [from the Internet and digital media] is the kind we most need for this increasingly complex world” (Spiro). Mark Bauerlein (an English professor from Emory University) feels differently. In an article entitled, “The New Bibliophobes” Bauerlein wrote that, “[K]ids read and write more words than ever before, but reading scores for high school seniors have been flat since the 1970s and down since the early 1990s” (Bauerlein). He also cited a 2008 report from Strong American Schools, which found that, “43% of two year college students and 29 percent of four-year college students end up in a remedial class in reading, writing, or math” (Bauerlein). Motoko Rich, a columnist for the New York Times, also criticized the new wave of “e-literacy” in a 2008 article entitled, “Online, R U Really Reading?” Rich noted that test results measuring reading ability have “declined or stagnated.” But she also cites the Kaiser Family Foundation research, which shows an increase in Internet usage by 25% in children under 18 in just five years from 1999 to 2004. The research also showed that children were using the Internet for longer periods as well with each passing year (Rich). Digital immersion is undeniably up, and test scores are undeniably down. In spite of the claims by Spiro and others who feel that online literacy is equal to traditional literacy the evidence provided by Rich as well as the statistics from Bauerlein show that e-literacy and digital saturation do not improve test scores.

Does the new paradigm of e-literacy then suppress mental development as Rosen suggests? Research cited by Paul Tough and Caleb Crain strongly suggests that indeed it does. Betty Hart and Todd R. Risley are child psychologists from the University of Kansas and they conducted a study focused on early intellectual and mental development in young children (Tough). Surprisingly, they found that given the situation of a young child IQ, and intelligence can in fact be not only developed and increased, but it can also be suppressed and hindered. In short, a child’s upbringing and education can and will determine their intelligence level as they grow up—and a child’s IQ can be impacted negatively. Hart and Risley also found that IQ at young ages was indicated by signs such as vocabulary size and word usage in interaction with others. This correlates with what Caleb Crain found in his research from the University of Washington: their 2007 study, “revealed that babies aged between eight and sixteen months know on average six to eight fewer words for every hour of baby DVDs and videos they watch daily.” (Crain) Hart and Risley have provided conclusive evidence that IQ and intelligence can be influenced at a young age, and the University of Washington linked television viewing to inhibited vocabulary and language skills in infants and toddlers. This would strongly suggest that at best the programming for infants (in spite of its presumably educational format) is no substitute for human interaction and limits their vocabulary, and at worst it may even inhibit intellectual growth and stimulation. While not utterly conclusive, the negative results do not reflect well on the idea of images being used to educate.

However, not everyone agrees that digital literacy is inferior, or that the written word is slowly eroding from its place in the average household. In “The Postliterate Future” author John Naisbitt argues that, “It is not that the word is going away. It’s not either/or…” (Naisbitt) but all signs—declining book sales, bankrupt newspapers, etc—point to the contrary. Some teachers, like Hiller Spires and Pru Cuper feel that, “Rather than attempting to reverse this natural trend [of online reading], we decided to capitalize on it by providing technology-enhanced learning opportunities” (Cuper) and the Reading Online website offers many more articles with a similar point of view. Several blogs and other online materials are also in favor of meeting the students where they already are: plugged in and online. However, the common flaw of this position is the consistent lack of verifiable research that even suggests that “e-literate” students have better test scores and a higher level of literacy.

The simple fact remains that as more readers come of age their reading grades continue to fall, but the time they spend in front of the television and plugged into the Internet continues to climb. The new digital age has brought some marvelous inventions, but it has also brought distractions and an environment where constant digital multi-tasking is normal and reading a book is not. As the digital age surges forward children are spoon-fed more and more information. They have less need to search for it and assess it for themselves; the ever-present screen is ready and willing to do that for them. They will invariably gain some level of knowledge, but the more pressing question is: will they gain understanding?


Works Cited

Bauerlein, Mark. "The New Bibliophobes." Educational Horizons Winter 2010: 84-91. ERIC. Web. 4 Apr. 2010.

Conger, Krista. "TV in Bedrooms Linked to Lower Test Scores." Stanford News. Stanford University, 13 July 2005. Web. 13 Apr. 2010. .

Crain, Caleb. "Twilight of the Books." The New Yorker. 24 Dec. 2007. Web. 12 Apr. 2010.

"Media Trends Track." Television Bureau of Advertising :: TVB Online. The Neilson Company, Sept. 2009. Web. 13 Apr. 2010. .

Naisbitt, John. "The Postliterate Future." The Futurist Mar.-Apr. 2007: 24-30. ProQuest. Web. 28 Mar. 2010.

Rich, Motoko. "Literacy Debate: Online, R U Really Reading?" The New York Times [New York] 27 July 2008. Print.

Shapley, Dan. "Kids Spend Nearly 55 Hours a Week Watching TV, Texting, Playing Video Games..." The Daily Green. 20 Jan. 2010. Web. 12 Apr. 2010.

Spires, Hiller, and Pru Cuper. "Literacy Junction: Cultivating Adolescents’ Engagement in Literature Through Web Options." Reading Online. International Reading Association, Sept. 2002. Web. 13 Apr. 2010. .

Spiro, Rand. "Pioneering a New Way of Learning in a Complex and Complicated World." College of Education - Teaching - Michigan State University. Spring 2002. Web. 13 Apr. 2010. .

Tough, Paul. "What It Takes to Make a Student." The New York Times [New York] 25 Nov. 2006. Print.


Saturday, July 17, 2010

Financial Behavior: Then and Now



\

Financial Behavior: Then and Now

Much has changed in America over the last one hundred years. Corporations have come and gone, Wall Street has had its ups and downs, and every year brings us a new standard for fashion in clothing. Additionally, the American mentality itself has shifted on almost every level from diet to finances. The average American has evolved in their philosophy of money and how they handle it. But is it a good evolution? Are we climbing the so-called evolutionary ladder, or are we getting caught up in harmful mutations? Or worse, are we de-evolving in our understanding of financial principles? To fully understand this, we must track our progress and see what the last one hundred years have brought us. How did our grandparents and great-grandparents handle money?

In 1910 the Sears, Roebuck, and Co. Catalogue contained the following line, “...buying on credit is folly.” (D. Ramsey, “Dumping Debt”, 2008) In 1922 Henry Ford, said in his book, “My Life and Work”, “…another rock on which business breaks is debt.” Later on he declared, “…most, would bestir themselves very little were it not for the pressure of debt obligations. If so, they are not free men and will not work from free motives. The debt motive is, basically, a slave motive.” (“Quotes by and about Henry Ford”, http://www.abelard.org/ford/ford4_quotes.php) These quotes give us a glimpse of the pre-1950 mentality regarding money. Debt was regarded as foolish, or even a sin (D.Ramsey), and many of that generation were fondly remembered for “saving for a rainy day” despite what they may have wanted at the time. In the 1930s during the depression there was in fact a negative savings rate, but it was largely based on the need to survive, rather than a desire for wealth and luxury. The Bureau of Economic Analysis in this year’s National Income and Product Accounts Table reports that the average savings rate after the Second World War was nearly 10%, and during the war it rose to as high as 16%. What this means is that saving was considered a necessary and prudent part of life by our grandparents, and our great-grandparents. It was simply the responsible thing to do. But the paradigm was already beginning to shift by the end of the 1950s.

In early 1958 Frank McNamara created the first credit card known as “Diner’s Club” (Wikipedia). After the war America had become prosperous and after pinching and hanging on by so little for so the long the desire to have their day in the sun was strong. By 1960 Americans had charged over $340 million of today’s dollars on their newfound credit cards. By 1970 the bill was 7 billion dollars spent via credit card, with an unbelievable 22% delinquency rate. (“Just one Word: Plastic” http://money.cnn.com/magazines/fortune/fortune_archive/2004/02/23/362195/index.htm) The credit card age had arrived with bang and it was almost instantly an industry measured in billions, not millions. The discipline was sliding away. The splurge had started.

Meanwhile, personal debt continued to steadily climb from 1950 to 1965 but began to taper off by the early 1970s. In the late 70s, the debt trend began to rise yet again, but still the ratio of debt-to-income was below 60%. What this essentially means is that for every $100 of income, less than $60 was owed in personal debt. All of this changed in 1985 when the average American owed more than 60% of their income to their debtors. Since then, the trend has increased, with the difference between monies owed and income received reaching an all time low of 0%. (“Is Household Debt too High?”, William R. Emmons, Federal Reserve Bank of St. Louis) Essentially, since the year 2000 Americans have owed just as much as they have spent, or more, and in 2007 Americans had a negative savings rate of 10%. (Ramsey)

Our nation is facing an enormous financial crisis, and these statistics show us the road that has brought us to where we are today. The big question now is “why?” and what does the future hold for our nation?

Between the years after the Second World War and the 80s the idea of personal debt became more and more accepted in the eyes of the public. The creation of the universal credit card and their sudden widespread distribution as well as concept of “keeping up with the Jones’s” began to permeate society. By the 1980s credit cards were fast becoming the norm, and the idea of saving was quickly becoming more and more nonstandard. Suddenly the buzzword of “rainy day” had been replaced by a new buzzword, “leverage” and the idea of borrowing money to get ahead was even being taught in colleges and universities. By the 1990s the new craze was “90 Days Same as Cash” and furniture stores and electronics stores were capitalizing on the fact that 70% of all same-as-cash accounts did not pay off in 90 days. (MoneyCrashers.com, “Stay Away From The “90 Days, Same As Cash” Trap” by Erik Folgate)

All of these ideas pushed an attitude known as “instant gratification”. Whether or not this was intentional was beside the point. The marketing message was still clear: you can have it now. The consumer may have started with the mentality of “I can”, but over time it has shifted to “I must”. Financing, and credit have brought previously unaffordable technology and possessions within our grasp. Humans tend to possess little self-control and the sudden ease of purchasing an item without cash proved to be a fatal combination. An example of this is seen at a simple fast food outlet, McDonald’s. Not very long ago McDonald’s did not accept credit or debit cards, and their average ticket was under $5 per customer. However after they began accepting debit and credit cards the average order total jumped to over $7 per customer. (Ramsey) Additionally CreditCards.com reports that as of September 2009, there are more than 1 billion credit cards actively in circulation around the world. In 2008, Visa, MasterCard, Discover, and American Express chalked up nearly 2 trillion dollars spent with their cards, an 800 billion dollar increase over 5 years. The long and short of these statistics is that Americans simply cannot say “no” and increasingly so as time goes on. It is this kind of mentality that has brought us to the place where more Americans declare bankruptcy every year than graduate from college. (PBS.org “Affluenza…Diagnosis” http://www.pbs.org/kcts/affluenza/diag/what.html)

What then does the future hold? Ultimately the marketplace has only aided the shift from responsibility of our grandparents to the instant gratification of our current generation. As the market has brought us newer cars, nicer houses, and more desirable goods, it has also brought us a way to attain these things instantly without any sense of actually earning the things we want. A free market can and does generate the most desirable goods and a great amount of wealth in the process. How that is handled is neither the fault of the goods or the marketplace, the responsibility lies solely at the feet of the consumer. The market will always rise to meet the demands of the consumer, and that the market delivers instant lines of credit and 90-days-same-as-cash only reflects on the consumer as being willing and ready to use such devices to afford luxuries that are perhaps beyond their means. Ultimately the consumer is the one with the buying power and it is their choice as to how they use that power. The last 45 years have been a slow road to financial depravity because of willful ignorance on the part of the consumer, but if educated the consumer could be a much greater force to be reckoned with. Our task is to clean up the mess we have made and teach the next generation what our grandparents knew all along: discipline and responsibility are the keys to true wealth.