The American economic system is organized around a basically private-enterprise, market-oriented economy in which consumers largely determine what shall be produced by spending their money in the marketplace for those goods and services that they want most. Private businessmen, striving to make profits, produce these goods and services in competition with other businessmen; and the profit motive, operating under competitive pressures, largely determines how these goods and services are produced. Thus, in the American economic system it is the demand of individual consumers, coupled with the desire of businessmen to maximize profits and the desire of individuals to maximize their incomes, that together determine what shall be produced and how resources are used to produce it.
An important factor in a market-oriented economy is the mechanism by which consumer demands can be expressed and responded to by producers. In the American economy, this mechanism is provided by a price system, a process in which prices rise and fall in response to relative demands of consumers and supplies offered by seller-producers. If the products is in short supply relative to the demand, the price will be bid up and some consumers will be eliminated from the market. If, on the other hand, producing more of a commodity results in reducing its cost, this will tend to increase the supply offered by seller-producers, which in turn will lower the price and permit more consumers to buy the product. Thus, price is the regulating mechanism in the American economic system.
The important factor in a private-enterprise economy is that individuals are allowed to own productive resources (private property), and they are permitted to hire labor, gain control over natural resources, and produce goods and services for sale at a profit. In the American economy, the concept of private property embraces not only the ownership of productive resources but also certain rights, including the right to determine the price of a product or to make a free contract with another private individual.
Exceptional children are different in some significant way from others of the same age. For these children to develop to their full adult potential, their education must be adapted to those differences.
Although we focus on the needs of exceptional children, we find ourselves describing their environment as well. While the leading actor on the stage captures our attention, we are aware of the importance of the supporting players and the scenery of the play itself. Both the family and the society in which exceptional children live are often the key to their growth and development. And it is in the public schools that we find the full expression of society's understanding — the knowledge, hopes, and fears that are passed on to the next generation.
Education in any society is a mirror of that society. In that mirror we can see the strengths, the weaknesses, the hopes, the prejudices, and the central values of the culture itself. The great interest in exceptional children shown in public education over the past three decades indicates the strong feeling in our society that all citizens, whatever their special conditions, deserve the opportunity to fully develop their capabilities.
"All men are created equal." We've heard it many times, but it still has important meaning for education in a democratic society. Although the phrase was used by this country's founders to denote equality before the law, it has also been interpreted to mean equality of opportunity. That concept implies educational opportunity for all children — the right of each child to receive help in learning to the limits of his or her capacity, whether that capacity be small or great. Recent court decisions have confirmed the right of all children — disabled or not — to an appropriate education, and have ordered that public schools take the necessary steps to provide that education. In response, schools are modifying their programs, adapting instruction to children who are exceptional, to those who cannot profit substantially from regular programs.
Discoveries in science and technology are thought by "untaught minds" to come in blinding flashes or as the result of dramatic accidents. Sir Alexander Fleming did not, as legend would have it, look at the mold on a piece of cheese and get the idea for penicillin there and then. He experimented with antibacterial substances for nine years before he made his discovery. Inventions and innovations almost always come out of laborious trial and error. Innovation is like soccer; even the best players miss the goal and have their shots blocked much more frequently than they score.
The point is that the players who score most are the ones who take the most shots at the goal — and so it goes with innovation in any field of activity. The prime difference between innovators and others is one of approach. Everybody gets ideas, but innovators work consciously on theirs, and they follow them through until they prove practicable or otherwise. What ordinary people see as fanciful abstractions, professional innovators see as solid possibilities.
"Creative thinking may mean simply the realization that there's no particular virtue in doing things the way they have always been done," wrote Rudolph Flesch, a language authority. This accounts for our reaction to seemingly simple innovations like plastic garbage bags and suitcases on wheels that make life more convenient: "How come nobody thought of that before?"
The creative approach begins with the proposition that nothing is as it appears. Innovators will not accept that there is only one way to do anything. Faced with getting from A to B, the average person will automatically set out on the best-known and apparently simplest route. The innovator will search for alternate courses, which may prove easier in the long run and are bound to be more interesting and challenging even if they lead to dead ends.
Highly creative individuals really do march to a different drummer.
Personality is to a large extent inherent — A-type parents usually bring about A-type offspring. But the environment must also have a profound effect, since if competition is important to the parents, it is likely to become a major factor in the lives of their children.
One place where children soak up A characteristics is school, which is, by its very nature, a highly competitive institution. Too many schools adopt the "win at all costs" moral standard and measure their success by sporting achievements. The current passion for making children compete against their classmates or against the clock produces a two-layer system, in which competitive A-types seem in some way better than their B-type fellows. Being too keen to win can have dangerous consequences: remember that Pheidippides, the first marathon runner, dropped dead seconds after saying: "Rejoice, we conquer!"
By far the worst form of competition in schools is the disproportionate emphasis on examinations. It is a rare school that allows pupils to concentrate on those things they do well. The merits of competition by examination are somewhat questionable, but competition in the certain knowledge of failure is positively harmful.
Obviously, it is neither practical nor desirable that all A youngsters change into B's. The world needs types, and schools have an important duty to try to fit a child's personality to his possible future employment. It is top management.
If the preoccupation of schools with academic work was lessened, more time might be spent teaching children surer values. Perhaps selection for the caring professions, especially medicine, could be made less by good grades in chemistry and more by such considerations as sensitivity and sympathy. It is surely a mistake to choose our doctors exclusively from A- type stock. B's are important and should be encouraged.
That experiences influence subsequent behaviour is evidence of an obvious but nevertheless remarkable activity called remembering. Learning could not occur without the function popularly named memory. Constant practice has such as effect on memory as to lead to skilful performance on the piano, to recitation of a poem, and even to reading and understanding these words. So-called intelligent behaviour demands memory, remembering being a primary requirement for reasoning. The ability to solve any problem or even to recognize that a problem exists depends on memory. Typically, the decision to cross a street is based on remembering many earlier experiences.
Practice (or review) tends to build and maintain memory for a task or for any learned material. Over a period of no practice what has been learned tends to be forgotten; and the adaptive consequences may not seem obvious. Yet, dramatic instances of sudden forgetting can be seen to be adaptive. In this sense, the ability to forget can be interpreted to have survived through a process of natural selection in animals. Indeed, when one's memory of an emotionally painful experience lead to serious anxiety, forgetting may produce relief. Nevertheless, an evolutionary interpretation might make it difficult to understand how the commonly gradual process of forgetting survived natural selection.
In thinking about the evolution of memory together with all its possible aspects, it is helpful to consider what would happen if memories failed to fade. Forgetting clearly aids orientation in time, since old memories weaken and the new tend to stand out, providing clues for inferring duration. Without forgetting, adaptive ability would suffer, for example, learned behaviour that might have been correct a decade ago may no longer be. Cases are recorded of people who (by ordinary standards) forgot so little that their everyday activities were full of confusion. This forgetting seems to serve that survival of the individual and the species.
Another line of thought assumes a memory storage system of limited capacity that provides adaptive flexibility specifically through forgetting. In this view, continual adjustments are made between learning or memory storage (input) and forgetting (output). Indeed, there is evidence that the rate at which individuals forget is directly related to how much they have learned. Such data offers gross support of contemporary models of memory that assume an input-output balance.
With the start of BBC World Service Television, millions of viewers in Asia and America can now watch the Corporation's news coverage, as well as listen to it.
And of course in Britain listeners and viewers can tune in to two BBC television channels, five BBC national radio services and dozens of local radio station. They are brought sport, comedy, drama, music, news and current affairs, education, religion, parliamentary coverage, children's programmes and films for an annual licence fee of ￡83 per household.
It is a remarkable record, stretching back over 70 years — yet the BBC's future is now in doubt. The Corporation will survive as a publicly-funded broadcasting organisation, at least for the time being, but its role, its size and its programmes are now the subject of a nation-wide debate in Britain.
The debate was launched by the Government, which invited anyone with an opinion of the BBC — including ordinary listeners and viewers — to say what was good or bad about the Corporation, and even whether they thought it was worth keeping. The reason for its inquiry is that the BBC's royal charter runs out in 1996 and it must decide whether to keep the organisation as it is, or to make changes.
Defenders of the Corporation — of whom there are many — are fond of quoting the American slogan. "If it ain't broke, don't fix it." The BBC "ain't broke", they say, by which they mean it is not broken (as distinct from the word "broke", meaning having no money), so why bother to change it?
Yet the BBC will have to change, because the broadcasting world around it is changing. The commercial TV channels — ITV and Channel 4 — were required by the Thatcher Government's Broadcasting Act to become more commercial, competing with each other for advertisers, and cutting costs and jobs. But it is the arrival of new satellite channels — funded partly by advertising and partly by viewers' subscriptions — which will bring about the biggest changes in the long term.
Rumor has it that more than 20 books on creationism/evolution are in the publisher's pipelines. A few have already appeared. The goal of all will be to try to explain to a confused and often unenlightened citizenry that there are not two equally valid scientific theories for the origin and evolution of universe and life. Cosmology, geology, and biology have provided a consistent, unified, and constantly improving account of what happened. "Scientific" creationism, which is being pushed by some for "equal time" in the classrooms whenever the scientific accounts of evolution are given, is based on religion, not science. Virtually all scientists and the majority of non-fundamentalist religious leaders have come to regard "scientific" creationism as bad science and bad religion.
The first four chapters of Kitcher's book give a very brief introduction to evolution. At appropriate places, he introduces the criticisms of the creationists and provides answers. In the last three chapters, he takes off his gloves and gives the creationists a good beating. He describes their programmes and tactics, and, for those unfamiliar with the ways of creationists, the extent of their deception and distortion may come as an unpleasant surprise. When their basic motivation is religious, one might have expected more Christian behavior.
Kitcher is philosopher, and this may account, in part, for the clarity and effectiveness of his arguments. The non-specialist will be able to obtain at least a notion of the sorts of data and argument that support evolutionary theory. The final chapter on the creationists will be extremely clear to all. On the dust jacket of this fine book, Stephen Jay Gould says: "This book stands for reason itself." And so it does — and all would be well were reason the only judge in the creationism/evolution debate.
It was 3:45 in the morning when the vote was finally taken. After six months of arguing and final 16 hours of hot parliamentary debates, Australia's Northern Territory became the first legal authority in the world to allow doctors to take the lives of incurably ill patients who wish to die. The measure passed by the convincing vote of 15 to 10. Almost immediately word flashed on the Internet and was picked up, half a world away, by John Hofsess, executive director of the Right to Die Society of Canada. He sent it on via the group's on-line service, Death NET. Says Hofsess: "We posted bulletins all day long, because of course this isn't just something that happened in Australia. It's world history."
The full import may take a while to sink in. The NT Rights of the Terminally Ill law has left physicians and citizens alike trying to deal with its moral and practical implications. Some have breathed sighs of relief, others, including churches, right-to-life groups and the Australian Medical Association, bitterly attacked the bill and the haste of its passage. But the tide is unlikely to turn back. In Australia — where an aging population, life-extending technology and changing community attitudes have all played their part — other states are going to consider making a similar law to deal with euthanasia. In the US and Canada, where the right-to-die movement is gathering strength, observers are waiting for the dominoes to start falling.
Under the new Northern Territory law, an adult patient can request death —probably by a deadly injection or pill—to put an end to suffering. The patient must be diagnosed as terminally ill by two doctors. After a "cooling off" period of seven days, the patient can sign a certificate of request. After 48 hours the wish for death can be met. For Lloyd Nickson, a 54-year-old Darwin resident suffering from lung cancer, the NT Rights of Terminally Ill law means he can get on with living without the haunting fear of his suffering: a terrifying death from his breathing condition. "I'm not afraid of dying from a spiritual point of view, but what I was afraid of was how I'd go, because I've watched people die in the hospital fighting for oxygen and clawing at their masks," he says.
A report consistently brought back by visitors to the US is how friendly, courteous, and helpful most Americans were to them. To be fair, this observation is also frequently made of Canada and Canadians, and should best be considered North American. There are, of course, exceptions. Small-minded officials, rude waiters, and ill-mannered taxi drivers are hardly unknown in the US. Yet it is an observation made so frequently that it deserves comment.
For a long period of time and in many parts of the country, a traveler was a welcome break in an otherwise dull existence. Dullness and loneliness were common problems of the families who generally lived distant from one another. Strangers and travelers were welcome sources of diversion, and brought news of the outside world.
The harsh realities of the frontier also shaped this tradition of hospitality. Someone traveling alone, if hungry, injured, or ill, often had nowhere to turn except to the nearest cabin or settlement. It was not a matter of choice for the traveler or merely a charitable impulse on the part of the settlers. It reflected the harshness of daily life: if you didn't take in the stranger and take care of him, there was no one else who would. And someday, remember, you might be in the same situation.
Today there are many charitable organizations which specialize in helping the weary traveler. Yet, the old tradition of hospitality to strangers is still very strong in the US, especially in the smaller cities and towns away from the busy tourist trails. "I was just traveling through, got talking with this American, and pretty soon he invited me home for dinner — amazing." Such observations reported by visitors to the US are not uncommon, but are not always understood properly. The casual friendliness of many Americans should be interpreted neither as superficial nor as artificial, but as the result of a historically developed cultural tradition.
As is true of any developed society, in America a complex set of cultural signals, assumptions, and conventions underlies all social interrelationships. And, of course, speaking a language does not necessarily mean that someone understands social and cultural patterns. Visitors who fail to "translate" cultural meanings properly often draw wrong conclusions. For example, when an American uses the word "friend", the cultural implications of the word may be quite different from those it has in the visitor's language and culture. It takes more than a brief encounter on a bus to distinguish between courteous convention and individual interest. Yet, being friendly is a virtue that many Americans value highly and expect from both neighbors and strangers.
Scattered around the globe are more than 100 small regions of isolated volcanic activity known to geologists as hot spots. Unlike most of the world’s volcanoes, they are not always found at the boundaries of the great drifting plates that make up the earth’s surface; on the contrary, many of them lie deep in the interior of a plate. Most of the hot spots move only slowly, and in some cases the movement of the plates past them has left trails of dead volcanoes. The hot spots and their volcanic trails are milestones that mark the passage of the plates.
That the plates are moving is now beyond dispute. Africa and South America, for example, are moving away from each other as new material is injected into the seafloor between them. The complementary coastlines and certain geological features that seem to span the ocean are reminders of where the two continents were once joined. The relative motion of the plates carrying these continents has been constructed in detail, but the motion of one plate with respect to another, cannot readily be translated into motion with respect to the earth’s interior. It is not possible to determine whether both continents are moving in opposite directions or whether one continent is stationary and the other is drifting away from it. Hot spots, anchored in the deeper layers of the earth, provide the measuring instruments needed to resolve the question. From an analysis of the hot-spot population it appears that the African plate is stationary and that it has not moved during the past 30 million years.
The significance of hot spots is not confined to their role as a frame of reference. It now appears that they also have an important influence on the geophysical processes that propel the plates across the globe. When a continental plate comes to rest over a hot spot, the material rising from deeper layers creates a broad dome. As the dome grows, it develops deep fissures (cracks); in at least a few cases the continent may break entirely along some of these fissures, so that the hot spot initiates the formation of a new ocean. Thus just as earlier theories have explained the mobility of the continents, so hot spots may explain their mutability (inconstancy).
It’s a rough world out there. Step outside and you could break a leg slipping on your doormat. Light up stove and you could burn down the house. Luckily, if the doormat or stove failed to warn of coming disaster, a successful lawsuit might compensate you for your troubles. Or so the thinking has gone since the early 1980s, when juries began holding more companies liable for their customers’ misfortunes.
Feeling threatened, companies responded by writing ever-longer warning labels, trying to anticipate every possible accident. Today, stepladders carry labels several inches long that warn, among other things, that you might —surprise! — fall off. The label on a child’s Batman cape cautions that the toy “does not enable user to fly.”
While warnings are often appropriate and necessary — the dangers of drug interactions, for example — and many are required by state or federal regulations, it isn’t clear that they actually protect the manufacturers and sellers from liability if a customer is injured. About 50 percent of the companies lose when injured customers take them to court.Now the tide appears to be turning. As personal injury claims continue as before, some courts are beginning to side with defendants, especially in cases where a warning label probably wouldn’t have changed anything. In May, Julie Nimmons, president of Schutt Sports in Illinois, successfully fought a lawsuit involving a football player who was paralyzed in a game while wearing a Schutt helmet. “We’re really sorry he has become paralyzed, but helmets aren’t designed to prevent those kinds of injuries,” says Nimmon. The jury agreed that the nature of the game, not the helmet, was the reason for the athlete’s injury. At the same time, the American Law Institute — a group of judges, lawyers, and academics whose recommendations carry substantial weight — issued new guidelines for tort law stating that companies need not warn customers of obvious dangers or bombard them with a lengthy list of possible ones. “Important information can get buried in a sea of trivialities,” says a law professor at Cornell Law School who helped draft the new guidelines. If the moderate end of the legal community has its way, the information on products might actually be provided for the benefit of customers and not as protection against legal liability.
A history of long and effortless success can be a dreadful handicap, but, if properly handled, it may become a driving force. When the United States entered just such a glowing period after the end of the Second World War, it had a market eight times larger than any competitor, giving its industries unparalleled economies of scale. Its scientists were the world’s best, its workers the most skilled. America and Americans were prosperous beyond the dreams of the Europeans and As ians whose economies the war had destroyed.
It was inevitable that this primacy should have narrowed as other countries grew richer. Just as inevitably, the retreat from predominance proved painful. By the mid-1980s Americans had found themselves at a loss over their fading industrial competitiveness. Some huge American industries, such as consumer electronics, had shrunk or vanished in the face of foreign competition. By 1987 there was only one American television maker left, Zenith. (Now there is none: Zenith was bought by South Korea’s LG Electronics in July.) Foreign-made cars and textiles were sweeping into the domestic market. America’s machine-tool industry was on the ropes. For a while it looked as though the making of semiconductors, which America had invented and which sat at the heart of the new computer age, was going to be the next casualty.
All of this caused a crisis of confidence. Americans stopped taking prosperity for granted. They began to believe that their way of doing business was failing, and that their incomes would therefore shortly begin to fall as well. The mid-1980s brought one inquiry after another into the causes of America’s industrial decline. Their sometimes sensational findings were filled with warnings about the growing competition from overseas.
How things have changed! In 1995 the United States can look back on five years of solid growth while Japan has been struggling. Few Americans attribute this solely to such obvious causes as a devalued dollar or the turning of the business cycle. Self-doubt has yielded to blind pride. “American industry has changed its structure, has gone on a diet, has learnt to be more quick-witted,” according to Richard Cavanagh, executive dean of Harvard’s Kennedy School of Government. “It makes me proud to be an American just to see how our businesses are improving their productivity,” says Stephen Moore of the Cato Institute, a thing-tank in Washington, DC. And William Sahlman of the Harvard Business School believes that people will look back on this period as “a golden age of business management in the United States.”
Being a man has always been dangerous. There are about 105 males born for every 100 females, but this ratio drops to near balance at the age of maturity, and among 70-year-olds there are twice as many women as men. But the great universal of male mortality is being changed. Now, boy babies survive almost as well as girls do. This means that, for the first time, there will be an excess of boys in those crucial years when they are searching for a mate. More important, another chance for natural selection has been removed. Fifty years ago, the chance of a baby (particularly a boy baby) surviving depended on its weight. A kilogram too light or too heavy meant almost certain death. Today it makes almost no difference. Since much of the variation is due to genes, one more agent of evolution has gone.
There is another way to commit evolutionary suicide: stay alive, but have fewer children. Few people are as fertile as in the past. Except in some religious communities, very few women have 15 children. Nowadays the number of births, like the age of death, has become average. Most of us have roughly the same number of offspring. Again, differences between people and the opportunity for natural selection to take advantage of it have diminished. India shows what is happening. The country offers wealth for a few in the great cities and poverty for the remaining tribal peoples. The grand mediocrity of today — everyone being the same in survival and number of offspring — means that natural selection has lost 80% of its power in upper-middle-class India compared to the tribes.
For us, this means that evolution is over; the biological Utopia has arrived. Strangely, it has involved little physical change. No other species fills so many places in nature. But in the past 100,000 years — even the past 100 years — our lives have been transformed but our bodies have not. We did not evolve, because machines and society did it for us. Darwin had a phrase to describe those ignorant of evolution: they “look at an organic being as a savage looks at a ship, as at something wholly beyond his comprehension.” No doubt we will remember a 20th century way of life beyond comprehension for its ugliness. But however amazed our descendants may be at how far from Utopia we were, they will look just like us.
When a new movement in art attains a certain fashion, it is advisable to find out what its advocates are aiming at, for, however farfetched and unreasonable their principles may seem today, it is possible that in years to come they may be regarded as normal. With regard to Futurist poetry, however, the case is rather difficult, for whatever Futurist poetry may be — even admitting that the theory on which it is based may be right — it can hardly be classed as Literature.
This, in brief, is what the Futurist says: for a century, past conditions of life have been conditionally speeding up, till now we live in a world of noise and violence and speed. Consequently, our feelings, thoughts and emotions have undergone a corresponding change. This speeding up of life, says the Futurist, requires a new form of expression. We must speed up our literature too, if we want to interpret modern stress. We must pour out a large stream of essential words, unhampered by stops, or qualifying adjectives, or finite verbs. Instead of describing sounds we must make up words that imitate them; we must use many sizes of type and different colored inks on the same page, and shorten or lengthen words at will.
Certainly their descriptions of battles are confused. But it is a little upsetting to read in the explanatory notes that a certain line describes a fight between a Turkish and a Bulgarian officer on a bridge off which they both fall into the river — and then to find that the line consists of the noise of their falling and the weights of the officers: ‘Pluff! Pluff! A hundred and eighty-five kilograms.’
This, though it fulfills the laws and requirements of Futurist poetry, can hardly be classed as Literature. All the same, no thinking man can refuse to accept their first proposition: that a great change in our emotional life calls for a change of expression. The whole question is really this: have we essentially changed?
If ambition is to be well regarded, the rewards of ambition — wealth, distinction, control over one’s destiny — must be deemed worthy of the sacrifices made on ambition’s behalf. If the tradition of ambition is to have vitality, it must be widely shared; and it especially must be highly regarded by people who are themselves admired, the educated not least among them. In an odd way, however, it is the educated who have claimed to have given up on ambition as an ideal. What is odd is that they have perhaps most benefited from ambition — if not always their own then that of their parents and grandparents. There is a heavy note of hypocrisy in this, a case of closing the barn door after the horses have escaped — with the educated themselves riding on them.
Certainly people do not seem less interested in success and its signs now than formerly. Summer homes, European travel, BMWs — the locations, place names and name brands may change, but such items do not seem less in demand today than a decade or two years ago. What has happened is that people cannot confess fully to their dreams, as easily and openly as once they could, lest they be thought pushing, acquisitive and vulgar. Instead, we are treated to fine hypocritical spectacles, which now more than ever seem in ample supply: the critic of American materialism with a Southampton summer home; the publisher of radical books who takes his meals in three-star restaurants; the journalist advocating participatory democracy in all phases of life, whose own children are enrolled in private schools. For such people and many more perhaps not so exceptional, the proper formulation is, “Succeed at all costs but avoid appearing ambitious.”
The attacks on ambition are many and come from various angles; its public defenders are few and unimpressive, where they are not extremely unattractive. As a result, the support for ambition as a healthy impulse, a quality to be admired and fixed in the mind of the young, is probably lower than it has ever been in the United States. This does not mean that ambition is at an end, that people no longer feel its stirrings and promptings, but only that, no longer openly honored, it is less openly professed. Consequences follow from this, of course, some of which are that ambition is driven underground, or made sly. Such, then, is the way things stand: on the left angry critics, on the right stupid supporters, and in the middle, as usual, the majority of earnest people trying to get on in life.
Why do so many Americans distrust what they read in their newspapers? The American Society of Newspaper Editors is trying to answer this painful question. The organization is deep into a long self-analysis known as the journalism credibility project.
Sad to say, this project has turned out to be mostly low-level findings about factual errors and spelling and grammar mistakes, combined with lots of head-scratching puzzlement about what in the world those readers really want.
But the sources of distrust go way deeper. Most journalists learn to see the world through a set of standard templates (patterns) into which they plug each day’s events. In other words, there is a conventional story line in the newsroom culture that provides a backbone and a ready-made narrative structure for otherwise confusing news.
There exists a social and cultural disconnect between journalists and their readers, which helps explain why the “standard templates” of the newsroom seem alien to many readers. In a recent survey, questionnaires were sent to reporters in five middle-size cities around the country, plus one large metropolitan area. Then residents in these communities were phoned at random and asked the same questions.
Replies show that compared with other Americans, journalists are more likely to live in upscale neighborhoods, have maids, own Mercedeses, and trade stocks, and they’re less likely to go to church, do volunteer work, or put down roots in a community.
Reporters tend to be part of a broadly defined social and cultural elite, so their work tends to reflect the conventional values of this elite. The astonishing distrust of the news media isn’t rooted in inaccuracy or poor reportorial skills but in the daily clash of world views between reporters and their readers.
This is an explosive situation for any industry, particularly a declining one. Here is a troubled business that keeps hiring employees whose attitudes vastly annoy the customers. Then it sponsors lots of symposiums and a credibility project dedicated to wondering why customers are annoyed and fleeing in large numbers. But it never seems to get around to noticing the cultural and class biases that so many former buyers are complaining about. If it did, it would open up its diversity program, now focused narrowly on race and gender, and look for reporters who differ broadly by outlook, values, education, and class.
When I decided to quit my full time employment it never occurred to me that I might become a part of a new international trend. A lateral move that hurt my pride and blocked my professional progress prompted me to abandon my relatively high profile career although, in the manner of a disgraced government minister, I covered my exit by claiming “I wanted to spend more time with my family”.
Curiously, some two-and-a-half years and two novels later, my experiment in what the Americans term “downshifting” has turned my tired excuse into an absolute reality. I have been transformed from a passionate advocate of the philosophy of “having it all”, preached by Linda Kelsey for the past seven years in the pages of She magazine, into a woman who is happy to settle for a bit for everything.
I have discovered, as perhaps Kelsey will after her much-publicized resignation from the editorship of She after a build-up of stress, that abandoning the doctrine of “juggling your life”, and making the alternative move into “downshifting” brings with it far greater rewards than financial success and social status. Nothing could persuade me to return to the kind of life Kelsey used to advocate and I once enjoyed: 12-hour working days, pressured deadlines, the fearful strain of office politics and the limitations of being a parent on “quality time”.
In America, the move away from juggling to a simpler, less materialistic lifestyle is a well-established trend. Downshifting — also known in America as “voluntary simplicity” — has, ironically, even bred a new area of what might be termed anti-consumerism. There are a number of bestselling downshifting self-help books for people who want to simplify their lives; there are newsletters, such as The Tightwad Gazette, that give hundreds of thousands of Americans useful tips on anything from recycling their cling-film to making their own soap; there are even support groups for those who want to achieve the mid- ’90s equivalent of dropping out.
While in America the trend started as a reaction to the economic decline — after the mass redundancies caused by downsizing in the late ’80s — and is still linked to the politics of thrift, in Britain, at least among the middle-class downshifters of my acquaintance, we have different reasons for seeking to simplify our lives.
For the women of my generation who were urged to keep juggling through the ’80s, down-shifting in the mid- ’90s is not so much a search for the mythical good life — growing your own organic vegetables, and risking turning into one — as a personal recognition of your limitations.
没有 译文 不过都是历年真题； 应该好找的
喜欢的话 请 大家 留下痕迹 让 我知道