Introduction
In the following lines I will share some of the Opinion Editorial examples that I found while researching in the Internet and in different magazines like The New York Times.
Due to the fact that they came from magazines like The New York Times, they were reliable and therefore, I decided to use them as a source of example when creating my Opinion Article.
-----------------------------------------------------------------------------------------------
Example 1
Example 2
Example 3
Example 4
Mexico. Brazil. Argentina. Mexico, again. Thailand. Indonesia. Argentina, again.
And now, the United States.
The story has played itself out time and time again over the past 30 years. Global investors, disappointed with the returns they’re getting, search for alternatives. They think they’ve found what they’re looking for in some country or other, and money rushes in.
But eventually it becomes clear that the investment opportunity wasn’t all it seemed to be, and the money rushes out again, with nasty consequences for the former financial favorite. That’s the story of multiple financial crises in Latin America and Asia. And it’s also the story of the U.S. combined housing and credit bubble. These days, we’re playing the role usually assigned to third-world economies.
For reasons I’ll explain later, it’s unlikely that America will experience a recession as severe as that in, say, Argentina. But the origins of our problem are pretty much the same. And understanding those origins also helps us understand where U.S. economic policy went wrong.
The global origins of our current mess were actually laid out by none other than Ben Bernanke, in an influential speech he gave early in 2005, before he was named chairman of the Federal Reserve. Mr. Bernanke asked a good question: “Why is the United States, with the world’s largest economy, borrowing heavily on international capital markets — rather than lending, as would seem more natural?”
His answer was that the main explanation lay not here in America, but abroad. In particular, third world economies, which had been investor favorites for much of the 1990s, were shaken by a series of financial crises beginning in 1997. As a result, they abruptly switched from being destinations for capital to sources of capital, as their governments began accumulating huge precautionary hoards of overseas assets.
The result, said Mr. Bernanke, was a “global saving glut”: lots of money, all dressed up with nowhere to go.
In the end, most of that money went to the United States. Why? Because, said Mr. Bernanke, of the “depth and sophistication of the country’s financial markets.”
All of this was right, except for one thing: U.S. financial markets, it turns out, were characterized less by sophistication than by sophistry, which my dictionary defines as “a deliberately invalid argument displaying ingenuity in reasoning in the hope of deceiving someone.” E.g., “Repackaging dubious loans into collateralized debt obligations creates a lot of perfectly safe, AAA assets that will never go bad.”
In other words, the United States was not, in fact, uniquely well-suited to make use of the world’s surplus funds. It was, instead, a place where large sums could be and were invested very badly. Directly or indirectly, capital flowing into America from global investors ended up financing a housing-and-credit bubble that has now burst, with painful consequences.
As I said, these consequences probably won’t be as bad as the devastating recessions that racked third-world victims of the same syndrome. The saving grace of America’s situation is that our foreign debts are in our own currency. This means that we won’t have the kind of financial death spiral Argentina experienced, in which a falling peso caused the country’s debts, which were in dollars, to balloon in value relative to domestic assets.
But even without those currency effects, the next year or two could be quite unpleasant.
What should have been done differently? Some critics say that the Fed helped inflate the housing bubble with low interest rates. But those rates were low for a good reason: although the last recession officially ended in November 2001, it was another two years before the U.S. economy began delivering convincing job growth, and the Fed was rightly concerned about the possibility of Japanese-style prolonged economic stagnation.
The real sin, both of the Fed and of the Bush administration, was the failure to exercise adult supervision over markets running wild.
It wasn’t just Alan Greenspan’s unwillingness to admit that there was anything more than a bit of “froth” in housing markets, or his refusal to do anything about subprime abuses. The fact is that as America’s financial system has grown ever more complex, it has also outgrown the framework of banking regulations that used to protect us — yet instead of an attempt to update that framework, all we got were paeans to the wonders of free markets.
Right now, Mr. Bernanke is in crisis-management mode, trying to deal with the mess his predecessor left behind. I don’t have any problems with his testimony yesterday, although I suspect that it’s already too late to prevent a recession.
But let’s hope that when the dust settles a bit, Mr. Bernanke takes the lead in talking about what needs to be done to fix a financial system gone very, very wrong.
-----------------------------------------------------------------------------------------------
Source: The New York Times - The Opinion Pages
Author: Bill Keller
Article Title: How to Die
ONE morning last month, Anthony Gilbey awakened from anesthesia in a hospital in the east of England. At his bedside were his daughter and an attending physician.
The surgery had been unsuccessful, the doctor informed him. There was nothing more that could be done.
“So I’m dying?” the patient asked.
The doctor hesitated. “Yes,” he said.
“You’re dying, Dad,” his daughter affirmed.
“So,” the patient mused, “no more whoop-de-doo.”
“On the other side, there’ll be loads,” his daughter — my wife — promised.
The patient laughed. “Yes,” he said. He was dead six days later, a few months shy of his 80th birthday.
When they told my father-in-law the hospital had done all it could, that was not, in the strictest sense, true. There was nothing the doctors could do about the large, inoperable tumor colonizing his insides. But they could have maintained his failing kidneys by putting him on dialysis. They could have continued pumping insulin to control his diabetes. He wore a pacemaker that kept his heart beating regardless of what else was happening to him, so with aggressive treatment they could — and many hospitals would — have sustained a kind of life for a while.
But the hospital that treated him offers a protocol called the Liverpool Care Pathway for the Dying Patient, which was conceived in the 90s at a Liverpool cancer facility as a more humane alternative to the frantic end-of-life assault of desperate measures. “The Hippocratic oath just drives clinicians toward constantly treating the patient, right until the moment they die,” said Sir Thomas Hughes-Hallett, who was until recently the chief executive of the center where the protocol was designed. English doctors, he said, tell a joke about this imperative: “Why in Ireland do they put screws in coffins? To keep the doctors out.”
The Liverpool Pathway brings many of the practices of hospice care into a hospital setting, where it can reach many more patients approaching death. “It’s not about hastening death,” Sir Thomas told me. “It’s about recognizing that someone is dying, and giving them choices. Do you want an oxygen mask over your face? Or would you like to kiss your wife?”
Anthony Gilbey’s doctors concluded that it was pointless to prolong a life that was very near the end, and that had been increasingly consumed by pain, immobility, incontinence, depression and creeping dementia. The patient and his family concurred.
And so the hospital unplugged his insulin and antibiotics, disconnected his intravenous nourishment and hydration, leaving only a drip to keep pain and nausea at bay. The earlier bustle of oxygen masks and thermometers and blood-pressure sleeves and pulse-taking ceased. Nurses wheeled him away from the wheezing, beeping machinery of intensive care to a quiet room to await his move to “the other side.”
Here in the United States, nothing bedevils our discussion of health care like the question of when and how to withhold it. The Liverpool Pathway or variations of it are now standard in most British hospitals and in several other countries — but not ours. When I asked one American end-of-life specialist what chance he saw that something of the kind could be replicated here, the answer was immediate: “Zero.” There is an obvious reason for that, and a less obvious reason.
The obvious reason, of course, is that advocates of such programs have been demonized. They have been criticized by the Catholic Church in the name of “life,” and vilified by Sarah Palin and Michele Bachmann in the pursuit of cheap political gain. “Anything that looks like an official protocol, or guideline — you’re going to get death-paneled,” said Dr. Ezekiel Emanuel, the bioethicist and expert on end-of-life care who has been a target of the rabble-rousers. (He is also a contributing opinion writer for The Times.) Humane end-of-life practices have quietly found their way into cancer treatment, but other specialties lag behind.
The British advocates of the Liverpool approach have endured similar attacks, mainly from “pro-life” lobbyists who portray it as a back-door form of euthanasia. (They also get it from euthanasia advocates who say it isn’t euthanasia-like enough.) Surveys of families that use this protocol report overwhelming satisfaction, but inevitably in a field that touches families at their most emotionally raw, and that requires trained coordination of several medical disciplines, nursing and family counseling, the end is not always as smooth as my father-in-law’s.
The less obvious problem, I suspect, is that those who favor such programs in this country often frame it as a cost issue. Their starting point is the arresting fact that a quarter or more of Medicare costs are incurred in the last year of life, which suggests that we are squandering a fortune to buy a few weeks or months of a life spent hooked to machinery and consumed by fear and discomfort. That last year of life offers a tempting target if we want to contain costs and assure that Medicare and Medicaid exist for future generations.
No doubt, we have a crying need to contain health care costs. We pay more than many other developed countries for comparable or inferior health care, and the total bill consumes a growing share of our national wealth. The Affordable Care Act — Obamacare — makes a start by establishing a board to identify savings in Medicare, by emphasizing preventive care, and by financing pilot programs to pay doctors for achieving outcomes rather than performing procedures. But it is barely a start. Common sense suggests that if officials were not afraid of being “death-paneled,” we could save some money by withholding care when, rather than saving a life, it serves only to prolong misery for a little while.
But I’m beginning to think that is both questionable economics and bad politics.
For one thing, whatever your common sense tells you, there is little evidence so far that these guidelines do save money. Emanuel has studied the fairly sketchy research and concluded that, with the possible exception of hospice care for cancer patients, measures to eliminate futile care in dying patients have not proved to be significant cost-savers. That seems to be partly because the programs kick in so late, and partly because good palliative care is not free.
Even if it turns out that programs like the Liverpool Pathway save big money, promoting end-of-life care on fiscal grounds just plays into fears that the medical-industrial complex is rushing our loved ones to the morgue to save on doctors and hospital beds.
When I asked British specialists whether the Liverpool protocol cut costs, they insisted they had never asked the question — and never would.
“I don’t think we would dare,” said Sir Thomas. “There was some very nasty press here in this country this year about the Pathway, saying it was a way of killing people quickly to free up hospital beds. The moment you go into that argument, you might threaten the whole program.”
In America, nothing happens without a cost-benefit analysis. But the case for a less excruciating death can stand on a more neutral, less disturbing foundation, namely that it is simply a kinder way of death.
“There are lots of reasons to believe you could save money,” said Emanuel. “I just think we can’t do it for the reason of saving money.”
During Anthony Gilbey’s six days of dying he floated in and out of awareness on a cloud of morphine. Unfettered by tubes and unpestered by hovering medics, he reminisced and made some amends, exchanged jokes and assurances of love with his family, received Catholic rites and managed to swallow a communion host that was probably his last meal. Then he fell into a coma. He died gently, loved and knowing it, dignified and ready.
“I have fought death for so long,” he told my wife near the end. “It is such a relief to give up.”
We should all die so well.
----------------------------------------------------------------------------------------------
ONE morning last month, Anthony Gilbey awakened from anesthesia in a hospital in the east of England. At his bedside were his daughter and an attending physician.
The surgery had been unsuccessful, the doctor informed him. There was nothing more that could be done.
“So I’m dying?” the patient asked.
The doctor hesitated. “Yes,” he said.
“You’re dying, Dad,” his daughter affirmed.
“So,” the patient mused, “no more whoop-de-doo.”
“On the other side, there’ll be loads,” his daughter — my wife — promised.
The patient laughed. “Yes,” he said. He was dead six days later, a few months shy of his 80th birthday.
When they told my father-in-law the hospital had done all it could, that was not, in the strictest sense, true. There was nothing the doctors could do about the large, inoperable tumor colonizing his insides. But they could have maintained his failing kidneys by putting him on dialysis. They could have continued pumping insulin to control his diabetes. He wore a pacemaker that kept his heart beating regardless of what else was happening to him, so with aggressive treatment they could — and many hospitals would — have sustained a kind of life for a while.
But the hospital that treated him offers a protocol called the Liverpool Care Pathway for the Dying Patient, which was conceived in the 90s at a Liverpool cancer facility as a more humane alternative to the frantic end-of-life assault of desperate measures. “The Hippocratic oath just drives clinicians toward constantly treating the patient, right until the moment they die,” said Sir Thomas Hughes-Hallett, who was until recently the chief executive of the center where the protocol was designed. English doctors, he said, tell a joke about this imperative: “Why in Ireland do they put screws in coffins? To keep the doctors out.”
The Liverpool Pathway brings many of the practices of hospice care into a hospital setting, where it can reach many more patients approaching death. “It’s not about hastening death,” Sir Thomas told me. “It’s about recognizing that someone is dying, and giving them choices. Do you want an oxygen mask over your face? Or would you like to kiss your wife?”
Anthony Gilbey’s doctors concluded that it was pointless to prolong a life that was very near the end, and that had been increasingly consumed by pain, immobility, incontinence, depression and creeping dementia. The patient and his family concurred.
And so the hospital unplugged his insulin and antibiotics, disconnected his intravenous nourishment and hydration, leaving only a drip to keep pain and nausea at bay. The earlier bustle of oxygen masks and thermometers and blood-pressure sleeves and pulse-taking ceased. Nurses wheeled him away from the wheezing, beeping machinery of intensive care to a quiet room to await his move to “the other side.”
Here in the United States, nothing bedevils our discussion of health care like the question of when and how to withhold it. The Liverpool Pathway or variations of it are now standard in most British hospitals and in several other countries — but not ours. When I asked one American end-of-life specialist what chance he saw that something of the kind could be replicated here, the answer was immediate: “Zero.” There is an obvious reason for that, and a less obvious reason.
The obvious reason, of course, is that advocates of such programs have been demonized. They have been criticized by the Catholic Church in the name of “life,” and vilified by Sarah Palin and Michele Bachmann in the pursuit of cheap political gain. “Anything that looks like an official protocol, or guideline — you’re going to get death-paneled,” said Dr. Ezekiel Emanuel, the bioethicist and expert on end-of-life care who has been a target of the rabble-rousers. (He is also a contributing opinion writer for The Times.) Humane end-of-life practices have quietly found their way into cancer treatment, but other specialties lag behind.
The British advocates of the Liverpool approach have endured similar attacks, mainly from “pro-life” lobbyists who portray it as a back-door form of euthanasia. (They also get it from euthanasia advocates who say it isn’t euthanasia-like enough.) Surveys of families that use this protocol report overwhelming satisfaction, but inevitably in a field that touches families at their most emotionally raw, and that requires trained coordination of several medical disciplines, nursing and family counseling, the end is not always as smooth as my father-in-law’s.
The less obvious problem, I suspect, is that those who favor such programs in this country often frame it as a cost issue. Their starting point is the arresting fact that a quarter or more of Medicare costs are incurred in the last year of life, which suggests that we are squandering a fortune to buy a few weeks or months of a life spent hooked to machinery and consumed by fear and discomfort. That last year of life offers a tempting target if we want to contain costs and assure that Medicare and Medicaid exist for future generations.
No doubt, we have a crying need to contain health care costs. We pay more than many other developed countries for comparable or inferior health care, and the total bill consumes a growing share of our national wealth. The Affordable Care Act — Obamacare — makes a start by establishing a board to identify savings in Medicare, by emphasizing preventive care, and by financing pilot programs to pay doctors for achieving outcomes rather than performing procedures. But it is barely a start. Common sense suggests that if officials were not afraid of being “death-paneled,” we could save some money by withholding care when, rather than saving a life, it serves only to prolong misery for a little while.
But I’m beginning to think that is both questionable economics and bad politics.
For one thing, whatever your common sense tells you, there is little evidence so far that these guidelines do save money. Emanuel has studied the fairly sketchy research and concluded that, with the possible exception of hospice care for cancer patients, measures to eliminate futile care in dying patients have not proved to be significant cost-savers. That seems to be partly because the programs kick in so late, and partly because good palliative care is not free.
Even if it turns out that programs like the Liverpool Pathway save big money, promoting end-of-life care on fiscal grounds just plays into fears that the medical-industrial complex is rushing our loved ones to the morgue to save on doctors and hospital beds.
When I asked British specialists whether the Liverpool protocol cut costs, they insisted they had never asked the question — and never would.
“I don’t think we would dare,” said Sir Thomas. “There was some very nasty press here in this country this year about the Pathway, saying it was a way of killing people quickly to free up hospital beds. The moment you go into that argument, you might threaten the whole program.”
In America, nothing happens without a cost-benefit analysis. But the case for a less excruciating death can stand on a more neutral, less disturbing foundation, namely that it is simply a kinder way of death.
“There are lots of reasons to believe you could save money,” said Emanuel. “I just think we can’t do it for the reason of saving money.”
During Anthony Gilbey’s six days of dying he floated in and out of awareness on a cloud of morphine. Unfettered by tubes and unpestered by hovering medics, he reminisced and made some amends, exchanged jokes and assurances of love with his family, received Catholic rites and managed to swallow a communion host that was probably his last meal. Then he fell into a coma. He died gently, loved and knowing it, dignified and ready.
“I have fought death for so long,” he told my wife near the end. “It is such a relief to give up.”
We should all die so well.
----------------------------------------------------------------------------------------------
Example 2
Source: The New York Times - The Opinion Pages
Author: Joe Nocera
Article Title: Has Apple Peaked?
If Steve Jobs were still alive, would the new map application on theiPhone 5 be such an unmitigated disaster? Interesting question, isn’t it?
As Apple’s chief executive, Jobs was a perfectionist. He had no tolerance for corner-cutting or mediocre products. The last time Apple released a truly substandard product — MobileMe, in 2008 — Jobs gathered the team into an auditorium, berated them mercilessly and then got rid of the team leader in front of everybody,according to Walter Isaacson’s biography of Jobs. The three devices that made Apple the most valuable company in America — the iPod, the iPhone and the iPad — were all genuine innovations that forced every other technology company to play catch-up.
No doubt, the iPhone 5, which went on sale on Friday, will be another hit. Apple’s halo remains powerful. But there is nothing about it that is especially innovative. Plus, of course, it has that nasty glitch. In rolling out a new operating system for the iPhone 5, Apple replaced Google’s map application — the mapping gold standard — with its own, vastly inferior, application, which has infuriated its customers. With maps now such a critical feature of smartphones, it seems to be an inexplicable mistake.
And maybe that’s all it is — a mistake, soon to be fixed. Butit is just as likely to turn out to be the canary in the coal mine. Though Apple will remain a highly profitable company for years to come, I would be surprised if it ever gives us another product as transformative as the iPhone or the iPad.
Part of the reason is obvious: Jobs isn’t there anymore. It is rare that a company is so completely an extension of one man’s brain as Apple was an extension of Jobs. While he was alive, that was a strength; now it’s a weakness. Apple’s current executive team is no doubt trying to maintain the same demanding, innovative culture, but it’s just not the same without the man himself looking over everybody’s shoulder. If the map glitch tells us anything, it is that.
But there is also a less obvious — yet possibly more important — reason that Apple’s best days may soon be behind it. When Jobs returned to the company in 1997, after 12 years in exile, Apple was in deep trouble. It could afford to take big risks and, indeed, to search for a new business model, because it had nothing to lose.
Fifteen years later, Apple has a hugely profitable business model to defend — and a lot to lose. Companies change when that happens. “The business model becomes a gilded cage, and management won’t do anything to challenge it, while doing everything they can to protect it,” says Larry Keeley, an innovation strategist at Doblin, a consulting firm.
It happens in every industry, but it is especially easy to see in technology because things move so quickly. It was less than 15 years ago that Microsoft appeared to be invincible. But once its Windows operating system and Office applications became giant moneymakers, Microsoft’s entire strategy became geared toward protecting its two cash cows. It ruthlessly used its Windows platform to promote its own products at the expense of rivals. (The Microsoft antitrust trial took dead aim at that behavior.) Although Microsoft still makes billions, its new products are mainly “me-too” versions of innovations made by other companies.
Now it is Apple’s turn to be king of the hill — and, not surprisingly, it has begun to behave in a very similar fashion. You can see it in the patent litigation against Samsung, a costly and counterproductive exercise that has nothing to do with innovation and everything to do with protecting its turf.
And you can see it in the decision to replace Google’s map application. Once an ally,Google is now a rival, and the thought of allowing Google to promote its maps on Apple’s platform had become anathema. More to the point, Apple wants to force its customers to use its own products, even when they are not as good as those from rivals. Once companies start acting that way, they become vulnerable to newer, nimbler competitors that are trying to create something new, instead of milking the old. Just ask BlackBerry, which once reigned supreme in the smartphone market but is now roadkill for Apple and Samsung.
Even before Jobs died, Apple was becoming a company whose main goal was to defend its business model. Yes, he would never have allowed his minions to ship such an embarrassing application. But despite his genius, it is unlikely he could have kept Apple from eventually lapsing into the ordinary. It is the nature of capitalism that big companies become defensive, while newer rivals emerge with better, smarter ideas.
“Oh my god,” read one Twitter message I saw. “Apple maps is the worst ever. It is like using MapQuest on a BlackBerry.”
MapQuest and BlackBerry.
Exactly.
-----------------------------------------------------------------------------------------------
If Steve Jobs were still alive, would the new map application on theiPhone 5 be such an unmitigated disaster? Interesting question, isn’t it?
As Apple’s chief executive, Jobs was a perfectionist. He had no tolerance for corner-cutting or mediocre products. The last time Apple released a truly substandard product — MobileMe, in 2008 — Jobs gathered the team into an auditorium, berated them mercilessly and then got rid of the team leader in front of everybody,according to Walter Isaacson’s biography of Jobs. The three devices that made Apple the most valuable company in America — the iPod, the iPhone and the iPad — were all genuine innovations that forced every other technology company to play catch-up.
No doubt, the iPhone 5, which went on sale on Friday, will be another hit. Apple’s halo remains powerful. But there is nothing about it that is especially innovative. Plus, of course, it has that nasty glitch. In rolling out a new operating system for the iPhone 5, Apple replaced Google’s map application — the mapping gold standard — with its own, vastly inferior, application, which has infuriated its customers. With maps now such a critical feature of smartphones, it seems to be an inexplicable mistake.
And maybe that’s all it is — a mistake, soon to be fixed. Butit is just as likely to turn out to be the canary in the coal mine. Though Apple will remain a highly profitable company for years to come, I would be surprised if it ever gives us another product as transformative as the iPhone or the iPad.
Part of the reason is obvious: Jobs isn’t there anymore. It is rare that a company is so completely an extension of one man’s brain as Apple was an extension of Jobs. While he was alive, that was a strength; now it’s a weakness. Apple’s current executive team is no doubt trying to maintain the same demanding, innovative culture, but it’s just not the same without the man himself looking over everybody’s shoulder. If the map glitch tells us anything, it is that.
But there is also a less obvious — yet possibly more important — reason that Apple’s best days may soon be behind it. When Jobs returned to the company in 1997, after 12 years in exile, Apple was in deep trouble. It could afford to take big risks and, indeed, to search for a new business model, because it had nothing to lose.
Fifteen years later, Apple has a hugely profitable business model to defend — and a lot to lose. Companies change when that happens. “The business model becomes a gilded cage, and management won’t do anything to challenge it, while doing everything they can to protect it,” says Larry Keeley, an innovation strategist at Doblin, a consulting firm.
It happens in every industry, but it is especially easy to see in technology because things move so quickly. It was less than 15 years ago that Microsoft appeared to be invincible. But once its Windows operating system and Office applications became giant moneymakers, Microsoft’s entire strategy became geared toward protecting its two cash cows. It ruthlessly used its Windows platform to promote its own products at the expense of rivals. (The Microsoft antitrust trial took dead aim at that behavior.) Although Microsoft still makes billions, its new products are mainly “me-too” versions of innovations made by other companies.
Now it is Apple’s turn to be king of the hill — and, not surprisingly, it has begun to behave in a very similar fashion. You can see it in the patent litigation against Samsung, a costly and counterproductive exercise that has nothing to do with innovation and everything to do with protecting its turf.
And you can see it in the decision to replace Google’s map application. Once an ally,Google is now a rival, and the thought of allowing Google to promote its maps on Apple’s platform had become anathema. More to the point, Apple wants to force its customers to use its own products, even when they are not as good as those from rivals. Once companies start acting that way, they become vulnerable to newer, nimbler competitors that are trying to create something new, instead of milking the old. Just ask BlackBerry, which once reigned supreme in the smartphone market but is now roadkill for Apple and Samsung.
Even before Jobs died, Apple was becoming a company whose main goal was to defend its business model. Yes, he would never have allowed his minions to ship such an embarrassing application. But despite his genius, it is unlikely he could have kept Apple from eventually lapsing into the ordinary. It is the nature of capitalism that big companies become defensive, while newer rivals emerge with better, smarter ideas.
“Oh my god,” read one Twitter message I saw. “Apple maps is the worst ever. It is like using MapQuest on a BlackBerry.”
MapQuest and BlackBerry.
Exactly.
-----------------------------------------------------------------------------------------------
Example 3
Source: The New York Times - The Opinion Pages
Author: Maureen Dowd
Article Title: It Goes With Everything, Even Blue Hair
SAVANNAH, Ga.
GROWING up, I did not think of black as an alluring color.
When you misbehaved, nuns in black habits, brandishing rulers, bore down on you. When a relative died, my mom wore rustling black rayon.
When my Irish great-aunts went to work for rich American families, they wore black maids’ uniforms. Our family dog, Scottie, bit anyone wearing black, even my brothers in their prom tuxedos.
Black was the color of despair, decadence, death, nightmares and vampire capes. It was the color, priests warned, that your soul would turn if you sinned.
But part of becoming a woman is realizing the mythic power of the little black dress. It makes you thinner and more chic, no matter how stunted your fashion sense, and gives you dash.
I first saw it in old movies: Rita Hayworth vamping in strapless black satin in “Gilda”; Marilyn Monroe sparkling in a barely-there Orry-Kelly beaded dress in “Some Like it Hot”; Natalie Wood winning Steve McQueen’s heart with a low-cut black dress in “Love With the Proper Stranger.”
And of course, the gorgeous black Givenchy cocktail dress Audrey Hepburn wore munching a pastry in front of Tiffany’s one morning — a look so embedded in the DNA of American culture that Tina Fey feyly evokes it on the cover of the new Entertainment Weekly, complete with upswept hair, long gloves, cigarette holder and Cat curled around her neck.
Others consider that image the shimmering height of the L.B.D., or little black dress. But not André Leon Talley, the imperious impresario of a new exhibition on the colorful subject at his eponymous gallery in the art museum of the Savannah College of Art and Design.
“It’s not the most iconic or important little black dress ever made,” dismissively notes the fashion czar, who himself favors comfortable yurt-like garments and size-15 Uggs. (He owns nine pairs of black and bark Uggs.) He points in the direction of an L.B.D. he finds far more compelling. I’m startled to see a male mannequin gussied up in a see-through black lace dress worn over spanking white boxers, black socks and shoes that would have dazzled Louis XIV, the Carrie Bradshaw of his day.
“This is what Marc Jacobs wore to the Met Costume Gala, a man-dress from Comme des Garçons,” Talley says. “It was a seminal moment in style for a man to go there, perfectly accessorized with diamanté buckled black matte leather court shoes that he designed himself.”
No doubt.
Talley, a contributing editor at Vogue and a correspondent at “Entertainment Tonight,” said he was inspired to mount the show after seeing Anna Wintour in a classic black Chanel dress, now framed in a shadow box on the cherry-red wall.
It was Coco Chanel and Vogue, after all, who popularized “the little nothings,” as they were known then, on Oct. 1, 1926, when the magazine published an illustration of a blouson black crepe de Chine sheath, predicting that every woman would aspire to have it in her closet, just as every man wanted a Model T in the garage.
“Chanel craved the power and independence of men,” says Gioia Diliberto, who has written a novel and a play about the couturière and who contributed an essay to the show’s catalog. “So in her designs, she borrowed the ease, comfort and muted palette of men’s clothes to create a style of pared-down elegance for women that liberated them from furbelows and froufrou confections.”
Talley says he collected a cavalcade of designer dresses from his friends in materials from “neoprene scuba diving fabric to latex to chicken feathers.” (Really ostrich feathers.)
In the rows of black glamour can be found Whoopi Goldberg’s Chado Ralph Rucci caftan — with a snake necklace strangling the mannequin’s neck; Sarah Jessica Parker’s buttery leather pleated Prabal Gurung; L’Wren Scott’s own design, a sexy wool and lace number that she wore to the Golden Globes with her boyfriend, Mick Jagger; a Tom Ford Chantilly lace and beaded concoction based on Goya’s portrait of the Duchess of Alba.
“Lady Gaga wore that with blue hair,” Talley confides.
Renée Zellweger contributed a navy ribbon-candy-style dress. “Sometimes you wear midnight blue as black,” Talley opined.
I start to feel paranoid when my friend André offers a disquisition on how to identify a “good black” hue versus a “bad black” one, and which blacks don’t match. I’d assumed black was black.
“You don’t want a harsh black or a dead black that looks like an old bunker that’s been oxidized through years of neglect in a barren warehouse,” he says. “A good black is an electrifying black. It should be about dreams of beauty.”
Gazing at my green T-shirt, Talley murmurs scornfully, “Sea foam, I suppose,” before turning back to the inanimate but far more enticing mannequins. “When in doubt, you go to your best little black dress, not to your wimpy, seaweedy, outre-mer sea foam or your wretched yellow lemon drop.”
He sums up the staying power of noir style with a line straight out of film noir: “It’s just something that you know is right, even if it’s wrong.”
-----------------------------------------------------------------------------------------------SAVANNAH, Ga.
GROWING up, I did not think of black as an alluring color.
When you misbehaved, nuns in black habits, brandishing rulers, bore down on you. When a relative died, my mom wore rustling black rayon.
When my Irish great-aunts went to work for rich American families, they wore black maids’ uniforms. Our family dog, Scottie, bit anyone wearing black, even my brothers in their prom tuxedos.
Black was the color of despair, decadence, death, nightmares and vampire capes. It was the color, priests warned, that your soul would turn if you sinned.
But part of becoming a woman is realizing the mythic power of the little black dress. It makes you thinner and more chic, no matter how stunted your fashion sense, and gives you dash.
I first saw it in old movies: Rita Hayworth vamping in strapless black satin in “Gilda”; Marilyn Monroe sparkling in a barely-there Orry-Kelly beaded dress in “Some Like it Hot”; Natalie Wood winning Steve McQueen’s heart with a low-cut black dress in “Love With the Proper Stranger.”
And of course, the gorgeous black Givenchy cocktail dress Audrey Hepburn wore munching a pastry in front of Tiffany’s one morning — a look so embedded in the DNA of American culture that Tina Fey feyly evokes it on the cover of the new Entertainment Weekly, complete with upswept hair, long gloves, cigarette holder and Cat curled around her neck.
Others consider that image the shimmering height of the L.B.D., or little black dress. But not André Leon Talley, the imperious impresario of a new exhibition on the colorful subject at his eponymous gallery in the art museum of the Savannah College of Art and Design.
“It’s not the most iconic or important little black dress ever made,” dismissively notes the fashion czar, who himself favors comfortable yurt-like garments and size-15 Uggs. (He owns nine pairs of black and bark Uggs.) He points in the direction of an L.B.D. he finds far more compelling. I’m startled to see a male mannequin gussied up in a see-through black lace dress worn over spanking white boxers, black socks and shoes that would have dazzled Louis XIV, the Carrie Bradshaw of his day.
“This is what Marc Jacobs wore to the Met Costume Gala, a man-dress from Comme des Garçons,” Talley says. “It was a seminal moment in style for a man to go there, perfectly accessorized with diamanté buckled black matte leather court shoes that he designed himself.”
No doubt.
Talley, a contributing editor at Vogue and a correspondent at “Entertainment Tonight,” said he was inspired to mount the show after seeing Anna Wintour in a classic black Chanel dress, now framed in a shadow box on the cherry-red wall.
It was Coco Chanel and Vogue, after all, who popularized “the little nothings,” as they were known then, on Oct. 1, 1926, when the magazine published an illustration of a blouson black crepe de Chine sheath, predicting that every woman would aspire to have it in her closet, just as every man wanted a Model T in the garage.
“Chanel craved the power and independence of men,” says Gioia Diliberto, who has written a novel and a play about the couturière and who contributed an essay to the show’s catalog. “So in her designs, she borrowed the ease, comfort and muted palette of men’s clothes to create a style of pared-down elegance for women that liberated them from furbelows and froufrou confections.”
Talley says he collected a cavalcade of designer dresses from his friends in materials from “neoprene scuba diving fabric to latex to chicken feathers.” (Really ostrich feathers.)
In the rows of black glamour can be found Whoopi Goldberg’s Chado Ralph Rucci caftan — with a snake necklace strangling the mannequin’s neck; Sarah Jessica Parker’s buttery leather pleated Prabal Gurung; L’Wren Scott’s own design, a sexy wool and lace number that she wore to the Golden Globes with her boyfriend, Mick Jagger; a Tom Ford Chantilly lace and beaded concoction based on Goya’s portrait of the Duchess of Alba.
“Lady Gaga wore that with blue hair,” Talley confides.
Renée Zellweger contributed a navy ribbon-candy-style dress. “Sometimes you wear midnight blue as black,” Talley opined.
I start to feel paranoid when my friend André offers a disquisition on how to identify a “good black” hue versus a “bad black” one, and which blacks don’t match. I’d assumed black was black.
“You don’t want a harsh black or a dead black that looks like an old bunker that’s been oxidized through years of neglect in a barren warehouse,” he says. “A good black is an electrifying black. It should be about dreams of beauty.”
Gazing at my green T-shirt, Talley murmurs scornfully, “Sea foam, I suppose,” before turning back to the inanimate but far more enticing mannequins. “When in doubt, you go to your best little black dress, not to your wimpy, seaweedy, outre-mer sea foam or your wretched yellow lemon drop.”
He sums up the staying power of noir style with a line straight out of film noir: “It’s just something that you know is right, even if it’s wrong.”
Example 4
Source: The New York Times - The Opinion Pages
Author: Paul Krugman
Article Title: Don't Cry for Me, America
Mexico. Brazil. Argentina. Mexico, again. Thailand. Indonesia. Argentina, again.
And now, the United States.
The story has played itself out time and time again over the past 30 years. Global investors, disappointed with the returns they’re getting, search for alternatives. They think they’ve found what they’re looking for in some country or other, and money rushes in.
But eventually it becomes clear that the investment opportunity wasn’t all it seemed to be, and the money rushes out again, with nasty consequences for the former financial favorite. That’s the story of multiple financial crises in Latin America and Asia. And it’s also the story of the U.S. combined housing and credit bubble. These days, we’re playing the role usually assigned to third-world economies.
For reasons I’ll explain later, it’s unlikely that America will experience a recession as severe as that in, say, Argentina. But the origins of our problem are pretty much the same. And understanding those origins also helps us understand where U.S. economic policy went wrong.
The global origins of our current mess were actually laid out by none other than Ben Bernanke, in an influential speech he gave early in 2005, before he was named chairman of the Federal Reserve. Mr. Bernanke asked a good question: “Why is the United States, with the world’s largest economy, borrowing heavily on international capital markets — rather than lending, as would seem more natural?”
His answer was that the main explanation lay not here in America, but abroad. In particular, third world economies, which had been investor favorites for much of the 1990s, were shaken by a series of financial crises beginning in 1997. As a result, they abruptly switched from being destinations for capital to sources of capital, as their governments began accumulating huge precautionary hoards of overseas assets.
The result, said Mr. Bernanke, was a “global saving glut”: lots of money, all dressed up with nowhere to go.
In the end, most of that money went to the United States. Why? Because, said Mr. Bernanke, of the “depth and sophistication of the country’s financial markets.”
All of this was right, except for one thing: U.S. financial markets, it turns out, were characterized less by sophistication than by sophistry, which my dictionary defines as “a deliberately invalid argument displaying ingenuity in reasoning in the hope of deceiving someone.” E.g., “Repackaging dubious loans into collateralized debt obligations creates a lot of perfectly safe, AAA assets that will never go bad.”
In other words, the United States was not, in fact, uniquely well-suited to make use of the world’s surplus funds. It was, instead, a place where large sums could be and were invested very badly. Directly or indirectly, capital flowing into America from global investors ended up financing a housing-and-credit bubble that has now burst, with painful consequences.
As I said, these consequences probably won’t be as bad as the devastating recessions that racked third-world victims of the same syndrome. The saving grace of America’s situation is that our foreign debts are in our own currency. This means that we won’t have the kind of financial death spiral Argentina experienced, in which a falling peso caused the country’s debts, which were in dollars, to balloon in value relative to domestic assets.
But even without those currency effects, the next year or two could be quite unpleasant.
What should have been done differently? Some critics say that the Fed helped inflate the housing bubble with low interest rates. But those rates were low for a good reason: although the last recession officially ended in November 2001, it was another two years before the U.S. economy began delivering convincing job growth, and the Fed was rightly concerned about the possibility of Japanese-style prolonged economic stagnation.
The real sin, both of the Fed and of the Bush administration, was the failure to exercise adult supervision over markets running wild.
It wasn’t just Alan Greenspan’s unwillingness to admit that there was anything more than a bit of “froth” in housing markets, or his refusal to do anything about subprime abuses. The fact is that as America’s financial system has grown ever more complex, it has also outgrown the framework of banking regulations that used to protect us — yet instead of an attempt to update that framework, all we got were paeans to the wonders of free markets.
Right now, Mr. Bernanke is in crisis-management mode, trying to deal with the mess his predecessor left behind. I don’t have any problems with his testimony yesterday, although I suspect that it’s already too late to prevent a recession.
But let’s hope that when the dust settles a bit, Mr. Bernanke takes the lead in talking about what needs to be done to fix a financial system gone very, very wrong.
-----------------------------------------------------------------------------------------------
Example 5
Source: The New York Times - The Opinion Pages
Author: Charles M. Blow
Article Title: Don't Mess With Big Bird
Mitt Romney’s Big Bird swipe during Wednesday’s debate raised some hackles: PBS’s, many on social media and mine.
Romney told the debate moderator, Jim Lehrer:
Mitt Romney’s Big Bird swipe during Wednesday’s debate raised some hackles: PBS’s, many on social media and mine.
Romney told the debate moderator, Jim Lehrer:
“I’m sorry, Jim. I’m going to stop the subsidy to PBS. I’m going to stop other things. I like PBS. I love Big Bird. I actually like you, too. But I’m not going to — I’m not going to keep on spending money on things to borrow money from China to pay for it.”
Those are fighting words.
Social media, and others, exploded in Big Bird’s defense.
PBS itself issued a tersely worded statement on Thursday, saying:
“Governor Romney does not understand the value the American people place on public broadcasting and the outstanding return on investment the system delivers to our nation. We think it is important to set the record straight and let the facts speak for themselves.”
Exactly! What they said!
Big Bird is the man. He’s 8 feet tall. He can sing and roller skate and ride a unicycle and dance. Can you do that, Mr. Romney? I’m not talking about your fox trot away from the facts. I’m talking about real dancing.
Since 1969, Big Bird has been the king of the block on “Sesame Street.” When I was a child, he and his friends taught me the alphabet and the colors and how to do simple math.
Do you know how to do simple math, Mr. Romney? Maybe you and the Countess Von Backward could exchange numbers.
Big Bird and his friends also showed me what it meant to resolve conflicts with kindness and accept people’s differences and look out for the less fortunate. Do you know anything about looking out for the less fortunate, Mr. Romney? Or do you think they’re all grouches scrounging around in trash cans?
I know that you told Fox News this week that you were “completely wrong” for making that now infamous 47 percent comment, but probably only after you realized that it was a drag on your poll numbers. Your initial response was to defend it as “inelegantly stated” but essentially correct. That’s not good, sir. Character matters. Big Bird wouldn’t have played it that way. Do you really believe that Pennsylvania Avenue is that far away from Sesame Street? It shouldn’t be.
Let me make it simple for you, Mr. Romney. I’m down with Big Bird. You pick on him, you answer to me.
And, for me, it’s bigger than Big Bird. It’s almost impossible to overstate how instrumental PBS has been in my development and instruction.
We were poor. My mother couldn’t afford day care, and I didn’t go to preschool. My great-uncle took care of me all day. I could watch one hour of television: PBS.
When I was preparing for college and took the ACT, there were harder reading passages toward the back of the test. Many had scientific themes — themes we hadn’t covered at my tiny high school in my rural town. But I could follow the passages’ meanings because I had watched innumerable nature shows on PBS.
I never went to art or design school. In college, I was an English major before switching to mass communications. Still, I went on to become the design director of The New York Times and the art director of National Geographic magazine.
That was, in part, because I had a natural gift for it (thanks mom and dad and whatever gods there may be), but it’s also because I spent endless hours watching art programs on PBS. (Bob Ross, with his awesome Afro, snow-capped mountains and “magic white,” will live on forever in my memory.)
I don’t really expect Mitt Romney to understand the value of something like PBS to people, like me, who grew up in poor, rural areas and went to small schools. These are places with no museums or preschools or after-school educational programs. There wasn’t money for travel or to pay tutors.
I honestly don’t know where I would be in the world without PBS.
As PBS pointed out:
“Over the course of a year, 91 percent of all U.S. television households tune in to their local PBS station. In fact, our service is watched by 81 percent of all children between the ages of 2-8. Each day, the American public receives an enduring and daily return on investment that is heard, seen, read and experienced in public media broadcasts, apps, podcasts and online — all for the cost of about $1.35 per person per year.”
PBS is a national treasure, and Big Bird is our golden — um, whatever kind of bird he is.
Hands off!
No comments:
Post a Comment