I was a small child when the Russians launched Sputnik in 1957--still at the age when recess was the highlight of the school day for me and my friends. Current world events were rarely deemed worthy of our attention. In this case, though, we all knew something extraordinary had happened: there was something in the night sky that people--strange people in a faraway place--had put there. Of all those tiny points of light, one was blinking and doing loops around the planet.
As a child of the space age, my life was probably affected by Sputnik in more ways than I can imagine. When history turns a corner, we never know what might have happened if it had taken another road. But two things became apparent fairly quickly:
1) Adults were frightened. Our tribal enemies had inexplicably become more powerful.
2) We were somehow inadequate. There was a sense, which penetrated all the way down to the third grade, that we either weren't learning the right stuff or we weren't learning it well enough. We lost art class and started doing more arithmetic.
On another extraordinary day, in 1967, I was at my job in a small trailer beside the largest building in the world: The Boeing Company's manufacturing hanger in Everett, Washington. No one was working. Portable black-and-white TV sets (color sets were still rare) had been hauled in and set up on drafting tables, wires run to the nearest outlets, stools arranged so that everyone could see a screen. To this day, I remember the awe and pride we all felt when Neil Armstrong's boot touched lunar soil. If they had never been repeated, I would still remember some of the words that were spoken that day: "The Eagle has landed," "One small step . . . ."
I was still a kid. I looked around the room at some of the senior members of my group--engineers who had designed and built the 747, that monstrosity of metal which somehow, miraculously, stayed aloft. I knew that it was people like them--with similar skills and education--who had made this thing happen. I knew then that I--and my generation--had been had.
The American engineers, designers, and programmers who put humans on the moon were, for the most part, at least fifteen or twenty years older than I was. They had not had the benefit of the emergency curriculum reform that forced my generation to do all those extra math problems.
Looking back, though, I count myself lucky. We lost art class. Many of today's kids are also losing recess.
Thursday, October 4, 2007
Tuesday, August 28, 2007
Sunk Costs
Wealthy Bush backers have launched a new ad campaign to persuade voters in certain states to back the war. Wounded soldiers and families of the dead have been recruited to tell viewers how they feel: that don’t want their sacrifices to have been in vain.
Organizers of this campaign have had the leisure to seek out individuals whose reaction to trauma and unthinkable loss suits their purposes. But how many other amputees or survivors feel differently? How many would say “Don’t let this happen to somebody else”?
The question is purely rhetorical, of course, because regardless of how people feel, feelings should never be the basis of decision making—especially when it comes of decisions of monumental importance. Rational decision makers take people’s feelings into consideration, of course—but the key word is rational.
Two years ago, Barry Schwartz pointed out the propensity of the president to fall for the sunk cost fallacy—the notion that because we’ve already spent something to gain something, we’re justified in spending more. And more.
Evidently the president didn’t read that article.
People who think for a living—economists, philosophers, psychologists, and decision experts—have known this for a long time: past expenditures are irrelevant to making rational decisions about the future. It’s an illusion to think that we can compensate in any way for what’s already been lost by “staying the course.” What’s spent or lost cannot be regained, but future costs can be avoided.
For people of my generation, who remember Viet Nam, the parallels between the two wars are impossible to ignore, and this is one of them: continual reminders that many have died or otherwise sacrificed their lives for the cause. Implied is the notion that if we quit, we somehow let them down.
Of course we should honor them—those who courageously accept the fateful place their government has assigned to them. But in deciding where to go from here—wherever “here” may be—we need to look forward, not backwards, along the path. Our responsibility is to those not yet maimed, killed, or traumatized. It is they whose futures depend on decisions made now.
If there are rational reasons for Americans to continue fighting in Iraq, I’m open to hearing about them. But efforts to influence me through fallacy and false sentiment only make me angry.
Organizers of this campaign have had the leisure to seek out individuals whose reaction to trauma and unthinkable loss suits their purposes. But how many other amputees or survivors feel differently? How many would say “Don’t let this happen to somebody else”?
The question is purely rhetorical, of course, because regardless of how people feel, feelings should never be the basis of decision making—especially when it comes of decisions of monumental importance. Rational decision makers take people’s feelings into consideration, of course—but the key word is rational.
Two years ago, Barry Schwartz pointed out the propensity of the president to fall for the sunk cost fallacy—the notion that because we’ve already spent something to gain something, we’re justified in spending more. And more.
Evidently the president didn’t read that article.
People who think for a living—economists, philosophers, psychologists, and decision experts—have known this for a long time: past expenditures are irrelevant to making rational decisions about the future. It’s an illusion to think that we can compensate in any way for what’s already been lost by “staying the course.” What’s spent or lost cannot be regained, but future costs can be avoided.
For people of my generation, who remember Viet Nam, the parallels between the two wars are impossible to ignore, and this is one of them: continual reminders that many have died or otherwise sacrificed their lives for the cause. Implied is the notion that if we quit, we somehow let them down.
Of course we should honor them—those who courageously accept the fateful place their government has assigned to them. But in deciding where to go from here—wherever “here” may be—we need to look forward, not backwards, along the path. Our responsibility is to those not yet maimed, killed, or traumatized. It is they whose futures depend on decisions made now.
If there are rational reasons for Americans to continue fighting in Iraq, I’m open to hearing about them. But efforts to influence me through fallacy and false sentiment only make me angry.
Wednesday, August 15, 2007
A Mind Worth a Moment
If we pause to think about the atrocities people routinely inflict on one another, such as the Nazi holocaust, most of us ask the same two questions:
Stanley Milgram’s work in the early 1960s answered both questions: yes.
In a series of experiments that began at Yale, Milgram employed ordinary adults in what they believed to be a study about learning. On the instructions of a white-coated authority figure, the volunteers were told to deliver increasingly high-voltage shocks to a person in another room, an actor they believed to be another volunteer. The shocks weren’t real, but the subjects believed they were. As the shocks increased in intensity, the subjects heard sounds—such as moaning, pleading, pounding on the wall—that led them to believe they were torturing another human being.
Shockingly (pardon the pun), virtually every one of the subjects of these experiments participated far beyond the point where they thought they were inflicting pain. Fully 65% thought they were doing serious harm to (or even, in some cases, killing) their experimental partner. Yet, urged on by authority, they continued.
Everyone needs to know this: We are social animals, programmed to obey. Given the right circumstances, any one of us can become an agent of misery and mayhem. Milgram’s work is doubly important because it could not be duplicated today. Modern experimental ethics wouldn’t allow it because of the psychological stress experienced by the subjects.
Milgram had a knack for asking new questions about the human psyche and exploring them in unusual ways. Though he’s best known for the experiments about obedience, he made other notable contributions, in his all-too-short life, that further our understanding of ourselves and society. He developed original and effective ways to study prejudice, the effects of living in cities, mental maps, and the “small-world phenomenon” (also known as the notion of “six degrees of separation”).
Stanley Milgram was born this day in 1933.
- Would ordinary people do such horrific things?
- Could the same thing happen here and now?
Stanley Milgram’s work in the early 1960s answered both questions: yes.
In a series of experiments that began at Yale, Milgram employed ordinary adults in what they believed to be a study about learning. On the instructions of a white-coated authority figure, the volunteers were told to deliver increasingly high-voltage shocks to a person in another room, an actor they believed to be another volunteer. The shocks weren’t real, but the subjects believed they were. As the shocks increased in intensity, the subjects heard sounds—such as moaning, pleading, pounding on the wall—that led them to believe they were torturing another human being.
Shockingly (pardon the pun), virtually every one of the subjects of these experiments participated far beyond the point where they thought they were inflicting pain. Fully 65% thought they were doing serious harm to (or even, in some cases, killing) their experimental partner. Yet, urged on by authority, they continued.
Everyone needs to know this: We are social animals, programmed to obey. Given the right circumstances, any one of us can become an agent of misery and mayhem. Milgram’s work is doubly important because it could not be duplicated today. Modern experimental ethics wouldn’t allow it because of the psychological stress experienced by the subjects.
Milgram had a knack for asking new questions about the human psyche and exploring them in unusual ways. Though he’s best known for the experiments about obedience, he made other notable contributions, in his all-too-short life, that further our understanding of ourselves and society. He developed original and effective ways to study prejudice, the effects of living in cities, mental maps, and the “small-world phenomenon” (also known as the notion of “six degrees of separation”).
Stanley Milgram was born this day in 1933.
Saturday, July 28, 2007
Doing the Math
What’s math?
When most people use the word “math,” they mean arithmetic: adding, subtracting, multiplying, and dividing. These are skills every adult needs to function—to pay bills, buy groceries, make sure they end up with the right number of kids at the end of the day.
What do high school and college teachers mean by math? They mean algebra, geometry, trigonometry, and calculus. These are skills often needed by researchers, engineers, and computer programmers but not by people in many other types of work.
To enter most four-year colleges in the U.S., most students need to be familiar with some basic principles of higher mathematics. Those who enter careers in business or the humanities, however, are unlikely ever to use them.
For decades, public education has provided for the needs of all students, whether they were gifted or challenged in math, whether their brains were ready for algebra and calculus or whether they weren’t.
Those who were developmentally ready for higher mathematics completed at least Algebra 2 in high school, often going directly to college. Others took more time to learn, perhaps (if their life plan required more schooling) starting at a two-year college before transferring to a university. Many got jobs and later returned to school for further training. (Brain development for many people isn’t complete until they’re well into their twenties, by which time certain tasks have become easier.)
The bulldozer of “educational reform” and No Child Left Behind (NCLB) has changed all that. The politicians who run education in my state, among others, have decreed that every child “should” graduate high school ready for college-level math. A federally mandated “high stakes test” requires students to do lots of problems that go way beyond arithmetic—and to do them, furthermore, in the second semester of their sophomore year. If they fail (and about half the 10th grade students in the state do) they are labeled as failures and given penance in the form of more tests, summer school, and mandatory classes that replace their elective courses.
Here are some of the results:
Americans have bought into the myth that public education is in shambles and our kids are failing. That perception isn’t going to change overnight. But there are a few things American voters ought to know, regardless of what the headlines say:
They weren’t being left behind, but now they are.
When most people use the word “math,” they mean arithmetic: adding, subtracting, multiplying, and dividing. These are skills every adult needs to function—to pay bills, buy groceries, make sure they end up with the right number of kids at the end of the day.
What do high school and college teachers mean by math? They mean algebra, geometry, trigonometry, and calculus. These are skills often needed by researchers, engineers, and computer programmers but not by people in many other types of work.
To enter most four-year colleges in the U.S., most students need to be familiar with some basic principles of higher mathematics. Those who enter careers in business or the humanities, however, are unlikely ever to use them.
For decades, public education has provided for the needs of all students, whether they were gifted or challenged in math, whether their brains were ready for algebra and calculus or whether they weren’t.
Those who were developmentally ready for higher mathematics completed at least Algebra 2 in high school, often going directly to college. Others took more time to learn, perhaps (if their life plan required more schooling) starting at a two-year college before transferring to a university. Many got jobs and later returned to school for further training. (Brain development for many people isn’t complete until they’re well into their twenties, by which time certain tasks have become easier.)
The bulldozer of “educational reform” and No Child Left Behind (NCLB) has changed all that. The politicians who run education in my state, among others, have decreed that every child “should” graduate high school ready for college-level math. A federally mandated “high stakes test” requires students to do lots of problems that go way beyond arithmetic—and to do them, furthermore, in the second semester of their sophomore year. If they fail (and about half the 10th grade students in the state do) they are labeled as failures and given penance in the form of more tests, summer school, and mandatory classes that replace their elective courses.
Here are some of the results:
- More drop outs.
High school students who’ve failed the state test in earlier grades drop out, reluctant to fail again when it counts the most. (Students who disappear from school these days are rarely reported as “drop-outs," though. Many schools list them as “transferred,” because NCLB penalizes schools whose kids drop out.) - Elective classes cancelled.
Courses like art, drama, music, drafting, shop, home economics, PE, and other electives disappear, as more and more students are required to take remedial classes to prepare them for THE test. (In other words, lots of kids are deprived of classes they like most—classes where they can show off their talents and develop skills that might lead toward a satisfying career.) - Fewer high-level courses.
Talented students lose opportunities to take high-level and special courses—courses like physics, anatomy, statistics, and second- or third-year chemistry—because teachers and money are increasingly needed for remedial classes. (Schools are judged on how many students pass THE test, not how many reach their real potential in high school.)
Americans have bought into the myth that public education is in shambles and our kids are failing. That perception isn’t going to change overnight. But there are a few things American voters ought to know, regardless of what the headlines say:
- The “math” kids supposedly don’t know is not arithmetic. With very few exceptions, kids can do arithmetic.
- Teachers aren’t stupid, subversive, or opposed to public discussions about education. They’re just discouraged and tired of being blamed for nonexistent problems.
- Not every child can or should learn at the same pace in every subject area. A kid who takes longer than others to learn algebra may be light-years ahead in other subjects. Kids need to be allowed to learn at their own pace in many different areas.
- Comparisons with students in other countries are virtually always meaningless for many reasons, including the fact that all American students are often compared only to the college-ready kids in other cultures.
Before NCLB, kids weren’t being left behind, unless it was because of the hellacious inequality of educational funding in America.
They weren’t being left behind, but now they are.
Wednesday, July 18, 2007
Let Me Count the Fallacies
The facts: A few high school kids in California, protesting certain issues regarding immigration, ran a Mexican flag up a pole with an American flag under it, upside down.
One result: A frantic glut of emails hastily sent out by countless Internet users to everyone on their mailing lists. (Those who felt this item was worthy of my time and attention happened to be, somewhat to my surprise, well-educated adults.)
The much-forwarded email shows pictures of the kids and the flags. The text accompanying these photos is so full of fallacious reasoning that it would be humorous, if it weren’t so dangerous. Here’s a sample (fallacies added, in italics):
Speaking of core beliefs and values, how does instigating hatred against a few teens who have the guts to try to make a political and moral statement square with the values of a country that purports to value free speech?
Some months ago, many Muslims throughout the world became enraged because a series of cartoons depicting Muhammad appeared in Danish newspapers. Swarms of Internet messages circulated in the U.S., denouncing their attitudes as examples of extremism and hypersensitivity.
I suppose it’s safe to assume that those who denounced the Muslims are not the same people who forwarded the email about the California teens. After all, to criticize the Muslims for overreacting and then do so themselves would be—well, illogical. Right?
One result: A frantic glut of emails hastily sent out by countless Internet users to everyone on their mailing lists. (Those who felt this item was worthy of my time and attention happened to be, somewhat to my surprise, well-educated adults.)
The much-forwarded email shows pictures of the kids and the flags. The text accompanying these photos is so full of fallacious reasoning that it would be humorous, if it weren’t so dangerous. Here’s a sample (fallacies added, in italics):
- ”I predict this stunt will be the nail in the coffin of [sic] any guest-worker/amnesty plan on the table in Washington.” (ad hominum and hasty generalization)
- ”Pass this along to every American citizen in your address book and to every representative in the state and federal government.” (bandwagon)
- ”If you choose to remain uninvolved [by not forwarding the email], do not be amazed when you no longer have a nation to call your own (slippery slope) nor anything you have worked for left since it will be ‘redistributed’ to the activists while you are so peacefully staying out of the ‘fray.’” (appeal to emotion, straw man, misrepresentation, exaggeration)
- Check history, it is full of nations/empires that disappeared when its citizens no longer held their core beliefs and values.” (non sequitur)
Speaking of core beliefs and values, how does instigating hatred against a few teens who have the guts to try to make a political and moral statement square with the values of a country that purports to value free speech?
Some months ago, many Muslims throughout the world became enraged because a series of cartoons depicting Muhammad appeared in Danish newspapers. Swarms of Internet messages circulated in the U.S., denouncing their attitudes as examples of extremism and hypersensitivity.
I suppose it’s safe to assume that those who denounced the Muslims are not the same people who forwarded the email about the California teens. After all, to criticize the Muslims for overreacting and then do so themselves would be—well, illogical. Right?
Thursday, July 12, 2007
The Rock and the Hard Place
This fall, my state is forcing me to participate in the great all-American either-or fallacy: In order to vote in the presidential primary, I have to register as a Democrat or Republican.
What if I don't want to vote for either party? What if I want to vote for what I believe rather than for some "platform" cobbled together by people I don't know and don't trust? What if I want to vote for the planet rather than a person? Or for the interests of future generations rather than my own?
What if I don't want to be labeled?
My hunch is that the candidates--the best of them, anyway--don't like this bipartisan system, either. But in American politics, only two trains pull out, and if you're not on one of them, you're left standing at the station.
Here's what really scares me. Supposedly, there's a balance of power in American government among the executive, legislative, and judicial branches. Supposedly, they keep each other in check. But in reality, there's now a fourth power in politics, one that permeates the whole process but, like dark matter in the universe, is invisible to the naked eye: the power of the Parties. Party leadership, who are not accountable to anyone except themselves and their own vested interests, make decisions of monumental importance.
Watergate should have been a warning to us. And what about the strange, disproportionate power of Dick Cheney?
By voting for my preferred candidate, I’ll be inadvertently supporting a system that is subverting everything my vote stands for. My vote will be tinged by the proverbial red or blue of party politics and possibly rendered meaningless by the antiquated electoral college system. But it is my vote, my little voice, and the only thing worse than having it rendered meaningless by others would be not to cast it at all.
What if I don't want to vote for either party? What if I want to vote for what I believe rather than for some "platform" cobbled together by people I don't know and don't trust? What if I want to vote for the planet rather than a person? Or for the interests of future generations rather than my own?
What if I don't want to be labeled?
My hunch is that the candidates--the best of them, anyway--don't like this bipartisan system, either. But in American politics, only two trains pull out, and if you're not on one of them, you're left standing at the station.
Here's what really scares me. Supposedly, there's a balance of power in American government among the executive, legislative, and judicial branches. Supposedly, they keep each other in check. But in reality, there's now a fourth power in politics, one that permeates the whole process but, like dark matter in the universe, is invisible to the naked eye: the power of the Parties. Party leadership, who are not accountable to anyone except themselves and their own vested interests, make decisions of monumental importance.
Watergate should have been a warning to us. And what about the strange, disproportionate power of Dick Cheney?
By voting for my preferred candidate, I’ll be inadvertently supporting a system that is subverting everything my vote stands for. My vote will be tinged by the proverbial red or blue of party politics and possibly rendered meaningless by the antiquated electoral college system. But it is my vote, my little voice, and the only thing worse than having it rendered meaningless by others would be not to cast it at all.
Wednesday, July 4, 2007
Supreme Anachronism
With so much that needs to be accomplished on this shrinking planet, it’s discouraging to hear about yet another set of contentious, regressive 5-to-4 decisions from the Supreme Court. With two recent Bush appointees on the bench, the nation can only watch as the Court blithely turns history in its head, undoing decisions previous courts have made in the past (such as the recent decision effectively outlawing school desegregation). It’s depressing to think what kind of damage may be done in coming years to important legislation protecting the environment, human rights, and free speech.
The Supreme Court (which, incidentally, did not spring forth fully fledged from the minds of the Founding Fathers but rather evolved into its present form) needs a little work in my opinion.
First, we should increase the number of justices. There’s nothing magic about the number 9. Given the gravity of the decisions the Court has to make, I think the committee should be larger. Also, there should be an even number of justices so they have to listen to each other, negotiate, and compromise in order to reach a decision. The practice would be good for them.
Second, there should be term limits and health standards. Since we don’t get to vote on the justices, we shouldn’t be stuck with them for decades—especially when they’ve become dotty or decrepit to the point where they can’t find the bathroom or stay awake during arguments. (Click here for more about term limits.)
Finally, let’s get some people on the bench who know something about something besides the law. Laws don’t exist in an idyllic universe, like Plato’s forms. They’re entwined with the material world in which things are happening that would have astonished the authors of the Constitution. Shouldn’t we have people involved in making important decisions who have deep knowledge—doctors involved in making decisions about medicine, teachers about education, scientists about science, engineers about technology?
If we want truly impartial Supreme Court Justices to thread the modern world through the eye of the Constitution, we’d best start programming robots to do it. If we want human beings to make decisions that will move us along toward greater civility and enlightenment, we’d better make sure they’re as a-political, well-adjusted, well-informed as human beings can be.
The Supreme Court (which, incidentally, did not spring forth fully fledged from the minds of the Founding Fathers but rather evolved into its present form) needs a little work in my opinion.
First, we should increase the number of justices. There’s nothing magic about the number 9. Given the gravity of the decisions the Court has to make, I think the committee should be larger. Also, there should be an even number of justices so they have to listen to each other, negotiate, and compromise in order to reach a decision. The practice would be good for them.
Second, there should be term limits and health standards. Since we don’t get to vote on the justices, we shouldn’t be stuck with them for decades—especially when they’ve become dotty or decrepit to the point where they can’t find the bathroom or stay awake during arguments. (Click here for more about term limits.)
Finally, let’s get some people on the bench who know something about something besides the law. Laws don’t exist in an idyllic universe, like Plato’s forms. They’re entwined with the material world in which things are happening that would have astonished the authors of the Constitution. Shouldn’t we have people involved in making important decisions who have deep knowledge—doctors involved in making decisions about medicine, teachers about education, scientists about science, engineers about technology?
If we want truly impartial Supreme Court Justices to thread the modern world through the eye of the Constitution, we’d best start programming robots to do it. If we want human beings to make decisions that will move us along toward greater civility and enlightenment, we’d better make sure they’re as a-political, well-adjusted, well-informed as human beings can be.
Thursday, June 21, 2007
Cat Turd Receptacles
On the rare occasion when I sit down and actually watch a little television, it doesn’t take long to be reminded why these occasions are so rare.
I happened to pass through the living room yesterday while my husband was watching a program about Isambard Kingdom Brunel, undoubtedly one of the most innovative engineers to grace the planet (and one whose projects can still be admired in many places in England). That’s the good thing about television these days—with a few spare minutes, you can actually learn something, and the quality of many documentaries is outstanding.
Then came the first advertisement. A woman is scooping litter from her cat’s box. Her expression dramatizes her complete revulsion at having to do the task, as she carries the bag at the end of her arm out to the trash. But wait! There’s a solution!
For a mere 40 bucks or so, you can have not one but two receptacles to hold used litter until you’re good and ready to take it out to the trash! Each container is equipped with a supply of plastic liners (the high-tech equivalent of the plastic grocery bags I use, I guess), and a mechanical closure in each container crimps the top so odors don’t escape.
Now we see the same woman again, beaming from ear to ear, carrying out another bag of cat waste—after some of it, purportedly, has ripened for a week. She looks like it’s her birthday and she’s just spotted the cake! (Interestingly, this second baggie appears to be very much the same size and shape as the baggie in the first scene, leaving one to wonder if her cat sometimes uses the toilet.)
Okay, let’s think about this for a minute.
For years, health experts have been urging people to be more active. We’re encouraged to park some distance from the door at a mall or grocery store so we get a little more exercise going to and from the car. Yet here are several happy customers endorsing a product that, at most, can save them a few steps to the garbage cans. These women (like me) look as though a little bit more exercise wouldn’t do them a bit of harm.
Hurry! Call now!
Good grief.
Who are these people who sit around thinking up kooky ideas for products nobody ever needed and nobody ever will? And who buys these gadgets?
And what would Isambard think about the way some modern engineers choose to apply their intelligence?
I happened to pass through the living room yesterday while my husband was watching a program about Isambard Kingdom Brunel, undoubtedly one of the most innovative engineers to grace the planet (and one whose projects can still be admired in many places in England). That’s the good thing about television these days—with a few spare minutes, you can actually learn something, and the quality of many documentaries is outstanding.
Then came the first advertisement. A woman is scooping litter from her cat’s box. Her expression dramatizes her complete revulsion at having to do the task, as she carries the bag at the end of her arm out to the trash. But wait! There’s a solution!
For a mere 40 bucks or so, you can have not one but two receptacles to hold used litter until you’re good and ready to take it out to the trash! Each container is equipped with a supply of plastic liners (the high-tech equivalent of the plastic grocery bags I use, I guess), and a mechanical closure in each container crimps the top so odors don’t escape.
Now we see the same woman again, beaming from ear to ear, carrying out another bag of cat waste—after some of it, purportedly, has ripened for a week. She looks like it’s her birthday and she’s just spotted the cake! (Interestingly, this second baggie appears to be very much the same size and shape as the baggie in the first scene, leaving one to wonder if her cat sometimes uses the toilet.)
Okay, let’s think about this for a minute.
For years, health experts have been urging people to be more active. We’re encouraged to park some distance from the door at a mall or grocery store so we get a little more exercise going to and from the car. Yet here are several happy customers endorsing a product that, at most, can save them a few steps to the garbage cans. These women (like me) look as though a little bit more exercise wouldn’t do them a bit of harm.
Hurry! Call now!
Good grief.
Who are these people who sit around thinking up kooky ideas for products nobody ever needed and nobody ever will? And who buys these gadgets?
And what would Isambard think about the way some modern engineers choose to apply their intelligence?
Wednesday, June 20, 2007
Of Lame Ducks
Since the beginning of 2007, President George W. Bush’s approval ratings in U.S. polls have yet to break 40%, and lately, they’ve been hovering around 33%. Apparently, two out of three Americans disapprove of his war, environmental policies, politics, or economics. Last fall, voters sent a clear message by electing Democratic majorities in the House and Congress.
Things don’t get much lamer than that for a Republican President.
Meanwhile in Great Britain, Tony Blair—faced with similar differences of opinion with his constituents—gracefully and graciously stepped aside as Prime Minister. Gordon Brown will replace him on June 27, and that will be that. Blair can go on to do good works as a private citizen and be gratefully remembered for his less controversial accomplishments in office.
So here’s my question: Why are we stuck with George W., and he with us, for the next 18 months? Why can’t we have a system that allows a lame duck president to make a dignified exit and let the rest of us get on with things?
When the Constitution was drafted, it could take weeks after an election just to figure out who the winners were. Messages traveled as fast as horses could trot. Things took time. But today, should it still take months or years to get the barge of state turned around?
In the U.S., it’s unthinkable for a President to quit—as Richard Nixon, uniquely, was forced to do—regardless of whether or not he (or she) can reasonably be expected to do a decent job in office. During the months when Bill Clinton was being harried and harassed about his sex life (and so many Americans were patiently trying to explain the headlines to their elderly grandparents), what if stepping aside gracefully had been an option? (I’m not saying he would have or should have stepped down. I’m just asking “what if?”)
Would we lose a lot by doing away with primaries and shortening the election season? Would candidates use more of their time for debating real issues and less for mud-slinging? Would people who really have a mandate from the people be able to get something accomplished, without the Executive and the Legislative branches canceling each other out (as happened today, when Bush vetoed a bill allowing funding of embryonic stem cell research)?
It’s been well over 200 years, and the first Continental Congress didn’t do a bad job, all things considered. But one or two things have changed since then. Maybe it’s time we started looking around at what works well in other countries and asking whether our practices are best practices.
Bush is outnumbered in both houses of Congress and under constant attack by members of his own party. This duck isn’t lame. He’s on his belly without a leg to stand on. Why should he and all of us Americans have to keep on pretending that he speaks for the nation?
Things don’t get much lamer than that for a Republican President.
Meanwhile in Great Britain, Tony Blair—faced with similar differences of opinion with his constituents—gracefully and graciously stepped aside as Prime Minister. Gordon Brown will replace him on June 27, and that will be that. Blair can go on to do good works as a private citizen and be gratefully remembered for his less controversial accomplishments in office.
So here’s my question: Why are we stuck with George W., and he with us, for the next 18 months? Why can’t we have a system that allows a lame duck president to make a dignified exit and let the rest of us get on with things?
When the Constitution was drafted, it could take weeks after an election just to figure out who the winners were. Messages traveled as fast as horses could trot. Things took time. But today, should it still take months or years to get the barge of state turned around?
In the U.S., it’s unthinkable for a President to quit—as Richard Nixon, uniquely, was forced to do—regardless of whether or not he (or she) can reasonably be expected to do a decent job in office. During the months when Bill Clinton was being harried and harassed about his sex life (and so many Americans were patiently trying to explain the headlines to their elderly grandparents), what if stepping aside gracefully had been an option? (I’m not saying he would have or should have stepped down. I’m just asking “what if?”)
Would we lose a lot by doing away with primaries and shortening the election season? Would candidates use more of their time for debating real issues and less for mud-slinging? Would people who really have a mandate from the people be able to get something accomplished, without the Executive and the Legislative branches canceling each other out (as happened today, when Bush vetoed a bill allowing funding of embryonic stem cell research)?
It’s been well over 200 years, and the first Continental Congress didn’t do a bad job, all things considered. But one or two things have changed since then. Maybe it’s time we started looking around at what works well in other countries and asking whether our practices are best practices.
Bush is outnumbered in both houses of Congress and under constant attack by members of his own party. This duck isn’t lame. He’s on his belly without a leg to stand on. Why should he and all of us Americans have to keep on pretending that he speaks for the nation?
Sunday, June 17, 2007
"Duke Lacrosse Prosecutor"
The stacked adjectives in the headlines are bad enough. But despite the fact that three young men have finally been exonerated, once and for all, for despicable acts they didn't commit, there's still plenty of damage to be done by rapacious and over-zealous attorneys.
Prosecutor Mike Nifong admitted wrongdoing and apologized. He's been humiliated and disbarred. A nation given to pondering things might ask questions like these:
Nifong did a rare thing for an attorney--or for anyone under so much public scrutiny: he admitted he was wrong. But is that enough? Not as long as there's a dollar to be made by other attorneys. Now lawyers for the accused athletes are talking about criminal and, no doubt, civil charges. The spectacle will be played out in the media as long as anyone still has enough sense of outrage or morbid curiosity to tune in.
To what purpose? Will the accused men and their families get back the days they spent worrying or the nights they spent obsessing about what might happen? No. But on the other hand, Mike Nifong will never again get carried away with his own arguments and pull out the stops to convict an innocent defendant. Is there any point to putting him and his family through months and years of what those other families went through?
I think it's time we quit confusing "justice" with "getting even." There's no such thing as getting even. If deliberately harming people achieves nothing but income for attorneys and a sense of satisfaction on the part of those who want revenge, then maybe it's time to question our values.
Prosecutor Mike Nifong admitted wrongdoing and apologized. He's been humiliated and disbarred. A nation given to pondering things might ask questions like these:
- "How often does this kind of thing happen when the truth never comes to light?"
- Should prosecutors have as much power as they do to choose what charges should be filed against people? (Should they, for example, be allowed to use outrageous charges to coerce defendants into pleading guilty to lesser crimes?)
- Would it be possible to change the focus of our judicial system from an adversarial, win-or-lose contest between two sides to one in which the focus is on finding truth and minimizing suffering?
- Do some people's suffering count more than other people's?
Nifong did a rare thing for an attorney--or for anyone under so much public scrutiny: he admitted he was wrong. But is that enough? Not as long as there's a dollar to be made by other attorneys. Now lawyers for the accused athletes are talking about criminal and, no doubt, civil charges. The spectacle will be played out in the media as long as anyone still has enough sense of outrage or morbid curiosity to tune in.
To what purpose? Will the accused men and their families get back the days they spent worrying or the nights they spent obsessing about what might happen? No. But on the other hand, Mike Nifong will never again get carried away with his own arguments and pull out the stops to convict an innocent defendant. Is there any point to putting him and his family through months and years of what those other families went through?
I think it's time we quit confusing "justice" with "getting even." There's no such thing as getting even. If deliberately harming people achieves nothing but income for attorneys and a sense of satisfaction on the part of those who want revenge, then maybe it's time to question our values.
Subscribe to:
Posts (Atom)