Monday, August 31, 2009

The Gravity of Goals

Newton formulated a theory of gravity where the attraction between two objects depends on their masses and the distance between them. Normally this theory is only applied to matter, but I think it is also relevant to the way we choose between goals.

The "Gravity of Goals" equation would reformulate Newton's model like this: my "attraction" to a goal depends on the amount I value the goal (mass) and the amount of time it will take to satisfy it (distance). The longer it takes to satisfy the goal, the less likely I will pursue it. The more I value a goal, the more likely I will pursue it. Faced with multiple goals, the goal I pursue will be the one with the highest payoff (measured by subjective calculations of combined "mass" and "distance").

Imagine an asteroid hurtling through space, irresistibly pulled by a number of planets, some larger than others. That's pretty much what we are, just in a more subjective way.

Sunday, August 30, 2009


I was thinking about hippies today and realized that really they're just Americans who have adopted elements of Indian culture. I started going down the list of things I think make a hippie a hippie, and I came up with:
  1. Practicing yoga rigorously (not that stuff at the gym)
  2. Sympathy or explicit belief in an Eastern Religion (Hinduism, Buddhism)
  3. Belief in relative morality
  4. Pacifistic or non-violent bent
  5. Tendency to prefer natural remedies and cures over chemicals (exemplified in, e.g., ayurveda)
  6. A tendency to renounce the world (derived from the concept of sannyasa)
Notice how all of these ideas have roots in, or are affiliated with, India and Indian culture! Of course, there are fundamental differences, and [full disclosure: I'm Indian] I think hippies understand these ideas in more naive ways than Indians (epitomized by the hippie saying "Why can't we just all get along?"). But still I'm fascinated by the connection.

From my personal experience, I always thought Indian culture was confined to the Indian culture, mostly because no one around me knew anything about it at all. But now that seems not to be the case—and it makes sense. Although Indians and Americans have begun to interact only recently (beginning, chiefly, with Swami Vivekananda), the effects of this interaction cannot just vanish into thin air. I think today I found at least one of the effects.

Saturday, August 29, 2009

The In Crowd

All through grade school I believed in the social hierarchy. I believed that it was more prestigious to hang out with some groups of kids rather than others, or that there was something intrinsically better about being in certain company. To hang out with the cool kids, then, was success, and not to do so was to fail—not only fail, but also have to hang out with them.

So of course I worked as hard as I could for "success," and in so doing I treated other kids based on which group they belonged to. If they were part of the "upper" group, then I tried my hardest to be congenial and friendly and impressive; but if they were part of the "riffraff," the kids with faces covered in pimples who sat in the corner discussing anime, then I made sure to end the conversation quickly, lest a cool kid catch me and implicate me by association.

But I never made it into the in crowd. I spent most of my school days alone, confused, not realizing that in pursuing the people who "should" have been my friends, I ended up alienating the people who would have been my friends.

Friday, August 28, 2009

Humility in the Classroom

When my political philosophy professor mentioned how he invites debate, pushback, and skepticism, I think he made a mistake...and sure enough, within an hour a kid took advantage of the invitation to make a complete ass of himself in front of the entire class.

In relation to game theory, and specifically The Prisoner's Dilemma, the professor was talking about how some societies are more cooperative are more cooperative than others, and was using his experiences in Minnesota and New Orleans as examples. He said people in Minnesota are extremely cooperative, whereas, as he put it, people in New Orleans wouldn't cooperate even if you shot them. He then went on to cite research that shows that whether we cooperate or not depends on what everyone else is doing; if everyone's cooperating, then we tend to cooperate too, and vice versa.

Then the ass raised his hand. I think you're wrong about this, he said, and I think I understand why you are wrong too. He then proposed some theory of geographical determinism, claiming that people in colder climates need to be more cooperative in order to survive the winter, and that this explains the discrepancy between Minnesota and New Orleans. Uh-huh, said the professor; that would also explain perfectly why the Russians are such cooperative people...

Of course there's nothing wrong with honest intellectual inquiry. If you see things another way, go ahead and share it. But do it in a respectful way: the classroom is no place for picking fights and trying to show how "smart" you are.

Afraid of discouraging students, professors have taken to letting this kind of arrogance go unchecked, happy that students are at least speaking. But this is dangerous. I think students may get the misconception that the rest of the world will be just as patient, leading them to an inflated ego, an argumentative personality, and a head full of empty ideas. No, students need to understand that the classroom has no room for intellectual pageantry and that there is a reason they are not lecturing themselves; the quicker they do this, the better off everyone will be.

But let's not go to the other extreme either. I am not talking about complete deference to authority and seniority. Like most things in life, being a good student requires striking a balance--in this case between independent thinking and deference. It's a difficult balance to attain, but humility (which is appallingly scant in university classrooms) makes it possible.

Wednesday, August 26, 2009

Fairness in Economics

One of the first things you learn in a social science class is the difference between positive and normative analysis. The two are considered kind of like opposites: positive analysis seeks to understand how things are, whereas normative analysis seeks to figure out how things should be. Another way of putting it is that positive analysis is value based, whereas normative analysis is not. Positive analysis can inform normative analysis, but the two are kept decidedly separate.

Recently in my political philosophy class, the professor was discussing how economists view fairness. He cited surveys of economists about fairness where the economists did not understand how to answer the question; they could not understand what "fairness" meant, or what it was (and those with a background in Pareto-efficiency will have an inkling for why that is). He also talked about a book that goes on for pages and pages only to conclude that fairness is a preference--like your taste for wine or where you buy your furniture. Somehow, said my professor, it seems they're missing the point.

These economists seem to have come to understand the H. economicus model so well that they actually are starting to think it, to incorporate its values, to live it. Remember that H. economicus was meant to be a strictly positive model. Whether we like it or not, people are result-oriented (which is part of what the model says)--but that by no means implies that we should be that way. The model should fit you, not the other way around.

These economists, however, are going the other way around, converting themselves from H. sapiens to H. economicus; or, in other words, they are understanding their positive model in a normative way.

There are, broadly, two possible explanations for why this is occurring. First, it could be that H. economicus-type people self-select themselves into the economics discipline, so the study itself has no effect on them. Second, perhaps economists have trained themselves to think in the H. economicus direction for so long that it has in fact rubbed off on them, so that is all they see.

If so, that mindset needs to end. Fairness should be central to economics. What's the point of having wealth if various factions are going to waste it all fighting over it?

Update: I spoke too soon: it turns out that theories of rationality must necessarily contain normative notions--and H. economicus is a model of rational action. However, I stand by my broader point that fairness matters in economics, and it seems economists don't take it seriously enough. Intuitively, it seems that fairness is not only something people do care about, but should care about.

Wednesday, August 19, 2009

Vanishing Bats

Today on NPR I heard a story about how the most common species of bat in North America seems to be on the path to extinction due a fungal disease called White Nose Syndrome. As hundreds of thousands of bats die, bat scientists are calling "crisis!" and starting to mount efforts to prevent extinction.

Part of the story also noted how bats perform a tremendous service to farmers as insecticides, and hinted that extinction would be a catastrophe for the ecosystem. For now, the report said, conservation biologists are carefully monitoring the situation.

Now if White Nose Syndrome is caused by humans (and that's a big if), then I would understand the cause for alarm. But that's not what was being talked about; instead, the report focused on how the decline of bats is a calamity in and of itself.

As someone with minimal experience in ecology, I may be wrong on this, but I feel this approach is misguided. It seems that if an event (including extinction) happens automatically in nature, then it is helpful rather than harmful, and ultimately works for the benefit of the system. After all the Earth has been sustaining life for hundreds of millions of years and has been doing just fine without our help. Just because we are now marginally aware of the complex ecology around us does not mean that we need to correct for perceived "imbalances," especially when these corrections can lead to real imbalances of their own.

Of course, this entire argument is contingent on whether White Nose Syndrome is caused by humans or not. And there very well may be other facts to bear in mind. Nonetheless, I sense that we environmentalists have a knee-jerk reaction for preserving species diversity—and I want to suggest that this may not be a good thing.

Tuesday, August 18, 2009

A Helping Hand in the Bowels of Bureacracy

It can be quite depressing to work in the Division for Substance Abuse Policy at the Governor's Office because you have to constantly consider social problems most people don't have to worry about. Our work is all about things like kids binge-drinking at age 13 and meth ravaging entire communities, which make you realize how desperate some problems have become. I've asked several people at the office how they cope with this. Their reply: "It helps knowing that I'm doing something to help."

Through the thickets of reports, paperwork, and legalese, though, it's hard to see how my work contributes to anything. It's all so far removed from the people and their problems.

Today, however, that gap was bridged—if only temporarily—when I inexplicably got a call from a distraught woman in San Diego who told me that she was desperately seeking help for her brother-in-law, a meth addict in Tucson. I have no idea how her call ended up reaching my cubicle, but once it did I seized the opportunity to do right. I told her I'd figure out what to do, and, flushed, I ran to the director of the division, who nonchalantly said she had some contacts in Tucson and called her back in 15 minutes...

That's it. I didn't even really help her directly. But it was enough for me: knowing that I was part of a process that brought some peace and consolation to a fellow human being is more than enough.

Monday, August 17, 2009

From Rights to Duties

Recently there has been a lot of brouhaha over the belligerency and unruliness at health care town hall meetings. Often these disruptive people yell slogans about their rights, which they take to mean "entitlements to do whatever you want"—including bringing automatic weapons. This sentiment has surfaced in a really garish way right now, but it's always been there, if only latently. David Sedaris, for example, has a funny story in Me Talk Pretty One Day where he mentions a man at a Chicago movie theater who refused to turn off his transistor radio. When the usher was called, he started arguing about how we live in a free country.

I think these anecdotes signal a general sentiment that confuses rights with complete license; freedom with impunity. Scholars will no doubt point out that these people are completely misunderstanding the concepts, but even still I think these misunderstandings arise out of the very language of rights themselves:

Rights are essentially an egocentric concept: with rights, you take for granted the entitlements society extends to you while ignoring the work everyone else is constantly doing to uphold those rights.

In other words, the problem with rights is that they obscure the fact that one man's right is everyone else's duty. If I have the right to free speech, it is only because everyone else in the community takes it upon themselves to refrain from silencing me, especially when they do not like what they hear. If we are a freer people, it is only because everyone is working harder at fulfilling their duties to themselves, their family, and their neighbors.

The concept of duty presents a better way of understanding how we should structure our social relations. However, this switch in thinking is not as radical as one might think. Rights and duties ultimately both represent the same values, like two sides of the same coin: if everyone is fulfilling their duties, then everyone’s rights are being respected. The difference between the two is that rights focus on how the individual receives respect from everyone else, whereas duties focus on how the individual gives respect to everyone around him. 

It's a small switch in thinking that makes a number of key differences. First off, it makes us more concerned about other people. In this new paradigm, instead of asserting rights (and thus focusing on my needs over everyone else's), my mind will turn to fulfilling duties (considering what I need to do help others satisfy their needs).

To be sure, I’m not talking about subordinating the needs of an individual for the sake of the collective. Rather, I’m speaking to the fact that a person can control only whether or not he fulfills his duties towards others, not whether others fulfill their duties towards him. An individual, on his own, has no power to defend his own right; he can only try to persuade others to respect his right. And if these former right-violators decide to change their right-violating ways, then what they are really doing is sacrificing their personal interests for the sake of maintaining society’s broader moral ideals—a principle commonly known as duty. Thus, even rights (a very individual-centric concept) contain within them notions of duty, making it difficult to say that a duty paradigm does not respect the individual.

The second benefit of the duty paradigm is that it makes more explicit the idea that social balance is only the result of work, and is fragilely maintained. Unlike rights, which seem to exist automatically until violated, duties are left undone until they are fulfilled. The Declaration of Independence can say what it wants about inalienability and God-guarantees, but history shows that it is only humans who protect the rights of others, and humans can as easily trample rights as they can uphold them.

What this shift to duties will not do, however, is make our moral problems any easier. Just as we try to understand what rights apply to what situations, we'll have to try to understand what duties apply to what situations. But I do think we will be working with a more socially responsible system of struggling through moral problems.

Sunday, August 16, 2009

District 9

Today I saw "District 9," a grisly film about life twenty years after aliens land. Although the plot was rather cliche (i.e. outcast hero sacrifices himself to save his newly found friends), I appreciated the film for the two main takeaways that it gave me:

1. For me, the film acted as a case study of how badly humans react to abrupt change, new social problems, and, most importantly, major profit opportunities. In this film, the "profit opportunity" is the ability to access the power of alien weapon technology, which for some reason only works with their DNA. Corporations, governments, and warlords are all hungry for this technology, and are willing to use violence, double-dealing, or any other means necessary to get it. It's not that I see people as fundamentally evil, but I think humans are very susceptible to corruption when the stakes are that high.

2. The film reminded me that fulfillment can be found even in the most wretched conditions through self-sacrifice—and conditions don't get any worse than for Wikus van der Merwe, our hero, who finds himself a fugitive from human society after an accident causes him to begin transforming into an alien. While Wikus finds this transformation painful and miserable, vested interests around the world are exuberant, since Wikus can now access alien weapons technology. This makes him the "world's most valuable corporate artifact"; no longer human, but a prize, a resource; and to be degraded to this state is, it seems, as low as one can possibly go. Yet whether the world recognizes it or not, Wikus is able to demonstrate and assert his humanity by sacrificing himself. Even as he lays dying he is victorious knowing that he is what he thinks he is, regardless of what the world around him thinks.

Saturday, August 15, 2009

Private Schools

Recently I've been finding out that quite a few people I know go to private preparatory elementary and secondary schools. This, to me, is quite a perplexing discovery, since it seems that as long as you don't live in a bad area (and these kids don't) the public schools are quite good. I naturally started to wonder what made these schools special, what made them worth the extra commute, extra money, and extra hassle, but I didn't get around to finding out.

Today, however, my curiosity got the better of me. My only research method involved browsing the schools' websites (which, out of courtesy, I'm withholding), but that was enough for my purposes.

For the most part, I saw what I expected to see: small classroom sizes, a "personal commitment to your child," various high-flung language classes, and robust drama and arts programs. The websites seemed like they could do with some Mozart in the background (after all, research proves his music enhances learning!), but the general aura was of class and refinement.

As I said, this was all expected. But I was completely floored when I saw the price: Tuition alone costs at least as much as my total cost to attend university! And that's for the cheaper school. The more expensive one costs twice as much, and fees, textbooks, and food are extra.

Now, I get the idea that education is the most valuable thing that you can give to your child, and that this is about their future and everything. But I don't understand how that much money can make a worthwhile difference in the way a kid learns his ABCs, or long division, or even middle-school algebra. At the college level you can talk about spending on the "university brand," but that's only for when you get there.

In college, no one gives a damn where you went to high school, let alone before that. In the end, it really all washes out. The private-school kids sit next to the public-school kids, and any stamp of distinction is immediately worn away.

Friday, August 14, 2009

The Rhetoric of Proselytization

As a non-Christian, I've seen evangelicals make a fair number of attempts to proselytize me. With their Bible in their hands and a worn smile on their face, they patiently try to make me understand that Jesus Christ is my only salvation, that I've been living my life wrong all this time (even though they have no idea what it means to live a Hindu life).

Although I oppose proselytism on principle, I sympathize with the intention. The idea, it seems, is that it feels miserly and selfish to have found something that has been such a positive and transformational force in one's own life and not share it with others. One wants to spread the "good news," so to speak.

But in spite of my sympathies, even well-meaning and clean-hearted evangelicals come off as annoying and intrusive—and it's not an accident.

The problem is that evangelicals forget one of the most fundamental rhetorical dictums: know thy audience. For some reason, evangelicals seem to believe that the best way to talk to people who do not believe in the validity of the Bible is to endlessly quote the Bible. They like to start with John 3:16 ("For God so loved the world..."), take a tour through Matthew and Mark, and then make a powerful close with John 14:6 ("I am the way and the truth and the life..."), the clincher that unequivocally proves the necessity of conversion. They look at you with eyes (and sometimes mouths) that say, if it's written in the Bible how can you contradict? And if you do contradict, you only get another Bible quote.

This is surely an exercise in futility. So, evangelicals, if you are to undertake the very difficult task of proving that only your way of worship is legitimate, please read up on some rhetoric first.

Wednesday, August 12, 2009

Adulthood Doesn't Happen On Its Own

I always assumed that when kids grow up, they naturally become more responsible, socially-aware, and self-directed. These values were just supposed to grow on you, like facial hair. Not true.

The background story for how this realization came about is rather long and unnecessary, but the point is that it doesn't seem like adulthood happens on its own anymore. Of course, the body will age and grow and the facial hair will come without asking. But those qualities that make adults adults don't.

I'm talking about that kind of adulthood which drives my mother to cook for us every day, even though she doesn't want to. I'm talking about the kind of adulthood that pushed my dad to work harder when I was born, because he realized that my health and well-being were in his hands. I'm talking about that kind of adulthood that makes people understand they need to confess to their mistakes, even if it hurts their ego. I'm talking about the kind of adulthood where people know in their bones that their needs are not the only ones that matter.

These values are the true hallmarks of adults, and they don't come automatically—or easily. Every day we must make an effort to inculcate these ideals, to improve ourselves, so that we can inspire the coming generation as we strive to live up to the promise of the previous one.

Tuesday, August 11, 2009

Not All Kinds of Consumer Confidence Are Equal

With the worst of the financial crisis seemingly over, economists and talking heads are now looking at the path to recovery--a path they say depends on so-called "consumer confidence." The title of Fareed Zakaria's recent article basically sums up the spirit: "Get out the wallets: The world needs America to shop."

As someone who has studied a little economics, I understand where they are coming from. Consumption spending comprises roughly 2/3 of GDP, and the American market is one of the biggest in the world. As businesses see higher volumes of sales, they are able to higher more workers, make investments, and circulate the money back into the economy, which keeps us all afloat. On paper, the numbers present a convincing argument.

The problem, though, is that the numbers do not translate into sustained, long-term, healthy prosperity. For example, having scores of people buy houses they cannot afford would certainly count as consumer confidence, but no one would say that is healthy or desirable. The same goes if everyone maxed out their credit cards and buried themselves in debts. Instead, these are the very problems that the financial crisis so starkly exposed, and, if anything, the severity of the panic should teach us not to make those mistakes again. Rather than reading headlines about negative savings rates and massive foreclosures, I think most of us would like to see households stay in/move into good financial well-being, make prudent investments, and take other steps to ensure long-term financial security. National prosperity, after all, begins with personal prosperity, and personal prosperity depends in large part on sound spending habits.

The key, as always, is moderation. Households do need to enter the marketplace, but they should do so in a level-headed way. The numbers from this level-headed approach may not look as good as those generated by "irrational exuberance," but, lest we forget, 5 years later that's not what we'll be caring about.

Monday, August 10, 2009

"Cash for Clunkers"

Talk of the town: "Cash for Clunkers" (a government program that gives rebates to people who trade in their old cars for more fuel-efficient models) has run through $1 billion in about a week, which is quite something considering that that $1 billion was supposed to last until November at least.

The current angle: The national conversation has been focusing on what this running out of money means. The left calls it a success. The right calls it a failure.

What really matters: What makes "Cash for Clunkers" different from a pure subsidy is its environmental component. This is crucial because it is this difference that makes it a public-interest program--the public-interest of course being that greener cars lead to cleaner air and solve some foreign policy problems. Now, some people have suggested that the process of junking the clunkers greatly dampens the environmental benefit of greener cars. This claim may be bogus, but these are the kinds of questions people should be asking and answering. The focus of the conversation should really be on whether the program is achieving its purported objectives, and achieving them efficiently; not on the program's popularity.

Sunday, August 9, 2009

Bhagavad Gita, Chapter 12 Verse 16

Today I was reminded of one of my favorite lines from the Bhagavad Gita, Chapter 12 Verse 16:

सर्वारम्भपरित्यागी यो मद्भक्त: स मे प्रिय:
sarvarambha parityagi yo mad-bhaktah sa me priyah

To paraphrase, it means that one should renounce the idea that one is the beginning or end of anything. Thus, to think, for example, that I discovered the cure, I came up with the design, I am the one that made this happen is nothing but sheer ignorance.

Isaac Newton, who probably has one of the biggest claims to bragging rights in the history of mankind, also probably has one of the best explanations for why he doesn't make his claim: "If I have seen far," he said, "it is because I have stood on the shoulders of giants." Trite, yes, but only because it is true. And it deserves more repetition. Quotes like these make the researcher who boasts of curing a disease think twice about skipping over all the foundational work that scores of people did to make that discovery possible.

However, this verse doesn't just apply to major acts of creation and discovery. The subtle and more day-to-day meaning is that behind all of our undertakings lies a motivating and influencing force. Parents, teachers, books, and media all shape our actions, and part of having a healthy outlook is recognizing this fact.

On paper this all seems so obvious, of course, but it's surprising how often we forget. We like to think of ourselves as intelligent, but we forget about the people who refined that intelligence and directed it into constructive channels. We like to think of ourselves as good people, but we forget about the people that have helped instill those values in us.

After internalizing the principle of this verse, sincere and matter-of-fact humility should come on its own. After all, when one goes through the list of people one's indebted to, how can one feel anything but humility?

Saturday, August 8, 2009


When I was in the 7th grade, I learned that exponents do not distribute inside parentheses. That is:
(a + b)2 ≠ a2 + b2
My teacher hadn't mentioned the FOIL method for performing distributive multiplication (that was for 8th grade, I came to find out), so I just assumed that it hadn't been discovered yet. How nice and clean would it be, though, if there were a way to distribute that exponent! So I worked on coming up with a rule.

Ambitious as I was, however, I didn't have any mathematical chops. I couldn't simplify it, or manipulate it, or modify it to come up with an elegant solution, so I tried the only other strategy I knew: guess-and check. a2 + b2 + a + b (no); a2 + b2 + 4a - b (no); a2 + b2 + 3a / b (no)...

For the next week this became my obsession. I thought about it during class (ignoring whatever else was going on), while eating, before sleeping. Guess-and-check is, after all, not a very quick or reliable method, so it took a lot of computing. But finally, I came up with something that seemed like it would work:
(a + b)2 = a2 + b2 + 2ab.
I double checked it and tripled checked; I tested it with fractions, decimals, irrational numbers, whatever math I knew at the time; and when I was absolutely sure it worked, I showed it to my parents.

It was meant to be one of those conversations that starts off casually enough at first, but then quickly escalates to life-changing proportions. I imagined my parents would jump out of their seats when they realized they had been raising a genius all along. There would be a press conference the next day of course, so I had already started planning how I'd strike that delicate balance between humility and healthy pride. And I had good reason to be proud. I had just solved one of the most perplexing mathematical problems of our time. Mathematicians would rejoice. Scientists would cheer. It would be a new world...

Friday, August 7, 2009

How many government employees does it take to change a light bulb?

Cassandra, who heads the Arizona Governor's Office for Children, Youth, and Families, where I work, had requested that a light bulb be installed in her office. She wanted something that would sit over her head or something so that she could read better.

So she puts in her request with the powers that be, and a week later a guy comes down to her office to appraise the situation. He nods, and then comes back with two more people: a guy holding some kind of meter, and a "lighting hygienist." The lighting hygienist takes a look around, turns on all the lights (including the lights under the desk, which Cassandra never uses), and has Cassandra sit in her chair. Well, the lighting is fine, she says, but I think your computer monitor may be angled too high...

Thursday, August 6, 2009

Languages and Codes

A personal observation with, I think, universal application:

After getting a fairly solid grasp on my third language, Hindi, I started to realize that there was a fundamental difference between the way my brain processes the languages that I know (English/Spanish/Hindi) and the ones I do not.

The languages I do know seem like, well, languages; which is to say that I cannot help but find meaning in those words, that it feels as if the meaning were inherent. When I do try to hear my languages as an outsider would (that is, sounds divorced from meaning), I'm shocked at what I take for granted. For example, have you ever stopped to think how caveman-like the word "food" is? It sounds like primitive grunting.

Foreign languages, however, don't seem like languages at all, but rather like codes. Like substitution ciphers, they don't seem to exist in their own right, but rather they merely encrypt the languages I already know. In other words, even though all languages employ arbitrary sounds to describe ideas, the sounds that my languages employ feel more "right." I don't think I'm alone in this experience, though, from the way I've seen people learn vocabulary in a new language:

asdflkj = dog
eoriur = cat
poiuwer = tree

Here, the mystery language on the left is being "deciphered" on the right. Notice that when we interpret the equal sign, we don't make a distinction between the idea of "dog," which is beyond words, and the word "dog" itself, which is peculiar to English. Thus, when most of us see such vocabulary lists, we do not see two equally valid arbitrary sounds for a concept, but rather one arbitrary sound (on the left) and one concept (on the right).

This language vs. code idea also shows up in the way we parody foreign languages. For example:

Italian: Montebello, mama mia deliciosso, wella comma to my pizza shoppa!
German: Gutten heiten schitinen hassengurten.
Chinese: Ping pong dong hai ni ma ni pay

I think that people who don't know these languages will agree with me that if these parodies are read in the right voice, they sound close enough to the actual languages. People who do know these languages will, of course, disagree—which brings me to my next point: it's almost impossible to parody a language that one knows fluently. I suspect this is because we cannot separate sound from meaning in our own language as we do with others, so the parodies come off as gibberish.

For now I'm on the fence about whether this "linguistic ethnocentrism" is pernicious or just a harmless natural instinct. Whatever the case may be, though, for the foreign-language learner, this scheme offers some useful heuristics for judging success: if you can no longer parody the language you're learning, if the words in your new language feel "right" and inherently meaningful, you're probably becoming fluent.

Sunday, August 2, 2009

*Throat Clearing Sound*

That last post took a lot longer than I expected, and it's not because I wasn't sure what I wanted to talk about.

I've barely started to write this blog, but I'm starting to truly understand what it means to have one's own "voice" in writing. Subconsciously, it seems that what I'm really trying to do is imitate and synthesize the voice of writers I like, such as David Sedaris or David Foster Wallace. No wonder it's taking me so long to write my posts.

Ultimately, then, I'll have to find my own voice. I don't know how that's going to happen, or how I'll even be able to tell (do fish know they swim in water?), but my hope is that things will become clearer as I keep this up.


I've never really thought of myself as a blogger, but here I am. The idea is that I'll get in some good writing practice this way.

Like most unexpected turn of events, this one happened rather suddenly. At least, the idea came to me only recently, but now that I think about it I guess I've been heading this direction for a while. I've always admired and envied good writing--you know, the kind that feels confessional instead of premeditated--and normally I leave it at that. But now I've decided that instead of idly admiring and envying, I'll take steps to develop the craft--and, as they say, practice makes perfect.