T-Minus Thirty-Five Hours

At this point, Kim and I have pretty much completed everything major that needs to be done before we leave. We are organized. We are prepared. We are surprisingly calm.

This is, of course, terrifying.

I’ve been telling people that I’m really not worried about the whole becoming-a-parent element of this whole adventure, and it’s true. What I am worried about: 22 hours of travel, to a foreign land we have only the slightest understanding of, on the way to becoming a parent.

But really, what’s probably going on is that I have only so much room for panic in my brain. So ask me again next week.

Meanwhile, we leave for the airport in 32 hours. We leave Cleveland at 6 am Wednesday for Washington Dulles, where we get to sit quietly and stress some more for five hours. And then it’s fourteen hours from Dulles to Beijing, where we will cavalierly engage in casual conversation and board games. By which I mean freak out some more.

I’m sure this comes as no surprise to those of you who are already parents, however you arrived there. Feel free to laugh at me as much as you want. Just not right now, okay?

Stealing Music, Again

I’m pretty sure anyone who’s tangentially related to the music industry is contractually obligated to weigh in on the Great NPR Stealing Music Fiasco of Ought-Twelve

So here are my thoughts, in response to Jonathan Coulton’s very interesting ruminations. This was originally left as a comment on his post, but it’s pretty much guaranteed to be buried, so…


 
Thought experiment for the Free-Culture and anti-copyright folks out here: Let’s say you make something. You distribute it digitally. And because you believe in Free Culture, you insist that it be distributed for free.

Then someone starts charging for it.

How does that make you feel? Do you feel that you, as the creator, have the right to determine how your creation is distributed?

I think even more than a practical issue — which, let me be clear, is certainly a big issue — this is an issue of principle. How would you feel if the foo was on the other shoot? Not great, I suspect.

But there’s also a practical element that doesn’t seem to be talked much about. There are a lot of Free-As-In-Beer flag-wavers outraged that the gub’mint might step in and knock out file-sharing centers. “What right,” they demand, “does the government have to determine how culture should be shared?”

But here’s what confuses me: What right do the flag-wavers have to determine how an artist’s work should be shared? Do you presume to know better than the creators of the works how their work benefits them, or benefits society?

We talk about “the music industry” as though it’s this faceless monolith. But the facts are (as usual) a lot messier. Yes, some artists can make a living on the road, and giving away their music (or allowing it to be given away) benefits them. But there are musicians for whom this is exactly reversed: Live performance earns nothing; music sales and licensing are everything. Most are probably in the middle. But are we going to insist that all musicians take it on the road, or give it up? Seems kind of counter-productive to the goal of diversifying art and encouraging experimentation. Do we really want a world where only the one percent (if you’ll pardon the allusion) of musicians can make a decent living?

People, think this through to its logical conclusion. We live in a world where money is necessary. The less money that can be made by making music, the fewer musicians we’ll have. Yes, there will be the super-successful, there will be the ones who do it for love alone, and there will be those — like our esteemed host — who find their own niche and make it work. But I don’t see how it can be denied that fewer rewards for making music will ultimately result in fewer musicians. Is that really what we want? Is that the price we’re willing to pay for “free” music?

Personally, I’d rather pay Mr. Coulton than get Rebecca Black for free.

Am I alone in thinking this way?


 
For more of my thoughts on the matter, you may be interested in this post from a few years back.

Evolution: A Dialogue

I love my family.

For those of you who don’t know, I’m the youngest of ten siblings. I have nineteen nieces and nephews, two grand-nephews, and a brand new grand-niece. That being the case, it probably doesn’t surprise you much that we have dramatically different ideals, faiths, and political beliefs. We pretty much cover a large swathe of the spectrum of ideology: we have liberals, conservatives, and libertarians; Democrats and Republicans; Catholics, Protestants, Buddhists, atheists, and agnostics — and some of those don’t match up the way you’d expect. Anyway: big group, lots of diversity. No surprise there.

What is surprising is the fact that, on the whole, we all maintain civility and respect for one another. Sure, we’ve pretty much learned that there are times that certain topics must be avoided, but we’ve also learned how to interact when those topics do come up in a gratifyingly adult manner…and to extend love and respect in spite of our disagreements. 

So I was very pleased when my oldest sister, Caroline, responded via e-mail to my previous post. She’s a Christian who holds very different beliefs from mine, and also a very smart lady, so we ended up getting deep into discussion about the validity of evolution and the interplay of science and religion. And though we ended up agreeing to disagree—as so often must be the case—I enjoyed the conversation so much that I asked her permission to share it with you.

So here it is, unedited. Let it never be said that adults cannot disagree civilly about fundamental matters.


 
Dear Joseph,

I read your blog post this morning, and though I don’t expect to change your mind about anything, I do feel I need to respond. Continue reading “Evolution: A Dialogue”

Dear Republican candidates: This is why we don’t take you seriously

The recent news about a former climate-change denier changing his tune — in a study funded by fossil-fuel interests, no less — got me thinking. Well, that and the seemingly endless series of Republican debates. In watching coverage of the debates, something kept nibbling at the back of my mind, something I couldn’t put my finger on. But I finally figured it out.

In talking about global warming and evolution (and in some cases, both at once!) the Republican candidates tend to fall back on some variant of this phrase:

“It’s just a theory.”

Evolution? Just a theory. Global warming? Just a theory.

Let me back up for a second and lay out some disclosure: I believe that — no, wait a minute; strike “believe.” Evolution is real. We know evolution is real because we see it in action. Ever hear of MRSA? Methicillin-resistant Staphylococcus aureus has become a serious problem in hospitals and nursing homes over the past few years. S. aureus is a bacteria that usually lives pretty harmlessly on the human skin. Occasionally, though, it can flare up into relatively serious infections. Historically, these infections have been pretty easily treated with penicillin or other antibiotics. Then came MRSA. This nasty little critter dodges most of what we would normally throw at it, forcing doctors to bring out the big guns. Where did it come from?

Evolution. Wide use of traditional antibiotics killed off, by definition, only those strains of staph susceptible to traditional antibiotics. What was left were the ones that had mutated in such a way that traditional antibiotics didn’t wipe them out. New drugs, hardier bugs. Survival of the fittest. Sound familiar?

(As an aside: I happen to agree with Newt Gingrich that recognizing the truth of evolution doesn’t mean you can’t also believe in a divine Creator. Unless you take the Bible as word-for-word accurate, and believe that the world was created in six twenty-four-hour days — in which case, I’d like to ask you some questions about Genesis 1:27 vis-a-vis Genesis 2:22, among others — there’s nothing in evolution that precludes the idea of a Creator guiding the mutations that result in evolution. In fact, I tend to find that idea more elegant.)

So, that’s evolution. Global warming? I’ll say I’m not nearly as up on the science here, but I’m willing to take the word of ninety-seven percent of the people whose job it is to know about this stuff. Because the alternative is a laughable global conspiracy with basically no upside for the alleged conspirators. (But you know what? Even if global warming is a complete fabrication, what the hell is wrong with working to reduce waste? That’s the fundamental goal of proponents of global warming, you know: to reduce waste. Fossil fuels are absurdly inefficient, and thus expensive far out of proportion to the benefits they provide. If we can come up with more efficient, less wasteful, less expensive ways of doing things, why wouldn’t we? I don’t know about you, but my parents taught me that waste was bad. But anyway.)

My point of these disclosures is that I recognize that having these views dismissed predisposes me to not take the dismisser seriously. But you know, it’s a big world, it’s a free country, you can feel free to believe what you want to believe, you know?

The problem is when you try to support your beliefs by saying these things are “just theories.” And that’s the point I want to make here. (I know, it took me long enough.) When you say that global warming or evolution is “just a theory,” you’re either displaying 1.) a dismaying level of ignorance about the way science works, or 2.) a cynical willingness to pretend to such ignorance if you think it makes you more electable.

Here’s why I say that: “Theory” in common parlance and “theory” in the context of science are two very different things. Anyone who took a single high school-level science class ought to know this. Outside of science, we use the word “theory” to indicate an untested idea. It’s the start of the process. If you say “I’ve got a theory: It could be bunnies,”  you’re essentially announcing your intention to explore the idea that bunnies could be at the root of your problems.

But in science, that’s called a hypothesis. A theory is what happens when a hypothesis has been rigorously explored. In other words, a hypothesis becomes a theory only after evidence has been gathered.

Now, this isn’t the end of the process by any means. Scientists are always re-evaluating theories to ensure they still hold up. That’s the great thing about science: You have all these really freakin’ smart people constantly checking to make sure everything works the way we think it does, so we don’t have to. And yes, sometimes new evidence arises that disproves a theory, even a long-held one. But that doesn’t change the fact that theories are based on evidence, not just wild speculation.

So dismissing a scientific theory — especially one as well-tested as evolution — as  “just a theory” is simply absurd. It’s like saying Earth is “just a planet.” The Grand Canyon is “just a big hole.” America is “just a country.” (U-S-A! U-S-A!) What I’m saying is that it makes you look ignorant. And then we all laugh at you. Because we’re mean.

If you want to fall back on scientific skepticism, we call all discuss things rationally, like adults. Point out holes in theories and sic the scientists on each other. I have no problem with that. There are plenty of things that well-meaning adults disagree on, and there’s just so much we don’t know. But if you try to pretend that you know more than science does — but use words to do so that betray a fundamental misunderstanding about one of the basic precepts of science — well, it makes it hard for folks who know better to take you seriously. You might as well debate the existence of gravity.

You do believe in gravity, right?

A Chain of Causality

I should be working. I’m working on a review, on the Xbox 360 — a system made by Microsoft — a company that probably would not exist if it weren’t for Windows — an operating system that definitely would not exist if not for the Macintosh. Once I’m done playing the game, I’ll write the review (as I’m writing this) on the most reliable computer I’ve ever owned, a Mac Mini.

I very likely would not be doing this for a living if it weren’t for the desktop-publishing experience I had before graduating college. First was high school, where I designed flyers for my first band on an original Mac. (In Zapf Chancery. I know. Shut up.) Then came college, during which I taught myself rudiments of more complex graphical design by laying out my next band’s first CD on a Mac at a Kinko’s. Desktop publishing skills — especially those on a Mac — looked great on a resume in 1996, which probably helped me get my first job out of college, at P.S.X., which later became The Official U.S. PlayStation Magazine. And if they didn’t help me get the job, they certainly helped me succeed at it, and keep it for over 10 years. For that entire time, I worked daily on a Mac. It was my livelihood.

But before that, I learned rudiments of programming — something that’s served me well in critical thinking as well as in basic web design — on an Apple IIe in grade school. For my seventh-grade computer project, I hand-coded an interactive version of Steven King’s The Eyes of the Dragon. For my eighth-grade project, the assignment was to write a program that generated a color picture on the screen. I ended up crafting a pretty impressive (for the time, and the technology) image of Gene Simmons’ face, in full makeup — then went a step further than required by animating it: he appeared on the screen, then turned his head and breathed fire. I still have that 5 1/4″ floppy somewhere.

Last night, I was checking e-mail and reading my RSS feeds on my iPad in between texting my wife on my iPhone. I had music going in the background, streaming from iTunes to an AirPort attached to my stereo, controlled by the iOS Remote app. A notification window popped up on my iPad from the AP News app: Steve Jobs had died. Of course we all knew this was coming, and I certainly felt a sadness for all the ideas that might now be unrealized, a certain concern that Apple might lose some of that creative spark that had made it such an interesting company. But I also thought, “Oh geez, ridiculous hyperbole incoming!” I winced in anticipation of all the maudlin blog posts and frothing overstatement of Jobs’ influence on the world.

Then I thought about the work I needed to do today, and the 360, and Microsoft, and Windows. I thought about that Apple IIe and the string of Macs and sitting in Kinko’s at 11:30 at night aligning pictures. I thought about the music I was hearing, the music I’d sold online, the fact that I was sitting comfortably on the couch with an amazingly powerful and usable computer sitting in my lap like a hardcover book. I thought about the fact that I never again have to take a road trip without bringing my entire music collection with me. And I realized that much of what I’d be reading about Steve Jobs might not be hyperbole after all.

No, he didn’t feed the poor or cure a disease or land on the moon. He wasn’t even the person directly responsible for creating the Apple, the Mac, the iPod, the iPhone, or the iPad. But he drove those creations. As a result of that drive, my life is better: I can communicate with loved ones more easily, I can work more efficiently, I can enjoy more pleasurable road trips. He didn’t “change the world” in the sense that, say, the inventor of the printing press did. But what he was in charge of has inarguably changed my world for the better, in many different ways. And he’s done the same for millions of others. And that’s definitely something to be proud of. That’s more than the vast majority of us could ever dream to do.

So here’s to you, Steve Jobs. Thanks for taking the lead on so many projects that have improved my life. Here’s hoping Apple will honor your legacy by continuing your vision. But if not, that won’t diminish the impact you had.