Tuesday, December 9, 2008

Your Media Isn't Yours: DRM as a Culture Limiting Force


(credit to Thomas Hawk, under a CC-Attribution-Non-Commercial license)

So imagine one day you walk into your home after a long day of work and decide you want to read, say, “Harrison Bergeron,” by Kurt Vonnegut. You head into the living room and towards the bookshelf to grab your copy of the excellent short story collection, “Welcome to the Monkey House.” But you are surprised to find that your entire bookshelf is no longer in your home. All of your books have disappeared along with it. The door was unmolested and nothing else has been taken, but for some reason, your entire book collection (and all it represented in spent money, intellectual feats accomplished, and personal development) has simply disappeared.

It sounds like an entirely impossible situation. You’d be as surprised to see your entire DVD collection vanish or your stack of Wired Magazine back-issues cease to be. But the modern media industry wants you to believe that something like this could happen to your entire purchased digital music collection. In fact, the media industry thinks that this is not only reasonable, but that it’s their right.

It sounds like a company is robbing you of your right to own your own property, which can mean only one thing: you can bet digital rights management (DRM) is to blame. I’ve written about this subject before (here), but it hasn’t gotten any better.

Here’s a recent example. In September, The iTunes music store was faced with a possible royalty hike from the music industry. This royalty hike would have been mandated by the federal government, and it would have cost iTunes something like 66% more per track to operate its digital music store. This left iTunes with only one possible course of action: shutting down their music store. It’s true; iTunes was within one government ruling of choosing to no longer operate their successful digital music store, all because the music industry, in concert with the government, could condition rights to that music on higher royalties.

Markets opening and closing because of price fluctuation is nothing new. But one unfortunate side effect of closing down a DRM-controlled music store is that purchases from that store become useless. Crippling the rights management system for a digital music file turns it from a purchased asset into a meaningless pile of ones and zeros.

Nowhere was this more evident than in the late-September WalMart digital music fiasco. WalMart made an announcement that, since it no longer sold DRM-controlled music, they were shutting down their DRM servers. The backlash was instant, of course, because the early adopters of the WalMart digital music store, who still had some DRM-controlled music, stood to be punished for purchasing this music. Because of the backlash, WalMart soon backed down from this radical decision. But from there forward, the ability of companies to literally wipe clean a customer’s digital purchases was not just a slippery-slope possibility. It was a close call.

The list of DRM-crippled technology is long (this blog series details 35 products that sacrifice customer satisfaction for rights management, and this feature counts down 25 reasons why DRM as it exists is bad for the consumer). But there are some success stories in between the horror stories (Mac and Netflix have finally worked together, for instance, to seamlessly integrate the Apple DRM platform with their own Watch It Now service). DRM isn’t an inherently bad thing; it gives creators a way to stop their customers from freely distributing products to the point of saturating the market and rendering the artists’ work meaningless.

Back to the bookshelf analogy. When you buy a book, its inherent physicality makes it impossible to let three people read it at once. The rights management on books is built into the nature of the books. But if we could instantly reproduce those books into identical copies, we’d need a new way to make sure that those copies aren’t replacing purchased versions in the homes of possible purchasers. Being able to manage rights is good.

The way that DRM works now, however, is akin to allowing a band of rights enforcers to break into your home and take all of your books if they start fearing for their profit margins. If the rights granters decide that it’s not profitable to be a book company anymore, they would be able to make all of the books they have created disappear right out of the hands of those that have purchased them. Until DRM is fixed, your digital files are in this kind of danger.

And in a culture that is moving quickly toward nearly entirely digitally distributed media, we suddenly find our access to creative endeavors limited by the will of a small group of media conglomerates. It’s starting to look like our culture itself is under the stranglehold of outdated laws and outdated media companies, and not controlled by our collective will to grow and push forward as a society.

I don’t know about you, but that scares the hell out of me.

Thursday, November 27, 2008

The more things change

“I would never date a Republican. They’re all religious fanatics that think they can tell me what to do with my own body. They’re racist homophobes and think that is how God wants them to be. All they care about is tax cuts for the rich and want to abolish social programming because they think poor people are welfare crack-whores who deserve what they get. How can I let someone like that touch me? I mean, what if I got pregnant and they made me keep the baby? It’s about time that we had someone in office who will bring change and not use people’s fears and prejudices to drive his own agenda.”

All right, I made that conversation up. Kind of. I have been told all of those things by various supporters of the Democratic Party, just not in succession. There seems to be a strong belief among liberal, educated youth that they are tolerant, generous, and righteous while Republicans are not. The argument goes that Democrats support social programming; Republicans don’t. Democrats believe in individual rights; Republicans don’t. Democrats are all educated and intelligent; Republicans are rich and greedy or stupid, backwater bible-thumpers.

This is pure, old-fashioned stereotyping and prejudice often coming from the very people who decry racism and prejudice of all (or most) forms and proclaim Obama’s victory as the victory of enlightenment over prejudice. The mistake they make is in assuming that racism is only about skin color. Today, the bigger issue is one of class and geography. Democratic youth believe that they are in the educated elite and look down upon those who come from Southern states who are stereotyped as uneducated and ignorant. This is, incidentally, untrue. Republicans are actually more likely to have a 4-year degree and equally likely to have a graduate degree.

Offensive stereotypes don’t need to be proven wrong, of course, but I will bring just one more example of how ridiculous some of this stereotyping is. The belief that Republicans do not care about the poor is mind-bogglingly (is that a word?) absurd. In fact, there is evidence that Republicans give more of their income to charity, on average, than Democrats . There is clearly a belief that Government should be less involved in social programming, but the rationale is that the private sector is better equipped to provide these services. You may disagree with this philosophy, but you can’t say with any sort of intellectual honesty that it means Republicans do not care about the poor. Unless, of course, you are letting your prejudices speak for you.

This was a historic election and Americans should be proud of the choice they have made, but in no way does this signify the end of racism and prejudice. It just reminds us that prejudice goes much further than skin deep.

Monday, November 24, 2008

SBO Roundup 1: Bond vs. Barack, and more


(above: obama bond, from worth1000.com cotributor sudiptatatha)

This is a new feature I'm trying out. I see a lot of things in my daily life that I want to talk about on this blog but that don't amount to enough content for a full post. Some things just don't get at the heart of the unwavering inertia of our complacent society confronted with the equally unswerving momentum of technology and culture moving inexorably forward.

So, this feature, the "SBO Roundup," will be a chance to briefly hit on the things that wouldn't otherwise get covered on Stars Blink Out. This time, I'm going to be talking about the new James Bond movie (particularly what makes it both a product and victim of some current and dated movie trends), Barack Obama's new approach to the "fireside chat," Michael Chabon's "The Yiddish Policemen's Union," and a quick comment on Kanye West's new record.
  • This weekend, I had a chance to see the newest James Bond movie. It's called "The Quantum of Solace" (arguably the worst Bond title yet). I'm not a huge Bond fan by any means, but to some extent, the Bond formula has permeated our spy movie viewing experience: campy, a little sex, and good old smooth James Bond. But a new paradigm has sort of eclipsed that one. It's the paradigm of the "Bourne" movies: grittier, darker, and with a more conflicted protagonist. Arguably, the first Bond reboot, "Casino Royale," delivers on the Bond premise with touches of grit and realism; "Quantum" seems to have strayed pretty far into "Bourne" territory by its still edifying, but certainly not scoring-a-Bond-girl-edifying, conclusion. It's a strong, interesting action movie, but I recommend caution for Classic Bond fans.
  • Barack Obama, as President Elect in a transition period, has, as previously mentioned, started the conversation with the American people already. A week or so back, that conversation took a new step. Obama posted a video on YouTube that was essentially a fireside radio chat, but instead of a fire there are some law books, and instead of radio it's the Internet. It's an intriguing new approach to getting the executive's core aims and values in front of a national audience. Watch it, and see what you think.
  • The premise of Michael Chabon's "The Yiddish Policemen's Union" is that, after WW2, a not-so-well-known, entirely non-fictional plan to make a portion of Alaska into a Jewish state, instead of being thrown out as ludicrous like it was in reality, has been enacted. The world's Jews all converge on a little portion of Alaskan wilderness and continue their already-tenuous interactions with the world. The novel uses the plan (and the eventual rise and fall of a sort of messiah figure) to discuss what homeland, isolation, guilt, sin, and salvation mean in today's world. I highly recommend this book for any Jew who can feel, somewhere inside of them, buried deep, a disconnect between their identity as a Jew, their expectations of salvation, and their duties and hardships as a human. I also recommend it to anyone who recognizes that conundrum within themselves.
  • Kanye West. What a crazy dude. His last album, "Graduation," was dripping with synthesizers and samples, surprising the listener with unexpected sounds around every corner. His new record, called "808's and Heartbreak" does some sonic surprising, but the premise sort of curtails the possibility for any real shock: every single track features nearly zero sample, but instead vintage 808 drums and synths; every track features not a single rap from Kanye, but melodies sung through an auto-tune device. It's strange to hear a sample-heavy rapper turn into a synth-heavy pop vocalist, but the result is at least new and intriguing. Any unevenness present on "Graduation" is smoothed out on "808's." More importantly, master of mashup and surprise Kanye West has managed to mash together disparate instruments and styles instead of disparate samples. The result is a more subtle, more progressive approach to mashup. The record also serves as a sort of comment on the fact that as Kanye moves farther from hip-hop and closer to straight-forward pop, his fame (or infamy) grows. This might also underscore how rough, angry hip-hop will always be outsold by straight-up pop. Listen to my favorite track, Robocop (until the RIAA finds this YouTube clip).
So that's what's crossed my mind recently. Stay tuned for more full-fledged, traditional Stars Blink Out posts in the near future, featuring our new expanded line-up.

Monday, November 10, 2008

Intellectual Psychosis

The hallmark of psychosis is the inability to decipher what is real - or, to believe something is real even when some (or all) evidence proves otherwise. Real things are consistent, and the reality that the psychotic perceives is very inconsistent. Without going into too much detail about my own mental illness, in my psychotic state, I perceived things that never came to fruition. And I'm not the only one: Pete Early, author of Crazy, chronicles the lives of many individuals suffering from mental illness. Most, if not all, had at least some sort of psychotic thought; one even believed he was a prophet, needing to tell the world of the Messiah's return in 2007. The movie A Beautiful Mind, for example, tells of a scientist that finally recognizes his artificial reality when one of his psychotic beliefs proves anachronistic.

If Dr. John Nash, entrenched in his pseudo-reality, could make such a discovery, then why can't most people, who do not suffer from such illnesses, do the same? We can laud Hollywood's portrayal of this schizophrenic man, but rarely do we - the audience - scrutinize our beliefs the same way.

For instance, a recent conversation with my Rabbi proved futile: of course the dinosaurs existed, he proclaimed, they just died in the flood! This is in stark contrast to Kent Hovind or Ken Ham's view, where the dinosaurs were saved in the Ark and actually lived with people. Never mind that the fossil strata and geological column prove both claims erroneous.

But if you've presupposed your conclusion (regardless of its correctness), any valid syllogism will do: the premises may not be falsifiable - or, if you want, you could even try to prove your premises true by backing into them using your conclusion inductively (e.g. that dinosaurs are now extinct is because god's obviously perfect word predicted an irrefutable flood). Though such arguments may on their face appear valid, they aren't. Strictly speaking, they are best described as dogma.

Inevitably, the dogmatic person will see evidence contrary to their beliefs. Faced with rejecting what might be a lifetime of teaching (and even perhaps a culture based around such teachings), the person will fall back on their intuition: "I've got to go with my gut." I've therefore coined a neologism for such faulty thinking: intellectual psychosis.

It's not psychotic, though, to have grounded faith, or to even use your gut. As an auditor for the government, people trust me, but no amount of testing could prove that I am, and will always be, an honest person. When we begin a relationship, we may have no personal experience with which to predict the relationship's success. But we, nonetheless, try, and hope for our partner's requited love and fidelity. We need our gut instincts. Many relationships don't work out, but we still have faith that, one day, one might. Grounded faith, intuition, conviction - these things are important.

Equally important, at least when figuring out why our gut tells us to do something, is critical thinking. We're afraid of being wrong. But it takes more guts to admit that you're wrong than to brainlessly assume you're right. Intellectual psychosis is a treatable malady. Try some critical thinking, it's good for both the head and the stomach.

Thursday, November 6, 2008

Meet Our Newest Staff Members!

I don't want to say too much, but you might have noticed that the "contributors" section has two new names in it. It's true: I've invited two new individuals to join me in writing for "Stars Blink Out." The names listed there are not, as is no surprise, their real names, as I'm going to leave it to each of them to regulate how they are known on this blog. And no promises that both have things lined up to post in the nearly near future (one does, I think). But they are here to bring us some variety, a break from me, etcetera.

One is a guy that I've known for maybe a year and a half (feels like longer), and the other is a guy that I've known for my whole life (feels like longer), my brother. Each has their own outlook on the world and their own perceptions of what constitutes "relevance" in this rapidly changing world.

But enough with the boring stuff. Let me just say that I look forward to having them on board and to giving you the reader, and also me, more to read and more to think about.

-Stephen

Change.gov: It's Starting To Look Like Yeah, We Maybe Can


(a background wallpaper from http://obama4thewin.com/)

A lot of people out there are starting to wonder. How many times can we say "Yes We Can" before we have to start actually doing something? How often can we call for "Change" before the absence of that change starts to show?

President-elect Barack Obama is listening to those people. He is faced with an in-between period, during which he has been elected but not inaugurated, he has support but no way to leverage it just yet. He has responded by doing something unprecedented. He has set up Change.gov.

A website for a president-elect certainly doesn't sound too impressive. But that's before you dive in and see what the site actually does. Not only does it document the obvious media attention, it calls for submissions on what the average American thinks our vision should be. It asks for average American stories. It outlines policies, plans, and specific methods for effectuating the change for which we have all been calling. It even has a blog. An American president-elect running a blog!

I know, it's not a fixed economy or a solved health care crisis. But it certainly is a giant step in an entirely changed direction. I like where this is going so far.

NOTE: I also have a straight up Obama entry working, so be on the lookout for that.

Thursday, October 23, 2008

John Green's "Paper Towns," And Then Some

(above: Paper Towns covers)

Any discussion of John Green almost invariably includes his brother, Hank Green, and the project they started on the first of January in 2007. The project was called "Brotherhood 2.0." The two brothers made a video blog entry every day, alternating each day, for one whole year. The result was a conversation had in public about what it means to grow up and to be brothers and what it means to be in a community. The result was a fellowship of young people, self-named Nerdfighters, with the common goals of thinking and appreciating each other and appreciating themselves. This unintended result of the project is likely the most impressive.

But another stated impetus for the project, the one on which I wish to focus, was for these two brothers to get to know each other as adults. Hank and John really only knew each other as their younger selves, the two brothers who grew up together. This project was a way to learn how each of the brothers had grown up, what kind of adult person each one had turned into.

It's a lofty goal, and it's one that the brothers Green took seriously. The project lasted the whole year, and Hank and John still make videos to this day. Over the course of these videos, the full architecture of each of their personalities came into full relief, and we got to know them as well as they got to know each other.

That brings me to John Green the young adult author. He's written some award winning books (An Abundance of Katherines! Looking for Alaska!), and over the course of his videoblogging project, he has been writing his newest book. It's called Paper Towns, and I'm here to discuss a really important thematic similarity between the book and Brotherhood 2.0.

In essence, a teenage boy called Q finds himself drawn into the enigma of a teenage girl named Margo, his next door neighbor and the object of his desire. The two share a night of pranks (inarguably a classic "night of passion" without the classic trappings of a "night of passion"), and then, as in all of John Green's books, we make camp inside of our male narrator's mind and examine what happens when an important figure in his world disappears.

Also as in Green's previous works, the absent female cornerstone becomes far more important in our protagonist's mind than in the narrative. Put differently, the male lead's conception of the absent female leaf is far more important
throughout the novel than the actual female lead. In point of fact, calling Margo a female lead is pretty hugely misleading, as she is thought about more in this novel than she is actually in it.

That said, Paper Towns can be seen as a capstone on some of the important ruminations in John Green's previous works. In Paper Towns, more so than in those previous books, people are misunderstanding each other, and it's having serious consequences for their development and interaction. People have these thin versions of each other in their minds, and the book chronicles those thin versions filling out.

Paper Towns is an engrossing, witty, real-feeling book.
It crackles at times with the same intensity and clarity that characterized Alaska and Katherines. But it finally grabs some of the side themes of those works and hammers at them. It describes something we have all done. We have all deified or vilified others. We have all underestimated the complexity of those around us. The book shows us, in a pretty visceral way, the effects of this kind of drastic misapprehension.

This is clearly the textbook that informs all of Brotherhood 2.0. John Green and Hank Green had these thin versions of each other in their minds, and throughout their video project, these versions filled out.
Maybe even more impressively, the thin versions of the band geeks, literature nerds, and D&D dorks inside of us all fill out, and we learn to appreciate each other more completely through the connection of good books, inside jokes, and even charity projects. I recommend the book highly, but I recommend the Brotherhood 2.0 and Nerdfighting experience even more highly.

If you believe that people are complex and important and awesome, and thinking otherwise is dangerous, check out Brotherhood 2.0. If you need more proof of that danger, read Paper Towns.

Wednesday, October 8, 2008

Where We Should Put Our $700 Billion: NPR

(above: Michale Slatoff's disorienting photograph)

I understand so little about this recent financial crisis. In fact, as with most things I know anything about, most of my knowledge comes from reading, listening and asking questions in conversation. But in this case, one source has made these conversations a little easier. That source is NPR.

Earlier this year, approximately May-ish, NPR's "This American Life" did a story called "The Giant Pool of Money." This story explained, in clear, concise, and engaging language, the genesis of the mortgage crisis. When I heard this story, while I didn't become an expert, I was finally able to start thinking about the issue in 3 dimensions, turning it around and pulling it apart, examining even the internal consistency of NPR's account.

And that's what the story meant to do. It didn't mean to explain it all or to make us all understand. It meant to get us thinking about the complex issues, to get us talking about them.

If that wasn't enough, as the global financial crisis developed, "This American Life" followed their first financial story up with another one called "Another Frightening Show About the Economy." This hour-long episode updated the financial story and tackled the substance of the bailout plan that's been floating around. The two minds behind these shows also started their own podcast on money matters, called "Planet Money."

Again, these NPR stories and podcasts don't want to clear the whole thing up or make it sound simpler than it is. All of this careful, enlightening work by the NPR staff merely seeks to make us talk, to make us think. NPR's coverage encourages, above all else, conversation.

As I said, nearly any pseudo-expertise I have comes form conversation. And for my money (bad pun!!), NPR is doing more for that conversation than even economists and politicians right now. You should give it a listen. And if you are a financial expert, you should let me know if NPR got anything really wrong. You know, we'll have a conversation.

Sunday, October 5, 2008

Spore: A Case-Study in DRM

(above: the title card of sorts for "Spore")

Recently, "Sim City" pioneer and creator of "The Sims," Will Wright, came out with a new simulation game. This one is called "Spore." It's meant to be a sort of Sim-everything, in that you start the game playing "Sim Amoeba" and end up with "Sim Interstellar Diplomacy," hitting every Sim-step in between.

I'm not here to talk about how well the game achieves this rather lofty goal. Instead, I'm here to talk a bit about "Spore" as a case of bad DRM or, at the very least, people finally noticing DRM and responding to it.

Let's start from the beginning. When you buy "Spore," you throw the disc in your computer and install the game. But before you can play it or get involved in the Sim-circus, you have to register online. You submit some information on yourself and you get the game essentially unlocked. Basically, your right to access your digital copy of spore is granted.

But let's say that you have three computers, and you want to install "Spore" on all three. Well and good, the "Spore" team says. You can register online up to three times for different rights to different digital copies. Now let's say one computer malfunctions and you want to install "Spore" again. Without getting a specific dispensation, your crack at the digital right of playing "Spore" is gone.

Another hypothetical: say the Sim-team decides that "Spore" isn't as popular as it used to be, and say they decide to stop running their registration service for "Spore." In this case, you can't unlock your digital rights to play "Spore" anymore. At all. Your copy of the game has become, as one Amazon review says, a colorful plastic coaster.

And speaking of Amazon reviews, that brings us to the response to "Spore." You'd think that Amazon would be a wealth of information on how good the game actually is. Instead, the people who have bought "Spore" have lashed out against the DRM on the game. There are tons of 1-star reviews of the game already posted, and nearly every one cites the DRM as the reason for the bad review.

So what do we take away from this? I suppose we learn that, yeah, a game can be revolutionary and do things games have never done before, but people won't buy it if they can't use it the way they want to. As CBC podcaster Jesse Brown said on his program, "Search Engine," consumers of Will Wright's games love the idea of creating your own world and your own players, but they also demand playing those games on their own terms. They want to expand Wright's idea of making your own rules within the game to how the game is distributed.

The key point to take away from this is that even though fighting oppressive DRM is a geek concept now, it is a) spreading into the mainstream consumer environment, and b) it becomes pretty serious when your target audience is a geek audience.

We have a right to get pissed when a company wants to turn our "purchases" into lease agreements. It's time for copyright rules to reflect our understanding of what "copy" means. And what "right" means.

(lots of info from Fred Benenson's blog, via BoingBoing)

Friday, September 26, 2008

Laurie Anderson PSAs



Laurie Anderson has been a favorite of mine for a long time. She's great at mixing the goofy with the enlightening. The above clip is a great example: something totally absurd that barely makes sense as a sort of illustration of her worldview.

Check out this one for something related to current events, even though it's approximately 20 years old. Or this one about women and money.

(credit to smashing telly for getting me on the right track for this one)

Wednesday, September 24, 2008

New Facebook: My Take

(above: my actual Facebook page, available through a link to the right)

New Facebook does the following very good things:
  • Forces you to prioritize your applications: you can only keep a small number of application boxes on your main profile page, and the rest of them must go on your "boxes" tab. This finally eliminates the MySpacey ugliness of poorly designed and ugly-graphic-heavy application boxes all over a profile.
  • Tabbed browsing of relevant profile information: if you only want someone's information or favorites, it's all neatly collected in their "info". If you want their pictures, you hit the "pictures" tab. And the news feed also has a tab. Oh, speaking of...
  • Combines all recent friend information in one place: gone are the days of checking news feed, wall, posted items, and so on just to see what is happening with your friends (or yourself!) at a given time. The combined feed gives you an overview of ALL recent activity.
  • A comprehensive, organized right sidebar on the main page: I watched a pointless little YouTube video mocking this specific feature and contesting that the internet is ALWAYS left justified. Turns out, interestingly enough, that some of his rant was in his video's info sidebar, which was on the (you guessed it) right side of the YouTube video page. And not only that, Facebook's sidebar has all the relevant info you might want, including a late-breaking addition of an application link menu, a change made as a result of constant and meticulous refining by the Facebook development team.
  • Generally cleaner, without wasted space: yeah, the pages take up more space in a browser window and gest rid of the white space on the two sides, but I always thought that screen efficiency and clean design were good things.
  • The ability to comment on everything: the dialogue-starting ability to comment on, for instance, stati is awesome; I have commented on quite a few people's stati and received reciprocal comments on my own. It's instant, compact, and it stays current and easy to use.
That's just an at-a-glance list. But even if you don't buy that new Facebook is superior, it's time to clam it, 'cause Facebook isn't about to host two separate architectures for their site just so that a few change-averse people can avoid getting used to something new. I myself am usually change-averse, but this time, even if the change isn't, in your eyes, far superior, it sure is permanent!

Tuesday, September 16, 2008

Andrew Jarecki's "Capturing the Friedmans"


The name "Jarecki" should jump out to fans of the modern documentary. A man named Eugene Jarecki made a movie called "Why We Fight," which is, in my opinion, the finest documentary I have seen. Apparently this knack for documentary is a family trait: Eugene's brother Andrew Jarecki made his directorial debut with his documentary, "Capturing the Friedmans."

The film captures a family in turmoil. In a semi-affluent New York neighborhood, a father and son, the Friedmans, are accused of molesting multiple children at a computer class taught in their basement. The film, though, starts with some benign, almost endearing footage of the family joking with each other and just acting like a family. It's not until about ten minutes in that we realize the film is not about how the family functions, but how the family falls apart over the course of the investigation and trial of the father and youngest son.

It's an absurd premise. Making a documentary about a trial is not unheard of. Neither is making a documentary about a family, or making a documentary about a failing marriage or about a troubled individual giving in to temptation. What this film does is blend all of these together to arrive at a complex, interwoven portrait of the nature of truth in an American family and an American court.

And that's what sets this film apart from a lot of the pack of modern documentaries. It does tell a story or two (or more), but it's main premise is to expose the ambiguity of truth in our legal system and in our lives. The question is not whether the father and son are guilty of the molestation charges against them. The question is, who else might be guilty, and of what? Or even, what is guilt?

In one expert scene, a lawyer relates a touching account of his client finally admitting to having a problem and deciding to plead guilty. But intercut with his account is the client's story of being manipulated by that very same lawyer into pleading guilty against his own better judgment and out of desperation. The truth probably hovers somewhere in between, and Jarecki does an expert job of letting us sort of wander around this in-between-ness of truth and come to our own conclusions (or even be satisfied by the roundness and symmetry of this lack of conclusions).

One similarity between Eugene Jarecki's expert "Why We Fight" and Andrew Jarecki's "Capturing the Friedmans" is the nearly entirely absent hand of the filmmaker. Every once in a while, you can hear the director ask a question in an interview, but by and large, there are no voice-overs or on-screen appearances by the filmmakers. These films appear self-organizing, as opposed to the products of the directors.

"Why We Fight" was a clean, impactful, persuasive film essay, while "Capturing the Friedmans" is a messy, ambiguous, emotional nonfiction narrative. But both, for entirely different reasons, feel like they sprang fully formed from reality. And that's basically what I feel a great documentary is meant to accomplish.

Thursday, September 11, 2008

LHC: Fictional Crisis Not Actually Averted!

(above: there it is, folks. the LHC's detector. no idea where i got the image from)

The LHC started yesterday. Nearly every blog I read that has regular updates had a post today saying something along the lines of "The LHC Started! We're Not Dead! Crisis Averted!" This is nonsense, and not because I think the world is going to end and we are in danger, but because the LHC hasn't C'd any H's yet.

That's right, the Large Hadron Collider has not even collided anything yet. The "tiny black holes" and other unpredictable, beautiful, GIANT DANGEROUS SCIENCE doesn't happen until things start smashing into each other some time next month. All that was tested today is the LHC's ability to get particles up to speed and spinning inside of the array. I certainly do not believe that the LHC will destroy the world, but even if it is going to accomplish this task, it can't happen until collisions start next month.

I know that bloggers are often rigor-averse (are they? are we? am I allowed to say that? etc.), but a little science knowledge would prevent misleading posts like the rash of posts I saw today. Without getting into too much sciency detail, saying that we averted a crisis today when nothing went wrong is like saying you averted the crisis of your car's engine blowing up by checking the tire pressure before leaving the garage. Or maybe even preventing a nuclear meltdown by checking that your pen works first before starting the reactor. Pure nonsense.

Almost as much nonsense as thinking that the LHC can destroy the world. But that is a subject for a later date.

Nonsense-purveyors: boingboing, io9. Nonsense-destroyers: WWdN.

Tuesday, September 9, 2008

Making Fun of Hipsters is the Hip Thing to Do!

(above: Toothpaste For Dinner, on hipsters)

The magazine, “Adbusters,” is one that I had no contact with before now. After doing a little research, I see that the magazine shoots for small press but big change, something I certainly can get behind. It’s slickly designed and has features on how you can feel like you are changing the world. It appears the magazine harnesses the hipster aesthetic to try to sell to a niche independent market of hip young people.

Notice that I used the word “hipster” to describe the magazine. That’s a term that gets thrown around a lot these days without really having a definition. In this case, all I really mean is that the magazine uses the current trend of sleek, sexy design and style to hock its substance to a young demographic. But really, all that my stilted definition of “hipster” does is show just how poorly-defined the term actually is.

Enter, once again, “Adbusters.” The magazine recently featured an article called “Hipster: The Dead End of Western Civilization.” The article, by Douglas Haddow, tries to delineate exactly what a hipster is and how this current trend of hipsterism means the downfall of the legitimacy of culture. Since his article has these twin aims, I will discuss each in turn, first his definition and second his claims about the significance of hipsterism.

As far as the definition of “hipster” is concerned, Haddow makes a valiant effort. His essay is interlaced with vignettes from the “hipster” world, including clubs, all-night parties, and the like. These little scenes really do establish just which crowd he means when he talks about hipsters. That Haddow equates hipsterism with a shifting sense of brand and self, in order to facilitate instant relevance, is certainly a large part of what hipsterism is all about.

This, I think, does get at a fundamental commonality among all of those that we might call hipsters. Hipsters are strongly interested in being on the cutting edge, on being the trendsetters and culture creators. There is an emphasis on recentism inherent in hipsterism. (Incidentally, this same recentism motivates most articles on hipsterism, but that would be a digression.)

But something is missing from Haddow’s definiton. Most people you might call a hipster would bristle at the insinuation. Hipsters rarely self-identify (Andy Warhol likely was the closest to self-identifying as a hipster). The term isn’t a general description of a culture. It inherently only reflects the negative side of this culture. Inasmuch as “hipster” is a pejorative term, Haddow nails what it means.

This brings me to Haddow’s second point. He argues that hipsters, because of his definition, are leading to a dead end in cultural and societal development. This claim is flawed for two main reasons: it imputes a negative trait to a largely indefinable group, and it ignores a good amount of history in marketing and culture.

Firstly, hipster, as I understood it and explained it above, is a negative quality of culture that is currently pretty dominant. It would be fair to say that this negative trait is bad for culture, but so is the vapidity that made “Epic Movie” profitable and the selfishness that lets 3rd world children starve. In other words, while self-interested superficial recentism is certainly a bad thing, it, more often than not, comes in concert with many other things, like political awareness, political action, and the development of art and culture. Hipsterism is just a rather unfortunate side effect of being young and trying to be cool.

Which solidly introduces my second point: the emphasis on recentism and superficiality is not new. It is what drives advertising and marketing. But even if you don’t care for advertising and marketing, you need look no further than the politically influential movements of the past to see hipsterism at work. The beat movement, arguably, had elements of hipsterism. So did the hippie movement (no surprise there) and the grunge movements. That the modern era has elements of this does not eliminate its relevance or spell doom for culture. If anything, the current election serves to demonstrate the influence of young voters and young, politically active students, despite the fact that some of them evince some elements of hipsterism.

So in essence, Haddow’s article is first and foremost well written and a good read. It even gets a lot of things right about hipsters. But the article arrives at some false dichotomies and some false conclusions. Haddow has fallen pray to the common belief that things are totally different now than they ever have been before. The call has gone out many a time in the past that Western Civilization was being threatened by “those crazy kids” and their “rebellion without a cause.” But 50 years has shown us that many of these movements weren’t death knells, but cultural cornerstones.

All that said, unfortunately for Haddow, this article will likely be forgotten long before the political influence of hipsters is forgotten.

NOTE: Thanks, Melanie, for the article. And also, see hipster runoff for examples of why hipsterism is hilarious, interesting, important, goofy, meaningless, and ridiculous.

Sunday, September 7, 2008

J.J. Abrams Talks "Fringe," Shark Jumping

(above: press image for "Fringe," yoinked from io9.)

J.J. Abrams did a conference call with press about his newest show, called "Fringe." He's the guy who sort of jump started "Lost," "Alias," "Felicity," "Cloverfield," and is now working on the new "Star Trek" film. So he's got some serious geek-cred and is well-established as a relevant voice in modern television.

That's why a conference call with him about his upcoming television work is a big deal. And in this particular call, he said something that I thought said a lot about what people expect from television and how he delivers it. Specifically, he said that "Fringe" was going to be "jumping the shark" early and often. What he means is that the show will take the things we see as indicating the outlandishness of television and bring them out early.

For example, as Abrams says about "Lost," he brought out a weird, supernatural monster and a polar bear very early in that series to establish just how far the show was willing to go. He doesn't mention "Alias," but in that show, he, within the first episode, killed off a character that was shaping up to be a main force in the show. Both are great examples of doing really stupid, really strange things very early in a series to shake up expectations of the show.

After seeing the pilot cut of "Fringe," I can verify that Abrams does take similar risks and shows you stuff you do not expect from this show within this first episode. It starts like a police procedural and drifts quickly into "X-Files" type territory, hitting some big twists and unexpected plot points, even within the first half of the pilot.

I guess this might be the thing that makes an Abrams project unique. Most shows establish early in their first season they have set boundaries that they are going to live within. They do this by not surprising us too much in those early episodes. Abrams never really goes this route. The reason "Lost" blew up was that we never really did establish what the show exactly was about, let alone what genre it was. We still haven't really found this out.

"Fringe" is set up in a similar way. It hovers through a few genres and shows us a number of things that just can't make sense yet. It leaves us only comfortable enough to accept the confusion it creates. While this is a great strategy for making intelligent, challenging, ground-breaking television, "Lost" might have been the first time this strategy translated into giant viewing numbers. ("The Prisoner," for instance, via similar techniques, had some followers, but didn't score the same share and revenue as "Lost" has managed to score.)

It's nice to get a little insight into how Abrams exploits this tension between the comfort of the expected and the thrill of the unexpected in his shows. Even if it's not as brilliant as "Lost," "Fringe" pays off on this philosophy. You settle into the rhythms of the show just long enough to be really pleased with the syncopation.

Head over to io9 for their coverage of the conference call about "Fringe."

A Brief Note: J.J. Abrams also tried this approach much less effectively with the ill-fated show "6 Degrees." I personally liked it, and I could tell it was struggling to get out of it's drama-rut and into more interesting Abrams territory. I guess it just never got a chance to get out of that rut before it was cancelled.

Monday, August 25, 2008

"Stuff White People Like" in a Special Feature Called "Stuff Stephen Sorta Likes Now"


(above: "Stuff White People Like" founder, Christian Lander, with a bunch of things that white people like)

Ok this site is growing on me. Very seriously. I stand by what I said previously, but after reading the Onion A.V. Club interview with the creator, the site makes more sense. And also hits home a bit more. I think "Stuff White People Like" is still a stupid name, and the idea that this is how white people behave is ridiculous. But I think the idea that the website is criticizing the things that its creator recognizes are ridiculous about him is much more fascinating.

The guy describes his site as a mirror for himself, to see that most of the stuff he likes is ludicrous and deserves a little lighthearted making-fun-of. In this context, a site called something closer to "Stuff We Like" might give his message more force. That so many people find it's stereotyping legitimate and funny is still a little disconcerting. The site's popularity still relys on a culture that loves simple and easy categorizations or generalizations.

But the creator seems less concerned with overgeneralization and stereotyping than with a little self-deprecation. When I see the site as he describes it, as the mirror that its creator intends it to be, the content is really growing on me.

An example, from the article on "The Wire": "In white culture, giving away information about a film or TV series is considered as rude as spitting on your mothers grave. It is an unforgivable offense."

Again, I'd prefer something like "In our culture..." or "For many people...", and I would also prefer some more elegant prose, but there is no denying the humor in this ridicule of an obviously idiotic, and obviously quite wide-spread, attitude.

OK, I admit. I am now almost a fan of "Stuff White People Like." Crap.

Thursday, August 21, 2008

HP Ads: Celebrities and their Data



Over at Global Nerdy, a blog I do not follow (I found the link through BoingBoing), they've collected some great YouTube videos of celebrity endorsements for computers. Above is the Jerry Seinfeld clip from HP, which is pretty funny. The rest of the collected HP clips are at the bottom, and they stand out from most computer ads in that they show people interacting with data, not props. Usually, they don't even show the faces of the celebs, they just let the data about them speak to who they are. The photos, music and information of these people is presented in a literally hands-on fashion. It can be pretty fascinating to watch.

I could go on about identity thgouh data, the increasingly "personal" nature of computers, or even a personal favorite topic, information aesthetics, but I'll just let you watch and enjoy these clever ads.

LINK!

More Joss Whedon Fanboy Ranting: Buffy Episode "Family"

(above: a still from the "Buffy" episode entitled "Family")

I will apologize up front for going off in a very fanboy direction in this post, focusing on specific details of a specific episode of a specific television show. I just happened to really like the "Buffy" episode I saw tonight. It's called "Family," and it gets right what a lot of teen dramas get wrong.

Let me first do a short, spoiler-free post, and then I will move on to the bigger stuff. Basically, now-regular cast member Tara is celebrating her birthday when her family shows up and reminds her of some troubling things in her family history. The way that Tara and the gang of other regular Buffy supporters, known affectionately as the Scooby Gang, deal with Tara's uncertainties and her family reveals a lot about what makes the show work: the genuineness of the interaction between characters. The episode contrasts Tara's real, backwards family with her much more accepting and loving "family" in Sunnydale. "Buffy" excels as a "Doctor Who" style mythology-driven show as well as a high school drama, but when those two elements combine to bring out fully-developed characters and their interactions, as in this episode, that is when the show is at its best.

Ok. There is the spoiler-cleansed version. Now below, spoilers follow, so do not read this unless you have seen the episode. I mean it. It's some MAJOR spoilerage for people that have not seen the episode or the ones leading up to it. It will ruin your experience of the episodes leading up to this one as well as this one itself. Please return after watching.

In the previous few episodes, the recent arrival of Dawn in the regular cast and in Buffy's life have challenged the notion of family in the series. Buffy's recent discovery that Dawn isn't actually family shook up Buffy's perception of what family is supposed to be. In the previous episode, however, Buffy expresses her feelings to Dawn that she will always see her as a sister, even if her actual status as family isn't entirely legit.

So when Tara's witchcraft-hating family shows up in Sunnydale, Tara has to deal with the same kinds of issues. Her father tries to oppress her into leaving her now-close friends so as not to expose them to her dark side or her more insidious practices of witchcraft. Of course Tara feels pressured by her family, thus setting off the more conventional Buffyesque plot, including spells to hide this demonic side, development of the season's major plot, and some stuff with Spike and Harmony, etcetera.

The heart of the episode, however, is not in the witchcraft and slaying plot, but in the story of Tara slowly realizing that an oppressive family is no family at all in the end. The Scooby gang, through a touching and pitch-perfect confrontation with Tara's father, show that Tara has found a new, more supportive family with her friends and the love of her life, Willow.

I'm sort of glossing over the obvious parallels between a family that is unsupportive of a witch daughter and a family that is unsupportive of a lesbian daughter. Cousin Beth (deftly played by clearly brilliant actress Amy Adams) even accuses Tara of living "G-d knows what kind of lifestyle." But the fluidity of romantic relationships crops up a lot in Buffy (and Doctor Who and Torchwood), so I'll deal with that, possibly, in another post.

Besides, this fluidity of romantic love is outshined in this episode by the fluidity of the concept of family. The capstone on the episode comes when Tara has chosen her supportive Sunnydale family over her paranoid and somewhat abusive actual family. In the final scene, Willow acknowledges that she is proud of Tara for becoming a fully formed individual, a woman of power and individuality, despite coming from a pretty messed up family. Tara seems to attribute this new found inner strength to her Sunnydale family, specifically Willow, in the final dialogue of the episode.

Whedon has stated that a lot of "Buffy" is about self-actualization, specifically with his female leads. He has acknowledged that he developed the concept as a response to the horror movie cliche of the helpless girl who screams a lot and is run down and killed. As a result of this impetus for the series, some of the strongest moments in "Buffy" consist not of big boss fights or killing, but of characters finding comfort with themselves and with the important people in their lives. "Family" is no exception; seeing Tara come into her own and find a new "family" is a very rewarding experience.

Wednesday, August 6, 2008

"Dr. Horrible's Sing-Along Blog" Better than LonelyGirl15


(Above: promo image for Dr. Horrible)

Ok. "Dr. Horrible's Sing-Along Blog" is a super hero story. It's also a sort of drama-adventure. It's a love story. It's certainly a comedy. And it's generally pretty awesome.

And even while "Dr. Horrible" is pretty brilliant and genre-defying, it's also format-defying. It's a video diary, a weblog, a 3-episode internet television series, a movie, and a musical. It's available for paid download on iTunes. It's available streaming free from Hulu. It's coming out on an extras-laden DVD, including musical commentary. It's even going to be a soundtrack CD.

And all of this from master of teen sci-fi and fantasy Joss Whedon as a diversion during the writers' strike. Whedon and his pals (including Neil Patrick Harris and Nathan Fillion) had nothing to do during the strike, so they did this. It was released in three parts with 2-day intervals between each part, and now that it's done, it's modestly sweeping the entertainment world, online and off. "Dr. Horrible" was inexpensive to produce, it's high quality, and it has reached a wide audience over the internet alone. Not bad for people who were just out to make each other laugh at a stupid super villain story.

Something like this has to have an impact on how television is made and distributed, doesn't it?

Go watch the trailer at YouTube, then the whole thing at Hulu. You won't be disappointed; this internet-only series created by a group of friends is clearly leaps better than most television series created by leagues of professionals and executives. And stay tuned: I mentioned an upcoming DVD and CD, but there are also whispers about a possible Act 4...

NOTE: I also mentioned LonelyGirl15 in the title of this post, and since that prototypical blog-fiction series just came to an end, expect a post about it in the nearish future. Or maybe about blog-fiction in general...

Tuesday, July 29, 2008

mwesch's Digital Ethnography Introduction to YouTube


(above: screenshot from MadV's "One World" campaign, via the wikiwiki)

One of my personal favorite thinkers working with emergent technology and culture right now, Professor Michael Wesch, has posted a new video on his YouTube account. He has previously tackled topics such as the meaning of Web 2.0 and technology's role in involving students in learning. His work inspired me to do my final senior thesis on Web 2.0 in secondary education (I can post that if there is an interest...).

And now Professor Wesch has released a new video, over here. This lecture and accompanying video presentation is intended to be an overview or introduction to the community that YouTube supports and has created. It makes sense, the video seems to be saying, that if YouTube is a community, anthropologists should be able to examine and think about this community. Thus, Prof. Wesch and his digital ethnography group tackled this task, and the result is summed up in the video. It's an hour long, but it's one hell of an hour.

The video also references some other fascinating user generated content in its exploration of YouTube as a culture and community. Some of the best examples are a cut-up tribute to remix and DIY culture set to a Regina Spektor song, a compilation video of various YouTubers displaying their distilled philosophies and mantras on the palms of their hands, and a particularly touching message from a community member that has found support within this community in a time of hardship, a time soon after the death of his infant child.

These examples are touching, life-affirming and often brilliant. The things that I love about YouTube are summed up in Prof. Wesch's talk on the community.

I offer one final entry to the overwhelming bestiary that is the multitude of flavors of community-building on YouTube: the Vlog Brothers. In a subsequent entry, I will talk more in depth about these guys, but here is one example. In this video, over here, writer John Green geeks out over "The Catcher in the Rye" and calls for an English-lesson-style online book club to talk about it. Not just an "I like it / I don't like it" discussion, but a literary analysis.

What is amazing about the video is not that John Green has the audacity to suggest that a group of people do a non-mandatory summer reading project on a classic book, but that people, as indicated in these video responses, will actually do it. People are looking forward to it.

So even Professor Michael Wesch can't begin to imagine the ongoing effects of our emergent technologies becoming our culture. That's why it's so fascinating; none of us can imagine it, but it's happening anyway, it's making a difference in how people live.

Thursday, July 17, 2008

New Radiohead Video Claims It Uses No Cameras, But...


(Above: a shot from Radiohead's new "House of Cards" video)

Radiohead claim their new video for "House of Cards," available here, uses no cameras. Let's first recap what cameras are, then talk about their process, and, if it still seems necessary after both of those discussions, talk about why "no cameras" may not have been the best choice of words.

A camera is a device that collects reflected light, then structures the collected information into a two dimensional portrayal of three dimensional space.

The device used for Radiohead's video, as explained in the behind the scenes clip here, is a device that collects reflected light (in this case an array of lasers), then structures the collected information (using computers) into a three dimensional-seeming two dimensional portrayal of three dimensional space.

So in other words, the "non-camera" used here is still basically a camera, but saying "a music video made with no cameras" sounds better than "a music video that uses fancy new and complicated cameras."

The whole thing also reminds me of this Polyphonic Spree video, a music video made with "no video whatsoever." In actuallity, it's just succesive still images shown quickly one after another to create the illusion of motion. Or, wait, that actually sounds a lot like the definition of motion pictures.

In both cases, the processes are noteworthy for being creative approaches to film, the former reaching forward to 3-d imaging and the latter reaching backward to the early days of film, but both using different types of cameras, despite being hyped as camera-less. Another lesson in never believing the hype, I suppose.

Don't get me wrong, both come out looking pretty cool.

But here's the fascinating part about Radiohead's video: granted the whole thing had to be created from a giant data set, one that could be used for many possible applications and visualizations. As a result, over at Google Code, Radiohead have released their entire data set from which their video is constructed. The data set is available for free, and now anyone can use the data to construct their own version of the 3-d imaging video. Best part is, people are actually doing it!

NOTE: Stereogum brought this video and its circumstances to my attention.

Thursday, July 10, 2008

The Philosophy of the Possible: Why "Solaris" Succeeds


Above: an image from "Solaris"
A quick warning: this review contains no major spoilers, but it does contain minor ones.

I finally saw Steven Soderbergh's version of "Solaris," my first exposure to this story. While many critics saw the film as succeeding or failing in various degrees, many agreed that the film was essentially a space-bound love story. But for those of you that felt the love story was rather undeveloped or that the ideas in the film seemed more complex and interesting than the plot let on, I can assure you that you are not alone.

The story is simple enough science fiction: a new planet dubbed "Solaris" has been discovered, but something strange is happening to those researching it. The something strange, in this case, is that people who are important to those researchers begin appearing in the flesh on the research satellite above the mysterious planet. In George Clooney's case, it's his long-dead wife. For another researcher, it's his still-alive but earthbound son. And for some, it's far more complex.

The thread uniting these apparitions is that the resurrected individuals are among the most important in the lives of those to whom they appear. They also remember their lives as their counterparts remember it, whether that memory is accurate or not.

With the current position of scientific discovery in mind, there are many right answers (and many wrong answers) to the question of how these beings actually appear. Even a little scientific forecasting can give us an interesting story about Higgs bosons and Higgs fields holding these new creatures together (this is the route the film goes, in fact). But eventually that science can be proven wrong or right. Eventually the science becomes no more than MacGuffin, nothing more than the "Higgs field" holding the movie together. And finally, as science catches up to science fiction, something remains beyond those predictable scientific explanations, something more lasting and more intriguing.

That remaining something in "Solaris" is a series of questions, some about humanity and some about things above and beyond what humanity can see or figure out.

Regarding the former, the obviously captivating question is whether these apparitions are "life," are "human." The film asks whether these newly-created individuals, whose experiences are limited to those remembered by other people, are actually people themselves. Sure they are facsimiles of the dead or departed, but are they just new copies? What does it mean to kill them? What does it mean to love them?

But even while these provocative questions about memory, loss, guilt, humanity and psychology drive the film's narrative, an even more fascinating question from the second category hangs just behind the veil of the space opera: What are these people? How are they formed, apparently, out of something so ephemeral as memories? And why do they pop into existence in this place alone?

In short, the driving question, the question that drags "Solaris" out of pulp sci-fi territory, even out of psychological thriller territory, and into the philosophy of the possible, is Why?

The location and mechanisms of the apparition phenomenon implies that the planet, Solaris, is creating them. The planet is apparently receiving something from the people studying it from above, decoding it, and sending back something else, something it has encoded. In short, Solaris is trying to communicate.

And that is what fascinates me most about "Solaris." While some science fiction amounts essentially to stories told in space, great science fiction proposes fundamental questions about the world and ourselves and then ventures a guess as to how they might, or might not, be answered. "Contact" did this in a similar manner. "2001" did, too. Even "Sunshine," while not universal loved, attempted this kind of ideological exploration.

In one telling exchange, George Clooney's character asks one of the apparitions what Solaris wants. The film recognizes the inherent human bias in expecting any intelligence to want something from us, and the apparition responds, "why do you think it has to want something?" "Solaris" is willing to get past the trappings of conventional science fiction and ask this difficult question, among a plethora of other equally interesting questions.

This kind of fundamental exploration of what it means to be alive, this kind of casting off of our human biases in favor of something inexplicable, is what makes "Solaris" a truly lasting, truly invigorating science fiction movie.

NOTES: Stanislaw Lem, author of the original novel from which "Solaris" is adapted, has also expressed concern at calling the film a "love story." The ideas he discusses in this article serve to underscore just what makes "Solaris" far more interesting than your average pulp science fiction story.

Wednesday, July 9, 2008

"Stuff White People Like" in a Special Feature Called "Stuff Stephen Dislikes"


Above: the McDonald sandwich, at one point the Most Expensive Sandwich.

You know what white people love nowadays? They love "Stuff White People Like."

Here's some background. This website has been gaining popularity over the course of almost a year now. It purports to chronicle the things that every white person likes. This includes fancy sandwiches, graduate degrees, and girls with bangs. The site has been emailed around from white person to white person, each one laughing more than the last. The site's popularity culminated with the release of a book by the same name early this month.

But something is wrong. Sure, the site has become more and more popular, but a vast majority of white people have probably never heard of it. For instance, a portion of the white community doesn't even have internet access in their homes. An even larger portion can't afford "fancy sandwiches." In fact, the number one thing that white people like is not "outdoor performance clothes" like "North Face." It's probably NASCAR.

It's true, the white people that send funny emails to each other and spend most of their days on the internet and drink Starbucks love the things that "Stuff White People Like" tells them they love. But the white people who live in small towns and know how to care for cows and can't afford a latte every day wouldn't even understand the appeal of "Stuff White People Like."

Maybe the site is just misnamed. Maybe it means "Stuff Upper Middle Class / Yuppie White People Like," but that also might prove to be untrue. But I think the problem goes to something deeper, something more human than just the name the creators chose.

The problem comes from another thing that a lot of people, white or not, like: a lot of people like oversimplifying things. A lot of people like categorizing each other. They like other races to be homogenized, separate. They like stereotypes.

And nowhere is this more evident than on "Stuff White People Like." The site even mentions the white person's apparently hypocritical love of diversity, saying that white people think having easy access to multinational restaurants is the height of diversity. This article explains the tendency to view other cultures as something to "Experience," something definable, something purchasable. But the site's very premise is to oversimplify the white population in a similar manner, reducing the individual to a stereotype.

The black population in America has worked long and hard to overcome society's pressure on them to fit into categories, to be simple and to be what we expect them to be. Their struggle is not over, and it remains one of America's lasting problems. But a problem like that cannot go away in a society that lets our desire to describe and simplify our world affect the way we see and treat each other.

I would like to think we live in a world that celebrates diversity, not makes a sham of it, not discourages it. Sure, "Stuff White People Like" is good for a laugh. But then again so were blackface minstrel shows back in the day. Does that make either one less damaging?

NOTE: for a much more well-thought-out article hitting on some of the same themes as this post, see here. Here's a quote: "The reason the phrase 'it's funny because it's true' has become a shorthand for things that are neither (a) funny nor (b) particularly true is because humor is rarely truly satirical when its targets also make up the bulk of its audience."

Tuesday, July 8, 2008

Commentary on Audio Fidelity in the Form of a Coldplay Review


Above: Coldplay's newest record.

I am a Coldplay fan. I am unashamed to admit it. I consider my "guilty pleasures" to simply be "pleasures," and Coldplay has been one of those pleasures for many years, dating back to early high school.

And whether you are a fan or not, it'd be hard for you to get through the months preceding a Coldplay record release without hearing about it. Mainstream media covers it, YouTube offers sneak peeks, posters go up, and eventually an epic Mac ad comes out featuring those doe-eyed british songsters.

This burst of market exposure, Coldplay-related or otherwise, is certainly a product of our times. Widely available media translates directly to widely available marketing planks. It also translates to widely available digital bootlegs and pre-release streamed copies of records. But another unfortunate side effect of the explosion of exposure space is the diminished fidelity of any audio this exposure brings us.

Let me illustrate this general principal with the specific case of Coldplay's newest record. The record was made available as a stream a few weeks before release, so many people got their first taste of the new material from a low-quality streaming music player on Coldplay's website. I personally got my first taste this way. Which is too bad.

After getting my CD copy of the record and taking some time to climb inside it and explore the space it creates, I found that Coldplay's songwriting abilities have not improved drastically since their inception. I found, instead, that this new record (entitled "Viva La Vida or Death and All of His Friends") was a sonicly complex and invigorating album.

From the opener, called "Life in Technicolor," the blippy synthesizer serves as a sort of beacon in the middle of a cavern of other sounds, the waves of other synthesizers crashing all around as the dulcimer-sounding rythmes and acoustic guitar push the song wider and wider.

The opening instrumental song is a macrocosm for the rest of the record as well. Every track finds a way to carve an interesting sonic territory and then proceeds to push to the edges of that territory. The Brian Eno production is no doubt to blame for most of this, but the band manages to provide some fascinating sounds and texture for Eno to work with. The songs are essentially as well-written as anything on "A Rush of Blood to the Head" and certainly better than almost all of "X&Y," but the febric of these songs is something new for Coldplay; the band is using the bricks of its arena-rock and adult contemporary image to build something more intricate, more expansive.

Now flash back to my first taste of the record. I was sitting in front of my computer, Coldplay.com widget loaded and mouse on the play button. The suspense was killing me. Then came my first audio taste of "Life in Technicolor."

What a dissapointment. Through that compressed audio stream, "Life in Technicolor" sounded like a muddled "X&Y" outtake. It flitted by, leaving little to no impression. And the rest of the album delivered nothing spectacular, with the wide expanse of "Lost!" sounding more like a drone and the punch of "Violet Hill" sounding like a listless, empty pop song.

Imagine my surprise when I finally got my hands on the full quality CD. The album came alive, expanding in size and scope. The songs were still a little bit of a dissapointment, but the sound and the structure revealed the sonic spelunking expedition the record actually was.

I'm not going to claim that my experience is typical of the average Coldplay listener. On the contrary, most people probably don't care what level of fidelity their music comes in, they just want the music. Good on those people. I just hope that this trend of low-quality YouTube videos and crappy record streams doesn't change the sound of my music any more than it already has.

NOTE: I had the same experience with Sigur Ros's new record, which I now believe is one of their finest. If anyone has had similar experiences, I would love to hear them.

Tuesday, July 1, 2008

Fine, Then Jack Kerouac Isn't a Writer Either

(image source: the Sports Oasis)

The Bissinger vs. Deadspin adventure is basically over, and Deadspin has marked the occasion with a level-headed email exchange with Mr. Bissinger. This one is not nearly as interesting as the video I posted about a few weeks ago, because a) there is not nearly as much swearing and yelling, and b) the two parties have already made their positions clear. But, as I told my friend Hudi (thanks for the link again, by the by), the email exchange serves the important purpose of showing that, even without the histrionics, Mr. Bissinger is no less wrong.

Check it out here.

Also note, Mr. Bissinger says that what bloggers do "isn't writing. That is just vomiting on the page." But great classic writer Truman Capote once said that Jack Kerouac's writing style in "On The Road" was not writing, just typing. Kerouac is now considered by some to be as entrenched a classic writer as Capote...

Thursday, May 29, 2008

This Post is Entirely Safe For Work

(Drew Curtis, founder of Fark)

The popular trademark blog Trademork covered a story about the phrase "not safe for work" late last year. Apparently, popular news blog / forum Fark.com applied for a trademark to protect their use of the acronym "NSFW." The application was filed in November of 2007 and was denied in March 2008.

I guess there are three ways to view this. One is that Fark wanted to protect their use of the mark, that they wanted to be the only ones permitted to use the acronym. This seems pretty unlikely, since the mark is so widely used that Fark would have a hell of a time trying to sue all of them. Fark also officially issued a statement that this was not their intention (here).

Another possibility: Fark had an elaborate joke planned that uses the trademark. Even if this is true, as they hint in their statement, I can't imagine it being very funny.

The way I see it, this case is an example of the trademark process at work successfully. The mark is so widely used by internet forums and blogs that it has no association with Fark. Maybe Fark thinks that when people think of content that is "not safe for work," they think immediately of Fark. But it seems to me Fark just doesn't understand the real function of Trademarks. Trademark isn't a method for getting "squatter's rights" to a phrase just by being the first to use it. It's a method for protecting brand.

People sort of see trademarks as a big stick to hit people with if they use your trademark. But trademark should probably operate more like a wall, a structure that separates companies and their brands from each other. It grows naturally from the culture of branding in modern commerce, and it operates in the marketplace as walls operate in a mall; the stores are separated for the benefit of customers to choose which they will use. This is the ostensible function of trademark law, too.

That's why a trademark on such a functional phrase just doesn't make sense in this context. The name "Fark" and any symbols associated with it are trademarkable but "NSFW" isn't, in the same sense that "Starbucks" is trademarkable but "hot coffee" isn't.

References:
Trademork Article
Trademark Office Documents