Rumination

You are currently browsing articles tagged Rumination.

I can’t believe I’ve been “blogging” for over seven years. How the hell did that happen?

Actually, I think it was longer — if I remember correctly, my first blog was on some service whose name I simply cannot remember now, until I ran across Blogger in 2000. Then I switched to there, using their service to run a blog I hosted on server space my then-employer let me use for free, and even let me use their nameserver for my domain name … drewspace.com. That name is now gone to someone or something else. But I did manage to suck all the old archives into my web space here. Here’s the first posts I have a record of, from August 2000.

This boggling (bloggling?) stretch of time occurred to me once I saw Ross Mayfield’s recent post about how he’s been blogging for five years. Of course, he’s much more industrious than I, what with a company of his own and writing that’s a heck of a lot more focused and, well, valuable. But of course, social software has been his professional focus for quite a while, whereas for me it’s been more of a fitful obsession.

“Social software” is turning out to be the monster that ate everything. Which only makes sense. The Web is inherently social, and so are human beings. Anything that better enables the flow of natural social behaviors (rather than more artificial broadcast/consume behaviors) is going to grow like kudzu in Georgia.

Anybody thinking of social software as a special category of software design needs to wake up and smell the friends list. Everything from eBay to Plaxo is integrating social networking tools into their services, and Google is looking to connect them all together (or at least change the game so that all must comply or die of irrelevance).

Poor old blog

I looked at my blog (this thing I’m writing in now) today and the thought that surfaced, unbidden, was “poor old blog.”

I felt bad because I haven’t been writing here like I used to, so sure I get the “poor” part — poor pitiful blog that isn’t getting my attention.

But where on earth did “old” come from? Besides the fact that “poor old whatever” is a common figure of speech, it felt a little shocking coming to the front of my brain while looking at a blog. I mean, I wouldn’t say “poor old iPhone” if I hadn’t picked one up for a week (and if I owned one to begin with).

I mean, blogs are still new, right?

But here he is (I’m convinced my blog is a “he” but I have no idea why, really). My blog, like all the other blogs, just taken for granted now. Blogs — part of the permanent landscape, like plastic grocery bags and 24 hour gas stations.

It was such a big deal just not long ago, but now here they are, blogs, sitting around watching other, younger, nimbler channels giddily running around their feet without a care in the world. The Twitters, Jaikus, Facebook apps. The Dopplrs, Flickrs and the rest.

It’s like somebody took a hammer to the idea of “blog” and it exploded, skittering into a million bits, like mercury.

So that’s what’s been up. I’ve been twittering, facebooking (yeah, it’s a verb, as far as I’m concerned), text-messaging… even the occasional “instant message” through the venerable old AIM or iChat, even though now that’s starting to feel as antiquated as a smoke signal or carrier pigeon.

If there was ever any chance of keeping focus long enough to write sound, thorough paragraphs, lately it’s been eviscerated to a barely throbbing stump.

I wonder if my poor old blog will rally? If it’ll show these whippersnappers it’s not done for yet? Like in the sports movies, you know, where the old batter who everybody thinks is all washed up slams another one over the bleachers?

I don’t know. All I know right now is, there’s my blog. With its complete sentences, its barely-touched comment threads. Its antiquated notion of being at a domain-named location. Its precious permalinks & dated archives, like it’s some kind of newspaper scholars will scan on microfiche in future generations.

Doesn’t it know that everything’s just a stream now? Everything’s a vapor trail?

Poor old blog.

I have it. I’ve been to several in the last 6 months, and I’m now burned out. For a few months anyway.

But UX Week was actually very good. Top-shelf in fact.

So, there it is, a glimmer on my blog that I am in fact still alive. I have lots of bloggy ideas, but they’ll have to wait until I can think in more than 2-3 sentences at a time.

OK Computer Cover - Wikipedia

I don’t know the exact release date, but I do know that it was right about ten years ago that I first heard OK Computer.

In May of ’97, I had just finished my MFA in Creative Writing at UNC Greensboro. But I had no job prospects. I’d had a job lined up for me at a small press out of state, run by some dear friends and mentors of mine, but money issues and a new baby made it so at the last minute, I had to turn that opportunity down. (I handled it horribly, and lost some dear friends because of it.)

My future, or my identity in a sense, felt completely unmoored. The thing I’d assumed for two years I’d be doing after finishing my degree was no longer an option; I’d fallen out of love with teaching, and didn’t really have any good opportunities to do that anyway. All I had going was this nerdy hobby I’d been obsessing on for some years: the Web.

So, I needed a job, and ended up talking my way into a technical writing gig in the registrar’s office of my MFA alma mater. I wouldn’t be editing poems and fiction for a press or a journal (as I’d gotten used to doing and thinking of myself as doing) but writing tutorials and help instructions for complicated, workaday processes and applications. But at least I’d be on the “Web Team” — helping figure out how to better use the Web for the school. I’d been working with computer applications, designing them and writing instructions for them, off and on in my side-job life while I’d been in grad school, so it wasn’t a total stretch. It just wasn’t where I imagined my career would take me.

That summer, in a fit of (possibly misplaced) optimism and generosity, my new employer sent me to a posh seminar in Orlando to learn better Photoshop skills. And one of the presenters there was the guy who made some of the most collected X-Files trading cards around, and an acknowledged wunderkind with digital mixed-media collages. (Cannot find his name…)

As I was waiting to see this guy’s presentation, and people were filing into the presentation room, he was setting up and had a slideshow of his creepy graphics going onscreen. And this spooky, ethereal, densely harmonic, splintery music was playing over the room’s speakers. I was feeling a little transfixed.

And, of course, when I asked him later what it was, he said it was Radiohead’s OK Computer.

Here’s the thing: I’d heard Radiohead interviewed on NPR by Bob Woodward about a month or so before, where they discussed the new album, the band’s methods, how they recorded most of it in Jane Seymour’s ancient country mansion. And they played clips from it throughout, and I remember thinking “wow, that’s just too over the top for me… a little too strange. I guess I won’t be getting that album — sounds like experiment for its own sake.”

It’s just one of a thousand times this has happened to me — conservative, knee-jerk reaction to something, only to come to embrace it later.

Something about this music connected with me on a deep level at that time in my life, and through a lot of things going on in my own head. It *sounded* like my own head. And, to some degree, it still does, though now I feel it’s more of a remnant of a younger self. Yet this music still feels quite right, quite relevant now, but I hear different things in it.

So. This just occurred to me. Had to share. I’m on record as a huge Radiohead fan, even though I realize this isn’t exactly a unique thing to be. I’ve found every release of theirs to be fascinating, challenging, and rewarding once it has a chance to settle in. (Not a huge fan of Thom Yorke’s solo effort, but I’m glad it’s out of his system, so to speak — then again, who knows, four years from now it may be my favorite thing ever.)

They have a new album coming out sometime this year, if all stars align correctly. Can’t wait.

Wired has a great story explaining the profound implications of Google Maps and Google Earth, mainly due to the fact that these maps don’t have to come from only one point of view, but can capture the collective frame of reference from millions of users across the globe: Google Maps Is Changing the Way We See the World.

This quote captures what I think is the main world-changing factor:

“The annotations weren’t created by Google, nor by some official mapping agency. Instead, they are the products of a volunteer army of amateur cartographers. “It didn’t take sophisticated software,” Hanke says. “What it took was a substrate — the satellite imagery of Earth — in an accessible form and a simple authoring language for people to create and share stuff. Once that software existed, the urge to describe and annotate just took off.”

Some of the article is a little more utopian than fits reality, but that’s just Wired. Still, you can’t deny that it really does change, forever, the way the human geographical world describes itself. I think the main thing, for me, is the stories: that because we’re not stuck with a single, 2-dimensional map that can only speak of one or a frames of reference, we can now see a given spot of the earth and learn of its human context — the stories that happened there to regular people, or people you might not otherwise know or understand.

It really is amazing what happens when you have the right banana.

Vegas Lingers

This is Vegas

It’s easy to overlook them. The Skinner-box button-pushers, watching the wheels roll and roll. Surrounded by a ‘paradise’ that still leaves them wanting — and thinking they’ll find it like this.

Vegas was a mixed bag. I guess I’d always seen so many glamorous photos and film shots, even the ones that tried to be ‘gritty’ still managed to put a sort of mythic gleam over everything.

But it’s not mythic. It’s plastic. It’s the progeny of a one-night stand between the Magic Kingdom and TGI Friday’s. Inescapable throngs of flip-flopped, booze-soaked denizens, eyes bugged wide by the promise of … what? I’m not even sure. Entertainment, certainly, but another flavor invades the way saccharine crowds and leaves a film over any other flavor. Luxury, perhaps. Richness of the kind that first comes to mind when someone says “rich”: Trumpism mixed with Hollywood ersatz.

I don’t mean to be so down on it. Really. I’m a big fan of decadent, crazy, outrageous kitsch. But this somehow was so overwhelming, it wasn’t even kitsch. (Definitely not camp.) Now I understand why U2 filmed the video for “I Still Haven’t Found What I’m Looking For” here so many years ago — and that was before it was injected with virtual-reality steroids.

The conference was terrific (except for having trouble escaping the waves of noise and humanity to have a decent conversation). I’m amazed the team made it come together as well as they did given the circumstances.

Fortunately, most of the time was much happier than I’m letting on here. Check out my iasummit2007 Flickr stream.

For the record: I know plenty of people enjoy Vegas a great deal, and they have fun gambling and seeing shows and everything, and I think that’s actually really great. Some of my family enjoy doing it from time to time, and they seem to always come back smiling. I think it just hit me in a strange way on this trip — but I’m always like that; if there’s a silver lining I’ll find a cloud. I just can’t help noticing the souls that seem to be a little lost, a little vacant behind the marquee-reflecting eyes. But hey, that’s just me.

Colleague Michael Magoolaghan passed along a link to the transcript of Tim Berners-Lee’s testimony before Congress.

Hearing on the “Digital Future of the United States: Part I — The Future of the World Wide Web”

It’s fascinating reading, and extremely quotable. But one part that really struck me is in the first paragraphs (emphasis added):

To introduce myself, I should mention that I studied Physics at Oxford, but on graduating discovered the new world of microprocessors and joined the electronics and computer science industry for several years. In 1980, I worked on a contract at CERN, the European Particle Physics Laboratory, and wrote for my own benefit a simple program for tracking the various parts of the project using linked note cards. In 1984 I returned to CERN for ten years, during which time I found the need for a universal information system, and developed the World Wide Web as a side project in 1990.

While TBL didn’t invent the Internet entirely, his bit of brilliance made it relevant for the masses. Even though that wasn’t his intention right off the bat, it became so as he realized the implications of what he’d done.

But let’s look at the three bits in bold:

  1. He started studying Physics, then decided to follow a side interest in microprocessors.
  2. He created an e-notecard system for himself, on the side of (and to help with) what he was contracted to do at CERN.
  3. He developed a universal version of his notecard system so everyone could share and link together, as a side project in 1990.

Imagine the world impact of those three “side projects”?

This really begs the question for any organization. Does it give its members the leeway for “side” interests? Are they considered inefficient, or just odd?

It’s not that every person is going to invent another Web. It’s more that the few people who might do something like that get trampled before they get started, and that the slightly larger group of people who might do something merely impressive are trampled in the same way.

There was a time when amateurs were the experts — they were the ones who dabbled and learned and communicated in excited screeds and philosophical societies. They were “blessed” to have the time and money to do as they pleased, and the intellectual curiosity to dig in and dirty their hands with figuring out the world.

It could very well be that we’re in the midst of a similar rush of amateur dabbling. Just think of all the millionaires who are now figuring out things like AIDS, malaria and space flight. Or the empowerment people have to just go and remix and remake their worlds. There’s an excellent O’Reilly Conference keynote I wish I’d seen, but the pdf of the slides gives a decent accounting. Here’s an abstract:

Rules for Remixing; Rael Dornfest & Tim O’Reilly

Citizen engineers are throwing their warranties to the wind, hacking their TiVos, Xboxes, and home networks. Wily geeks are jacking Jetsons-like technology into their cars for music, movies, geolocation, and internet connectivity on the road. E-commerce and network service giants like Amazon, eBay, PayPal, and Google are decoupling, opening, and syndicating their services, then realizing and sharing the network effects. Professional musicians and weekend DJs are serving up custom mixes on the dance floor. Operating system and software application makers are tearing down the arbitrary walls they’ve built, turning the monolithic PC into a box of loosely coupled component parts and services. The massive IT infrastructure of the ’90s is giving way to what analyst Doc Searls calls “do-it-yourself IT.
We see all of this as a reflection of the same trend: the mass amateurization of technology, or, as Fast Company put it, “the amateur revolution.” And it’s these hacks, tweaks, re-combinations, and shaping of the future we’re exploring in this year’s Emerging Technology Conference theme: Remix.

I saw Mark Frauenfelder on Colbert Report last night, talking about Make Magazine and the very things mentioned in the abstract above. Colbert marveled at the ingenuity, and I wondered how many people watching would think to themselves: “Hey, yeah! Why not just take things apart and change them to the way I want them???”

It’s on the rise, isn’t it? Wow. Another sea change, and I’m not even 40. What a time to be alive.

All hail side projects and passionate tangents. Long may they reign.

But for now… I gotta get back to work.

This video is amazing. I’m not sure how to describe it. Cory Doctorow on BoingBoing said this about it:

“Web 2.0… the Machine is Us/ing Us,” is deeply moving and incredibly smart. The creator is Michael Wesch, an assistant Cultural Anthropology Prof at Kansas State U, and he has strung together a bunch of animations, text, and screenshots in order to tell the story of “Web 2.0” — and why it matters, and how it’s changing the world. This is as starry-eyed as techno-optimism gets, and it might just choke you up a little, if you care about this stuff.

YouTube – Web 2.0 … The Machine is Us/ing Us

For a year or so now, “innovation” has been bobbing around at the very top of the memepool. Everybody wants to bottle the stuff and mix it into their corporate water supplies.

I’ve been on the bandwagon too, I confess. It fascinates me — where do ideas come from and how do they end up seeing the light of day? How does an idea become relevant and actionable?

There’s a recent commercial for Fedex where a group of pensive executives are sitting around a conference table, their salt-and-pepper haired and square-jawed CEO (I assume) sitting at the head of the group and a weak-chinned, rumpled and dorky underling sitting next to him. The CEO asks how they can cut costs (I’m paraphrasing) and the little younger dorky guy recommends one of Fedex’s new services. He’s ignored. But then the CEO says exactly the same thing, and everybody nods in agreement and congratulates him on his genius.

The whole setup is a big cliche. We’ve seen it time and again in sitcoms and elsewhere. But what makes this rendition different is how it points out the difference in delivery and context.

In looking for a transcript of this thing, I found another blog that summarizes it nicely, so I’ll point to it and quote here.

The group loudly concurs as the camera moves to the face of the worker who proposed the idea in the first place. Perplexed, he declares, “You just said what I just said only you did this,” as he mimics his boss’s hand motions.
The boss looks not at him, but straight ahead, and says, “No, I did this,” as he repeats his hand motion. The group of sycophants proclaims, “Bingo, Got it, Great.” The camera captures the contributor, who has a sour grimace on his face.

(Thanks Joanne Cini for the handy recap.)

What it also captures is the reaction of an older colleague sitting next to the grimacing dorky guy who gives a little nod to him that shows a mixture of pity, complicity in what just happened, and a sort of weariness that seems to say, “yeah, see? that’s how it works young fella.”

It’s a particularly insightful bit of comedy. It lampoons the fact that so much of how ideas happen in a group environment depends on context, delivery, and perception (and here I’m going to pick on business, but it happens everywhere in slightly different flavors). Dork-guy not only doesn’t get the language that’s being used (physical and tonal), but doesn’t “see” it well enough to even be able to imitate it correctly. He doesn’t have the literacy in that language that the others in the room do, and feels suddenly as if he’s surrounded by aliens. Of course, they all perceive him as alien (or just clueless) as well.

I know I’m reading a lot into this slight character, but I can’t help it. By the way, I’m not trying to insult him by calling him dork-guy — it’s just the way he’s set up in the commercial; I think the dork in all of us identify with him. I definitely do.

In fact, I know from personal experience that, in dork-guy’s internal value matrix, none of the posturing means a hill of beans. He and his friends probably make fun of people who put so much weight on external signals — they think of it as a shallow veneer. Like most nerdy people, the assumption is that your gestures, haircut or tone of voice doesn’t affect whether you win the chess match or not. But in the corporate game of social capital, “presence” is an essential part of winning.

Ok, so back to innovation. There’s a tension among those who talk and think about innovation between Collective Intelligence (CI) and Individual Genius (IG). To some degree there are those who favor one over the other, but I think most people who think seriously about innovation and try to do anything about it struggle with the tension within themselves. How do we create the right conditions for CI and IG to work in synergy?

The Collective Intelligence side has lots of things in its favor, especially lately. With so many collective, emergent activities happening on the Web, people now have the tools to tap into CI like never before — when else in history did we have the ability for people all over the world to collaborate almost instantaneously in rapid conversation, discussion and idea-vetting? Open Source philosophy and the “Wisdom of Crowds” have really found their moment in our culture.

I’m a big believer too, frankly. I’m not an especially rabid social constructivist, but I’m certainly a convert. Innovation (outside of the occasional bit that’s just for an individual privately) derives its value from communal context. And most innovations that we encounter daily were, in one way or another, vetted, refined and amplified by collaboration.

Still, I also realize that the Eureka Moments don’t happen in multiple minds all at once. There’s usually someone who blurts out the Eureka thought that catalyzes a whole new conversation from that “so perfect it should’ve been obvious” insight. Sometimes, of course, an individual can’t find anyone who hears and understands the Eureka thought, and their Individual Genius goes on its lonely course until either they do find the right context that “gets” their idea or it just never goes anywhere.

This tension betwen IG and CI is rich for discussion and theorizing, but I’m not going to do much of that here. It’s all just a very long setup for me to write down something that was on my mind today.

In order for individuals to care enough to have their Eureka thoughts, they have to be in a fertile, receptive environment that encourages that mindset. People new to a company often have a lot of that passion, but it can be drained away long before their 401k matching is vested. But is what these people are after personal glory? Well, yeah, that’s part of it. But they also want to be the person who thought of the thing that changed everybody’s lives for the better. They want to be able to walk around and see the results of that idea. Both of these incentives are crucial, and they’re both important ingredients in the feed and care of the delicate balance that brings forth innovation.

Take the Fedex commercial from above. The guy had the idea and he’ll see it executed. Why wouldn’t he be gratified to see the savings in the company’s bottom line and to see people happier? Because that’s only part of his incentive. The other part is for his boss, at the quarterly budget meeting, to look over and say “X over there had a great idea to use this service, and look what it saved us; everybody give a round of applause to X!” A bonus or promotion wouldn’t hurt either, but public acknowledgement of an idea’s origins goes a very very long way.

I’ve worked in a number of different business and academic environments, and they vary widely in how they handle this bit of etiquette. And it is a kind of etiquette. It’s not much different from what I did above, where I thanked the source of the text I quoted. Maybe it’s my academic experience that drilled this into me, but it’s just the right thing to do to acknowledge your sources.

In some of my employment situations, I’ve been in meetings where an idea I’ve been evangelizing for months finally emerges from the lips of one of my superiors, and it’s stated as if it just came to them out of the blue. Maybe I’m naive, but I usually assume the person just didn’t remember they’d heard it first from me. But even if that’s the case, it’s a failure of leadership. (I’ve heard it done not just to my ideas but to others’ too. I also fully acknowledge I could be just as guilty of this as anyone, because I’m relatively absent-minded, but I consciously work to be sure I point out how anything I do was supported or enhanced by others.) It’s a well-known strategy to subliminally get a boss to think something is his or her own idea in order to make sure it happens, but if that strategy is the rule rather than the exception, it’s a strong indicator of an unhealthy place for ideas and innovation (not to mention people).

But the Fedex commercial does bring a harsh lesson to bear — a lesson I still struggle with learning. No matter how good an idea is, it’s only as effective as the manner in which it’s communicated. Sometimes you have no control over this; it’s just built into the wiring. In the (admittedly exaggerated, but not very much) situation in the Fedex commercial, it’s obvious that most of the dork-guy’s problem is he works in a codependent culture full of sycophants who mollycoddle a narcissistic boss.

But perhaps as much as half of dork-guy’s problem is that he’s dork-guy. It’s possible that there are some idyllic work environments where everyone respects and celebrates the contributions of everyone else, no matter what their personal quirks. But chances are it’s either a Kindergarten classroom or a non-profit organization. And I happen to be a big fan of both! I’m just saying, I’m learning that if you want to play in certain environments, you have to play by their rules, both written and unwritten. And I think we all know that the ratio of unwritten-to-written is something like ten-to-one.

In dork-guy’s company, sitting up straight, having a good haircut and a pressed shirt mean a lot. But what means even more is saying what you have to say with confidence, and an air of calm inevitability. Granted, his boss probably would still steal the idea, but his colleagues will start thinking of him as a leader and, over time, maybe he’ll manage to claw his way higher up the ladder. I’m not celebrating this worldview, by the way. But I’m not condemning it either. It just is. (There is much written hither and yon about how gender and ethnicity complicate things even further; speaking with confidence as a woman can come off negatively in some environments, and for some cultural and ethnic backgrounds, it would be very rude. Whole books cover this better than I can here, but it’s worth mentioning.)

Well, it may be a common reality, but it certainly isn’t the best way to get innovation out of a community of coworkers. In environments like that, great ideas flower in spite of where they are, not because of it. The sad thing is, too many workplaces assume that “oh we had four great ideas happen last year, so we must have an excellent environment for innovation,” not realizing that they’re killing off hundreds of possibly better seedlings in the process.

I’ve managed smaller teams on occasion, sometimes officially and sometimes not, but I haven’t been responsible for whole departments or large teams. Managing people isn’t easy. It’s damn hard. It’s easy for me to sit at my laptop and second-guess other people with responsibilities I’ve never shared. That said, sometimes I’m amazed at how ignorant and self-destructive as a group some management teams can be. They can talk about innovation or quality or whatever buzzword du jour, and they can institute all sorts of new activities, pronouncements and processes to further said buzzword, but not do anything about the major rifts in their own ranks that painfully hinder their workers from collaborating or sharing knowledge; they reinforce (either on purpose or unwittingly) cultural norms that alienate the eccentric-but-talented and give comfort to the bland-but-mediocre. They crow about thinking outside the box, while perpetuating a hierarchical corporate system that’s one of the most primitive boxes around.

Ok, that last bit was a rant. Mea Culpa.

My personal take-away from all this hand-wringing? I can’t blame the ‘system’ or ‘the man’ for anything until I’ve done an honest job of playing by the un/written rules of my environment. It’s either that, or play a new game. To me, it’s an interesting challenge if I look at it that way; otherwise it’s just disheartening. I figure either I’ll succeed or I’ll get so tired of beating myself against the cubicle partitions, I’ll give up and find a new game to play.

Still, eventually? It’d be great to change the environment itself. Maybe I should go stand in front of my bathroom mirror and practice saying that with authority? First, I have to starch my shirts.

Flash

Just one more Sunday with my daughter, dwindling now. Another week of work and daycamp, then a return drive to NC on Saturday, and our month together for 2006 will be over.

This time it went so fast. That’s a cliche, I know. But it did.

At the amusement park last week, kids lined up to ride the big swing ride over and over again. Parents tired of riding it with them, as I did, and so we all stood and watched, our necks crooked upward at our children flung in the wide circle. The sun was starting to drop, and some of us had cameras out trying to pluck images of our children from the screaming, sweaty orbiting ring, one frame at a time.

Cameras flashed. Parents yelled up “yes, I see you! hang on!” Then another flash. And we all had that frozen smile on our faces, the one where the mouth is all joy and wonderment, but the eyes behind the cameras say “I will keep this moment, this moment will never change” (flash) “stop” (flash) “yes, stop there and there” (flash) “and that moment, that smile, I’ll keep that one” (flash) “and that one too! oh god so many look at them pass, too fast” (flash) “too fast, too fast.”

I’ve been working for a couple of months now on an article for the ASIS&T Bulletin (American Society for Information Science and Technology). It started out as an article version of my “Clues to the Future” presentation, but I soon realized that 1) I couldn’t really explain the same stuff very well in a 4000 word article, and 2) to do so would be a bit redundant with the presentation itself (which is fairly well explained in the text part of the pdf download). I also realized (I guess this is 3) that there were other things I really wanted to say but hadn’t managed to figure out how to articulate them yet, and this was a good incentive and/or opportunity to do so.

After banging my head against a few walls (both real and virtual) for eight weeks, and the extreme patience of an editor, I think it may manage to at least form the beginnings of what I’ve had rolling around in my brains.

Writing it was hard. Period. It always has been, at least to do it well. This is true when I’ve written fiction or poetry (in mostly a previous life) but I found it especially hard writing a long-form essay for print. I’ve been so used to writing PowerPoint presentations and blog posts, I was quite out of practice with developing my ideas with any rigor. And I’m still not sure how effectively I’ve articulated this stuff, but so it goes. One of my favorite quotations ever is E.M. Forster’s “I don’t know what I think until I see what I say” or some version of that. It’s so very true — the act of putting it into parsable language inevitably changes any idea for good or ill, but hopefully makes it better.

Especially, though, writing about things that don’t really have a solid, agreed-upon vocabulary just yet is quite difficult. I used to curse the philosophy texts I read as a student because they were so full of neologisms — especially from those pesky Germans like Heidegger — but I sort of understand that they were trying to express ideas that hadn’t been expressed yet, and needed rubrics by which to signify them without re-explaining each time.

The piece is called “We Live Here: Games, Third Places, and the Information Architecture of the Future.” Egad, now that I see it here it sounds awfully pompous.

When it’s published I’ll post a link or excerpt or something.

As an adoptee who is also a fan of AM Homes, I was astonished I hadn’t seen this yet.

AM Homes: The Mistress’s Daughter

I follow up with a call. Her voice is low, nasal, gravelly, vaguely animal. I tell her who I am and she screams, “Oh, my God! This is the most wonderful day of my life.” Her voice, her emotion, comes in bursts, like punctuation—I can’t tell if she is laughing or crying.
The phone call is thrilling, flirty, like a first date, like the beginning of something. There is a rush of curiosity, the desire to know everything at once. What is your life like? How do your days begin and end? What do you do for fun? Why did you come looking for me? What do you want?
Every nuance, every detail, means something. I am like a recovering amnesiac. Things I know about myself, things that exist without language—my hardware, my mental firing patterns, parts of me that are fundamentally, inexorably me—are being echoed on the other end, confirmed as a DNA match. It is not an entirely comfortable sensation.
“Tell me about you—who are you?” she asks.

I have to say, it felt powerfully similar to my own experience of meeting my birthparents — but also entirely different. Mine are actually very considerate and kind people.

Homes’ essay is pretty amazing though. Downright devastating.

« Older entries