Technology

You are currently browsing articles tagged Technology.

As networked social applications mature, they’re evolving more nuanced ways of constructing and maintaining an identity. Two of the major factors in online identity are How you present yourself, and Who you know.

How you present yourself: “Flourishing”

Flourishing is how we ornament ourselves and display ourselves to others. Think of peacocks flourishing their tail-feathers. It’s done to communicate something about oneself — to attract partners, distinguish oneself from the pack toward some end, or even dissuade the advances of enemies.

I don’t know if this behavior has another name, or if someone else has called it this yet. But it’s the best name I can think of for the technologically enhanced version of this behavior.

Humans have always used personal ornament to say something about themselves, from ancient tattoos and piercings, “war paint,” various kinds of dress, engagement and wedding rings, to larger things like their cars and homes. We’ve long used personal ornament to signal to others “I am X” in order to automatically set initial terms of any conversation or encounter.

It expands our context, and makes physical things about us that our bodies alone cannot communicate. Often these choices are controlled overtly or subtly by cultural norms. But in cultures where individual identity is given some play-room, these choices can become highly unique.

So, how has digital networked life changed this behavior? For a while, I’ve thought it’s fascinating how we can now decorate ourselves not only with things we’ve had to buy or make, but with a virtual version of almost anything we can think of, from any medium. My online identity as represented by one or more ‘avatars’ (whether that’s an avatar in an environment like Second Life, or a MySpace profile that serves a similar, though 2-D purpose) can be draped with all manner of cultural effluvia. I can express myself with songs, movie clips, pictures of products I love (even if I can’t afford them). Our ability to express ourselves with bits of our culture has increased to vertiginous heights.

Just as I started blogging about this thing that’s been on my mind for a while, I thought I’d look to see if anyone has done real work on it. I’m sure there’s a lot of it out there, but one piece I ran across was a paper from Hugo Liu at MIT, entitled “Social Network Profiles as Taste Performances,” which discusses this development at some length. From the introduction:

The materials of social identity have changed. Up through the 19th century in European society, identity was largely determined by a handful of circumstances such as profession, social class, and church membership (Simmel, 1908/1971a). With the rise of consumer culture in the late 20th century, possessions and consumptive choices were also brought into the fold of identity. One is what one eats; or rather, one is what one consumes—books, music, movies, and a plenitude of other cultural materials (McCracken, 2006).

… In the pseudonymous and text-heavy online world, there is even greater room for identity experimentation, as one does not fully exist online until one writes oneself into being through “textual performances” (Sundén, 2003).

One of the newest stages for online textual performance of self is the Social Network Profile (SNP). The virtual materials of this performance are cultural signs—a user’s self-described favorite books, music, movies, television interests, and so forth—composed together into a taste statement that is “performed” through the profile. By utilizing the medium of social network sites for taste performance, users can display their status and distinction to an audience comprised of friends, co-workers, potential love interests, and the Web public.

The article concerns itself mainly with users’ lists of “favorites” from things like music, movies and books, and how these clusters signal particular things about the individual.

What I mean by “flourishing” is this very activity, but expanded into all media. Thanks to ever-present broadband and the ability to digitize almost anything into a representative sample, users can decorate themselves with “quotes” of music, movies, posters, celebrity pictures, news feeds, etc. Virtual bling.

I think it was a major reason for MySpace’s popularity, especially the ability to not just *list* these things, but to bring them fully into the profile, as songs that play as soon as you load the profile page, or movie and music-video and YouTube clips.

This ability has been present for years in a more nascent form in physical life — the custom ring-tone. Evidently, announcing to all those around you something about yourself by the song or sound you use for your ring-tone is so important to people that it generates billions of US dollars in revenue.

Here’s what I’m thinking: are we far from the day when it’s not just ring-tones, but video-enabled fabric in our clothes, and sound-emitting handbags and sunglasses? What will the ability to “flourish” to others mean when we have all of this raw material to sample from, just like hip-hop artists have been doing for years?

For now, it’s only possible to any large extent online. But maybe that’s enough, and the cultural-quoting handbags won’t even be necessary? Eventually, the digital social network will become such a normal part of our lives that having a profile in the ether is as common and expected as phone numbers in the phone book used to be (in fact, people in their teens and 20s are already more likely to look for a Web profile than even consider looking in a giant paper phone-book).

As physical and digital spaces merge, and the distinction becomes less meaningful, that’s really all it’ll take.

Who you know: “Friending”

Alex Wright has a nice column in the NYT about Friending, Ancient or Otherwise, about research that’s showing common patterns between prehistoric human social behavior and the rise of social-network applications.

Academic researchers are starting to examine that question by taking an unusual tack: exploring the parallels between online social networks and tribal societies. In the collective patter of profile-surfing, messaging and “friending,” they see the resurgence of ancient patterns of oral communication.
“Orality is the base of all human experience,” says Lance Strate, a communications professor at Fordham University and devoted MySpace user. He says he is convinced that the popularity of social networks stems from their appeal to deep-seated, prehistoric patterns of human communication. “We evolved with speech,” he says. “We didn’t evolve with writing.”

I’m fascinated with the idea that recent technology is actually tapping into ancient behavior patterns in the human animal. I like entertaining the idea that something inside us craves this kind of interaction, because it’s part of our DNA somehow, and so we’ve collectively created the Internet to get back to it.

It’s not terribly far-fetched. Most organisms that find their natural patterns challenged in some way manage to return to those patterns by adaptation. At least, in my very limited understanding of evolution, that’s what happens, right? And a big chunk of the human race has been relegated to non-tribal community structures for only a tiny fraction of its evolutionary history — makes sense that we’d find a way back.

Regardless of the causes (and my harebrained conjecture aside), who you have as friends is vital to your identity, both your internal sense of self and the character you present externally to the world. “It’s not what you know, it’s who you know” is an old adage, and there’s a lot of truth to it, even if you just admit that you can know a heck of a lot, but it won’t get you anywhere without social connection to make you relevant.

What digital networks have done is made “friendship” something literal, and somewhat binary, when in fact friendship is highly variable and messy business. Online, a “friend” could be just about anyone from a friend-of-a-friend, to someone you ran into once at a conference, to someone from high school you haven’t actually spoken to in 10 years but just for grins is on your Facebook list.

Systems are starting to become more sophisticated in this regard — we can now choose ‘top friends’ and organize friends into categories on some sites, but that still forces us to put people in categories that are oversimplified, and don’t reflect the variability over time that actually exist in these relationships. Someone you were friends with and saw weekly six months ago may have a new job or new interests, or joined a new church or gym, and now you’re still “people who keep up with each other” but not anything like you were. Or maybe you just have a fight with a friend and things have soured, but not completely split — and months later it’s all good again?

The more we use networks for sharing, communicating, complaining and commiserating, sharing and confessing to our social connections, the more vexing it’s going to be to keep all these distinctions in check. I doubt any software system can really reflect the actual emotional variety in our friendships — if for no other reason than no matter how amazing the system iis, it still depends on our consciously updating it.

So that makes me wonder: which is going to change most? The systems, or the way we conceive of friendship? I wonder how the activity of friendship itself will feel, look and behave in ten or fifteen years for people who grew up with social networks? Will they meet new friends and immediately be considering what “filter” that friend might be safe to see on a personal blog? Will people change the way they create and maintain relationships in order to adapt to the limitations of the systems or vice-versa (or both)?

I can’t believe I’ve been “blogging” for over seven years. How the hell did that happen?

Actually, I think it was longer — if I remember correctly, my first blog was on some service whose name I simply cannot remember now, until I ran across Blogger in 2000. Then I switched to there, using their service to run a blog I hosted on server space my then-employer let me use for free, and even let me use their nameserver for my domain name … drewspace.com. That name is now gone to someone or something else. But I did manage to suck all the old archives into my web space here. Here’s the first posts I have a record of, from August 2000.

This boggling (bloggling?) stretch of time occurred to me once I saw Ross Mayfield’s recent post about how he’s been blogging for five years. Of course, he’s much more industrious than I, what with a company of his own and writing that’s a heck of a lot more focused and, well, valuable. But of course, social software has been his professional focus for quite a while, whereas for me it’s been more of a fitful obsession.

“Social software” is turning out to be the monster that ate everything. Which only makes sense. The Web is inherently social, and so are human beings. Anything that better enables the flow of natural social behaviors (rather than more artificial broadcast/consume behaviors) is going to grow like kudzu in Georgia.

Anybody thinking of social software as a special category of software design needs to wake up and smell the friends list. Everything from eBay to Plaxo is integrating social networking tools into their services, and Google is looking to connect them all together (or at least change the game so that all must comply or die of irrelevance).

Poor old blog

I looked at my blog (this thing I’m writing in now) today and the thought that surfaced, unbidden, was “poor old blog.”

I felt bad because I haven’t been writing here like I used to, so sure I get the “poor” part — poor pitiful blog that isn’t getting my attention.

But where on earth did “old” come from? Besides the fact that “poor old whatever” is a common figure of speech, it felt a little shocking coming to the front of my brain while looking at a blog. I mean, I wouldn’t say “poor old iPhone” if I hadn’t picked one up for a week (and if I owned one to begin with).

I mean, blogs are still new, right?

But here he is (I’m convinced my blog is a “he” but I have no idea why, really). My blog, like all the other blogs, just taken for granted now. Blogs — part of the permanent landscape, like plastic grocery bags and 24 hour gas stations.

It was such a big deal just not long ago, but now here they are, blogs, sitting around watching other, younger, nimbler channels giddily running around their feet without a care in the world. The Twitters, Jaikus, Facebook apps. The Dopplrs, Flickrs and the rest.

It’s like somebody took a hammer to the idea of “blog” and it exploded, skittering into a million bits, like mercury.

So that’s what’s been up. I’ve been twittering, facebooking (yeah, it’s a verb, as far as I’m concerned), text-messaging… even the occasional “instant message” through the venerable old AIM or iChat, even though now that’s starting to feel as antiquated as a smoke signal or carrier pigeon.

If there was ever any chance of keeping focus long enough to write sound, thorough paragraphs, lately it’s been eviscerated to a barely throbbing stump.

I wonder if my poor old blog will rally? If it’ll show these whippersnappers it’s not done for yet? Like in the sports movies, you know, where the old batter who everybody thinks is all washed up slams another one over the bleachers?

I don’t know. All I know right now is, there’s my blog. With its complete sentences, its barely-touched comment threads. Its antiquated notion of being at a domain-named location. Its precious permalinks & dated archives, like it’s some kind of newspaper scholars will scan on microfiche in future generations.

Doesn’t it know that everything’s just a stream now? Everything’s a vapor trail?

Poor old blog.

Julian Dibbell has a marvelous post about how game realities are symptoms — sort of concentrated, more-obvious outcroppings — of a general shift in economic and cultural reality itself. The game’s the thing …

Online Games, Virtual Economies … Distinction between Play and Production

And I’m arguing, finally, that that relationship is one of convergence; that in the strange new world of immateriality toward which the engines of production have long been driving us, we can now at last make out the contours of a more familiar realm of the insubstantial—the realm of games and make-believe. In short, I’m saying that Marx had it almost right: Solidity is not melting into air. Production is melting into play.

I only just heard about the Google Image Labeler via the IAI mailing list.

Here’s a description:

You’ll be randomly paired with a partner who’s online and using the feature. Over a two-minute period, you and your partner will be shown the same set of images and asked to provide as many labels as possible to describe each image you see. When your label matches your partner’s label, you’ll earn points depending on how specific your label is. You’ll be shown more images until time runs out. After time expires, you can explore the images you’ve seen and the websites where those images were found. And we’ll show you the points you’ve earned throughout the session.

So, Google didn’t just assume people would tag images for the heck of it. They build in a points system. I have no idea if the points even mean anything ouside of this context, but it’s interesting to see a game mechanic of points incentive, in a contest-like format, being used to jump-start the collective intelligence gathering.

POSTSCRIPT:

Later in the day, I hear from James Boekbinder that this system was invented (if he has it right) by a mathematician named Louis Ahn, and Google bought it. He points to a great presentation Ahn has on Google Video about his approach.

Ahn’s description says that people sometimes play the game 40 hours a week, while I’m hearing from other sources that research showed users putting a lot of effort into it for a short time, then dropping and not coming back (possibly because there’s no persistent or tranferable value to the ‘points’ given in the game?).

Via Jay Fienberg, via the IAI discussion list, I hear of this excellent post by professor David Silver about a talk Silver did recently on the Web 2.0 meme.

Silver starts out lauding the amazing, communal experience of blogs and mashups of blogs and RSS feeds and other Web 2.0 goodness, and then gets into giving some needed perspective:

then i stepped back and got critical. first, i identified web 2.0 as a marketing meme, one intended to increase hype of and investment in the web (and web consultants) and hinted at its largely consumer rather than communal directions and applications. second, i warned against the presentism implied in web 2.0. today’s web may indeed be more participatory but it is also an outgrowth of past developments like firefly, amazon’s user book reviews, craigslist, and ebay – not to mention older user generated content applications like usenet, listservs, and MUDs. third, i argued against the medium-centricness of the term web 2.0. user generated content can and does exist in other media, of course, including newspapers’ letters to the editor section, talk radio, and viewers voting on reality tv shows. and i ended with my all-time favorite example of user generated content, the suggestion box, which uses slips of paper, pencils, and a box.

I think this is very true, and good stuff to hear. (Even in the peculiar lower-case typing…fun!) Group participation has been growing steadily on the Internet in one form or another for years.

I do think, though, that some tipping point hit in the last few years. Tools for personal expression, simple syndication, a cultural shift in what people expect to be able to do online, and the rise of broadband and mobile web access — the sum has become somehow much greater than its parts.

Still, I think he’s right that the buzzword “Web 2.0” is mainly an excellent vehicle for hype that gets people thinking they need consultants and new books. (Tim O’Reilly is a nice guy, I’m sure, but he’s also a business man and publisher who knows how to get conversations started.)

Silver mentions Feevy, a sort of ‘live blogroll’ tool for blogs — it has an excerpt of the latest post by each person on your blogroll. Neato tool. I may have to try it out!

Wired has a great story explaining the profound implications of Google Maps and Google Earth, mainly due to the fact that these maps don’t have to come from only one point of view, but can capture the collective frame of reference from millions of users across the globe: Google Maps Is Changing the Way We See the World.

This quote captures what I think is the main world-changing factor:

“The annotations weren’t created by Google, nor by some official mapping agency. Instead, they are the products of a volunteer army of amateur cartographers. “It didn’t take sophisticated software,” Hanke says. “What it took was a substrate — the satellite imagery of Earth — in an accessible form and a simple authoring language for people to create and share stuff. Once that software existed, the urge to describe and annotate just took off.”

Some of the article is a little more utopian than fits reality, but that’s just Wired. Still, you can’t deny that it really does change, forever, the way the human geographical world describes itself. I think the main thing, for me, is the stories: that because we’re not stuck with a single, 2-dimensional map that can only speak of one or a frames of reference, we can now see a given spot of the earth and learn of its human context — the stories that happened there to regular people, or people you might not otherwise know or understand.

It really is amazing what happens when you have the right banana.

I finally got a chance to listen to Bruce Sterling’s rant for SXSW 2007 via podcast as I was driving between PA and NC last week.

There were a lot of great things in it. A number of people have taken great notes and posted them (here’s one example). It’s worth a listen either way — as are all of his talks. I like how Bruce is at a point where he’s allowed to just spin whatever comes to mind for an hour to a group of people. Not because all of it is gold — but because the dross is just as interesting as the gold, and just as necessary.

A lot of this year’s talk was on several books he’s reading, one of which is Yochai Benkler’s The Wealth of Networks. It’s fascinating stuff — and makes me want to actually read this thing. (It’s available online for free — as are some excellent summaries of it, and a giant wiki he set up.)

In the midst of many great lines, one of the things Sterling said that stuck with me was this (likely a paraphrase):

“The distinctions just go away if you’re given powerful-enough compositing tools.”

He was talking about commons-based peer production — things like mashups and remixes, fan art, etc. and how the distinctions between various media (photography, painting, particular instruments, sculpture, etc) blur when you can just cram things together so easily. He said that it used to be you’d work in one medium or genre or another, but now “Digital tools are melting media down into a slum gully.”

First, I think he’s being a little too harsh here. There have always been amateurs who create stuff for and with their peers, and they all think it’s great in a way that has more to do with their own bubble of mutual appreciation than any “universal” measure of “greatness.” It just wasn’t available for everyone to see online across the globe. I’ve been in enough neighborhood writer’s circles and seen enough neighborhood art club “gallery shows” to know this. I’m sure he has too. This is stuff that gives a lot of people a great deal of satisfaction and joy (and drama, but what doesn’t?). It’s hard to fault it — it’s not like it’s going to really take over the world somehow.

I think his pique has more to do with how the “Wired Culture” at large (the SXSW-attending afficianados and pundits) seem to be enamored with it, lauding it as some kind of great democratizing force for creative freedom. But that’s just hype — so all you really have to do is say “we’ll get over it” and move on.

Second, though, is the larger implication: a blurring between long-standing assumptions and cultural norms in communities of creative and design practice. Until recently, media have changed so slowly in human history that we could take for granted the distinctions between photography, design, architecture, painting, writing, and even things like information science, human factors and programming.

But if you think of the Web as the most powerful “compositing tool” ever invented, it starts to be more clear why so many professions / practices / disciplines are struggling to maintain a sense of identity — of distinction between themselves and everyone else. It’s even happening in corporations, where Marketing, Technical Writing, Programming and these wacky start-up User-Experience Design people are all having to figure each other out. The Web is indeed a digital tool that is “melting” things down, but not just media.

My obsession with what I call the “game layer” aside, it’s interesting that the mainstream press are now reporting on how using “game mechanics” in business software can create more engaging & useful ways of working with data, collaborating, and getting work done.

Why Work Is Looking More Like a Video Game – New York Times

Rave adapts a variety of gaming techniques. For instance, you can build a dossier of your clients and sales prospects that includes photographs and lists of their likes, dislikes and buying interests, much like the character descriptions in many video games. Prospects are given ratings, not by how new they are — common in C.R.M. programs — but by how likely they are to buy something. All prospects are also tracked on a timeline, another gamelike feature.

(Thanks, Casey, for the link!)

glider emblem

This is delightful. A sort of logo for hacker culture. Not hackers as in criminals (hacker culture calls those people ‘crackers’ among other things) but hackers as in lateral-thinking technology heads.

The graphic … is called a glider. It’s a pattern from a mathematical simulation called the Game of Life. In this simulation, very simple rules about the behavior of dots on a grid give rise to wonderfully complex emergent phenomena. The glider is the simplest Life pattern that moves, and the most instantly recognizable of all Life patterns.

I love this emblem because it really does reference so many things I adore about the internet, what’s happening on it, and the culture that I believe to be the beating heart of it.

Here’s some of the explanation from Frequently Asked Questions about the Glider Emblem

The glider is an appropriate emblem on many levels. Start with history: the Game of Life was first publicly described in Scientific American in 1970. It was born at almost the same time as the Internet and Unix. It has fascinated hackers ever since.
In the Game of Life, simple rules of cooperation with what’s nearby lead to unexpected, even startling complexities that you could not have predicted from the rules (emergent phenomena). This is a neat parallel to the way that startling and unexpected phenomena like open-source development emerge in the hacker community… The glider fulfils the criteria for a good logo. It’s simple, bold, hard to mistake for anything else, and easy to print on a mug or T-shirt. It could be varied, combined with other emblems, or modified and infinitely repeated for use as a background

I’ve been going on and on about how the internet has given rise to a “game layer” to the world we live in: a sort of subcutaenous skin of data that connects everything, and mirrors the logic of our world. (Hence the number of friends you have on MySpace; the location you’re twittering from in Twitter; which songs you listen to the most on your iPod; the ability to track a UPS package at every turn; and on and on). Everything we attach to the network becomes more data, and if it’s data, it’s game-able.

Hacking itself is a kind of game, and the culture is very playful. I can’t get enough of this idea that “play” and “game,” once expanded some in their meaning and context, show us entirely new frames of reference that help explain what’s happening in the world.

Austin Govella puts a question to me in his post here: Does Comcast have the DNA to compete in a 2.0 world? at Thinking and Making

Context of the post: Austin is wondering about this story from WSJ, “Cable Giant Comcast Tries to Channel Web TV” — specifically Jeremy Allaire’s comments doubting Comcast’s ability to compete in a “Web 2.0” environment.

At the end of his post, Austin says:

And the more important question, for every organization, how do you best change your DNA to adapt to new ages? Is it as simple as adjusting your organization’s architecture to enable more participation from good DNA? What happens if your internal conversations propagate bad DNA?
This is my question for Andrew: how do you architect community spaces to engender good DNA and fight infections of bad DNA?

My answer: I don’t know. I think this is something everybody is trying to figure out at once. It’s why Clay Shirky is obsessing over it. It’s why Tim O’Reilly and others are talking about Codes of Conduct.

So, when it comes to specifics, I don’t know that we have a lot of templates that we can say work most of the time… it’s so dependent on the kind of community, culture, etc.

However, in general, I think moderation tools that allow the organism to tend to itself are the best way to go. By that I mean “karma” functions that allow users to rate, comment, and police one another to a degree.

That, plus giving users the opportunity to create rich profiles that they come to identify with. Any geeks out there like me know what it’s like to create a quickie D&D character just to play with for the day — you can do whatever you want with it and it doesn’t matter. But one that you’ve invested time in, and developed over many sessions of gaming, is much more important to you. I think people invest themselves in their online ‘avatars’ (if you consider, for example, a MySpace profile to be an avatar — I do), and they’re generally careful about them, if they can be tied to the identity in a real way (i.e. it isn’t just an anonymous ‘alt’).

In short, a few simple rules can create the right structure for healthy complexity.

As for Comcast, I suspect that the company’s image is generally perceived to be a lumbering last-century-media leviathan. So it’s easy for people like Allaire to make these assumptions. I think I might have made similar assumptions, if I didn’t personally know some of the talented people who work at Comcast now!

What Allaire doesn’t come right out and say (maybe he doesn’t understand it?) is that the Web 2.0 video space isn’t so much about delivering video as about providing the social platform for people to engage one another around the content. Like Cory Doctorow said (and yes, I’m quoting it for like the 100th time), content isn’t king, “conversation is king; content is just something to talk about.”

Having the content isn’t good enough. Having the pipes and the captive audience isn’t good enough either. From what I’ve seen, of Ziddio and the like, Comcast is aware of this.

But it’s weird that the story in WSJ only mentions the social web as a kind of afterthought: “Competitors also are adding social networking and other features to their sites to distinguish them from traditional television.” As if social networking is just an added feature, like cup holders in cars. Obviously, WSJ isn’t quite clued in to where the generative power of Web 2.0 really lives. Maybe it’s because they’re stuck in an old-media mindset? Talk about DNA!

Gene puts up a very nice honeycomb diagram for thinking about the capabilities & focus of social software.

Social Software Building Blocks

While doing research for a recent workshop, I came across a useful list of seven social software elements. These seven building blocks–identity, presence, relationships, conversations, groups, reputation and sharing–provide a good functional definition for social software. They’re also a solid foundation for thinking about how social software works.

« Older entries