Pinterest: digital identity, Stepford Wives edition

Oh, Pinterest.

You’re so pretty. Everything in your world looks sanitized and inspirational.

Your tagline is “organize and share things you love.” You don’t really mean our sticky kids, though, or the gritty streets of NYC on a February Tuesday. That’s for Flickr and Instagram.

You’re about our aspirations. Your purpose is to make us look like designers of our digital lives: clean, controlled, concise. Maybe quirky, just a little.

“Find your niche,” advises our culture’s contemporary mantra for success: “Me, Inc.” The age of Neoliberalism.

Your niche and passion, Pinterest, is our deep desire for escape from our cluttered excess. We are busy and overloaded, most of us. We’d like to run away and live online, in miniature white screen frames stark and orderly as zen paintings. With witty aphorisms. And tiny, perfect servings of food porn. Your niche is our escapism.

And so you’re booming, Pinterest. Last night, Mashable released a chart showing your rapid rise in user engagement numbers over recent months. You’re, without a doubt, the flavour of the week.

And you look and taste great. Hey, I enjoy a decontextualized serving of digital heart-shaped creme brulee (almost) as much as the next person.

But there’s something terribly Stepford Wives about the whole practice.

We Are What We Share
Sure, it’s just a hobby, a pastime. But you make me nervous, Pinterest. Because when I run away and live online in your world, as opposed to on my blog or on Twitter or even Facebook, I’m crossing into a model of digital identity that’s very shiny, but also scary.

It’s “Me, Inc.” without the, um, “me.”

(No, this isn’t about copyright, Pinterest.  Yes, that’s what everybody’s on about these days, and it appears with good reason: you look to be a bit of a copyright nightmare, with Kafkaesque Terms of Service. According to this lawyer, you have apparently reserved the right to prosecute users for the very copyright violations the Pinterest platform seems designed to support.)

But. My issue isn’t the copyright practices you implicitly encourage.

It’s the identity practices.

Using social media shapes who we are, and how we see ourselves. Social media relies on identity: on handles or names or pseudonyms that represent us and our contributions to the rest of our networks. Pinterest is the same: when I sign up, I get an account, under a name of my choosing. People can see what I share. Being “re-pinned” means what I’m sharing is stuff people want to see.

To our networks, we are what we share.

And on Pinterest, that stuff? Isn’t usually mine. And isn’t encouraged to BE mine.

“Me Inc.” Without the Me
See, the difference between Pinterest and most of the major social media platforms that have come before is that Pinterest is set up to encourage us building identity and reputation primarily on the basis of other people’s content.

On Pinterest, sharing your own work goes against the explicit etiquette of the site. Rule #3: “Avoid Self-Promotion.” Sure, “If there’s a photo or project you’re proud of, pin away! However try not to use Pinterest purely as a tool for self-promotion.”

I can see the collective exhale, here. No wonder Pinterest looks kinda like an Ikea catalogue for every facet of human life. Its express purpose is to free us from the awkwardness of self-expression and keep us safely in the realm of the pre-chewed, the market-filtered.

Admittedly, self-promotion on most online platforms gets tiresome. Hey, look at what I did! What I wrote! What I dug out from my back teeth and photographed in extreme closeup!

On Pinterest, I’d just share pictures of somebody else’s perfect teeth. Whitened. Without the accompanying stories of orthodontistry or the person’s flossing regimen. Probably not even his or her whole face.

Pinterest is exactly what it claims to be: the digital equivalent of the corkboard I had in my bedroom when I was thirteen. I had me some Bono, some Annie Lennox, a dented centrefold of Thriller. I once tore a page out of a hair salon magazine for a grainy shot of the dude who played Robert Scorpio on General Hospital. I may also have clipped the Volkswagen microbus ad out of chapter six of my geometry text. (Sorry, Mr. Murnaghan.)

These things weren’t me. They were who I wanted to be, in a sense, but in the dream realm. My cutout of Robert Scorpio didn’t actually further my path to becoming a soap opera spy, in any sense. My purloined VW image didn’t actually buy me a car. It was just an early form of brand affinity, a way of performing identity and belonging.

That’s the problem, Pinterest. You’re a grownup version of dress-up, of playing cotton-candy princesses. It’s fun. Play is healthy. But when we build broadly networked aspects of our public selves based largely on these tickle-trunk identities? Especially with stuff that we’ve lifted finders-keepers-style from other people’s equally aspirational magpie nests? We may eventually find ourselves with the identity equivalent of tooth decay.

Because make no mistake: the way social media works, our Pinterest practices ARE shaping our digital identities.

Augmented Reality: The Blurring of Offline & Online Worlds
Social media’s promise is that of an augmented reality: one wherein physical and virtual combine to create a blurring between offline and online.

Most of us who use Facebook or Twitter already live in some version of this reality; our networks of friends live both inside and outside the computer.

By extension, so does our identity, and theirs: we know and understand each other via a combination of physical and digital interactions. To the friend on Facebook whom I haven’t actually seen in person since 1988, I am as much my photos and my status updates and whatever I share of my contemporary life as I am that girl who used to chew her pencils. I hope.

Social media bypassed the gatekeeping of mass media control, and enabled us to become creators as well as consumers.

Identity-wise, this was revolutionary. Instead of sharing who I was via brand or band allegiance, or some other externalized representation of myself, I could actually connect with people – with anybody, anywhere, so long as we happened upon each others’ networks – on the basis of my words and thoughts and images. On the basis of what I created.

I could be known for being me. Or an aspirational version of me. Instead of having a picture of a typewriter pinned to my corkboard, I could write, and build an audience, and gradually – slowly – come to see myself and be seen through that lens. “Writer” became part of my digital identity. And – thanks to the blurring between online and off – my so-called “real” identity too.

Anybody could do it. You could share your work – your words, your pictures, your witty-ish status updates – and engage with the work of others and in so doing build reputation and connections and complex linked networks. Axel Bruns called this produsage. George Ritzer – with a few minor variations – calls it prosumption.

Want to be a photographer? Social media offers access to photography platforms, photography learning opportunities, and photography communities. You can take pictures and share them, with your name attached. You can participate in the sites and networks where other people are sharing photography that appeals to you. If you want to become known there, you can gradually build a presence and an identity and – yes – a niche. If you keep sharing and are generous with your own work and that of others, you may never be Ansel Adams, but you’ll be – in a very genuine way – a photographer.

The Difference Between Curators and Creators
An internet of a billion aspiring photographers, of course, does tend to get clogged. The culture of scarcity which led to my criminal defacement of a geometry textbook back in my misspent youth no longer exists. Instead, we have abundance, or excess. And a need to curate.

Since blogging died the first of its over-reported deaths back in, what? 2007? and Facebook and Twitter began minimizing the centrality of creation and enabling the public sharing of other people’s content, the notion of “curation” has been getting attention. Curation, really, is what librarians and archivists and gallery owners do. It involves more than collection and sharing, in its original context. But increasingly, and with some apoplexy on the part of professional curators, it’s being taken up simply as what you do when you select and share a friend’s great picture, or a New York Times article you loved, or a pin of vintage Snoopy coffee cups.

Curation is as much a part of our digital identity practices as creation, today.

It’s what Pinterest operates on, entirely. But at the express expense of creation. If you search “I wrote this” in Pinterest, for example, you get a gallery of pins that are pretty easily digestible, at a glance, without much depth to click and explore. Commerce. Curation. Not much in the way of creation that could actually be tied to a person’s digital identity or fledgling reputation as a writer.

And that’s no huge deal, if Pinterest is just a sideline in our digital identity practices. But in fact, it extends trends already begun with Tumblr and even, increasingly, Facebook, where frictionless sharing of unidentified content stands in as the means by which we communicate with our networks.

Here’s the thing, identity-wise. If we drop the “creator” part of the equation, people of Teh Internets, we really go back to being consumers, and consumers alone. Because the type of curation Pinterest offers isn’t actually new at all; it just used to involve doing unspeakable things to geometry texts and hair salon magazines.

Style over Substance: Simulated Reality, not Augmented Reality
The things Pinterest enables us to share need to be more or less instantly visually communicable, either in the form of a picture or an image of words, preferably in minimal quantity. It’s well-suited to design and aphorisms. It’s not well-suited to complexity.

Life is complex. In this augmented world of constant engagement and digital self-promotion, it’s exponentially complex. It’s no wonder we want to go live in Pinterest’s perfect white kitchens and surround ourself with cute pictures of polka-dots and cupcakes.

But online practices become habits. What we see shared shapes what we understand to be shareable, to be palatable.

Taken to its logical conclusion, the practices of Pinterest suggest we’ll stop writing about the stuff stuck in our teeth, or the stories of how our teeth or our selves got broken. (Schmutzie does a beautiful job of taking this apart, this creeping process of self-presentation). We’ll default increasingly to playing dressup in decontextualized, aspirational pictures of other people’s purdy teef. Like in the magazines.

Magazines have always been simulated reality. I like magazines just fine.

But you would not know me from a magazine article about me, if such a thing existed. You might recognize me from a picture, but the meeting – the moment where the physical and the digital selves converge in the same space – would be like meeting a celebrity, a cardboard cutout, not a person with whom you share a regular, intimate interaction in daily life, even if ‘only’ online.

If we trade the produsage model of augmented reality for a simple, Stepford-wife simulated reality, we undermine the premises and promises of social media; the idea that the long tail will ultimately have something for all of us. If we gradually remove ourselves from the creation portion of the creator-curator-consumer model, we’ll end up simply shuffling mass-mediated or market-driven versions of self around Teh Internets, wondering what went wrong.

Or perhaps entirely oblivious, smiling, Stepford-style.

the death of purity

This is not a post about Steve Jobs. At least not mostly.

When I started writing yesterday, I’d considered calling it “Jobs Are Dead.” Small “j” jobs, that is, not Jobs himself. But in the spate of elegies and eulogies spinning round the web today, the title just looks like a misuse of the plural verb.

Yet the two are, perhaps, related. Jobs, Apple CEO and innovator and cultural poster boy for outside-the-box-thinking, was a pretty singular dude. He deserved a lot of the reverence he inspired.

I also suspect that the claims that we will not see his like again are actually accurate.

But that is not entirely due to his own personal singularity.

Rather, I think of it as a marker that the age of singularity is over. While Jobs himself was certainly brilliant, a man with a lifelong and apparently self-sustained vocation, that very figure of manhood – the iconic hero, the exceptional genius – has actually been dying for years.

Our old model of the discrete and stable Enlightenment human, imbued with utter individuality and backed by institutions, is crumbling.

And Jobs’ work – his continual pushing of the envelope, his delivery of connectivity to non-programmers, his pretty white gadgets that revolutionized social media – arguably did as much as anyone to kick while it was down.

We are, for better or worse, connected and collective and fractured, now, all at once.

You wouldn’t know it today. Today, we are inundated not just with the identity cult of Steve Jobs himself, but with an apologia for identity cults in general, with adulation of singularity and exceptionality. Jobs was given up for adoption at birth, read one heavily retweeted gem, quit school, and STILL changed the world. What’s your excuse?

Indeed.

Now, I want people to want to change the world. And I thought it was nice to have at least one CEO in the world who claimed the creative, wired outsiders of the world as his own, and vice versa. We all need role models.

But I think Jobs the icon and Jobs the inventor and world-changer were actually at odds, antithetical in their message and their potential impact. Mac and the iPhone have made the world of connectivity accessible and personal and mobile. They’ve made possible the breakdown of institutions and institutional thinking. They’ve also broken down the structures that support that notion of individual exceptionality: there is no room for Great Men in the cloud. Greatness of scale, perhaps. But all are nodes in the network, all connected.

The institutional breakdown frees a lot of us who owe a debt to Jobs.

But it also opens whole other, very real cans of worms. Worms of debt, and the decay of small “j” jobs, and the kind of society we believe we live in.

Because just as Jobs is gone, so are the jobs. Particularly for the types of people his brand spoke to the most.

I see the stories everyday. Richard Florida’s creative class – those of us reputedly liberated by Steve Jobs – is being hollowed out. Our most educated specialists, after years and years of study, face the reality that the academic job market they’ve trained for is, essentially, gone. Universities are caught between their old institutional structure and newly institutionalized corporate realities which make tenure look untenable.

Besides, we have new ways of gathering to share and build and learn together, like the #change11 MOOC I’m involved in, or the Stanford version with its 130,000 enrollees.

The NYT article on the Stanford Open Online Course talks about its potential to disrupt education. I’m all for disrupting education.

But. If the model succeeds – perhaps not this round but over years – what happens to Stanford in the long run?  And to universities in general? And beyond the idea of the university as a bricks-and-mortar institution, to the concept of public education and the jobs affiliated? Sure, many will find creative ways to innovate and monetize and perhaps even deliver and share free knowledge and content. I celebrate that. I’m hoping for that.

But they won’t do it by being isolated specialists in particular canons, unable to speak or understand the discourse of others. They won’t do it by having clear, pure vocations in which the lines are all tidy and what they do and don’t do remains delineated over time.

Yet we still raise and educate kids to think of success on those terms, and to have expectations that their lives can or should work that way. We lionize singular figures from our cultural mythology as purists, nobly certain of their vocation or their goal or their results-driven management style. We praise Steve Jobs for being the model of the very kind of self-made genius that his own inventions worked to undermine.

Fierce independence and inspiration – the capacity to see things differently – are the answer to change only so long as the centre holds.

Similarly, Jobs’ outsider identity and his advice to “stay hungry, stay foolish” only makes sense if you assume a stable, institutional PC or IBM-style culture; a machine against which to rage.

If everybody is actually hungry and there is no stable centre, you don’t get innovation when everybody scrambles to be extraordinary. You get collapse. Or bloodshed. Suggesting we all be exceptional all by ourselves just like Steve Jobs?  Ignores the fact that even creative rogue CEOs are backed by the ultimate contemporary institution: corporate power.

I fully agree that Steve Jobs left us a legacy. But it is not to BE him.
***

Those of us who identify with the Jobs/Apple perspective on the world need to accept, to paraphrase Bruce Springsteen, that “the jobs are going, boys, and they ain’t coming back.” We need to stop calling it a “job market,” especially for creatives and academics. It’s a dead model. Our industries do not work that way anymore.

This leaves us with two problems.

The first is this: culture change and social media alter our systems of money, status, and knowledge, wide open. But most of us are still in need of some means of garnering money, status & knowledge, even when there is no institutional centre to define those things or the paths to them.

The second problem is tough for Jobs’ tribe: as the institutional centre’s Swiss cheese holes threaten the entire structure, how do we stand outside?

The notion of purity – always messier than it sounded in the mythologies – is dead. The lines between inside and outside collapse along with the edifice. To make money as an artist, one must become a designer. To make a living writing, one must write to a market, or blog product reviews. Student science conferences on polar climate change are sponsored by BP. The breakdown of boundaries and purity makes it hard not to be complicit in the very things that outsiders have tended to critique about the centre.

Even Apple – yes, beloved Apple – has led the Internet away from the open sharing of the web and towards semi-closed, more profit-modelled apps. Like so many social media shifts, the effects of this have a lot to do with tying capitalism closer and closer to average people’s daily practices. Jobs didn’t talk about that overtly: it didn’t fit the anti-corporate-corporation stance Apple managed so successfully as a brand. If there were ever true purists, they were gone long before he came on the scene.

So if we want to honour Jobs, we do so not by buying the myth of the pure, individualist outsider genius. We do it by using the connectivity Apple was part of enabling.

We are in it together, in this changing economic and environmental and educational climate. Social media enables the possibility of collective knowledge, of distributed action, of working together on a scale never before possible. Maybe we can figure out how to innovate together, and create functional systems that allow for money and meaningful work and some kind of liveable, post-institutional world. Who knows? Maybe.

But we won’t do it by standing alone, trying to be geniuses.

Triumph of the Nerds, he called Apple’s success, once. It’s been clear to the industrial sector for years that the old era’s gone. We nerds have been slower to notice, busy thinking we were on the outside and waiting for our ascendancy in the Brave New World where the creative classes would shine, and our ships would all come in.

I think the ships have sailed, but here we are. The centre does not hold. Yet in this mass of connected people is more knowledge and talent and drive – all mixed in, impure-like, with ambition and complicity and mutual reliance – than even Steve Jobs could have wrapped his visionary head around. If we can only give up on the idea that we need singular geniuses to figure out how to use it.

Now THAT would be a real Triumph of the Nerds.

 

education, learning, and technology – #change11

One of the most interesting things I’m doing this year – learning-wise, research-wise, and community-wise – is the Change MOOC.

(For those of you not already signed on for this adventure, dimestore recap: a MOOC is a Massive Open Online Course. It’s free. Anyone can register and participate. There are set topics, assignments, and timelines, but you do what you want, via blog or FB or the central discussion threads: in terms of both your contributions and the platforms you use to share them, its entirely your choice. There are no gold stars – as yet – or credentials for completion, and no invalid forms of participation so long as you’re respectful of others. It’s a chance, basically, to be part of some coordinated conversations about learning, or whatever the topic, and to make some connections amidst the morass of people IN that conversation. A MOOC usually has participants from around the globe, at least within the English-speaking world. There are close to 2000 people registered for this one, I think.)

MOOCs are catching on these days: Stanford is even offering one. What makes the Change MOOC particularly intriguing is that while Stephen Downes, George Siemens, & Dave Cormier (yep, that Dave) – the godfathers of MOOC – will manage this course, they’re not doing the lion’s share of the facilitation this time.

The course is 36 weeks long. Each week has its own focus, under the overarching umbrella theme of “Change: Education, Learning, and Technology,” or how being connected changes learning. Which, as I see it, is a key site of contemporary cultural shift whether you come at it from the perspective of an educator, a geek, or simply a connected person interested in understanding social media practices more explicitly. Each week is facilitated – readings suggested, an online discussion session hosted – by a researcher or innovator or leader in the particular area being explored: this week, it so happens, is the theme of Digital Scholarship, led by Open University academic and author Martin Weller. Participants are encouraged to “write themselves into the course” by responding to topics, themes and assignments in whatever way they wish. In MOOCs, participants’ input often drives as much or more of the trajectory of discussion and interaction than does the facilitator. It’s networked, distributed learning.

I’ll be facilitating a week on social media identity come spring. I get to take my fledgling research into a classroom-ish setting and explore it with and through the participation of others, many of whom likely will have social media identities to bring to the table. (I’m the second-last week, mind you, so the 2000+ may either have dwindled to 12 by then or blown up to even more gargantuan proportions. We’ll see.)

But even better than getting to teach, I get to participate in the whole shebang. Student, faculty, and researcher all in one. Most of the 36 facilitators are also participants and researchers. Identity-wise, this levelling of hierarchical role separations obviously interests me. But so does the rest of the content.

This week’s Digital Scholarship discussion is particularly interesting. What does the capacity to share ideas outside traditional academic channels mean for scholarship in the 21st century? What will the impact of it be? It’s the impact piece – and the implications for traditional practices – that intrigues me. How do connectivity and the capacity for digital sociality suggest transformations in academia?

One of the readings Martin suggested for the week was the JISC (UK) report (2009) on the Lives and Technologies of Early Career Researchers: as an early career researcher, just starting on my second year of Ph.D studies, I’ve been conducting my own informal experiment over the past year into the subject.

Am I getting more input and feedback into my research and learning via traditional academic channels, or online?

It’s an unfair question, in a sense, because of scale. I’m in a tiny program, the first Ph.D in Education in this province. We are a cohort of three, with three additional students starting up this month. While there are a few broad overlaps in subject area, my peers and I share very little in the way of common focus, experience, practice, or expectation. So the level of face-to-face peer input into my research thus far – lovely and supportive though my colleagues are – has been seriously minimal. Tiny. Whereas from the time I started this blog back in January, the combination of my large-ish online community and my research interest in online identities and practices has made a wealth of sharing and input and feedback available to me, in spite of the fact that the majority of that online community are not in any way academics. They are, however, engaged in the culture of blogging and social media, which encourages reciprocality. Academia is generally still disposed and structured to be wary of reciprocality: it comes too close to plagiarism and treads on cherished Enlightenment notions of individual intellectual enterprise.

However, if I only had an online community of three, I might not have had the same experience. But it is very very hard to have an online community of three. The scale of connectivity in digital spaces and the potential for productive sharing, collaboration, and congruence therein is one of the biggest arguments, in my mind, for digital scholarship. Or at least, the digital engagement of scholars. Which usually ends up being digital scholarship, because people engage on topics that interest them.

As Martin Weller points out, though, there’s a conflict here. Research, in most traditional academic conceptions, relies on concepts of control, even where replicability is not required. New technologies are, from an institutional practice point of view, about letting go of control: giving it up to the crowd. And if academia lets go of its controls, of course, how does it validate knowledge? How does it verify and justify its own structures and practices? Yes, connectivity distributes research ideas far more quickly and broadly than traditional journals. At the same time, yes, crowdsourcing is (perhaps) a more vulnerable system of verification than peer review, Nature magazine’s 2006 experiment notwithstanding.

(Sidenote: have been reading Deleuze on Foucault lately. Foucault spoke about the institutional structures of the 19th century as the structures of a disciplinary society, juxtaposed against what he, Burroughs, & Deleuze called societies of control, in which continuous modulation of behaviours shaped by business principles along digital (or spectrum-based, non-analogical) models occurs. Control devolves from confinement by institution to a self-colonizing practice taken up by the crowd, by the individuals within. I think this has – ahem – some resonance for those of us interested in higher education and MOOCs, I’ll explore it soon, in another post).

The biggest best thing about the MOOC is that it’s a semi-structured opportunity to teach yourself to BE a digital scholar, whether in or out of the academy. To select what’s relevant from the stream. To curate. To share. To work iteratively, publishing ideas that – like this post – aren’t all you could say on the subject, but are at least a start. And letting some stuff go so as not to be entirely colonized, perhaps.

All are welcome, and there are still 34 mind-boggling-packed weeks ahead. Not too late. Think on it. Change MOOC.

I’ll let Dave explain how simple it is, courtesy of our SSHRC research project last year.

welcome to the patriarchy, love mom

In the end, after all the days of buildup and song practice and excitement and charging the camera battery, I missed it.

Josephine was in her upholstered seat doing a potty dance of Saturday Night Fever proportions, so I ran her up the aisle to the bathroom. He got called up first.

And so it was his father who captured Oscar walking across his first stage with his first diploma, his “graduation” out of preschool and into the formal school system of kindergarten. Fitting, perhaps.

Felicitations, says the scroll, in fancy letters. I unrolled it and smiled, at the formality of his name printed across the page.

I cringed too. There he goes.
***

Madeleine Grumet’s 1988 opus Bitter Milk: Women & Teaching, says that schooling serves as the delivery of children to the patriarchy.

I picked up the book a couple of weeks back, just as the kids’ preschool year was coming to a close. As an educator, and a student before that, and now a student yet again – someone who has been wrapped up in some form of the system for 35 years – OUCH.

And yet I nodded even as I flinched, reading the words. Grumet put her finger on the piece of this societal project of education that I’ve never been able to quite name, nor shake.

School is the foundation of much of what many of us IN the system want to see change, in schools and in society.

Schooling is powerfully self-replicating, making almost all of us complicit in its protection of its own practices.

Everyone has an opinion about school. Most of us have critiques of school, and schools, and schooling. But no matter the critiques and the shifts – whole language through critical pedagogy to ed reform and a call for standardization – no matter the politics and policies and the thousands of good intentions and spirited efforts and debates, schools march on, surprisingly same from decade to decade. Especially from the vantage point of a six or seven year old kid.

Think about it. The world of kids in the 1950s was relatively different than it is today. According to our cultural myths, at least, they had mothers at home, were sent outside to play, and had apparent run of their neighbourhoods. They’d never seen a carseat or a DS and most would have had more ashtrays in their homes than screens. Some would have never seen a television. They had fewer toys and books, and from the age of four or five, they were expected to entertain themselves in groups for extended hours of the day. If they’d gone with their fathers to work, which would have been seldom as fathers were not expected to be involved or engaged parents, they’d have encountered masculine, hierarchical environments where people performed discrete tasks.

Less supervision and less attention to their interests, feelings, and desires were simply the norms of the day. They were expected to behave and interact differently from children in this generation.

But when those children of the 1950s went to school, they would have encountered expectations very similar to those Oscar will encounter in September. Admittedly, the disciplinary shift is vast. Oscar’s cohort will not expect to be rapped on the knuckles if they breach the rules of the classroom. But those rules and the subjects they creates – subjects who sit, raise their hands, complete discrete tasks independently, and participate in various overt and subtle hierarchies of skill and tribe and class – are remarkably similar.

In spite of the fact that those rules and skills no longer even make for an advantage in the post-1950s job market. What the educational system seems to do best is reproduce itself, getting further and further from cultural value all the time.

We send them into the school system, most of us, with great hopes. Learning. Education. Talisman words. They promise development of our children’s potential, inculcation into the mysteries of consciousness. The lure of the Tree of Knowledge.

What they get – what we all get – is something…other…than that. We get people who learn their place in our culture. In the – however much I flinch at the word – patriarchy, with its implicit hierarchy of gendered behaviours and classed behaviours and racialized behaviours, even as we in our schools and culture pay lip service to inclusion and acceptance and celebration of difference.

That, in the end, is the worldview of the mothers, of the feminized voices within society.

Grumet’s premise is that it is schools – and female primary and elementary teachers, for the most part – who serve to reinforce the nature/culture binary that privileges masculinized “cultured” behaviours over the intersubjectivity of mothers and children, the living with-and-through-another that marks most humans’ early days.

In school, we learn to give over curiosity to passive acceptance, rewarded by praise. We learn to “other” other people, by grades and behavioural sanctions and the message that classrooms as we understand them cannot seem to fail to impart: Some of You Are Doing It Wrong. Some of You are Not Worthy. Some Animals are More Equal Than Others.

It naturalizes the separation of subject and object, of us and them, me and you. It works because it buries its own traces, creating subjects who believe it is simply the way of the world to stand apart, against intersubjectivity and the interwoven world of shared interests. Schools function symbolically, guiding us to adulthood and away not just from the literal worlds of our mothers but from the symbiosis these worlds and their mutual dependence represent.

      “Contradicting the inferential nature of paternity, the paternal project of curriculum is to claim the child, to teach him or her
to master the language, the rules, the games and the names of the fathers. Contradicting the symbiotic nature of maternity,
the maternal project of curriculum is to relinquish the child so that both mother and child can become more
independent of each other.” (Grumet, 1988, p. 21).

In other words, schooling creates subjects who internalize the subject/object divide that reinforces patriarchy and so-called culture through the knowledge acquisition and gendering processes that schools and teachers are constructed to see as natural. And parents, products and subjects of the same system, go along, delivering our children to the same inequitable and flawed system even as we gripe collectively about its flaws and failures.

I’ve watched it start this year, in Oscar, as he moved to five days a week in a preschool physically attached to the school he’ll attend next year. He learned excellent French. He also learned a lot about what boys should do. About being shy to be wrong. And about colouring in the lines and thinking skies need to be blue. He learned you can’t talk all the way through Circle Time. And he learned how to court the powerful, how to curry favour in a pecking order and how to spot difference that makes others vulnerable. There was something violent about it all. And yet familiar, utterly familiar.

But that is only because I went through the same process myself. As did you, probably. And so the system goes, self-replicating because we don’t know anything else.

So here I am: mother, educator, student of educational theory. And I have the mother of all dilemmas on my hands.

It’s good, part of me says, to know all these things that school teaches. Not the information ones. The social relations. The power rules. Certainly, we expect O not to talk when others are talking here at home, and it’s useful to know how to handle yourself in a hierarchy.

But. But.

Learning these things makes you subject to them, no matter which end you come out on. I learned all that crap so well that it’s taken me years to begin to unpack it, to live without waiting for a grade, for an external deadline, for a sense of how I measure up against others. I do not want this for my children. I do not want them to be like me.

And so another part of me sits watching this march of normativity start up, and blows smoke at the spectacle and asks Really? All these years of trying to critique the system from within, and you’re going to go ahead and subject your own child and children to the whole shebang?

Really? Can you not come up with an alternative solution?

And when I look that voice in the eye, I am ashamed.
***

We could homeschool, I suppose, or preferably, unschool. I think unschooling is probably – if not necessarily overtly – about trying to uncouple the patriarchy from the educational process. But I am both a product and a purveyor of education in its traditional forms. I have been – gently, maybe, but nonetheless – delivering other people’s children to the patriarchy for years.

A part of what I know how to do, professionally, is a form of serving at the pleasure of the patriarchy itself. I am complicit.

Could I be otherwise? Do I want to be? That, Hamlet, is the question.

I do not believe learning is inherently a patriarchal process, even if the notion of the Tree of Knowledge might be. We do lots of critiques in my classrooms, just as we do in our house. My children, like my students, will inevitably be exposed to the idea that the world and its power relations are constructed, not natural. But could I go further than that, if I were willing and able to carve out the space in this next few years to try to educate my kids myself?

Or would I inevitably replicate what I know, what’s been done to me in the name of learning and becoming “educated”?

Part of me suspects I would. And I wonder if I wouldn’t rather have a nice gentle primary teacher do that to them rather than me?

Part of me prefers my autonomous life, my space. I was no idealized mother, when my children were infants. I work from home, now, but alone. Part of me fears that I do not know how to function without the patriarchal separation of the domestic and the professional, no matter how specious and unnecessary it may be.

In the end, I suspect that I will deliver my children over to some version of a 1950s classroom. Anything else would shock me. And I assume there will be good in it, and bad, just as there was for most of us.

Yet, sitting here thinking about tiny diplomas and the patriarchy and the world I’d like to live in, I recognize that schooling is a choice.

And I marvel and cringe at the power of a system that makes it so difficult for even those of us most deeply embedded in and privileged by its operations to see other options. Patriarchy for the win, indeed.
***

Do you think Grumet’s assessment is fair?

Those of you who have homeschooled, or unschooled…what was it like? What are its strengths and weaknesses, in practice? Do you end up replicating what you know?

And…what role do you think educational technologies could play in shifting some of the power relations involved in children’s learning? Do the peer-to-peer capacities and real audiences of social media offer any real challenge to the traditional practices of hierarchy in education?

 

 

a genealogy of digital identities

Grope. Stumble. Circle around.

I’m fumbling my way towards the methods & methodology choices that will guide my digital identity research. This week, for the first time, my blurry paths collided hard with current events in the world and the social media sphere.

Tom MacMaster, A Gay Girl in Damascus’ hoax blogger, has personally altered the direction of my dissertation’s methods section.

(Okay, well, him and Twitter. And the mainstream media attention his blog garnered even before he claimed Amina had been kidnapped. And the Orientalism and colonialism and exoticism that still inform how we in the West attend to narratives from the Other, seeing as I doubt somehow that it was a total coincidence that the single identity most Westerners could name from the whole Syrian uprising this spring turns out to be that of…a Westerner.)

I struggle with formalist categories like method. I recognize that they are, in a sense, intended to make things clearer, to parse the broad territory of social science and research and the multitudes therein. For someone like me, more inclined to gradations and overlaps than clear divisions, they confuse. I hover on the borders and boundaries, a millipede with feet in so many camps that headings like “Research Objectives” and “Data” make me feel hopelessly messy, mired in no-man’s-land.

This isn’t a bad thing, only a disorienting one. My work doesn’t fit tidily within the bounds of education alone, or of cyborg anthropology or any other discipline or corner. The straddling that I need to do between discourses and approaches and worldviews helps me unpack methods and methodologies and epistemologies, forces me to continually apply theory to theory in a roundabout kaleidoscope. Patti Lather’s work, which explores validity structures transgressive to traditional scientific methodologies and includes comforting titles like Getting Lost (2007), helps me feel better about the kaleidoscope. My goal, after all, is situated knowledge, rhizomatic knowledge with multiple openings. No one tidy method will ever take me on that kind of exploration.

Every journey has first steps. The two methods I’ve embarked on thus far are themselves straddlers, each bridging the blurry boundaries between methodology and method. One is the material-semiotic method that marks Actor-Network theory and the work of LaTour and Haraway and Karen Barad. The other is Foucault’s genealogy.

It is my understanding of the genealogy of digital identity that I’m going to have to revisit after this week.

Just a few days back, somebody asked the question that inevitably comes up whenever I mention genealogy and social media in the same breath: “How could there be social media subjectivity before social media?”

Sure, the platforms I’m working with date only from 2005 or so. But the shifts in the forms of identity performance privileged during that timespan have still been pretty heady. And digital identity scholarship was huge in the 90s. Haraway’s cyborg metaphor, which informs my own concept of social media subjectivity, is from 1985. The narrative forms and subjectivities that the blogosphere made into mass communications could be argued to have their origins in Montaigne. This rhizome has far older roots than appear on the surface.

Genealogy as a philosophical method isn’t much  different from genealogy as your great-aunt Louise’s favourite hobby: it’s an historically-focused endeavour that operates on the assumption that our present understandings – of self, of our place in the world, of anything – have precedents and ancestors.

In genealogy, delving into the questions of what or who these ancestors might have been and how they operated is an almost-never-ending, always-partial process of unpacking and tracing and exploring, aimed at re-presenting the present in a broader, more complex, and perhaps counter-intuitive light. Knowing you are a descendant of Marie Antoinette, even whilst you traipse the aisles of Walmart, may imbue you with a sense of grandeur, tragedy, entitlement, or irony, depending on your perspective.

Knowing the ancestors of our notions of who we are when we’re online, when we write ourselves into being, when we engage with each other through identities with visible metrics? I don’t know whether that will imbue us with any grandeur – I’m aiming more for irony – but I hope it will help situate the implications of social media subjectivities within stories and discourses more familiar to higher education, so I can then consider the overlaps and challenges facing academia in the near future.

But. But. One of the historical notions I believed I could refer to and then politely consign to the out-of-date heap came roaring back into play this week, with the furor over the Amina hoax.

The purportedly half-American half-Syrian lesbian passing herself off in interviews with The Guardian (the big one,  not the local PEI paper) as “the ultimate outsider” is, of course, actually MacMaster, a white male Master’s student living in Edinburgh.

What that says about white male fantasies of outsider status, the one thing privilege cannot offer, fascinates and entertains me. And affects my perspective on digital identity, because it revives a trope I thought I’d watched die.

In the 1990s, there was a lot of scholarly interest and attention paid to the idea of digital identity. Sherry Turkle and Neil Postman and a whole host of people did fascinating, exploratory work on the emerging digital culture and ideologies of technology and identity and the body in virtual worlds. One of the recurring themes in much of that work emphasized virtual identity and the possibilities of pseudonymous identity performance enabled by computers.

My favourite of these is the story of “Julie” from Allucquere Rosanne Stone’s The War of Desire & Technology at the Close of the Mechanical Age (1995). Julie was the extraordinarily successful and popular female persona of a male pyschiatrist in an early CompuServe chatroom. Like Amina’s, Julie’s was a marginalized female persona performed by a mainstream male: Julie claimed to be disfigured and disabled. In a narrative arc rather similar to that of Amina, Julie was ultimately outed by her own excess: while she claimed that her disability left her unable to interact offline with her chatroom community, she wove an increasingly complex narrative of offline antics. The stories created suspicion, and her embodied identity as Sanford Lewin was revealed. The gap between Lewin’s assigned identity and his virtual performance as Julie represented one of the major themes of digital identity scholarship in the ’90s: the possibility of being someone else online.

I thought this particular piece of digital identity ancestry had been rendered largely historical. When I began blogging in 2006, many of the bloggers I read – especially those who wrote about parenting and children – were still pseudonymous. Gradually, that shifted: the digital sociality that emerged out of that blogosphere community is an augmented reality, wherein people regularly meet in person and connect with each other across platforms, including Facebook, which tends to privilege and push towards disclosure of so-called “real” identity. Beyond that, the incursion of capital and sponsorship and the discourse of monetization all emphasized coming out as “oneself,” because a blogger named WineyMommy (names have been changed to protect the innocent) is arguably less likely to get picked up as a writer for the Huffington Post, say. Even if that only pays in reputation and opportunity.

My genealogy, though, will obviously need to consider how speaking the dominant discourse of power impacts reputation and opportunity, even for those purporting to be marginalized voices. It’ll need to reconsider whether even in the neoliberal “Me, Inc” augmented reality of social media, there’s room for performances of subjectivity that don’t match a person’s assigned gender or cultural identity.

Genealogy, as I understand, is about who can speak, and for whom, and to whom. Grope. Stumble. Circle back on myself and revisit. Thanks, Amina, for complexifying things. I’d hate for my methods section to get, uh, dull.
***

Have you ever had a pseudonymous identity online? If no, why not? If yes, to what extent did this persona line up with your own assigned identity?

Are you the same you across platforms (blogging, Twitter, FB, etc)? What factors affect your decisions about how to present yourself in social media spaces?