Tower of Song

For my birthday back in January, Dave bought me a Leonard Cohen biography.

I opened it and laughed and laughed.

He bought it because Cohen is still alive. David Bowie – who for more than thirty years was my imaginary boyfriend and the person I wanted to be when I grew up – is dead. And I woke up the morning of January 10th and realized:

a) me & Bowie are never gonna have that conversation about identity. dammit.
b) life is short .

But I turned 44 on a paid-for plane trip back from London, so not all was ashes. I gave talks at the LSE and the Tower of London and I made a side pilgrimage to the street where the Ziggy Stardust album cover was shot the week I was born, and I woke up in my own bed the morning after I got back and there was Leonard Cohen waiting for me in the only form he or Bowie will ever be waiting for me, and I opened up the bio and the first lines I saw were Cohen’s poem that begins:

Marita, please find me.
I am almost thirty.

And then I laughed some more and it was only faintly hysterical.
***

But my point was about identity.

Given the timing of the London trip, I turned talks about academic Twitter and orality and literacy into a Bowie tribute of a sort.

Bowie songs made up the titles of half my digital identity posts back when I started this blog. I firmly believe thirty years of watching him navigate time and selves prepped me well to live in a world of hypervisibility and monetized identity and blatant performativity.

Or, you know, social media.

But I’d never fully mapped it out.  So before I got into Twitter’s collapsed publics and what Meyer (2015) calls the “smoosh” of orality & literacy, I laid out – with images and lyrics and some core points, the idea that networked identity is very much a Bowie approach to identity.

Fluidity, fragmentation, the vision and chutzpah to stand just on the edge of rising trends and embody them for audiences…notable qualities of successful networked identities.

But for scholars in particular, the core of the Bowie approach is a distinction between role and identity.

It used to be that the personal/professional axis generally divided lives into separate domains, at least where the possessors of said lives took on paid roles outside the domestic realm. In most fields – and certainly in academia – such paid roles tended to be stations within articulated, often hierarchical systems. Or organizations. Or institutions. A person – often a dude – showed up for his or her job and fulfilled clearly-delineated responsibilities that were parallel to those of other people working in similar roles, until such point at which a higher up determine he or she was worthy of another role.

Whatever kind of special snowflake this person secretly imagined him or herself to be had to be enacted off the job, at home, in the personal domain.

Under sway of this broad societal norm, only those rare animals who became celebrities of some sort or another made their individual identities – their distinctions rather than their interchangeability – the core of their professional lives.

(Enter Bowie. But Bowie was not just any celebrity).

Gossip rags vouch for the fact that a great many people catapulted to fame not only discover that the collapse of personal/professional identity results in a parallel and alarming collapse of privacy, with the public working self taking over 24/7…but for many, that public working self quickly becomes a stale trap of typecasting, minimizing their creativity in exchange for a single hypervisible public image that they tended to be pilloried from departing from. Alas, poor Fat Elvis.

Bowie may not have been the only celebrity to manage to change successfully with the times AND maintain – some of the 70s aside – a modicum of personal privacy regardless of public scrutiny and visibility…but he did a smashing, savvy job of it, for the most part. As he did a smashing job of messing openly with constraints of gender and sexuality.

Selves were things Bowie picked up and put down, serially, while managing to create an overarching identity as someone so utterly distinct that people repeatedly referred to him as “otherworldly.”

Bowie’s job wasn’t really to be a rock star, like all the other rock stars in the constellation. Bowie’s job was to be Bowie.

Screen Shot 2016-03-02 at 2.15.37 PM

***
So what does this have to do with academic Twitter?

Our academic institutions are still built on roles. Tenured roles – or permanent academic roles, for those outside North America – are an endangered species, but the hierarchical, institutional model still conceptualizes labour along roles’ interchangeable, impersonal terms.

My research into academic Twitter the last couple of years was pretty emphatic that Twitter enables actively-participant (or resident) scholars to operate beyond what Boyer (1990) would have called the “hierarchy of functions” of scholarship. Twitter doesn’t just situate users within the realm of networked scholarship…in can enhance their sense of community and engagement in their work in general.

And it can open up paths to the development, performance, and circulation of scholarly identities…even without roles. People who may not have institutional roles or academic jobs or status in the hierarchy can become known for their work and their ideas, via Twitter and broader networks of participation.

This changes things. Junior scholars and grad students and contingent academics can create forms of visibility and legitimacy within the blurred space between media and institutional scholarship that do not match their institutional status, or lack thereof.

But it changes more than who gets to join in some aspects of the academic conversation. It changes how.

When, on your campus, you need someone who does things that seem “digital,” you can look for somebody who has that word (or some other word in permanent danger of becoming imminently outdated…I’m looking at you, elearning) in their title…or you can look for the person who for inexplicable reasons seems to do that stuff. Sometimes it’s the same person, but sometimes it’s better if it’s not.

Because humans are funny. If we approach someone on the basis of their title or role, we tend to approach them within the boundaries of the institutional hierarchy. I suspect many folks in higher ed still secretly associate all things digital with the 1996-era title “webmaster”…which meant you had a seat at neither the faculty nor the admin table.

But if somebody is approached based on an interest in that person’s differentiated, visible, searchable expertise and identity…the conversation changes. The hierarchy is a little bit undermined. The conversation tends to depart far more quickly from what’s happened before to what *might* actually be possible. New things are more likely to emerge.

This isn’t rocket science. It’s just a networked approach to identity and interaction.

***

There’s a catch, though.

Carving out space as a career individual within a society primarily marked by institutionalized roles is one thing. Bowie had uncanny timing and instincts in this regard.

Carving out space as a career individual within a society – and particularly within a sector – primarily marked by the collapse of institutionalized roles is another thing entirely.

We can all be as Bowie as we can muster in the connective tissue of our networks. But those do not – at least for scholars, nor for musicians either, at least in the way they used to – an industry make.

A ton of us live here, straddling this strange gap between academic roles that don’t – and may never – exist for us, and academic-ish identities that we use to contribute in the ways we can, whether to institutions or just to the broader conversation.

Paying our rent every day in the Tower of Song, as it were.

Maybe Leonard Cohen, who found himself swindled out of everything in his seventies and hustled his way back on stage, touring til nearly eighty in his sharp gangster suit, is who I oughtta plan to be when I grow up.

Somebody call me when we figure out alternatives?

What Your New Year’s Facebook Posts Really Mean

So I did that Facebook “Year in Review” thing a week or two ago even though I’m moderately sure it serves up some extra layer of data-mining capacity on a platter to Zuckerberg’s new personalized learning minions. Encapsulated in ten photos, my reductive 2015 in review looked…nice.

Really nice. A lot of travel, a lot of family time, a Ph.D earned, a conversation on Twitter with David Bowie’s son. Some excessive (expletive deleted) snow, but otherwise nice.

It left the rejected papers out. The time my son wore the same socks for four days. My posts about alcohol and fascism and friends leaving town all stayed conveniently out of the frame, presumably because Facebook knows these are not the prettiest things upon which to reflect fulsomely at the close of the year. Or perhaps Facebook only *knows* that because nobody much liked those posts.

All in all, it made me appear more or less like an amalgam of the identities I aspire to. Yeh, yeh.

You already knew that about Facebook.

But I think there’s more going on there. Today, on New Year’s Eve, my Facebook feed is a radiant orgy of Auld Lang Syne recollecting the year gone by in (mostly) tranquility and (mostly) appreciation, with a smattering of don’t-let-the-door-hit-you-on-the-way-out, depending on what kind of a year people had & also where they self-identify and perform on the emo-to-chirpy spectrum. It is also, increasingly, a site of exhortations to do better as a society in 2016, a space for calling out the broken social contracts and structural underpinnings that differentiate individuals’ life chances so drastically even in some of the wealthiest countries in the world.

It occurred to me this morning that a thousand years hence, should archaeologists or aliens dig up the remnants of bourgeois North American “civilization,” such as it is, they will be sorely challenged to understand a damn thing about who we were and how we lived without our Facebook feeds.

If we cared about the future, people, we’d be chiseling this stuff into stone.
***

I got a book for Christmas – thanks Santa Dave! – called A Colorful History of Popular Delusions. Like all good gifts for fledgling academics, it has me thinking about work, even while I appear to be lolling in sloth over the holidays.

The book is a cultural history – without excessive depth, but this is not a peer review – of mass phenomena that overtake pockets of society at various intervals: fads, crazes, urban legends, mass hysterias. It details examples of each of these phenomena, from the tulip craze in Holland through the Salem witch trials and McCarthyism, and some of the extenuating cultural factors that generated them.

Two things strike me:

  1. We, as humans, are profoundly adaptable – we have, historically, in matters of weeks and even days, on occasion adjusted the norms and compasses of our societies – in ways that seem almost unimaginable later on – in response to triggers that prey upon particularly cultural powerful fears, aspirations, or repressions.
  2. We, as cultures, are profoundly vulnerable to the narratives that we circulate and enact as members of our societies, particularly surrounding fears, aspirations, and repressions.

What does this have to do with Facebook?

Facebook – and more broadly, social media in general…but Facebook remains for the moment the space of the widest participation across demographics even while targeting ads designed to keep people IN their existing demographics – is the stage upon which the battle over dominant cultural narratives is played out.

Social media is where we are deciding who we are, not just as individual digital identities but AS A PEOPLE, A SOCIETY. Or perhaps, as we haven’t quite acknowledged yet, as almost separate societies within the same geopolitical entities, subject to laws and policies that have differential effects on different bodies and identities. Day-to-day, social media is the battleground for the stories we live by. It is the space where our cultural fears, aspirations, and repressions circulate.

Previously, at least as my book loosely outlines it, these narratives tended to be nursed and cultivated through a combination of institutional and moral edicts, generally protecting whatever the status quo was except in times of upheaval wherein individual voices – or, occasionally, intentional power gambits – destabilized those normative belief systems and identities and galvanized new ones around them, even if only for a brief window of time.

I’m not naive enough to think this means we’re free from our institutions, the media perhaps most outsizedly and dangerously powerful among them in terms of narrative capacity, but as any of us who have had any level of professional media exposure via social media participation can attest, even the media now draw their sense of the tenor of things from social media, even if they insist on repackaging them in binaries in the process.

This is why hashtag activism matters, and why social media visibility is risky and why posting about mass shootings draws out your weird uncle (who otherwise never acknowledges anything you say) in full Gandalf “YOU SHALL NOT PASS” mode, even if Gandalf wouldn’t approve of his from-my-cold-dead-hands politics.

Facebook and the rest of social media are our day-to-day archive of who we are trying to become.

These are our times and they are fraught and sometimes ugly and we move too fast from fad to fad and whiplash to whiplash in the outrage generator that social media creates, absolutely.

Still, I watch people get a little bit more media literate all the time, make the wizards behind the curtain a little more visible, push back against witch hunts in ways that I’m not sure were possible in closed and isolated societies like 17th century small-town Massachusetts.

Sometimes I have hope that maybe this isn’t all just a one-way sinkhole. Sometimes.
***

Which brings us back to the New Years posts. We live lives of inexorable and relentless change, amplified by the bucket lists and planned obsolescences and precarities and excesses the kinds of lives Facebook seems designed to reflect. A lot can happen in a year of living one’s Best Life (TM), after all, and if one fails to reflect on it all with sufficient attention, one is committing the ultimate sin of those aiming for Best Lives. My thoughts on the pressure to live our Best Lives are not pretty.

But when I see our collective New Years wishes and reflections and updates and hopes less in the vein of the “yay me” holiday update of wonderfulness and more in the spirit of a mass ongoing narrative conflict in which we try to influence our peers’ understandings of what has meaning and value, of what our repressions are and what our fears and aspirations *should* be…I’m less cynical.

Bring on the New Years posts and wishes and wrap-ups. Maybe these little outpourings help us focus on bits of hope as we cross into a new turn around the sun, bring collegiality to spaces and identities that are often fraught. Even if the aliens and archaeologists never see it all, maybe it makes a difference to the rest of what they dig up someday.

Happy New Year, friends. :)

inequality & networks: the sociocultural implications for higher ed

Next week is #dLRN15 at Stanford. Months of planning and debating and collaborating (and panicking!) all come together to launch an inaugural conference/conversation on Making Sense of Higher Education: Networks & Change.

It’s all Panic At The Disco around here these days, people.
***

There are some serious high hopes embedded and embodied in #dLRN15. Not just for a successful event – though a successful event is a joy forever, as the poets say. Or, erm, something like that. But success is a complex thing, and hopes go beyond the event.

#dLRN15 is grounded in the kind of quiet hopes most of us in higher ed these days don’t talk about all that much: the hopes that things can actually get better. The hopes that research can be conducted and communicated in such a way as to shape the direction of change. The hopes for a future for the spirit of public education, in a time when much in higher ed seems to have been unbundled or disrupted or had its goalposts moved.

Those kinds of hopes are waaaaay too big to lay on the shoulders of any single event or single collection of people…but still, we got hopes, and they underpin the conversations we’re hoping to start through this small, first-time conference next week. We have the privilege of bringing together powerful thinkers like Adeline Koh and Marcia Devlin and Mike Caulfield as keynotes, plus systems-level folks and established researchers and students and grad students and people from all sorts of status positions within higher ed, all thinking about the intersection of networked practices and learning with the institutional structures of higher ed.

However, there’s one strand of conversation, one hope, in the mix at #dLRN15 that I’m particularly attached to. It’s the Sociocultural Implications of Networks and Change in Higher Ed conference theme, and particularly the opening plenary panel of the conference, on Inequities & Networks: The Sociocultural Implications for Higher Ed. I’m chairing, and the ever-thoughtful George Station, Djenana Jalovcic, and Marcia Devlin have agreed to lead the conversation from the stage.

But we need you.

No plenary panel is an island…and while all of us contributing have our own deep ties to this topic, our role is only to start the conversation. Help us make it wider and take it further. Whether you’ll be there or not, your thoughts and input are welcome on the #dLRN15 hashtag or on our Slack channel, or here in the comments. Throw in.

To me, this is the strand that gets at the heart of what education is for, and who it includes, and how, in a time of massive stress: is the digital helping widen participation and equality? Is it hindering?

If the answer to both is “yes,” WHAT NOW?
***

The aim of the panel is to explore how intersectional issues – race, gender, class, ability, even academic status – in higher education are amplified and complexified by digital technologies and networked participation. While digital higher education initiatives are often framed for the media in emancipatory terms, what effects does the changing landscape of higher education actually have on learners whose identities are marked by race/gender/class and other factors within their societies?

We’ll be sharing and unpacking some of the places we get stuck when we think about this in the context of our work as educators and researchers.

What effects do you see digital networks having on inequalities in higher ed? What sociocultural implications do networked practices hold for institutional practices? What are universities’ responsibilities to students who live and learn in hybrid online/offline contexts?

Please. Add your voices, so that this panel becomes more a node in a networked conversation than a one-off to itself. That in itself would pretty much make #dLRN15 a success, in my mind. :)

the morning after we all became social media gurus

One morning, all my friends woke up as experts.

Or rather, thanks to years of what academia had mostly framed as the gauche and wasteful habit of talking excessively to people who lived inside our computers and iPads, many of us whose social and work lives had merged somewhere in the ether of that Third Place/Space woke up with workshops to give, because…academic service. When what was gauche and time-wasting yesterday is The New Black today, it’s handy to have a vanguard of self-taught experts to teach everybody else how to play along.

But what are all these workshops doing, in the context of the academy? Mark Carrigan posed the question of social media as fashion or fad on Twitter this morning. I retweeted his post. We ended up in a conversation that eventually included another three or four colleagues, from a few different countries. THIS is how social media actually works for me, when it works.

These excerpts carry the gist of the conversation better than I can encapsulate. They also raise questions that I think all of us passing as social media gurus – however unwillingly – in the academy need to grapple with, and soon.

Screen Shot 2015-09-25 at 9.38.57 AM

Screen Shot 2015-09-25 at 8.30.50 AM

  •  Are the workshops helping…or just making people feel pressured to Do Another Thing in a profession currently swamped by exhortations to do, show, and justify?

Screen Shot 2015-09-25 at 9.39.18 AM

  • Does the pressure over-emphasize the actual power of social media and encourage people to dig in against it as some kind of new regime, without necessarily having the experiential knowledge to judge whether it could have any value for them?
Screen Shot 2015-09-25 at 8.29.11 AM
Screen Shot 2015-09-25 at 9.47.25 AM
  • How SHOULD we count digital and networked scholarship within the academy? Should we count it at all?
***
FWIW, I think we should, but I’m very wary of how. And so I wonder what happens the morning after we all wake up as experts, so to speak.

I feel like I’ve been here before. Yesterday afternoon, somebody tweeted an old post I wrote four years ago, back when I’d had a personal blog for years and was trying to understand the shift I was seeing in the economy of social media, from relational to market.

Screen Shot 2015-09-25 at 11.04.03 AM
It was the words “a path into the machine” that gave me a sense of deja vu.
Because one morning back in about 2008 all my friends woke up as social media gurus. We’d been hobbyists and bloggers and it was kind of wonderful but faintly embarrassing to talk about in polite company and then BOOM people started appearing on Good Morning America and it gentrified and stratified fast.

Switch out “brands” for “institutions” up in the pull quote above and we are living a parallel moment in academia, just a few years late. And the the many-to-many communications that the networks were based on risk, once again, being instrumentalized into something broadcast-based and metrics-driven that misses the whole point.

There has been plenty of excellent – and necessary – advocacy for the inclusion of digital, public engagement in academic hiring and tenure and promotions and our general sense of what counts as scholarship.

But the practices that get encapsulated as digital scholarship or networked participatory scholarship straddle two worlds, and two separate logics. One is the prestige economy of academia and its hierarchy and publishing oligopoly and all the things that count as scholarship. The other is social media, which has its own prestige economy.

The overlap goes like this, IMO:
Screen Shot 2015-09-25 at 9.52.53 AM
I never liked Klout’s reductionism to metrics – scale of account, reach of posts. Yet the thing that these two spheres share – their common language, so to speak – is metrics. And while those of us engaged in the complex logic of influence and prestige in academic Twitter *get* that the ephemerality of a tweet that goes viral isn’t the same as a reputation of smaller scale over time, nor does a broadcast account operate on the same terms as a reciprocal account, metrics divorced from context – either on Klout or in citation counts and h-indexes – do NOT get that.

So if those of us giving workshops to the academy about social media don’t make it really clear that it’s more than metrics – and don’t give people the experiential opportunity to taste what a personal/professional learning network (PLN) feels like and can offer – we have only ourselves to blame when the academy eventually tries to subsume social media into its OWN prestige economy.

The morning is now, kids. It’s been now for a little while but it won’t be forever. Seize the day.

How do YOU think we can best engage scholars and institutions in networked scholarship without selling the farm?

Digital Pedagogy: Hospitality & The Hot Mess

Sometimes, the people you are expecting are not the ones who show up.

Last month, I spent a week facilitating the “Networks” track at the inaugural Digital Pedagogies Lab Summer Institute in Madison, Wisconsin…an immersive, five-day deep dive into the intersections of higher ed and digitally-networked platforms, practices, and pedagogical implications. Heady stuff…and risky stuff, every time, because questions of open & closed educational practices and open & closed academic systems strike at the heart of people’s most deeply-held beliefs about their professions and their professional identities.

But at #digped, it was MY understanding of my profession that got unsettled and re-aligned. Or rather, re-focused.

Because in the (pretty amazing) collection of 25+ professionals who joined my track, at least half were not the faculty, grad students, and maybe teachers I’d expected would come to explore digital pedagogies. They were instructional designers. Librarians. People tasked with the roles of making “the digital” happen in institutions, but people whose pedagogical audiences are as much faculty as conventionally-designated ‘students.’

I should have expected them. I started off in this field as a proto-instructional technologist myself, back before I’d ever heard the word. I began thinking about digital pedagogies pretty much at the point when I began teaching faculty how to teach online.

But the hierarchy of the academy to which we are actively acculturated in higher ed works to make the labour of digital professionals – particularly instructional technologists – invisible. They are not faculty. They are not admin, at least unless they are Directors. They are not much like the other support staff, in the sense that they interface (in most contexts) far less directly with students than with faculty. They are not students.

And yet in the contemporary university, in North America, they are the people most likely to be actively shaping an institution’s pedagogical response to the Internet.

Where pedagogy intersects with all things digital in higher ed, it’s being outsourced. To a class of workers who do not hold an official position in the academic hierarchy.
***

I’m not clutching pearls or defending the academic hierarchy, just noting that some pretty vast gaps exist in its version of higher ed and what it’s for.

Because as higher ed has complexified, whole classes of labour have emerged that have never been fully brought into the academy’s vision of itself, and central parts of that vision, such as pedagogy, have become increasingly isolated from the work of faculty.

I’d argue that these gaps – not the people in them but the gaps themselves – operate to further deprofessionalize the professoriate, ironically. Not to mention that digital adoption and online learning demand pedagogical direction if they are even to begin to do more than just move print-era content and its embedded pedagogical assumptions online. At the same time, tech still tends to be gendered male, so there are other – sometimes conflicting – forms of stratification at work at this strange intersection. And then there’s casualization. And the ever-present question of race in the academy and whose knowledge gets to count. And the fact that digital higher ed spaces in particular face enclosure and corporatization by those who see education as a ripe candidate for disruption or whatever they’ve decided to call it this year.

I suspect the technical term for the whole combo is “hot mess.”

I’d almost given up on trying to unpack it all when Tony Bates wrote a piece last week suggesting there’s little future and no career path in online learning. While a large part of me wants very much to agree with Tony’s reasoning – which runs “in the future, we will need instructors who have the skills to decide when and how to use online learning as part of their jobs, and not see online learning as a specialty of someone else” – I recognize that my desire to agree comes from a place of privilege, since I straddle the roles of instructor and online learning specialist. And much as most of my public work is about encouraging educators and faculty to explore digital literacies and digital pedagogy and digital scholarship, I’m not sure that our need for that future will magically create that future.

Sometimes the people you need – or are expecting – are not the people who show up.

Which is where we circle back to #digped and Wisconsin.
***

My friend Kate Bowles has been talking for awhile now about hospitality in education, about being present to who shows up. It may shock her to learn I’ve actually been listening.

But on the Tuesday morning last month in Wisconsin on day 2 of #digped when it dawned on that my vision for the week wasn’t exactly addressing a large chunk of the people who were paying good money to join me for the experience, it was Kate’s voice I heard in the back of my head.

One does not simply *ignore* Kate Bowles. ;)

And so we changed gears midstream, albeit with some grinding of those gears along the way. And the whole week was better for it. Powerful, rich, and full of lessons that I, at least, will take forward into future iterations and future work. And this was thanks in huge part to the generous, exploratory spirit of the many instructional technologists and designers and librarians – as well as the faculty – who made up the Networks track and the range of skills and knowledge and conversations between us all.

We benefit from being hospitable to each other, and opening our narrow hierarchies of specialization. And even those of us who should know better sometimes need reminding.