List

I thought the readings for the week were really interesting, if a little celebratory. . . and why shouldn’t they be?  At its core, it would appear that Web 2.0 is really about extending the realm of sociality from the meat world into digital spaces.  Collaborative web tools like RSS, Delicious, etc., herald a fundamentally different web that embraces constellations of networked social connections and allows for a decentralization (democratization?) of control from monolithic content providers.  This certainly seems to be the argument that the folks over at Michigan State make in their article on the development and implementation of a civically-minded, empowerment-inducing mapping application that pushed maps away from their static, technical role toward a more user-determined, “tour” based, and localized narration of spatiality.  I want to do a couple of different things in this post. . . well, two different things really.  First, I want to consider the way that piracy is written into the Web 2.0 narratives we read for today.  Second, I want to think about what comes next. . . what is Web 3.0 and how does it differ from what we’re reading in these articles.

The bittorrent protocol is mentioned by the Diehl et.al. piece briefly and elaborated on in the O’Reilly essay in more detail.  While I won’t go into the technical details of how bittorrent works (you can read it in the sem paper if you’re really keen!  J), I do want to point out a fundamental quality that bittorrent shares with the characterizations of Web 2.0 we get from these texts.  On a technical level, the bittorrent file sharing protocol does harness the power of distributed computing.  Because bittorrent connections aggregate multiple users and put tiny fragments of their content in discussion with one another, the direct P2P model (2 users) utilized by less efficient, slower technologies like Shawn Fanning’s Napster are rendered obsolete.  This technological efficiency has more than technological effects.  As Diehl et.al. note, “Granulated syndicated content means that users can choose to bypass many of the places on the Web where that content was generated, and therefore bypass the interface’s rhetorical stance influencing how they display and view the data; the users will more easily become producers of content that is relevant to themselves” (425).  The mass proliferation of granularized content distributed across disparate computer networks enables a resistant culture – a tactical culture to reference de Certeau – that manipulates the existent web architecture so that it can be redeployed as subversive – or at least piratical – force.  People can find content in places other than where it was generated and can produce – albeit often times with less social/cultural capital – more content in response. . . to mix my messages, the culturally colonized herd of “empire” strikes back against the Culture Industry by short-circuiting it’s rather controlled means of media distribution and circulation.

So, yes, I agree that bittorrent protocol is Web 2.0 in the sense that it engenders appropriation and redistribution of content acquired through channels different from the channels that created it; however, I do have a bit of a beef with O’Reilly’s characterization of bittorrent technology as one that “automatically gets better the more people use it.”  If only this were the case!  I understand that O’Reilly is arguing that bittorrent expands the catalogue as more people join the network and contribute their material to the catalogue; however, with the exception of a small number of bittorrent networks that are private, managed populations operating on gift economies, most people see public bittorrent use as an opportunistic, leech-based model wherein content is consumed but redistribution of content after consumption is not practiced.  While I think that there are pockets on the web – like open-source software development cirlces, wiki-enthusiasts, and piratical microprivates – that operate on a Web 2.0 “architecture of participation,” O’Reilly’s characterization of bittorrent as a protocol of a new technoethics of participation is a little zealous and celebratory.

So what about the future of the web?  If Web 2.0 is characterized by a sociality of digital space, what comes next?  Many folks see the future of the web in a semantic architecturalization of existing web structures.  The semantic web stakes claim to a universal pragmatics that utilizes a shared Saussurean langue via metadata that allows machines to move through the web like humans . . . without human direction – autonomous robots generating new content and answers.  Working with a combined series of code – a quality O’Reilly attaches to rich user experiences – like XHTML, the RDF Schema and Web Ontology Language the new web doesn’t search for documents but searches through collections of structured databases  in order to wade through the incredible amount of content generated by the consumers-as-producers that Web 2.0 made possible.

While the semantic web is only one feature of Web 3.0, I think it’s a fascinating one. . . and a possibly Sisyphean task.

Stolley, K. (2009). “Integrating Social Media into Existing Work Environments: The Case of Delicious.” Journal of Business and Technical Communication 23(3), 350-371.

This article takes an example of technical communicators integrating social media into existing work environments in the hopes that tools like RSS and Delicious can address “context-sensitive needs” in the workplace.  Some highlights:

  • When the author says “work environments” she means web browsers and project Web sites (CMS?)
  • Modus Operandi:  The author presents a hypothetical case of how technical writers and other folks with a stake in a project could share bookmarks while revising a set of technical documents on a product (a digital audio recorder).  Next, the author discusses how SMA integration can take place in “work environments” by using Kaptelinin and Nardi’s activity theory in relation to interaction design.  Finally, the author argues that the future of technical communication will be one of ever-increasing sophistication and customization that integrate SMAs into multiple technologically driven projects and workplaces.
  • Constellation:  the “regularly occurring coordination of mediating artifacts” (355) – in other words, the integration and use of multiple technological tools to achieve tasks by symbolic-analytic workers.
  • Textual coordination:  the ad-hoc strategies of document/artifact use that allows for the most efficient method of extracting information.  It’s important to note that the sort of tool-mediation that the author argues for in this piece is only useful if it “unobtrusively supports” the individual user’s work activity (369)
  • Working from Slattery’s work that posited that SA workers needed more screen space, the author picks up with browser plug-ins as a way to fix this; however, she achieves this through a discussion of activity theory because she argues that SMAs provide “functionality (actions) in support of a larger activity (356).  The SMA delicious allows users to operationalize bookmarking through unobtrusive browser-based add-ons and plug-ins.  The author argues that the use of SMAs in the interest of completing the document revisions is an example of the “polymotivation of actions” because users can perform a single action (bookmarking) to achieve multiple motives (self and other directed).
  • Activities, actions, operations – AT model.
  • Despite the wonderfulness of browser add-ons, these don’t necessarily fix a problem of complexity – an desire toward operationalization – for team-based activities.  So what to do?  Customize!
  • Zotero provides a solution to the problem that the author has. . . that being said, she used tags on bookmarks to collate useful information across users.
  • API – Application Programming Interface – sometimes customizable interfaces of web 2.0 applications.
  • Mediating artifacts have histories that often reflect the efforts and experiences of other people who tried to solve similar issues and modified the tool to achieve those ends.
  • The author argues that TC’ers will not only need to be “merely technological” in the know-how side of things in the future, but will also be able to draw the connections between technological actions and the broader sociality of team-based work.  This is why TC’ers will remain relevant and not taken over by IT’ers.

Diehl, A., Grabill, J.T., & Hart-Davidson, W. (2008). Grassroots: Supporting the knowledge work of everyday life. Technical Communication Quarterly 17(4), 413-434.

The authors introduce “Grassroots” – a piece of software used to map assets for organizations – as a way to advance three arguments: 1)  an argument about the nature of the knowledge work of everyday life, or an argument about the complex technological and rhetorical tasks necessary to solve commonplace problems through writing; 2) an argument about specific technologies and genres of community based work and how making maps is an essential genre to this work, and about why the asset map in particular is possibly transformative; and 3) an argument about the making of this piece of software itself – in other words, how did this process express, test, and verify the authors theories about writing and knowledge work (414).

  • The authors define knowledge work as “analytical activity requiring problem solving and abstract reasoning, particularly with (and through) advanced information technologies and particularly with and through acts of writing” (415).
  • Writing is often characterized – especially in sociological discussions – as invisible work because when it works, it works well and doesn’t create visible rupture.  Writing is also invisible because it occurs “everyday” in complex ways that have been – to use a term from AT – action-ized and operationalized.
  • Grassroots is a tool based on GoogleMaps that allows for database-driven customizations using overlays that reflect various kinds of data. . . in this case, asset information.
  • The power of the map:  the choice of what to map is a rhetorical one, socially influenced and constrained, and selectively visioned. . . it’s a way of meaning, an epistemological creation – to use de Certeau, it’s a narrativization of space.
  • The thrust for developing Grassroots was a civic one.  The authors wanted a way to empower the members of a community to narrate their own spaces.  GIS was too expensive and far too expert-driven.  Paper maps were too expensive.
  • The asset map used by Amy presented two arguments:  where media organizations already exist; and 2) how the dispersal of these media outlets over the geographic grid might allow for a strategic decision concerning the creation of their own media center.
  • Web 2.0 – the authors define as “the untethering of content and information available via the Web from the very places and pages with which we usually associated that content” (424).  The authors also point out that this untethering doesn’t increase the divide between content and form, but actually empowers the wreader because content can be repurposed into new forms for specific rhetorical situations.
  • There is an argument to be made here about bittorrent piracy communities and the authors definition of what constitutes web 2.0.  REVISIT.  (425)
  • Community change – according to Kretzmann and McKnight – is best achieved through asset-driven, not needs-driven development.
  • The authors did this work because they understand that “communities and citizens are entrenched in a knowledge society, where citizenship is a function of activity and this activity is knowledge work” (431).
  • Fundamentally, this democratization the authors advocate deauthorizes the technical communicator in practice, because this sort of work is “writing” is the kind that we don’t usually see.  We don’t see it because of disciplinary constraints, pedagogical foci and research practice.

O’Reilly, T. (2005). What is Web 2.0? O’Reilly Media. Retrieved 7 Jan. 2010 from http://oreilly.com/web2/archive/what-is-web-20.html .

  • The web as a platform – Web 2.0 is a collection of practices and principles that orgranizes and unites a huge array of different websites.
  • The value of a software product is proportional to the scale and dynamism of the data it helps to manage.
  • The authors make use of Sharkey’s “long tail” theory to understand that mass buy in in a couple of really populated industry sectors is not the way to drum up a lot of e-business.
  • I like that the authors fall heavily on the side of open-development projects in this piece.
  • Web 2.0 relies on distributed networks – like Napster or bittorrent.
  • Bittorrent is an example of the Web 2.0 principle that “the service automatically gets better the more people who use it” – I don’t necessarily agree.  Relate this discussion to gift economies.  This doesn’t work because people aren’t working on gift economies in large piracy networks, they’re working on leeching.  This is a complication of the architecture of participation mentioned later in the article.
  • Web 2.0 harnesses collective intelligence through hyyperlinking, aggregating, indexing, communalizing, socializing, tagging, collective filtering, and open source.  All of these processes create the lesson that: Network effects from user contributions are the key to market dominance in the Web 2.0 era.
  • Harnessing user-generated data is also a key feature of Web 2.0 (think GoogleMaps overlays).
  • Constant incremental software modification and improvement is supplanting traditional software release cycles.  This is the result of new applications being delivered as a service, not a product.
  • Design for lightweightness, hackability, remixability, and syndication in Web 2.0 apps.
  • Innovation occurs in the assembly of multiple innovative software products, not the creation of an all-encompassing monolithic software package.
  • Rich user experiences are facilitated by combinations of numerous languages and media.

4 Responses to “CCR760 – Sharing is Social, Meaning is Automatic: Web 2.0/3.0 – Social Media Readings”

  1. Luce

    Your more explicit discussion of piracy is interesting, and it certainly puts de Certeau on the forefront in ways that the folks writing the articles didn’t. The main question I had while reading through this week, and was really prominent in reading Harfoush’s piece is this: what is the effect of framing social media’s civic potential in economic/market/consumerist terms? Not only does that make various body parts twitch, it seems like the coupling or marriage of components that, if we were to examine them separately, were not theoretically meant to be together. I guess the follow up question this raises for a tech novice like me is this: is social media to be understood as consumers in the capitalist sense, or consumers in the consume/produce sense? And who is talking about social media as a potentially rich site of anti-capitalist ventures?

  2. Mike

    Justin, I’m wondering about your statement that bittorrent protocols have rendered 1st gen P2P protocols or technologies obsolete. I don’t know that I’ve considered torrent implementations a direct replacement for traditional P2P tools and technologies. I agree that there are inherent efficiencies with bittorrent, but I’m wondering about the typical implementations. I haven’t come across a business app or system that makes use of torrents (within a context of data distribution across an organization). However, there are a range of business apps that make full use of traditional P2P technologies to improve information and file sharing through proprietary clients.

  3. justin

    @Mike: I guess I should have qualified that statement in the context of social sharing in non-organizational contexts. Though I think some folks are still using P2P for file sharing on networks like KaZaa, I don’t think there are many left. I know bittorrent is usually used for open-source linux distributions, World of Warcraft updates, and illegal file sharing. Interestingly, the protocol itself must be of some interest as backers like 21st Century Fox, MTV, and Warner Brothers invested in Bram Cohen’s startup over the last couple of years. This would seem to point to this protocol being useful in the distribution of media over networks in a way that challenges however-in-the-world Netflix and the TiVo work.

  4. Mike

    Now that very cool. I wasn’t aware of those commercial uses of torrents. Do you see a problem with managing legitimate “bits” in those commercial spaces? I guess I’m wondering how one could assure users that each piece of a data seed is valid and not compromised. Do you have to assume a degree of risk with the more “public” torrents?

Leave a Reply