GLORIA

GEOMAR Library Ocean Research Information Access

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Bruns, Axel  (8)
  • 1995-1999  (8)
Material
Person/Organisation
Language
Years
  • 1995-1999  (8)
Year
Subjects(RVK)
  • 1
    Online Resource
    Online Resource
    Queensland University of Technology ; 1998
    In:  M/C Journal Vol. 1, No. 3 ( 1998-10-01)
    In: M/C Journal, Queensland University of Technology, Vol. 1, No. 3 ( 1998-10-01)
    Abstract: "'Cogito ergo sum' is an insufficient measure of existence within Usenet. ... Without some sort of response beyond interior cogitation there is nothing to be perceived by other Usenet users." (MacKinnon 119) Much early research into computer-mediated communication (CMC) claimed that meaningful online interaction between individuals who didn't know each other 'in real life' was very unlikely, that online communities could never develop -- too restrictive seemed the medium, too lacking in extratextual cues to each participant's identity (like variations of the tone and style of one's language) to build relationships. Such views have been comprehensively refuted by now, of course: 'virtual community' has become one of the CMC researchers' favourite buzzwords, and it is widely accepted that the language of online interaction is rich with newly-invented cues that replace the body language and voice inflection changes that accompany oral communication -- smilies and acronyms are only the most immediately obvious of such tools. In any form of communication, we use these cues mainly to get our own identity across, and to uncover that of others -- beyond the actual content of the message, cues tell us how a speaker feels about what they're saying, whether they're sympathetic, angry, ironic, and more generally hint at a speaker's level of education, interest in the topic at hand, general state of mind, and much more. The different cue system of interaction on the Net may delay the communication of identity particularly for inexperienced users, but won't prevent it altogether -- the many closely-networked groups of participants on Usenet newsgroups are a strong testimony to that fact. The most celebrated benefit of online interaction is that we can now freely choose any identity we'd like to take on: leaving our 'meat', our bodily existence, behind as we 'jack in' to the network (to use William Gibson's terms), we can recreate ourselves in any shape or form we want. But to take on such identity is only half the story, and much like wearing extravagant clothes only in the privacy of one's own home -- on the Net, where merely physical existence is irrelevant, you have to show your identity to exist. Only if you participate will you truly be a part of the online community -- lurkers are nothing but insubstantial shadows of users whose potential for existence hasn't yet been realised. Like streams of subspace energy in Star Trek's transporter rooms, they haven't materialised yet, and only will with the creation of a newsgroup posting or Webpage, or any other form of communication. Even that is not the full story, though: just as oral communication requires at least a speaker and a listener, the presentation of an identity online also needs an audience. Again, too, the nature of the medium means that the presence of an audience can only be confirmed if that audience shows its presence in some way. "Without a visible response, a written statement remains isolated and apparently unperceived -- a persona's existence is neither generated nor substantiated", as MacKinnon writes (119). Disembodied as participants in discussion groups are, for their online identities to exist they depend crucially on an engagement in sufficiently meaningful communication, therefore -- this inevitable need to communicate thus is what makes online communities so strong, in comparison with similar offline groups where group members may simply refuse to communicate and still use this as a strong statement as to their identity. Online, those who choose to stand on the sidelines and sneer, as it were, don't really exist at all. This finding doesn't just apply to newsgroups and other discussion fora: Webpages similarly have little actual existence unless they are viewed -- much like Schrödinger's cat, they exist in a state of potentiality which can only be realised through access. Again, however, Web access doesn't usually leave any obvious traces -- the nature of the Internet as an electronic medium means that a site which has only been accessed once will look just the same as one that has had millions of hits. This is where the growing industry of Web counters and statistics servers comes in, services which offer anything from a mere count of accesses to a page to a detailed list of countries, domains, and referring pages the visitors came from. (And indeed, this very journal keeps track of its access statistics, too.) Descartes's physical-world premise of 'cogito ergo sum' isn't directly applicable to the online world, then. Merely to be able to think does not prove that you exist as an Internet participant; neither, as we have seen, does being able to write, or publish Web pages. As MacKinnon writes, the new credo for the information age has now become "I am perceived, therefore I am" (119) - videor ergo sum. Only this makes real the disembodied self-chosen identity which computer-mediated commu nication affords us. References Gibson, William. Neuromancer. London: HarperCollins, 1993. MacKinnon, Richard C. "Searching for the Leviathan in Usenet." CyberSociety: Computer-Mediated Communication and Community. Ed Steven G. Jones. Thousand Oaks, Calif.: Sage, 1995. 112-37. Citation reference for this article MLA style: Axel Bruns. "Videor Ergo Sum: The Online Search for Disembodied Identity." M/C: A Journal of Media and Culture 1.3 (1998). [your date of access] 〈 http://www.uq.edu.au/mc/9810/videor.php 〉 . Chicago style: Axel Bruns, "Videor Ergo Sum: The Online Search for Disembodied Identity," M/C: A Journal of Media and Culture 1, no. 3 (1998), 〈 http://www.uq.edu.au/mc/9810/videor.php 〉 ([your date of access]). APA style: Axel Bruns. (1998) Videor ergo sum: the online search for disembodied identity. M/C: A Journal of Media and Culture 1(3). 〈 http://www.uq.edu.au/mc/9810/videor.php 〉 ([your date of access]).
    Type of Medium: Online Resource
    ISSN: 1441-2616
    RVK:
    Language: Unknown
    Publisher: Queensland University of Technology
    Publication Date: 1998
    detail.hit.zdb_id: 2018737-3
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 2
    Online Resource
    Online Resource
    Queensland University of Technology ; 1998
    In:  M/C Journal Vol. 1, No. 2 ( 1998-08-01)
    In: M/C Journal, Queensland University of Technology, Vol. 1, No. 2 ( 1998-08-01)
    Abstract: Memory is everywhere. We remember, more often than not, who and what we are, recognise friends and acquaintances, remember (hopefully) birthdays and anniversaries, and don't forget, as much as we'd sometimes like to, our everyday tasks and duties. But that's just the tip of the iceberg: we also speak of computer memory (usually in the context of needing more to run the latest Microsoft-made memory hog), of digital archives where we store what we don't want to bother our braincells with, and of those storerooms of human knowledge -- libraries -- which are gradually moving from analogue to digital storage as they join the new global memory that is the Internet (according to the visionaries). And then there are the alternatives to this 'official' memory: repressed memories, oppositional views of history, new discoveries that challenge our ideas of the past. It is in this wide field of possible cultural interaction that this, the second issue of M/C operates. At a time when half the world remembers the first anniversary of Princess Diana's death, with the other half trying desperately to avoid the tabloids ' crocodiles' tears, at a time when most of us are looking forward to forgetting all about the White House sex scandals, and at a time, finally, when cultural commentators the world over are beginning to sort out which events of the past decade, century, and millennium will have been worth remembering, we review the idea of 'memory' from a variety of angles -- some broad, some narrow, some focussed on individual human memory, some on the memory of humanity as such. Our featured M/C guest writer, Canadian scholar Paul Attallah, opens this issue. In his article "Too Much Memory", he covers a lot of ground -- from the growing nostalgia for cultural products of the past to the recovery of political memory of past wrongs, to the memory of Princess Diana and other deceased celebrities. The media, he writes, are today in the business of creating 'pseudo-events' -- but the public are getting better at looking behind the façades: they might come to reject this constant stream of too much (fake) memory. As P. David Marshall writes, the problem becomes even more complicated if you're in Australia, at some distance from the centres of mainstream cultural production. As publicity leaks across the Internet and similar channels, Australians collect 'anticipatory memories' of those pseudo-events created by the media -- before the events even take place in the local channels of popular culture. The result of this phenomenon, Marshall suggests, may be an even stronger hegemonic grip of American broadcast standards. Adam Dodd takes us from memories of events in the immediate future to repressed memories -- of alien abductions. He points out that whatever the truth behind abduction stories, we should take note of the fact that these stories are reported as truth, and promptly rejected by the scientific establishment. This raises age-old questions of the nature of 'reality' in a postmodern world where objectivity has come to be recognised as an unattainable dream. Continuing the extraterrestrial theme, Nick Caldwell turns to the possible revival of 1950s science fiction iconography. After the cynical 80s with its dark and dirty SF designs, fond memories of the curvy, stylish interstellar dreams of post-war times are beginning to emerge again -- at a time of frantic artistic recycling of works from all eras, and at the dawn of a new millennium where again everything seems possible, perhaps now the rocketship designs of the 50s can finally come true. Axel Bruns returns the focus earth-wards, but remains on the topic of modern technology. He points to the opportunities and threats brought about by Internet archives such as Deja News -- with every newsgroup article at every user's fingertips, the potential for abuse is immense. As the perfect digital memory offered by Deja News is becoming a favourite search tool, it is high time to question the ethical implications of archiving the ephemeral. Paul Mc Cormack's article offers some more general thoughts on the future of the Internet. Comparing what still are the early days of this new medium with the first decades of radio, he suggests that we may 'remember' the future of the Net by learning from the past. The commercialisation of radio after its 'anarchic' childhood may be what's in store for the Internet, too -- despite the obvious differences between the two media. Finally, in her article on "Memory and the Media", Felicity Meakins closes the circle by returning to an issue touched on by Paul Attallah -- the death of Princess Diana. She describes how since Diana's demise the media's rhetoric has changed profoundly to consist almost exclusively of forms of eulogy. Using Speech Act Theory, Meakins identifies the performative function of this rhetoric, and points out how it has influenced our memories of Diana. Finally, in her article on "Memory and the Media", Felicity Meakins closes the circle by returning to an issue touched on by Paul Attallah -- the death of Princess Diana. She describes how since Diana's demise the media's rhetoric has changed profoundly to consist almost exclusively of forms of eulogy. Using Speech Act Theory, Meakins identifies the performative function of this rhetoric, and points out how it has influenced our memories of Diana. Citation reference for this article MLA style: Axel Bruns. "Editorial: 'Memory'." M/C: A Journal of Media and Culture 1.2 (1998). [your date of access] 〈 http://www.uq.edu.au/mc/9808/edit.php 〉 . Chicago style: Axel Bruns, "Editorial: 'Memory'," M/C: A Journal of Media and Culture 1, no. 2 (1998), 〈 http://www.uq.edu.au/mc/9808/edit.php 〉 ([your date of access]). APA style: Axel Bruns. (199x) Editorial: 'memory'. M/C: A Journal of Media and Culture 1(2). 〈 http://www.uq.edu.au/mc/9808/edit.php 〉 ([your date of access]).
    Type of Medium: Online Resource
    ISSN: 1441-2616
    RVK:
    Language: Unknown
    Publisher: Queensland University of Technology
    Publication Date: 1998
    detail.hit.zdb_id: 2018737-3
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 3
    Online Resource
    Online Resource
    Queensland University of Technology ; 1998
    In:  M/C Journal Vol. 1, No. 4 ( 1998-11-01)
    In: M/C Journal, Queensland University of Technology, Vol. 1, No. 4 ( 1998-11-01)
    Abstract: These days, when we speak of the Internet, electronic networks, or computer-mediated communication (CMC) in general, the term 'cyberspace' all too readily presents itself as a blanket description of these communications systems. Without question it's an attractive and powerful metaphor -- and that "the way we think, what we experience, and what we do every day is very much a matter of metaphor" (Lakoff & Johnson 3) has by now been proven convincingly by the work of cognitive linguists. They have found that "metaphors serve to organise and interpret experience" (Closs Traugott 49), and so the influence, especially on users only just coming to terms with the Net, of an image of computer-mediated communication processes as taking place in a cyber-space of some description is immense: it's no accident that we speak of homepages and attempt to enter Websites and ftp servers, which in turn protect themselves from the outside cyberworld using firewalls. In itself, that's no problem -- we need salient metaphors in order to conceptualise new realms of experience, and a spatial approach to experiencing the Internet comes naturally, given that in 'real life' we interact with a spatial world, after all; we're experts in understanding things through spatial analogies. All those analogies tend to be limited to three- (or, allowing for the temporal dimension, four-) dimensional spatial models, however -- most obviously in the science fiction literature that first popularised the term 'cyberspace', where writers like William Gibson describe the network matrix as a "transparent 3D chessboard extending to infinity" (Gibson 68). Quite obviously, this form of cyberspace basically replicates the structures of the 'real' world: it's merely adding an easy changeability and a suspension of physical laws which material existence cannot offer, but sticking to three dimensions. Outside the VR labs and slick computer graphics-heavy Hollywood movies, however, 'cyberspace' as we experience it today is far removed from such three-dimensional dreams. For the average Internet user, cyberspace has come to stand mainly for the World Wide Web, and here the deeply-ingrained 3D concepts the term conjures up serve to confuse rather than aid an understanding. At the most basic level -- that of the computer screen --, cyberspace inevitably appears in only two dimensions, of course, and it's hard to imagine one's clicking through a range of Webpages as an effortless glide over, or at least a walk through, the global village. On the other hand, with regards to organising the documents that make up that cyberspace, there is no reason why a three-dimensional approach should be favoured over one in any other number of dimensions: as calculating machines without the perceptually-determined preferences of humans, computers can think in four, five, n dimensions just as well as in three. Were we to force our 3D thinking on the machines, in fact, we'd probably end up limiting their usefulness (which in many cases is limited enough as it is). The problem that cyberspace isn't simply a common, Euclidean 3D space is what frequently confuses its human visitors, though: like Homer J. Simpson stepping out of the 2D cartoon into the third dimension, we're disoriented, even frightened -- ask any novice Websurfer who is just trying to find their path through cyberspace. Jumping from site to site, the relative locations to one another of the places they visit remain unclear; 'how do I get to...' is one of the most frequently heard questions. Many of the paths through the Web are temporary, after all, and there are many unexpected shortcuts and hidden passages, as well as roadblocks and detours. Again, an overly three-dimensional conceptualisation obscures more than it describes here: while we may well regard the sites and servers of major organisations as the highrisers in the cyberspatial cityscape, entering them through the main foyer (the central Webpage) is just one of our many options: we might just as well 'materialise' directly in a basement room, as it were, by following a link to a specific page on the site, or enter the hidden thirteenth floor by jumping to a page that hasn't been publicised. Hyperlinks are our wormholes in cyberspace, defined only by their beginning and end, with an indefinite distance in between. How do you explain such feats to someone applying a strict 3D thinking to the Web? Stuck with the traditional options of three-dimensional experience, such a person wouldn't even think these actions possible. That cyberspatial geometry is far from fixed makes the situation even worse. The perception of space crucially depends on understanding the relations between lengths, heights, depths, and locations, but in cyberspace the lengths, heights, depths, and locations depend on one's perception: Websites seem extensive or small to us depending exclusively on how much of them we've explored -- largely, this is because there is never any way to view a site in its entirety, simply because there is no vantage point in cyberspace from which to do so (we're always either immersed in a site, looking from within, or at an unknown distance from it, unable to see from without -- but never just close by, where we might survey the whole site). And our perception of relative location depends entirely on the available hypertext links we are aware of -- the sites listed in our bookmarks are merely a step away, nextdoor, but those we're only vaguely aware of and have to hunt through Yahoo! and other search engines for can be a very long way off. (But we can bring them into immediate proximity simply by adding a bookmark later on.) Finally, while we may change the geography of our cyberworld through our own interventions, attaching Webpages to our individual nets by way of links, these may also slip away again unnoticed -- like a boat from its moorings -- by relocating to a different server or by restructuring their site contents. The relatively stability of three-dimensional space in the 'real' world just doesn't translate to cyberspace. These problems with the spatial metaphor for the Internet and the Web also affect other metaphors similarly grounded in three-dimensional experience. The most persistent and influential of these is the image of the 'global village' that was invented by Marshall McLuhan. This village model has inherited all the 3D limitations we've already seen, and additionally this particular three-dimensional arrangement also introduces further complications in that the idyllic and simplistic image the word 'village' conjures up hardly fits the confusing, contradictory, multi-lingual, multi-ethnic, and highly-populated nature of the Web -- calling this structure a 'global metropolis' seems to be at least some improvement over the 'global village' model (cf. Bruns sect. 4 bite 15ff.), as it opens up the possibility of individual suburbs with their own local identities, of a city centre with the major communal services, of internal politics amongst opposing factions of citizens, and of a number of outskirts areas with various connections to centre and outside. Again, this thought construct necessarily falls short of describing Internet reality in its entirety, of course -- an even more accurate term than 'global metropolis' may be 'Western-hemispheral n-dimensional population centre of variable shape', but that's a bit of a mouthful, really, and so I'll stick with 'global metropolis' for now, adding the caveat that the apparent three-dimensionality of this image shouldn't be taken for granted. In this, interestingly, cyberspace perhaps isn't all that far removed from the 'real world' -- after all, modern physicists are increasingly convinced that there are more dimensions to the universe than the human eye may be prepared to see. References Bruns, Axel. "'Every Home Is Wired': The Use of Internet Discussion Fora by a Subcultural Community." 1998. 5 Nov. 1998. 〈 http://www.uq.net.au/~zzabruns/uni/honours/thesis.php 〉 . Closs Traugott, Elizabeth. "'Conventional' and 'Dead' Metaphors Revisited." The Ubiquity of Metaphor: Metaphor in Language and Thought. Eds. René Dirven and Wolf Paprotté. Amsterdam: John Benjamins, 1985. 17-56. Gibson, William. Neuromancer. London: HarperCollins, 1993. Lakoff, George, and Mark Johnson. Metaphors We Live By. Chicago: U of Chicago P, 1980. Citation reference for this article MLA style: Axel Bruns. "The n-Dimensional Village: Coming to Terms with Cyberspatial Topography." M/C: A Journal of Media and Culture 1.4 (1998). [your date of access] 〈 http://www.uq.edu.au/mc/9811/village.php 〉 . Chicago style: Axel Bruns, "The n-Dimensional Village: Coming to Terms with Cyberspatial Topography," M/C: A Journal of Media and Culture 1, no. 4 (1998), 〈 http://www.uq.edu.au/mc/9811/village.php 〉 ([your date of access]). APA style: Axel Bruns. (1998) The n-Dimensional Village: Coming to Terms with Cyberspatial Topography. M/C: A Journal of Media and Culture 1(4). 〈 http://www.uq.edu.au/mc/9811/village.php 〉 ([your date of access]).
    Type of Medium: Online Resource
    ISSN: 1441-2616
    RVK:
    Language: Unknown
    Publisher: Queensland University of Technology
    Publication Date: 1998
    detail.hit.zdb_id: 2018737-3
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 4
    Online Resource
    Online Resource
    Queensland University of Technology ; 1999
    In:  M/C Journal Vol. 2, No. 8 ( 1999-12-01)
    In: M/C Journal, Queensland University of Technology, Vol. 2, No. 8 ( 1999-12-01)
    Abstract: It used to be so simple. If you turn on your TV or radio, your choices are limited: in Australia, there is a maximum of five or six free-to-air TV channels, depending on where you're located, and with a few minor exceptions, the programming is relatively uniform; you know what to expect, and when to expect it. To a slightly lesser degree, the same goes for radio: you might have a greater choice of stations, but you'll get an even smaller slice of the theoretically possible range of programming -- from Triple J to B105, there's mainstream, easy listening, format radio fodder, targetted at slightly different audience demographics, but hardly ever anything but comfortably agreeable to them. Only late at night or in some rare timeslots especially set aside for it, you might find something unusual, something innovative, or simply something unexpected. And of course that's so. How could it possibly be any other way? Of course radio and TV stations must appeal to the most widely shared tastes, must ensure that they satisfy the largest part of their audience with any given programme on any given day -- in short, must find the lowest common denominator which unifies their audience. That the term 'low' in this description has come to be linked to a negative meaning is -- at first -- only an accident of language: after all, mathematically this denominator constitutes in many ways the most fundamental of shared values between a series of fractions, and metaphorically, too, this commonality is certainly of fundamental importance to community culture. The need for radio and TV stations to appeal to such shared values of the many is twofold: where they are commercially run operations, it is simply sound business practice to look for the largest (and hence, most lucrative) audience available. In addition to this, however, the use of a public and limited resource -- the airwaves -- for the transmission of their programmes also creates significant obligations: since the people, represented by their governmental institutions, have licenced stations to use 'their' airwaves for transmission, of course stations are also obliged to repay this entrustment by satisfying the needs and wants of the greatest number of people, and as consistently as possible. All of this is summed up neatly with the word 'bandwidth'. Referring to frequency wavebands, bandwidth is a precious commodity: there is only a limited range of frequencies which can possibly be used to transmit broadcast-quality radio and TV, and each channel requires a significant share of that range -- which is why we can only have a limited number of stations, and hence, a limited range of programming transmitted through them. Getting away from frequency bands, the term can also be applied in other areas of transmission and publication: even services like cable TV frequently have their form of bandwidth (where cable TV systems have only been designed to take a set number of channels), and even commercial print publishing can be said to have its bandwidth, as only a limited number of publishers are likely to be able to exist commercially in a given market, and only a limited number of books and magazines can be distributed and sold through the usual channels each year. There are in each of these cases, then, physical limitations of one form or another. The last few years have seen this conception of bandwidth come under increased attack, however, and all those apparently obvious assumptions about our media environment must be reconsidered as a result. Ever since the rise of photocopiers and personal printers, after all, people have been able to create small-scale print publications without the need to apply for a share of the commercial publishers' 'bandwidth' -- witness the emergence of zines and newsletters for specific interest groups. The means of creation and distribution for these publications were and are not publicly or commercially controlled in any restrictive way, and so the old arguments for a 'responsible' use of bandwidth didn't hold any more -- thus the widespread disregard in these publications for any overarching commonly held ideas which need to be addressed: as soon as someone reads them, their production is justified. Publishing on the Internet drives the nail even further -- here, the notion of bandwidth comes to an end entirely, in two distinct ways. First, in a non-physical medium, the argument of the physical scarcity of the publication medium doesn't hold anymore -- space for publication in newsgroups and on Web pages, being digital, electronic, 'virtual', is infinitely expandable, much unlike frequency bands with their highly fixed and policed upper and lower boundaries. New 'stations' being added don't interfere with existing ones here, and so there's no need to limit the amount of individual channels available on the Net; hence the multitude of newsgroups and Websites available. Again, whatever can establish an audience (even just of a few readers) is justified in its existence. Secondly, available transmission bandwidth is also highly divisible along a temporal line, due to the packet-switching technology on which the medium is based: along the connections within the network, information that is transmitted is chopped up into small packets of data which are recombined at the receiver's end; this means that individual transmissions along the same connection can coexist without interfering with one another, if at a somewhat reduced speed (as anyone navigating the Web while downloading files has no doubt experienced). Again, this is quite different from the airwaves experience, where two radio stations or TV channels can't be broadcasting on the same frequency without drowning each other out. And even the reduction of transmission speed is likely to be only a temporary phenomenon, as network hardware is constantly being upgraded to higher speeds. Internet bandwidth, then, is infinite, in both the publication and the transmission sense of the word. If it's impossible to reach the end of available bandwidth on the Net, then, this means nothing less than that the very concept of 'bandwidth' on the Net ends: that is, it ceases to have any practical relevance -- as Costigan notes, reflecting on an all too familiar metaphor, "the Internet is in many ways the Wild West, the new frontier of our times, but its limits will not be reached. ... The Internet does not have an edge to push past, no wall or ocean to contain it. Its size and shape change constantly, and additions and subtractions do not inherently make something new or different" (xiii). But that this is so, that we have come to this end of 'bandwidth' by never being able to come to an end of bandwidth on the Net, is in itself something fundamentally new and different in media history -- and also something difficult to come to terms with. All those of courses, all those apparently obvious and natural practices of the mainstream media have left us ill prepared for a medium where they are anything but natural, and even counterproductive. Old habits are hard to break, as many of the apparently well-founded criticisms of the Internet show. Let's take Stephen Talbott as an example here: in one of my favourite passages of overzealous Net criticism, he writes of The paradox of intelligence and pathology. The Net: an instrument of rationalisation erected upon an inconceivably complex foundation of computerised logic -- an inexhaustible fount of lucid 'emergent order.' Or, the Net: madhouse, bizarre Underground, scene of flame wars and psychopathological acting out, universal red-light district. ... The Net: a nearly infinite repository of human experience converted into objective data and information -- a universal database supporting all future advances in knowledge and economic productivity. Or, the Net: perfected gossip mill; means for spreading rumours with lightning rapidity; ... ocean of dubious information. (348-9) Ignoring here the fundamental problem of Talbott's implicit claim that there are objective parameters according to which he can reliably judge whether or not any piece of online content is 'objective data' or 'dubious information' (and: for whom?), and thus his unnecessary construction of a paradox, a binary (no pun intended) division into 'good' and 'bad' uses, a second and immediately related problem is that Talbott seems to claim that the two sides of this 'paradox' are somehow able to interfere with each other, to the point of invalidating one another. This can easily be seen as a result of continuing to think in terms of bandwidth in the broadcast sense: there, the limited number of channels, and the limited amount of transmission space and time for each channel, have indeed meant that stations must carefully choose what material to broadcast, and that the results are frequently of a mainstream, middle-of-the-road, non-challenging nature. On the Net, this doesn't hold, however: here, the medium can be used for everything from the Human Genome Project to peddling sleeze and pirated 'warez', without the two ends of this continuum of uses ever affecting one another. That's not to say that what goes on in some parts of the Net isn't unsavoury, offensive, illegal, or even severely in violation of basic human rights; and where this is so, the appropriate measures, already provided by legal systems around the world, should be taken to get rid of the worst offenders -- notably, though, this won't be possible through cutting off their access to bandwidth: where bandwidth is unlimited and freely available to anyone, this cannot possibly work. Critical approaches like Talbott's, founded as they are on an outdated understanding of media processes and the false assumption of a homogeneous culture, won't help us in this, therefore: rather, faced with the limitless nature of online bandwidth, we must learn to understand the infinite, and live with it. The question isn't how many 'negative' uses of the Net we can point to -- there will always be an abundance of them. The question is what anyone of us, whoever 'we' are, can do to use the Net positively and productively -- whatever we as individuals might consider those positive and productive uses to be. References Costigan, James T. "Introduction: Forests, Trees, and Internet Research." Doing Internet Research: Critical Issues and Methods for Examining the Net. Ed. Steve Jones. Thousand Oaks, Calif.: Sage, 1999. Talbott, Stephen L. The Future Does Not Compute: Transcending the Machines in Our Midst. Sebastopol, Calif.: O'Reilly & Associates, 1995. Citation reference for this article MLA style: Axel Bruns. "The End of 'Bandwidth': Why We Must Learn to Understand the Infinite." M/C: A Journal of Media and Culture 2.8 (1999). [your date of access] 〈 http://www.uq.edu.au/mc/9912/bandwidth.php 〉 . Chicago style: Axel Bruns, "The End of 'Bandwidth': Why We Must Learn to Understand the Infinite," M/C: A Journal of Media and Culture 2, no. 8 (1999), 〈 http://www.uq.edu.au/mc/9912/bandwidth.php 〉 ([your date of access]). APA style: Axel Bruns. (1999) The end of 'bandwidth': why we must learn to understand the infinite. M/C: A Journal of Media and Culture 2(8). 〈 http://www.uq.edu.au/mc/9912/bandwidth.php 〉 ([your date of access]).
    Type of Medium: Online Resource
    ISSN: 1441-2616
    RVK:
    Language: Unknown
    Publisher: Queensland University of Technology
    Publication Date: 1999
    detail.hit.zdb_id: 2018737-3
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 5
    Online Resource
    Online Resource
    Queensland University of Technology ; 1998
    In:  M/C Journal Vol. 1, No. 2 ( 1998-08-01)
    In: M/C Journal, Queensland University of Technology, Vol. 1, No. 2 ( 1998-08-01)
    Abstract: They may have been obscured by the popular media's fascination with the World Wide Web, but for many Net users, Usenet newsgroups still constitute an equally important interactive tool. While Web pages present relatively static information that can be structured through hypertext links and searched using Yahoo! and similar services, newsgroups are fora for open, interactive discussion on virtually any conceivable subject, amongst participants from around the world -- more than any other part of the Net, they are instrumental in the formation of virtual communities by allowing like-minded individuals to get together, exchange their opinions, and organise themselves. Having emerged from email mailing-lists, newsgroups are among the oldest uses of computer networks for communication; based around a simple exchange of all new postings among the participants, they offer few of the comforts of more modern technologies, and so it is not surprising that a number of services have begun to provide powerful Web interfaces for newsgroup access. One of the oldest and most reknowned amongst these sites is Deja News. Since its launch in May 1995, Deja News has expanded to cover around 50,000 different newsgroups, which are contained in a memory archive that is several hundred gigabytes in size, according to some reports (Woods, n.pag.); the company itself boasts accesses by more than three million users each month to the hundreds of millions of articles it has archived ("Company Background", n. pag.). What makes Deja News attractive to so many users are the many search options the service has added to its newsgroups interface. Deja News visitors can search for any article subjects, keywords, newsgroups, or participants' names they are interested in: if you're looking for the absolutely latest on the White House sex scandal, search for "Clinton + Lewinski" and limit your search to articles posted in the last few days or hours; if you want to know what newsgroups your co-worker is writing to in her lunch break, simply ask Deja News to find all postings by "Annabel Smith". If you can't quite remember the address for the latest Internet camera site, Deja News will. To put such powerful research capabilities at the fingertips of any Web user raises any number of legal as well as ethical questions, however. To begin with, does Deja News even have the right to archive anyone's and everyone's postings to newsgroups? The simple fact that users' articles are posted openly, for all newsgroup participants to see, does not necessarily automatically imply that the article may be made available to a greater public elsewhere -- by analogy, if you have a casual conversation with a group of people, you aren't usually expecting to see your words in the history books the next day (but analogies between the Internet and 'real life' are always dangerous). Unless you spoke to someone with a photographic memory, you can usually rely on them gradually forgetting what you said, even if you made a fool out of yourself -- that's human. Appearing as the result of a Deja News search, articles lose their original context -- the newsgroup discussion they were part of -- and are given an entirely different one, a context in which by virtue of being so presented they may gain new and potentially questionable authority. As is so often the case for information on Websites, the information in articles which can thus be found through services like Deja News cannot easily be verified or disproven; due to the loss of context, researchers cannot even gain a feel for a writer's trustworthiness in the same way that seasoned newsgroup members can. Neither may they always detect intentional irony and humour: playful exaggeration may easily appear as deliberate misinformation, friendly oneupmanship as an angry attack. The results of author-based searches may be even more potentially damaging, however. Deja News offers various mechanisms that can include searches restricted to the articles written by a particular user, culminating in the 'author profile' that can be used to list all posts ever made by a particular user. One does not need to be paranoid to imagine ways in which such a powerful research tool may be abused -- perhaps even (and most easily) by the Deja News company itself: since, following general Web etiquette, access to the service is free for normal users, Deja News relies on other avenues of income, and doubtlessly it would be tempting to sell the rights to exploit the Deja News database to professional spammers (Internet junk mailers). These could then aim their advertising emails directly towards the most promising target audience -- those who have in their newsgroups postings shown the most interest in a particular product or service. Indeed, Deja News notes, somewhat vaguely, that it "can provide efficient and effective Internet-based marketing for various types of online marketing goals, such as testing messages, building brand awareness, increasing Website traffic or generating leads" ("Company Background", n. pag.). While such uninvited advertising may be annoying to unsuspecting Internet users, more damaging and mean-spirited uses of the 'author profile' can also be imagined easily. What if your prospective new employer finds the comments you made about the company in a newsgroup article last year? What if they check your author profile for your attitude to drugs (did you ever write an article in rec.drugs.cannabis) or your sexual orientation (any posts to alt.sex.homosexual)? What if some extremist group targets you over your support for multiculturalism? Thanks to Deja News and similar services, anything you've ever said may be used against you. The virtual walls of cyberspace have ears, come with a perfect memory, and are prepared to share their knowledge with anyone. A valid line of argument against such criticism notes that we are all responsible for our own actions: newsgroups are, after all, public discussion fora that are open to all participants -- if you make a controversial statement in such a place, you must be prepared to suffer the consequences. However, the threat of being taken out of context must once more be emphasised at this point: while the articles that can be found through Deja News appear to accurately reflect their writers' views, the background against which such views were expressed is much more difficult to extract. Furthermore, only very few newsgroup participants will be aware that their postings are continually being archived, since newsgroups generally appear as a fairly ephemeral medium: only the last few days or, at most, weeks of newsgroup traffic are usually stored on news servers. An awareness of being archived would help writers protect themselves -- it may also serve to impoverish newsgroup discussion, however. Even more importantly, the already-digital, computer-mediated nature of newsgroup discussions has far-reaching implications. Dealing with units of data that come in a handy, easily stored format, services like Deja News tend to archive Usenet newsgroups interminably -- your first flames as a 'clueless newbie', years ago, may therefore today still be used to embarass you. This is the most important new development: analogue, organic, human memory eventually fades; we tend to organise our memories, and remember those things we regard as most important, while others gradually vanish. Many modern legal systems reflect this process by gradually deleting minor and even major offences from a citizen's criminal records -- they forgive as they forget. Other than in cases of extreme Alzheimer's brought about by server crashes and hard disk defects, digital memory, on the other hand, is perfect for unlimited periods of time: once entered into a database, newsgroup articles may be stored forever; ephemeral discussions become matters of permanent record. The computer doesn't forget. While its many Internet accolades bear witness to the benefits users have found in Deja News, the ethical questions the service raises have hardly been addressed so far, least of all on the Deja News Website itself. While apparently the inclusion of a header line "X-noarchive: yes" may prevent articles from being archived by Deja News, this isn't advertised anywhere; neither are there any statements justifying the unauthorised archiving of newsgroups, or any easily accessible mechanisms for users to have their own articles deleted from the archives. As has often been the case on the Internet, a private organisation has therefore become a semi-official institution, simply by virtue of having thought of an idea first; ethics and laws are left behind by technological development, and find that they have some catching-up to do. Of course, none of this should be seen as condemning Deja News as a malevolent organisation out to spy on Internet participants -- in fact, the company so far appears to have shown admirable restraint in declining to exploit its database. Deja News is at the centre of this controversy only by virtue of having implemented the idea of a 'memory of Usenet' too perfectly: not Deja News is the problem, but those who would use, to the ir own and possibly sinister ends, information made available without charge by Deja News. Eventually, in any way, Deja News itself may be overwhelmed by its own creation: with the amount of Internet users still continually increasing, and with newsgroup articles accumulating, it is gradually getting harder to still find the few most important postings amongst a multitude of discussions (in a similar way, search engine users are beginning to have trouble locating the most relevant Websites on any specific topic amongst a large number of less useful 'vanity' homepages). In the end, then, this new 'perfect' digital memory may have to learn an important capability from its analogue human counterpart: Deja News and similar archives may have to learn how to forget articles of lesser significance. A very simple first step towards this process has already been made: since December 1997, junk mail postings ('spam') are being removed from the Deja News archives (Woods, n. pag.). While such articles, whose uselessness is almost universally agreed upon by the Internet community at large, constitute a clear case for removal, however, any further deletions will mean a significant step away from the original Deja News goal of providing a complete archive of Usenet newsgroups, and towards increasingly controversial value-judgments -- who, after all, is to decide which postings are worth archiving, and which are irrelevant? If memory is to be selective, who will do the selecting? Eventually (and even if new memory management technologies help prevent outright deletion by relegating less important information to some sort of second-rate, less accessed memory space), it seems, the problem returns to being an ethical one -- of what is archived where and for how long, of who has access to these data, and of how newsgroup writers can regain control of their articles to protect themselves and prevent abuse. Deja News and the Internet community as a whole would be well-advised to address the problems raised by this perfect memory of originally ephemeral conversations before any major and damaging abuse can occur. References "Company Background." Deja News. 1998. 11 Aug. 1998 〈 http://www.dejanews.com/emarket/about/background.shtml 〉 "Deja News Invites Internet Users to Search the World's Largest On-Line Newsgroup Archive." Deja News. 30 May 1996. 11 Aug. 1998 〈 http://www.dejanews.com/emarket/about/pr/1996/dnpr_960530.shtml 〉 . Woods, Bob. "Deja News Cuts, Increases Content." Newsbytes 8 Dec. 1997. Citation reference for this article MLA style: Axel Bruns. "Archiving the Ephemeral: Deja News and the Ethics of Perfect Memory." M/C: A Journal of Media and Culture 1.2 (1998). [your date of access] 〈 http://www.uq.edu.au/mc/9808/deja.php 〉 . Chicago style: Axel Bruns, "Archiving the Ephemeral: Deja News and the Ethics of Perfect Memory," M/C: A Journal of Media and Culture 1, no. 2 (1998), 〈 http://www.uq.edu.au/mc/9808/deja.php 〉 ([your date of access]). APA style: Axel Bruns. (1998) Archiving the ephemeral: Deja News and the ethics of perfect memory. M/C: A Journal of Media and Culture 1(2). 〈 http://www.uq.edu.au/mc/9808/deja.php 〉 ([your date of access]).
    Type of Medium: Online Resource
    ISSN: 1441-2616
    RVK:
    Language: Unknown
    Publisher: Queensland University of Technology
    Publication Date: 1998
    detail.hit.zdb_id: 2018737-3
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 6
    Online Resource
    Online Resource
    Queensland University of Technology ; 1999
    In:  M/C Journal Vol. 2, No. 4 ( 1999-06-01)
    In: M/C Journal, Queensland University of Technology, Vol. 2, No. 4 ( 1999-06-01)
    Abstract: Welcome to the world of pop. Even to announce this issue in such a way seems like a quaint anachronism, a mild nostalgia; the expression echoes the voices of countless TV presenters on Top of the Pops, Beat Club, Countdown, or whatever your local variety was. This association demonstrates that pop has been historically located in the arts and in popular culture as something connected to the 1960s: not so much to the politicisation of musical intent that embodied the late sixties, but to the current of the three-minute-or-less love song, the early Beatles, the vacant but loving repeat of Andy Warhol and his images. The Archies kept it going into the late sixties along with the Monkees and the 1910 Fruitgum Company's beautiful pop bubble "Yummy Yummy Yummy I've Got Love in My Tummy". What pop implied retrospectively was a clear sentiment of unity even as it set up binarisms that separated the serious and significant in popular culture from the ephemera and the momentary, with the perishable products of pop apparently placed quite clearly on the lighter side. This is why there is a nostalgic association between pop and the world: pop implies a simpler unity of the world that is carried momentarily by the pleasure of the song, the image, the dance. It is also why we associate pop with the transitional moments in our lives: it is the music of preteendom, the images of early youth and the moment of unselfconscious dancing to and in front of this aural and visual landscape provided by the very core of the transnational (read "world") culture industries. Those affective connections to cultural products is what pop art plays with and makes the viewer ponder. At the same time, pop styles also move beyond the preteen stage, grow up and change: within the space of a decade, "Yummy Yummy Yummy" mutated to The Buggles' "Video Killed the Radio Star"; within a similar space of years, a suntanned, mirrorshaded George Michael in Wham! became a Sony-battling (mis)user of public toilets; and eventually, even the once united quintamfeminate of Spice World seems inevitably on the course towards diadochal wars. As much as we may look back on personal and public memories with fondness, pop isn't forever caught in a static McHappyland, where nothing ever changes. Or perhaps that is to say that these (and other) pop styles aren't: beyond the stylistic formations, there seems to be a deeper kind of pop, a kind of primordial soup of popness from which a particular species of pop evolves every once in a while, matures, mutates, and discovers whether it is capable of survival even once it's left home. The momentary pleasure of pop is never completely compartmentalised into an historical moment, however much British popular music documentaries try to produce that effect, or however much the postwar generation venerate their particular liminal moments of the 1960s as the most significant. Pop is regenerative. Just when one thinks that the various strands of popular culture have organised themselves completely into niche markets we have the will-to- worldpop, with something like Spice World or Aqua's Barbie Girl. The term 'pop sensibility' -- sensibility to an underlying 'popness' which doesn't equate with any particular style of pop, but pervades all of them -- is useful here, and it informs some of the articles in this issue. Pop sensibility is an understanding of the pleasure generated by popular culture, and recognises that in some ways they point to complex relationships between people and cultural forms. It is difficult to explain why one of our editors (David) enjoys a hit by the boygroup Five while the other (Axel) has serious difficulties telling one boygroup from another, or from many of the more forgettable members of the Stock/Aitken/Waterman stable of the early 80s; but it is partly connected to seeing through the various generations of musical style a pop sensibility that has something to do with accessibility and the pleasure generated by that complex simplicity. Pop engages us with what Fiske described as the "art of making do" and thereby is a conduit to the operations of contemporary culture, industrially and culturally. The 'pop' issue of M/C explores this pop sensibility or, in some cases, a pop sensitivity through a variety of channels that should onomatopoeically "pop" into your thinking processes. Martin Laba's feature article "Picking through the Trash" provides the pin to burst cultural studies' reading of the popular bubble, by identifying and then working through the meaning of the supposed detritus of popular culture that doesn't possess the cultural cache of either 'marginal' or 'hip' status. His inspiration remains Don DeLillo's White Noise for its celebration and lament of the popular as it is organised through consumer culture and the various uses made of the apparent ephemera of contemporary culture. Pop, from Laba's perspective, remains the source for understanding the deep structure of the contemporary, and through detailed investigation in the tradition of DeLillo we can unearth the organisation of cultural value. Sean Smith also dances in the light of consumer culture in his tragicomic "Ya Bloody Cappie!", through his sudden realisation that his hard-working consumption practices had been appropriated as a popular culture practice and demographically defined in a way that made them seem as contrived and deplorable as those of the 1980s yuppie. The identification of the cappie, the Face-designed acronym for Consumer of Alternative Pricey Products, presents a crisis of persona for Smith, and leads to a perceptive reading of this shift as evidence of a new "class formation" through a shifted organisation of the self via a form of exclusive cultural capital. Such media stereotyping gone wrong may be partly behind the atrocities committed by members of the often-quoted "trenchcoat mafia" at Littleton, Colorado, but the media have turned a predictably blind eye to their own complicity in the shootings. In "Seen But Not Heard: Pop Culture Scapegoats and the Media Discourse Hierarchy", Nick Caldwell investigates the incredibly repetitive media patterning of establishing cause and effect relationships between outbreaks of youth violence and the usual suspects of cultural artefacts: 'satanic' popular music and grossly violent and antisocial computer games. Caldwell's article finds the discursive proliferation sadly familiar as the media looks to popular culture to stitch together its neverending narrative without the requisite sideways glance at the cultural context of violence. Benign or malignant, media power is also evident in the excitement leading up to and surrounding the release of Star Wars: The Phantom Menace, and we simply couldn't pass by this major artefact of current pop culture in this issue. In many ways, Tara Brabazon's "A Red Light Sabre to Go and Other Histories of the Present" is a process of excavation of popular cultural memory. In an elaborate reclamation program, Brabazon establishes Star Wars as a generational benchmark for a certain affectivity or -- in our terms -- pop sensibility that intersects with how cultural experiences are received by that same generation. Linking the Star Wars generation with Generation X (and her academic/pop self), Brabazon weaves a shifted tapestry of the significance of cultural memory in working out contemporary engagements with culture, and thereby presents whole new territories for the investigation of what Raymond Williams called "the structure of feeling". Cultural studies academics unimpressed with George Lucas's storytelling abilities have plenty of other fields to cover, too, though. Diane Railton's "Justify My Love: Popular Culture and the Academy" provides an invigilating examination of where academics have engaged with popular culture. Her critique is with what may be called new Bourdieuian 'distinctions', where popular music is reintegrated into cultural judgments of taste and thereby simply recategorised with shifted monikers of high (legitimate) and low (illegitimate) designations. Railton calls for a realisation of the political nature of academic work on popular culture that moves beyond this new and shifted constitution of cultural elitism. One of the key divides in research into popular music is about authenticity, which often gets reorganised into new categorisations of cultural value. In "Seeing Sound, Hearing Image: 'Remixing' Authenticity in Popular Music Studies" Steve Jones has provided a map through the debates in popular music studies on how the authentic is deployed by scholars. Jones situates the significance of affect in understanding the pop aesthetic and provides some material for how new technologies are shifting the ground on which popular music's authenticity has been built. Two of the remaining articles in this issue also deal with authenticity in various ways, if not necessarily as the term is used by Jones. In the first of these, "Painting Out Pop: 'Andy Warhol' as a Character in 90s Films", Julie Turnock traces more or less authentic portrayals of Andy Warhol (what would a 'pop' issue be without him?) in recent movies. She uncovers how Andy Warhol's blank visage sits uncomfortably with the narrative and content of three films that need the richness of a normative biography. In the process, the films cannot deal with the conceptualisation of pop that Warhol embodied as an artist, where content disappears to surface and repetition. The celebrity persona of Warhol in its contentlessness is Warhol's ultimate canvas, but the films miss this completely. Where Warhol's celebrity refuses its biopic, David Riddell discovers that sports god Wayne Gretzky's retirement reproduces naturally and seamlessly the spectacle of ice hockey into a movie narrative. Riddell's "Wayne's World: The Making of a Hockey Movie" is a close textual reading of Wayne Gretzky's last game in terms of heavily pre- planned causation which transforms the pleasures of the unexpected that are part of watching any sporting event into the constructed celebrity spectacle, throwing into doubt its authenticity as a sports contest. The blur of speed and spontaneity that is ice hockey becomes the blur of celebrity where fact and fabrication are melted together. Warhol and Gretzky (there's an unexpected pairing!) as media superstars both represent the way pop is defined by the cultural industries in all its crassness and oversimplification; frequently, though, the media's attention is self- centred, in a continuous desire to rate their popularity and measure it against those of their rivals. Axel Bruns's "What's Pop, and What's Not? Measuring Popularity in the Many-to-Many Age" questions the meaningfulness of these ratings, and debates the significance of the ways the Internet determines popularity (for example through the ubiquitous counters). Playing against the need to construct an audience to sell to someone (and advertisers are of course always welcome at the bustling M/C site itself) is the manner in which the Internet is constructed, used and abused by its surfers. The mythic models of measuring the television audience prove to be inadequate to describe the forms of interactions and sideward hypertext movements on the contemporary Web. Nevertheless, the counting goes on.... Finally, we turn to myths of a different kind. There is a certain pop sensitivity that Adam Dodd's article, "Making It Unpopular: The CIA and UFOs in Popular Culture" identifies in 1950s America. Dodd's provocatively argued piece indicates that a fear of mass hysteria motivated moves by the CIA and other government agencies to debunk through apparent explanation any possibility that UFOs actually existed and were seen. The desire to believe was so strong in the popular will that the American agencies felt compelled to work in propagandistic techniques to manipulate that belief. Although we may never know with the amount of propaganda and misinformation masquerading as fact, Dodd presents an interesting case study in the government control and movement of information about a popular cultural phenomenon. From "Yummy Yummy Yummy" to White Noise, from Warhol to Gretzky, from satanic music to academically accepted 'pop', from Star Wars to 'real' UFOs, the scope of this issue of M/C demonstrates the wide reach and diversity of 'the popular'. As issue editors, we hope it will also prove popular with our readers (a pun which had to be made eventually), and won't leave the shallow aftertaste of so much average pop. Much rather, we'd like you to remember once again those 60s pop music shows and agree that "it's a hit!" (And feel free to hit M/C's pages frequently and repeatedly.) P. David Marshall Axel Bruns 'Pop' Issue Editors Citation reference for this article MLA style: P. David Marshall, Axel Bruns. "Editorial: 'Pop'." M/C: A Journal of Media and Culture 2.4 (1999). [your date of access] 〈 http://www.uq.edu.au/mc/9906/edit.php 〉 . Chicago style: P. David Marshall, Axel Bruns, "Editorial: 'Pop'," M/C: A Journal of Media and Culture 2, no. 4 (1999), 〈 http://www.uq.edu.au/mc/9906/edit.php 〉 ([your date of access]). APA style: P. David Marshall, Axel Bruns. (1999) Editorial: 'Pop'. M/C: A Journal of Media and Culture 2(4). 〈 http://www.uq.edu.au/mc/9906/edit.php 〉 ([your date of access]).
    Type of Medium: Online Resource
    ISSN: 1441-2616
    RVK:
    Language: Unknown
    Publisher: Queensland University of Technology
    Publication Date: 1999
    detail.hit.zdb_id: 2018737-3
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 7
    Online Resource
    Online Resource
    Queensland University of Technology ; 1999
    In:  M/C Journal Vol. 2, No. 5 ( 1999-07-01)
    In: M/C Journal, Queensland University of Technology, Vol. 2, No. 5 ( 1999-07-01)
    Abstract: Practically any good story follows certain narrative conventions in order to hold its readers' attention and leave them with a feeling of satisfaction -- this goes for fictional tales as well as for many news reports (we do tend to call them 'news stories', after all), for idle gossip as well as for academic papers. In the Western tradition of storytelling, it's customary to start with the exposition, build up to major events, and end with some form of narrative closure. Indeed, audience members will feel disturbed if there is no sense of closure at the end -- their desire for closure is a powerful one. From this brief description of narrative patterns it is also clear that such narratives depend crucially on linear progression through the story in order to work -- there may be flashbacks and flashforwards, but very few stories, it seems, could get away with beginning with their point of closure, and work back to the exposition. Closure, as the word suggests, closes the story, and once reached, the audience is left with the feeling of now knowing the whole story, of having all the pieces necessary to understand its events. To understand how important the desire to reach this point is to the audience, just observe the discussions of holes in the plot which people have when they're leaving a cinema: they're trying to reach a better sense of closure than was afforded them by the movie itself. In linearly progressing media, this seems, if you'll pardon the pun, straightforward. Readers know when they've finished an article or a book, viewers know when a movie or a broadcast is over, and they'll be able to assess then if they've reached sufficient closure -- if their desires have been fulfilled. On the World Wide Web, this is much more difficult: "once we have it in our hands, the whole of a book is accessible to us readers. However, in front of an electronic read-only hypertext document we are at the mercy of the author since we will only be able to activate the links which the author has provided" (McKnight et al. 119). In many cases, it's not even clear whether we've reached the end of the text already: just where does a Website end? Does the question even make sense? Consider the following example, reported by Larry Friedlander: I watched visitors explore an interactive program in a museum, one that contained a vast amount of material -- pictures, film, historic explanations, models, simulations. I was impressed by the range of subject matter and by the ambitiousness and polish of the presentation. ... But to my surprise, as I watched visitors going down one pathway after another, I noticed a certain dispirited glaze spread over their faces. They seemed to lose interest quite quickly and, in fact, soon stopped their explorations. (163) Part of the problem here may just have been the location of the programme, of course -- when you're out in public, you might just not have the time to browse as extensively as you could from your computer at home. But there are other explanations, too: the sheer amount of options for exploration may have been overwhelming -- there may not have been any apparent purpose to aim for, any closure to arrive at. This is a problem inherent in hypertext, particularly in networked systems like the Web: it "changes our conception of an ending. Different readers can choose not only to end the text at different points but also to add to and extend it. In hypertext there is no final version, and therefore no last word: a new idea or reinterpretation is always possible. ... By privileging intertextuality, hypertext provides a large number of points to which other texts can attach themselves" (Snyder 57). In other words, there will always be more out there than any reader could possibly explore, since new documents are constantly being added. There is no ending if a text is constantly extended. (In print media this problem appears only to a far more limited extent: there, intertextuality is mostly implicit, and even though new articles may constantly be added -- 'linked', if you will -- to a discourse, due to the medium's physical nature they're still very much separate entities, while Web links make intertextuality explicit and directly connect texts.) Does this mark the end of closure, then? Adding to the problem is the fact that it's not even possible to know how much of the hypertextual information available is still left unexplored, since there is no universal register of all the information available on the Web -- "the extent of hypertext is unknowable because it lacks clear boundaries and is often multi-authored" (Snyder 19). While reading a book you can check how many more pages you've got to go, but on the Web this is not an option. Our traditions of information transmission create this desire for closure, but the inherent nature of the medium prevents us from ever satisfying it. Barrett waxes lyrical in describing this dilemma: contexts presented online are often too limited for what we really want: an environment that delivers objects of desire -- to know more, see more, learn more, express more. We fear being caught in Medusa's gaze, of being transfixed before the end is reached; yet we want the head of Medusa safely on our shield to freeze the bitstream, the fleeting imagery, the unstoppable textualisations. We want, not the dead object, but the living body in its connections to its world, connections that sustain it, give it meaning. (xiv-v) We want nothing less, that is, than closure without closing: we desire the knowledge we need, and the feeling that that knowledge is sufficient to really know about a topic, but we don't want to devalue that knowledge in the same process by removing it from its context and reducing it to trivial truisms. We want the networked knowledge base that the Web is able to offer, but we don't want to feel overwhelmed by the unfathomable dimensions of that network. This is increasingly difficult the more knowledge is included in that network -- "with the growth of knowledge comes decreasing certainty. The confidence that went with objectivity must give way to the insecurity that comes from knowing that all is relative" (Smith 206). The fact that 'all is relative' is one which predates the Net, of course, and it isn't the Internet or the World Wide Web that has destroyed objectivity -- objectivity has always been an illusion, no matter how strongly journalists or scientists have at times laid claims ot it. Internet-based media have simply stripped away more of the pretences, and laid bare the subjective nature of all information; in the process, they have also uncovered the fact that the desire for closure must ultimately remain unfulfilled in any sufficiently non-trivial case. Nonetheless, the early history of the Web has seen attempts to connect all the information available (LEO, one of the first major German Internet resource centres, for example, took its initials from its mission to 'Link Everything Online') -- but as the amount of information on the Net exploded, more and more editorial choices of what to include and what to leave out had to be made, so that now even search engines like Yahoo! and Altavista quite clearly and openly offer only a selection of what they consider useful sites on the Web. Web browsers still hoping to find everything on a certain topic would be well-advised to check with all major search engines, as well as important resource centres in the specific field. The average Web user would probably be happy with picking the search engine, Web directory or Web ring they find easiest to use, and sticking with it. The multitude of available options here actually shows one strength of the Internet and similar networks -- "the computer permits many [organisational] structures to coexist in th e same electronic text: tree structures, circles, and lines can cross and recross without obstructing one another. The encyclopedic impulse to organise can run riot in this new technology of writing" (Bolter 95). Still, this multitude of options is also likely to confuse some users: in particular, "novices do not know in which order they need to read the material or how much they should read. They don't know what they don't know. Therefore learners might be sidetracked into some obscure corner of the information space instead or covering the important basic information" (Nielsen 190). They're like first-time visitors to a library -- but this library has constantly shifting aisles, more or less well-known pathways into specialty collections, fiercely competing groups of librarians, and it extends almost infinitely. Of course, the design of the available search and information tools plays an important role here, too -- far more than it is possible to explore at this point. Gay makes the general observation that "visual interfaces and navigational tools that allow quick browsing of information layout and database components are more effective at locating information ... than traditional index or text-based search tools. However, it should be noted that users are less secure in their findings. Users feel that they have not conducted complete searches when they use visual tools and interfaces" (185). Such technical difficulties (especially for novices) will slow take-up of and low satisfaction with the medium (and many negative views of the Web can probably be traced to this dissatisfaction with the result of searches -- in other words, to a lack of satisfaction of the desire for closure); while many novices eventually overcome their initial confusion and become more Web-savvy, others might disregard the medium as unsuitable for their needs. At the other extreme of the scale, the inherent lack for closure, in combination with the societally deeply ingrained desire for it, may also be a strong contributing factor for another negative phenomenon associated with the Internet: that of Net users becoming Net junkies, who spend every available moment online. Where the desire to know, to get to the bottom (or more to the point: to the end) of a topic, becomes overwhelming, and where the fundamental unattainability of this goal remains unrealised, the step to an obsession with finding information seems a small one; indeed, the neverending search for that piece of knowledge surpassing all previously found ones seems to have obvious similarities to drug addiction with its search for the high to better all previous highs. And most likely, the addiction is only heightened by the knowledge that on the Web, new pieces of information are constantly being added -- an endless, and largely free, supply of drugs... There is no easy solution to this problem -- in the end, it is up to the user to avoid becoming an addict, and to keep in mind that there is no such thing as total knowledge. Web designers and content providers can help, though: "there are ways of orienting the reader in an electronic document, but in any true hypertext the ending must remain tentative. An electronic text never needs to end" (Bolter 87). As Tennant & Heilmeier elaborate, "the coming ease-of-use problem is one of developing transparent complexity -- of revealing the limits and the extent of vast coverage to users, and showing how the many known techniques for putting it all together can be used most effectively -- of complexity that reveals itself as powerful simplicity" (122). We have been seeing, therefore, the emergence of a new class of Websites: resource centres which help their visitors to understand a certain topic and view it from all possible angles, which point them in the direction of further information on- and off-site, and which give them an indication of how much they need to know to understand the topic to a certain degree. In this, they must ideally be very transparent, as Tennant & Heilmeier point out -- having accepted that there is no such thing as objectivity, it is necessary for these sites to point out that their offered insight into the field is only one of many possible approaches, and that their presented choice of information is based on subjective editorial decisions. They may present preferred readings, but they must indicate that these readings are open for debate. They may help satisfy some of their readers' desire for closure, but they must at the same time point out that they do so by presenting a temporary ending beyond which a more general story continues. If, as suggested above, closure crucially depends on a linear mode of presentation, such sites in their arguments help trace one linear route through the network of knowledge available online; they impose a linear from-us-to-you model of transmission on the normally unordered many-to-many structure of the Net. In the face of much doomsaying about the broadcast media, then, here is one possible future for these linear transmission media, and it's no surprise that such Internet 'push' broad- or narrowcasting is a growth area of the Net -- simply put, it serves the apparent need of users to be told stories, to have their desire for closure satisfied through clear narrative progressions from exposition through development to end. (This isn't 'push' as such, really: it's more a kind of 'push on demand'.) But at the same time, this won't mean the end of the unstructured, networked information that the Web offers: even such linear media ultimately build on that networked pool of knowledge. The Internet has simply made this pool public -- passively as well as actively accessible to everybody. Now, however, Web designers (and this includes each and every one of us, ultimately) must work "with the users foremost in mind, making sure that at every point there is a clear, simple and focussed experience that hooks them into the welter of information presented" (Friedlander 164); they must play to the desire for closure. (As with any preferred reading, however, there is also a danger that that closure is premature, and that the users' process or meaning-making is contained and stifled rather than aided.) To return briefly to Friedlander's experience with the interactive museum exhibit: he draws the conclusion that visitors were simply overwhelmed by the sheer mass of information and were reluctant to continue accumulating facts without a guiding purpose, without some sense of how or why they could use all this material. The technology that delivers immense bundles of data does not simultaneously deliver a reason for accumulating so much information, nor a way for the user to order and make sense of it. That is the designer's task. The pressing challenge of multimedia design is to transform information into usable and useful knowledge. (163) Perhaps this transformation is exactly what is at the heart of fulfilling the desire for closure: we feel satisfied when we feel we know something, have learnt something from a presentation of information (no matter if it's a news report or a fictional story). Nonetheless, this satisfaction must of necessity remain intermediate -- there is always much more still to be discovered. "From the hypertext viewpoint knowledge is infinite: we can never know the whole extent of it but only have a perspective on it. ... Life is in real-time and we are forced to be selective, we decide that this much constitutes one node and only these links are worth representing" (Beardon & Worden 69). This is not inherently different from processes in other media, where bandwidth limitations may even force much stricter gatekeeping regiments, but as in many cases the Internet brings these processes out into the open, exposes their workings and stresses the fundamental subjectivity of information. Users of hypertext (as indeed users of any medium) must be aware of this: "readers themselves participate in the organisation of the encyclopedia. They are not limited to the references created by the editors, since at any point they can initiate a search for a word or phrase that takes them to another article. They might also make their own explicit references (hypertextual links) for their own purposes ... . It is always a short step from electronic reading to electronic writing, from determining the order of texts to altering their structure" (Bolter 95). Significantly, too, it is this potential for wide public participation which has made the Internet into the medium of the day, and led to the World Wide Web's exponential growth; as Bolter describes, "today we cannot hope for permanence and for general agreement on the order of things -- in encyclopedias any more than in politics and the arts. What we have instead is a view of knowledge as collections of (verbal and visual) ideas that can arrange themselves into a kaleidoscope of hierarchical and associative patterns -- each pattern meeting the needs of one class of readers on one occasion" (97). To those searching for some meaningful 'universal truth', this will sound defeatist, but ultimately it is closer to realism -- one person's universal truth is another one's escapist phantasy, after all. This doesn't keep most of us from hoping and searching for that deeper insight, however -- and from the preceding discussion, it seems likely that in this we are driven by the desire for closure that has been imprinted in us so deeply by the multitudes of narrative structures we encounter each day. It's no surprise, then, that, as Barrett writes, "the virtual environment is a place of longing. Cyberspace is an odyssey without telos, and therefore without meaning. ... Yet cyberspace is also the theatre of operations for the reconstruction of the lost body of knowledge, or, perhaps more correctly, not the reconstruction, but the always primary construction of a body of knowing. Thought and language in a virtual environment seek a higher synthesis, a re-imagining of an idea in the context of its truth" (xvi). And so we search on, following that by definition end-less quest to satisfy our desire for closure, and sticking largely to the narrative structures handed down to us through the generations. This article is no exception, of course -- but while you may gain some sense of closure from it, it is inevitable that there is a deeper feeling of a lack of closure, too, as the article takes its place in a wider hypertextual context, where so much more is still left unexplored: other articles in this issue, other issues of M/C, and further journals and Websites adding to the debate. Remember this, then: you decide when and where to stop. References Barrett, Edward, and Marie Redmont, eds. Contextual Media: Multimedia and Interpretation. Cambridge, Mass.: MIT P, 1995. Barrett, Edward. "Hiding the Head of Medusa: Objects and Desire in a Virtual Environment." Barrett & Redmont xi- vi. Beardon, Colin, and Suzette Worden. "The Virtual Curator: Multimedia Technologies and the Roles of Museums." Barrett & Redmont 63-86. Bolter, Jay David. Writing Space: The Computer, Hypertext, and the History of Writing. Hillsdale, N.J.: Lawrence Erlbaum Associates, 1991. Friedlander, Larry. "Spaces of Experience on Designing Multimedia Applications." Barrett & Redmont 163-74. Gay, Geri. "Issues in Accessing and Constructing Multimedia Documents." Barrett & Redmont 175-88. McKnight, Cliff, John Richardson, and Andrew Dillon. "The Authoring of Hypertext Documents." Hypertext: Theory into Practice. Ed. Ray McAleese. Oxford: Intellect, 1993. Nielsen, Jakob. Hypertext and Hypermedia. Boston: Academic Press, 1990. Smith, Anthony. Goodbye Gutenberg: The Newspaper Revolution of the 1980's [sic] . New York: Oxford UP, 1980. Snyder, Ilana. Hypertext: The ELectronic Labyrinth. Carlton South: Melbourne UP, 1996. Tennant, Harry, and George H. Heilmeier. "Knowledge and Equality: Harnessing the Truth of Information Abundance." Technology 2001: The Future of Computing and Communications. Ed. Derek Leebaert. Cambridge, Mass.: MIT P, 1991. Citation reference for this article MLA style: Axel Bruns. "What's the Story: The Unfulfilled Desire for Closure on the Web." M/C: A Journal of Media and Culture 2.5 (1999). [your date of access] 〈 http://www.uq.edu.au/mc/9907/closure.php 〉 . Chicago style: Axel Bruns, "What's the Story: The Unfulfilled Desire for Closure on the Web," M/C: A Journal of Media and Culture 2, no. 5 (1999), 〈 http://www.uq.edu.au/mc/9907/closure.php 〉 ([your date of access]). APA style: Axel Bruns. (1999) What's the story: the unfulfilled desire for closure on the Web. M/C: A Journal of Media and Culture 2(5). 〈 http://www.uq.edu.au/mc/9907/closure.php 〉 ([your date of access]).
    Type of Medium: Online Resource
    ISSN: 1441-2616
    RVK:
    Language: Unknown
    Publisher: Queensland University of Technology
    Publication Date: 1999
    detail.hit.zdb_id: 2018737-3
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 8
    Online Resource
    Online Resource
    Queensland University of Technology ; 1999
    In:  M/C Journal Vol. 2, No. 4 ( 1999-06-01)
    In: M/C Journal, Queensland University of Technology, Vol. 2, No. 4 ( 1999-06-01)
    Abstract: Have you noticed the proliferation of access statistics icons on your favourite bands' Websites? How do you feel about being told you're visitor number 10870 to the Star Wars hate page? Have you wondered why you don't gain weight from all the cookies you seem to be getting from your Internet music retailer? Did you sign a complete stranger's online guestbook? Are you annoyed with the dozens of pop-up windows that keep asking you to 'RATE THIS SITE!!!'? Don't worry: it's not you, it's them. You're witnessing the symptoms of existential angst. In most media, to be seen, read, or heard is everything. To have an audience, preferably a large and loyal one, is crucial: in the mass media's views, audience size and share determines popularity, and popularity attracts private and/or public funding. 'Popular', for these media, doesn't mean much active intervention from the people (in contrast to the way the word is often used in cultural studies): 'popularity' means a solid base of dedicated and continuous users, preferably larger than that of their competitors. Commercial Websites must similarly justify their setup and running costs by the amount of visitors they attract, but not-for-profit and private Webmasters, too, usually need that knowledge to justify and reward the effort that has gone into the site. A Website without visitors might just as well not exist at all. The problem is that on the Web this form of popularity is almost impossible to determine with any accuracy -- despite the multitude of measuring methods you're likely to be subjected to within just an hour of heavy Web browsing. That's not to say that some major sites on the Web aren't quite obviously major sites: the Amazons, CDnows and Yahoo!s of the Web are clearly visited by thousands, even millions of users each day. But for the majority of medium and minor content providers, the situation is far from clear, especially if the attention is focussed on the relative audience shares between a number of comparable services. These providers have a hard time determining whether they're amongst the leading sites in their field, and whether they're known to and enjoyed by a sufficient share of their target audience. Such difficulties largely are a continuation of similar problems in other media, and so it's worth taking a brief tour through the depths of audience measurement elsewhere. Audience research has become an important industry, but what's often overlooked in the endless battle for better ratings is that those ratings are often quite misleading -- the more so the less material a medium appears. While for culture that is linked to material artefacts (books, CDs, videos, newspapers) some relatively credible circulation, sales and unsold returns figures can usually be obtained (although magazines often multiply these figures by a set number to generate more impressive 'readership' figures), there is no direct-feedback way of gauging how many listeners tune in to a particular radio programme, or watch a certain television show. The amount of 'hits' (to borrow a Web term) to a programme cannot be monitored by a station itself; instead, it relies on peoplemeters placed in a selection of supposedly representative households to log such accesses. Additionally, there is the general question of what consumers do with any product, and whether every access to it can honestly be counted towards its popularity: I may buy the weekend newspaper only for the personal ads, disregarding its editorial content; you may channel-surf across the available TV programmes without really watching any of them attentively -- and alternatively, I may make copies of a CD I've bought for any number of friends; and you may tape a radio programme to listen to (repeatedly, even) at a later time. This real-life context of accesses will usually either escape or confuse peoplemeter devices: they may keep a record of what channels the family TV was tuned in to at any particular time -- but what they cannot record was if a viewer has fallen asleep, turned the sound off while talking on the phone, or gone to the kitchen to fix dinner; or indeed if the VCR is at the same time recording another show. Additionally, it is also highly doubtful that households with peoplemeters accurately represent the viewing habits of the wider population: the anecdote that current affairs shows regularly rate extraordinarily well if they include a story about families with peoplemeters is only an obvious example here. The more diverse the range of situational settings for the consumption of a particular medium, the less likely is it that any sample group of consumers can accurately represent the audience as a whole -- and the more we study consumption contexts, the more individualised they appear, as Ang has pointed out for television: "emphasis on the situational embeddedness of audience practices and experiences inevitably undercuts the search for generalisations" which audience research with its scientific approach engages in (164). Above a certain level of situational diversity such generalisations can only find a lowest common denominator which is trivial and largely useless: a certain size of audience may have been tuned in at one time or another, but for how long or with what degree of satisfaction remains unclear. Recent developments in the mass media have only increased the diversity of access situations, however. First, there is the ongoing expansion in available media channels. Where in Australia there used to be only a handful of television networks, for example, the introduction of pay-TV has added dozens more channels, few of which are available to all viewers; and where there used to be only a few daily newspapers, the rise of carrier media such as the World Wide Web now means that readers can make the New York Times or the Süddeutsche Zeitung rather than the Sydney Morning Herald or, heaven forbid, the Courier-Mail their preferred morning paper, if they so desire. Such developments further underline the point that for example "the boundaries of 'television audience', even in the most simple, one dimensional terms, are impossible to define. Those boundaries are blurred rather than sharply demarcated, precarious rather than absolute" (Ang 154). This raises the general problem of defining the exact boundaries of a media market, and the channels through which this market is accessed by producers and consumers. A cultural product's 'popularity', if expressed in the number of accesses to the product, can only possibly be measured with any degree of accuracy at the bottlenecks through which products must pass into and out of the market: for material goods, this is the distribution process, where the number of products (newspapers, books, CDs, etc.) shipped can be listed against the number of unsold products returned, and circulation figures can be calculated. (Whatever the means of measurement at these bottlenecks, it is clear that the measurement itself must be automatic, and cannot rely on the users themselves: survey-based audience research results are questionable ab initio, since they are drawn only from that part of the audience that is willing to participate, and thus rule out those users which may variously be less active or less interested, or conversely more suspicious or more active -- and thus too busy to fill in a survey.) For less 'material' cultural products, the bottlenecks reside in the equipment needed to send and receive them: radio and TV sets, for example -- but as we have seen, this bottleneck can be bypassed with the help of sound and video recorders, and new media forms such as the Internet, which provide additional access channels to the older media; it is also a bottleneck that is less accessible to researchers than that on the distributors' side. How many peoplemeters are there next to PCs with TV tuner cards? How should accesses to online editions be figured into the circulation numbers of newspapers? Ironically, unlike electronic broadcast media the Internet does appear to offer a way to directly measure audience access to content, of course: as a 'pull' medium which requires the user to request content individually rather than the provider to send programming indiscriminately, such individual accesses (predominantly to Web pages) can be monitored. But for the same reason that peoplemeter statistics are fundamentally inaccurate, so are Web counter data: accesses ('hits') don't equal readers, since Web browsers may jump elsewhere without having read a whole page, and since proxy servers may access a page once, but redistribute that page to any number of clients. Again, the situational context of access cannot be monitored with such relatively simplistic measures -- and it can be argued that the range of diversity for Web access situations is even greater than it is for other electronic mass media; while TV access (with any degree of attention), for example, remains largely in recreational settings, engaged Web access spreads from these to offices, laboratories, libraries, and cafés. Ironically, unlike electronic broadcast media the Internet does appear to offer a way to directly measure audience access to content, of course: as a 'pull' medium which requires the user to request content individually rather than the provider to send programming indiscriminately, such individual accesses (predominantly to Web pages) can be monitored. But for the same reason that peoplemeter statistics are fundamentally inaccurate, so are Web counter data: accesses ('hits') don't equal readers, since Web browsers may jump elsewhere without having read a whole page, and since proxy servers may access a page once, but redistribute that page to any number of clients. Again, the situational context of access cannot be monitored with such relatively simplistic measures -- and it can be argued that the range of diversity for Web access situations is even greater than it is for other electronic mass media; while TV access (with any degree of attention), for example, remains largely in recreational settings, engaged Web access spreads from these to offices, laboratories, libraries, and cafés. Ironically, unlike electronic broadcast media the Internet does appear to offer a way to directly measure audience access to content, of course: as a 'pull' medium which requires the user to request content individually rather than the provider to send programming indiscriminately, such individual accesses (predominantly to Web pages) can be monitored. But for the same reason that peoplemeter statistics are fundamentally inaccurate, so are Web counter data: accesses ('hits') don't equal readers, since Web browsers may jump elsewhere without having read a whole page, and since proxy servers may access a page once, but redistribute that page to any number of clients. Again, the situational context of access cannot be monitored with such relatively simplistic measures -- and it can be argued that the range of diversity for Web access situations is even greater than it is for other electronic mass media; while TV access (with any degree of attention), for example, remains largely in recreational settings, engaged Web access spreads from these to offices, laboratories, libraries, and cafés. Cultural producers can still take some information from their access statistics, of course -- no matter how inaccurate the figures, a thousand hits per day are still better than ten, and while page reloads and browsing durations may indicate technical problems or extraneous distractions just as much as attentive engagement, such data too may be useful to some extent. Web publishers may even try to compare their figures with those of other Websites which they regard as competitors in the field. It has become impossible, though, to claim market and audience shares with any degree of accuracy: when the total size of the audience cannot be determined, no percentages can be calculated; ratings-based systems will fail. This is a major shift especially for the entertainment industry, where ratings battles have become notorious; it is a shift directly related to the unregulated, unlimited nature of the online market, where no limits on the number of competitors exist or can be enforced (a situation markedly different from that in the practically closed TV and radio markets in many countries), and it is a shift which may lead to some deal of paranoia on the part of the established media outlets: on the Web, there is always a danger that upstart competitors could snatch a share of the market (a development, moreover, which wouldn't show early on in any ratings figures). While popularity ratings weren't an exact science at the best of times, then, they are becoming hopelessly inaccurate as media and audiences change -- not just in the case of the Web, but (as we gradually move towards a much-anticipated media convergence) in the case of many others as well. Few media forms will remain unaffected by these developments: as 'pop' music fragments into multitudes of sub-genres, for example, each with their own radio stations (terrestrial as well as online), publications, record labels, CD shops, or even online distribution schemes, does it still make sense to speak of 'popular' music? As we gain access to a global media market with Thai newspapers, Brazilian radio stations, and German TV programmes only a click of the mouse away, is there still a point to local or national ratings figures? Such questions haven't necessarily stopped ratings users from relying on them in the past, of course -- Ang's critique of TV audience ratings was published in 1991, but the ratings appear no less important to TV stations now than they did then. Ang expected this: "television institutions ... are likely to continue the quest for encompassing, objectified constructions of 'television audience' -- as the continued search for the perfect audience measurement technology suggests" (155). For newer media like the Web, though, this troubled experience with audience measurement in television and elsewhere, and the many impracticalities of accurately measuring Web audiences, may serve to tame the desire for similarly "conveniently objectified information" (Ang 152) on audience participational patterns -- information which fails to take note of the context of such participation -- before that desire develops into a TV-style obsession with one's own popularity as expressed through ratings and audience sizes. Indeed, once the novelty of Website access statistics has worn off, perhaps this is where we return to a different conception of 'popularity'. As the mass media splinter into collections of specialty channels, as the audience differentiates into individuals belonging to and moving through any number of interest groups in the course of a single day, with each group gradually gaining access to their own channels, and as many-to-many media give certain people (though not everybody) the ability to communicate without the need to subject themselves to mediation by any existing media institution, perhaps the translation of 'popular' as 'from the people' is once again on the ascendancy. And at the very least, as the ratings' accuracy continues to deteriorate, so will their relevance and importance, and cultural producers may feel less strongly the need to appeal to the lowest common taste denominator. That can't be a bad thing. References Ang, Ien. Desperately Seeking the Audience. London: Routledge, 1991. Citation reference for this article MLA style: Axel Bruns. "What's Pop, and What's Not? Measuring Popularity in the Many-to-Many Age." M/C: A Journal of Media and Culture 2.4 (1999). [your date of access] 〈 http://www.uq.edu.au/mc/9906/what.php 〉 . Chicago style: Axel Bruns, "What's Pop, and What's Not? Measuring Popularity in the Many-to-Many Age," M/C: A Journal of Media and Culture 2, no. 4 (1999), 〈 http://www.uq.edu.au/mc/9906/what.php 〉 ([your date of access]). APA style: Axel Bruns. (1999) What's pop, and what's not? Measuring popularity in the many-to-many age. M/C: A Journal of Media and Culture 2(4). 〈 http://www.uq.edu.au/mc/9906/what.php 〉 ([your date of access]).
    Type of Medium: Online Resource
    ISSN: 1441-2616
    RVK:
    Language: Unknown
    Publisher: Queensland University of Technology
    Publication Date: 1999
    detail.hit.zdb_id: 2018737-3
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...