Feeds:
Posts
Comments

Posts Tagged ‘web’

I’ve just given a study day about the internet and new media, and it forced me to get my head around some of the jargon and the ideas. Here is my summary of what these terms mean and where the digital world is going.

Web 1.0: The first generation of internet technology. You call up pages of text and images with incredible speed and facility. It’s no different from strolling through a library, only much quicker. The operative verb is I LOOK. I look at pages on the screen just as I look at pages in a book. All content is provided for you – it’s a form of publishing. It may be updated in a way that is impossible when a solid book is sitting on your shelf, but you can’t change the content yourself.

Web 2.0: The second generation of internet technology allows for user-generated content. You don’t just look at the pages, you alter them. You write your own blog; you comment on someone else’s article in the comment boxes; you edit an entry on Wikipedia. And then, by extension, with basically the same technology, you share your thoughts on a social networking site, which means you are commenting not on a static site, but on something that is itself in flux. You have moved from action to interaction; from connection to interconnection. If Web 1.0 is like a digital library, Web 2.0 is like a digital ‘Letter to the Editor’, a digital conference call, a digital group discussion. The verb here is I PARTICIPATE.

Web 3.0: People disagree about the meaning of Web 3.0, about where the web is now going. I like John Smart‘s idea of an emerging Metaverse, where there is a convergence of the virtual and physical world. In the world of Web 2.0, of user-generated content and social networking, you stand in the physical/natural/real world and use the new media to help you around that world – the new media are tools. You talk to friends, you share ideas, you buy things that have been suggested and reviewed by others. But in Web 3.0 the new media become an essential part of the world in which you are living, they help to create the world, and you live within them.

The border between Web 2.0 and Web 3.0 is not tidy here, because Web 3.0 is partly about Web 2.0 becoming all-pervasive and continuous, so that your connection with the web and your social network is an essential part of every experience – it doesn’t get switched off. The mobile earpiece is always open to the chatter of others; the texts and status updates of your friends are projected into the corner of your Google Glasses (like those speedometers that are projected onto the car windscreen) so that they accompany what you are doing at every moment – the connection between real and virtual, between here and there, is seamless; the attention you give to every shop or product or street or person is digitally noted, through the head and eye movement sensors built into your glasses and the GPS in your phone, and simultaneously you are fed (into the corner of your glasses, or into your earpiece) layers of information about what is in front of you – reviews of the product, reminders of what you need to buy from the shop, warnings about the crime rate on this street, a note about the birthday and the names of the children of the person you are about to pass, etc. This is augmented reality or enhanced reality or layered reality.

It’s no different, in essence, from going for a stroll in the mid-70s with your first Walkman – creating for the first time your own soundtrack as you wander through the real world; or having the natural landscape around you altered by neon lights and billboards. But it is this experience a thousand times over, so that it is no longer possible to live in a non-virtual world, because every aspect of the real world is already augmented by some aspect of virtual reality. The verb here is I EXIST. I don’t just look at the virtual world, or use it to participate in real relationships; now I exist within this world.

Web 4.0: Some people say this is the Semantic Web (‘semantics’ is the science of meaning), when various programmes, machines, and the web itself becomes ‘intelligent’, and starts to create new meanings that were not programmed into it, and interact with us in ways that were not predicted or predictable beforehand. It doesn’t actually require some strict definition of ‘artificial intelligence’ or ‘consciousness’ for the computers; it just means that they start doing new things themselves – whatever the philosophers judge is or is not going on in their ‘minds’.

Another aspect of Web 4.0, or another definition, concerns plugging us directly into the web: when the boundary between us and the virtual world disappears. This is when the virtual world becomes physically/biologically part of us, or when we become physically/biologically part of the virtual world. When, in other words, the data is not communicated by phones or earpieces or glasses, but is implanted into us, so that the virtual data is part of our consciousness directly, and not just part of our visual or aural experience (the films Total Recall, eXistenZ, and the Matrix); and/or, when we control the real and virtual world by some kind of brain or neural interface, so that – in both cases – there really is a seamless integration of the real and the virtual, the personal/biological and the digital.

If this seems like science fiction, remember that it is already happening in smaller ways. See previous posts on Transhumanism, and the MindSpeller project at Leuven which can read the minds of stroke victims, and this MIT review of brain-computer interfaces. In this version of Web 4.0 the verb is not I exist (within a seamless real/virtual world), it is rather I AM this world and this world is me.

Watch this fascinating video of someone’s brainwaves controlling a robotic arm:

And this which has someone controlling first a signal on a screen, and then another robotic arm:

So this is someone making things happen in the real world just by thinking! (Which, come to think of it, is actually the miracle that takes place whenever we doing anything consciously!)

Any comments? Are you already living in Web 3.0 or 3.5? Do you like the idea of your children growing up in Web 4.0? What will Web 5.0 be?

Read Full Post »

Jenny McCartney “celebrates” the life of Eugene J Polley, the inventor of the TV remote control, who has recently died. Without him, there would be no such thing as channel-hopping. And who knows, if we hadn’t made the leap from watching to hopping, perhaps we wouldn’t have been psychologically or culturally ready for the next leap from hopping channels to surfing the web.

Polley was an engineer at Zenith, where he worked for 47 years. I put “celebrates” in inverted commas, because McCartney thinks he leaves a dubious legacy.

I am old enough to remember what viewing life was like before the remote control hit the UK, in the days when there were only three channels and you had to make the active decision to haul yourself up from the sofa and press a button to alter them. It was better. If someone wanted to change the channel, etiquette usually demanded that they consult the other people in the room, only moving towards the television once agreement was reached. As a result, you stuck with programmes for longer: since it took a modicum of effort to abandon them, and people are naturally lazy, even slow-burning shows were granted the necessary time to draw you in.

With the arrival of the remote control, the power passed to whoever held the magic gadget in his or her hot little hands. Automatically, the holder of the remote was created king of the living room, and everyone else became either a helpless captive, or an angry dissenter. As the number of channels steadily grew, so did the remote-holder’s temptation to flick between the channels with the compulsively restless air of one seeking an elusive televisual fulfilment that could never be found.

Channel-surfing is a guilty pleasure that should only be practised alone. There is nothing worse than sitting in the same room while someone else relentlessly channel-surfs. It makes you feel as if you are going mad. You hear – in rapid succession – a snatch of song, a scrap of dialogue, a woman trying to sell you a cut-price emerald ring, half a news headline, and an advertising jingle. The moment that something sounds like it might interest you, it disappears. Worse, when you yourself are squeezing the remote, you find that you have now developed the tiny attention span of a hyperactive gnat. Is it any surprise that, now that alternative amusements to the television have emerged, family members are challenging the remote-holder’s solitary rule and decamping to the four corners of the family home with their iPads and laptops?

I know that lamenting the invention of the remote control will – in the eyes of some – put me in the same risibly fuddy-duddy camp as those who once preferred the horse and cart to the motor car, yearned for the days when “we made our own fun”, and said that this email nonsense would never catch on. I don’t care. Listen to me, those of you who cannot imagine life without the zapper: it really was better before.

I think the phrase ‘surfing the web’ is misleading and actually disguises the fragmentary nature of the typical internet experience. If you go surfing (I went once!) you wait patiently and let a lot of inadequate waves pass underneath your board, but as soon as you spot the right wave, ‘your’ wave, you paddle with all your might to meet it properly, leap onto the board, and then ride that wave for as long as you can.

When you find a wave, in other words, you stay with it. You are so with it and trying not to fall off it that it’s inconceivable that you would be looking out of the corner of your eye for a better one. That’s the joy of surfing – the waiting, the finding, and then the 100% commitment to the wave that comes.

That’s why the phrase ‘surfing the web’ doesn’t work for me. The joy of the web, and the danger, is that you can hop off the page at any time, as soon as you see anything else vaguely interesting or distracting. You are half-surfing a particular page, but without any physical or emotional commitment. You can move away to something better or more interesting – that’s the miracle of the web, what it can throw up unexpectedly. But it means that one part of you is always looking over the horizon, into the other field, where to go next; as if non-commitment to the present moment, a kind of existential disengagement, is a psychological precondition of using the internet.

As you know, I am not against the internet. I just wonder what long-term effects it has on us and on our culture. On the internet, everything is provisional. So if we see everything else through the lens of our internet experience, then it all becomes provisional – including, perhaps, even our relationships.

Maybe that’s the word to ponder: ‘provisionality’.

Read Full Post »

Internet Forever - Back Cover by hotdiggitydogs.

This coming week, the internet turns 40. On 29th October 1969 Leonard Kleinrock and some colleagues crowded round a computer terminal somewhere in California and logged into another one several hundred miles away. It was the particular type of remote connection that proved significant. It was only partially successful. The system crashed two letters into the first word – which was meant to be ‘LOGIN’; and so the first utterance sent across the net was the biblical ‘Lo…’

To choose a moment like this is somewhat arbitrary. There are many other technological shifts of huge significance that could be noted. But this is the one Oliver Burkeman opts for in his fascinating article about the history and implications of the internet. Arpanet, as this first system was called, was funded by US government money that had been released by Eisenhower in the panic after Sputnik. So it was, indirectly, a result of the space race.

Burkeman takes us through the first academic net, early email, the world wide web, search, the generativity of Web 2.0, and then speculates about where it will be in 4 years. He doesn’t dare to go further than that time frame, because change (not just growth) has been exponential, and you would be a fool to imagine you could see much further. It’s fun to reminisce, but it provokes deeper thoughts about how radically the world has changed, together with our ideas about knowledge, community, the self etc…

WiFi + 17mpbs Internet by Don Solo.

One nice quotation is from a science fiction story by Murray Leinster, written in 1946. Everyone has a tabletop box called a ‘logic’  that links them to the rest of the world. Look at how prescient it is:

You got a logic in your house. It looks like a vision receiver used to, only it’s got keys instead of dials and you punch the keys for what you wanna get . . . you punch ‘Sally Hancock’s Phone’ an’ the screen blinks an’ sputters an’ you’re hooked up with the logic in her house an’ if somebody answers you got a vision-phone connection. But besides that, if you punch for the weather forecast [or] who was mistress of the White House durin’ Garfield’s administration . . . that comes on the screen too. The relays in the tank do it. The tank is a big buildin’ full of all the facts in creation . . . hooked in with all the other tanks all over the country . . . The only thing it won’t do is tell you exactly what your wife meant when she said, ‘Oh, you think so, do you?’ in that peculiar kinda voice.

Another article by Tom Meltzer and Sarah Phillips gives a nostalgia trip through various internet firsts: first browser, smiley, search engine, item sold on eBay, youtube video etc. My favourite entry is the well-known first webcam, which was primed on a coffee machine in Cambridge University’s computer lab so that people at the end of the corridor could get live updates on whether it was worth making the journey away from the desk or not.

Read Full Post »

%d bloggers like this: