Feeds:
Posts
Comments

Posts Tagged ‘TV’

Every few months we hear about the impending death of television, how everyone has shifted to the internet, to social media, to Web 2.0, to Web 3.0… Yes, there are some shifts, but here in the UK we are watching far, far more TV than just a few years ago.

tv head by ElAlispruz

You can read this recent report from TV Licensing.

Here is the key statistic:

We watch an average of 4 hours 2 minutes of TV a day, up from an average of 3 hours 36 minutes a day in 2006.

Four hours a day! This is an average day in the UK in 2013. Seems like a lot to me.

Here are some of the technological shifts:

  • We have fewer TVs: The average household now has 1.83 TV sets, down from an average of 2.3 sets in 2003.
  • But we’re watching more television on more devices: We watch an average of 4 hours 2 minutes of TV a day, up from an average of 3 hours 36 minutes a day in 2006. A TV Licence covers you to watch on any TV, mobile device or tablet in your home or on the move. In 2012, fewer than one per cent of us watch only time-shifted TV.
  • Premium TV features are on the rise: More than a third of the TV market value in 2012 was from sales of 3D TVs, and sales of jumbo screens (43 inch or more) increased 10 per cent in the past 12 months.
  • Social networks allow us to engage with each other in real-time like never before: 40 per cent of all tweets are about television shows between 6.30pm and 10pm.

So despite there being more devices and platforms, we are still gathering round the ‘hearth’ of a premium TV at the centre of the home. And instead of being completely absorbed in the entertainment experience, we are tweeting about what we are watching in real-time, which is probably no more than an extension of the chatter that would take place round the TV in previous generations.

Read Full Post »

TV time should be limited for children, and under-threes should be kept away from television altogether – writes Sarah Boseley.

These are the conclusions of a recent report.

A review of the evidence in the Archives Of Disease in Childhood says children’s obsession with TV, computers and screen games is causing developmental damage as well as long-term physical harm. Doctors at the Royal College of Paediatrics and Child Health, which co-owns the journal with the British Medical Journal group, say they are concerned. Guidelines in the US, Canada and Australia already urge limits on children’s screen time, but there are none yet in Britain.

The review was written by psychologist Dr Aric Sigman, author of a book on the subject, following a speech he gave to the RCPCH’s annual conference. On average, he says, a British teenager spends six hours a day looking at screens at home – not including any time at school. In North America, it is nearer eight hours. But, says Sigman, negative effects on health kick in after about two hours of sitting still, with increased long-term risks of obesity and heart problems.

The critical time for brain growth is the first three years of life, he says. That is when babies and small children need to interact with their parents, eye to eye, and not with a screen.

Prof Mitch Blair, officer for health promotion at the college, said: “Whether it’s mobile phones, games consoles, TVs or laptops, advances in technology mean children are exposed to screens for longer amounts of time than ever before. We are becoming increasingly concerned, as are paediatricians in several other countries, as to how this affects the rapidly developing brain in children and young people.”

The US department of health and human services now specifically cites the reduction of screen time as a health priority, aiming “to increase the proportion of children aged 0 to two years who view no television or videos on an average weekday” and increase the proportion of older children up to 18 who have no more than two hours’ screen time a day.

The American Academy of Pediatrics has also issued guidance, saying “media – both foreground and background – have potentially negative effects and no known positive effects for children younger than 2 years”. The Canadian Paediatric Society says no child should be allowed to have a television, computer or video game equipment in his or her bedroom.

Sigman goes further, suggesting no screen time for the under-threes, rising gradually to a maximum of two hours for the over-16s. Parents should “encourage” no screens in the bedroom, he says, and be aware that their own viewing habits will influence their children.

But what can you do?

The RCPH’s Professor Blair said there were some simple steps parents could take, “such as limiting toddler exposure as much as possible, keeping TVs and computers out of children’s bedrooms, restricting prolonged periods of screen time (we would recommend less than two hours a day) and choosing programmes that have an educational element.”

But Justine Roberts, co-founder of Mumsnet, said it was hard for parents to compete with technology. “It would be great if someone could invent a lock that could automatically ensure a daily shut down of all the different devices in and around the home after a designated period. Until such a thing is invented, it’s going to be an ongoing battle to keep on top of everything,” she said.

Any thoughts from parents? Is the no TV ideal possible? Is it realistic? Is it even desirable?

Read Full Post »

Jenny McCartney “celebrates” the life of Eugene J Polley, the inventor of the TV remote control, who has recently died. Without him, there would be no such thing as channel-hopping. And who knows, if we hadn’t made the leap from watching to hopping, perhaps we wouldn’t have been psychologically or culturally ready for the next leap from hopping channels to surfing the web.

Polley was an engineer at Zenith, where he worked for 47 years. I put “celebrates” in inverted commas, because McCartney thinks he leaves a dubious legacy.

I am old enough to remember what viewing life was like before the remote control hit the UK, in the days when there were only three channels and you had to make the active decision to haul yourself up from the sofa and press a button to alter them. It was better. If someone wanted to change the channel, etiquette usually demanded that they consult the other people in the room, only moving towards the television once agreement was reached. As a result, you stuck with programmes for longer: since it took a modicum of effort to abandon them, and people are naturally lazy, even slow-burning shows were granted the necessary time to draw you in.

With the arrival of the remote control, the power passed to whoever held the magic gadget in his or her hot little hands. Automatically, the holder of the remote was created king of the living room, and everyone else became either a helpless captive, or an angry dissenter. As the number of channels steadily grew, so did the remote-holder’s temptation to flick between the channels with the compulsively restless air of one seeking an elusive televisual fulfilment that could never be found.

Channel-surfing is a guilty pleasure that should only be practised alone. There is nothing worse than sitting in the same room while someone else relentlessly channel-surfs. It makes you feel as if you are going mad. You hear – in rapid succession – a snatch of song, a scrap of dialogue, a woman trying to sell you a cut-price emerald ring, half a news headline, and an advertising jingle. The moment that something sounds like it might interest you, it disappears. Worse, when you yourself are squeezing the remote, you find that you have now developed the tiny attention span of a hyperactive gnat. Is it any surprise that, now that alternative amusements to the television have emerged, family members are challenging the remote-holder’s solitary rule and decamping to the four corners of the family home with their iPads and laptops?

I know that lamenting the invention of the remote control will – in the eyes of some – put me in the same risibly fuddy-duddy camp as those who once preferred the horse and cart to the motor car, yearned for the days when “we made our own fun”, and said that this email nonsense would never catch on. I don’t care. Listen to me, those of you who cannot imagine life without the zapper: it really was better before.

I think the phrase ‘surfing the web’ is misleading and actually disguises the fragmentary nature of the typical internet experience. If you go surfing (I went once!) you wait patiently and let a lot of inadequate waves pass underneath your board, but as soon as you spot the right wave, ‘your’ wave, you paddle with all your might to meet it properly, leap onto the board, and then ride that wave for as long as you can.

When you find a wave, in other words, you stay with it. You are so with it and trying not to fall off it that it’s inconceivable that you would be looking out of the corner of your eye for a better one. That’s the joy of surfing – the waiting, the finding, and then the 100% commitment to the wave that comes.

That’s why the phrase ‘surfing the web’ doesn’t work for me. The joy of the web, and the danger, is that you can hop off the page at any time, as soon as you see anything else vaguely interesting or distracting. You are half-surfing a particular page, but without any physical or emotional commitment. You can move away to something better or more interesting – that’s the miracle of the web, what it can throw up unexpectedly. But it means that one part of you is always looking over the horizon, into the other field, where to go next; as if non-commitment to the present moment, a kind of existential disengagement, is a psychological precondition of using the internet.

As you know, I am not against the internet. I just wonder what long-term effects it has on us and on our culture. On the internet, everything is provisional. So if we see everything else through the lens of our internet experience, then it all becomes provisional – including, perhaps, even our relationships.

Maybe that’s the word to ponder: ‘provisionality’.

Read Full Post »

No-one doubts, after Egypt, that you can organise a revolution on Facebook. The question for those of us not presently caught up in this kind of political activism is: can you truly socialise there? 

Aaron Sorkin, creator of the West Wing and scriptwriter of The Social Network, was asked in a recent interview what he thought of the way Facebook is changing the nature of our relationships.

I’ve copied the full answer below, but let me highlight the thought-provoking analogy he makes, which is reason for a post in itself:

Socialising on the internet is to socialising what reality TV is to reality.

Here’s the context:

Q: How to you feel about the way Facebook is changing how people relate?

A: I have a 10-year-old daughter who has never really known a world without Facebook, but we’re going to have to wait a generation or two to find out the results of this experiment. I’m very pessimistic. There’s an insincerity to it. Socialising on the internet is to socialising what reality TV is to reality. We’re kind of acting for an audience: we’re creating a pretend version of ourselves. We’re counting the number of friends that we have instead of cultivating the depth of a relationship. I don’t find it appealing. [Playlist, 12-18 Feb, p12]

But aren’t we always acting for an audience? (If you want some thoughts on this go and read Tom Stoppard’s play Rosencrantz and Guildenstern Are Dead.) And what if the distinctions between reality TV and ‘non-reality’ TV (whatever that was/is) and non-TV reality were lost a long time ago?

Read Full Post »

Be honest. Keep a tally of how many minutes of TV you watch each day. Add it up. What’s the weekly total? And the more interesting question: Has this figure gone up or down over the last few years?

Chicken watching TV or TV watching chicken?

Everyone thought that the internet and social media would kill television, just as they thought that cinemas would become extinct with the arrival of the video recorder. But it hasn’t happened.

British viewers watched an average of three hours and 45 minutes of television a day in 2009, 3% more than in 2004, according to research published by the media regulator Ofcom. Here are some thoughts from John Plunkett:

TV continues to take centre stage in people’s evenings, boosted by the popularity of shows such as The X Factor, Britain’s Got Talent and Doctor Who.

Television’s popularity has also been boosted by digital video recorders (DVRs), now in 37% of households – and the introduction of high definition television, now in more than 5 million UK homes.

“Television still has a central role in our lives. We are watching more TV than at any time in the last five years,” said James Thickett, director of market research and market intelligence at Ofcom.

New technology offered viewers an enhanced, easy-to-use viewing experience, with 15% of all viewing time spent watching programmes recorded on to a DVR, he said.

“Unlike VHS, which was such a hassle to set up and record a programme that only a very small proportion of viewing was on video, DVRs give viewers the chance to watch the programmes they really want to watch. It is bringing people back into the living room.”

The UK’s ageing population has also pushed up the figures. Older people are likely to watch more television, with the average 65-year-old watching five hours and 14 minutes a day. And it’s to do with the increasing number of channels too:

Digital television passed the 90% threshold for the first time last year, with 92.1% of homes having digital TV by the first quarter of 2010. The average weekly reach of multichannel television exceeded that of the five main TV channels – BBC1, BBC2, ITV1, Channel 4 and Channel 5 – also for the first time in 2009.

“More people are getting access to a greater number of channels and that’s translating into greater number of viewing hours per person,” said Richard Broughton, a senior analyst at the audiovisual research company Screen Digest.

“Various people have predicted that the internet would kill off television but we have always said that TV would be here for a long time to come. It’s much harder for broadcasters and production companies to monetise content online, and there are all sorts of things that broadcast can do that online can’t, such as high definition.”

Broughton said viewers were using Facebook and Twitter while watching the television, rather than switching it off altogether. “In many cases television is complemented [by social media platforms] and not necessarily a direct competitor,” he added.

I was about to write that the beauty of cinema is that you are forced to give your attention to one image, and that you have to leave all your other digital distractions behind. But then I remembered a recent visit to the cinema when the guy in front of me was texting even after the film had begun. It breaks your heart…

Read Full Post »

I still hardly use Facebook. If I remember, I copy these posts onto my homepage. And if someone sends me a message, I try to reply. But being ‘the wrong side of 40’ most of my middle-aged friends still prefer email to social networking.

Nokia e61 smartphone by Ziębol.I used to console myself with the idea that Facebook is the past, and something new will soon step over the digital horizon. It seems I was wrong, and Facebook is actually the face of the future.

It’s not just that Facebook growth is still exponential (I don’t just mean large, I mean exponential: it’s rate of growth is going up; the number of active users doubled from 200 million last summer to this month’s 400 million). It’s that our personal identity, and our commercial identity, is becoming defined not by what we consume (shopping), or watch (TV), or search for (Google), but by what we connect with in real-time.

This is why mobile Facebook and all the new smartphone applications will shape the evolution of culture and human consciousness over the next decade. Blogging, by the way, is almost prehistoric by now.

David Rowan explains the background:

What we are witnessing is the ultimate battle for control of the internet. Google, employing the world’s smartest software engineers, has dominated the desktop-internet era for a decade through its unbeatable algorithm-based computing power. But now we’re into the mobile-internet era, Facebook intends to dominate by knowing what we are thinking, doing and intending to spend — wherever we happen to be. As Facebook’s founder Mark Zuckerberg sees it, this “social graph”, built around our friends, family and colleagues, will determine how hundreds of millions of us decide on everything from holidays to cosmetic surgeons. And with Facebook the proprietary gatekeeper — its mobile-phone applications already attracting extraordinary engagement from members — that’s a potential advertiser proposition that Google can only dream of.

It’s not that Mr Zuckerberg is still only 25 and naively arrogant that annoys Google, nor that his company has enticed swaths of senior Google talent. It’s that Facebook’s fast-growing dominance of the “social” internet threatens its rival’s entire business model. If it can sell advertisers access not just to what you’re thinking, but to where you are, who you’re with and what you plan to do, Facebook’s revenues from individually targeted “behavioural” advertising could increase exponentially. And it knows it.

“Google is not representative of the future of technology in any way,” a Facebook veteran boasted to Wired recently. “Facebook is an advanced communications network enabling myriad communication forms. It almost doesn’t make sense to compare them.”

Mr Zuckerberg’s human-powered view of the internet also taps into our yearning, as social creatures, to climb Abraham Maslow’s hierarchy of needs to attain self-actualisation: of the 400 million active Facebook users (up from 200 million last summer), half log on in any given day; they share five billion pieces of content a week and upload more than three billion photos each month. On average, they spend more than 55 minutes a day on Facebook. Those who access it via their mobile devices are “twice as active”. Now do you see why the search gurus in Google’s Mountain View headquarters are so anxious?

Read Full Post »

General Electric Color TV, 1960's by Roadsidepictures.The debate continues about whether allowing young children to watch TV harms their cognitive development or not. It flared up a couple of weeks ago when a report commissioned by the Australian government recommended that children under 2 should be banned from watching TV and electronic media such as computer games. So this is about freedom, censorship, the relationship between the personal and the political, the nanny state, etc., as much as it is about child development.

An article by Patrick Barkham looks at some of the scientific and political issues involved. At the centre of everything is the extraordinary way in which the human brain develops. (I prefer the word ‘mind’ to ‘brain’, because ‘mind’ allows us to appreciate that our cognitive relationship with the world is dependent on much more than the neurological condition of the brain. But the brain certainly plays its part.) Barkham reports the findings of Dr Michael Rich, director of the influential Center on Media and Child Health at Boston Children’s Hospital:

Humans have the most sophisticated brain on the planet because it is relatively unformed when we are born. Our brains triple in volume in the first 24 months. We build our brains ourselves, by responding to the environment around us. The biggest part of this is a process called pruning, says Rich, whereby we learn what is significant – our mother’s voice, for instance – and what is not. “TV killing off neurons and the synaptic connections that are made in order to discriminate signals from ‘noise’,” he says.

Experts in child development have found that three things optimise brain development: face-to-face interaction with parents or carers; learning to interact with or manipulate the physical world; and creative problem-solving play. Electronic screens do not provide any of this. At the most basic level, then, time spent watching TV has a displacement effect and stops children spending time on other, more valuable brain-building activities.

Scientists concede that they do not yet know precisely how TV affects the cognitive development, not just in terms of understanding the inner workings of the brain but because the way we use television and other electronic screens is changing so rapidly that we do not know how it will affect people by the time their brains stop developing in their mid-20s. But the weight of evidence about the deleterious impact of TV on child’s ability to learn is alarming…

A more recent article by Helen Rumbelow steps back and looks at the way theories of child development have changed over the last couple of generations. The 1990s was the decade in which we discovered the importance of the first 24 months, and the idea that the right stimulation could boost your child’s chances. This led to playing Mozart to the child in the womb, flashcards as soon as they popped out, and Baby Einstein videos when they could sit up. Now the tide has turned.

Waching too much TV is bad for your eyes by | spoon |.

I’m not taking sides here – I don’t know enough, and I don’t have children, and I’ve seen plenty of happy and healthy children grow up with a bit of TV. But for all those anxious parents tortured with guilt and uncertainty, Rumbelow provides some consolation with a quote from Dr Martin Ward-Platt. The evidence, he says, is still too equivocal:

 The farther you get away from deprived populations, the less TV gets watched, and the more parental controls there are, so it is hard to disentangle this stuff.

Of course, the thing that really makes the difference for a baby is interaction with a caregiver and there is nothing we can invent as a people substitute. But if a child watches some TV and is exposed to people for the rest of the time, they will do fine. What we don’t know is where the limit is, where you start to hold children back.

If there is no strong evidence either way, we think it’s much better to say we don’t know, and what’s right for you is probably the best thing for your family.

Read Full Post »

%d bloggers like this: