How Twitter ruined your life

Twitter replaces an awful lot.  It replaces live sport, breaking news, your actual friends.  It’s great for connecting you to people and events and it’s true to say that almost everything in which I have an interest (museums, galleries, sports teams, newspapers, travel destinations) has an associated Twitter feed, in many cases a better start-point for information than the website.  I always try to explain Twitter to people as an information-filter: it’s about the information that you gather in, not the information that emanates from you.  Twitter is for your eyes, not your mouth.  My own use of Twitter has changed over the 5+ years I’ve been a user (phrase deliberate).  I now use it in a far more professional context, which may explain why I’ve become more dull over time.

A recent study shows that over 70% of Twitter users check their feed within 3 minutes of waking up.  Leaving aside this most obvious way that Twitter can ruin your life (addiction), there are several more subtle negative aspects to Twitter.  Guard against them.

1.  Only following people whose opinions you agree with.

Being open to ideas and opinions is important, but following only people who agree with you is likely to cement your position even before a discussion has started.  I’ve had the misfortune to work with one or two people whose confidence in their right-ness was astounding.  If at any point you disagreed with them, you were either an idiot, or someone who had simply not thought enough about the argument: think it through again, and I’m sure you’ll agree with me.  If you are going to argue, it’s important to be open to persuasion.  It’s the discussion that should be important, not the ‘winning’.  It’s also hard to ‘win’ an argument in 140 characters, especially against someone with a long-ish Twitter handle.  For every person who agrees with you fundamentally, try to follow one who doesn’t, unless the first person you followed was the Anne Frank house, for example.  You’ll find your feed has far more balance and you might even come to respect the opinions of those who disagree with you.

2.  The over-thought Bio.

Changing your profile picture on a regular basis is just about acceptable, but changing your Bio is odd.  You are not David Bowie and you don’t need to continually re-invent yourself.  I’m not even sure what the point of a Bio is, and if you’re trying to crow-bar some comedy into what you write, stop it.  Stop it now.  There are some things that shouldn’t need to be written: if you have kids, we take it as read that you think they are ‘wonderful’.  If you work in IT, you do not need to state that you have 2.0 kids (that joke became obsolete around the same time as the ZX Spectrum).  Stating that you are ‘partial to the odd glass of wine’ does not make you sound like a lot of fun, just someone without any genuine interests.  The Bio is meant for people to see at a glance if they wish to follow you or not, but reading the top 5/10 Tweets from someone’s timeline is a far more reliable way of telling what you’re getting.  It didn’t take me long to find two examples of bafflingly pointless Bios:

‘Editor and professional procrastinator.  Massively confused by the whole thing’


‘Curmudgeon.  Neither in School, nor of school, but by school.  Brace yourself – there may be a kerfuffle’


No, I’ve no idea either.


3.  Your dinner.

No-one cared what you ate for dinner before you were on Twitter, and nothing has changed.  Did you ever take a polaroid photo of your evening meal and pass it round the office the following day? (note: this is rhetorical, I hope).  By all means post photos of your culinary creations, but to avoid a false sense of over-importance, you must first assume that no-one is going to view them.

4.  Being proud to be blocked.

Blocking people is fairly unusual.  The only people I ever block are generally spam sex-bots with alluring names like @ej35xxx80.  Famous people with lots of followers seem to have endless reserves of patience and will generally threaten blocking before actually doing so; you’ve actually got to be pretty offensive to have people hit the block button on you.  Being blocked shouldn’t be something to be proud of, but I’ve seen lots of Bios where people are delighted to state that they’ve been blocked by someone they disagree with, which strikes me as wrong.

5.  Protecting your account.

Twitter is public.  It’s pretty much the whole point of Twitter.  If you want to protect yourself from everyone but your nearest and dearest, that’s what Facebook is for, your real friends.  People with 7 followers and a protected account might just be missing the point.  I’d understand if what you’re writing is top secret (maybe you’re working towards who really killed Kennedy), but then Twitter is probably not your ideal medium.

And now I’m off to make some truffled eggs.  Photo on Instagram in 5.

Advertisements

Familiarity and contempt

I’ve used the Stephen Fry expression to describe friendship before. The Nation’s favourite Wildean uncle claimed that he ‘likes to taste his friends, not eat them’. Aside from the obvious innuendo, it’s a sentiment with which I agree. Some of my favourite people are those that I don’t see for a couple of years, and when we do meet up, it’s like we’ve never been apart. I’ve just spent a week in the states with a friend I hadn’t seen for 3 years (we keep up only through twitter) and it led to some of the most enjoyable, entertaining and easiest conversation you could imagine. Some people like to surround themselves with a small group of close friends, and these people act like a kind of social comfort blanket. Friendship lines are drawn, everyone knows which topics are there to be debated and which are off-limits, opinions are generally well-known, and conversation can be dominated with everyday chit-chat.

I’m certainly not saying that the better I know people, the less I like them, or even the less interesting I find them; I do consider however that the friendship of those people that I rarely converse with and meet up with even less often can be just as valuable. It’s like music and books. Some books you are happy to read and re-read, and there’s some music that you never tire of listening to. There are other books that you loved first time around, but you have no desire to read again, at least not in the immediate future. Some music is like this too; I love it, and then I love re-discovering it, but only at a much later date.

As I’m on holiday at the moment, I’ve had the opportunity to do quite a lot of reading. I’ve been reading a couple of authors that I thought I liked a lot: Malcolm Gladwell and Jay McInerney. The more I’ve read of them, the less I like them. Maybe that’s a little strong, but the less interest I have in them; their freshness is notable by its absence. In McInerney’s case, I’ve read him pretty much chronologically, starting with the fantastic ‘Bright lights, Big City’. His later novels (less so the short stories) resemble less good versions of his earlier work. The themes are similar, the humour more forced, the material less fresh. People say that you write about what you know, but he seems to have written about all that he knows in the first couple of books, and has spent much time re-hashing old material after that. Gladwell is more odd, because I read Outliers (2008), then What the dog saw (2009) then his breakthrough novel The Tipping Point (2000). Gladwell certainly has a brilliant easy-reading style, and it has been said of him that he ‘makes you feel as though you are the genuis’. It’s a very leading style though, and many of the conclusions that he comes to, which appear watertight at first, do not stand up to any kind of rigorous scrutiny. His standard technique is to take a one-off event, re-tell it as an incredibly entertaining story, and then to draw far reaching conclusions from this single event that usually challenge general thinking on the subject. Thought and discussion-provoking certainly, but hard evidence? almost certainly not. The more I read, the more I feel that I’m being worked on, albeit very gently, into believing the genius of Gladwell, and I find that irritating, and just a little bit subversive.

This isn’t the case with all authors. If one reads Orwell chronologically, things culminate with 1984, and all of his other writing and experiences feel like a build-up to this. It helped that he died young, and knew that he was dying, and maybe that’s the key: to die before one’s output starts to tail off. Morrison, Dean, Fitzgerald have nothing duff in their back catalogue; they simply didn’t have time. Conversely, the longer that Jagger or McCartney hang on, the more hapless the material they produce has become. This is similar with Dave Grohl, who sounds more like un-edgy bad Nirvana with each album. I used to think that Dali was a genius, until you realise that you’ve seen all the good stuff in the first 10% of his output, and the rest of his career was a re-hash of former ideas.

Perhaps there’s a limit to creativity, and it’s best to stop when you feel genuine creation is harder to come by. Bowie and Picasso manage to stay creative forever by continual re-invention. They are the genuine outliers; these are people with whom one can be fully familiar, and feel nothing but admiration for their genius.

Creative Juices

Richard Feynman is one of the greatest minds of the 20th Century. He won a Nobel prize for Physics in the 1960s; he was involved in the atomic bomb project at Los Alamos when only in his early 20s; he helped to compile the report which unearthed the reason for the disaster on the space shuttle challenger. That’s all pretty impressive. Perhaps even more special than all this is the fact that he was a brilliantly clever man who was able to inspire every person that he talked to, from fellow Nobel-prize winning scientists to interested laymen. The Horizon documentary in which he is featured is the best programme ever made about science, and I challenge anyone not to find themself drawn in and fascinated by his view of the world. And yet there are things he claims to find difficult to explain: he states that at one time he was trying to explain to his father the emission of a photon from an atom as it moves from a higher state to the ground state. His father asks whether the photon was in the atom ahead of time, and he states that it was not, and it is the moving between 2 states that allows the photon to be emitted. He likens it to when his son told him that he could no longer say the word ‘cat’ because his ‘word bag was empty’. We do not have a ‘word bag’, i.e. a finite number of words we can use, nor is the number of times we can say any particular word limited. The words are not in our bodies ahead of time; we form them, just as the photon is not in the atom ahead of time.

If anyone is still reading this, I think this is an example of why Feynman would have been such a brilliant teacher – his use of analogy is so good, which is why he can explain even difficult concepts to anyone who is willing to listen.

All this serves to introduce what I was really thinking about, and that is the limit to one’s ideas and creativity. Is there a limit to this, just as we might have a limit to the number of times we can say the word ‘cat’? I’ve been a teacher for 12 years (just starting my 13th) and it’s a good job that I have moved around from School to School and between roles in these Schools. I’ve felt that each of my moves has co-incided with the time at which I felt my creativity in that particular role was on the wane. After 5 years as a Head of Department that my creative output was on a downward slope. I’d had a lot of ideas, but I’d rather exhausted them over a 5 year period. But it seems like I’m not alone. Many hugely creative artists (note that I’m not comparing myself to these people) seem to run out of steam after a certain amount of time: Paul McCartney once changed the face of British music, now he churns out instantly forgettable pop pap. You can include Mick Jagger here too. There’s the notorious ‘3rd album’ problem faced by singers/bands, and it’s often at this stage that later songs just sound like less good versions of what’s gone on before (hello Oasis). Salvador Dali was a real artistic original (though Bosch was doing the same thing about 450 years earlier), though when you look at Dali’s work, the same themes/ideas come up time and time again. Francoise Sagan – wrote Bonjour Tristesse at the age of 19, and precious little of note afterwards, and there’s many authors in the ‘one masterpiece’ club (Harper Lee, Margaret Mitchell).

Some ideas clearly run their course, and there’s no need to keep flogging a dead horse, whilst others are cut in their prime, and leave you desperate for more (12 episodes of Fawlty Towers, and 100 of Birds of a Feather hardly seems fair). To keep being creative takes a very special individual, or ones that are able to reinvent themselves. I’m not sure that many would compare Leonardo da Vinci and David Bowie, but these are the two examples that came to mind first, and I do like to write these blogs in a stream of conscious-esque manner. Da Vinci is probably the greatest Polymath of all time, and he managed to remain creative all his life, and Bowie is one of those artists who seems to be willing to produce total tosh at times (Tin machine) in order to maintain his creative streak – this provides us with genius such as ‘Heathen’ and ‘Hunky Dory’. Only one idea is needed to make us rich, but it’s those people that retain the ability to be creative right through to the end that I find most impressive.

Here’s some classic Feynman (may need watching twice!):

http://www.bbc.co.uk/archive/feynman/10705.shtml

A Word of Advice for David Mitchell

Fame’s a fickle thing. Many people manage to stay famous their entire working life; some by re-inventing themselves (Bowie), others merely by the fact that we can’t really forget about them, no matter how hard we try (Princess Diana, and yes, I know she’s dead, though I also suspect that most Mail readers think about her many times daily).

Fame comes late for some people; what did Richard Wilson or Thora Hird do before they were 60? Others find that fame comes to them early, and then leaves them just as quick; note the cautionary tale of Macauley Culkin, or Corey Haim (or was it Feldman?). There seems to be a real problem with over-exposure, and never was this more true than in the 1980s. The 80s spawned the Hollywood brat-pack, who churned out film after film in the latter part of the decade; then the decade ended, and the curtain came down on the career of Ringwald, McCarthy, Nelson and the twin Coreys. Incidentally, lest you think that this happened only in America, and only to glamorous people, the very same fate befel the ‘never-sure-why-you-were’ popular Tony Slattery. His brylcreemed side-parting and lavicious grin were rarely far from our screens, and then…nothing: he’d been whisked away as we heralded in a new decade.

Of course much of this instant fame followed by an similarly instant fall from grace is more about our inability to stick with something and our low boredom threshold than it’s to do with any lack of talent on the part of the performer. We also don’t like to see people at the top for too long (Kevin Costner), and we get bored of the same old face beaming out at us for too long. Some folk do have an uncrushable longevity about them (Forsythe – unfathomable, or Monkhouse – a legend), but most people come and go as we build them up just to sweep them back under the carpet.

And this is what I see immimently about to happen to David Mitchell. Maybe I’ve just been unlucky, but he does seem to be everywhere. What started out as a comedy actor playing a lead role in a funny original sit-com has now become: flogging said sit-com long since it went over the hill, writing an Observer Column, appearing on almost any panel show going and hosting a raft of 10pm-ish moderately watchable nothingish comedy gameshows that seem perfect for the ‘it’s not time to go to bed but I have nothing else to do’ slot. He was undoubtedly funny in peep show (series 1-5), but that was largely because he was playing himself, and we identified with him; his vulnerability and insecurities were there for us all to see, and they were funny whilst at the same time making us feel better about ourselves. Now though he’s gained confidence, and he’s starting to take the piss out of other people. Surely this shouldn’t be allowed; and we’re giving him just the platform from which to do it, with his column, new-found presenting skills and occasional one-liners on mock the week.

Can it last? History is against DM, and my advice is not to over-expose. Get back to playing yourself in sit-coms, written by other people, and we promise to laugh, and mostly with you. Otherwise, you’ll end up like Slattery. I wiki’d Tony S just now, to see what he’s been up to in the last 5 years. Here’s the sum total:

In January 2010, he appeared with Phyllida Law on Ready Steady Cook.

The future’s not bright.