Too much cinema-going leads the brain to make strange connections. You wouldn’t think that Calvary, a small-budget Irish film about a rural priest facing a death threat, would have much in common with The Amazing Spider-Man 2, one of the biggest-budget blockbusters of the year. And yet here we are. They’re the most recent two films I’ve seen, and one thing leaped out at me from both of them: the problem that they have in establishing a tone.
(Spoilers for both movies below, though as few of them as I can get away with.)
Up until now, crowdfunding schemes have had one main pitfall: that even though you prepaid for something, you might not get it. Now, with Facebook’s purchase of Oculus VR, a pitfall has emerged on the other end of the success spectrum: that the thing you bought might evolve into something that maybe you wouldn’t have prepaid for in the first place.
The Oculus Rift headset was one of the biggest early success stories on Kickstarter. Virtual Reality is one of those never-was technology dreams, but Oculus’s promise was enough for backers to go for it in droves. It wasn’t just promise either: there was plenty of intelligence behind the Rift headset, and it seemed to keep improving as the months went by, with new versions of the development kit and some highly impressive game demos.
And then yesterday Facebook went and bought Oculus VR for $2 billion. This has not gone down particularly well in the technology press, either because the deal is a betrayal of Oculus’s indie roots, or simply because it makes no sense. Facebook, a company with a major games presence, albeit one that’s hardly on the cutting edge, seems to be buying into Oculus because it sees VR as a new field opening up, and with the recent announcement of Sony’s Project Morpheus, it might be right.
Still, the argument that the purchase doesn’t make much sense is a strong one. Unless Mark Zuckerberg has bought into the notion of The Matrix and sees it as the logical end point of Facebook’s parallel world of social connections, it’s not easy to guess where he’s heading with this. VR headsets may be providing increasingly realistic experiences, but they’re still bulky and obvious—only suited for home use, when you’re alone with a net connection. Vain hope it may be, but I don’t really want things to go that way.
Where VR headsets might be heading can be seen in the convergence of technologies. VR headsets replace reality with something new, which is perfect for games but isolates the user from the world around them. Augmented Reality headsets like Google Glass take the world the user is already in and layers extra information over it. Right now they’re limited in their application, but as they become more sophisticated, the tweaks they make to reality will become increasingly indistinguishable from the real thing. At some point, AR and VR are going to merge, and the choice of just how much of the real world to occlude is going to be left to the user.
With high-definition displays, motion tracking and fast response times, VR headsets are approaching the point where they can deliver a genuinely immersive experience. AR headsets are already extremely lightweight, and you don’t have to look too far in the future to see them being implanted in contact lenses. So maybe this is where Facebook is looking with its purchase of Oculus VR—not the immediate future of immersive gaming but rather the long-term play of a future in which your social world is always with you.
This could yet turn out well for everyone: Facebook certainly (?) isn’t stupid enough to kill off Oculus’s promise as something new in the world of gaming. The goodwill that the company gained over the course of its Kickstarter campaign and subsequently is gone already, but some of it could be clawed back if the hardware and its software ecosystem meet early hopes. Longer term, and more scarily, we might yet be facing a future where Facebook is always in the corner of your eye. That may not be a “Like” button that many are willing to click.
I’m an inveterate fan of the underdog, but sometimes the underdog gets squished. As an example, take Robert Hooke—something of a scientific underdog, despite being an inventor and polymath described as “England’s Leonardo”. It was Hooke’s misfortune that he picked a fight with one of the smartest men in history: master mathematician Isaac Newton, Mister Gravity himself.
Not that picking a fight was something that Hooke was shy about in his later years. In addition to his multifarious talents, he gained a reputation for being cantankerous, vindictive and petty. Once again, Hooke’s problem was that the man he picked a fight with was a spectacular example of cantankerousness, vindictiveness and pettiness.
The third episode of Neil DeGrasse Tyson’s Cosmos: A Spactime Odyssey retells the story of Hooke and Newton as part of Tyson’s celebration of Edmond Halley of Halley’s Comet fame, a contemporary of both men and a notable polymath in his own right. However, as the story is told from Halley and Newton’s point of view, Hooke is shown as Newton saw him: a hunchbacked, dwarfish figure, with lank, greasy hair and a face always in shadow (the lack of a contemporary portrait of Hooke is often blamed on either neglect or deliberate destruction on Newton’s part).
It’s a fascinating story*, to be sure, replete with accusations of plagiarism, a vendetta lasting beyond the grave and some of the most important scientific discoveries of this or any era. Nor does Tyson shy away from Newton’s own strangeness: not only was he far more of a recluse than Hooke, but he also focused much of his time and intellectual energy on alchemy and the search for hidden messages from God in the bible. It’s hard not to feel that Hooke is a bit hard done by in Tyson’s portrayal—his many achievements are mentioned, albeit more briefly than accusations of plagiarism and credit claimed for other scientists’ work that could just as easily be levelled at Newton.
The feud with Newton was to sink Hooke’s place in scientific history for centuries. Although the two men had very different areas of expertise—Newton was the master mathematician and theoretician, whereas Hooke was an experimenter and thinker in almost every field available—they ended up quarrelling wherever their interests intersected. Famously, his “standing on the shoulders of giants” comments is often thought not to refer to his illustrious predecessor but to be a pointed jibe at Hooke, who was shorter even than Newton.
When Newton became president of the Royal Society shortly after Hooke’s death, he did much to conceal his predecessor’s achievements. In more recent years, scholars have rescued Hooke’s reputation somewhat, but only those with an interest in the history of science or the Regency era in England are likely to know much about him. Newton, by contrast, is generally reckoned one of the finest minds in history and gets his face plastered across banknotes.
It’s a pity that Cosmos doesn’t even the scales a little more, because otherwise it’s a great show, striking a fine balance between entertainment and education. Tyson conveys the march of our understanding of the universe around us in unapologetically positive tones, and if he doesn’t always match the quasi-mystical sense of wonder of Carl Sagan (to whose Cosmos: A Personal Journey series Tyson’s namesake show is a sequel/remake), he may yet be delivering something that could inspire a new generation of scientists.
*Told in much more detail, and to my mind more even-handedly, in Neil Stephenson’s massive-yet-fascinating Baroque Cycle of novels.
Terry Gilliam scarred my childhood. Not through a too-young exposure to his surreal and occasionally lewd animations on Monty Python’s Flying Circus (my sense of humour is mostly the result of my dad exposing me to recordings of The Goon Show, again, too young) but rather through his film Time Bandits, one of the greatest and darkest children’s movies ever made. I’m quiet sure that the ending, which I’m not going to spoil, resulted in a few disrupted nights for my parents.
Now he’s back with The Zero Theorem, a new movie following the familiar Gilliam theme of a discontented everyman trying to survive in an at-times sadistically unfriendly dystopia. It’s equal parts Kafka and comic book, and the result is a visual feast that barely conceals the symbolism Gilliam shovels into the mix.
(Spoilers for two recent movies below the cut…)
Bungie Software, creators of Halo and the upcoming Destiny, have toyed with the notion of artificial intelligence throughout their games. In particular, they’ve developed the idea of “rampancy”, whereby an AI becomes increasingly self-aware, intelligent and uncontrolled. It wasn’t a concept I was expecting to come across in Spike Jonze’s Her, though it turns out to be central.
(Spoilers all the way below…)
There are two main approaches to Valentine’s Day. The mass-market one, adopted with varying degrees of enthusiasm or resignation, is that of getting involved, making an effort and celebrating that one special person. The sophisticated approach, adopted by cynics and singletons, is that it’s nothing more than an exercise in raising sales of flowers and chocolates, and is best avoided by anyone with a genuinely romantic bone in their body.
I don’t wholly subscribe to either viewpoint.
It’s not much fun being single on Valentine’s Day, when the world is reminding you, in red and pastel pink, how wonderful relationships are. However, it’s not always a lot of fun being in a relationship either, facing a dose of societally mandated pressure to “celebrate” your significant other by splashing cash on
However, human beings are crap when it comes to relationships, as much as they’re crap at anything else. We’re forgetful, we fall into bad habits and we take the most important things for granted simply because they’re always there. Getting a kick up the arse, even from an unwelcome direction, isn’t a bad thing if it reminds us that, hey, this is something worth a little celebration.
Only once a year though? (Add in a birthday, an anniversary and Christmas and you have four times a year, which still seems a little lacking.) Rampant commercialism doesn’t seem like the best way to set a mood either. It’s very hard not to be cynical when you see stores clearing away Christmas goods just to replace them with an array of Valentine’s Day products. (My local Tesco has a permanent “seasonal products” aisle, so you always know where to go to be reminded what the next thing you’re supposed to spend money on is.)
Cynicism, then, may be the healthiest response to Valentine’s Day. However, that ought to be cynicism towards the marketing rather than the message buried deep underneath. Responding to a prompt to do something nice, to give the person that means the most to you a little extra thought, doesn’t mean you’re giving into capitalism. After all, the form the resulting action takes ought to reflect both you and the relationship you’re in. However weird it may be.
So, opt out of Valentine’s Day and its avalanche of cards, flowers and chocolates by all means. Or opt in, and personalise it. Either way, it ought not to be just one day a year, and any reminder to be a better person ought to be appreciated.
Last week was a week of anniversaries. World-changing anniversaries, in fact, though I’m going have to make an argument for at least one of them.
The anniversary that got the most press inches was the 30th anniversary of the introduction of the Apple Macintosh in 1984. The launch is best known for that Ridley Scott advertisement invoking Orwell’s Big Brother, but recently a video came to light revealing a launch event in front of the Boston Computer society.
It’s a fascinating watch, mostly for the fact that it contains so much of the future. The Mac is front and centre, and it’s amazing just how much of what we take for granted in our computers today appeared right at the start of the Mac age. It’s not just the MacOS that still bears the ancestral marks of its progenitor. Every modern desktop/laptop OS can trace its ancestry back to 1984. Amazingly, it’s a trick Apple has pulled off more than once: its iOS is similarly the root from which the modern smartphone/tablet ecosystem arose.
It’s also instructive to watch Steve Jobs at work, long before his keynote speeches grabbed attention around the world. The delivery isn’t as smooth as it later became, but so much of those keynotes is already in place: the idea of the intersection of art and technology, the attention-grabbing video segments, the on-stage demonstrations to wow the audience. Jobs would soon be ousted from Apple, only to return and lead it to world leadership years later, and his keynotes would be much more controlled, so getting to see him do a question-and-answer with the original Mac team is a rare treat.
The other anniversary is for an event ten years earlier and one less easy to nail down. Gary Gygax and Dave Arneson’s Dungeons & Dragons also changed the world, albeit in a less direct way than the Mac. It birthed, of course, the roleplaying game (RPG), a combination of board game and improvisational storytelling. RPGs have never been a big industry, but their influence has spread far and wide.
D&D drove an interest in fantasy, and followup RPGs drove interest in science fiction and horror, even as they followed trends in wider culture: Star Wars, Anne Rice, Ghostbusters, etc. RPG players got involved in the growing computer games industry and the entertainment industry, leading to a lot of what is now mainstream. Game of Thrones’ George RR Martin? A roleplayer. Joss Whedon? Roleplayer.
The Mac is still going strong, despite some dodgy moments along the way. D&D has lost its leading position everywhere except cultural memory, but the hobby it kicked off has endured and spread like a weed, its roots and tendrils going everywhere. The Mac changed how we interact with the world. D&D created a new spawning ground for content, and an avenue for storytelling and offbeat genres that wasn’t there before. Happy birthday to them both.
Ah, the joys of following the Northern Ireland news. Every so often, you get served up the kind of insanity that only the combination of parochial religious zealotry and a genuine 17th-century mindset can provide.
This week, it seems that the DUP councillors in Newtownabbey, evidently nostalgic for the days of the Life of Brian controversy, elected to force a shutdown of the Reduced Shakespeare Company’s The Bible: The Complete Word of God (Abridged), just a week before performance time. Because, hey, there’s nothing more important going on in Northern Ireland than a slapstick play that might put a few religious noses out of joint.
Let’s just clarify here: this is the Reduced Shakespeare Company that has been in existence for three decades and has been a fixture on the London theatre scene for much of that time. This is a show that has been around for nearly 20 years, winning awards, being performed around the world in numerous languages. And this is the DUP councillors standing up en masse and doing their best to bully the local arts board into shutting it down without a vote.
It would be funny if it weren’t so predictable. The combination of political power with the certainty of religious faith brings tends to results in the shutdown of dissenting points of view. Underdog sects and religions can favour freedom of conscience, but history shows that when the boot’s on the other foot, attitudes change. After all, when you’re in possession of the ultimate truth, isn’t it a public good if that’s the only truth that’s going to get promoted?
The trend towards secular government is one that took a long time to hit Ireland, and arguably it still hasn’t hit the North. Everywhere else, there has been pushback, in the form of Texas creationists altering school textbooks, Islamist efforts to marginalise secular Turkish youth, or a UKIP councillor linking gay marriage to recent floods. In Northern Ireland, the linkage between religion and the sectarian divide and the fact that parties from either extreme hold the whip hand means that it’s not so much pushback as an effort to hold onto power (something the Unionists have been doing for decades).
There’s no indication that anyone involved in cancelling the play had actually seen it, or had any interest in seeing it. Whether their chief interest was in “defending Christian values”, grandstanding for a few more votes or simply throwing their weight around, they both overstepped the mark in terms of their electoral mandate and completely undershot in terms of doing something of benefit for the people of Newtownabbey.
It’s one of the best—and best known—corporate mottos of the technology age. Google’s three-word mantra positioned it as something different from the corporate behemoths it disrupted as a scrappy startup. It was also an implicit promise, that as a company it was on the side of the customers it served.
Except, that was years ago, and Google is now one of the biggest of those corporate behemoths. Many of its products may still be loved (just witness the Android/iOS flamewars) but the company itself? The reaction to its recent acquisition of Nest shows that beyond a sense of confusion as to what Google was paying all that much money for, Google itself just isn’t trusted all that much.
Nest is an odd little company in and of itself: founded by Tony Fadell, one of the creators of Apple’s iPod, it looks to bring the same design ethos to neglected home products and add some Internet-era connectivity and device awareness. Nest are one of the leading lights of the “Internet of Things”, so it’s not a surprise that Google has swooped in, but the reaction from technology pundits has been broadly negative.
Why? Well, we tend not to trust those with too great a power over us, and Google’s huge collection of information and ability to leverage it gives it immense power. Nest’s connected devices suggest a desire to extend its awareness of our activities even further into our homes and our every waking moment.
Paranoia? Maybe, but in this case they are watching us. Wasn’t Nest doing the same thing already? It was, but there’s a difference between communicating with devices that form part of our lives, and having those devices communicating with a vast database that already contains details on our lives.
A company like Nest, as with Apple, makes its money by selling products to consumers. A company like Google makes its money by selling advertising and licensing software, both to other companies. As long as there’s a viable alternative, it’s vital for Google to retain the goodwill of consumers, but the fact that their money comes from elsewhere means that they’re always going to be a servant of two masters, and the temptation is to rely on PR to deal with consumer attitudes.
This is not to say that Google, much less the people who work for it, is evil, but the company relegated its famous motto to the backburner a while ago. The pressures of capitalism, as practiced in the modern world, don’t leave much room for moral scruples. The law and how to follow or sidestep it is the main limiting factor on the goal of maximising profits.
The fact is, the Nest acquisition is a good deal, expensive though it may be for Google. Nest gets funding and the chance to supercharge its entry into homes around the world. Google gets not only the hardware design expertise of Fadell and his team of ex-Apple employees (something it badly needs) but also that extra angle of attack for visualising our lives in data.
As for whether Google can regain its standing in the eyes of consumers, the question is does it want to? It’s become one of the world’s largest corporations and it’s hugely profitable. There’s an inevitable sense of sadness at the loss of innocence from those idealistic startup days, but maybe that’s always the price to be paid for success.