Through the looking glass

The image of Seattle being refracted through m...

The image of Seattle being refracted through myopic glasses (Photo credit: Wikipedia)

Some inventions are so mundane that we barely give them a second thought. They just do their job well and everyone takes them for granted. There is one such invention that I rely on every day of my conscious life. Like the majority of the population, I wear spectacles. Without them – I’m lost. I can fumble my way around, especially if I’m familiar with the terrain, but ask me to read anything and I’m sunk.

Nowadays of course, the lenses are fashioned from lightweight plastic, but it wasn’t always the case. When I started wearing glasses at the tender age of 2, they really were made of glass. In those days, optical technology was nowhere near so advanced, so the lenses were thick and heavy. No matter, I have always been grateful for the invention of glass.

Without glass, homes would have no windows and be very draughty and cold. There would be no TVs if there were no screens. Nor would there be any tablets, mobile phones or laptops. There would be limits on how fast cars without windscreens could comfortably travel. Aeroplanes would not be able to fly so high or so fast and there would be no such thing as a skyscraper.

Thanks to volcanic activity, glass occurs naturally but not in a particularly workable form. Stone age man managed to use bits of glass as cutting tools, but that was about as far as it went. Man made glass was in full swing in the late Bronze Age, but commonly only used for beads or drinking vessels. It wasn’t until medieval times that man started to make glass window panes.

The building of Crystal Palace by Joseph Paxton for the great exhibition of 1850 in London marked the first real use of glass as a fundamental construction material. During the industrial revolution, glass manufacture became increasingly mechanised and refined and the material became ubiquitous.

Optic fibres have been spun since Roman times, but it was only in the late 18th century that the Chappe Brothers from France invented an optical telegraph system. Others experimented with the optical fibres using them for everything from illuminating body cavities to central lighting for the home. Fast forward to today and fibre optic cable forms the very bedrock of the World Wide Web.

Scientists in Turkey have even invented a form of spray on glass although the invention has been taken to market by a German company. It is a form of silicone dioxide which can create a flexible and even breathable layer. The substance, when applied, is 500 times thinner than a human hair. It is environmentally friendly, food safe and is quickly finding applications in just about every field of human endeavour.

So the next time you look at a screen, a skyscraper or drive your car or take a flight – be thankful for the material that makes it all possible.

Shields up!

NCC-1701-B

NCC-1701-B (Photo credit: Wikipedia)

In almost every Star Trek episode or film, you can count on a number of things. Someone will say “beam me up”, the crew will face some kind of moral dilemma and someone in a red shirt gets it. The other thing that happens regularly is that the Starship Enterprise will get involved in some kind of tussle. The Captain issues the command “Shields up!” and we will see the pretty little display panel showing the nice safe force fields around the ship.

Once that happens, you know that although the crew might get tossed around a bit, some minor pyrotechnics will go off under a control panel and one of the officers will give dire warnings that the shields are failing. But hey – there’s another half an hour to run, so nothing too bad is going to happen. The shields are a good thing. They keep all that horrible nastiness away from the ship and the Star Trek universe seems to have more than its fair share of horrible nastiness.

As a technologist, I have to deal with shields of a different kind. In my career, the “tussles” I get involved with are much more mundane and still there are many who feel the need to resort to a force field. The command words in this case are “I’m not technical”. Don’t even bother trying to explain any of that technology nonsense to me, because I’m not technical. They wear those words like a suit of armour.

The problem is, that the very same person wearing those shields typically want to know why it’s going to take so long to get that new bit of functionality or to get their problem fixed. Whenever faced with this situation, unwise words gallop to the front of my brain. “Sorry – it takes us a long time to find the spell components to cast the spell to fix that particular problem” or “Sorry – but the fix-it fairies only work on Fridays”. Thankfully, I’ve always managed to catch them before they escape.

Once during a presentation, a guy in the front row put up his hand and asked me to clarify something I had just illustrated. After a short exchange to understand where the guy was coming from, I tried rephrasing my point in simpler language. He still didn’t get it, so I tried to make it simpler and gave an analogy. Still no luck, so I kept on simplifying until I just ran out of levels and then conscious of wasting the rest of the audience’s time, I told him it was magic, an answer which he accepted with good grace.

OK, so the gap between that guy’s knowledge and the subject matter of the presentation was too great to be bridged in the limited time we had available, but at least the guy was game. He wanted to understand and for that, he gets a lot of points in my book. So if you are one of the people equipped with the “I’m not technical” shields, just hold off hitting the button – take a chance. You might be amazed how technical you really are.

Window replacement

Microsoft Windows 95 operating system cover shot

Microsoft Windows 95 operating system cover shot (Photo credit: Wikipedia)

Microsoft has always been an adaptable beast, constantly reinventing itself to adapt to whatever technology landscape is the current order of the day. Sometimes they are slow to adapt, such as when Bill Gates initially dismissed the internet, but they are quick to catch up.

This week, 18 years after the fanfare of Windows 95, comes the launch of Windows 8. Back in 1995, Take That and Blur were fighting for the number 1 spot in the charts. Sweden, Austria and Finland had just joined the European Union and Netscape had just gone public.

The computing landscape was very different back then. Pretty much every desktop in the world ran Windows, so Microsoft found a ready supply of customers eager to upgrade from the limitations of Windows 3.x up to the ultra modern Windows 95 with its plug and play, 32 bit support and long filenames. Although, even the ultra modern Windows 95 didn’t even come with a web browser. You had to download the “plus” pack in order to get the fledgling Internet explorer. Thus began the browser wars that led to the downfall of Netscape.

The mood of the launch was very different for Windows 95. Microsoft was very much a company in the ascendancy. They dominated the desktop with Windows and Office and there was absolutely no doubt that the new version of Windows would be a success. They chose “Start Me Up” from the Rolling Stones as a theme tune for the launch campaign as a reference to the brand new start button that nestled in the bottom left of the screen. Wisely, they recorded a new rendition where they removed the words “you make a grown man cry“.

Windows 95 was a runaway success with 1 million copies selling in the first 4 days, 40 million in the first 12 months. Microsoft will be hoping for similar commercial success with the new version of Windows. But the competitive landscape is very different. Windows 8 is not just a desktop operating system, it is also aimed at the very crowded tablet market. It’s quite a battlefield with Android and iOS holding the high ground. Also – Windows 95 was a big step forward from Windows 3.x. Windows 8 comes after a very capable Windows 7 which had little to fault.

Windows 8 has been publicly denounced by Tim Cook, the Apple CEO as an unholy union not unlike a toaster combined with a fridge. Apple have approached the market with separate operating systems for tablet and desktop and see any operating system that tries to cater to both platforms as a compromise too far.

With the cash cows of Windows and Office looking decidedly venerable, Microsoft need Windows 8 to be successful and the move to a completely new paradigm is brave (even though the old look and feel is still there if you need it). I think they deserve plaudits for that bravery and there is a good chance that just like the ribbon toolbar that came with Office 2007, people will get used to it and come to love it.

Either way – Windows 8 is a landmark event in computing history.

How free should free speech be?

SHOOTING OFF YOUR FACE WON'T HELP FREE SPEECH ...

SHOOTING OFF YOUR FACE WON’T HELP FREE SPEECH – NARA – 515409 (Photo credit: Wikipedia)

To say that law enforcement agencies are struggling with how to cope with the direction that technology has taken in the last decade is an understatement. Back when life was simple and there was no social networking and most people were still using landlines, policing was much simpler. If you wanted to be libellous, you needed to print something in a newspaper or a book. If you wanted to organise a heist or a riot or some terrorist activity, you probably had to do it in person.

Celebrities who are concerned about their reputations have been resorting to age-old legal machinery to protect their reputations which in the modern-day and age is just not up to the job. An injunction granted in a court of law in one country has little or no jurisdiction in another. So you may not be able to name the latest dubious liaison by a premiership footballer, but someone from another country can. It can then be retweeted by someone else and so it goes on until it’s trending and everyone knows.

I happen to think that free speech is one of our most important rights, but people can say and put into bits and bytes some pretty vile rhetoric. The authorities walk a knife-edge between overreacting and contravening the sacred right of free speech and allowing such vitriol to go unpunished. The Crown Prosecution Service has recently decided against prosecuting a man for a homophobic tweet he made about divers Tom Daly and Pete Waterfield, but the fact that they considered prosecution shows how seriously they are taking it.

Then there was the infamous case of Paul Chambers who faced a fine of just under £3000 and a criminal record for his joke tweet about blowing up Robin Hood airport in January 2010 following flight delays. I’m sure the sense of relief he felt when the court overturned the conviction on appeal was palpable but again – it shows the sense of gravity with which these actions are considered.

There is a need for consistency. A man who recently posted a message on Facebook suggesting that all soldiers should die and go to hell after 6 British soldiers died in Afghanistan got off scott free. Another man who posted vile comments on his Facebook page about the missing girl, April Jones, was sentenced to 12 weeks in jail. Both comments were despicable, but why was one sent to jail and the other allowed to go free?

The legislation being applied to social media was never designed for the purpose and it is time for a free and fair debate about what is and is not OK to say and print and for new purpose-built legislation to deal with today’s technology. I don’t envy the lawmakers their task though, what a minefield.

Conference season

Beck at Yahoo! Hack Day

Beck at Yahoo! Hack Day (Photo credit: Scott Beale)

Sometimes, it feels like I spend my life at conferences, either as an attendee, a speaker or maybe even just for some meetings with people who happen to be there. By and large, they are very well organised and it’s rare that I feel that the time I spend there has been wasted.

A few weeks ago, I had a free pass to a two-day conference down in London. The venue was local, the subject matter should have been of interest; cloud, SOA, mobile and REST and the price was certainly agreeable, so why did I only attend day 1?

Firstly, the venue itself didn’t endear itself to me. Although South Kensington is technically in London, it is hardly well-connected. Secondly, the conference took place in Imperial College London which is a sprawling university campus. Not only that, but the signage telling you where you needed to go was pretty poor. Even then, assuming you actually found the right room, there was a really good chance that the agenda had been changed without notice and you were in the wrong place.

Even if you found the right room, the breakout sessions were only about 45 minutes, so the speaker had just enough time to introduce himself and the subject before wrapping up. The sessions were either too superficial for the familiar or over the head of the novice.

All was not lost though. Over lunch, myself and a colleague bumped into Richard Johnson, the American founder of hotjobs.com, in a local pub. His presence was completely coincidental and had nothing to do with the conference. He told us the story of how he remortgaged his house, his business, his dog and his wife to come up with the princely sum of $4m which he blew on a 30 second advert during the Super Bowl.

His was the first dot-com to do so and Yahoo! acquired the company shortly afterwards. Eventually, Yahoo disposed of the operation and it formed the foundation for monster.com which is one of the biggest internet recruitment companies around.

So although the conference itself had lost its sheen, the trip was not totally wasted. I probably learned more during the half hour chat with Richard (who was a total gent) than I was ever going to learn in the odd 45 minute breakout session.

So in a roundabout way – that conference was worthwhile (even though it wasn’t).

Artificial intelligence?

IBM Watson (Jeopardy at Carnegie Mellon) - How...

IBM Watson (Jeopardy at Carnegie Mellon) – How I saved humanity! (Photo credit: Anirudh Koul)

People remember Alan Turing  for many different reasons. He was a British mathematician who worked as a codebreaker at Bletchley Park during World War 2. He also went on to become one of the pioneers of computing along with Max Newman. In 1952, Alan Turing was convicted of homosexuality. He accepted treatment with female hormones (or chemical castration) rather than go to prison and 2 years later committed suicide. In short, he was a genius who became a victim of his time. Were he born today, no-one would bat an eyelid at his homosexuality.

One of the his legacies is the Turing test. A machine could be said to be intelligent if it was indistinguishable from a human in conversation. He suggested that it would be better to come up with a learning machine (like a child’s mind) that could be taught and not something that simulates an adult mind. There is some debate as to whether a machine has ever really passed this test. I can think of a few humans who would struggle too.

Whenever I’ve seen any proffered example of artificial intelligence, I could not help but be disappointed. Today, however, I attended a very interesting session on IBM’s Watson semantic supercomputer. As supercomputers go, its $3m cost and 4TB of storage are pretty modest. Built on standard hardware and software, it resembles Turing’s child-like mind that can be taught. Indeed, before it can answer any sensible questions in a  particular domain, it needs to be fed with information. Lots of it. Even that’s not enough, Watson needs to do further research on everything it reads.

English: IBM's Watson computer, Yorktown Heigh...

English: IBM’s Watson computer, Yorktown Heights, NY (Photo credit: Wikipedia)

Once loaded up with information, Watson is ready for questions. The first step in being able to answer any question is to parse it into the component elements. From this, Watson can determine the type of question being asked and what sort of answer is expected. It then generates many different permutations of the question to give the greatest chance of coming up with the right answer. For each interpretation of the question, Watson searches (in parallel) its knowledge base to see what answers it can find, coming up with as many as possible.

For each of the possible answers, Watson goes on the hunt for evidence, both for and against. Based on this evidence, obviously incorrect answers are discarded and the remainder scored based on the quality and reliability of the evidence. Finally, Watson uses the experience it as gained in the past in answering similar questions to gauge the value of the different types of evidence and comes up with a final confidence rating for each answer and ranks them accordingly. The process is very similar to that used by Doctor Gregory House in the hit TV program. You write-up all the possible answers and cross them out as the evidence goes against them.

The first outing for Watson was to win the TV quiz game Jeopardy against two former champions. IBM admit that this was little more than a bit of fun and a publicity exercise. Watson is now being geared up for much more serious applications such as medical research and diagnosis support. The applications for such technology are legion and the team freely admit that there are far more valid use cases than they have the time to exploit right now.

For the first time in my life, I am genuinely inspired by an example of artificial intelligence and I will follow Watson’s progress with interest.

Updates needed

raw data snapshot

raw data snapshot (Photo credit: MelvinSchlubman)

Do you think CT scanners in hospitals pop up messages in the middle of a brain scan telling the operator that there is a software update available? When was the last time your car stopped and flashed up a message saying software update needed? What about your TV? Or your satellite receiver? Probably never.

What about your computer? It seems like I get a message several times a week telling me that something needs an update. If it’s not the operating system, it’s the office software or the virus checking software or maybe a plug-in for my browser. It drives me nuts!

I have a laptop at home and a machine at work. Between them, I probably update some piece of software every day. On one day last week, my laptop needed an update as did the office software running on it and the flash player in the browser oh, and Adobe Acrobat Reader. Not only that, but my iPhone joined in the party and decided that what I really needed was an inferior maps application, so along came iOS6.

It’s a mess. All the time you spend updating all these software components compromises your productivity. Not only that, but all this change is risky. Software vendors seem to have improved at testing their updates, but even so, you always feel like you are taking a gamble when applying all these updates. At the end of the process – will you end up with a working machine or a nice looking brick.

And why does the software update process have to be so damned invasive? OK – so Acrobat may need an update – but do you think that I really want to know about it when I’ve just opened a document? Some update mechanisms allow you to specify options such as how often to check for updates and how to apply them – which is an improvement, but why do I have to do it separately for every application on the system.

I like the update system for apps on the iPhone and iPad. No painful pop up messages when you are trying to do something, just a little number hovering over the corner of the app store. If you are curious, you can go and see what the updates are and what they do. You can choose when to apply them – like when you plug your phone in for the night to charge. All in all, a very elegant system.

When can we have it in OSX and Windows? A single central source of software updates for the entire machine. No piece of software would be allowed to apply software updates any other way. Simple, elegant and non-invasive. Perfect.

The freedom of wifi

English: This is a 1987 Madge Networks Token R...

English: This is a 1987 Madge Networks Token Ring 4/16Mbps switchable Network Interface card. It can be slotted in any ISA compatible bus. (Photo credit: Wikipedia)

Look behind pretty much any office computer these days and the chances are that it is connected to the network by an ethernet cable. I can remember when the connectors on the end of ethernet cables were round and when there were other competing network types.

The way ethernet protocol works is rather like a drunken barroom conversation. Your computer waits for a gap and then shouts whatever it wants to send. If someone shouts at the same time, your machine will stop shouting and wait again for a gap. All in all, it’s a terribly impolite conversation.

IBM used to have a network protocol called token ring. That was much more polite. There was a token that whizzed around the network. If a computer on the network wants to say something, they have to wait for the token to come round, grab it and then say what they want to say. Every computer waited it’s turn and there was no shouting. The downfall of the token ring network was speed. Being polite is not as efficient as shouting. Token Ring was harder to set up as unlike ethernet, a token ring network had to be “started up” to get a token into the network.

Wi-Fi Signal logo

Wi-Fi Signal logo (Photo credit: Wikipedia)

Now that we have 3G cellular networks and wi-fi, there is no need for a wire at all, which is just as well. It would be incredibly dull if you had to plug a cable into your mobile every time you wanted to do a google search or look at where you are on a map.

Most hotels will charge you for wi-fi access and it seems like the posher the hotel, the higher the cost. I don’t know how a hotel receptionist can keep a straight face when they explain to you that not only does your $250 room rate not include wi-fi, but the charge is $25 per day for access. I have no objection to anyone making a living, but I do object to profiteering.

Of course some places offer free wi-fi, but it is rarely free in both senses of the word; cost and freedom. More and more, free wi-fi will require you to register. Often, they will need you to click on a link in an email to confirm your email address. Sometimes, they will even send you a completely random user ID and password – something you don’t have a hope of remembering. If you are on a mobile, it means you are going to need to hunt around for a pen and paper to make a note of it.

Not only that – but many of them will kick you out after a predetermined time, which means that pretty much every time you go to use the Internet – you have to go through the whole palaver of logging back in again. If you are going to offer free wi-fi – please make it completely free.

The rise and fall of the apple empire

Roman Forum and surroundings

Roman Forum and surroundings (Photo credit: KayYen)

History is littered with stories of civilisations that have grown in stature until they are too big to sustain. Once the edges are so far from the heart, people forget what it was all about and the empire implodes more dramatically than it grew. We studied two of them in history at school; the Greeks and the Romans.

Today, in a way, it’s difficult to imagine Greece having that much power which is ironic because they probably have more influence on the fate of Europe than any other country right now. The same goes for the Romans. As I was growing up, I was fed on a diet of World War 2 films and commando comics. In these, Italians were the guys with rubbish equipment who spent half their time retreating and the other half surrendering.

We British have had our imperial moments, but we are very much in decline as far as empires go. One by one, the countries that were once coloured pink on my ancient, dented globe have decided that they want to be independent of British rule.

The same thing is happening more and more to big companies now. If you travelled back in time a few years, Nokia and RIM were unstoppable in the mobile phone market. Today, they seem to be in terminal decline. When I was growing up, the word Kodak was synonymous with cameras. It looks like they will wink out of existence once they milk the last bit of value out of their patent portfolio.

There are some eternal survivors out there. IBM have been in trouble before but bounced back. Microsoft had a near death experience when they dismissed the Internet as a fad before waking up and smelling the coffee. Apple have been on the ropes before in the years between Steve Jobs leaving and rejoining, but today they are going from strength to strength.

But Steve Jobs is no more.

I have just upgraded to iOS 6, and although, in general, I am a fan of all things Apple – I am not happy. As part of iOS 6, the once fantastic maps app powered by Google has been replaced by an app produced by Apple. It is inferior in just about every way you can imagine. Yes it has 3D views of the major cities in the world, which is a neat trick where it works, but it feels like a gimmick you play with for half an hour and forget about.

Steve Jobs would never have let it out the door. I can’t help but feel that this is the turning point in the fate of Apple. Of course they have huge resources upon which to draw as did the Romans.

Related articles