Cloudy with a chance of.... fog?
Conrad Taylor attended the Knowledge Café on the 16th April hosted by Core. The evening proved to be a great success with lively debate and discussion. Below is a short summary from Conrad.
A personal account by Conrad Taylor of a Gurteen Knowledge Café hosted by Core
David Gurteen promotes the practice of ‘Knowledge Cafés’, a kind of discussion workshop which is structured to encourage creative conversations around a topic, with the aim of bringing the knowledge of the participants to the surface, sharing ideas and insights between them. In process, a Gurteen Knowledge Café is related to the World Café process originated by Juanita Brown and David Isaacs in 1995, but the Gurteen café meetings are run with shorter table-group sessions, and smaller attendance overall. Not only does this make a Gurteen Knowledge Café easier to organise and to host, but with typically forty or so people in the room it is also possible to close out the event with a discussion in the round.
For a number of years David Gurteen has run a series of occasional Knowledge Café events in London. The principle is that an organisation hosts the meeting, providing the venue and some refreshments, and the meeting is open to all comers and free to attend. Note that the Café methodology lends itself very well to internal organisational knowledge sharing, but David’s London Café series is left deliberately open and free, encouraging networking and inter-networking.
The most recent Gurteen Knowledge Café event was held on the evening of 16th April 2014 at the Rubens Hotel by Buckingham Palace, and like the previous café event was generously hosted by Core. Core is a Microsoft business partner company with special interests in secure mobile working for government and business, virtualised managed IT services and such like.
To seed the series of round-table discussions at a Gurteen Knowledge Café, the normal practice is for a presenter, who is generally from the hosting organisation, to speak quite briefly to the proposed topic, winding up with some open questions which the participants can then discuss. In this case, the meeting had been given the title ‘Cloudy with a chance of fog?’ (explanation follows shortly!) After Joyce Harmon of Core had welcomed us and David Gurteen outlined the process for the Café (generally about half the people who come have not attended one of these events before), Core’s senior technology strategist Andrew Driver gave the talk.
To seed the series of round-table discussions at a Gurteen Knowledge Café, the normal practice is for a presenter, who is generally from the hosting organisation, to speak quite briefly to the proposed topic, winding up with some open questions which the participants can then discuss. In this case, the meeting had been given the title ‘Cloudy with a chance of fog?’ (explanation follows shortly!) After Joyce Harmon of Core had welcomed us and David Gurteen outlined the process for the Café (generally about half the people who come have not attended one of these events before), Core’s senior technology strategist Andrew Driver gave the talk.
The proposition
If you send and receive email, share photos or documents from your computer, or do your banking or shopping online, you are using ‘Cloud’ computing. Hotmail, Skydrive (now OneDrive), iCloud and Dropbox are all examples of cloud computing which we now take for granted.This is IT consumerisation; allowing an individual or a business can buy their IT the way they might buy any other subscription based product.Now we have the 'Internet of Things', the idea of everyday objects like cars and toasters being connected to everything else. What next? What are the wider implications for the future? As well as the many benefits of a more connected world, should we be concerned about a future led by terms such as Machine Learning and Artificial Intelligence. Further, what is the gap between what we believe and reality?
Defining the Cloud
Now, I had been thinking about the assertions in the above text, and doing some reading around, and it seems that the term ‘cloud computing’ only really became current towards the end of the 2000s, when it became possible to use ‘software as a service’ (SaaS) and remote storage and computation-on-demand services available over the Internet. By this definition, I considered my early use of email and file transfer (via SMTP and FTP) not to be that ‘cloudy’, so I asked Andrew early into his presentation for clarification.
Andrew’s usage is a very wide one; as far as he is concerned, it is a newly-minted term, but it describes arrangements that have been around for a long time. He said, ‘Cloud computing is whenever you have a collection of computers performing a function [for you], but they are not directly your responsibility.’ By this token, the advent of the Internet itself was ‘cloudy’ because it pooled the resources of all the participating networks (owned by companies, universities etc), and the routers forwarding data between them; every one of these items may have been owned by someone, but nobody owned The Internet per se.
Wherever we draw that line, it is clear that people and organisations increasingly use online remote services, some free of charge and some paid for by subscription, to host email accounts and web pages, back up large amounts of data and so on. Helping companies to do this big-time is one of the reasons for Core to be in business.
In the round
Following the table groups session, we gathered our seats into a big circle and David asked us to share as we wished. This session lasted about 40 minutes.
The first person to speak said that in the conversations he had had, the issues seemed to be less technical than socio-political. For example, machine learning might make middle class and managerial professionals redundant, and this could result in serious social dislocations.
Andrew referred to a recent conversation with the person at a client organisation moving their email out to the Microsoft Office 365 system; he feared he might be left with nothing to do. No, said Andrew; at present you use the systems you have to facilitate communication in the business, and surely you will continue to have the same job, but using a different technical system.
Several people chipped in with worries about what machine learning and machine ‘intelligence’ might do for a tier of middle class support jobs: amongst paralegals, legal researchers and journalists for example. The top fee earners won’t be threatened, but the ranks who support them might indeed be replaced by expert automated systems.
One rather scary aspect of machine intervention is represented by the research trend towards ‘autonomous killer robots’, drones and missile batteries and battlefield weapons which are coming close to being granted powers to decide whether to kill or not. They may be constrained by their coding, but when there is the need to react quickly, quicker perhaps that human judgement would take, how long will this remain the case? South Korea has automated gun emplacements along its border with the North (the Samsung SGR-A1 system), currently under human control but capable of being made autonomous.
One lady mentioned that South Korea may be the only country which has actually developed an ethical framework for robotic behaviour, possibly akin to what the science fiction author Isaac Asimov put forward in ‘I, Robot’ and other books. For South Korea it is significant not just because of the defence system mentioned above, but also because they hope to drive towards each Korean home having a robot by 2020.
Richard Harris, in his book ‘The Fear Index’, suggests that we may control the morals and parameters of robotic systems, but it may still be the case that a system decides its behaviours for itself. The scenario is based on automated decisions in the investment banking industry. Now, one hopes that good decisions would be coded in; but it is often the case that we have lost control of the code, and no-one knows how it is working.
As a thought experiment, someone imagined a self-driving car. A small child runs out in front of the car and the car must act. To the left is a bus stop with eight people in the queue; to the right is a precipitous cliff. Which choice should the vehicle make, and would it make that choice?
One of us raised the issue of how different generations think about privacy behaviours and privacy laws.
The conversation took a turn towards the second question Andrew had launched at us, about the gap between perception or belief on the one hand, and reality. Challenged to explain, Andrew expanded by saying that he was often in conversations with people who he might have expected to have a wider vision, but was coming to appreciate that many senior and experienced people have their mindset in a kind of rut, ill-prepared for what is about to bring radical change. For himself, he thinks it behoves us to show an interest in our future.
Someone recalled the perceptual experiment that asks people to count the number of times that a basketball is passed, and hardly anyone charged with this task notices that someone in a gorilla suit walks right through the shot. It’s what we might call ‘entrained thinking’, the captivating power of mental models, and though mental models have their uses, so does naïvety? Assumptions undermine our ability to understand the world, especially in novel contexts and arrangements.
I asked if any of the table groups had addressed the question of ‘the Internet of Things’ and someone replied that yes, on her table they thought it had the potential to create some large security risks and loopholes.
David Gurteen said that as an iPhone user, he recently became aware that when he has his phone plugged in to charge in the same room, he has become aware that ‘Siri’ (the natural-language control interface for the telephone) is listening to every second of time and his every word. Siri has imperfect ears, and might hear David and his wife use a phrase in dinner conversation and interject, ‘How can I help you?’ I raised the recent news stories about the Samsung voice-control TVs and the talking Barbie doll, both of which use an Internet link to a natural language processing software system ‘in the Cloud’ and which therefore are also continuously listening to whichever human is in the same room (though soon, they start to listen and react to each other).
Someone remarked that there is a kind of trade-off between gaining increased machine help and losing our privacy and control over our own information. A trade-off along those lines may be perfectly acceptable, were we able to decide about it ourselves. But do we really understand what are the terms of the trade-off? And who is in charge of those terms? Until Edward Snowden enlightened us, how much did we understand about how those trade-offs were handing vast amounts of information about us to the security organisations?
What, for example, are we to make of the harvesting and mass pooling of our medical records and genetic data? It has some huge potential to advance medical science through Big Data analysis.
We had a bit of a debate about whether ‘radical transparency’ with respect to our data is asymmetric (they want to know everything about Us but don’t let us know much about Them), or whether the information flow is more symmetric than that.
In closing out, Andrew Driver suggested we check out a book by Peter Fingar called ‘Process Innovation in the Cloud’, which is related to an article called ‘Everything has changed utterly’. The book, he suggested, is not that exciting, but the article is worth a look.
At this point David Gurteen thanked our hosts; he got people’s unanimous agreement that it was OK to share emails amongst us, but we demurred at him sharing those with his toaster. And so we rose, and spent some more valuable informal time networking with the aid of wine and beer generously provided by our hosts.
For the full article please see Gurteen.com