Friday, June 16, 2017

Fractious Guardian debate: Tech in schools – money saver or waster

7 reasons why ‘teacher research' is a really bad idea
The Guardian hosted an education debate last night. It was pretty fractious, with the panel split down the middle and the audience similarly split. On one side lay the professional lobby. who saw teachers as the only drivers of tech in schools, doing their own research and being the decision makers. On the other side were those who wanted a more professional approach to procurement, based on objective research and cost-effectiveness analysis. What I heard, was what I often hear at these events, that teachers should be the researchers, experimenters, adopting an entrepreneurial method, making judgements and determining procurement. I challenged this - robustly. Don’t teachers have enough on their plate, without taking on several of these other professional roles? Do they have the time, never mind the skills, to play all of these roles? (Thanks to Brother UK for pic.)
1. Anecdote is not research
To be reasonably objective in research you need to define your hypothesis, design the trial, select your sample, have a control, isolate variables and be good at gathering and interpreting the data. Do teachers have the time and skills to do this properly? Some may, but the vast majority do not. It normally requires a post-graduate degree (not in teaching) and some real research practice before you become even half good at this. I wouldn’t expect my GP to mess around with untested drugs and treatments with anecdotal evidence based on the views of GPs. I want objective research by qualified medical researchers.
En passant, let me give a famous example.Learning styles (VAK or VARK) were propulgated by Neil Fleming, a teacher, who based it on little more than armchair theorising. It is still believed by the majority of teachers, desoite oodles of evidence to the contrary. This is what happens when bad teacher research spreads like a meme. It is believed because teachers rely on themselves and not objective evidence.
2. Not in job description
Being a ‘researcher’ is not in the job description. Teaching is hard, it needs energy, dedication and focus. By all means seek out the research and apply what is regarded as good practice, but the idea that good practice is what any individual deems it to be through their personal research is a conceit. A school is not a personal lab – it has a purpose.
3. Don’t experiment on other people’s children
There is also the ethical issue of experimenting on other people’s children. I, as a parent, resent the idea that teachers will experiment on my children. I assume they’re at school to learn, not be the subject of the teachers' ‘research’ projects in tech.
4. Category mistake
What qualifies a teacher to be a researcher? It’s like the word ‘Leader’, when anyone can simply call themselves a leader, it renders the word meaningless. I have no problem with teachers seeking out good research, even making judgements about what they regard as useful and practical in their school, but that’s very different from calling yourself a ‘researcher' and doing ‘research’ yourself. That’s a whole different ball park. This is a classic category mistake, shifting the meaning of a word to suit an agenda.
5. Entrepreneurial
This word came up a lot. We need more start-ups companies in schools. Now that’s my world. I’m an investor, run an EdTech start-up, and, believe me, that’s the last thing you need. Most start-ups fail and you don’t want failed projects crashing around in your school. But it “teaches the kids how to be entrepreneurs said one of the panel”. No it doesn’t. Start-ups have agendas. Sure they’ll want to get into your school but don’t believe that this is about ‘research’, it’s about ‘referral’. Wait, look, assess, analyse, then try and procure.
6. Teaching tech bias
Technology is an integral part of a school. But it is a mistake to focus solely on ‘teching tech’. There are three types of technology in schools:
School tech – general stuff, website, admin, comms, internet access….
Teacher tech – teacher aids – whiteboards, assessment software…
Learner tech – autonomous learning software
The assumption is that the main issue is Teacher tech. I’d argue that the other two categories are more important. Far better to get your basic, infrastructure sorted, that some blue-sky augmented reality project in the classroom.
7. Professional procurement
Procurement is difficult. Too often education suffers from ‘device fetish’ buying devices not solutions. The failed tablet debacle is the perfect example. Professional procurement means starting with the question ‘to what problem is this a solution’ then assessing the options, doing your homework on background evidence and research, a detailed cost-effectiveness analysis (this is tricky) and a change management plan that includes training need and solutions. This is a skilled job and few schools have the professionals with these skills. Yet this is what the Governors and senior managers should demand. Alternatively, procurement should be done at a higher level, for groups of schools, just as JISC has a defined product set which it recommends into Higher Education.
Back to the debate
The title of the debate was Tech in schools – money saver or waster? The answer, of course, is ‘both’. Technology is always ahead of sociology or culture, which is always ahead of pedagogy. This means that culture always trumps strategy. A school, almost by definition, is a difficult environment for technological change. It’s funding, procurement, management structure, job roles, classroom structure and teaching culture can (not always) work against the use of technology. Teachers deliver learning largely in classrooms, which is a one to many teaching environment, largely unsuitable for technological disruption, so it is not surprising that technology is a difficult fit in schools, a circular, ‘individualised’ peg trying to get squeezed into the ‘one to many square’ that is the classroom.
Tech always been in schools
Tech has always been in schools. From the dawn of civilisation, the earliest caches of clay tablets and pottery shards show people learning how to write and draw. Writing, pens, pencils, erasers, clay tablets, paper, books, bells, canes, leather straps, slates and blackboards; each of these has had a profound effect on what is taught and how it is taught. Writing is a skill that had to be taught in schools, with pencils and erasers mistakes could be erased and corrected, slates were sophisticated as assessment devices (Lancaster method), paper/papyrus/bark gave us the ability to publish and store our thoughts, books gave us fixed texts that could be taught, printing brought scale to resources, bells regulated the timetable, hideous instruments of punishment regulated behaviour and the blackboard made teachers turn their backs on learners and broadcast fixed knowledge. We forget that all of these have pedagogic affordances. Technology has always influenced teaching and learning. So the idea that it should not be used is ridiculous. But that doesn’t mean that all technology should be used.
Device fetish
Unfortunately, with the rise of more autonomous technology, the new causes friction when it rubs up against the old. The first computers were calculators. These changed the pedagogy of maths, in that technology itself had agency and could do more than act as an aid – they could actually calculate faster and more accurately than a human. There was a great deal of angst about this when introduced, as it was though that they would turn our children into unthinking idiots, unable to do mental arithmetic. What actually happened was a recognition that the tech was a feature of the real world and had to be accommodated.
Subsequent computer devices have, of course, been subject to the same charge, but this was not the main problem. With computers, tablets and mobiles ‘device fetish’ took over. The device was everything, so education institutions bought them by the skip-load and parachuted them into schools. Procurement was too often about the device and not the delivery of teaching and learning. Just as counting bums on seats is to focus on the wrong end of the learner, so devices focus on the wrong end of the problem. A device is a peripheral that hangs of a network and as the internet and streaming has become the norm, so devices have become less important. It was always the case that doing things was more important than the delivery device, yet far too little attention, analysis and procurement efforts went into the software, as opposed to hardware.
Device fetish: Keep on taking the tablets
The tablet Taliban, led by Apple, insisted on kids being given what is essentially a consumer device. Tablets have poor affordances – difficult to write at length, code, create graphics and so on. There is even evidence that they slow up progress in writing, as touch-screen makes you write shorter sentences with a higher error rate. Poor procurement, higher than expected insurance costs, difficulty in networking, poor internet access and a paucity of teacher training and software meant that many did not last. Some were disastrous, especially in the US and many swapped out tablets for laptops.
What is far more useful is a strategic look at technology across the school – with school tech, teaching tech and learner tech. Too often the emphasis is on teaching tech, hence the huge spends of whiteboards, tablets and so on. Far less attention is paid to administration and learning tech, which is where, I believe, the efficacy really lies.
School tech
Your website is important as it represents the school to the outside world. Do you have email and comms so that parents and others can contact the school? Do you have a social media presence? Administratively, student support, finance, timetabling, absences and a host of other functions need software to function. Try writing policy documents without a word processor or doing the budget without a spreadsheet. I’d include here the use of tech in teacher, admin and governor training. Modern Governor is used in many schools as a 24/7 training tool for Governors. It’s one of the best spends on tech in school, as it brings all Governors up to speed on their roles and responsibilities. Teacher training should also be considered as should training for other staff. There’s even good online training for catering staff.
Teaching tech
Beghind the scenes, lesson planning and sharing can use technology. Teacher training can be revolutionised by technology – from twitter as CPD to VR as a feedback mechanism for inexperienced teachers. The blackboard and whiteboard are largely teaching technologies. But for me, tech is often best applied outside of the classroom, in the hands of learners not teachers. There is a natural bias towards teaching tech, such as whiteboards, but they have been shown to be of limited efficacy and value.
Learning tech
In the long-term, this is by far the most important. Tools such as word processor, spreadsheet, powerpoint, graphics packages (2D & 3D) and databases are a vital form of technology in schools, as they are mainstays in the real world. They must be made available to learners.
Learning resources, such as Wikipedia and a mountain of Open Educational resources
Corrective software is another, tools that identify errors in spelling, grammar, structure and style. This now includes adaptive software that personalises learning. Then there’s assessment software, that can set tests, formative and summative, as well as mark. However, I’m not such a fan of marking. When as a parent did you ever set your child a test or mark them? It turns schools into unnecessarily competitive environments, where there are winners but also, and more destructively even more losers. The focus here should be on effortful learning – blogging, writing, doing things, projects, providing support on independent learning – what used to be called homework.
AI is here
The new tech kid on the block, takes us away from devices, towards software that learns while it delivers. AI now helps us create, curate, consolidate, deliver  and assess learning. Of course, it’s not such a new kid, as every learner on the planet with internet access uses Google to search and find things – and Google is pure AI. In fact AI is the new UI (User Interface) as most services you use online, Google, email, social media, Amazon, Netflix – are delivered using AI. AI is also revolutionising interfaces for learning. Siri, VIV, Cortana and Alexa are bring voice and dialogue into play, reintroducing something that was lost in learning – Socratic dialogue. But this time Socrates is smart software.
Tech is transgressive
Lastly, tech sneaks into schools whether you like it or not. Kids will have smartphones – almost ALL of them. Kids will have laptops, games consoles, smart TVs. Tech is cool. School is not cool. They will game the system. This poses real challenges. Audrey Mullen made a name for herself when a high school student in making some apposite and powerful recommendations for tech in schools. She abhorred iPads, told teachers to “Save us from ourselves” and ban mobiles from the classroom, and made an appeal for solid administrative software that delivered good services and content. Even teachers have been known to sneak in tech for predatory purposes in schools, the cameras in the toilets, child porn and so on. I’m not against the banishment of the use of tech in classrooms. Classrooms were designed for one on many teaching, not tech. Young people see technology as subversive and transgressive. They will game it.
Conclusion
Every school should have a digital strategy. This needs to cover school, governor, teacher and learner tech. There needs to be some reasonable effort made to define and plan for tech in schools then implement professional procurement. If we leave it to erratic, personal, teacher-led ‘research’ (which is not really research at all) we’ll continue to make the same mistakes. History will repeat itself and education will not benefit from technology in the way, I believe, it should.

 Subscribe to RSS

Tuesday, June 13, 2017

10 recommendations on HE from OEB Mid-Summit (Reykjavik)

Iceland was a canny choice for a Summit. Literally in sight of the house where Reagan and Gorbachov met in 1986 (Berlin Wall fell in 1989), it was a deep, at times edgy, dive into the future of education. When people get together and face up to rifts in opinion and talk it through – as the Reagan-Gorbachov summit showed, things happen – well maybe. Here's my ten takeaways from the event (personal).
1. Haves-have nots
First the place. Iceland has eemerged up through the Mid-Atlantic Ridge, which still runs right through the middle. Sure enough, while here, there were political rifts in the US, with the Coney-Trump farrago, and a divisive election in the UK. It is clear that an economic policies have caused fractures between the haves and the have-nots. In the UK there’s a hung Parliament, country split, Brexit negotiations loom and crisis in Northern Ireland. In the US Trump rode into Washington on a wave of disaffection and is causing chaos.
But let’s not imagine that Higher Education lies above all of this. Similar fault lines emerged at this Summit. As Peter Thiel said, Higher Education is like the Catholic Church on eve of Reformation, “a priestly class of professors….people buying indulgences in the form of amassing enormous debt for the sort of the secular salvation that a diploma represents”. More significantly he claims there has been a ‘failure of the imagination, a failure to consider alternative futures’. Culture continues to trump strategy.
Higher Education is a valuable feature of cultural life but people are having doubts. Has it become obese? Why have costs ballooned while delivering the same experience? There are problems around costs, quality of teaching and relevance. Indeed, could Higher Education be generating social inequalities? In the US and UK there was a perception, not without truth, that there is a gulf between an urban, economically stable, educated elite and the rest, who have been left to drift into low status jobs and a loss of hope for their children. The Federal debt held on student loans in the US has topped 1.5 trillion. In the UK, institutions simply raise fees to whatever cap they can. The building goes on, costs escalate and students loans get bigger. Unlike almost every other area of human endeavor, it seems there has been little effort to reduce costs and look for cost-effective solutions.
Recommendation: HE must lower its costs and scale
2. Developed-developing
The idea that the current Higher Education model should be applied to the developing world is odd, as it doesn’t seem to work that well in the developed world. Rising costs, student and/or government debts, dated pedagogy and an imbalance between the academic and vocational, renders application in the developing world at best difficult, at worse dangerous. I have been involved in this debate and it is clear that the developing world needs vocational first, academic second.
Recommendation: Develop different and digital HE model for developing world
3. Public-private
In an odd session by Audrey Watters, we had a rehash of one of her blogs, about personalized learning being part of the recent rise in ‘populism’. She blamed ‘capitalism’ for everything, seeing ‘ideology’ everywhere. But as one brave participant shouted behind me “so your position is free from ideology then?” It was the most disturbing session I heard, as it confirmed my view that the liberal elite are somewhat out of touch with reality and all too ready to trot out old leftist tropes about capitalism and ideology, without any real solutions. The one question, from the excellent Valerie Hannon, stated quite simply, that she was “throwing the baby out with the bath water”. Underlying much of the debate at the summit lay an inconvenient truth that Higher Ed has a widespread and deep anti-corporate culture. This means that the public and private sectors talk past, and not to, each other. This is a real problem in EdTech. Until we start talking to each other, like Reagan and Gorbachov, this wall will not fall.
Recommendation: Stop talking past each other, talk to each other
4. Research-practice
Session after session laid out established and recent research in cognitive psychology and educational research, which showed the redundancy of the lecture as a core pedagogic principle. Data was shown of shockingly low attendance in lectures from both the US and the UK. The illusion that Higher Ed teaches critical thinking was also exposed by Ben Nelson (critical thinking by the way isn’t really a thing in itself). Arun’s study in Academically Adrift, of 2332 students, in 23 institutions, over 4 years, showed a worrying lack of success in critical thinking, complex reasoning and written communication. Harold Beckering gave a brilliant talk on how we learn through the correction of errors, yet teaching methods fail to recognize this core cognitive fact. Roger Schank eviscerated current pedagogy with its lazy obsessions with lectures, and marking. Think of parents, he pleaded, did you ever give your kid a test or mark them?
Recommendation: Don’t lecture me!
5. Teaching v research
Astin’s study of 24,847 students, in 309 institutions, looked at the correlation between ‘faculty orientation towards research’ and ‘student/teaching orientation’ and found them to be strongly negatively correlated. Student orientation was also negatively related to compensation, with a significant institutional price to be paid, in terms of student development, for a very strong faculty emphasis on research”. This should come as no surprise. Research skills require systematic thinking, attention to detail, understanding of methods and analysis. Teaching skills require social skills, communication skills, the ability to hold an audience, keep to the right level, avoid cognitive overload, good pedagogic skills and the ability to deliver constructive feedback. An additional problem being the exponential growth of Journals and, some would say, 2nd and 3rd rate research. The swing away from teaching towards research over the lasy 6o years has been well documented by Jencks, Boyer, Massy and Bok.
Recommendation: Research is not a necessary condition for teaching – break the link
6. Building v online
Most campuses look as though they’ve been built by committee, often a rather ugly assembly of disparate buildings – that’s because they have been built departmentally. The architecture reflects the fractured, departmental nature of the organisation. Encouraged by endowments, where alumni want their name, if not in lights, in concrete – the building goes on. Yet the occupancy rates of University buildings shows an appalling return on investment. At the same time there is often a small and tactical approach to online delivery. It is perhaps time to consider, what John Daniel called, a ‘default to digital’ for some courses.
Recommendation: Build less. Balance out the capital budget with a substantial digital budget
7. Inside-outside
HE is unlikely to change from inside, as culture trumps strategy. Substantial, strategic change  - online courses, rebalancing academic/vocational, pedagogic and technology shifts are more likely to come from outside of academe, influence and action through political policies, technological shifts, new models such as MOOCs/online courses and use of technology by students. Sure there’s some good and real change happening within HE but they tend to be, and remain, outliers. The core system is in stasis.
Recommendation: Open up to outside, not just with technology but culturally
8. Tech v anti-tech
In technology, AI was the hot topic, and rightly so. I gave a session devoted to its application in learning, others also, and it was a recurring theme. To be honest, AI is not really the right phrase, let’s just call it smart software. We had a marvelous talk from Nell Watson on the transformative nature of machine learning, another from Valerie Hannon making a similar point about the complexity of the problems we face and the need for smart, technological solutions in education. Peter O’Driscoll also showed how tech ‘jerks’ people around in institutions but rather than retreat into culturally safe, luddite shelters, we need to embrace the technology to do good.
Recommendation: Embrace transformative technology
9. Culture v strategy
Culture trumps strategy. Budgets, chasing ratings, quality systems, building programmes, obsession with lectures, research-driven teaching, an anti-corporate, internal-looking culture always trumps strategy. Change management (planned and executed) is the way to go and we can learn a lot about how this is done in the outside world – not by writing reports but by creating a sense of urgency and sustained action. No matter how many summits, reports and horizon scans we have – ‘the best way to predict the future is to create it’ (Alan Kay). That means recognising the issues and taking a strategic approach to solutions.
Recommendation: Strategic, costed initiatives with change management
10. Academic v vocational
There’s always been a tension between these two but the pendulum may have swung way too far towards the academic. Roger Schank and I made passionate pleas for more learning by doing and more apprenticeships. It’s no accident that Germany is Europe’s strongest economy – they have balance in their educational system. Guess what happens – within 48 hours Trump issues a major policy announcement recommending precisely this. We’ve already done this in the UK with 0.5% of payroll (by law) going towards apprenticeships.
Recommendation: Rebalance academic and vocational
Conclusion

As if by magic, which of course it is not, within 48 hours of our Summit, there was a major briefing from the White House about building skills and apprenticeships, exactly what Roger and I had been talking about. (There is a link which will be revealed later.) It’s a pity that it’s taken a Trump to get this going – but hey – I don’t care where it comes from – good policy is good policy. It is an example of what I was talking about. Paraphrasing Alan Jay, we can take the future into our own hands or let it just happen.

 Subscribe to RSS

Monday, May 22, 2017

Philosophy of technology - Plato, Aristotle, Nietzsche, Heidegger - technology is not a black box

Greek dystopia
The Greeks understood, profoundly, the philosophy of technology. In Aeschylus’s Prometheus Bound, when Zeus hands Prometheus the power of metallurgy, writing and mathematics, Prometheus gifts it to man, so Zeus punishes him, with eternal torture. This warning is the first dystopian view of technology in Western culture. Mary Shelley called Frankenstein ‘A Modern Prometheus’ and Hollywood has delivered for a nearly a century on that dystopian vision. Art has largely been wary and critical of technology.
God as maker
But there is another more considered view of technology in ancient Greece. Plato articulated the philosophy of technology, seeing the world, in his Timaeus, as the work of an ‘Artisan’, in other words the universe is a created entity, a technology. Aristotle makes the brilliant observation in his Physics, that technology not only mimics nature but continues “what nature cannot bring to a finish”. They set in train an idea that the universe was made and that there was a maker, the universe as a technological creation.
The following two thousand year history of Western culture bought into the myth of the universe as a piece of created technology. Paley, who formulated the modern argument for the existence of God from design, used technological imagery, the watch, to specify and prove the existence of a designed universe and therefore a designer - we call (him) God. In Natural Theology; or, Evidences of the Existence and Attributes of the Deity, he uses an argument from analogy to compare the workings of a watch with the observed movements of the planets in the solar system to conclude that it shows signs of design and that there must be a designer. Dawkins titled his book The Blind Watchmaker as its counterpoint. God as watchmaker, technologist, has been the dominant, popular, philosophical belief for two millennia. 
Technology, in this sense, helped generate this metaphysical deity. It is this binary separation of the subject from the object that allows us to create new realms, heaven and earth, which gets a moral patina and becomes good and evil, heaven and hell. The machinations of the pastoral heaven and fiery foundry that is hell  revealed the dystopian vision of the Greeks.
Technology is the manifestation of human conceptualization and action, as it creates objects that enhance human powers, first physical then psychological. With the first hand-held axes, we turned natural materials to our own ends. With such tools we could hunt, expand and thrive, then control the energy from felled trees to create metals and forge even more powerful tools. Tools beget tools.
Monotheism rose on the back of cultures in the fertile crescent of the Middle East, who literally lived on the fruits of their tool-aided labour. The spade, the plough and the scythe gave them time to reflect. Interestingly our first records, on that beautifully permanent piece of technology, the clay tablet, are largely the accounts of agricultural produce. The rise of writing and efficient alphabets make writing the technology of control. We are at heart accountants, holding everything to account, even our sins. The great religious books of accounts were the first global best sellers.
Technology slew God
Technology may have suggested, then created God, but in the end it slew him. With Copernicus, who drew upon technology-generated data, we found ourselves at some distance from the centre of the Universe, not even at the centre of our own little whirl of planets. Darwin then destroyed the last conceit, that we were unique and created in the eyes of a God. We were the product of the blind watchmaker, a mechanical, double-helix process, not a maker, reduced to mere accidents of genetic generation, the sons not of Gods but genetic mistakes.
Anchors lost, we were adrift, but we humans are a cunning species. We not only make things up, we make things and make things happen.
We are makers
Once God was dead, in the Nietzschean sense of a conceptual death, we were left with just technology. Radovan Richta’s theory of Technological Evolution posited three stages – tools, machines and automation. We got our solace not from being created forms but by creating forms ourselves. We became little Gods and began to create our own universe. We abandoned the fields for factories and designed machines that could do the work of many men. What we learned was scale. We scaled agricultural production through technology in the agricultural revolution, scaled factory production in the industrial revolution, scaled mass production in the consumer revolution. Then more machines to take us to far-off places – the seaside, another country, the moon. We now scale the very thing that created this technology, ourselves. We alchemists have learned to scale our own brains.
Maker destroy the Little Gods
Eventually we realized that even we, as creators, could make machines that could know and think on our behalf. God had died but now the Little Gods are dying. Gods have a habit of destroying their creators and we will return to that agricultural age, an age of an abundance of time and the death of distance. We, once more, will have to reflect on the folly of work and learn to accept that was never our fate, only an aberration. Technology now literally shapes our conception of place and space. With film, radio, TV and the web. As spiders we got entangled in our own web and it now begins to spin us.
Technology not a black box
Technology is not a ‘black box’, something separate from us. It has shaped our evolution, shaped our progress, shaped out thinking - it will shape our future. It may even be an existential threat. There is a complex dialectic between our species and technology that is far more multifaceted than the simplistic ‘it’s about people not technology’ trope one constantly hears on the subject. That dialectic has suddenly got a lot more complex with AI. As Martin Heidegger said in his famous Spiegel interview, “Only a God can save us”. What I think he meant by this was that technology has become something greater than us, something we now find difficult to even see, as its hand has become ever more invisible. It is vital that we reflect on technology, not as a ‘thing-in-itself’, separate from us, but as part of us. Now that we know there may be no maker God, no omnipotent technologist, we have to face up to our own future as makers. For that we need to turn to philosophy – Plato, Aristotle, Nietzsche and Heidegger are a good start….
The postscrip is that AI may, in the end, be the way forward even in philosophy. In the same way that the brain has limits on its ability to play chess or GO, it may also have limits on the application of reason and logic. Philosophical problems themseleves may need the power of AI to find solutions to these intractable problems. AI may be the God that saves us....

 Subscribe to RSS

Thursday, May 04, 2017

10 uses for Amazon Echo in corporates

OK she’s been in my kitchen for months and I’m in the habit of asking her to give me Radio 4 while I’m making my morning coffee. Useful for music as well, especially when a tune comes into your head. But it’s usually some question I have in my head or topic I want some detail on. My wife’s getting used to hearing me talk to someone else while in another room. But what about the more formal use of Alexa in a business? Could its frictionless, hands-free, natural language interface be of use in the office environment?
1. Timer
How often have you been in a meeting that’s overrun? You can set multiple timers on Alexa and she will light up and alarm you (softly) towards the end of each agenda item, say one minute before the next agenda item. It could also be useful as a timer for speakers and presenters. Ten minutes each? Set her up and she provides both visual and aural timed cues. I guess it would pay for itself at the end of the first meeting!
2.  Calendar functionality
As Alexa can be integrated with your Google calendar, you simply say, “Alexa, tell Quick Events to add go to see Tuesday 4th March at 11 a.m.". It prompts you until it has the complete scheduled entry.
3. To do lists
Alexa will add things to a To Do list. This could be an action list from a meeting or a personal list.
4. Calculator
Need numbers added, subtracted, multiplied, divided? You can read them in quickly and Alexa relpies quickly.
5. Queries and questions
Quick questions or more detailed stuff from Wikipedia? Alexa will oblige. You can also get stock quotes and even do banking through Capital One. Expect others to follow.
6. Domain specific knowledge
Product knowledge, company specific knowledge, Alexa can be trained to respond to voice queries. Deliver a large range of text files and Alexa can find the relevant one on request.
7. Training
You can provide text (text to speech) or your own audio briefings. Indeed, you can have as many of these as you want. Or go one step further with a quiz app that delivers audio training.
8. Music
Set yourself up for the day or have some ambient music on while you work? Better still music that responds to your mood and requests, Alexa is your DJ on demand.
9.  Order sandwiches, Pizza or Uber
As Alexa is connected to several suppliers, you can get these delivered to your business door. Saves all of that running out for lunchtime sandwiches or pizza.
10. Control office environment
You can control your office environment through the Smart Home Skill API. This will work with existing smart home devices but there’s a developer’s kit so that you can develop your own. It can control lights, thermostats, security systems and so on.
Conclusion

As natural language AI applications progress, we will see these business uses become more responsive and sophisticated. This is likely to eat into that huge portion of management that the Harvard Business Review identified as admin. Beyond this are applications that deliver services, knowledge and training, specific to your organisation and you as an individual. Working on this as an application in training as we speak.

 Subscribe to RSS

Wednesday, May 03, 2017

AI moving towards the invisible interface

AI is the new UI
What do the most popular online applications all have in common? They all use AI-driven interfaces. AI is the new UI. Google, Facebook, Twitter, Snapchat, Email, Amazon, Google Maps, Google Translate, Satnav, Alexa, Siri, Cortana, Netflix all use sophisticated AI to personalise in terms of filtering, relevance, convenience, time and place-sensitivity. They work because they tailor themselves to your needs. Few notice the invisible hand that makes them work, that makes them more appealing. In fact, they work because they are invisible. It is not the user interface that matters, it is the user experience.
Yet, in online learning, AI UIs are rarely used. That’s a puzzle, as it is the one area of human endeavour that has the most to gain. As Black & William showed, feedback that is relevant, clear and precise, goes a long way in learning. Not so much a silver bullet as a series of well targeted rifle shots that keep the learner moving forward. When learning is sensitive to the learner’s needs in terms of pace, relevance and convenience, things progress.
Learning demands attention and because our working memory is the narrow funnel through which we acquire knowledge and skills, the more frictionless the interface, the more efficient the speed and efficacy of learning. Why load the learner with the extra tasks of learning an interface, navigation and extraneous noise. We’ve seen steady progress beyond the QWERTY keyboard, designed to slow typing down to avoid mechanical jams, towards mice and touch screens. But it is with the leap into AI that interfaces are becoming truly invisble.
Textless
Voice was the first breakthrough and voice recognition is only now reaching the level of reliability that allows it to be used in consumer computers, smartphones and devices in the home, like Amazon Echo and Google Home. We don’t have to learn how to speak and listen, those are skills we picked up effortlessly as young children. In a sense, we didn’t have to learn how to do these things at all, they came naturally. As bots develop the ability to engage in dialogue, they will be ever more useful in teaching and learning.
AI also provides typing, fingerprint and face recognition. These can be used for personal identification, even assessment. Face recognition for ID, as well as thought diagnosis, is also advancing, as is eye movement and physical gesture recognition. Such techniques are commonly used in online services such as Google, Facebook, Snapchat and so on. But there are bigger prizes in the invisible interface game. So let's take a leapof the imagination and see where this may lead to over the next few decades.
Frictionless interfaces
Mark Zuckerberg announced this year that he wants to get into mind interfaces, where you control computers and write straight from thought. This is an attempt to move beyond smartphones. The advantages are obvious in that you think fast, type slow. There’s already someone with a pea-sized implant that can type eight words a minute. Optical imaging (lasers) that read the brain are one possibility. There is an obvious problem here around privacy but Facebook claim to be focussing only on words chosen by the brain for speech i.e. things you were going to say anyway. This capability could also be used to control augmented and virtual reality, as well as comms to the internet of things. Underlying all of this is AI.
In Sex, Lies and Brain Scans, by Sahakian and Gottwald, the advances in this area sound astonishing. John-Dylan Hayes (Max Plank Institute) can already predict intentions in the mind, with scans, to see whether the brain is about to add or subtract two numbers, or press a right or left button. Words can also be read, with Tom Mitchell (Carnegie Mellon) able to spot, from fMRI scans, nouns from a list of 60, 7 times out of 10. They moved on to train the model to predict words out of a set of 1001 nouns, 7 times out of 10. Jack Gallant (University of California) reconstructed watched movies purely from scans. Even emotions can be read, such as fear, happiness, sadness, lust and pride by Karim Kassan (Carnegie Mellon). Beyond this there has been modest success by Tomoyasu Horikawain identifying topics in dreams. Sentiment analysis from text and speech is also making progress with AI systems providing the analysis.
The good news is that there seems to be commonality across humans, as semantic maps, the relationship between words and concepts seems to be consistent across individuals. Of course, there are problems to be overcome as the brain tends to produce a lot of ‘noise’, which rises and falls but doesn’t tell us much else. The speed of neurotransmission is blindingly fast, making that difficult to track and, of course, most of these experiments use huge, immobile and expensive scanners.
The implications for learning are obvious. When we know what you think, we know whether you are learning, optimise that learning, provide relevant feedback and also reliably assess. To read the mind is to read the learning process, it’s misunderstandings and failures, as well as its understanding and successful acquisition of knowledge and skills. A window into the mind gives teachers and students unique advantages in learning.
Seamless interfaces
Elon Musk’s Neuralink goes one step further looking at extending our already extended mind through Neural Laces or implants. Although our brains can cope with sizeable INPUT flows through our senses, we are severely limited on OUTPUT, with speech or two meat fingers pecking away on keyboards and touchscreens. The goal is to interface physically with the brain to explore communications but also storage and therefore extended memory. Imagine expanding your memory so that it becomes more reliable – able to know so much more, have higher mathematical skills, speak many languages, have many more skills.
We already have cochlear implants that bring hearing to the deaf, implants that allow those who suffer from paralysis to use their limbs. We have seen how brain use in VR can rewire the brain and restore the nervous system in paraplegics. This should come as no surprise that this will develop further as AI solves the problem of interfacing, in the sense of both reading and writing to the brain.
The potential for learning is literally ‘mind blowing’. Massively leaps in efficacy may be possible, as well as retained knowledge, retrieval and skills. We are augmenting the brain by making it part of a larger network, seamlessly.
Conclusion

There is a sense in which the middleman is being slowly squeezed out here or disintermediated. Will there be a need for classrooms, teaching, blackboards, whiteboards, lectures or any of the apparatus of teaching when the brain is an open notebook, ready to interface directly with knowledge and skills, at first with deviceless natural interfaces using voice, gesture and looks, then frictionless brain communications and finally seamless brain links. Clumsy interfaces inhibit learning, clean smooth, deviceless, frictionless and seamless interfaces enhance and accelerate learning. This all plays to enhancing the weaknesses of the evolved biological brain - its biases, inattentiveness, forgetting, need to sleep, depressive tendencies, lack of download or networking, slow decline, dementia and death. A new frontier has opened up and we’re crossing literally into ‘unknown’ territory. We may even find that we will come to know the previously unknowable and think at levels beyond the current limitations of our flawed brains.

 Subscribe to RSS