TEDx Glasgow delivered on its theme, ‘A Disruptive World’, before it even began, by itself being disrupted by a fire alarm that had the massed delegates standing in the glorious June sunshine. That this disruption had been triggered by someone having a sly smoke in the alleyway behind the building cemented the fact that this would be an anarchic, very Glasgwegian affair. The west coast warmth and banter was maintained throughout the day by presenter Janice Forsyth (@janiceforsyth) and comedian Sanjeev Kholi (@govindajeggy), who ensured the proceedings had an accessible feel both on and off stage.
TED stands for Technology Education and Design, and is a non-profit organisation started all the way back in 1984. Acting as a platform for ‘ideas worth spreading’, it has grown into a international success story. TEDx is an offshoot which seeks to promote local communities in delivering TED like experiences.
We began with a haunting, unaccompanied folk song from Kathleen McInnes, which took us to the first session. James Watt (@brewdogjames), Founder of BrewDog, spoke about his companies disruptive approach to customer culture, imploring us over a pre-10am can of lager ‘Don’t fuck up the culture’. By bringing the external internally, the dedicated customer base (of which I am one) have helped drive BrewDog to international success.
Such openness has its risks, illustrated in animated fashion by James Lyne, Global Head of Security at Sophos (@JamesLyne). The hacker of old has gone, having been replaced by the smiling social engineer of the modern cyber-criminal. When buying credit card details is as simple as visiting a dark-web online shop, itself customer rated and more secure than the banks they’ve breached, it’s clear that the traditional reliance on others to maintain our privacy has gone.
My clear first interest is Digital Health, and to that end I was particularly interested in hearing from the medical TEDx speakers.
Dr Ravinder Dahiya (@flexsensotronic) from University of Glasgow introduced the audience to the importance of touch in robotics, sharing the groundbreaking work he and his team have been undertaking in wrapping flexible ‘e-Skin’ over advanced robots and prosthetics. The critical importance of returning this sense to the wounded, and delivering it to the robotic, cannot be underestimated and bodes well for the future of both fields.
Jason Leitch (@JasonLeitch), National Clinical Director at the Scottish Government, was his usual brilliant and urbane self as he took the audience through a crash course in recording vital signs – pulse, respiratory rate, and ‘What matters to me’. This simple question has transformed the relationship between patients and staff in many Scottish hospitals, disrupting the traditional (and regressive) top-down approach to delivering care.
The medical device market, so crucial and valuable in global healthcare, is aching to be disrupted, and Dr Craig Robertson from @Epipole_ltd is doing just that, attacking Diabetic Retinopathy head-on. By developing a high quality, inexpensive fundoscope, linked to the best of cloud-based machine learning, he and his team are bringing 21st century screening to the developing world first, and seeking permission later (not the first time I heard this on the day). He also successfully delivered a live tech demo, and wins my ‘Silicon Cojones’ award. Don’t ask to see the trophy.
Marco Plas, Head of Research at the Wonder Weeks, spoke about the serial disruptions (10!) that occur in the first 20 months of a childs life. Understanding and responding to these important disruptions, and making the most of the fleeting opportunities they present, is critical in ensuring the very best for children as they grow.
The day was interwoven with frequent breaks, workshops, and speaker Q&As. My hosts, the Digital Health & Care Institute (@DHIScotland – dhi-scotland.com) took over the 2nd floor to present ‘Innovation Avenue’, a showcase of Scotland’s future, where I was able to experience first-hand some of the incredible products being supported by the Innovation Centres (@ic_Scotland – innovationcentres.scot). Two of the highlights were:
Dr David Harris-Birthill, Senior Research Fellow from St Andrews University, demonstrating touch-free pulse and oxygen saturation monitoring of up to 6 people at once using Microsoft Kinect. This could be extremely helpful in remote monitoring waiting areas in urgent care centres and emergency departments, improving safety and saving staff time and resource.
Dr Pablo Casaseca, Senior Lecturer in Signal & Image Processing from University of West Scotland, whose team is cleverly using a mobile phone app for audio analysis of coughs to help monitor respiratory health and predict exacerbations.
The appetite for disruptive and proactive innovation was absolutely clear. As one person described it, ‘We’re moving from asking permission first, and doing it then asking for forgiveness, to just doing it and not stopping until they taser you’ – I may well put this on my coat of arms.
Of course, TEDx isn’t just about medicine. Part of the magic of the event is the wide variety of speakers they assemble on one stage. The subsequent wild mixture of topics stimulates the mind and conversation even further.
So, from the art world we heard from fashion designer Pam Hogg, (@PAMHOGGcouture) talking about ‘Divine Disorder’ and the chaotic muse she serves in delivering her incredible and personal work to the catwalks of the globe. From NVA(@_nva_), Creative Director Angus Farquhar premiered a mystical video of his art installation at St Peter’s Seminary in Cardross. He spoke of ‘healing the wounded giant’: a choral piece ringing through the illuminated skeleton of this post-modern ruin. Brianna Robertson-Kirkland (@BreeRob_Kirk) explained how the Castrato, the eunuch rock-stars of the classical operatic world, led to the development of a vocal training methodology that shapes singers today.
From the world of business, we heard of the need to innovate in the conservative world of the legal profession from Ruaridh Wynne-McHardy (@RuairidhWM). Steve McCreadie (@TheLensCP) & Dr Mark Payton (@MerciaTech) gave advice on nurturing intrapreneurship and entreprenuerhsip respectively, and Ellis Watson from DC Thomson (@DC_Thomson) lit a fire under our collective backsides and just asked us to get on with it – ‘Disrupt yourself or Die Trying’
And then there’s the motivational element, for which TED and TEDx excel. Mark Muller Stuart (@BeyondBorders_) reminded everyone of global conflict, and the role that a small nation such as Scotland can play in Non-State Diplomacy. Luke Robertson (@lukeRobertson) gave a humbling talk on ‘The Other Side of Fear’, and how he recovered from having a pacemaker and brain surgery to become the first Scot to complete a solo, unsupported and unassisted expedition to the South Pole. Fear can be a powerful motivator, and he encouraged us all to take more from it than it takes from us.
Which leads me to the most touching talk of the day – Laura Beveridge (@wee_munchkin6). By day she works as development officer at Who Cares? Scotland helping young people in care. Laura came from a childhood in care herself, and she spoke bravely and honestly about how fear and bureaucracy got in the way of even the most simple activities that we all take for granted:
Risk assessments for sleepovers.
Sitting on the shore while your friends play in the sea because the wrong kind of staff are present.
Being denied a hug, or even being told that someone loves you, because you’re not a child, but a child in care.
A more stark example of the need for disruption, and to rise about fear, could not have been given. As her speech closed, the whole auditorium rose as one to give her a standing ovation.
The day finished as it started, with clear blue skies and warm early summer sun bathing the massed audience. Conversations with strangers continued into the evening, and it was clear that the mission statement of TEDx was being delivered. What is interesting about the TED approach is that, in contrast to more traditional conferences, answers aren’t provided. What you get instead are hints at solutions, and encouragement to communicate, collaborate, and boldly experiment. The call to disruption of the world starts by accepting disruption within.
The NHS has often been described as Stone Age in its adoption of technology, and whilst I wouldn’t be that harsh, it’s not far off. It’s certainly lagged behind the entire time I’ve been in the NHS (which is since my birth), but at times it has flirted with coming up to date. I’d heard rumours that we’re giving it another try, but having recently been at a talk from one of the NHS innovation leads which sounded more like he was reading from a Silicon Valley bingo card, I’ve not been entirely full of hope.
After today, I’m feeling optimistic.
Dr Steve Laitner (@stevelaitner), GP and Freelance Health Consultant and someone I’ve conversed with many times over Twitter, was kind enough to invite me to attend a meeting on ‘High Value Personalised Medicine in the NHS – now and the future‘. Personalised medicine is another buzz phrase (a box on the bingo card if you will) which is slightly hard to define. I understand it as the convergence of advances in medical informatics and biotechnology which will allow for super-personalisation of treatment for patients by segmenting populations into smaller groups (hence the other name for it: Stratified Medicine) at the popultaion level and more granular tests on individuals (including genomics) at the individual level (hence the other, other name – Precision Medicine). Of course, the NHS wants some of this, but what is not at all clear is how to best approach it.
The meeting brought together an incredible mix of patients, carers, academics, commissioners, third sector executives, scientists, and a few doctors, in an effort to begin to answer some of the questions raised. In Steve’s own words, it was a horizon scanning event, which looked to identify those technologies that could help deliver Personalised Medicine in the NHS.
Dr Fiona Carraghar, Deputy Chief Science Office (@depcsofiona) started us off with a clear explanation of Personalised Medicine, breaking it down into its alliterative components: Prevention, Precision, Prediction and Participation. There are a number of projects already well underway, including the ‘100 thousand Genome project‘ which has progressing nicely since it kicked off in December 2010. There are now 13 Genomic Medicine Centres in England, which have generated huge volumes of data that are being used for research and tailoring patient diagnosis and treatment in those with cancer and rare diseases.
Sir Muir Gray(@muirgray), director of the National Knowledge Service & Chief Knowledge Officer to the NHS, was next up. Bold and entertaining, he claimed that we stand on the verge of the third medical revolution:
The first: Public Health.
The second: (everything else in between).
The third: Mobile Phones.
He stated that the incredible processing, networking, and empowering effects of the mobile phone have transformed every aspect of our society, and now it is hard to find people who don’t have access to one. This revolutionary tool has the power to amplify both the benefits and the harms of medical investigation and treatment, so we need to be more thoughtful than ever in how we apply them.
Sir Gray is also a big fan of maps: the sort that show you the difference in treatment between hospitals. He spoke of a dream that every hospital has one on their wall, highlighting the variation in investigation and treatment of diseases. Personalisation does not mean eliminating the variation, but rather recognising we need to be asking why is there variation, and understanding the cause. As an active participant in the East Sussex Better Together programme in my home CCG, I can attest to the challenge of developing a good understanding of this data to help deliver a high quality integrated health and social care organisation to my locality.
His parting request was for everyone in attendance to help not only create and foster innovation, but also work towards adopting it widely in the NHS. This is a message I have heard many, many times: from providers, commissioners, patients and carers alike. It’s something I’m committed to helping with, and with Sir Gray’s encouragement my resolve has been suitably stiffened.
The final keynote was delivered by Lord Victor Adebowale (@voa1234), who came to ask a series of simple but powerful questions from his notebook:
Why are we pursuing Personalised Medicine, when we have so many other simpler problems to solve?
Who does it benefit?
How do we ensure that this doesn’t widen the gap we know as the Inverse Care law – that care is least available to those that need it the most?
In contrast to Sir Gray, he claimed that mobile telephony and internet access is not equitably distributed, with 8 million people in the UK have no internet access. I was surprised by this, as were some of the people following my live-tweeting. Sure enough, the ONS Internet Access survey of 2015 shows only 86% of households as having internet access.
Lord Adebowale was elected as a people’s peer for his work as chief executive of Turning Point, and I’m delighted to have someone of his focus and intelligence asking these importance questions at the highest levels of government. It certainly focused the minds of the attendees, and closed what was an truly outstanding opening session.
With that we were into the workgroups. I sat with the group discussing Patient Participation and Genomics, others looking at Personal Health Data and Population Health Management. Each session was led by a domain expert, which set us up nicely for what turned out to be lively and wide ranging discussion about how the fundamental tools of Personalised Medicine – shared high quality data, and genomics – might be used in a practical and ethical way within the NHS. Other groups considered big data and data from wearables and other sources.
There is clear tension between the realising the promise of these two rapidly advancing fields and the needs to apply appropriate controls to ensure security & confidentiality. We also need to minimise the risk of harm from its over-application. If we are to make the most of Personalised Medicine in the NHS, we need to make sure that the needs of patients are front and centre, and that we don’t simply rush ahead, justifying our pace by believing we have their best interests at heart.
I approached the consumer end of the genomics market (23 and Me) courtesy of a christmas gift from my parents. I thought little of the implications of being tested. Even with nearly 25 years of medical training and practice I’ve been baffled by some of the results, and left two locked and unviewed (my Parkinsons and Alzheimer’s disease risk). I’ve had two patients share their data with me in confusion, and I haven’t been much able to help. This is a dangerous situation, as the scope for health anxiety, fear, and unnecessary investigation and cost is great. Equally, the gold standard work of the NHS Geneticists is difficult to scale, so we’ll be left dealing with a heterogenous and inequitable situation where large scale gene sequencing may be restricted to those with money, leaving them to reap the benefit and harms, whilst those that might benefit from this the most either wait for access, or receive only a small portion of what might be possible.
I can’t deny that I’ve also personally seen a benefit from my genetic data though. I know now that I am a rapid metaboliser for Proton Pump Inhibitors (clinical grade antacids), which explains why Omeprazole has never settled my heartburn. My elevated risk of Type II Diabetes, only revealed to me when I ran my 23 and Me data through a different analysis, was certainly in my mind when I committed to weight loss using the 5:2 diet in the past few months.
As you scale up the data sets, more is possible:
An instantly accessible and accurate national organ and tissue donation registry? No problem.
A fantastically powerful disease screening system, which can be upgraded at the touch of a button? Absolutely.
Can we have cognitive computers deal with all of this while we get on with the more important stuff like speaking to patients? Of course.
When linked to other databases, the possibilities become staggering. Its certainly seductive stuff, but as Lord Abedowale asked: how do we ensure the benefits and harms are considered and equity is assured?
The other concern about Personalised Medicine, felt principally by the providers such as me, is that this adds an extra layer of complexity to our already over-stuffed workload. With burnout in GPs at a record high, it’s going to be an exceptionally tough sell to convince my colleagues that we should be doing more in the limited time we have with patients. The history of innovation in healthcare is not really characterised by the freeing up time, but rather one of making space into which more work is crammed. I certainly feel quite distant from many of my patients, often like a production line worker, which is dissatisfying for all concerned.
Could it be that Personalised Medicine is different? Assuming a patient spends 10 minutes with their GP once a week, every week of the year (an exceptionally high consultation rate by any standard), that represents less than 0.1% of the patients time. It’s not as if their health and wellbeing is paused for the remaining 99.9% of the year. By handing back greater control and ownership of healthcare to patients and their carers, perhaps we can all get much greater use of that time, saving the valuable time in each other’s company for the sharing expert knowledge, making decisions based on mutual understanding, exploring options creatively, and perhaps even having some time for empathy – something that Muir Gray feels is the one thing that mobile phones cannot replace!
The final panel drew together the discussions, with a summing up from Dan Gosling, from the NHS Personalised Medicine Core Team. He explained that this day was seen of the start of a conversation, and one that will be followed up as we move forward.
And there it was. Someone from a dedicated NHS team, attending an event convened to explore this in a serious and democratic, diverse way. This is why I have hope that the NHS is taking this seriously, and that there we can genuinely expect to see some of the incredible promise of Personalised Medicine realised for each of us in the months and years to come.
My hopes were high for some earth-shattering VR announcements from Google yesterday. As I watched the keynote on YouTube and sat through a series of announcements that, on the face of it, were rather underwhelming, I started to feel a little less hopeful.
When the VR came, it was in the form of an announcement about a new name (‘Daydream’), reference specs for VR ready handsets and headsets (‘coming this fall’) and a peek at the user interface, which looked interesting but somewhere south of what I can already get from Samsung Gear VR. The inclusion of a wiimote style controller was interesting though, and my mind went immediately to the possibility of using this in physical therapy and stroke-rehabilitation.
While the future of VR is, in my opinion, very bright and nausea-free, it was the remainder of the keynote that got my neurons firing with possibilities.
Earlier in the event, Google presented a rebranded messaging and video call pairing with Allo and Duo. Allo I could take or leave – it’s AI enhanced auto-reply seems well set to address all my dog-breed identifying woes. It does highlight the path to a more conversational, human-orientated AI interface, all of which strengthens Google’s core offering of a smart, intuitive natural language interface (more later)
Duo introduced something called ‘Knock Knock’ to video calling though, and this is where I saw some potential benefit. Essentially, it is a live video preview of the caller before you answer. This allows the recipient to know a little more about the person calling, presented as a way to gauge the mood of the caller.
Me? I saw a way of pre-triaging a patient before the call begins, without the patient needing to speak or enter data. We’ve seen video footage being used to determine pulse, respiratory rate, and even emotional state in other AI systems – could ‘Knock Knock’ even screen for facial weakness in acute stroke? Perhaps you could even analyse changes to speech against previous calls to determine subtle early changes to voice that can happen during ischaemic events. Whether this is unique to Google Duo or whether you could integrate this into existing WebRTC clients is another matter – the ‘smooth transition’ much emphasised by the presenter is less important in this situation.
Changes to Android OS (Version N – my money is on ‘Nougat’) were also discussed, but in truth the most interesting announcement of the whole event was Google Home. Having been nicely set up with a display of the advances in the natural language interface of Google Assistant, this device appears to be the voice activated front-end of home automation. There has been a slowly growing roster of Internet-of-Things (IoT) devices in the consumer market in recent years, such as Nest and Hive thermostats, Philips Hue lights, and the like. Google Home will provide voice search, internet services, and voice control of IoT home devices. Of course, Amazon got there first with Echo, but Google does have 17 years of weapons-grade search engine experience behind it.
The demo video shows the range of ways in which it could get the model family with 2.4 kids and a science project ready for their day – lovely, but much less interesting than the massive potential for health and social care.
A natural language interface could be enormously helpful in helping to meet the health & care needs of the older population:
It could be used to easily coordinate carers and update estimated time of arrival, reducing anxiety.
Food could be ordered from online grocers, reducing the need to employ carers for the simplest of care tasks (and thereby reducing cost and easing demand).
Medication reminders could be given in a friendly and simple to follow way, which could then feed back to the patient’s electronic record.
Voice control of home devices would be a gateway to increased use of IoT enabled lamps, heaters, and cooking equipment which would improve accessibility and safety, especially when it comes to family supporting their more dependent members.
In event of an emergency, Google Home could be used to summon help if the user was unable to get up after a fall, and act as speakerphone to emergency services.
The list of possibilities goes on and on, and multiplies with every connected service – something Google is very good at. Having been at the NHS Hack Day this last weekend, I’ve been struck by how little hacking appears to take place in the Social Care arena – perhaps later this year we could see events where Google Home (and Amazon Echo) are used to provide novel services at low cost?
Google Home can potentially help with something much more human though – the need for company. So many of my older patients live alone, with no-one visiting between carer appointments. This device opens the door to a easy, natural way of communicating with others, playing music, listening to audiobooks and the radio, but also interacting with someone that is always there, ready to talk, 24/7. It may start with traffic and weather reports, pre-canned jokes, and facts about astronomy from wikipedia, but the ambition of Google and Amazon, powered by the exponential growth in the field of Artificial Intelligence, means that before long the AI in your home will not just be your butler and assistant, but also a friend and companion.
This is partly down to the fact I’m sleeping on a camp bed at my brother’s home with the sunrise peeking through the curtain, but mostly because my brain has already started fizzing with ideas and excitement ahead of my second ever NHS Hack Day.
I first went to NHS Hack Day in January 2015, when it was held in Cardiff. I’d been introduced to it through tweets with AnneMarie Cunningham (@amcunningham), GP and Primary Care Director at the Aneurin Bevan Health Board, who was organising the event. Sold as an opportunity to meet like-minded hackers and geeks, I spent a whirlwind 36 hours working on GWYB – a notification system for patients which triggered communication cascades on the event of their admission to hospital. We even won the Patient Prize for our efforts.
NHS Hack Day is a free to attend event that has been running across the country at weekends since 2012. In ‘Meeting the challenge’, they ask:
How can we build an environment where world-class NHS digital services flourish?
Through leadership that understands technology and is bold enough to modernise the delivery of digital services, including embracing openness.
To this end they sent out the call to all geeks that love the NHS and bring them together in the spirit of adventure, openness, and addiction to coffee.
I arrive at 8:30 at Kings College, London, and pitch in immediately with laying out the bottle water, coffee, tea and bin-bags. Extension cables are daisy-chained together and taped to the floor. I pop my Ricoh Theta S camera onto it’s tripod and start up Tweetbot in readiness.
By 9 it’s getting busy, and 15 minutes later we’re off at pace. It’s a speed that doesn’t really drop for the subsequent 33 hours. Everyone who has contributed to the Google Document of Pitches through the week is given 60 seconds to pitch to the assembled masses. Here’s my attempt:
Yes, that’s right: I’ve just asked a room of strangers to build a customised 360 video viewing app for Google Cardboard by the next afternoon. I’m nothing if not ambitious.
The pitches range widely, from medical dictionary and haematology data visualiser, to hospital bed finder, bleep replacement, and even personal pollution monitoring. I’m suddenly aware that there are lots of other teams I’d like to join.
10:30 am and I feel like a wall-flower at a speed-dating event.
Once you’ve pitched you stand around the side of the room with a sheet of A0 paper with the project’s name on it. It starts slowly, but gradually the fact that I have a VR headset and I’m willing to share it attracts people. Several question the scale of what I’m trying to achieve, and as a result I realise that the project needs to change. With the help of some of the people the subsequently become the team, we decide to focus on using the tools at hand and skills we share to explore using VR and 360 video to help treat Phantom Limb Pain (PLP)
At this point it’s probably important to give you a little more information. PLP is a common and distressing complication of amputation. Up to 70% of people who have had an amputation can experience pain, itching, burning or distortion of their missing limb. It’s difficult to treat with medication, and as such a number of psychological and alternative therapies have been developed.
One such treatment method is MIRROR THERAPY. First described by Ramachandran in 1995, this uses mirrors to allow patients to view their injured limb as made whole again using the other limb. This has been shown to help reduce pain and distress, both during the treatment but also on an ongoing basis.
I have two patients with phantom limb pain, and even before coming to NHSHD I’d been wondering about using VR to help treat them. This weekend started to look like I might be able to make good on that.
More coffee and a time check. 11:30. We have 6.5 hours left of the day, then a further 6 hours tomorrow, to try and deliver something that will genuinely improve patient care.
My team is comprised of people with a huge range of different skills and backgrounds. Becky is a coding and digital media student from Brighton. Helen is a registered community nurse with a passion for tech and digital health. Mussadiq is a java dev with geographic information system skills. Ali a quantitative analyst. We also have Daniel and Charlotte, both software engineers. Some of the team stay for just day 1, and we’re joined on day 2 by Reno who’s switched codes from the dark side of finance to join Team Digital Healthcare. It’s an eclectic and excellent bunch – you can meet them all on our site.
Given our target group, the plan is to explore using VR, 360 video, and the Gear VR headset to simulate mirror therapy in a low cost digital way. My hope is that we can develop practical methods of deploying this in a clinical setting and share our findings with the community at large. It also means we get to have fun playing with all the toys, whilst everyone gets a chance to contribute and learn something.
The team splits into three streams:
Charlotte and Daniel start on the website, which we will use to contain our work from the weekend.
Becky, Ali and Musaddiq immediately set to work on the hard coding challenge – looking at Virtual Reality and whether we can mirror a live 360 video stream from the Theta S camera.
Helen & I began the collation of research evidence, and constructing a ‘treatment protocol’ that we could create some simple 360 video footage of which we could test with the team.
Such is the focus of a Hack Day that many of us didn’t really realise that the excellent lunch had been served until the back of the queue bumped into our table. This was despite the food being served right next to us. I guess this was the first proof of the distractive powers of Virtual Reality.
For the remainder of the day each stream worked away on their particular tasks. The website came together quickly and beautifully, built on a wordpress framework. Becky and Musaddiq heroically tackled 2 things at once:
3d modelling in Unity and then 3D Studio Max, developing some great point-of-view animations of leg therapy
Tests of live streamed 360 video using OBS and YouTube – this was sadly too slow, and there did not appear to be any open source mirror plugins.
Helen introduced me to Slack, a team collaboration tool that I dared to consider as yet another Social Network until I was sternly corrected. Using a technique shamelessly borrowed from the adult entertainment industry, I duct taped the 360 camera and Gorillapod to my chest to record 5 short series of basic mirror therapy clips. You can see them all here, and watch them yourselves using any VR headset. What was immediately apparent was that by watching and copying the movements you could experience an eerie sensation that the hands you were seeing were, in fact, your own (which in my case they were)
By 6 o’clock the pub and Eurovision were calling, so we all departed.
Day 2. 5:30 am this time. Ukraine won.
Another glorious day, so with coffee in hand I took a few photos of Embankment and set off to rejoin my slightly smaller team. This was offset by the fact that overnight we had been contacted by Reno, who asked to join us. Expanding the reach of the Hack Day using social media is fantastic, and something I hope they facilitate in future. As it was, our hashtag started trending shortly into day one, which was announced by the unwelcome hijacking of our thread by a russian dating agency.
By 9:30 everyone was up-to-date and the plan for the remaining 6 hours was in place. To up the pace and demonstrate the power of what we were doing to the team, we decided to utilise the ‘Cold Pressor’ test to see whether any of the content we had created could offset the pain of holding your hand in iced water.
The Cold Pressor test can be thought of as the bespectacled, serious cousin of the ice bucket challenge. It is used in research to help provide a controlled and safe painful stimulus. It has already been used, successfully, in demonstrating the efficacy of VR in reducing pain, so I felt it was justifiable to subject Becky, Reno and myself to a bit of light Sunday torture in the name of science.
Despite our rather crude efforts, what we found was quite startling. Becky & I recorded some point-of-view footage of ourselves with both hands inverted, our left arm in an empty bucket. The bucket was duly filled, and we were timed as to how long we could keep our left hand in the iced water.
Becky bowed out at 1 minute 30 seconds. I lasted even less, at 1 minute 10.
We were given a while to recover and then tried again using our personalised 360 video. What we found was that Becky increased the time she tolerated the pain to over 3 and a half minutes. I tried again and stopped at much the same time, with the feeling that I could have gone on if I wished. The sensory confusion of seeing both hands in the air versus the sensation of the left arm in water at near freezing clearly disrupted my perception of pain.
Reno stepped in next to experience the power of VR to distract patients from painful stimuli. Watching ‘Kurios’ by Cirqu du Soleil, he breezed through nearly 10 minutes of laboratory standard agony, smiling much of the time. Having checked his biography, I now see that he is an ultra-runner. This doesn’t diminish his achievement, but explains the smile.
Next came the crunch. I’d cunningly ensured that 3 of the time had frozen typing hands, so we awkwardly wrote up our findings, with Becky and Ali also finding the time to crack the problem of mirroring 360 footage in a simple and effective manner. It was this last development that will really help clinicians in creating effective personalised 360 mirror content for patients, and will form the basis of the next steps I take with my own patients.
3:30 arrived, and the final presentations in front of the judges began. With a brutally marshalled 3 minutes, each team spoke of what they had achieved in the last day and a half, before being grilled by the panel and audience.
We saw a great variety of differing presentations, but what tied them together was the incredible progress everyone had made, and the amazing creativity and skill that had been used in producing extremely polished applications that were, in many cases, ready to use. I was particularly impressed by ‘Outbreak’, a disease-outbreak management system in a box that used Raspberry-Pi’s and tablets to create a pop-up field network. I wasn’t the only one: they took home the star prize. Very well deserved.
So what about Virtual Analgesia? Well, I’m delighted to report that we won a ‘Highly Commended’ prize from panel judge Alan Thomas (@alanroygbiv) for our work on Patient Inclusion. Having had the idea come from patient needs, it was high praise indeed to have this recognised.
6 pm and it was all over, bar the wrestling over the goodies and dividing up the remaining bottled water. I’d been part of 36 hours of intense team work and creativity, and joined a group of new friends and colleagues. Most importantly we had a new tool that clinicians can consider using in managing Phantom Limb Pain. In the coming weeks I hope to share this work with my two patients and see whether they’d like to try this approach. Using VR in this way means that when they wake at 3 in the morning they’ll have something new to try to control the burning pain in the foot that’s no longer there.
This post is not a speech, so I won’t go into detail about how thankful I am for the help I had from my team – I’m banking on the fact that they know this already.
What this post must be is a loud celebration of the amazing work of the NHS Hack Day group, and most of all about the incredible reservoir of passion and talent in the developers, students, clinicians and patients of this country. The challenge of rising demand and shrinking funding of healthcare is not unique to the UK, but we have a National Health Service – free at the point of delivery, with care provided based on need, not the ability to pay. The NHS Hack Days demonstrate that it isn’t just the nurses and doctors that are committed to supporting this unique and precious institution, and that we don’t go into the fight unarmed – there’s an army of geeks out there, and they have some incredible tech to share.
To find out more about the next NHS Hack Day, visit their website www.nhshackday.com or follow them on twitter @nhshackday – they really are amazing events, and welcome everyone with a passion for healthcare.
All the notes from my team ‘Virtual Analgesia’ are available on www.virtualanalgesia.net . We’d love to hear from you with any feedback or comments. You can join the discussion on Facebook on ‘VR Doctors‘ – just apply to join.
Declaration of Interests
I attended this event in my own time and at my own expense. The hardware and software used was all either open source or owned and operated by the participating team members.
There is something uniquely wonderful about attending Digital Health events.
The mixture of hope, cutting edge technology, and the energy and enthusiasm of the start-up scene vying against the giants of biotechnology makes for invigorating stuff. In my experience, it’s something that is best done in the gentle heat of the West Coast of the United States, as you’ll perceive no change in the atmosphere as you leave the auditorium and go for a coffee on the high streets of Palo Alto. My professional development plan for each year has ‘visit California for med tech event’ as a recurring entry, filed under the ‘recharge soul’ category, for exactly this reason.
So when Felix Jackson, founder of MedDigital & DefinitiveDx, suggested that I consider attending WIRED Health and stick a little closer to home, I have to confess that I was a sceptical. How could central London in spring time compete with Silicon Valley?
Remarkably well, as it turns out. WIRED Health 2016, held at the Royal College of GPs in London, was an impressively curated event, with a fascinating and wide-ranging programme that took the attendee from Global Pandemic through Ingestible Perfumes to Longevity Research and AI. It shared some DNA with its American cousins though, as it was a little light on scaled practical application. This blog focuses on the speakers and main content, with my next post looking at the exhibitors floor and some of the off-stage conversations.
The day started with Jim O’Neill, commercial secretary to the treasury (WIRED) and Jeremy Farrar (WIRED), talking about the THREAT OF EPIDEMICS. Anyone who has spent any time involved in pandemic flu preparation or who has watched ‘Outbreak’ or ‘Planet of the Apes’ will be aware of the impact of the increasingly connected world. Combine this with the rise of multi-drug resistant superbugs and you have a powerful cocktail that, if left unchecked, could be killing 10 million people every year by 2050. Jim O’Neill has headed up the UK Independent Review on Antibiotic Resistance, which publishes shortly. Within this, a 10 point plan underpinning the creation of a global surveillance system for Antimicrobial Resistance (AMR) will be laid out.
The need to upgrade our surveillance systems was supported in Jeremy Farrar’s entertaining talk which reviewed his career in infectious disease. Recent outbreaks such as Ebola and SARS demonstrate the speed with which local outbreaks can become global concerns.
AMR is something that I am seeing more of in my own practice in the UK, with the rise of antibiotic resistant gonorrhoea being a particular concern locally. Improved technology supporting surveillance, diagnostics, and prescribing, alongside rigorous public health measures are surely the key: something which is a worry in this age of falling funding.
Next, we were onto more traditional digital tech territory – UNLOCKING THE BRAIN. We began with John Donoghue from the Wyss Center (WIRED), who took the audience through non-drug approaches to treating brain disorders. We were introduced to BrainGate, one of the incredible advances in direct machine/brain interfaces. The last decade has seen users go from moving a cursor on a screen, to Kathy, disabled by a stroke, operating a robot arm to bring a drink to her lips.
Gero Miesenböck adopted the persona of “stern teutonic scientist bent on world domination” as he explained his invention, Optogenetics, to the audience (WIRED). This allows scientists to engineer neurons to respond to light, thereby acting as a tool to determine causes, identify connections, and clarify the mechanisms behind neurophysiology. Progress in sleep science illustrated the power of this transformational tool.
Finally Vincent Walsh (WIRED)from UCL grounded the discussion, with a plain speaking criticism of the limits of technology and a call to stay true to the principles of good science. His second theme, that of success and peak performance being reward for hard work and persistence, was well received but I suspect this was down to playing to the ego of the attendees.
I’ve previously been disappointed by the NHS representation at events like these, but today I was gratified to see Dr Mahiben Maruthappu, Co-founder of the NHS Innovation Accelerator (WIRED) take the stage to lay out an the NHS hopes to move ‘from the stone age to the digital age’ in five years. Covering familiar ground like the Five Year Forward View, it felt at times a little like a game of ‘Digital Health’ Bingo. That said, I was disappointed not to hear about any work in Virtual Reality or Augmented Reality, and once again Primary Care was very much an afterthought. Whilst some of the detail of the plan was given, the fact remains that innovators still find it very difficult to get into the NHS. This clearly has to change.
Having been distracted by the exhibitors hall and a free neck massage, I was sadly late for the start of the third section, REDESIGNING MEDICINE, and missed Anna Young from MakerHealth (WIRED) and Alejandro Madrigal, from the Anthony Nolan Research Institute (WIRED). I arrived to the wildly imaginative work of Lucy McRae, Artist and Futurologist (WIRED). It’s difficult to summarise exactly what her work entails, but her presentation ranged from ingestible perfume, through vacuum packing spa clients in mylar, to exploring the nature of isolation in deep space travellers.
Her message to organisations to facilitate the creative and artistic in their workforce struck a particular chord with me, given the increasingly protocolised nature of medical practice for many doctors.
The remainder of this section dwelled with makers and designers. Geraldine Hamilton from Emulate (WIRED) enthusiastically took us through the award-winning design of her ‘Organ-on-a-chip’ technology. The beautifully simple design has already led to successful replication of lung, blood vessel, skin, intestine, brain, and liver systems. I couldn’t help but wonder whether this could be combined with optogenetics, or even the revolutionary magnetic tech used by Medisieve.
José Gómez Márquez, Little Devices @ MIT (WIRED) made a well argued plea for medical devices to break free of their black-box form and allow a new generation of hackers, makers and tinkerers to advance healthcare technology. Finally, Andrew Dawood & Josh Stephenson – Dawood & Tanner (WIRED) demonstrated the practical functionality and scope of 3D Printing in healthcare. Contrasted with some of the ‘bigger’ topics, this section was one of the highlights through its focus on the practical application of creativity in digital healthcare.
Our post-prandial topic was HEALTHCARE ONLINE, featuring Jen Hyatt, Big White Wall – (WIRED) and Shafi Ahmed, Medical Realities (WIRED). Jen Hyatt is the founder of Big White Wall, an online community where patients with mental health issues can anonymously seek support from peers and therapists. This has been successfully delivered in a number of NHS areas, and is even something that I have recommended to my own patients. The fact that it is not funded in my locality illustrates one of the recurring problems within the NHS – innovation struggles to gain a foothold, and when it does the pace of adoption is slow.
No such problems with Shafi, the swashbuckling surgeon who recently live streamed an operation in 360 for the first time (#VRinOR: I have blogged separately about this momentous event). Using advances in VR, he has demonstrated how we can deliver world-class surgical training globally and help address the estimated 5 billion people across the world that cannot access safe surgery.
Democratising access to the best possible medical care was also a theme of Kyu Rhee‘s talk about THE AI DOCTOR (WIRED). Using the analogy of the doctor’s stethoscope, he argued that we are moving towards a time where Clinical AI is as ubiquitous as the totemic set of tubes around the neck of the clinician. There is little argument against the extraordinary power of cognitive computers to collate and make sense of the vast data sets found in healthcare and research, but work needs to be done to ensure that the data that feeds these engines is accurate and comprehensive, and that we don’t leave the consultation models behind in the rush to digitise care.
As we entered the final stretch, WIRED Health wheeled in the big guns to discuss GENOMICS. Fresh from a mad dash from his shower, Craig Venter was his usual brilliant and mind-blowing self during a talk delivered by video link (WIRED). Having heard him speak quite convincingly about sending DNA ‘teleporters’ to Mars at his Exponential Medicine talk of 2014, I was encouraged to hear a more practical tour of his Human Longevity Institute. We are now at the point of the sub-$1000 hi-res genome, and when this is combined with the advances in differential MRI imaging and massive computing power with access to enormous data-sets, the predictive possibilities are staggering. Dr Venter showed us how facial appearance, and even the sound of the voice, can be predicted from gene data. At $25’000 a visit it’s not cheap, but with a mission statement that commits to solving the diseases of aging, the challenges are appreciable. Therein lies my main concern about longevity research – it still appears to be the preserve of the wealthy, with little in the way of trickledown to the population at large. I’d like to see some of the evangelists for this technology start to explore this a bit more.
We were back to more eye-popping statistics as Ye Yin, Chief Executive of BGI Genomics, took to the stage (WIRED). From the incredibly large – 100 trillion cells in the human body – to the small – only 4% difference in genes between us and chimpanzees – the incredible power of DNA to determine our fate was explained. BGI Genomics played a key role in the sequencing of SARS, and now look to developing large scale public efforts to help prevent disease such as pre-natal testing for Down’s Syndrome.
Meanwhile, in Glasgow, Jo Montford and her team at the Institute of Cardiovascular and Medical Science, University of Glasgow, are growing red blood cells (WIRED). Using pluripotent stem cells, bioreactors, and fuelled by the enormous global need for blood, they have scaled from 100’000 red blood cells in 2008 to ten billion cells per year in 2014. Given that this is still only 8.8 litres of blood, there is still some way to go, but the power to create group specific blood in a short time scale is a fantastic step forward in regenerative medicine. Apparently they are looking for a name for these new cells: might I suggest Cupriscytes or Novoerythrocytes?
As the day drew to a close, and stomachs started to rumble, we moved on to FOOD AS MEDICINE. Dr Molly Maloof (WIRED) kicked off with the stark facts: much of chronic disease is preventable, and food is one of the key factors in prevention. Practicing what she called ‘culinary medicine’, she combines conventional evidence based medicine, Social Media, Telemedicine, meal delivery, kits and apps, and blends with a dash of genomics, microbiomics and micronutrient analysis to deliver a personalised formulation for her patients. Again, I struggled to imagine some of my less activated patients making full use of this, but time will see whether this trickles down into wider practice.
Talking microbes, our final speaker was Tim Spector (WIRED). Professor of Genetic Epidemiology at Kings College London, he challenged much of the established dogma surrounding diet. From the harms of high fat diets, through calorie counting and the generic approach to consuming healthy foods, experience and evidence has shown that we need to move towards a more personalised approach to dietary science. Combined with the advances in gut microbiology research, the important role of pre- and probiotics in maintaining not only health and wellness, but combating disease, is clearer now than ever before. This has a personal relevance for me as I suffer from non-alcoholic fatty liver disease (NAFLD), and in a separate conversation with Tim after his talk he explained how even here gut microbes have a role in metabolising bile acids which, in turn, are pro-inflammatory to the Liver. I may get some benefit out of my uBiome analysis after all.
With that, it was off to the reception and further discussions about what we’d heard and what happens next. WIRED Health 2016 absolutely delivered on the content, with as eclectic and expert an audience as one could hope for. This really was west coast thinking delivered in the heart of London. I left energised and full of thoughts about how to convert some of the high-level and cutting-edge science into practical benefit for my patients and colleagues. We’d heard about the commitments the NHS is making to stepping into the 21st century, but there is still a long way to go. We need to move beyond simply listing exciting technologies, and start to make the NHS much more open to innovation from within and without. Everyone who has a passion for health and care improvement has their part to play in this.
DECLARATION OF INTERESTS
My employment status and conflicts are given in the ‘About Me’ section of this website. I attended this event in a personal capacity and not representing my employers. I paid for all expenses myself.
Getting up in the dark at five thirty in the morning for work is never fun, particularly when I’m working a weekend shift. However, on Saturday the 16th of April I was awake before the alarm, and I set off with a spring in my step, because on this particular occasion I was heading off to London to attend one of the Royal Social of Medicine’s regular Innovation summits, courtesy of MedTech Campus.
provide a broad range of educational activities and opportunities for doctors, dentists and veterinary surgeons, including students, and for allied healthcare professionals
promote an exchange of information and ideas on the science, practice and organisation of medicine, both within the health professions and with responsible and informed public opinion
With a varied year round programme of events, lectures, and workshops, set in a stunning building bang in the centre of London, it’s not the first time I’ve visited this august institution. Its also not the first time I’ve seriously considered joining them as a member, a feeling which was undiminished when I left a few hours later. So, bright eyed and optimally caffeinated, I took my seat.
We started the day talking about sleep, with a polished presentation from Dr Sophie Bostock, Operations Lead at Big Health and a self-professed ‘Sleep Evangelist’. The audience were introduced to Sleepio ( @sleepio), a digital sleep improvement programme. A popular and evidence based Cognitive Behavioural Therapy based approach to the common problem of insomnia and poor sleep, Dr Bostock laid out the harm that comes from sleep disruption, and how through the use of Sleepio users fell asleep 54% faster, woke through the night 62% less, and had a daytime energy and concentration boost of 58%. I can personally attest to the debilitating effects of sleep deprivation and shift working on my performance and concentration; one of the principal concerns I have about the new Junior Doctors’ contract which is currently causing so much anger regarding it’s imposition in England.
Insomnia is one of the most common presenting or associated problems I see in my practice, and standard approaches using hypnotic medication are fraught with dangers of addiction, over-sedation and tolerance. I have actually had 2 patients use Sleepio to great effect, and would love to see this adopted at scale in the NHS. Dr Bostock made the familiar call for health commissioners to be more accepting and open minded to digital medicines. Their commitment to establishing an evidence base for this and dialogue with individuals such as myself go a long way to help here.
I’ve always had an affinity for hands-on medicine, hence my love for urgent primary care. Listening to South African doctor William Mapham describe his history of single handedly gassing, cutting and resuscitating during rural caesarian sections, I was struck by how performing medicine in the most difficult of circumstances presents difficult practical problems that require creative solutions, often with a much greater appetite for risk than we might be comfortable with in the ‘developed’ world. Dr Mapham’s main interest is ophthalmology – 80% of all blindness is preventable or curable, and he has seen first hand the transformative power of simple surgical interventions. He observed that primary healthcare workers often lacked access to appropriate information, skills, and basic diagnostic tools.
The Vula Eye Health mobile app (@vulamobile) was born from this – a clinical case discussion and information tool that allows these workers, often in remote areas with the slimmest of network connections to the world, to access expert information, carry out eye test, discuss cases with specialists, and make referrals. He even managed something harder: a live demonstration. I sought him out in the interval as this app is something that I’ll be able to make use of almost immediately in my own work. It’s a free download on iOS and Android
The summit did not disappoint when it came to the absolute cutting edge of technology. George Frodhsam introduced the audience to Medisieve (@Medisieve), a ground-breaking drug-free malaria treatment that used the principal of magnetism to dialyse and physically separate parasite infected cells from the blood stream of patients, using the naturally occurring paramedic properties of the disease. With the concept proved, it is heading towards human trials by the end of 2016. This was exciting for a number of reasons: it is a novel, drug-free therapy that not only treats a disease that leads to 200 million cases and 600’000 deaths annually, but also could be developed with engineered magnetic nanoparticles to treat a host of other illness, including other infections and cancers. I was particularly delighted to hear George tell an audience member that their organisation had provided the funds that made the very first trial device possible.
We also had an introduction to genome editing with CRISPR/Cas9, TALENs and the intriguingly-titled ‘Zinc Finger’ technologies from Katrine Bosley(@ksbosley), President and CEO of Editas. The tools developed by Editas and other biotechnology companies look set to transform the power of medicine and treat diseases that have up to now been untouchable. Cementing the promise with real world examples, Professors Waseem Qasim and Paul Veys from Great Ormond Street Hospital presented the cases of Layla and Harriet, two children with untreatable relapsed acute lymphoblastic leukaemia that have effectively been cured by gene-edited immune cells. This brief paragraph scarcely does justify to the achievement and incredible possibilities of this maturing therapeutic modality.
And then there were the robots – Professor Guang-Zhong Yang of the Imperial College Hamlyn Centre (@ICLHamlynRobots) have a presentation about the history of robotic surgery, through the current applications, and then a glimpse into the future of the field. As with all exponential fields, the pace of change in the past few years has been staggering. With ever smaller, flexible robotic instruments, in-vivo mass-spectrometry and optical microscopic tools that allow cellular level surgical work, live biopsy, and even augmented reality visual aids and surgical ‘no-go’ zones to assist the surgeons, Professor Yang painted a picture where the surgical robot ‘disappears’, delivering superhuman abilities to the operators alongside Artificial Intelligence assistance.
Further surgical demonstrations came from Professor Paulo Stanga(@mvr_lab) & Dr Sallusto. Professor Stanga is a Consultant Ophthalmologist and Vitreoretinal Surgeon at Manchester Royal Eye hospital. He gave a presentation on the Argus II epiretinal prosthesis aka ‘The Bionic Eye’. This has returned functional vision to a number of patients with Retinitis Pigmentosa, a rare condition that has previously been untreatable. He’s moving on to Age Related Macular Degeneration (ARMD), a far more common problem predicted to affect nearly 200 million people globally by 2020
Dr Sallusto talked about another world first – a robotic, transvaginal kidney donation and transplant between two sisters. A technically difficult procedure, this landmark of natural orofice/robotic surgery built on the efforts of many surgeons in the past few years. Incredible stuff, but the video of this and the eye surgery in the post-lunch session was a little hard to take, with audible groans around the lecture theatre as Professor Stanga incised his patient’s eyeball.
Against all of the high technology and promise of digital healthcare and biotechnology, the presentation that most affected me was that of Lauren Braun (@laurenrbraun), founder of the Alma Sana Project. Lauren spoke about her experience as a summer intern working in a vaccine clinic in Cusco, Peru. Here, she found that the poverty and low literacy of mothers led to late or missed vaccinations in their children, which could have devastating consequences. Vaccine-preventable diseases account for 20% of childhood deaths globally, with 18-22 million children vaccinated late. All of this inspired her to develop an innovative, life-saving bracelet which the child wears as a personal, physical vaccination record reminder.
Her talk took us through the challenges of bringing a simple yet powerful idea to practical reality, and hammered the benefit home with first hand accounts of the difference this has made.
I’m sad to say that I couldn’t stay for the whole day, so missed the final quarter and some fascinating sounding talks, but I am assured all will be available online at the Royal Society of Medicine website in the near future.
As I sat on my train home, reviewing my notes and collecting my thoughts, I was struck by the incredible diversity and scope of innovation that was showcased by the RSM team. From the incredibly hi-tech of Professor Yang’s surgical robots and Professor Stanga’s bionic eyes, through the revolutionary use of gene-editing and magnetic medicine to bring hope to those that had previously been beyond assistance, to the realisation of the promise of mHealth with Sleepio and Vula, and finally to the life-changing difference that one intern can make to the children of the world with a simple bracelet. It may be tough to get up early, but when you do, you are sometimes rewarded with the most incredible sunrise.
All of the presentations and content will be available shortly at the Royal Society of Medicine’s website. You can also view their film ‘Doctors of the Future’ from the 18th of April.
I was in the office, hiding from the rest of the workers and taking a quiet moment in the chief executive’s swivel chair to watch history being made. My father rang from his car to tell me that he’d heard them mention it on BBC Radio 5. Across the globe, people tweeted images of themselves sitting in lecture theatres, standing on the side of the road, and staring slack-jawed at their laptops as Mr Shafi Ahmed, (@ShafiAhmed5) Consultant Colorectal Surgeon at the Royal London Hospital, broke new ground with the world’s first live-streamed operation in Virtual Reality.
I had the pleasure of meeting Shafi back at the Wearable show in March, 2014, where he was exhibiting VR & Augmented Reality (AR) healthcare applications with Amplified Robot. Since then I’ve met him on a few occasions, and each time I’ve left with a mind expanded several sizes larger than when I arrived. He’s a true believer in the transformative and democratic power of Digital Healthcare, particularly when it comes to medical education and reaching out to the less well served parts of the globe.
Shafi is a man for firsts, having also been the first to use Google Glass live-stream an operation from a first person point-of-view (joining other innovators such as Dr Rafael Grossman (@zjgr) , US surgeon who first transmitted live surgical footage in this way back in June 2013). His office is a treasure trove of medical gadgetry. He’s even been known to sweep in for his lectures on hoverboard.
The event itself has been advertised for the last few weeks, slowly gaining media attention. The Virtual Surgeon platform, developed by Medical Realities in association with Mativision and Amplified Robot, was available as a free download on Google Cardboard, Samsung Gear VR, and Oculus Rift. You could even watch in it a browser, should the thought of viewing the procedure first hand in VR feel too overwhelming.
The clock rolled past the start time as I sat alone in the office finishing off the remains of my cup of tea. Nothing happened. I clicked the logo several times, to be treated to the soundless preview footage on a loop. Things weren’t looking good, and somewhere I imagined a pall of smoke rising above an over-heating webserver. I took to twitter but was pleased to see that more mundane, real-world and (lets face it) more important concerns were delaying things slightly – the patient was being made ready.
All of a sudden, it started. I was there, in the operating theatre, and I was transfixed as Mr Ahmed began the laparoscopic procedure, calmly talking us through the steps he was taking to first reduce a hernia, before proceeding to the tumour resection. The stream was a little grainy, much as any of the streamed VR footage I have seen already, but the audio was clear and the sense of immersion was striking. On both sides of the operating table large screens relayed the operating field, and around Shafi the team worked smoothly as the well drilled unit they clearly are. Thinking back to my time as a medical student and junior doctor, the only thing that was missing was an anaesthetist quietly completing a the Telegraph crossword, while the lead surgeon screamed at me for not knowing the branches of the Mesenteric artery. Oh, and that smell.
All too soon the real world intruded on the virtual, and I had to return to work. I took the opportunity to introduce some co-workers to VR in the most immediate and graphic way, and I was done.
It’s now a few hours after the broadcast and I’ve had time to think about what I saw and what it meant. I’d love to see the stream in a higher resolution, but this is simply a matter of time and product development. Likewise, I’d have loved some binaural audio. It would have been great to have the laparoscope feed overlaid to the side so I could have had a better view. I’d also have loved it if the assisting surgeons hand hadn’t loomed enormously into view as the patient was repositioned but this was primarily about the patient, so I’ll forgive him that.
I can wax hyperbolic about the possibilities for making surgical training globally accessible, but the most incredible thing about today was that here was a currently available technology, delivered cheaply to a huge global audience, by a bold innovator with a passion for education and adventure. That Daniel Kraft ( @daniel_kraft) , Medical & Neuroscience chair of the Singularity University, had also made the trip to watch this live, while the a news crew stood filming from a safe distance, showed the significance of today’s event. We had the rock star, the press, and leading the show with a tremendous vision of the future was our own Tony Stark: Mr Shafi Ahmed.
Last night I smashed a light bulb while fighting Space Pirates, and teared up as I traced a curve of scarlet flame against the backdrop of the Milky Way. I held my breath as a Blue Whale stared into my eyes from a few feet away. This morning, my shoulders ache from a session defending my castle gate from the marauding hordes with a bow and flaming arrow. All of these things were made possible by one of the most satisfyingly immersive pieces of technology I’ve ever had the fortune to experience: The HTC Vive.
The Vive is HTC & Valve’s foray into the nascent home VR market. Whilst Google Cardboard and Samsung Gear VR deliver great experiences at low or even no cost, you’ll need to drop quite a wedge on getting Vive up and running – a rocket-fueled PC with near-sentient graphics card will take you to about £1000, and the Vive itself costs £750-ish (including delivery). Even if you manage to get your hands on all of this, you’re also going to need a fair amount of space in your home (with decent head-room: more on this later).
Thanks to Sam Watts (@VR_Sam, Producer at Tammeka Games) I was able to skip those requirements and have a turn on his setup, which he had managed to configure in his spare room. Turns out that you don’t really need as much space as you’d think – I was standing in a floor space of about 2m x 3m. Two light stations (wired) are placed at opposite corners of the room, which allow the system to map the environment and the position of the user and the handsets in that space.
Once the headset went on, I was transported for 90 minutes to some incredible, vivid, and utterly realistic other places. When you combine the ultra-low latency headset with a wide Field-of-vision, headphones, and (crucially) two innovative handsets that allow you to interact with the world you inhabit, the illusion is complete.
Through the Steam Dashboard I sampled a number of experiences:
theBlu: a short demo where you find yourself standing underwater on the prow of a sunken ship. Shoals of fish surround you and scatter at the wave of your hand. Manta rays glide past. Then, a BLOODY ENORMOUS WHALE TURNS UP AND BLOWS YOUR MIND.
Job Simulator: I was a cubicle drone in a large open plan office. Ostensibly a recreation of turn-of-the-millennium working practices , it’s a hilarious satire of office work that allows you to throw donuts, photocopy coffee cups, and electrocute yourself. All in the first 5 minutes.
The Lab: I was secretly hoping for Half-Life 3, or Portal 3, but instead I got ‘face mounted portal spheres’ – basically a variety of mini-games. I’m criminally underselling it by describing it this way though. Just being back in Aperture Science is worth the entry price. I won’t spoil the surprises, but this is where the burning arrows came in. And personality cubes. God, I *cannot wait* to play HL3 or P3 in this.
Space Pirate Trainer: Dual wielding laser pistols, and dancing like only a 6’5″ 43 year old man can, I took down wave after wave of aggressive Space Pirates, and then smashed Sam’s light. The weird thing? I didn’t really notice I’d done it.
Tilt Brush: Sam had saved the best to last. It’s pretty simple to describe: it is a painting tool, which allows you to draw using ink, ribbons, flames, sparkles or paint splatters on a fully 3 dimensional canvas. You can move and gaze freely at whatever you create, and take animated gifs and snapshots to share with the world. Of all the experiences of the evening, this was the most magical and perhaps the most revealing, because it shows how being in VR is so incredibly intuitive. Anyone who has ever spelled their name in the night using a sparkler knows how to draw using Tilt Brush, but no-one could have imagined how affecting it might be to see it persist long enough to walk around. Sam said a friend’s father, himself an artist, pretty much refused to take the headset off. I completely understand.
And then it was done. I wanted to capture my reactions quickly and share them, so I’ll come back to some of my thoughts on health and social care applications of Vive specifically. For now though, I wanted to let you know that VR is here and it works. It works so much better than you might have hoped for, and most excitingly, it’s only the beginning.
Virtual Reality (VR) is a technology that is older than television, and something that I recall very well from the heady, neuromancer days of the early 90’s. With the launch of commercial products from HTC and Oculus, and Samsung building on the early successes of its Gear VR headset, more people than ever can get their heads into another digital realm and experience first hand what they could only haltingly and nauseatingly experience from the ‘Dactyl Nightmare‘ days. I was pretty keen to put that behind me and see what was new in this field.
With that in mind, on a wet Thursday night on 24th March, 2016, I attended the 4th annual VRLO (Virtual Reality London) meet up at the Amba Hotel at Marble Arch, London. Hosted by VR & MR production company Rewind, the event is billed as:
(a) regular hands-on social event is for professionals who are curious to see what impact virtual reality and applications will have on every aspect of our lives. Get early access to the latest developer kits and applications, immerse yourself in cutting edge applications and network with the people at the forefront of this new medium.
The event was split between two rooms: the first contained all of the exhibitors and hands-on demos, and was the place the I spent the entire evening. A second room hosted the presentations, although to be honest most people looked like they were there for the toys and the networking.
My interest in VR, AR (Augmented Reality) and 360 video and audio comes from a place of personal interest as well as a fascination with what these new technologies can offer to health and social care. So armed, I met with the exhibitors, tested their gear, and chatted about MedTech. In the process I discovered a hidden passion for VR Healthcare, the beginnings of practical applications for patients and clinicians, and a rather worrying disregard for basic infection control.
The exhibitors were big and small. The biggest, Sony Playstation and Samsung, were there in full effect, although the Sony VR equipment was noticeably absent. As a result, so were visitors to their stand.
Samsung, on the other hand, were demonstrating their new wireless 360 Camera, the Gear 360.
Announced at 2016 Mobile World Congress, this tiny orb houses 2 wide-angle lenses and a cute tripod and can record still and video images at hi-def. So far, so ‘Ricoh Theta S’. It did exceed my current 360 camera in a number of departments though: splash and dirt proof, it also live-streams beautifully, which will play a huge part in the coming growth of 360 media. You only have to look at UK Surgeon Mr Shafi Ahmed’s exciting first world 360 live broadcast of an operation on 14th April, 2016 to see the potential for education in healthcare and surgical training here. Within primary care, documenting practical procedures and studying doctor:patient interaction immediately springs to mind, but also the ability to rapidly record a person’s home and living space to allow remote occupational therapy and monitoring of social care provision.
There were many app and content developers present showing the work they had done in demonstrating the potential to companies and clients, producing training material, and wrapping this content in branded Google Cardboard hardware. If you haven’t had a chance to dip into VR yet, Cardboard is certainly the cheapest way to do so, as you can buy a headset for less than £10 and strap your smartphone into it to have a taste of the other side. Spend a little more and you can get a great, comfortable system such as the FreeflyVR (my current preference for Google Cardboard work)
I’ve taken these entry-level headsets to two clinical environments thus far (my surgery and my dentist) to see how patients and professionals fare, and what their feedback might be. The current generation of hardware is a little bulky, especially for dental work, and needs to slim down or risk getting in the way. The optics are fairly basic, so limit the audience somewhat to those with standard size heads, and a shallow range of visual acuities. You can wear glasses with some of these headsets, but I’ve yet to find a headset that makes this anything other than an uncomfortable workaround.
Infection control is the biggest unaddressed issue, in my opinion. Most headsets have soft foam padding around the headset which would be a nightmare to clean. Additionally, the headsets themselves can be quite intricate and would harbour bugs in all of the nooks and crannies. Seeing person after person line up to pop headset and headphones on in a crowded, sweaty room, having just finished a shift seeing record levels of upper respiratory infections and scarlet fever in my surgery made me a little tense. Work needs to be done here for basic protocols to ensure the next big VR event doesn’t turn into a cruise liner style outbreak.
One team had their eyes firmly on this area however – the impressive Kickstarter-funded OPTOVR ( @OptoVR ). I had a great chat with the co-founders Richard Stephens and Tom Jarvis, who took the time to talk through their development story of what they claim is the worlds first portable VR headset with integrated headphones. What interested me was that it had a beautiful, clean look and feel, and as it is made from closed cell foam it can be readily cleaned (Closed cell foam is the kind of material used in Croc shoes, so loved by surgeons the world over). Add in the lightweight hardware and beautifully integrated sound system, I see this as being the first VR headset that I would consider using in the live clinical environment. Definitely one to watch, you can help fund them on Kickstarter and even attend their launch on 30th March, 2016 at Somerset House in London
Another exhibitor was CURISCOPE – ‘Education adventures in VR & AR’. Ed Barton ( @ed_barton), their founder and CEO, was demonstrating a remarkable T-Shirt which allows people to gaze into the chest and abdomen of the wearer and see their internal organs in glorious technicolour. The educational possibilities are obvious, but I wonder whether you could also use this approach for patients to better understand their own bodies and anatomy, and the effects of disease and lifestyle on their own health. Imagine showing a young smoker their lungs age and blacken before their eyes! Powerful stuff. This idea is by way of revenge, as Ed had previously scared the life out of me by showing me the ‘Great White Shark’ video in 360. As a diver, this was particularly terrifying and a spectacular demonstration of the immersive effects of VR. You can try it yourself using Gear VR on the Oculus store using VRIDEO. Ed is also Kickstarting his Virtuali-Tee and app.
My final stop before leaving was AltspaceVR. Michael Salmon, from NBCUniversal, took me through a demo of this multi-platform social space that was operating on Gear VR and Oculus that day. I’d previously bought in to the criticism of VR as being isolating, but this preconception was blown to pieces by spending just a few minutes in this social environment and speaking with other users around the world. What really sealed this for me was the moment a stranger joined me on the piano in the virtual world to play the duet from ‘Big’ without any verbal interaction at all. As I awkwardly picked out the tune using my head cursor, a more accomplished user played the bass line with aplomb.
Shared virtual spaces are also not new (Second Life anyone?), but VR makes them magically accessible and, combined with the deep immersive effects of the platform, mean that greater levels of telepresence could be easily achieved. Michael and I talked about how this effect can be used to transport patients out of hospitals during long stays, or even how group therapy for mental health and other chronic disease groups could radically improve the accessibility of this approach.
As I left I was struck by how health and social care applications of VR, AR and 360 are never far from the minds of developers and users, and how exciting this particular phase of the growth of this technology is. Many of the ideas can be traced back to the early 90s, but at that time the technology really wasn’t up to the task.
Today is very different.
Not only do we have the processing power and display hardware to deliver an excellent experience to even the most basic of platforms, but we have the internet and social media multiplying the effect yet further. Add to this a population that is increasingly of a world that dwells in the digital realm daily and we have a solid base camp from which all of the digital explorers can set out, confident in the knowledge that where they lead, others will follow. As part of the medical team, I can’t begin to tell you how exciting this is.
DECLARATION OF INTERESTS
My employment status and conflicts are given in the ‘About Me’ section of this website. I attended this event in a personal capacity and not representing my employers. I paid for all expenses myself, and have neither a financial interest nor professional working relationship with any of the individuals or companies listed above.