If knowledge is power, then from 7th century BC to the 4th century AD, the most powerful women of the classical world were undoubtedly the Oracles of Delphi.
Supplicants would travel far to seek the Pythia’s wisdom, delivered in ecstatic frenzy after inhaling the spirit of Apollo. Such was the cryptic nature of their utterances that prophets would be employed to help travelers make sense of their revelations.
Important decisions in matters of war, trade, marriage, and business, all made with reference to the divine knowledge imparted by the Oracle.
In matters of medicine, doctors have been the gateway to knowledge. With that, doctors have significant power in the war against disease.
On the afternoon of 22nd February I went to Digital Health.London’s ‘Collaborate’ event. It was a celebration of the programme’s first year’s work, an award ceremony, and innovation event all rolled into one. They were kind enough to let me speak about Virtual Reality in patient care as well.
There was a fantastic, diverse audience in attendance too, from the cutting-edge 360 live-streaming surgeon Shafi Ahmed, to my colleague Sunil Bhudia (with whom I’m working on the PREVENT-ICU-Delirium project), and the inspirational Molly Watt, who took part in a panel discussion about accessibility and digital participation. It’s also brilliant to meet people I know from twitter face-to-face, like Victoria Betton, who led a session on hacking STPs), and Dr Robert Lloyd, who skillfully MC’d the proceedings
So why the Delphi reference? Of all the talks, the most fascinating for me was the panel discussion on Artificial Intelligence.
Patients should use Artificial Intelligence to reduce the amount of time they need to spend with healthcare professionals
The panel comprised of three heavyweights in the Digital Health field:
Dr Ameet Bakhai Consultant Cardiologist Royal Free Foundation NHS Trust
Professor Nicholas Peters – Professor of Cardiology & Electrophysiologist at Imperial College London
Entertainingly and insightfully chaired by Dr Jordan Schlain , a fellow US based GP and Founder of HealthLoop, the debate fell nicely into those for, against, and balanced on the fence.
Ali Parsa began with a blistering defense of clinical AI, delivering an impassioned argument for how it can meet the yawning gap in healthcare provision around the globe. His team at Babylon have seen an 80% reduction in conversion of clinical inquiries to video consultations since the introduction of their triage AI. As far as he is concerned, there is a moral duty to implement AI to meet the care divide, and augment the diagnostic capabilities of clinicians.
It’s difficult to come back against that kind of rhetoric, although I would posit that Babylon’s UK business deals with a small proportion of range of presentations seen in General Practice, and from a significantly healthier and wealthier cohort. It might be difficult to extrapolate the 80% reduction in demand. With an ongoing trial front-ending GP in North London though, I guess time will tell.
The progress in developing world is also laudable, but I wonder whether it can ever cover the totality of care needs? Perhaps 10% of something better than 100% of nothing.
The issue of AI treating humans well and not turning them into batteries comes up time and again, and from my own point of view I swing between optimism and existential dread. It’s not unique to patient care, but does have an interesting twist in that should AI be successful in helping the patient, it may harm the doctor.
As for empathy, this is stronger ground. I certainly believe that humans need contact with other humans, particularly when it comes to the ritual of the consultation. Who’s to say that empathy is a uniquely human property though? Any pet owner will attest to the ability of their loved companions to deliver comfort without words. In time, empathy may well be better delivered by machine, especially to those raised in the post-millennial world – Homo Digitalis .
Professor Peters was left to adopt the middle ground, adding nuance to the preceding statements. I was particularly taken by his observation that the addiction that doctors have to treating patients, felt as a need deeper than simply a method of paying the bills, coloured their opinion. AI and the fear of being replaced and made worthless affect us all.
The framing of the debate made this last point ironic – there were no patients on the panel discussion the proposition. I raised this point, which was countered with an anecdote about Henry Ford who was quoted as saying that if he’d given the public what they wanted, he’d have made a faster horse. Professor Peters also referenced Molly Watts’ support of Apple’s commitment to user interface excellence & accessibility, which comes despite their famous lack of user input. All good points, but I’d still want to hear a patient make them.
The concept of Artificial Intelligence has been around since the 1950s, and we’ve seen this level of enthusiasm before. We’ve also had our hopes dashed before. I believe we’re entering a time when some of that promise will be realised, especially in narrow specialist areas. For this reason, I see the generalist physician outliving the specialist when it comes to head-to-head performance against AI in terms of patient care. Before my GP colleagues get too comfortable, even that advantage will pass in time, leaving the nurses as the most valuable humans in the healthcare system.
It may be that we will co-exist as providers of healthcare, but surely there will come a time when AI will be superior to human in a number of areas. When this happens, if we truly respect tradition of medicine, and believe we must first do no harm, maybe doctors should stop diagnosing patients?
And with that, is the best that doctors can wish for is to become gatekeepers of the knowledge – prophets at Delphi helping the patient understand the superhuman wisdom of the AI oracle? Or will this knowledge by available to the patients directly, leaving doctors shorn of their power?
It’s no surprise that our excitement is tinged with fear.
“The one thing that the NHS cannot afford to do is to remain a largely non-digital system – it is time to get on with IT”
The Health & Care Innovation Expo 2016 is now entering its third year of showcasing the very best of innovation in the NHS. Hosted by NHS England, and held in the steampunk Victorian grandeur of the Manchester Central Conference Centre, I took part in 2 packed days of talks, workshops, demonstrations and general flights of wild innovative fancy with a wide range of attendees. The importance of the event was underlined by the prestige and range of speakers, from Professor Sir Bruce Keogh opening the event and chairing numerous panels, to Professor Bob Wachter MD talking about his review into digital usage in the NHS. We even had a hirsute Simon Stevens delivering a keynote and a full hour of Jeremy Hunt’s time, where he launched the next phase of the Digital NHS roadmap.
In truth there was a little too much to wrap my head around. The show floor was packed with exhibitors large and small, and an interesting range of stands exploring the ‘feature zones’ of New Care Models, NHS Right Care, Digital Health, and Personalised Medicine. Given the long queues for some of the talks, not to mention the numerous pop-up events and side meets, the one innovation we were all in need of was more time.
The announcements were, as tradition dictates, presented in the morning papers and we heard about the coming year’s targets on the journey to a digitised NHS in 2020. Primary care is in a good place here – in fact, Jeremy Hunt commended GPs for ignoring the government advice and ploughing their own furrow when faced with Connecting for Health. Without this, he said, we would be significantly further behind. Interesting advice on avoiding governmental advice there.
The news broke down as follows:
Patients will be able to book appointments, order medications, and download records, US ‘Blue Button’ style, on a revamped www.nhs.uk to be launched at Expo 2017.
Anyone will be able to access detailed stats on performance in key areas such as dementia, diabetes, and learning disabilities
There will be online access to 111, which can lead to direct appointment, signposting, or callbacks.
By March 2017 there will be a directory of approved apps from March 2017, with subsequent support for wearables
The creation of a second round of ‘national’ excellence centres, with more detail to follow.
The creation of an NHS Digital Academy to teach Informatics skills to NHS staff and create the next generation of Clinical Chief Information Officers and Digital Health Leaders.
Response to these announcements was mixed, both at the expo and in the press. On the one hand, when you combine this with the Tech Tariff (on which there was little news), it’s yet more evidence that the NHS is making good on the promise to step into the 21st century. Entrepreneurs and startups might complain that it doesn’t go far enough, and that the route to approval is still too long-winded and narrow. There was also the usual chorus of disapproval for any non-evidenced interventions in the NHS, and possible willful misinterpretation of what was being offered as simply a way of fobbing patients off with an app instead of a doctor. Those of us with a role in innovation have a responsibility to ensure that expectations are managed appropriately: Digital Health is NOT a panacea, but is instead another weapon in our fight against illness and social problems. We also need to ensure that evidence is generated and shared whilst trying to balance the pace of technological change against that of traditional research.
My presence at the expo was as innovations lead for my CCG (Eastbourne Hailsham & Seaford, and Hastings & Rother), and so it was exciting to be able to share the stage with Professor Sir Bruce Keogh, Dr Mahiben Maruthappu(@M_Maruthappu), Mr Ashish Pradhan & Maria Slater. Our panel, ‘Achieving Innovation at scale in the NHS’ hoped to inform the debate about how we can turn small scale innovation (which the NHS is brilliant at) into widely adopted, large scale change (not so good). The vehicle of the NHS Innovation Accelerator, which I have spoken of previously, is beginning to deliver, and I was one of three speakers talking about current NIA products.
Mr Pradhan is a Consultant Subspecialist Uro-Gynaecologist at University Hospitals NHS Foundation Trust, Cambridge. Episcissors – 60 are fixed angle episiotomy scissors, which are used to assist with incisions for difficult births that avoid the complication of damage to the anal sphincter and subsequent problems with continence. Undeniably, a brilliant idea, but the point was made that a business case was hard because this cheap intervention actually reduces hospital income down the line! The NHS is littered with such perverse incentives not to innovate, all of which need addressing.
When it came to me, my story was simple – having an excellent product is NOT enough. AliveCor is, undoubtedly, a great product which works very well at identifying asymptomatic Atrial Fibrillation (AF) as well as other rhythm disturbances, but from pilot work and a wider scale roll out in my CCGs, uptake has been slow. This reinforces the need to carefully consider how to manage change when introducing innovation, as well as considering the practical aspects and the need for education and support.
Even so, with lower uptake than expected, we detected 61 new cases of AF which, if treated appropriately, would have significantly reduced the risk of stroke in the target population. In effect, we may have avoided up to 3 strokes per year even in this small group. Numbers like that surely warrant support!
It was also great to be able to celebrate East Sussex Better Together and our progress towards a single Accountable Care Organisation. By working together with acute trusts, community trusts, and social care, we are moving towards a world where the “perverse incentives” mentioned in Episcissors story are a thing of the past. Costs are no longer saved in someone else’s budget
You could have spread the event over a week and still not had the opportunity to catch the majority of the content. I attended talks about the GP Forward View, Urgent & Emergency Care Innovation, and even learning from high performance and marginal gains theory in a talk called “Black Box Thinking” from Matthew Syed(@matthewsyed). Innovation is more than just technology, and sometimes the change in mental perspective towards one of continual marginal improvement is the most difficult of all.
My personal favorite technology, Virtual Reality (VR), was a little thin on the ground. We had VR from treating Obsessive Compulsive Disorder from a company called Mindwave Ventures(@mindwave_). They are using VR to create what must be the most disgusting bathroom since Trainspotting to help patients gradually address their fears of contamination. Augmented Reality was showcased from AMA(@AMAapplications), whose Xpert Eye platform will soon be used in my area to allow doctors to remotely visit care home patients. I also have to confess that my day (and probably whole week) was made when I discovered that the MSD team had brought Microsoft Hololens(@hololens). I can only apologise to everyone that had to experience my excited swearing as I strolled around an alternate reality populated with tigers, sharks, and a ghostly vitruvian man with a glowing nervous system.
Having spoken at TEDxNHS(@TEDxNHS), it was lovely to meet Dr Jon Holley (@jonnyholley), Dr Manpreet Bains(@manpreetbains_1) and the team again at their stand. The video footage from the event is in the edit and I’m assured will be available soon. It even led to one of the more surreal moments of the event where I got pulled out of a talk on Urgent Care to demonstrate VR to Ruby Wax ahead of her talk on Mindfulness and Mental Health.
I’ve made no secret of my love for the US way of approaching innovation, and how they celebrate the possibilities whilst including patients, especially in the Stanford Medicine X conferences. Thanks to speakers like Roy Lilley(@RoyLilley) who talked energetically about the importance of innovation from the front line, challenged pretty much everyone he spoke to to think differently, and who then danced off to ‘Always look on the bright side of life’ after his talk, I think I can now see the British version of this optimism, and the contagion is spreading.
Innovation now has fewer barriers than ever in the NHS, although those that remain are substantial. It’s over to us to make sure that next year for Expo 2017 we have some real success stories to share, alongside the courage to share and learn from our failures.
DECLARATION OF INTERESTS
I attended in my role of CCG Innovation Lead & Governing Body Member of EHS/HR CCG. As a speaker, all travel, accommodation fees met by the event organisers. I received no speaker fee.
Oh, I also wore #PinkSocks throughout, in the spirit of #JFDI and #GSD. These were a gift from Eugene Borukhovic (@healtheugene)
This is partly down to the fact I’m sleeping on a camp bed at my brother’s home with the sunrise peeking through the curtain, but mostly because my brain has already started fizzing with ideas and excitement ahead of my second ever NHS Hack Day.
I first went to NHS Hack Day in January 2015, when it was held in Cardiff. I’d been introduced to it through tweets with AnneMarie Cunningham (@amcunningham), GP and Primary Care Director at the Aneurin Bevan Health Board, who was organising the event. Sold as an opportunity to meet like-minded hackers and geeks, I spent a whirlwind 36 hours working on GWYB – a notification system for patients which triggered communication cascades on the event of their admission to hospital. We even won the Patient Prize for our efforts.
NHS Hack Day is a free to attend event that has been running across the country at weekends since 2012. In ‘Meeting the challenge’, they ask:
How can we build an environment where world-class NHS digital services flourish?
Through leadership that understands technology and is bold enough to modernise the delivery of digital services, including embracing openness.
To this end they sent out the call to all geeks that love the NHS and bring them together in the spirit of adventure, openness, and addiction to coffee.
I arrive at 8:30 at Kings College, London, and pitch in immediately with laying out the bottle water, coffee, tea and bin-bags. Extension cables are daisy-chained together and taped to the floor. I pop my Ricoh Theta S camera onto it’s tripod and start up Tweetbot in readiness.
By 9 it’s getting busy, and 15 minutes later we’re off at pace. It’s a speed that doesn’t really drop for the subsequent 33 hours. Everyone who has contributed to the Google Document of Pitches through the week is given 60 seconds to pitch to the assembled masses. Here’s my attempt:
Yes, that’s right: I’ve just asked a room of strangers to build a customised 360 video viewing app for Google Cardboard by the next afternoon. I’m nothing if not ambitious.
The pitches range widely, from medical dictionary and haematology data visualiser, to hospital bed finder, bleep replacement, and even personal pollution monitoring. I’m suddenly aware that there are lots of other teams I’d like to join.
10:30 am and I feel like a wall-flower at a speed-dating event.
Once you’ve pitched you stand around the side of the room with a sheet of A0 paper with the project’s name on it. It starts slowly, but gradually the fact that I have a VR headset and I’m willing to share it attracts people. Several question the scale of what I’m trying to achieve, and as a result I realise that the project needs to change. With the help of some of the people the subsequently become the team, we decide to focus on using the tools at hand and skills we share to explore using VR and 360 video to help treat Phantom Limb Pain (PLP)
At this point it’s probably important to give you a little more information. PLP is a common and distressing complication of amputation. Up to 70% of people who have had an amputation can experience pain, itching, burning or distortion of their missing limb. It’s difficult to treat with medication, and as such a number of psychological and alternative therapies have been developed.
One such treatment method is MIRROR THERAPY. First described by Ramachandran in 1995, this uses mirrors to allow patients to view their injured limb as made whole again using the other limb. This has been shown to help reduce pain and distress, both during the treatment but also on an ongoing basis.
I have two patients with phantom limb pain, and even before coming to NHSHD I’d been wondering about using VR to help treat them. This weekend started to look like I might be able to make good on that.
More coffee and a time check. 11:30. We have 6.5 hours left of the day, then a further 6 hours tomorrow, to try and deliver something that will genuinely improve patient care.
My team is comprised of people with a huge range of different skills and backgrounds. Becky is a coding and digital media student from Brighton. Helen is a registered community nurse with a passion for tech and digital health. Mussadiq is a java dev with geographic information system skills. Ali a quantitative analyst. We also have Daniel and Charlotte, both software engineers. Some of the team stay for just day 1, and we’re joined on day 2 by Reno who’s switched codes from the dark side of finance to join Team Digital Healthcare. It’s an eclectic and excellent bunch – you can meet them all on our site.
Given our target group, the plan is to explore using VR, 360 video, and the Gear VR headset to simulate mirror therapy in a low cost digital way. My hope is that we can develop practical methods of deploying this in a clinical setting and share our findings with the community at large. It also means we get to have fun playing with all the toys, whilst everyone gets a chance to contribute and learn something.
The team splits into three streams:
Charlotte and Daniel start on the website, which we will use to contain our work from the weekend.
Becky, Ali and Musaddiq immediately set to work on the hard coding challenge – looking at Virtual Reality and whether we can mirror a live 360 video stream from the Theta S camera.
Helen & I began the collation of research evidence, and constructing a ‘treatment protocol’ that we could create some simple 360 video footage of which we could test with the team.
Such is the focus of a Hack Day that many of us didn’t really realise that the excellent lunch had been served until the back of the queue bumped into our table. This was despite the food being served right next to us. I guess this was the first proof of the distractive powers of Virtual Reality.
For the remainder of the day each stream worked away on their particular tasks. The website came together quickly and beautifully, built on a wordpress framework. Becky and Musaddiq heroically tackled 2 things at once:
3d modelling in Unity and then 3D Studio Max, developing some great point-of-view animations of leg therapy
Tests of live streamed 360 video using OBS and YouTube – this was sadly too slow, and there did not appear to be any open source mirror plugins.
Helen introduced me to Slack, a team collaboration tool that I dared to consider as yet another Social Network until I was sternly corrected. Using a technique shamelessly borrowed from the adult entertainment industry, I duct taped the 360 camera and Gorillapod to my chest to record 5 short series of basic mirror therapy clips. You can see them all here, and watch them yourselves using any VR headset. What was immediately apparent was that by watching and copying the movements you could experience an eerie sensation that the hands you were seeing were, in fact, your own (which in my case they were)
By 6 o’clock the pub and Eurovision were calling, so we all departed.
Day 2. 5:30 am this time. Ukraine won.
Another glorious day, so with coffee in hand I took a few photos of Embankment and set off to rejoin my slightly smaller team. This was offset by the fact that overnight we had been contacted by Reno, who asked to join us. Expanding the reach of the Hack Day using social media is fantastic, and something I hope they facilitate in future. As it was, our hashtag started trending shortly into day one, which was announced by the unwelcome hijacking of our thread by a russian dating agency.
By 9:30 everyone was up-to-date and the plan for the remaining 6 hours was in place. To up the pace and demonstrate the power of what we were doing to the team, we decided to utilise the ‘Cold Pressor’ test to see whether any of the content we had created could offset the pain of holding your hand in iced water.
The Cold Pressor test can be thought of as the bespectacled, serious cousin of the ice bucket challenge. It is used in research to help provide a controlled and safe painful stimulus. It has already been used, successfully, in demonstrating the efficacy of VR in reducing pain, so I felt it was justifiable to subject Becky, Reno and myself to a bit of light Sunday torture in the name of science.
Despite our rather crude efforts, what we found was quite startling. Becky & I recorded some point-of-view footage of ourselves with both hands inverted, our left arm in an empty bucket. The bucket was duly filled, and we were timed as to how long we could keep our left hand in the iced water.
Becky bowed out at 1 minute 30 seconds. I lasted even less, at 1 minute 10.
We were given a while to recover and then tried again using our personalised 360 video. What we found was that Becky increased the time she tolerated the pain to over 3 and a half minutes. I tried again and stopped at much the same time, with the feeling that I could have gone on if I wished. The sensory confusion of seeing both hands in the air versus the sensation of the left arm in water at near freezing clearly disrupted my perception of pain.
Reno stepped in next to experience the power of VR to distract patients from painful stimuli. Watching ‘Kurios’ by Cirqu du Soleil, he breezed through nearly 10 minutes of laboratory standard agony, smiling much of the time. Having checked his biography, I now see that he is an ultra-runner. This doesn’t diminish his achievement, but explains the smile.
Next came the crunch. I’d cunningly ensured that 3 of the time had frozen typing hands, so we awkwardly wrote up our findings, with Becky and Ali also finding the time to crack the problem of mirroring 360 footage in a simple and effective manner. It was this last development that will really help clinicians in creating effective personalised 360 mirror content for patients, and will form the basis of the next steps I take with my own patients.
3:30 arrived, and the final presentations in front of the judges began. With a brutally marshalled 3 minutes, each team spoke of what they had achieved in the last day and a half, before being grilled by the panel and audience.
We saw a great variety of differing presentations, but what tied them together was the incredible progress everyone had made, and the amazing creativity and skill that had been used in producing extremely polished applications that were, in many cases, ready to use. I was particularly impressed by ‘Outbreak’, a disease-outbreak management system in a box that used Raspberry-Pi’s and tablets to create a pop-up field network. I wasn’t the only one: they took home the star prize. Very well deserved.
So what about Virtual Analgesia? Well, I’m delighted to report that we won a ‘Highly Commended’ prize from panel judge Alan Thomas (@alanroygbiv) for our work on Patient Inclusion. Having had the idea come from patient needs, it was high praise indeed to have this recognised.
6 pm and it was all over, bar the wrestling over the goodies and dividing up the remaining bottled water. I’d been part of 36 hours of intense team work and creativity, and joined a group of new friends and colleagues. Most importantly we had a new tool that clinicians can consider using in managing Phantom Limb Pain. In the coming weeks I hope to share this work with my two patients and see whether they’d like to try this approach. Using VR in this way means that when they wake at 3 in the morning they’ll have something new to try to control the burning pain in the foot that’s no longer there.
This post is not a speech, so I won’t go into detail about how thankful I am for the help I had from my team – I’m banking on the fact that they know this already.
What this post must be is a loud celebration of the amazing work of the NHS Hack Day group, and most of all about the incredible reservoir of passion and talent in the developers, students, clinicians and patients of this country. The challenge of rising demand and shrinking funding of healthcare is not unique to the UK, but we have a National Health Service – free at the point of delivery, with care provided based on need, not the ability to pay. The NHS Hack Days demonstrate that it isn’t just the nurses and doctors that are committed to supporting this unique and precious institution, and that we don’t go into the fight unarmed – there’s an army of geeks out there, and they have some incredible tech to share.
To find out more about the next NHS Hack Day, visit their website www.nhshackday.com or follow them on twitter @nhshackday – they really are amazing events, and welcome everyone with a passion for healthcare.
All the notes from my team ‘Virtual Analgesia’ are available on www.virtualanalgesia.net . We’d love to hear from you with any feedback or comments. You can join the discussion on Facebook on ‘VR Doctors‘ – just apply to join.
Declaration of Interests
I attended this event in my own time and at my own expense. The hardware and software used was all either open source or owned and operated by the participating team members.
Virtual Reality (VR) is a technology that is older than television, and something that I recall very well from the heady, neuromancer days of the early 90’s. With the launch of commercial products from HTC and Oculus, and Samsung building on the early successes of its Gear VR headset, more people than ever can get their heads into another digital realm and experience first hand what they could only haltingly and nauseatingly experience from the ‘Dactyl Nightmare‘ days. I was pretty keen to put that behind me and see what was new in this field.
With that in mind, on a wet Thursday night on 24th March, 2016, I attended the 4th annual VRLO (Virtual Reality London) meet up at the Amba Hotel at Marble Arch, London. Hosted by VR & MR production company Rewind, the event is billed as:
(a) regular hands-on social event is for professionals who are curious to see what impact virtual reality and applications will have on every aspect of our lives. Get early access to the latest developer kits and applications, immerse yourself in cutting edge applications and network with the people at the forefront of this new medium.
The event was split between two rooms: the first contained all of the exhibitors and hands-on demos, and was the place the I spent the entire evening. A second room hosted the presentations, although to be honest most people looked like they were there for the toys and the networking.
My interest in VR, AR (Augmented Reality) and 360 video and audio comes from a place of personal interest as well as a fascination with what these new technologies can offer to health and social care. So armed, I met with the exhibitors, tested their gear, and chatted about MedTech. In the process I discovered a hidden passion for VR Healthcare, the beginnings of practical applications for patients and clinicians, and a rather worrying disregard for basic infection control.
The exhibitors were big and small. The biggest, Sony Playstation and Samsung, were there in full effect, although the Sony VR equipment was noticeably absent. As a result, so were visitors to their stand.
Samsung, on the other hand, were demonstrating their new wireless 360 Camera, the Gear 360.
Announced at 2016 Mobile World Congress, this tiny orb houses 2 wide-angle lenses and a cute tripod and can record still and video images at hi-def. So far, so ‘Ricoh Theta S’. It did exceed my current 360 camera in a number of departments though: splash and dirt proof, it also live-streams beautifully, which will play a huge part in the coming growth of 360 media. You only have to look at UK Surgeon Mr Shafi Ahmed’s exciting first world 360 live broadcast of an operation on 14th April, 2016 to see the potential for education in healthcare and surgical training here. Within primary care, documenting practical procedures and studying doctor:patient interaction immediately springs to mind, but also the ability to rapidly record a person’s home and living space to allow remote occupational therapy and monitoring of social care provision.
There were many app and content developers present showing the work they had done in demonstrating the potential to companies and clients, producing training material, and wrapping this content in branded Google Cardboard hardware. If you haven’t had a chance to dip into VR yet, Cardboard is certainly the cheapest way to do so, as you can buy a headset for less than £10 and strap your smartphone into it to have a taste of the other side. Spend a little more and you can get a great, comfortable system such as the FreeflyVR (my current preference for Google Cardboard work)
I’ve taken these entry-level headsets to two clinical environments thus far (my surgery and my dentist) to see how patients and professionals fare, and what their feedback might be. The current generation of hardware is a little bulky, especially for dental work, and needs to slim down or risk getting in the way. The optics are fairly basic, so limit the audience somewhat to those with standard size heads, and a shallow range of visual acuities. You can wear glasses with some of these headsets, but I’ve yet to find a headset that makes this anything other than an uncomfortable workaround.
Infection control is the biggest unaddressed issue, in my opinion. Most headsets have soft foam padding around the headset which would be a nightmare to clean. Additionally, the headsets themselves can be quite intricate and would harbour bugs in all of the nooks and crannies. Seeing person after person line up to pop headset and headphones on in a crowded, sweaty room, having just finished a shift seeing record levels of upper respiratory infections and scarlet fever in my surgery made me a little tense. Work needs to be done here for basic protocols to ensure the next big VR event doesn’t turn into a cruise liner style outbreak.
One team had their eyes firmly on this area however – the impressive Kickstarter-funded OPTOVR ( @OptoVR ). I had a great chat with the co-founders Richard Stephens and Tom Jarvis, who took the time to talk through their development story of what they claim is the worlds first portable VR headset with integrated headphones. What interested me was that it had a beautiful, clean look and feel, and as it is made from closed cell foam it can be readily cleaned (Closed cell foam is the kind of material used in Croc shoes, so loved by surgeons the world over). Add in the lightweight hardware and beautifully integrated sound system, I see this as being the first VR headset that I would consider using in the live clinical environment. Definitely one to watch, you can help fund them on Kickstarter and even attend their launch on 30th March, 2016 at Somerset House in London
Another exhibitor was CURISCOPE – ‘Education adventures in VR & AR’. Ed Barton ( @ed_barton), their founder and CEO, was demonstrating a remarkable T-Shirt which allows people to gaze into the chest and abdomen of the wearer and see their internal organs in glorious technicolour. The educational possibilities are obvious, but I wonder whether you could also use this approach for patients to better understand their own bodies and anatomy, and the effects of disease and lifestyle on their own health. Imagine showing a young smoker their lungs age and blacken before their eyes! Powerful stuff. This idea is by way of revenge, as Ed had previously scared the life out of me by showing me the ‘Great White Shark’ video in 360. As a diver, this was particularly terrifying and a spectacular demonstration of the immersive effects of VR. You can try it yourself using Gear VR on the Oculus store using VRIDEO. Ed is also Kickstarting his Virtuali-Tee and app.
My final stop before leaving was AltspaceVR. Michael Salmon, from NBCUniversal, took me through a demo of this multi-platform social space that was operating on Gear VR and Oculus that day. I’d previously bought in to the criticism of VR as being isolating, but this preconception was blown to pieces by spending just a few minutes in this social environment and speaking with other users around the world. What really sealed this for me was the moment a stranger joined me on the piano in the virtual world to play the duet from ‘Big’ without any verbal interaction at all. As I awkwardly picked out the tune using my head cursor, a more accomplished user played the bass line with aplomb.
Shared virtual spaces are also not new (Second Life anyone?), but VR makes them magically accessible and, combined with the deep immersive effects of the platform, mean that greater levels of telepresence could be easily achieved. Michael and I talked about how this effect can be used to transport patients out of hospitals during long stays, or even how group therapy for mental health and other chronic disease groups could radically improve the accessibility of this approach.
As I left I was struck by how health and social care applications of VR, AR and 360 are never far from the minds of developers and users, and how exciting this particular phase of the growth of this technology is. Many of the ideas can be traced back to the early 90s, but at that time the technology really wasn’t up to the task.
Today is very different.
Not only do we have the processing power and display hardware to deliver an excellent experience to even the most basic of platforms, but we have the internet and social media multiplying the effect yet further. Add to this a population that is increasingly of a world that dwells in the digital realm daily and we have a solid base camp from which all of the digital explorers can set out, confident in the knowledge that where they lead, others will follow. As part of the medical team, I can’t begin to tell you how exciting this is.
DECLARATION OF INTERESTS
My employment status and conflicts are given in the ‘About Me’ section of this website. I attended this event in a personal capacity and not representing my employers. I paid for all expenses myself, and have neither a financial interest nor professional working relationship with any of the individuals or companies listed above.