The Third Screen Revolution in Healthcare Is Before Us

For the past few years, there’s been a lot of innovation and discussion about the second screen – primarily smartphones and tablets – while watching TV. Most second screen apps provide information and services to complement the content on the big screen.

Microsoft CEO Steve Ballmer has outlined Microsoft’s three screens and a cloud strategy. Microsoft wants to be your technology vendor across all three screens — smartphone, PC, and TV– and connect all of them via the cloud. Apple, Google, Amazon, Microsoft, and to some extent Facebook are all converging on slight variations of a shared vision to be the super-mega technology company across all computing form factors.

Doctors are already making use of EHRs and other clinical data across two screens. Most doctors are looking up ancillary information on their smartphones during the clinical exam. It’s convenient because doctors can leave the clinical note in sight and in mind while looking up any other information on their smartphone. They key is that they don’t change screens away from the active clinical note. Some of the information that doctors are looking up on smartphones is coming straight from the EHR via a mobile app, while other information is from the Internet more broadly, such as Epocrates.

For once, healthcare is actually ahead of the mainstream in technology adoption. Clinicians have beaten consumers to semi-ubiquitous use of the second screen. So what about the third screen? When will inpatient clinicians make use of the large screen in most patient rooms more effectively?

They probably won’t. It’s generally difficult to read large amounts of text and numerical data (think labs, allergies, meds, etc.) based content from 10 feet away. Try hooking up your PC to your living room TV to find out for yourself. But there will be an incredible third screen revolution in healthcare. It will happen on eyeware computing platforms, including Google Glass and Meta-View, and I’m sure there will be many others as well.

Each form factor has its strengths and weaknesses. Broadly speaking, form factors can be thought of in terms of the amount of friction between the human and the computer. Laptops and tablets provide larger screens, but aren’t that mobile. Smartphones are mobile, but don’t offer robust data entry options. Eyeware computers can in time offer screens that can present vast quantities of data efficiently and effectively through three dimensions.

I wore Google Glass for the first time last week. Unfortunately, the screen just isn’t very large, which presents serious limitations to developers looking to develop point-of-care applications on the Glass platform. Of course, Glass is a consumer-focused, first- generation product.

As eyeware platforms emerge and mature, we’re going to see a huge wave of innovation point of care healthcare application innovation on these platforms. Someone has to solve the EHR backlash problem.

5-29-2013 9-38-34 PM

Kyle Samani is a technology enthusiast who is passionate about healthcare, technology, and startups.

  • “For once, healthcare is actually ahead of the mainstream in technology adoption.”

    Well, I don’t know that I’d go quite that far! But smartphones and tablets certainly make a lot more sense than putting a desktop or thin client at every point of care. We’re, unfortunately, not quite to the point of profoundly enriching the interaction between the physician and the patient, since most of what’s out there today are scaled-down apps or a desktop EMR running on an iPad through a Citrix session. But I think we’re heading there… (Sooner or later.)

    I think nirvana would be great usability regardless of platform and no significant loss of patient interaction features when going from desktop to mobile. The exciting day will be when EHR and PM apps are built natively and with the target mobile platform in mind… Say the physician wants to show the patient their lab results? Use the accelerometer and gyroscope to figure out when the physician is rotating or tilting the tablet towards the patient and shift to a streamlined display of only the selected results.

    As for Google Glass, I think the possibilities are pretty exciting for healthcare but probably (sadly) years away… But one obvious use case to could be to take advantage of location and proximity data and face recognition instead of BCMA. Could be a lot quicker than barcodes, and the compliance check could essentially be continuous. Depending on how good Google Glass gets at “reading” information, we might be able to skip over buying some EHR connected equipment, since the import of data into the EHR could be automated just by a provider literally looking at the device display.

    I do wonder if (and maybe worry that) we’ll end up seeing ubiquitous video recording… If it takes off, eyewear computing would have a huge impact — positive and negative — on quality, liability, privacy…

    I sure like the “eyeware” computing portmanteau, though!

  • kylesamani

    Justen

    I thought epic and Cerner and meditech had iPad apps by now, no? If not, do you know if those are in the pipeline? Which have you worked with?

    My startup is developing apps for glass for healthcare. Our general ideas and thinking are quite different from yours. Please contact me offline if you’d like to discuss more. Kyle@pristine.io

  • There are native apps out there, but functionality is often severely limited. Hence the poor folks who try using the desktop version via Citrix… There have been significant steps forward, some even very recently, but there’s still quite a ways to go.

    I’ll definitely stay tuned to what you’re up to with Google Glass. As I said, despite the privacy concerns and so forth, I definitely agree that the possibilities are exciting.

↑ Back to top

Founding Sponsors

Platinum Sponsors