Columbus, OH — Over dinner last night, some colleagues compared their families by level of device and technology addiction. There’s a wife who can’t stop reading Reddit. A family entirely fitted in iEverything. Pre-teens who can’t go on a 10-minute drive without a handheld game. And – the winner – a set of teenagers who are known for sitting around the living room together texting each other rather than speaking out loud.
Technology has changed us. It’s reinvented how we work, evolved how we communicate, and – it turns out – started to rewire our brains. Barbara Fredrickson, a professor of psychology at the University of North Carolina, Chapel Hill, and her colleagues are about to publish an article that suggeststhat one measurable toll of all this technology may be on our biological capacity to connect with other people. The theory in neuroscience is that “neurons that fire together, wire together.” What that means for us is that any habit molds the very structure of our brains in ways that strengthen our proclivity for that habit.
When we spend more time staring at a screen than connecting with people, our innate capacity for connection, friendship and empathy atrophies (a fancy medical word for withers away).
One place this phenomenon of tools v. human connections is felt strongly is in the exam room. Dr. Bryan Vartabedian, a pediatric gastroenterologist, wrote about it recently on 33 charts. He said patients have a fantasy expectation for what an interaction with their doctor should be like. They want the doctor to look at them and never at a screen. But, they also want “everything flawlessly documented, reviewed, flagged and cross-checked in the EHR.” They know they need the new, but long for the old.
It reminds me of the near-future novels that were popular a few years ago – books like Super Sad True Love Story, How to Live Safely in a Science Fictional Universe, and The Four Fingers of Death. All were set slightly in the future – close enough to be recognizable, far enough out to be a little alien. And, all showed characters with instant access to incredible technology that streamed live connections in the perpiphery of their vision and augmented every moment of their lives … yet, left them feeling entirely lonely.
It’s a paradox of our time, but also a real challenge for medical practices that are increasingly as dependent on the medical records and tools housed behind their screens as they are on physical examination. The open question is: Can technology be used to build the human connection at the point of care or only compete with it?
There are two projects we’re watching right now that seem to have the potential to use technology to reconnect people:
Digital Nurse Assistant, Xerox
The innovation team at Xerox unveiled their new “DNA” tool at demo day at PARC. The system tracks both nurses and the medications and tests that have been ordered for their patients. A small digital tag recognizes a nurse as she enters a patient room. The patient’s information immediately appears on an in-room display along with any specific data that needs attention – like a need for blood pressure monitoring or the result of a new test. The nurse is able to quickly get the needed information and can turn her attention quickly to a personal connection with her patient.
BodyMaps from GE and Healthline
This clever little app might just make it possible for doctor and patient to share a screen rather than be separated by it. It’s designed to be a visual aid in discussing diagnoses and treatment plans that allows users to explore the human body in 3-D. Doctors can show multiple layers of the human anatomy and view systems and organs down to their smallest parts. Beyond detailed anatomy, it also has 200 concise tutorials like “Why do I sneeze?”
Plus, doctors can annotate right on the screen – calling out where an individual’s particular problem is or what a surgery might fix. Those sketches and markups can be emailed to the patient to continue the education and conversation with loved ones. (Could it be even cooler with a little augmented reality? Imagine a quick mashup with Augment that would make a look at a Bodymaps knee seem like a look at your own knee.)
Posted by: Leigh Householder