Chad Ruffin:
Think about it. This child's entire auditory cortex will develop solely in response to that electrical stimulation. And I thought about what this meant for him. And I teared up. Because like him, I had been deaf since before I had learned to speak. But unlike him, I didn't get my cochlear implant until I was 21 years of age. I teared up, because I thought that he would never endure the profound social isolation that comes when you can only talk to one other person, that he would not constantly miss the punchlines of jokes and have his friends say "it's not important" whenever you ask for something to be repeated, that he would be able to chime in at a dinner party with his friends simply because he couldn't hear, or that the chancellor of his medical school would not overrule the acceptance of the admissions committee solely because he was deaf. I teared up, because I thought that we had reached a zenith of possibility for deaf people.
But I was wrong. Before me, no other deaf person had successfully completed a surgical residency program It took 30 interviews over two years, a second two-year research fellowship, an additional year of internship-- all of this with a trial run, if you will-- before I was able to convince my field that I could, indeed, care for an airway patient gasping for air in the dead of night. That left a scar that was deep and wide.
I had already done one two-year research fellowship, published on the far right of the bell curve. I had spoken to heads of state about hearing loss. And on the side, I taught myself electrical engineering to create a hearing aid to help me hear in the operating room. I had done all of these things out of pure passion.
But on the interview trail, instead of hearing, wow, you have some drive, it would be great to have you on our team, I was greeted with, so, you're the deaf boy everyone's been talking about. And while I sat on the sidelines, I watched colleagues with similar credentials get into the top programs. But in a really cool change of events, I was recruited to an otolaryngology residency program at Indiana University. It was chaired by the pioneer pediatric cochlear implantation.
In contrast to my previous research fellowship where we focused on bottom-up methods of restoring coding and the hearing nerve, at Indiana, we were really interested in how the brain is changed by deafness in a top-down manner. And you will see how that combination of training parlayed into a powerful background for treating hearing loss. As I immerse myself into surgical training, I became profoundly aware of the limitations of cochlear implants.
These limitations are not obvious. It can really torpedo a deaf person's desire to succeed in the upper echelons of professional society. I was determined to overcome these limitations. And I'll play a sound stimulation and let you experience this.
During the interview process, I was called shifty and aloof. Why was that? That criticism never left my brain. I delved into studying auditory cognition and autism to figure out some way of deconstructing the deaf interview experience so that we could learn from it.
And when you listen to these sound simulations, I want you to pretend that you are an interviewee looking for a job or an employee having a performance review. Recall the stress and anxiety that you feel as a hearing capable person. And then with this simulation, think about what it must be like to have a cochlear implant. Ready to listen?
Chad Ruffin:
So you probably understood most of what was being said. But who was that? Is that Stephen Hawking, or is it Daniel Kraft. It's actually Daniel Kraft. Was he making a statement, or was he being sarcastic? Where is Daniel from? What's his accent? The point is, in an interview situation, people tamp down their body language. So how do I tell whether an interviewer thinks that I would be a great addition to their team or they're more concerned that my hearing might endanger patients? How do I do this on the fly while being witty, and charming, and not being shifty and aloof?
The second thing that's really important is the automaticity of hearing. Remember that you had to mentally reach out to figure out what was being said. We call that reaching out cognitive load. And it worsens during stress or anxiety.
It really affects a patient of mine, who's a 60-year-old lawyer, who lost his hearing late in life. He now hears in quiet 100% of what's being said. But he will tell you that he cannot negotiate a contract between four parties, because he spends so much mental effort just understanding what's being said that he can't think a few steps ahead to best guide the negotiations.
Now the effect of cognitive load on executive functioning sounds super obvious. But in the operating room, I was in uncharted territory. It's a stressful and noisy environment. And people with hearing loss can't lip read because of the surgical mask.
When I hear, I have to parse individual words into sentences, think about what that sentence means. And then I have to perform an action on that thought. Thinking about what you're not hearing is super distracting, so much so that, even today, I have to train friends and colleagues that I'm not being disengaged or shifty and aloof.
Now that we've reimagined the problem, you can see that it's not just about hearing loss, but about giving the brain room to process what's being said and to also do the other functions for which it was designed to do. And one of these functions is to simply connect with the other human being to whom you are talking. So what are we doing to give that four-year-old boy all the promise in the world?
That brings us to the atheneum of RuffLabs, where we use our unique insight to create new technology that gets to the heart of communication and hearing. We combine medicine, technology and advocacy to create unique solutions to these problems. I'll talk about where my field is and then where our approach is different.
Medicine-- sensorineural hearing loss is super difficult to treat. I don't hear what you hear. Because not only is there a lack of volume, but also a clarity. Depending on where your hearing loss is, we can implant the inner ear with the cochlear implant like I have, the auditory brainstem, or even the auditory midbrain.
We now understand the limitations of electrical stimulation. And we're working on using lasers and genetic engineering to overcome these problems. You guys saw this yesterday with the optogenetics. We're also using stem cells to regrow inner ear organs.
You saw brain implants, how we can control artificial limbs, and a computer just by thinking about it. This technology could be crucial for a friend of mine who wants to be a neurosurgeon. He signs, but he can't voice for himself. Using this technology, he could communicate with the operating room team while his hands are occupied by surgery.
We know that hearing loss affects the brain. It is the number one modifiable risk factor for dementia. Just put a hearing aid on it. We know that [INAUDIBLE] centers in the auditory cortex are repurposed for visual processing. This is great if you sign. It's not so good if you want to use your ears.
But our field is working to understand and drive these changes for maximal benefit. But now, what you really want to hear-- in true Silicon Valley fashion, I hacked my cochlear implant to create a new way to fire the electrodes and improve hearing. In a eureka moment, it provided the sensations of music that I had never before experienced, the ability to hear all the complex themes of a movie and its score, to hear feminism in Tina Turner's "What's Love Got to Do With It?" One of our cochlear implant audiologists-- she's been programming these devices for 20 years-- stated that she had never seen a implant patient describe music in this manner.
Technology-- we have hearing aids that attempt to recreate the complex processing that goes on in the inner ear and the brain. The FDA has deregulated hearing aids, that now we have over-the-counter devices for mild hearing losses. These devices and open-source hearing aid platforms means that technology companies can really create smart devices, such as hearing aids we have today that can email mom and let her know that her baby's hearing aid batteries are dead.
At RuffLabs, we're working on a new way of filtering speech that takes into account how we process the world. But the key point in this is we use our unique insight as a springboard to develop the same technology to have wider applications-- not only other neurological disabilities, but also for the normal hearing population.
Advocacy can truly be helped by technology. As it stands, all the technology that we currently have can't overcome the issues of cognitive load in executive functioning. You can think of how hard it might be for someone who lost their hearing later in life. But what about that four-year-old boy? What about me?
For those of us who grew up using degraded hearing to hear and communicate, our entire social construct is built by interactions gleaned from a sphere of hearing that extends only 5 feet from our ears. For normal hearing people, knowing what to say and when to say it is conscious. But it's instinctive. And it's visceral. It's guided by neural circuits built by years upon years of direct conversations and just overhearing people in passing.
Deaf children do not have access to the same neural circuits. So we need to teach deaf people-- what I and other successful deaf people-- strategies that we use to advocate for ourselves with fluidity, wittiness and grace. We're working on a national campaign so that both that four-year-old boy and his potential employer can truly know what deaf people are capable of.
You can't always fix a disability in the moment. But you can re-imagine technology and create interventions. All of what I described so far is hugely important. Sensorineural hearing loss affects 15% of Americans. If you are a child with moderate hearing loss, you are more likely to fail a grade. If you are deaf, you are likely to have only a fourth grade reading level literacy level and less than half-- less than half-- of the income or the employment rate of normal hearing people.
So I leave you with this. I really enjoy piloting planes and photography. And I often find myself turning around in my seat to take a picture of the world behind me. Those pictures remind me of where I'm from. But they also frame what lies ahead.
I invite you to help RuffLabs provide the four-year-old boy with a new world. We are currently establishing partnerships and fundraising for our academic laboratory and commercial enterprise. It's my hope that, one day, that four-year-old boy will pilot his own plane into a world of less turbulence and where barriers have been removed for deaf and hard-of-hearing people. It's been an honor talking to you. Thank you very much.