Skip to main content
Illustration by Nicholas Konrad  / The New Yorker; Source photographs from Getty

When John Lee Clark was five years old, in 1983, he entered a small Deaf program within a public school near his home in Eden Prairie, Minnesota. Clark was a dreamy kid who dressed in tucked-in button-downs and pressed slacks. He came from a large Deaf family—his father and brother are DeafBlind, his mother and sister are Deaf and sighted—and the family had communicated in American Sign Language (or A.S.L.) for generations. On Clark’s first day of kindergarten, his mother, worried, followed his school bus in her car. When she surprised him at school to ask if he was O.K., Clark said that he was fine but that the bus driver had forgotten how to speak. His mother laughed and reminded him that the driver didn’t know how to speak: she was hearing! “This is a common story among Deaf families,” Clark told me recently. “The gradual dawning that all those mutes could actually talk with one another, but in a very different way.”

In third grade, Clark began a bilingual Deaf program. Instruction was in A.S.L., but students were grouped on the basis of their ability to read English, a second language that Clark accessed only in print. “My literacy was abysmal,” he said. He still has a workbook from that time, in which he answered questions—“What is your favorite sport?” “Who are the members of your family?”—with drawings instead of in English. But he was gifted in A.S.L., and teachers would ask him for help with tricky words. He sometimes pranked them by inventing ostentatiously elaborate versions. The word “heaven” is difficult for A.S.L. learners, involving a precise looping of the hands; Clark added several gratuitous loops.

At twelve, Clark began attending a residential Deaf school, many of whose students came from Deaf families. But, around this time, he began to go blind. Hundreds of thousands of people in the U.S. have some combined hearing and vision loss, but most are older adults and have spent the bulk of their lives hearing and sighted. A much smaller group—about ten thousand, according to some estimates—become DeafBlind earlier in life; a leading genetic cause is Usher syndrome. Clark, his father, and his brother have Usher, which can cause a person to be born deaf and to gradually go blind. At fourteen, Clark started to lose track of A.S.L. conversations. “I was this boy who always said, ‘Say again?,’ who might collide into you,” Clark told me. “So pathetic.” He began reading in Braille, which his father had encouraged him to learn as a child, and started walking with a white cane.

In high school, Clark stopped trying to follow A.S.L. visually and began using tactile reception, feeling words with his hands. This helped, but miscommunication was common. A.S.L. is a fundamentally visual language. The dominant-hand gestures for the words “stamp” and “fun,” for instance, look very similar, except that “stamp” begins near the mouth, whereas “fun” starts at the nose. Yes-or-no questions are signified with raised eyebrows, and sentences can be negated with a shake of the head. When Clark would reply in A.S.L., he’d have no idea how the person was responding, or whether she was still paying attention at all; he said that it was like “talking to a wall.” He attended Gallaudet, a Deaf university in Washington, D.C., with his future partner, Adrean, a sighted-Deaf artist. “It was really when I got married that I noticed more serious problems,” he told me. He would come home from the store without the items that Adrean had requested, and misunderstood the timing of their appointments: “It’d blow up on me, how that information in ASL had failed to register.”

On September 11, 2001, Clark went to a literature class at the University of Minnesota, where he was working toward his bachelor’s degree. When he arrived, his interpreters made the hand shape for “airplane,” and ran it into a raised finger twice. Clark interpreted this as “an airplane hitting two poles,” and assumed that he was hearing about a local-news story—perhaps about a hobbyist in a prop plane hitting telephone wires. It wasn’t until he got home that he learned how much he must have missed: the tear-streaked faces, the TV footage running on loop. (I heard remarkably similar stories about 9/11 and other cataclysmic news events from several DeafBlind people.)

In 2013, Clark attended a training, in Minneapolis, in Protactile, a new movement that was encouraging DeafBlind people to reject the stigma, in American culture, against touch, which often leaves them cut off from the world around them. According to Protactile’s principles, rather than waiting for an interpreter to tell her about the apples available at the grocery store, a DeafBlind person should plunge her hands into the produce bins. If a sighted friend pulls out her phone in the middle of a conversation to check a weather alert, she should bring her DeafBlind interlocutor’s hand to her pocket as well, to understand where the weather forecast is coming from.

Protactile includes a set of practices to make tactile communication more legible. One of its creators, a DeafBlind woman named Jelica Nuccio, showed Clark how it worked. They sat facing each other, their legs touching, and Nuccio rested Clark’s hand on her knee, explaining that, as she spoke, he should tap to indicate that he understood, like nodding—a practice called back-channelling. Nuccio articulated words into Clark’s hand, but also directly onto his arms, back, chest, and lower thighs. In A.S.L., pronouns are articulated as points in space; you might designate Minneapolis as a spot in the air near your left shoulder, and Seattle as a spot near your right, and then those gestures stand in for the cities. Nuccio showed Clark how to indicate them as points on the body instead: a two-fingered press on each shoulder.

“It didn’t feel like a lightning-bolt moment,” Clark told me. “It was all too natural.” But after the training he noticed changes in his household. He and Adrean began using a Protactile principle called co-presence: if she came into a room, she would brush him to let him know that she was there. Before, they’d sat around the table, and whoever sat next to Clark interpreted what the rest of the family said. Afterward, they began eating in informal clusters, allowing for tactile group conversations.

In the years since, Protactile has spread across the country. Today, most DeafBlind adults have heard of Protactile’s call to place touch at the center of their lives. Clark, who has become a leader in the movement, compared it to the Deaf Pride movement of the nineteen-eighties, when more Deaf people began speaking A.S.L. in public, insisting that hearing people gesture back. A few hundred people use Protactile’s communication practices daily—a very small group. Still, several linguists have come to believe that, among some of its frequent users, Protactile is developing into its own language, with words and grammatical structures that have diverged from those of A.S.L. “I am totally convinced that this is no tweak of A.S.L.,” Diane Brentari, one of the premier linguists of sign language, who teaches at the University of Chicago, told me. “This is a new language.” Clark believes that Protactile has the potential to upend centuries of DeafBlind isolation. “It’s an exciting time to be DeafBlind,” he has written. “The single most important development in DeafBlind history is in full swing.”

This past December, I met Clark in an old stone building that houses some of the University of Chicago’s linguistics labs. Clark is tall, with a youthful face. He lives with his partner and their three children, who are hearing and sighted, in St. Paul. He writes poems that are published regularly in Poetry magazine; he won a National Magazine Award, in 2020, for a piece on tactile art, and has both a poetry collection and a book of essays forthcoming from Norton. When we met, I was struck by the similarity between his presence in person and the way he comes across over e-mail; in both, he is affectionately didactic. I had assumed that he would speak through my interpreter, but he insisted that he address me directly while she watched and translated, so that I could experience the feel of Protactile.

I have a condition called retinitis pigmentosa, the visual component of Usher syndrome, which is causing me to slowly go blind. (My hearing is unaffected.) As Clark and I faced each other, our white canes leaning in a corner of the room, he kneaded my shoulders, and instantly found my baseball cap, which I use as a sort of cane for my face—it saves me from slamming my head into open cabinet doors. “Lots of people with Usher syndrome and R.P. will use these kinds of caps,” Clark said. He playfully pulled it off my head and pressed it to my chest. “If you want to up your game in Protactile,” he said, “then what you’re going to need to do is get rid of that cap and get your hands busy.”

The title of Clark’s new poetry collection, “How to Communicate,” captures what has always been the central problem for DeafBlind people. DeafBlind children living in linguistic isolation can spontaneously develop home signs that their immediate families understand. Laura Bridgman, who lost her sight and hearing to scarlet fever in eighteen-thirties New Hampshire, had signs for “father” (her hand drawn across her cheeks, describing his whiskers) and “spinning wheel” (a rotating hand). But, without a wider community, home signs can’t grow into full languages. In 1837, the educator Samuel Gridley Howe recruited Bridgman to attend what would later be called Perkins, the first American school for the blind, in Massachusetts. Howe had previously visited Hartford’s American Asylum for the Deaf and Dumb, an incubator for what would soon emerge as American Sign Language, but dismissed signing as little more than pantomime. Instead, he and others at Perkins taught Bridgman to read and write English, using raised letters, which she quickly mastered. He relentlessly publicized this achievement, and Bridgman became an international celebrity.

Forty years later, Helen Keller’s mother read Charles Dickens’s account of meeting with Bridgman, and reached out to Perkins, which sent a recent graduate, Annie Sullivan, to educate Keller. Sullivan finger-spelled English words into Keller’s hands, hoping that she would slowly pick up the language, the way infants pick up spoken language. The story of Keller’s breakthrough, as her teacher placed her hand under a stream of water while finger-spelling W-A-T-E-R into the other, is a canonical scene in American history. There’s a bronze statue of Keller at the water pump in the U.S. Capitol, and the moment was immortalized in the 1962 film “The Miracle Worker.” The “miracle” is Sullivan’s feat of bringing language to a DeafBlind person—someone understood to be, as Howe described Bridgman, consigned to the “darkness and silence of the tomb.”

Clark has no patience for this sacred image of Keller’s DeafBlind epiphany. “There was already a word for water,” he said. Keller had developed dozens of home signs with her family before Sullivan arrived, for words such as “ice cream” (pretending to turn the crank of the freezer, then shivering) and “bread” (“I would imitate the acts of cutting the slices and buttering them,” Keller wrote). “What Helen learned to do was to perform a stunt,” Clark has written. “Annie was attempting the equivalent of forcing Helen Keller to utter a pentasyllabic word . . . whenever she wanted water. If you’re thirsty, say ‘ideology’ or ‘specification’ or ‘liability.’ ”

In Keller’s lifetime, other methods of DeafBlind communication arose. In public, Keller used the Tadoma method, in which she placed a thumb on the throat of her interlocutor and the rest of her fingers across that person’s lips and jaw—a kind of tactile lip-reading. There were several variations on an “alphabet glove,” printed with English letters so that a sighted person could tap out a message, but communicating letter by letter was cumbersome and slow. Today, some DeafBlind people communicate orally, many using hearing aids or cochlear implants, which usually offer only partial access to speech. Tactile sign language is also used, but issues with intelligibility remain. A 1995 study found that DeafBlind people understand as little as sixty per cent of a sentence conveyed through tactile sign language. Various systems have been devised to improve tactile communication. In the nineteen-nineties, Trine Næss, a DeafBlind Norwegian woman, standardized Haptics, a system of touch signals for common words: eleven for colors, eight for drinks. “For a time, some people were, like, ‘Do you support Haptics or P.T.?’ ” Clark told me. “But you really cannot compare the two—it’s not a Pepsi-vs.-Coke situation, but Pepsi vs. Cadillac.”

In 2005, Jelica Nuccio took over as the first DeafBlind director of Seattle’s DeafBlind Service Center (or D.B.S.C.), which offered social services to about a hundred people in the region. Thirty years earlier, a nonprofit called the Seattle Lighthouse for the Blind had established a program that employed DeafBlind people to do industrial work, and, in the decades since, the city had become a kind of DeafBlind mecca. Nuccio is fifty-seven, with long, dark hair and a bright laugh. “I was ready to move to Seattle and start a new chapter,” she said.

Nuccio had come to sign language late. As an adolescent, she attended a school in St. Louis that taught the oralist method, drilling Deaf students in gruelling exercises to learn how to read lips and produce speech. “The nuns said if you signed, you were stupid,” she said. “If you point at something in sign, you look like an animal.” She learned A.S.L. only in college, at the Rochester Institute of Technology, after her Deaf classmates mocked her speech by using a derogatory word for “oralism” in A.S.L., two horizontal forearms coming together like giant lips flapping: blah blah blah. In 1996, as Nuccio was becoming increasingly blind, she went to the Helen Keller National Center, a training facility on Long Island. But she felt that it was run like a prison. In the cafeteria, “they immediately started shovelling food at me,” she said. “They weren’t even communicating with me. I was in a feeding trough. I was, like, ‘I have degrees, people!’ ” (A spokesperson for the center told me that, for fifty-five years, “thousands of DeafBlind individuals have benefitted from HKNC’s programs” and added that its staff is “knowledgeable, helpful and kind.”)

Nuccio was disappointed by what she found at the D.B.S.C. Sighted employees and interpreters dominated life there. Blindness is enormously stigmatized in Deaf culture, and many of the DeafBlind people at the D.B.S.C., whom Nuccio called the “tunnel-vision people,” clung to their dwindling eyesight, continuing to use visual A.S.L. even as it grew difficult. The “tactile people” ate in a separate group at lunch and were treated with pity and condescension. DeafBlind people often use interpreters to interact with the hearing-sighted world, but, in Seattle, they used them in DeafBlind groups, too: each client would speak to her own interpreter, who would repeat the message to other interpreters, who would then relay it to their clients. “I didn’t understand why they would call Seattle ‘the DeafBlind mecca’ when it was run that way,” Nuccio said. “Yes, there are a lot of DeafBlind people here. But so what? Why is that a mecca?”

Nuccio hired aj granda, who is DeafBlind and had worked on and off for the D.B.S.C. The pair recalled that the interpreter program at a nearby community college had posted a sign on the wall that said “ASL Zone”: when you entered the room, you agreed to abide by the rules of Deaf space by “turning off” your voice. They decided to make the D.B.S.C. a DeafBlind-friendly zone, modelled on the same principle. But what were the rules of DeafBlind space?

The first rule that they established came to be called “air space is dead space.” DeafBlind people at the D.B.S.C. were continually left out of A.S.L. conversations among sighted people. Now, whether you were DeafBlind or not, all communication needed to happen in the realm of touch. Granda told me that their conversations with Nuccio had become so adapted to tactile reception that sighted friends could no longer follow them. To make tactile words even more expressive, the pair gradually expanded the canvas of touch to include the back, arms, lower thighs, and upper chest. Back-channelling emerged to capture what A.S.L. speakers communicate through facial expressions—a limp hand laid on the knee could signify exhaustion, and a tense grip might indicate terror. “Everything was kind of clunky, and everyone was awkward with how we were using each other’s bodies,” Nuccio said. “ASL was in the mix, and it was a mess. It was a great, messy start.”

Nuccio and granda called their method Protactile, and, within a few years, they were holding trainings. But, for the most part, the sight-reliant people were set in their ways. They often arrived, found a chair, and sat down, waiting for their interpreters. “I said, ‘If you need to know where anything is, you can ask a DeafBlind person,’ ” Nuccio said. She would take their hands, and together they’d touch the drinks and the snacks. Nuccio and granda encountered tremendous resistance among employees and clients at the D.B.S.C. “DeafBlind people are oppressed by Deaf people in the Deaf community,” granda said. “People who are oppressed tend to oppress others.” Nuccio ended up firing much of her staff, including many of her friends. But she and granda believed that they were developing a new political framework to achieve DeafBlind autonomy.

The pair hadn’t set out to alter the linguistics of A.S.L., but, as DeafBlind people in Seattle took Protactile’s methods home, words began to change in their hands. Granda said, “they realized ASL was no longer their language.” The A.S.L. word “yes,” for instance, is a fist bobbing in space, like a nodding head. But by touch it felt wrong. “We knew what it meant because we knew the ASL word, but it was weird,” Clark told me. “A head rubbing itself against a wall? It did not make natural sense in contact space.” The A.S.L. word “no,” a two-fingered pinch, was similarly off-putting. “It felt like an ostrich trying to pluck some hair off your head,” Clark said. “We never had a meeting to invent any new words. Life went on, and we had to say yes and no a thousand times every day!” In time, the community replaced these A.S.L. words with words that felt more tactilely intuitive: “yes” became an affirmative patting, and “no” felt like a hand swiftly erasing a message from a whiteboard. “Those P.T. words are so simple, duh-worthy, so elegant. And they have absolutely no relation to ASL or the ASL words ‘yes’ and ‘no,’ ” Clark said. “Not a shred in common.”

The A.S.L. word “vehicle” is made with a hand turned on its side so that the thumb is like a driver piloting a craft through the air. By touch, all you can feel is a pinky grazing your leg. Over time, the Protactile word became a flat palm driving across the lower thigh. And speakers developed ways of elaborating on these new words. “Instead of describing the size of a vehicle in terms of how big it looks,” Nuccio and granda wrote, the tactile word can describe a vehicle “in terms of how heavy it is, or how much friction it generates on the road”—the features more relevant to touch. To signify a large vehicle, the speaker presses a flat palm down hard on the receiver’s leg. For a compact car, she’d use a lighter touch.

Some worried that Protactile’s intense tactile immersion could feel inappropriate, including to DeafBlind survivors of sexual or domestic violence, an objection that its creators have had to grapple with. Granda has taught Protactile to numerous DeafBlind people whose prior traumas made them resistant to touch. “We care about survivors and want to make sure that those people feel safe,” granda said. But they argued that anyone can feel comfortable and safe in Protactile. “There is a natural form of appropriate consent built into the language that, with constant conversations, actually can bring about healing,” they said.

By the mid-twenty-tens, Protactile had evolved from a set of communication practices into a national movement. Granda and Nuccio made Braille bumper stickers, released videos, and travelled the country giving workshops and hosting “P.T. happy hours,” where locals could learn the basics. Nuccio and granda eventually drifted apart, and granda has spent time working at the Seattle Lighthouse and teaching Protactile at Seabeck, an annual DeafBlind retreat near the city. In 2014, Nuccio established an organization dedicated to Protactile training called Tactile Communications. Around the same time, Clark joined the Protactile movement, and has led trainings that have reached hundreds of people.

This past December, a half-dozen of Protactile’s most fluent speakers met up at the University of Chicago. They had come at the invitation of Terra Edwards, a linguistic anthropologist who is studying Protactile with her colleague Brentari, the sign-language linguist. By visual standards, the lab had a drab, provisional air: it was empty aside from a haphazard scattering of metal folding chairs and a table pushed against the wall. But, in DeafBlind space, this was a comfortable arrangement, ideal for generating ad-hoc clusters of tactile conversations, with no armrests or conference tables to separate people’s bodies. Nearly all of the DeafBlind people were in stocking feet. “With shoes, everything feels the same,” Hayley Broadway, who had flown in from Austin, said. “I don’t feel the ground. I can’t feel if it’s dirty or if it’s rough.” Earlier that year, Broadway had married her husband, who is also DeafBlind, in a Protactile ceremony. They walked down the aisle in an intertwined cluster of friends. For the exchange of vows, the officiant spoke in Protactile to both Broadway and her husband, forming a three-way conversation. Everyone at the wedding was barefoot, and the couple served sushi. “We just wanted finger food,” she said, “something you can eat with one hand while you could stay in communication with the other.”

Clark walked into the room wearing a burgundy shirt. He had a co-navigator with him, who joined him in interactions with the hearing-sighted world of airline attendants, cabdrivers, and cashiers. But the co-navigator trailed behind as Clark strode into the room, reaching out to explore his environment. He found Nuccio, spoke his Protactile name onto her back—two quick downward strokes—and they hugged. My interpreter put her hands on their backs, signalling her presence. This was, I realized, what it meant to be communicating in contact space: I was sitting a few feet away, but my observation was covert; it was only when I laid my hands on the group that I was actually present with them.

Clark now speaks to his partner and children in Protactile. Jaz Herbers, who retired after fifteen years working in I.T. because of his changing vision, saw early videos explaining Protactile in 2013. “I was, like, ‘That’s it!’ ” he told me. “That’s the answer to my life now.” Today, he leads Protactile trainings around the country. Rhonda Voight-Campbell, a forty-nine-year-old instructor at the Rochester Institute of Technology, attended a residential training in Protactile after she became increasingly blind, and felt the full possibilities of conversation return. “I ate at the dinner table with several DeafBlind peers in the dark,” she said. “Hands and feet patting, groping, and stomping.” Oscar Chacon, who works part time in Edwards’s lab, told me that it annoys him when hearing-sighted people, upon learning about Protactile, say they find it “inspiring.” “We’re human beings,” he said, “using language the way humans use language.”

Since the Protactile conversations that I observed all passed in a flutter of movements that I didn’t understand, Clark took a moment to demonstrate a word—“oppression.” He took two hands and pressed them down onto mine. I tried to repeat it back: Is this “oppression”? “P.T. isn’t a code where two hands pressing down equals oppression. It could also be something like this,” he said, and dragged his hand slowly down my arm. “This person is oppressed,” he said, and gripped my chest, crushing something invisible there. He cycled through a range of other movements. My interpreter became uncharacteristically overwhelmed. “I can’t think of enough English words to equal what he’s giving you,” she said. At first, I interpreted Clark’s demonstration as suggesting that Protactile lacked precision. But each variety of oppression that Clark had shown me—which my interpreter scrambled to translate as “repression,” “suppression,” and so on—intuitively connoted “oppression”: they were all forms of dragging, weighting, gripping. It’s just that they had no direct correspondence with English.

Unlike with spoken language, which can be transcribed or taped, or visual sign language, which can be filmed, there is still no way to make a tactile recording. This means that the only way to communicate in Protactile is in person. At one point, graduate students demonstrated new devices that could send taps and presses from a distance—a kind of primitive haptic FaceTime. But the DeafBlind group was unimpressed by the technology, which could transmit only slow, single taps on a limited patch of the body, and had none of the rich array of squeezes and presses that Protactile deploys. Today, many DeafBlind people stay in touch using a Braille display, which has dots that pop up and down to render text from a computer or phone. (I’m learning to use one, too.) Navigating cluttered Web pages can be nightmarish in Braille, but the DeafBlind world thrives in the plain-text realm of e-mail Listservs. In lieu of “LOL,” Protactile e-mailers type “LOY,” for “Laughing on You,” invoking the Protactile mode of laughter, a spidery tickle. Clark teaches college-level seminars entirely by e-mail. He once wrote, “Before PT came along, I had my most fun, found the most joy, experienced life the most on listservs.”

Clark has considered applying for teaching positions at universities, but told me that he wishes that they hired “environments”—groups of DeafBlind colleagues following the rules of contact space—rather than individuals. In Chicago, I noticed that the DeafBlind people carried their Protactile conversation with them like a miniature weather system as they made their way through the campus. They remained in contact with one another and explored their environment, touching walls, trees, and the raised letters on signs, sharing their impressions. At lunch, they occupied a large communal table at a café on campus. Clark felt his way to one side and ended up with his hands on the back of a hearing-sighted woman at another table. She tapped back on the communal table, trying to signal where he should go, and then continued her conversation with her lunch partner. When Clark made it back to his seat, he announced, “I found two mutes!”

In 2006, just as the Protactile movement was beginning, Terra Edwards, then a graduate student, was at Seabeck, the annual retreat near Seattle. Outside, she saw a DeafBlind person forcefully correcting her interpreter. “This was highly abnormal,” Edwards said. “I could tell that was a shift in the authority structure.” But Edwards was also interested in the correction itself. The interpreter had pointed at something in the air, and the DeafBlind person, with some degree of “angst and irritation,” told her to instead draw a diagram on her palm. “People had pretty strong opinions about whether or not you were doing it right,” Edwards said. “To me, that suggested that there was some kind of system at play.”

Edwards (and, eventually, Brentari) spent the following years filming some of Protactile’s most fluent speakers telling stories and describing objects, and found an increasingly conventionalized system, with an emerging lexicon of its own, organized by new phonological rules. When Edwards shared these rules with DeafBlind people, they knew exactly what she meant, even if they’d never had a reason to spell it out, just as English speakers are able to follow complex grammatical rules without having any idea what an indefinite clause is. By 2014, Edwards believed that, among those who had immersed themselves in Protactile, the practice was evolving into its own language. Other linguists I spoke to agreed. Molly Flaherty, a developmental psychologist and sign-language linguist at Davidson College, told me, “How amazing is it that language is something that’s flexible enough to work in yet another modality?”

In the nineteen-fifties, the linguist Noam Chomsky identified what he came to call the “poverty of the stimulus,” the idea that language learners receive vanishingly few clues for how linguistic systems work. Ann Senghas, a cognitive scientist at Barnard, told me, “Someone gives you a pie, and you have to figure out how to make it.” Chomsky concluded that our brains are endowed from birth with aspects of grammar, allowing us to reproduce language without formal instruction. More recent theories hold that we are simply incredibly good at unconscious statistical analysis of linguistic patterns. Whatever the case, the human brain is a superb language-decoding machine.

In the absence of a shared language, people will create new ones. In the seventeenth century, French colonizers brought enslaved Africans to what would eventually be called Haiti. These Africans brought their languages—Igbo, Fongbe, Bantu, and many others—with them. As they communicated, their language converged, drawing from the varieties of French that were spoken on the island, and incorporating elements of West African grammars. In the course of the eighteenth century, a new language, today known as Haitian Creole, or Kreyòl, emerged. Michel DeGraff, a linguist at M.I.T., told me, of early speakers of Haitian Creole, “They’re not sitting down and taking language classes. They’re learning and innovating on the go.”

Starting in the seventies, when several new schools for young Deaf children were established in Nicaragua, students arrived with their own sets of home signs. But, within a few years, their signs began to evolve. Senghas told me that the process was “like going through hundreds of years of language change in just a decade.” The word for “rice” began as a pinching motion, showing the grain’s size, followed by a flicking gesture that mimicked the process of removing stones from the rice before cooking it, and another demonstrating how it’s eaten. In the eighties, the word simplified to just the flicking motion, its most distinctive element. “It’s not even the most salient thing about rice,” Senghas said. But “a system survives because it’s learnable.”

Edwards and Brentari believe that Protactile is in the very early stages of such an evolution. They found that much of Protactile’s “archival lexicon” comes from A.S.L., but the rules governing how these words are articulated have changed significantly. Edwards and Brentari have studied gestures that make up Protactile words—the equivalent of phonological units like “puh,” “buh,” “shuh”—and catalogued them: you can trace, grip-wiggle, slap, and so on. There are rules for how these movements can be combined. A single-finger tap followed by a two-finger tap never happens in Protactile, as it can be difficult to distinguish the two, whereas a single-finger tap could easily be followed by a two-finger press. These rules emerged intuitively, without conscious codification. But when they’re broken it doesn’t feel right, just as if an English speaker tried to combine a “P” and a “B” sound without a vowel separating them.

The linguists also observed new words being created. In A.S.L., “king” is made by taking the manual alphabet’s “K” and sliding it down one’s chest, like a royal sash. But the “K” was hard to recognize by touch. A new version emerged during a Protactile training in 2018. Everyone noticed that Jaz Herbers, the former I.T. specialist, had particular preferences. For instance, he positioned himself closest to the air-conditioner on a hot day, and bought a king-size bag of M&M’s. Other DeafBlind people began jokingly giving him a tactile crown. By the end of the training, he’d received his Protactile name: a one-fingered circle followed by a downward movement that evoked the crown and its weight. A few years later, Edwards noticed a group of DeafBlind people talking about getting a fast-food lunch, using the A.S.L. word “burger” followed by Herbers’s P.T. name. “Soon, any time anyone said ‘king,’ that’s the word they were using,” Edwards said. If Protactile continues to spread, there’s a chance that future speakers will trace the etymology of “king” back to Herbers, much as English speakers today owe the name for a lunch of meat between slices of bread to John Montagu, the fourth Earl of Sandwich.

Edwards and Brentari found that Protactile was doing things that other languages couldn’t. Protactile is full of a kind of tactile onomatopoeia, in which a hand resembles the feel of the thing it’s describing. In what the linguists call “proprioceptive constructions,” the speaker recruits the receiver’s body to complete the word, say, by turning her hand into a tree (five fingers as branches) or a lollipop (fist as candy). At one point, I asked Nuccio where she was from, and she told me to make my hand into a fist, which represented the globe. “You and I are in America, over here,” she said, touching my first knuckle. “And this is the ocean.” She traced a finger to my wrist to find the country where she was born, Croatia. She accomplished all of this in a series of movements that Edwards said followed consistent grammatical rules. At another point, Nuccio described how difficult her life had been when she’d worked as a technician in a genetics lab as she went blind. She had me point my finger up, and told me that it was now the flame of the Bunsen burner that she’d used in her lab. She demonstrated how to adjust the flame on one of my knuckles, and how delicate the apparatus was. I was astonished by the precision of this tactile illustration, which felt, in the moment, more vivid than any verbal description could have.

Some linguists remain skeptical that Protactile has yet emerged as an independent language. “I think it’s fascinating what’s happening,” Wendy Sandler, a sign-language linguist at the University of Haifa, told me. “But I have a lot of questions about how it’s going to develop.” She said that many of the functions of A.S.L.—for example, the way that parts of sentences are separated spatially on the body—still hadn’t made it into Protactile’s system. “Most P.T. users in the U.S. already know ASL very well and can mentally ‘fill in the gaps,’ ” Sandler said. “But these gaps do not yet seem to be filled in by P.T. itself.” Another linguist told me that she believed that Protactile is more like a dialect of A.S.L., similar to how there are many dialects of American English. There is no single test for whether a form of communication has emerged as a language, and the debate is ongoing. Senghas compared A.S.L.’s influence on Protactile to the presence of French or Latin in English. “It’s got its seeds in A.S.L. in many ways, but it’s a different language,” she said. “If there’s no English, there’s no Morse code. Whereas, with Protactile, if there’s no A.S.L., there’s still P.T.”

On their last night in Chicago, the Protactile group gathered at a local’s house for a party. I was one of a handful of hearing people there, and one of only a few who didn’t know Protactile. My interpreter wanted to visit with a friend, and as soon as she left the room I felt like Clark’s kindergarten bus driver: I’d forgotten how to speak. In the kitchen, the host was blasting bass-heavy eighties hits, and I felt the vibrations in my chest, which is how many Deaf people listen to music. Elsewhere, it was quieter. I sat on the couch in a room packed with a dozen people all engaged in a silent but lively conversation that I couldn’t understand.

The party was supposed to feature a tactile game called a P.T. Hat Slam. The game never materialized, but the host had cleared out a bedroom and laid out his extensive hat collection for the guests to admire. Clark offered to give me a tour, without my interpreter’s help. As he guided my hands over the hats, I thought that I detected the presence of language: notice how this hat can fold out; watch out for the spikes on the gladiator helmet. I had no way of knowing how much was Protactile and how much was just basic gestural communication; that line is still thinner in Protactile than in established languages. Later, Clark criticized my performance during his tactile hat tour. “You didn’t know how to really feel!” he told me. “Maybe you were looking at them with your eyes. You didn’t go beyond my hand to touch, to explore. That’s one skill that has to be taught.”

Protactile continues to grow. Nuccio and Clark recently received a two-million-dollar grant to expand a Protactile interpreter-training program. There are weeklong retreats on cruise ships and at Florida resorts (“Breezin’ P.T. Weekend”), and experiments in Protactile theatre: in 2018, a Gallaudet professor staged a Protactile version of “Romeo and Juliet.” A handful of Europeans have studied Protactile and taken its techniques back to France and the Netherlands. But some linguists wonder whether Protactile will ever fully develop into its own language. Protactile lacks a dense, in-person DeafBlind community, like the residential Deaf schools that incubated the development of A.S.L. I was surprised to learn that several active members of the Protactile movement live with Deaf spouses and children who resist using Protactile with them.

Most DeafBlind people in the U.S. who have encountered Protactile understand it as a broadly “pro-tactile” philosophy, but haven’t adopted it as a new language. George Stern, a writer in West Texas, told me, “In a lot of my activities, whether it’s ballet dancing or practicing salsa, or cooking, I incorporate touch.” When Stern had a hearing-blind girlfriend, he taught her a series of tactile signals—fingers walking across the back, for example—to coördinate passing each other in their narrow kitchen. But, to Stern, who usually uses hearing aids and communicates orally, the linguistic component of Protactile still feels rarefied and out of reach. “I’m glad that there are people developing P.T. as a language where they are,” he said. “But how is it going to function where I am? I don’t live in a DeafBlind community. I live in a primarily hearing-sighted world, in an American culture that’s generally averse to touch.” Chris Woodfill, the associate executive director of the Helen Keller National Center, told me that, though he focussed on learning tactile modes of communication as his own vision declined, many of his clients communicate orally, using hearing aids and other assistive listening devices. An increasing number have received cochlear implants as children—a practice that remains controversial in the Deaf community—and never learned visual sign language. As a result, he noted, the center doesn’t push tactile communication on its clients: “We lay out the menu, and it’s à la carte.”

Language development is most productive when it’s passed through a new generation, whose infant learners refashion a language as they learn it. But most people with Usher syndrome don’t become blind until early adulthood, so few would be children when they learned Protactile. Many young children who are DeafBlind have other disabilities, such as charge or congenital rubella syndrome, which can cause cognitive delays that affect communication. And many have hearing-sighted parents who don’t know A.S.L. themselves, let alone Protactile. “They understand that the tactile world is important to a DeafBlind kid,” Deanna Gagne, a researcher who is studying language acquisition in DeafBlind children, told me. “They just don’t know how to implement it.” During the pandemic, Edwards, Brentari, and Gagne received an emergency grant from the National Science Foundation to introduce Protactile to DeafBlind children isolated in their homes. Nuccio played with one DeafBlind boy for about five months. At first, he was reluctant to have his hands touched, but, over time, communication improved: Nuccio ran a toy car up and down his arm, and then used the P.T. word “car” and made the same motion, trying to connect the word with the object. Later, the boy made the same word on her arm.

The richest Protactile environments are still the ones inhabited by the movement’s leaders. At Nuccio’s training center, which she calls P.T. House, visual A.S.L. is forbidden and her dogs respond only to tactile commands. And, in their apartment in St. Paul, Clark’s family has turned the archival A.S.L. vocabulary “way, way down,” to encourage invention. The result has been an efflorescence of new words. During his bedtime ritual with his children, Clark has forced himself to discard the A.S.L. phrases he grew up with, and to come up with Protactile ones instead. To say good night, he places his hands on a child’s shoulders, and brings them together in the center of the child’s chest. “I thought I was ‘gesturing,’ but somehow still conveying the sentiments,” he said. “My ASL mind hadn’t recognized those as actual words.” Recently, he told a Protactile Theory seminar that he conducts over e-mail about these new words. He acknowledged that they may forever remain home signs. But they might seep out into the community, as Clark converses with the hundreds of people he touches every year. “At any rate,” he concluded, “if you want to get rid of ASL words for ‘good night,’ ‘I love you,’ and ‘sweet dreams,’ I have drafts for you!”

Source: DeafBlind Communities May Be Creating a New Language of Touch | The New Yorker