Tatum Robotics: Hands-On Innovation for the DeafBlind Community

In the modern world, we have an abundance of technology that helps us with our communication, information gathering, and entertainment needs. But most of this is inaccessible for DeafBlind individuals whose primary language is tactile sign. Tatum Robotics is advancing accessibility by developing a robot hand that can communicate with DeafBlind people through tactile sign and allow them to access the internet.

Introduction

Welcome to Third Angle, today we're taking the hand of a robot that's changing lives.

We use a combination of our senses to communicate. But for someone who is deafblind, relying on these two senses isn’t an option. In this episode, we meet Tatum Robotics, a Boston-based startup who are changing the lives of the deafblind community. They’ve developed the first collaborative, cloud-based robotic hand that can be a lifeline to those who cannot hear or see. This community relies on the remarkable ability to tactile sign. This requires a human signer to be present for any communication to take place. But their tabletop robotic hand can translate any virtual speech or text into tactile sign language without the signer needing to be physically present.

About Tatum Robotics

Samantha Johnson is the founder of Tatum Robotics, based at a robotics hub in Seaport, Boston. They have a team of five, a couple of co-ops, some full-time people, and then about once or twice a week, a number of deafblind people coming in to do testing and validation. “Usually, when we go to testing, we essentially set up sitting in a sort of kiddie corner with the robot in the middle. I try to describe and let them feel it. That’s always the first step,” says Samantha. “They run their hands down the front, feel the buttons, understand what’s in front of them.”

How does the process work

I tend to start by doing sample sentences like, “The cat walks through the door,” so they can get an idea of their signing speed, the grammar patterns they use, things like that.” One deafblind woman Samantha was working with reacted emotionally to the phrase “smelly cat”. The woman told her about how she was born deaf and blind in one eye, but recently lost vision in her other eye. So now she’s completely deaf and completely blind, learning tactile signing. “I did smelly cat,” said Samantha, “And she basically started to cry. It was her favorite show growing up, she could hear it and she could read the captions. It was so exciting for her that she could get that moment back. We started doing Friends quotes and doing all those things together.” And it’s an experience we get frequently of these people that are just so deprived of entertainment, of these technologies, that people really see how much of an impact this could have on them going forward.

What the hardware needs to do

The hardest part, and the most exciting part, is that there are all of these things that need to fit together in a profile that is predefined. We know that the hand itself needs to be the size and shape of a human hand, which limits our hardware but also informs our software development in terms of how we’re building out our trajectories and what needs to be integrated. It’s a fun group, because we have a very interdisciplinary team because of the hardware and software play that we have here. So we have software people who are remote that are helping develop linguistics algorithms, the client for the robot itself, and then here in person, we have all of our hardware people developing the anthropomorphic robotics as well.

How it all started

What’s been fun for me is I started this as my master’s thesis. I started this because I was volunteering in the deafblind community during COVID, when social distancing pretty much prevented deafblind people from accessing communication. I know how to sign, but as I started building on this project, the scope just kept increasing. As I started, I started building a hand, and then I realized this whole linguistics element and brought on Nicole shortly after for that. And then I realized, “Oh, we need to allow them a way to interact back,” so we sort of bring on this computer vision aspect. It’s been a fun project as it’s continued to grow as we start to see these needs come out of the testing that we’re doing with deafblind people.

About the technology

The hand features a tendon-driven system with 18 degrees of freedom to give it the necessary dexterity to achieve the hand shapes of American Sign Language. Starting with five degrees of freedom, the team added additional degrees of freedom to make different hand shapes to make sure the design was as optimized as possible. “We’re not having any extra degrees of freedom, we’re not paying for additional motors, we’re really making sure that this is optimized specifically for signing,” said Samantha. “It’s also minimized for grip force. Most robotics these days have very limited degrees of freedom but a really high grip force to pick things up; this is almost the opposite. We have a lot of degrees of freedom, and minimal grip force that it can’t injure those deafblind people. And so each finger has tendons routing through it to allow it to bend in different patterns.” It routes to a pulley so that the motor can spin and then configure those motions. It’s difficult technology, and there’s a small market.

The ultimate goal

There are millions of deafblind people worldwide, but it’s always difficult to really bring assistive technology forward, says Samantha. “There have been previous projects developing fingerspelling hands or robotics for the deaf or deafblind community, mainly in academia, and nothing’s ever gone through commercialization. A lot of the work that we do here is really making sure that we bridge that gap. When I was doing this as my master’s thesis, I was about to graduate, and the deafblind people we were working with through the Deafblind Contact Centre here in Boston were really encouraging me. They said: ‘You have to start a company with this, you need to commercialize it because you haven’t helped us yet.’ Having this be a couple-year thesis project doesn’t benefit anybody. So really making sure that this makes it into their homes and takes it to that next level, ensuring that it’s actually helping this community that’s they’ve worked so hard for.”

Accessible design

The hand is designed specifically for deafblind people based on how they use their microwaves. If you can imagine microwaves nowadays, they’re flat on the outside so it’s hard to differentiate where the buttons are. Each of these buttons has a different topical feel and different textures, so people can differentiate what each button is doing. Each of them has different applications like stories and news. “This one is the one that inspired the whole project. So really making sure if there are important news updates, upcoming election information, they can access that,” said Samantha. “The weather app was actually really interesting for deafblind people, as they leave the house, and they don’t know what the weather is outside. This is helpful in allowing them to know if it’s raining out, this is the temperature you can expect, the temperature will change throughout the day, things like that.” The last app is websites so deafblind people can go on the internet.

How does the technology work

The hand is held from behind, with the deafblind person able to feel right along the back with the tips of their fingers. As it signs, they’re able to feel it bending and they know what those different forms mean. “If we click one of the application buttons, it just signed the word story because I was going into the story app. If I enter the application, it will then ask me a tag. What filters you’re looking for, and it’s very intuitive. All the apps work the exact same way, so it’s the same walkthrough process for news.It really just getting them comfortable with how to interact with the system,” explained Samantha.

The challenges of deafblind communication

In this space right now, deafblind communication options are basically through Braille tablets and human interpreters. Braille tablets are a way to access a written or spoken language. Braille is a medium of English, for example, here in the US, and there are a lot of technologies in that Braille space. However, only about 10% of deafblind people know braille, either because they went blind later in life or if they were born deafblind, they also have cognitive disabilities. Only about 10% of deaf people actually know Braille. As a result, a large majority focus on human interpreters, which can be very expensive and have long lead times. They can’t have an interpreter with them all day, every day. And there are deafblind people who live alone or maybe in homes that need that access to what is the weather outside and what news is happening – again, especially in times of pandemic, the news is essential to move forward. There isn’t a lot of technology in this space if you can’t access Braille, which is why this is really filling that gap right now, to allow people to use their primary conversational language, which is a tactile sign, but in an independent way.

Collaboration is key

As well as testing with the local deafblind community, Tatum collaborated with Canadian National Institute for the Blind and Sense International India to understand its use for other sign languages. “There are about 300 sign languages,” said Samantha. “That’s really the goal of what we’re making, is making this a flexible platform so the hand has as many degrees of freedom as needed to make robust hand shapes, and not just American Sign Language. That’s something we looked at very early, looking at, for example, French signs and making sure that this can still make those hand shapes and understanding. For example, Indian Sign Language is a two-handed sign language. How do deafblind people receive that? And how would they use our technology? Asking these questions early on to create this platform that is easily integrated with other languages.”

Creating content

Nicole Rich is the lead linguist at Tatum Robotics. “My favorite days are definitely the ones where we have our deafblind people come in, getting to actually watch them interact with the technology and interacting with them personally. Getting to know them on a personal level has been so rewarding,” she said. “I write the content for what the robot is going to be spelling. So that’s a tricky spot where I had to decide what people want to learn about and what they don’t already know about, what they can learn, and what’s repetitive, so it’s hard for one person to do all that. But I did a lot of research with the deafblind community in Boston here specifically, what do they already know? What do they learn in school? What do they not learn in school? What do they know about celebrities that we take for granted? What do we see online that they never see because they had to be really intentional about what information they gather?”

Misconceptions

“I think the biggest misunderstanding is that people think that tactile sign is simple. People often think that ASL is just manually coded English. That couldn’t be further from the truth. The fact that I’ve been working on ASL translation for two years, I think, proves that, if it were simple, then it’d be over by now. And even assumptions about visual sign, they don’t carry over to tactile sign in the same way,” explains Nicole. “You really have to have personal experience with the people that use the language as a primary language to understand it at all. A lot of people really think it’s as simple as if you know English, then you can just move your hands around and you can get to the sign. Or that all signs look like the concept that they represent in English, like that the sign for tree looks like what an actual tree looks. And that’s true, but some other signs don’t work like that at all. I think that one of the biggest misunderstandings we encountered with non-signers is that they assume that it must be simple.”

Overcoming barriers

“One of the biggest hurdles we face as a constant is showing the importance of bringing communication to the deafblind. I think a lot of people see the deafblind community, they might think there are three to five of them, but there are millions of these people that are completely isolated. And I think often we have to show that there’s this need, and also that we’re solving the need correctly,” said Samantha. “Often, people think: ‘Can’t they use Morse code or something?’ Well, I don’t know Morse code, you don’t know Morse code, so most deafblind people definitely don’t know Morse code either. Showing them that it’s important that we support this community in a way that is accessible to them. Show them that the way that we build this technology so they can use it easily. They don’t need to have hundreds of hours of training to learn a language they don’t know. We’re really hoping to preserve the identity of the deafblind community in signing culture, but in a way that’s just accessible for them.”

Tatum Robotics use Onshape

Tatum Robotics follows an agile development approach, and that has included being a participant in PTC’s Onshape startup program. The team at Tatum Robotics drive innovation by embracing agile product development, utilizing Onshape. Tatum Robotics uses Onshape for secure design, sharing and collaboration, accelerating the development process, and ensuring control over access permissions. Tatum can now easily share a design with someone and then withdraw access just as quickly, which enables quicker and more secure design feedback from their whole community.

Want to try Onshape for free? Here’s how

Our Onshape cloud-native architecture is the perfect solution for hardware startups looking to be agile and innovate quickly. founders and entrepreneurs can take advantage of our incredible Onshape startup programme, where we’re providing qualifying startups with free access to Onshape professional licences and enhanced focus technical support. To apply, go to onshape.pro/startup and see if your startup is eligible for Onshape professional licences for a year for free.

Credits

Thanks to Jon Hirschtick for his insight, Samantha Johnson for telling us all about Tatum’s mission and Curt for taking us behind the scenes at Mass Robotics.

Please rate, review and subscribe to our bi-weekly Third Angle episodes wherever you listen to your podcasts and follow PTC on LinkedIn and X for future episodes.

This is an 18Sixty production for PTC. Executive producer is Jacqui Cook. Sound design and editing by Ollie Guillou. Location recording by Curt Nickisch. And music by Rowan Bishop.

Episode guests

Samantha Johnson, Tatum Robotics Founder & CEO

More About Tatum Robotics

Jon Hirschtick, VP & Chief Evangelist at PTC

More About Onshape
overlaycontent