Easy Way Uses Arduino to Translate Subtitles on the Fly

Arduino
Easy Way Uses Arduino to Translate Subtitles on the Fly

YouTube player

Using the Video Experimenter shield for Arduino, a group in Brazil developed a way to translate live closed captioning to a number of different languages. Called Easy Way Subtitles, the project uses the Video Experimenter Shield to get the closed captioning text from the broadcasted signal and turns it over to Google Translate. Viewers can then view the live translation into their native language by opening their web browser and pointing it to Easy Way’s site. “The number of foreign students and professionals coming to Brazil increases every year,” according the the captions in the video above. “But most of them don’t speak Portuguese, so they can’t understand what’s going on on local TV.” Easy Way solves this inventively by combining older technology—closed captioning—with newer web based translation services. [via Twitter]

18 thoughts on “Easy Way Uses Arduino to Translate Subtitles on the Fly

  1. J says:

    Interesting…

  2. trkemp says:

    This is a great idea. Having to look in two places (TV and iPad or computer) seems like it would take away from the comprehension of the material being watched. It would be cooler if it could add the translated CC back as either CC or by superimposing the text on the video directly. I also wonder if it’s fast enough to make sense within the context of the program.

    1. Alan S. Blue says:

      I was thinking: Combine this with XBMC on a Raspberry Pi. The Pi can ‘delay’ the video (via PVR functions) long enough for the translator to keep up. So, other than the 5-second-or-so ‘tape delay’, everything ends up on the same screen at basically the right time. (Although even regular CC is pretty bad at being -exactly- right, and it’s mostly downhill from there.)

  3. FM says:

    Most CC on live feeds is pretty terrible. They send the audio portion of the program through a Text-to-Speech program and hope it is close enough. Plus, the CC is normally delayed by about 10 seconds so the text never matches up and gets cut off whenever they cut to commercials.

    If you have ever watched the nightly news with CC turned on, you will see the problem a hearing impaired person would have trying to follow the news. I guess the networks never get enough complaints to bother trying to fix the problem. I guess less than 1% of the population isn’t big enough to concern them, ADA notwithstanding.

    1. Easy Way Uses Arduino to Translate Subtitles on the Fly Matt Richardson says:

      I’m not sure where you are on the globe and what programs you watch, but I’m in New York and most of the local and national television I watch is captioned with very good quality. Pre-taped shows have excellent captions and live shows are transcribed by humans on the fly, so there’s some delay and it’s not always 100% accurate. I wasn’t aware that channels feed programs through speech-to-text. Is that conjecture, or do you know for certain that broadcasters do that?

    2. nootropic design (@nootropicdesign) says:

      FM, I have only seen delayed CC on live broadcasts (like the news). It is because a person in transcribing as fast as they can, but it’s a bit delayed and the quality is lower. Prerecorded programming has the CC synced up almost perfectly.

    3. markneu says:

      In my days in TV production (about 10 years ago) we fed the same script to teleprompter and cc at the same time for our nightly news. So you were getting what the anchor was supposed to be saying – which usually was what they were actually saying.
      Bigger markets would use variations of court stenography equipment since they allow a much quicker rate of transcription than a normal keyboard. You can generally tell if it is a live person doing the work because of the occasional correction of a word as they backspace and retype something.
      Computer captioning, such as on YouTUBE, is still in its baby-step days, providing just as many laughs as nuggets of useful information. Since I lost my hearing about 6 years ago I have been hoping the algorithms would advance at a quicker pace, but they are still not quite there yet.
      This project is cool. While it is still a case of GIGO as far as the captions, maybe if more people see the iffy state of on-the-fly systems we will get more people working to improve them.

  4. dubiaku says:

    Very interesting device. But as a frequent user of Google Translate, I doubt that anything that comes through this system will be anything other than gibberish.

    1. Swim says:

      You’d be surprised… Google Translate is actually great for translating from Portuguese into English. Not sure how good it’d be for the other languages, since Translate always first translates it from the original language into English and then from English intro the target language.

Comments are closed.

Discuss this article with the rest of the community on our Discord server!
Tagged

Matt Richardson is a San Francisco-based creative technologist and Contributing Editor at MAKE. He’s the co-author of Getting Started with Raspberry Pi and the author of Getting Started with BeagleBone.

View more articles by Matt Richardson

ADVERTISEMENT

Maker Faire Bay Area 2023 - Mare Island, CA

Escape to an island of imagination + innovation as Maker Faire Bay Area returns for its 15th iteration!

Buy Tickets today! SAVE 15% and lock-in your preferred date(s).

FEEDBACK