Day 5 – Translating video

Welcome to Day 5 of the 12 Days of AI. Today, we will be looking at video translation AI.

What is HeyGen?

HeyGen is a generative AI video platform that provides users with the ability to translate videos into different languages without requiring any technical expertise. HeyGen combines translations from ChatGPT-4, lipsyncing from wav2lip-2, and voice cloning from ElevenLabs. HeyGen’s privacy policy.

How do I access it?

Create an account with HeyGen.

Here is an example of the 12DoAI welcome video that has been translated into French using HeyGen. What do you notice about this version? If you’re a French speaker, does it sound correct?

Daily Task

Create a video translation using HeyGen using 1 of these 2 ways:

  • Upload a video that you already have into HeyGen and ask it to translate it in to a different language.
  • Create in video in HeyGen using one of it avatars and then translate it into a different language.

Note: If you’re uploading a video you already have, please ensure it is not copyrighted material and you have permission of others in the recording.

Technical support

HeyGen provides some support and help on how to create its videos. You may find some of these articles useful during your process.

Alternatives to HeyGen:

  • Rask translates your own voice into 130+ other languages or use the voice clone tool in 28 other languages
  • ElevenLabs creates “realistic” AI voices using text-to-speech

Optional reading:

Join the discussion:

Share the link to the HeyGen video you created.

  • How did you get on with the task – did you find it useful?
  • How do you think you could use HeyGen in your teaching or your own professional practice.
  • What implication are there for language teaching for this type of AI?
  • Will we even need translators in the future?

Leave a reply in the comments or tweet your comment using the hashtag #12DoAI.

21 Comments on “Day 5 – Translating video

  1. just watched the HeyGen video example and that French voice does not match the presenter. the voice over is the equivalent of English R P.
    looking forward to testing this AI app later.

  2. As I have done Days 4 and 5 in one go, I have found HeyGen less easy to navigate. The translation function, a feature free to HeyGen but not to HourOne, is quite useful even though there’s always a limitation to Machine Translation. Like for HourOne, I also created a video to introduce Academic Support. I found the avatar of myself amusing and odd, and I don’t know whether I would use it before I personally accepting it.

  3. I uploaded a video with my own voice over and translated it into Dutch (my native language). Not only was the translation very accurate, quite freakishly – it was my own voice how it would sound like in Dutch.

  4. I found a steeper learning curve with HeyGen but impressed by the translation audios. I dressed my avatar in a Christmas jumper, played with many languages, altered the speed and pitch of the audio a few times, and as a result, used up all my free units, so couldn’t create the video in the end! It looks like I’ll get more free units tomorrow.

  5. Generally HeyGen seemed to work well for the basic translations I tried, which could be very useful for sharing content with international students. I thought I’d have a bit of fun though by just applying a French speaking voice to some English text, rather than getting it to translate. https://youtu.be/_Ajo5Wu0PMk
    I also tried ElevenLabs a few days ago after seeing a demo of it on LinkedIn. I tried creating a very simple text to speech track using a non-accented English voice. It started okay, but then I noticed some interesting changes! https://youtu.be/MKqWivb5N2k

  6. I was disappointed that I couldn’t receive the verification code from Heygen to open the free account – despite numerous attempts – so I used Rask instead. Most of the usable video files I had were too large for the free version and you only get 1 minute of the video as a trial. It took a couple of minutes to load the translated video as well. It was quite good and seemed to adjust the speakers’ voices to the gender and tone of the original language, which was interesting. However, it would be nice to see how this works over a longer video with more speakers who engage in rapid-fire interactions to see how the speech synchronises or if there is any lag.

  7. afraid i didn’t get on with this one at all, some issues trying to get the code to register (delays in it being sent) and then once i did, trouble finding a video short enough that i had complete ownership over…can definitely see from others examples the advantages of such a tool.

    i want to start thinking about what (apart from the proclaimed goals of the ai companies) we are being asked to do by AI companies and what they plan to do. We are being told to give them our likenesses, our voices, our data with no real long view of what this means for ourselves as people, for our jobs, for the societies we have that require human interaction and ethics, Is AI being used to pick of the low hanging fruit (certain functions and jobs) rather than being employed to tackle bigger issues we have in our world (health disparities, disease, climate crisis)…this all sounds a bit dramatic but I think we also need to be critical not just of the tools but of the tech corporations behind the

    • Hi Andrew thanks you for posting this comment. You raise some really important issues here and I don’t think it is being dramatic. Most of these AI companies have privacy statements and some even talk about ethical considerations but at the end of the day they are commercial companies with a bottom line of pursuing the profit motive – and I’m not so sure these things are compatible. Even if a AI company does the right thing in protecting our data they are always going to be susceptible to takeover and mergers and then where does our data go? There are some worrying patterns happening in the AI Health sector and will AI in Education go the same way? One solution is for Universities and government to start providing these AI tools based on more ethical data sets but I haven’t really seen this emerging anywhere …yet.

  8. I did the task and here’s the video https://app.heygen.com/videos/f33cec906bdb4e4cbc759ec41b78b16d?subType=undefined.
    My confession is that I made the video but didn’t get to translate it as I ran out of credit. This task was a chance for me to make a video using another platform and to develop the script using ChatGPT thus developing my skills and knowledge in terms of Day1 and 4!
    I thought this was a useful task. I developed the text in ChatGPT using prompts with different word counts. This alone helped me in terms of generating text and relating the amount of text to video. Once I had explored HeyGen making the video was ok. Not sure if I would buy credits to make videos in this way. I didn’t get to the translation stage – no credits but I could see how it worked.
    This approach could be a useful support for the lecturers I work with who often have international students. The value of translating some material accurately and swiftly into a number of languages looks worth exploring.

  9. My language skills are very limited so it is hard for me to judge how good the translation is but I found the process easy though the free generation took a long time. Generally though, all our help guides are in English so I don’t currently have a use for this.

  10. I recorded a video of myself introducing myself and my pronouns and listing how I support students and asking them to get in touch with via email if they’d like help. All of that in English and I asked HeyGen to translate it to my native language, Greek.

    I am so impressed at HeyGen’s skills. The translation was quite accurate and I couldn’t fault it other than the occassional grammatical error. Two things I noticed though: 1. My email and pronouns were not acurately translated and 2. The context of what I was saying slightly altered, as I don’t think the tool really understood specific tasks, such as referencing resources. Also the voice generated was uncanny. It was close enough to my voice, but it wasn’t mine!

    This is a tool I could use to support students with one to one tutorials etc., but I’m not sure that I’d like to use yet, or that I’d trust it to create an contextually accurate translation. Overall, I don’t feel that translation is just about the direct transaltation of words and I don’t think HeyGen has it figured out yet.

  11. I recorded a short video about our course, had it translated to Finnish and sent it to a Finnish colleague. This is what she replied: “Wow! It was 100 times better than what I expected! The beginning was just a bit strange since it just started by saying “Sinä” which means “You!”, and no one would start by saying “Hi You” or anything like that in Finnish. There were just a few words missing here and there, but it was amazingly good!”

  12. I’m a linguist by training and have been fascinated by machine translation ever since it first emerged. It doesn’t surprise me that some of these are quite good. By the same token, a machine translation, however it is presented, will always contain errors unless the message is extremely simple. Any message that contains a private joke, an acronym from your place of work, a person’s name/ surname, or an obscure reference will never be translated accurately.
    In a limited way, this can be good. But when companies start using things like this to promote their work or products, then it could be disastrous. There are some classic examples of machine translations gone wrong!

  13. I really enjoyed playing around on HeyGen and discovering the possibilities of creating videos which I translate a ball into a variety of different languages. I thought some parts were a little bit clunky to use however as soon as I spent a little bit more time on it things became more apparent and easier to navigate. I asked the software to translate a few template videos into the Hindi language and he did a pretty decent job. The flow of the speaker was really impressive. This is something you can definitely have a lot of fun experimenting with and finding new ways of using it too.

    • Thanks for sharing your experience translating into Hindi. It’s interesting to hear that the flow was well translated – this can be a hard thing to capture. Language is about so much more than the actual words we’re saying.

Leave a Reply

Your email address will not be published. Required fields are marked *

*