MIL 101 | AI voice cloning
For more news, visit:
►https://www.ptvnews.ph/
Subscribe to our DailyMotion Channel:
►http://www.dailymotion.com/peoples-television-incorporated
Subscribe to our YouTube channel:
►http://www.youtube.com/ptvphilippines
Like our Facebook pages:
►PTV: http://facebook.com/PTVph
►Rise and Shine Pilipinas: https://www.facebook.com/riseandshinepilipinas
Follow us on Twitter:
►http://twitter.com/PTVph
Follow us on Instagram:
►https://www.instagram.com/ptvph
Watch our livestream on:
►http://ptvnews.ph/livestream/
►https://www.dailymotion.com/PTVPhilippines
Watch our News Programs, every Mondays to Fridays
Rise and Shine Pilipinas - 6:00 - 7:00 am | 7:30 - 8:00 am
Balitang Pambansa - 7:00 - 7:30 am | 12:00 - 12:30 pm |
6:00 - 6:30pm | 9:30 - 10:00 pm
PTV Sports - 8:00 - 9:00 am
Bagong Pilipinas Ngayon - 12:30 - 1:00 pm
Sentro Balita - 1:00 pm - 2:00 pm
Ulat Bayan - 6:30 pm - 7:00 pm
PTV News Tonight - 10:00 pm - 10:30 pm
Saturday & Sunday:
►Sentro Balita Weekend - 1:30 - 2:00 pm
►Ulat Bayan Weekend - 6:15 pm - 7:00 pm
For more news, visit:
►https://www.ptvnews.ph/
Subscribe to our DailyMotion Channel:
►http://www.dailymotion.com/peoples-television-incorporated
Subscribe to our YouTube channel:
►http://www.youtube.com/ptvphilippines
Like our Facebook pages:
►PTV: http://facebook.com/PTVph
►Rise and Shine Pilipinas: https://www.facebook.com/riseandshinepilipinas
Follow us on Twitter:
►http://twitter.com/PTVph
Follow us on Instagram:
►https://www.instagram.com/ptvph
Watch our livestream on:
►http://ptvnews.ph/livestream/
►https://www.dailymotion.com/PTVPhilippines
Watch our News Programs, every Mondays to Fridays
Rise and Shine Pilipinas - 6:00 - 7:00 am | 7:30 - 8:00 am
Balitang Pambansa - 7:00 - 7:30 am | 12:00 - 12:30 pm |
6:00 - 6:30pm | 9:30 - 10:00 pm
PTV Sports - 8:00 - 9:00 am
Bagong Pilipinas Ngayon - 12:30 - 1:00 pm
Sentro Balita - 1:00 pm - 2:00 pm
Ulat Bayan - 6:30 pm - 7:00 pm
PTV News Tonight - 10:00 pm - 10:30 pm
Saturday & Sunday:
►Sentro Balita Weekend - 1:30 - 2:00 pm
►Ulat Bayan Weekend - 6:15 pm - 7:00 pm
Category
🗞
NewsTranscript
00:00This is not Morgan Freeman. This is not real.
00:04Have you watched a video or ad where the voice sounds like it's from a famous celebrity?
00:12But if you listen closely, you'll notice that there's something different about their voice.
00:18You're cooking s***, Leslie. Get out of my kitchen.
00:21Have you ever thought that, wait, maybe he really didn't say that?
00:27Well, be aware because it's probably AI voice cloning or voice phishing.
00:33With the rapid development of technology, apps that can make voices of anyone have spread,
00:40especially famous personalities and artists.
00:44Some people find it amusing to use it.
00:47It's a big danger if the voices made in these apps are used for deception
00:53or spreading wrong or fake information.
00:56What are the risks of using AI voice generator or cloning?
01:01And how do we know if a video or audio is legit and not used by these apps?
01:08Let's find out here in MIL 101.
01:16Cars, AI voice cloning apps are being used left and right.
01:21Some are using it to hear the voice of their favorite artist while covering a song.
01:28Spreading their product videos.
01:31Or if not, just to explore.
01:34But there are also a lot of dangers associated with these harmless app features.
01:40Well, what is AI voice cloning?
01:44This is a technology that can copy or copy the voice of a person using artificial intelligence or AI.
01:52It's good to use for customer service and other things.
01:56But it's dangerous if it's used in the wrong way, like deception or scam.
02:02What are the possible wrong ways to use it?
02:06First, deception and scam.
02:09Clones or voices in these apps can pretend to be relatives or officials
02:16to get money or personal information from other people.
02:21For example, your relative will call you and ask for help.
02:26But it's a fake and you're already a victim.
02:29This can be a violation of what we call voice phishing.
02:34Second, getting personal information.
02:38Like what we mentioned earlier, this can be used to get personal and sensitive information
02:45and can be used against businesses or individuals.
02:48Or if not, to get money from your financial accounts.
02:54Third, spreading false news.
02:58Fake voices that are cloned from artists or their personalities
03:03can be used to spread false information or propaganda.
03:08Sometimes, it's used to endorse products that are not endorsed by the artist or personality.
03:16We can't say that these products are not trustworthy.
03:20But it's possible that the product is not safe for them.
03:25What should you do if your voice is copied?
03:29First, report it immediately.
03:32If you're a victim of AI voice cloning, report it immediately to the authorities
03:37like the NBI or the Cybercrime Division.
03:42Inform your friends and family.
03:46Tell them to be careful with their calls.
03:50Enhance privacy settings and be careful with answering calls.
03:56Ensure that your social media accounts are safe so that it's not easy to get your information and voice.
04:03Also, don't just answer calls, especially if you don't know the number.
04:09The question is, how can we avoid being victimized by these?
04:14We need to be extra careful.
04:16Don't believe what you see and hear online.
04:20Also, be careful with visiting sites.
04:23Don't just click on links and open websites.
04:27Research and review the clips.
04:29Find out if the information is true.
04:33And always check if the voices in the audio are fake.
04:39New technologies like AI voice cloning are amazing.
04:44It can also bring danger if used in the wrong way.
04:49Don't just listen.
04:51We need to be observant and always protect ourselves against these kinds of frauds.
04:58That's all we have for now on MIL-101.