• 6 months ago
We first saw deepfakes as humorous memes, then as celebrity likenesses selling tacky products. But the deepfake of the president last month calling on the armed forces to take action against a foreign country demonstrated AI-driven deepfakes’ potential for real danger.

Deepfakes have been defined as video or audio manipulated with the help of AI to create hoaxes. But they’re now also considered an advanced form of disinformation with often malicious intent.

Our podcast guest Jeffrey Dy is the government’s undersecretary for Information and Communications Technology. He describes the rapid response of the government to the deepfake president’s order to the military. He also talks about hacking attacks on government that were traced to China, although not necessarily its government.

He says that the country will need 300,000 cybersecurity professionals to handle the wave of deepfakes and other cyber threats on the horizon.

Category

🗞
News
Transcript
00:00 Good day, Podmates! How is everyone?
00:03 Let me remind you that a long attention span is a gift for the smart.
00:07 Our guest today has a very difficult job.
00:11 It's Undersecretary Jeffrey D.,
00:13 the Undersecretary at the Department of Information and Communications Technology
00:17 with a wide mandate.
00:19 But it includes cyber security
00:22 and protecting our government technology infrastructure from hacking,
00:26 which has already happened at an alarming pace.
00:29 On top of that, more recently, he has to grapple with deepfakes, a new danger,
00:34 the very convincing mimicking of voices and likenesses of public personalities,
00:39 including the President of the Philippines.
00:42 And just to disclose,
00:44 I myself have also been a victim of deepfakes along with other GMA News practitioners.
00:50 Good day to you, Undersecretary Jeffrey D.
00:54 Good day to you, Javi, and to all the listeners
00:57 and to all the viewers of your podcast.
01:00 You're right, there are a lot of GMA personalities and talents,
01:04 and also ABS-CBN talents.
01:06 Yeah, well, it's a global phenomenon now, Usec,
01:10 but we're small fry, our news personalities, compared to the President, of course.
01:16 So let's start off with the deepfake of the President.
01:18 This is just a couple of weeks ago.
01:21 His fake voice was calling on the armed forces to take action against a foreign country
01:26 that has deactivated this video.
01:29 Hopefully, it won't spread and it won't be watched anymore.
01:33 But of course, we know it will not be the last attempt.
01:37 Walk us back to that incident, Usec.
01:41 How did you find out about it?
01:43 And what was your reaction and the reaction of your department,
01:47 which is supposed to prevent that and address those kinds of issues?
01:51 We have tools that actually scours the internet about whatever is happening.
01:57 If somebody is shouting out to assassinate, for example, a very important personality,
02:02 like our President, we actually have a monitoring system for that.
02:07 So in this particular instance, we were in a group with the Presidential Communications Office,
02:13 NBI, and the intelligence apparatus.
02:16 And then on that weekend, we monitored a vlogger who was saying that he saw a video.
02:23 Sorry, he heard that our President is preparing the armed forces to retaliate
02:30 against our neighbor in the West Philippine Sea.
02:33 Now, we know for a fact, as senior government officials,
02:37 that any type of policy statement to that effect should have at least went through a vetting process.
02:45 So we asked around and we realized that, no, it's fake.
02:50 It's not a 100% copy of the President's voice, but for those who only hear the President from the news,
02:58 it's a bit outdated. It really sounded like him.
03:02 So Usec, you heard about this deepfake by the President from a vlogger who believed it to be true.
03:09 He believed it.
03:11 Well, at least that's the story, right?
03:13 Because the next thing was, we preempted it spreading out.
03:20 So we asked YouTube and Facebook to stop its spread.
03:25 YouTube, in 24 hours, they killed the channel.
03:29 And every time that video resurfaces, they use AI also.
03:33 So it's AI fighting AI.
03:35 They use AI also to detect if that video resurfaces.
03:39 They don't delete it, but they stop it.
03:41 The setting becomes private so that the public can't see it.
03:46 It was not prevalent on Facebook, but we also called Meta and Meta said that they are also monitoring,
03:53 just in case the link appeared there.
03:55 In broad strokes, we were successful because in the first few days, many were looking for the video.
04:02 After it went viral on the news, they couldn't find it because it was a cat and mouse thing.
04:07 It would be released for a few minutes or even an hour,
04:11 then YouTube's AI would detect it and kill it.
04:15 They would privately restrict the viewing.
04:18 So I think we were successful in making sure that it did not spread out.
04:22 Up to the point, of course, that we already informed the public that there was the existence of this video.
04:29 So that if you watch it, you are already armed that our dear president did not really say that.
04:34 Okay, well, you know, we may have just been lucky.
04:37 We dodged a bullet.
04:41 But who knows how others might have reacted if it went viral.
04:47 I mean, YouTube did not deactivate the video instantly.
04:52 They have their own investigations.
04:55 And of course, the news, the video had to reach you before you could react and you had to make this decision.
05:02 So there were at least a few minutes.
05:04 Maybe half a day at the most.
05:07 Yeah, okay. That's a pretty long time for something like that.
05:10 Kind of a declaration of war or even an order to attack, etc.
05:15 I mean, we news personalities only sell pharmaceutical products online, our defects.
05:23 But this is a different story.
05:26 How worried were you when you saw it or did you think that we would be able to control this immediately?
05:32 Well, of course, the first thing that came to my mind is first we have to verify the news.
05:36 Did this really happen or not?
05:38 Even though I'm an undersecretary, there are a lot of things that happens in the halls of Malacanang that I may not be privy of.
05:44 After all, it might have happened, right?
05:46 The entire government is just one chat or phone call away.
05:49 We work together very seamlessly even at night.
05:52 So I immediately called the assistant secretary of the presidential communications office, which they readily denied.
06:00 And then after that, we called the National Security Council and they said, "No, we don't think that's true."
06:05 So after we confirmed, then we put things in motion.
06:09 PCO immediately denied that using the official SOCMED postings.
06:16 They already said, "No, don't believe this." And then we went public about it to inform the public.
06:22 But you're right, Awin.
06:23 As an important note, we take all of these things seriously.
06:27 It just so happens that this particular event had a far larger impact because it was the president.
06:34 It was the voice of the president.
06:36 And of course, the statement of the president is official.
06:40 As long as he said that, that is already a statement of policy.
06:43 So it's important that we prevent this from happening.
06:46 But let's make some things clear.
06:49 Any type of AI misinformation or disinformation, including to our friends in media, including even to our loved ones or relatives, are important.
06:59 That's all.
07:00 Because they are massive, I think that will be your next point.
07:04 Unfortunately, our resources can only focus on a few, which in this particular case, the office of the president.
07:13 Of course, Usec, the flip side of this is, if the next time the president says something important, maybe no one will believe it.
07:25 Because it could be something very urgent that's real, but because we've experienced a deepfake, maybe it's like a boy who cried wolf.
07:35 That's one possible effect of these deepfakes.
07:38 That is correct, Awin.
07:39 This is where the importance of institutions comes in.
07:44 The institutions that held transparency and the integrity of data and information at a very high standard.
07:52 And I am talking about proper journalism, like what you're doing.
07:57 Also, even the government journalists, the government media broadcasting networks.
08:02 The message to us Filipinos is always look at the proper sources.
08:08 For example, if you're talking about Awin Severino, then look at GMA7.
08:13 Right?
08:14 Or it is a proper URL, the proper site of his podcasts.
08:19 If you see it in Facebook, nobody should speak in behalf of Howie, except Howie himself.
08:27 So that's my message.
08:30 So if it is the office of the president, then either it is RTBM or PTB4.
08:37 That's usually the source.
08:39 And of course, by extension, the Malacanang Beats, which is ably represented by all major media networks.
08:49 You know it, we know these sources and we should focus there.
08:54 Our problem is that we focus too much on social media.
08:57 We were victims of the scrolling.
09:00 You know, you just saw your friend's post and before you knew it, you've been scrolling for two hours.
09:07 We should not look at the truth on the internet.
09:12 Okay, this is the president's deepfake and the same person or persons or maybe organization is probably creating deepfakes of other people.
09:23 That is a crime, right?
09:26 Yeah, yeah.
09:28 So what is the liability of this person and is it possible to find out who is doing these things?
09:35 It is possible to find this out, but it would be very difficult.
09:38 So we're leaving that up to the law enforcement agencies such as the NBI, the PNP to be able to root them out.
09:46 We can determine who created the video.
09:49 The true attribution is different.
09:51 So the attribution is who posted it.
09:54 The true attribution is who is the original source.
09:58 That's the hardest part on the internet because it's copy, copy, copy.
10:02 So you won't see the original source.
10:05 But I will let the very able NBI and PNP handle that.
10:10 Now, with regard to your first question, the prosecutorial service of the Department of Justice has various tools which they can use.
10:19 So there could be an identity theft under the cybercrime law of 2012.
10:25 There could also be some other provisions in the revised penal code which they can use.
10:29 Usually, what the DOJ does is to find out what is the worst that they can apply.
10:36 If it is found out that it is part of a larger conspiracy, then it could be also inciting to sedition or some other treasonous acts.
10:50 Just in case, because we're talking about, I'm not saying definitely, but that will depend on the evidence.
10:56 And remember, we're talking about the office of the president.
11:00 Yes. Well, before we leave this particular incident, is there a suspect?
11:09 I mean, how far are we in finding this person, filing a case?
11:15 And just as a warning to others who might want to do the same thing.
11:21 I'm sorry we cannot disclose the full details of the investigation, but I believe the NBI and the PNP already have persons of interest which they are investigating.
11:31 So they already have leads. And hopefully, these are the real perpetrators.
11:36 It's a bit difficult to determine who among the many posters is the source.
11:42 Now, the good thing here is that, of course, you cannot create a YouTube account without giving out some form of information.
11:50 Some information might be fabricated, but still, you made the effort of creating impersonated identities.
12:02 So at least we have leads in law enforcement.
12:05 Okay. There are a lot of scenarios about the elections. I'm sure you've been giving some thought, Usec.
12:13 What are the scenarios we see for this information, deep fakes that will try to influence the elections?
12:23 I'll just give you a hypothetical scary thought.
12:26 Let's say you have a candidate saying he or she is withdrawing from the race. And let's do this on a very ungodly hour.
12:37 Let's say midnight before the polls open.
12:45 So you have very little time to actually verify the information.
12:50 And you know how chism is trans, no? It's even faster than the news.
12:54 Actually, that's how this information works. This information doesn't work in the long term.
13:00 It doesn't. Because in the long term, there will be analysts who will come out who has the mental fortitude to actually say, "That's wrong."
13:09 But it works if it's fast. When we're talking about hours before the actual polls, that's how it works.
13:17 The other thing is, a candidate will say something against another candidate that will backfire on him.
13:24 Very rude, very wrong statements that even questions the morality of the ethical and moral principles of that candidate.
13:35 But he didn't do that. He just made it look like he was shooting the other candidate. But he didn't do it.
13:42 What you said earlier, Howie, it's not true, but you believed it.
13:47 And not only that, even if he denies it, the other would say, "Of course, you're denying it because it backfired."
13:55 You're creating a scenario now, a lose-lose situation for that particular candidate.
14:02 So we're thinking in advance about this. That's why in social media, we're talking to Facebook, we're talking to YouTube.
14:09 And according to them, by May, they will be launching an AI labeling scheme.
14:15 So if the video or audio is AI generated, they said they will also use AI to fight AI.
14:21 So there will be a release there that says, "Probably AI generated." Or probably edited.
14:26 So are we expecting it anytime soon? Or did they give a date?
14:30 I think they said it's in the third to fourth week of May.
14:33 Facebook will then follow YouTube.
14:37 Okay. Well, so far, we've been discussing deepfake videos, the likeness of people, the face of the president or other personalities with their voice saying something they didn't say.
14:53 I've read that the next generation will be audio-only, audio deepfakes, that will have robocalls, calling our mobiles, sounding like somebody we know, sounding like somebody in the news, etc.
15:10 So this is a different kind of creature, Yusek.
15:14 That's correct. Now, we're already monitoring this abroad because if it happens abroad, it's not far off that it will happen in the Philippines.
15:24 So now we're talking about cybercrime, not cybersecurity.
15:27 Crime is actually an enterprise. Usually, there's a tool that they sell.
15:34 And when it becomes easy for everyone to do that, then it becomes a common crime.
15:40 So now, it's not yet common, but it already happened in the United States. It already happened in Europe.
15:45 And just recently, it happened in Hong Kong where somebody in a voice call, in a conference call of C-level executives, they thought they were talking to their CEO.
15:56 And the CEO ordered a transfer of money in the millions of dollars.
16:02 They did it.
16:03 In that particular conference call, the CEO was not there. It was a robocall.
16:09 So yes, this already happened in other areas.
16:12 And so we are expecting that this will also happen in the Philippines.
16:17 Those tools, if criminals get hold of them, they can use them in the Philippines.
16:22 But so far, Yusek, you haven't heard anything about this happening in the Philippines.
16:29 Nothing yet, but this is very concerning.
16:32 You can see it on Facebook, you can see the jokes that AI generated where our past presidents are discussing with each other.
16:41 They were even singing.
16:42 Now, notice how easy it was to do.
16:45 They can do like once every two days.
16:47 Every other day, they do that.
16:49 Plus, you can be Filipino.
16:53 Taglish.
16:54 Conversational taglish is what AI does.
16:57 This is what's scary because that means your culture can be assimilated and therefore it becomes conversational.
17:04 Therefore, it's harder to detect.
17:07 Okay.
17:08 There are congressmen and other political personalities that are advocating now for legislation against deepfakes.
17:19 Do we need a law?
17:20 You said that this is a crime, mimicking the president and having him say false things.
17:28 This is obviously a crime.
17:30 Do we need legislation here?
17:32 Yes, we do.
17:34 Let's put this recent deepfake event into perspective.
17:40 What happened here is that we are looking for a case.
17:43 We are looking for a revised penal code, identity theft, etc.
17:48 Because technically speaking, no one is saying that you are committing a crime.
17:55 So, if we can criminalize that particular act by itself, you don't need an intent.
18:02 You don't need an intent of conspiracy, for example, to call or conspiracy to perform seditious acts.
18:11 The very act of creating an AI copy of somebody's voice should be prohibited because it's assuming that it's not for evil intent.
18:24 But if someone gets it, they can splice it and make replicas or some other product out of that copied voice and copied video.
18:37 Of course, I'm just saying that this debate should be at the forefront of the discussions with legislators vis-a-vis our right on freedom of expression.
18:51 This is my personal opinion, I'm not for censorship.
18:54 But the public should be informed.
18:56 They should know that this is AI.
18:59 So, you understand what I'm trying to say.
19:02 That's the only thing that needs to be balanced. And to be honest, I don't envy the discourse that is going to happen in Congress.
19:08 In fact, there are already a lot of bills that are already there.
19:11 Our congressmen predicted this as far as two years ago.
19:15 There's a bill to expand the powers of MTRCB to censor even streaming media and social media against misinformation.
19:25 But now, the anti-corruption has become one-up.
19:30 Because now, it's not just misinformation, it's really copying.
19:33 They really took the voice and the look to make it look like you, but it's not really you.
19:41 Well, okay. Well, that's kind of reassuring that there are conversations like this in our political sphere before it gets out of hand.
19:51 You said that there's serious damage happening even in the corporate world, in other countries, other places.
19:58 But in our country, it seems like there are no such incidents.
20:01 So, there's a way of creating more guardrails before that happens.
20:06 In the meantime, I know that you, Usec, have also been preoccupied with the hacking of some government websites.
20:12 So, that's already a current problem.
20:15 In previous interviews, you've said that some of these hacking attacks, you know for certain that they came from China.
20:23 Not necessarily the Chinese government, but from Chinese hackers' networks in China.
20:30 Can you share any updates about that?
20:34 Maybe some more investigations or information about these hacking attacks on government tech infrastructure?
20:43 Yes, obviously. So, last February, we monitored a spyware, which is spying on the access credentials of our government mail exchange systems.
20:56 This actually happened at the same time that Google and Microsoft was being attacked in the United States.
21:02 It just so happens that our government mail exchange system is housed in Google.
21:07 So, I think they used the same attack to encroach into our government mail systems.
21:12 We immediately responded to it. It was like a cat and mouse game.
21:18 Cyber security is always like that.
21:21 But we managed to get it under control.
21:24 And you're right, we call it TTP in the hacking world, Tactics, Techniques, and Procedures.
21:29 The tactics, techniques, and procedures employed seem to belong to advanced persistent threats associated with Chinese APTs.
21:38 Again, you're right, we're not saying it's the Chinese government, but they are well-known Chinese APTs, which even China, in their database, says.
21:48 This looks like criminals within the death frame.
21:52 You said, you clarified that you're not saying that this is the Chinese government doing these cyber attacks.
22:01 However, can we expect at least the Chinese government to help us, and help the Philippine government, track down whoever, whatever is orchestrating, engineering these attacks from China?
22:16 As in the spirit of maybe friendship or diplomacy, are they offering to cooperate, to help the Philippine government in tracking down these people?
22:30 These people who are doing it.
22:32 I think help was extended through our secretary of ICT.
22:37 The embassy did call the secretary and they were offering him.
22:41 In terms of concrete help or how this can be done, we need to turn that out.
22:48 There are computer emergency response teams that are inter-country, and we usually communicate.
22:53 For example, we communicate with Japan.
22:55 We even communicate with the computer emergency response team of Taiwan, and Australia, et cetera.
23:04 We don't have that particular setup with the Chinese.
23:07 With the Chinese, what we usually do is we escalate it to the Asia-Pacific computer emergency response team consortium, and then we get the response from them.
23:17 I think you're right.
23:20 The proper way of doing this is to have this sort of mechanism where we can call each other and de-escalate.
23:28 This is a very sophisticated attack, and we wouldn't have known it without the assistance also from the private sector.
23:39 I'm very thankful also to the private sector and also to our other nationalists.
23:44 We have a community of hackers. I know there is an underground community, but there is also a group of professionals who are very nationalistic who helped us out also.
23:52 A white hat hacker.
23:55 Yeah, professional white hackers who helps us.
23:58 There are underground people who hack the government, but there are also other people who help governments.
24:04 Good to have allies like that.
24:05 I want to ask you about TikTok, Yusek, because in the United States, the public conversation there, at least in some congressional government circles, is if TikTok is banned or regulated.
24:19 Will the government restrict access to mobile phones to TikTok or even ban it altogether as has happened in some other countries?
24:36 Because supposedly, control over TikTok can be traced to Chinese companies.
24:39 Until now, it's still unclear what China's involvement in TikTok is.
24:43 Do you think TikTok is a security risk? Should there be some kind of restriction over its use?
24:52 Should the Philippine government be considering even banning TikTok?
24:55 We're one of the biggest TikTok users in the world.
24:58 Yeah. First things first, there is no discussion in banning TikTok for the entire Philippines. None.
25:07 However, there is discussion regarding the use of TikTok for as official social media platform for government agencies.
25:22 So you can allay that fear. There is no discussion yet whether we will ban TikTok for the public.
25:30 The discussion in the government is, is it enough or is it right to use TikTok as a government communications platform?
25:38 Because if you notice, most of the people who use TikTok are informal.
25:41 And the problem with being informal is that sometimes when you're too relaxed and comfortable discussing,
25:47 and I know you notice, Awi, because that's what journalists try to do, you make us comfortable.
25:52 So there's a scope, right? So we can say something that we shouldn't say.
25:56 That's what happens when you're informal. So that discussion is revolving around that.
26:01 And of course, we're also talking about the ownership structures.
26:06 But our position in the ICT is that this particular problem is not only TikTok, but extends to other applications as well.
26:13 For example, we also would like to call out Telegram because why is it that most of the hacking incidents, the leaks are being released by Telegram?
26:21 I'm just stating out the fact. It's not Viber. It's not Messenger. Right? It's not Signal. It's always Telegram.
26:31 Look at almost all the hacks, not only here, but also worldwide.
26:35 So this particular also we're calling out Facebook. I've already called Claire, the Meta Country Director here.
26:42 And I told her that, you know, the Messenger is also being used for Facebook marketplace is being used for prostitution.
26:48 And so we would like to control that as well. So we are calling out any type of application that is being used for illegal,
26:56 either for illicit activities or for certain activities which we think is not appropriate.
27:02 So, yes, there is discussion on TikTok. I don't want to preempt the discussion.
27:08 I'm sorry, but we have to be statesmen also. And we have to respect also the opinion,
27:13 especially of the other government officials who wanted TikTok banned, at least as a government platform.
27:18 So we're having that discussion. But for the public, of course, you can still use it.
27:23 May alam ka na bago. That also goes for Telegram. Kasi nga, it's being used by hackers.
27:27 We also know that the API of Telegram is exposed. So maybe Telegram should also not be used by government officials.
27:34 That discussion is also on the table.
27:36 Well, OK, thank you for bringing up messaging platforms. Kasi a big part of the population are also using messaging platforms to communicate.
27:44 And sabi nyo nga, not all of these are, you know, they're not all equally secure or they may be all equally insecure.
27:52 But I'm just wondering, are there particular messaging platforms that you would recommend as,
27:59 kung may some people are considering the various options now? You know, there's Messenger, there's Telegram, there's WhatsApp, there's Viber.
28:05 I mean, many of us are on multiple messaging platforms.
28:09 But if there was something that we would want to keep secure or there's a messaging platform that we can trust,
28:18 would you be at liberty to say, Yusek?
28:22 May hirap na ba kay call out ako. Pero ang sabihin ko na lang is that, like I said, na yung hackers are using Telegram more often. That's one.
28:31 We know also for a fact that there was a leak of keys that happened in WhatsApp.
28:36 But that was like last year or several or two years ago.
28:40 WhatsApp said that they already fixed it. The same goes true for Facebook Messenger.
28:48 In terms of technical capabilities, the iMessage of iPhone has very strong encryption.
28:56 At pinagmamalaki nila that they haven't surrendered their keys to any government agency or any government law enforcement.
29:02 So that's very reassuring. But however, iMessage is only secure if it is iPhone to iPhone communication.
29:09 Kasi kapag ka iPhone to Android, the security cannot be guaranteed.
29:13 So those are some of my technical assessments for the public out there.
29:17 I want to pivot a little bit. Kasi nga in a previous interview, you talked about the drastic shortage of cybersecurity professionals.
29:25 Considering the kind of threats that we're facing today and the kind of scenarios that you outlined for the future.
29:34 It's not just government who will need cybersecurity professionals. Of course, the private sector, academia, everyone will be needing it.
29:42 Is there a way to encourage more talented patriotic Filipinos to go into cybersecurity to protect the country?
29:50 So that's the type of behavior we're nudging the industry towards. For example, just last week, Friday, we had the launching of Hack4Gov,
30:01 which is actually a government-sponsored hackathon. So nagbago na tayo, meron na tayong hackathon.
30:07 We're inviting college students to form teams of four. We provide a box where they can safely experiment with.
30:12 And those with the highest points, which means the number of flags they captured, wins.
30:17 And hindi lang 'to nag-humihin to rito. We're doing this the entire year. We're going to do region by region.
30:24 We're going to be announcing winners per region. And then on October, it will culminate in a national hackathon
30:30 where each winner of the region will participate. And the winner will send to Thailand for the ASEAN hackathon.
30:37 And elsewhere, kasi we're also being invited to sa mga hackathons. But we're not stopping there, Kawi.
30:43 Gusto ko lang masabi na this is an achievement of this administration as well.
30:47 When we first came in, there were only about three bachelor's degrees in cybersecurity.
30:54 There's a lot of trainings in cybersecurity, but only three colleges or universities offer bachelor's degree in cybersecurity.
31:02 Yung iba, computer science, pero walang cyber. But now, there are more than 10,
31:07 which means that there is now a growing number of bachelor's degree or higher education institutions that are offering cybersecurity.
31:16 Hindi ko na kinakaut yung master's. You know, master's is not really technical.
31:19 It's more about going towards the philosophy of things, managing things.
31:24 So we really wanted to drill in. There's a technical. That's why we're counting the bachelor's degrees.
31:31 We need about 300,000 until 2028.
31:37 300,000?
31:38 Yeah.
31:39 So how far do we have to go? How many do we have right now? May estimate?
31:43 Malayo pa. I think we only have, siguro mga 10,000 or less cybersecurity professionals.
31:50 Hindi ko na binibilangin sa gabiyerno. I'm talking about the entire kasama ng private sector.
31:55 You know why we need that much people? Because cyber attack is going to be mainstream.
32:00 In fact, sinasabi na namin sa PNP, dapat lahat ng cyber attacks, hindi na siya centralized or special unit,
32:07 like PNP ACG, Anti-Cybercrime Group.
32:10 Dapat lahat ng presinto marunong mag-handle ng cybercrime because it is going to be a main crime.
32:18 Hindi siya special crime. It's going to be the next common crime.
32:22 In fact, nangyari na ngayon. Diba? Scams sa GCash, phishing, etc.
32:27 So we need that many professionals, not only in government, but also in the private sector.
32:32 So, Yusek, what can you share about your own professional journey? How did you end up in the DICT?
32:38 So, I'm a CISSP, Certified Information Security Systems Professional of ISC2.
32:46 I think there's only like 200, less than 300 CISSPs in the Philippines. Medyo mahirap na exam.
32:53 Back then in the 1990s when I was in college, kasi wala pang ganyan. So I took Applied Physics.
32:59 But I did through the certification path, CCNP. For everybody's information, I actually came from the private sector.
33:06 I only joined government five years ago. So talagang trained in the private sector.
33:12 Ang karera ko talaga, usually before you go to cybersecurity, you have a certain specialization.
33:18 So my specialization is Network Engineering. So I know how to build this, you know, BGP, OSPF, etc.
33:27 So for the private sector. And then, pumutok na nga noong 2000, yung mid to 2010, etc., yung cybersecurity, information security.
33:37 I took the chance, no? I took some certifications, I took some trainings, and I took up master's degree.
33:44 And here I am. But ang gusto ko sabihin lang, is that there is a lot of people who are enamored with the mystical world of cybersecurity.
33:55 So enamored, in fact, that they would like to test their skills using black hat orientation.
34:01 Ito po sasabihin ko sa inyo, madaling gumamit ng tools pang hack, at ihack ang isang government agency para sumikat ka, ang mas mahirap maging ethical about it.
34:12 It's like firing a gun. It's easy to buy one illegal gun, fire it, be very good at it.
34:18 But being a law enforcer, being in the military, for example, who can fire your weapon at appropriate times only, perhaps never in your life, pang training lang, yan yung mas mahirap.
34:31 So I would like to invite everyone who wanted to pursue their professional journey in cybersecurity.
34:36 Wag po kayo mag-focus sa black hat, mas mahirap po maging ethical or white hacker. So dun tayo sa light, wag tayo sa dark.
34:45 Wise advice, wise advice, Yusek. So I just want to ask, as a final question, your department, the Department of Information and Communications Technology, is one of the newer departments in government.
34:56 Ano yung priorities and main concerns ninyo as a department?
35:00 Ang priorities natin is to build the infrastructure. So if you look at layers, bago ka pumunta sa application, which is yung e-gov namin, nandyan yung e-gov app, I'm sure you're aware of it, the national ID, nasa tuktok yan eh.
35:12 Sa gitna niyan, data centers. So kung paano magagawa yung application kung wala kayo yung gitna which is data.
35:20 So makikita mo ngayon yung priority namin. We wanted to go to the upper layer, which is how to make government efficient by providing a citizens app so that ang bawat isang Pilipino hindi napipila.
35:31 Mayroon na silang app para mag-transact sa government. Pero before ka umabot don, you need data management, data centers, and then you need connectivity. Right?
35:41 So yan na, makikita mo na yung priority. Connectivity is infrastructure. Make no mistake about it, internet is very physical. Cable, wireless access points, etc. So we're doing all of that.
35:53 Plus, of course, kung lahat nang ito ang gawa na, dalawa naman ngayon ang natitira. Training para magamit mo, so upskilling.
36:02 And security, para alam mo you trust using that application so that you trust that you can use the services being offered you in the internet.
36:13 So yan na, binanggit ko na yung limang priority namin. So EGov application or the citizens app, data management, the infrastructure or connectivity, then supported by upskilling and cybersecurity.
36:27 Okay, this has been very enlightening, Usec. We want to thank you for your service and for shedding light on these very urgent problems of deepfakes and hacking, etc.
36:40 Undersecretary Jeffrey D. of the Department of Information and Communications Technology, maraming maraming salamat po, magbuhay po kayo.
36:48 Maraming salamat, Howie, Salat, Ravikinig, maraming salamat.
36:51 Hi, I'm Howie Severino. Check out the Howie Severino Podcast. New episodes will stream every Thursday. Listen for free on Spotify, Apple Podcasts, Google Podcasts, and other platforms.
37:02 [MUSIC]

Recommended