• 2 years ago
Subscribe: https://www.youtube.com/IgiheTV?sub_confirmation=1
Facebook: https://web.facebook.com/igihe
DailyMotion: https://www.dailymotion.com/igihetelevision
Twitter: https://twitter.com/IGIHE
Instagram: https://www.instagram.com/igiheofficial
Flickr: https://www.flickr.com/photos/igihepictures/
Website: http://igihe.com/

#IGIHE #Rwanda

Category

🗞
News
Transcript
00:00 [MUSIC PLAYING]
00:03 Thank you so much for making the time to come here today
00:20 and join us for this conversation.
00:26 So now more often than not, tech is by many people
00:32 seen as a great equalizer for solving some of the world's
00:37 leading challenges.
00:39 But the key to enable this is and remains
00:43 to be the access to capital.
00:47 Your Excellency, why should global investors
00:50 consider Africa as an opportunity for investment?
00:54 [INAUDIBLE]
01:00 First of all, it is long overdue.
01:03 That's why they need to really consider that.
01:07 I don't see why Africa, like any other part of the world,
01:14 where human beings, the brain, and other gifts from the time
01:29 we are all created.
01:33 I have never understood why there
01:35 should be a particular geographical place where
01:39 everything is to be underestimated or underrated.
01:45 And therefore, the investors need
01:48 to bear that in their minds.
01:53 Africa has everything the rest of the world has, even more.
02:00 But when it comes to people, I think
02:04 we should consider ourselves as equal with any other people
02:08 anywhere in the world.
02:11 But there are also resources, other resources
02:13 other than just people that we can tap into and grow and connect
02:21 with the rest of the world for our own success,
02:24 but also for the success of the rest of the world.
02:28 So the investors need just not to look at Africa
02:34 as a big market, which it is, but also a thriving people,
02:41 society that, to a great extent, will match or even supersede
02:50 actually what exists elsewhere in the world.
02:53 So the investors need to consider that.
02:56 And of course, the next consideration for any investor
03:00 is you invest for a purpose.
03:03 And is that purpose going to be served?
03:06 Is the profit you want and the growth
03:10 you want to make out of that going to happen?
03:14 I think I can say it is guaranteed that it will happen.
03:21 Well, thank you.
03:22 And why do you think starting in Rwanda
03:26 can de-risk your investment into Africa?
03:32 Well, first of all, we are deliberate in Rwanda
03:40 in terms of creating the environment that
03:45 will enable us to achieve what we want to achieve.
03:51 In the same way, us doesn't mean just particularly Rwandans.
03:57 It means Africans.
03:59 It means people from many ways in the world
04:02 who want to invest with us or do business with us.
04:08 And that thinking, therefore, provides us
04:10 for an environment that lowers the risk anybody would face
04:18 when they have taken a chance with us
04:23 and believed in us that we can do things like that
04:30 and do as expected in terms of holding each other's hand
04:36 and moving forward.
04:39 Very good.
04:40 Thank you so much.
04:40 Do we have time for any questions from the audience,
04:43 perhaps?
04:46 Someone can help with a microphone.
04:50 We can perhaps take the first question of the session.
04:53 Your Excellency, my name is Vicky Akaniwabo.
05:00 I am the managing director of Hens Technologies Rwanda.
05:05 It's a data and AI company that is building products
05:08 for global companies.
05:10 And our current focus is legal.
05:12 I'd also like to mention that I am
05:14 a product of Carnegie Mellon University Africa.
05:17 Your Excellency, there's a global debate
05:21 around artificial intelligence on things
05:24 like ethics, safety, and for Africa,
05:29 specifically things like its preparedness or readiness
05:32 to adopt AI.
05:34 I'm curious, what is your view on artificial intelligence?
05:38 And what is your advice for us, the tech entrepreneurs,
05:42 who are trying or aspire to help Africa leapfrog and also
05:47 actually make it a leader in artificial intelligence?
05:51 Thank you.
05:52 Yeah.
05:53 To begin with, everything new that comes to society
06:00 raises certain concerns.
06:04 In fact, let's begin with look at technology as a whole.
06:12 Technology has a huge benefit, provides that advantage
06:19 in terms of productivity and different kinds of things.
06:24 And it lifts people to a much higher level, generally.
06:34 But it is important to walk that path while you are thinking
06:38 of what are the benefits in actual sense,
06:43 but what are also the dangers that
06:46 come with these new tools and thinking and so on to society?
06:55 Now, there are now even more concerns
06:58 when it comes to artificial intelligence.
07:02 But those concerns should also be looked at in a sense
07:08 that there are probably more benefits
07:12 with artificial intelligence than many other technologies.
07:20 And therefore, we need to balance the two,
07:28 the accelerated benefits in terms of what AI provides
07:33 as a new and sophisticated technology.
07:40 But again, with the same mind, what
07:44 are likely to be the dangers?
07:46 So that, again, people work towards managing that,
07:50 regulating that.
07:51 Sometimes people tend to approach things
07:57 with either or.
08:01 But I don't think that's what we should be doing.
08:05 We should be saying, does this have benefits
08:09 to what extent as a new thing?
08:14 But does it also have some dangers?
08:17 So how do we work to mitigate that and make sure
08:20 that the benefits override on the dangers that
08:25 are likely to be there?
08:26 And there's no way a single entity, nation, company,
08:31 country, or individuals would manage that alone.
08:38 So there has to be collaboration.
08:40 There has to be thinking together,
08:42 because what is coming out of these new technologies
08:47 like artificial intelligence affects the whole world.
08:51 It doesn't just affect one person, one company,
08:53 one nation.
08:55 So that, therefore, I have seen there
08:59 are debates around that about regulation.
09:03 Some people say, no, no, no, if you regulate,
09:06 there are these dangers.
09:07 If you don't regulate, you know, it's like--
09:09 so we have got to find a way forward together and minimize
09:14 the dangers that people are thinking about,
09:17 but also maximize the benefits that are likely to come along
09:20 with that.
09:22 Thank you.
09:23 [APPLAUSE]
09:26 I think we have time for one more question.
09:29 Yes.
09:30 We might allow one more.
09:33 Your Excellency, my name is Eugene Nwagasore.
09:37 I co-founded the company, KwaTiPiNdo,
09:40 a cloud communication platform for businesses.
09:44 Rwanda is known to take risks, going above and beyond.
09:49 Some say we are a startup nation,
09:53 and other says that we may be punching above our weight.
09:58 You are attracted, Ilkad and ZipLine, to say the least.
10:05 At Pindu, we are aspiring to scale our communication tools
10:10 using AI so that communities in Africa
10:14 can engage in communicating this digital age using
10:18 our native languages.
10:19 And that requires, really, a big undertaking on our behalf.
10:27 Something that you always do.
10:31 As a leader, what makes you believe in yourself
10:36 to take those risks, regardless of the challenges
10:39 they may bring with them?
10:42 Thank you.
10:43 [MUSIC PLAYING]
10:47 [MUSIC PLAYING]
10:50 [MUSIC PLAYING]
10:54 [MUSIC PLAYING]
10:57 [MUSIC PLAYING]
11:00 [MUSIC PLAYING]
11:04 [MUSIC PLAYING]
11:07 [MUSIC PLAYING]
11:10 [MUSIC PLAYING]
11:14 [MUSIC PLAYING]
11:17 [MUSIC PLAYING]
11:20 [MUSIC PLAYING]
11:24 [MUSIC PLAYING]
11:27 [MUSIC PLAYING]
11:31 [MUSIC PLAYING]
11:34 [MUSIC PLAYING]
11:37 [MUSIC PLAYING]
11:41 [MUSIC PLAYING]
11:44 [MUSIC PLAYING]
11:48 [MUSIC PLAYING]
11:51 [MUSIC PLAYING]
11:55 [MUSIC PLAYING]
11:58 [MUSIC PLAYING]
12:02 [MUSIC PLAYING]
12:05 [MUSIC PLAYING]
12:08 [MUSIC PLAYING]
12:12 [MUSIC PLAYING]
12:16 [MUSIC PLAYING]
12:19 [MUSIC PLAYING]
12:23 [MUSIC PLAYING]
12:26 [MUSIC PLAYING]
12:30 [MUSIC PLAYING]
12:33 [MUSIC PLAYING]
12:44 [MUSIC PLAYING]
12:55 [MUSIC PLAYING]
12:58 [MUSIC PLAYING]
13:16 [MUSIC PLAYING]
13:19 [MUSIC PLAYING]
13:38 [MUSIC PLAYING]
13:41 [MUSIC PLAYING]
13:59 [MUSIC PLAYING]
14:02 [MUSIC PLAYING]
14:21 [MUSIC PLAYING]
14:24 [MUSIC PLAYING]
14:42 [MUSIC PLAYING]
14:45 [MUSIC PLAYING]
15:04 [MUSIC PLAYING]
15:07 [MUSIC PLAYING]
15:10 [MUSIC PLAYING]
15:29 [MUSIC PLAYING]
15:32 [MUSIC PLAYING]
15:51 [MUSIC PLAYING]
15:54 [MUSIC PLAYING]
16:12 [MUSIC PLAYING]
16:15 [MUSIC PLAYING]
16:34 [MUSIC PLAYING]
16:37 [MUSIC PLAYING]
16:55 [MUSIC PLAYING]
16:58 [MUSIC PLAYING]

Recommended