Renée Cummings, Assistant Professor of the Practice in Data Science, University of Virginia
Category
🤖
TechTranscript
00:00Thank you so very much.
00:06Let me say it's an absolute honor and a pleasure to be with you.
00:10I'm a criminologist and criminal psychologist.
00:13I'm also a therapeutic jurisprudence specialist, and I'm also a terrorism specialist.
00:18And that is the work that I bring to data science.
00:21When we think about equity, when we think about access, and when we think about justice,
00:26we have got to think about designing with these concepts, using these concepts as frameworks
00:33to audit this technology.
00:36We know that AI is a revolutionary technology.
00:39We know that AI is a radical technology.
00:42But we have to understand that there is no AI without data.
00:47Data is a most critical component.
00:50It is the most critical component, the lifeblood of this technology.
00:55So when we think about data, we have got to think about the frameworks which we use
01:02to really ensure that we are developing and designing and deploying a technology that
01:09is equitable, a technology that is just, and a technology that creates access.
01:15We know that data creates very, very unique challenges when it comes to bias, when it
01:20comes to discrimination, when it comes to systemic challenges.
01:24We also know that data, if left in a space and a state of really not being managed, could
01:32really create some extraordinary challenges.
01:35So when we think about data, I like to say we have got to bring an ethical approach to
01:40the ways in which we are doing data.
01:42We have got to think about how do we use equity and access and justice so that we do not undermine
01:51data accuracy, or we do not undermine data integrity, data fidelity, or even data validity.
02:00These are very, very critical questions.
02:02So when we think about access, we have got to think about visibility.
02:07We have got to think about voice.
02:09We have got to think about vulnerable groups and the ways in which we can ensure we are
02:14building a just future.
02:16We are co-creating a just future for everyone to be a part of it.
02:21When we think about justice, we have got to think about questions around due process.
02:26One of the things that we have seen with data and with algorithms is that an algorithm has
02:31the ability to undermine in real time questions around agency, questions around access, questions
02:40around self-actualization.
02:42An algorithm has the ability to make a decision about you without you even knowing a decision
02:49was made.
02:50So for us to really bring an approach that is equitable, approach that creates access,
02:56an approach that is justice-oriented, we have always got to think about bringing a very
03:02sophisticated level of due diligence to the ways in which we are designing, because it
03:08is about trust.
03:10It is about responsibility, and it is about ensuring that this technology benefits humanity.
03:17We also have to think about questions around due process, questions around just the ways
03:24in which we need to bring a requisite level of duty of care, the ways in which we need
03:31to bring the duty to warn, and the ways in which we need to bring ethical vigilance to
03:37what we are doing.
03:38So let me just start with questions around duty of care.
03:42We know that AI has extraordinary promise.
03:47We know that AI is not only pervasive, but we know that AI has the potential to just
03:53do extraordinary things.
03:55But we know this technology also brings extraordinary risks.
04:00Where there are risks, it means that there are responsibilities, and those responsibilities
04:07require us to protect the rights of the individuals who are engaging with this technology, the
04:15rights of entire societies.
04:17We have got to ensure that the technology is not only responsible and trustworthy, but
04:23we're understanding questions around resources.
04:28Data and AI require an extraordinary amount of resources.
04:34The impact on the environment is something that many of us do not think about.
04:40We speak about building a sustainable and resilient future using this technology, but
04:46we've got to think about the ways in which the environment is also being impacted by
04:51this technology.
04:52So we have a duty of care, a responsibility as a data scientist, as a critical data scientist,
04:59as individuals who are designing technology.
05:02We have a responsibility to ensure that we bring a framework to audit the ways in which
05:09we are building models, the ways in which we are building algorithms.
05:15We have got to ensure that we are not only bringing this very sophisticated and heightened
05:20level of due diligence.
05:22We have got to ensure that we are understanding that responsibility, our duty of care.
05:28We have got to ensure, we have got to ensure that everything that we do with this technology
05:35understands this question of due process, equity, and fairness coming together to create
05:43situations of justice.
05:45When I speak about data, I always speak about this ethical data pipeline.
05:50And some of the things that I like to look at would be questions around the provenance
05:55of data.
05:56It's so important when we're designing to ask ourselves, where did this data come from?
06:02I always speak of the anatomy of data, the genealogy of data, the pathology of data.
06:09Sometimes we're using data that is being duplicated in very weird ways.
06:14Data that's actually missing pieces, critical pieces to the stories that need to be told.
06:21So when we are using data, when we are bringing this heightened level of due diligence, when
06:27we are paying attention to this duty of care, it means as designers using data, as designers
06:34building algorithms, as designers of AI, the space in which I work, we have always got
06:41to ensure that everything that we do, we bring this heightened level of ethical vigilance.
06:49So when we think about access, we have got to ask ourselves, when we are designing, when
06:55we're developing, and when we are building, who are we building for?
06:59Are we using this technology to create access?
07:02Or are we using this technology in ways that are going to deny access?
07:08Ways that are going to deny access to resources?
07:12Ways that are going to deny opportunities?
07:15We've got to think about that.
07:17When we think about equity, we have got to understand that data equity is business intelligence.
07:23Data equity is economic intelligence.
07:27Data equity is also about opportunity, but it's also about empowering.
07:33And when we think about justice, we know if we are designing, developing, procuring a
07:39technology that is unjust, a technology that is unfair, then we are not designing for the
07:47greater good.
07:48We've got to think about those questions.
07:51We've also got to think about the ways in which we engage communities.
07:55That is so important to the work that we're doing.
07:59Stakeholder engagement.
08:00Whether or not the products, the policies, the processes we are designing, are they being
08:08community-inspired?
08:09Are they being community-led?
08:11What's the level of stakeholder engagement?
08:14How involved are communities in the process?
08:17It's very, very critical, the ways in which we engage communities to design.
08:23Because we are realizing that many of our data sets, many of the algorithms that we're
08:28deploying are creating very, very critical risks.
08:32So an aspect of what we do is that ability to mitigate those risks, to manage those risks,
08:40to monitor those risks.
08:41So if we use equity and access and justice as frameworks to audit, if when we're designing
08:49and developing and deploying and procuring, we are thinking about those vulnerability audits
08:55and those impact assessments and those risk management approaches and those crisis
09:00management approaches and the ways in which we are building our business continuity plans,
09:05if we are thinking about them within that framework of equity, access, and justice,
09:12it means we are designing for good.
09:15It means that we are designing and engaging and educating in the kinds of structures or
09:22systems that we are creating.
09:25We also have got to think about, for me, the legacy.
09:31And I always say what I have learned as a critical data scientist, what I have learned
09:35as someone involved in AI and data governance is the question of legacy that is attached to data.
09:43When we think about data, when we think about algorithms, we have got to think about the
09:48legacies that we are deploying with an algorithm.
09:53As I said earlier, an algorithm has the ability to make a decision about you, not only in
09:59real time, but without you even knowing a decision has been made.
10:05So when you think of your own identity, when you think about the ways in which you want
10:09to engage with this technology, you have got to think about the legacy.
10:16When you are designing, I design with data.
10:20When you are designing, what is the legacy you want this algorithm to leave?
10:26And it's a very, very critical question.
10:29Because an algorithm has the ability to create access or deny access, to create resources
10:35and deny resources, and to define legacies and undermine legacies.
10:42And it's something that we have got to think about.
10:45We have also got to think about the imagination and how the imagination intersects with equity
10:53and access and justice.
10:56We have got to think about a radical imagination.
10:59And I think from the few days that I have spent in Macau, I have really experienced
11:07the possibilities and the potential of a radical imagination.
11:12Because what I have seen among the fusion of cultures is an enduring spirit and this
11:19grand idea of possibilities and imagining the impossible and then designing it.
11:27So as we think about designing, using a framework of equity, access, and justice, we have got
11:37to ask ourselves, what is our relationship with our communities?
11:42What is our relationship with society?
11:45What is our responsibility as data scientists, as technologists, as designers, as policymakers,
11:53as leaders?
11:55What is our responsibility as we co-create a future together?
12:00How are we going to ensure that we are building equity and access and justice into our designs?
12:08How are we going to ensure that we are bringing that ethical vigilance that is required?
12:14Because what we have realized with data, what we have realized with AI, is that AI is not
12:19a technical tool, AI is a socio-technical tool because of the extraordinary impact it
12:26has on society and continues to have.
12:29The other challenge with AI and Gen AI is that we are yet to truly understand the extent
12:37of this technology and what this technology can do.
12:41We know that it can do great things and we are seeing that in healthcare, we are seeing
12:46that in communication, we are seeing just about every discipline and every industry
12:53re-imagining its business model using AI and using Gen AI.
12:59So we understand the extraordinary potential, but we've also got to understand the extraordinary risks.
13:07We've got to understand the responsibilities.
13:09We've got to understand our rights.
13:11We've also got to understand that data is now a question of justice.
13:17Algorithms are now a question of justice.
13:20When we are designing, we've got to think about human rights and civil rights and data
13:27rights and algorithmic rights.
13:30These are big questions.
13:31What data has allowed us to do is answer some of the world's greatest questions around the
13:37environment, around healthcare, around criminal justice.
13:41But what data has also done is create new ethical challenges.
13:47So if we want to think about technology in a responsible way, if we want to think about
13:53technology in a way that benefits humanity, if we want to think about technology for good,
14:00AI for good, and doing good data science, we have got to think about a framework that
14:07is built on equity, access, and justice.
14:12Because if we do not, then what we are going to create would be systems that not only undermine
14:20our own progress, but systems that could create situations that undermine the ways in which
14:29we move into the future together.
14:32So as you build, as you design, as you develop, as you deploy, as you procure, I ask you when
14:40you are auditing, when you are doing those vulnerability audits and those impact assessments,
14:46you have got to bring that question of equity, access, and justice.
14:51Because if we do not do it, then we are at risk of creating a future that may not even
14:58welcome us into that space.
15:00So as we dream and as we imagine, let us think of the impossible and let us use that framework
15:07to really deploy what is possible for all of us.
15:11And with that, I thank you.