• last year
Daniela Braga, Founder and CEO, Defined.ai, Benjamin Plummer, Chief Executive Officer, Invisible Technologies , Jonathan Ross, Founder and CEO, Groq, Rana el Kaliouby, Managing Partner, Blue Tulip Ventures; Co-chair, Fortune Brainstorm AI
Transcript
00:00I am so excited for this conversation.
00:02Thank you for being here.
00:03So AI is making us all more productive,
00:05but man, it's so expensive to make.
00:08So I thought in this conversation,
00:10we would go behind the scenes and talk about
00:12what it actually takes to make AI.
00:14So the data, the humans, and the compute.
00:18And Jonathan, I wanted to start with you.
00:20All right.
00:21People often think about Grok as a hardware company,
00:24but you actually insisted on not being
00:26on the tip panel at this event.
00:29You're more like a full-stack solution,
00:32and more like NVIDIA, actually.
00:34And I wanted to point to one particular data point.
00:37So Grok Cloud Platform is used by
00:41more than 650,000 developers
00:44to build a wide range of AI applications.
00:46So can you explain why this software-hardware combo
00:49is so important, especially in AI inference,
00:52which is what Grok focuses on?
00:55Well, most people don't know this,
00:57but AMD's GPUs are actually faster than NVIDIA's GPUs.
01:02I actually had a VP from NVIDIA brag to me about this fact,
01:06and then said, it's actually their software
01:09that gives them the advantage.
01:11So when we started Grok,
01:12we actually spent the first six months
01:14working on the compiler,
01:16and only after we were able to take programs
01:18and lower them into something that we thought
01:21we could build and run a chip with,
01:23did we start designing the chip.
01:25And so it was a massive advantage.
01:28Follow-on question.
01:29So you're often thought of as NVIDIA's challenger,
01:33and there's also a lot of new entrants in the space
01:35like Cerebrus and Etched and Amazon.
01:39What is it gonna take to win?
01:41Okay, so you never wanna directly compete.
01:45So we like to say that we do inference,
01:47and they do training.
01:48They would love to do inference, too,
01:50but we do fast inference, and others-
01:52Why is that important?
01:53Fast?
01:54Why is fast inference important?
01:55Does anyone here want slow AI?
01:57Please raise your hand.
01:58Okay, I see no hands.
02:00So just imagine if you were doing a Google search,
02:02and it took eight, 10, 40 seconds to get an answer, right?
02:08That would be intolerable,
02:09and that's where we're at with AI today.
02:11So with us, we're about 20, 40 times faster
02:15than a GPU for these answers.
02:17It allows you to do a lot of the system two thinking
02:20and other stuff, improve the quality of results,
02:24but speed also results in increased engagement.
02:27So roughly every 100 milliseconds of speedup
02:30is about an 8% conversion rate increase on desktop
02:33and 30% on mobile, so it matters a lot.
02:36Amazing.
02:37Daniela, I wanna come to you next.
02:38So we've known each other for many, many years.
02:40It's great that you're here.
02:42Earlier this year, you presented at the United Nations,
02:45and I wanna share a quote from that conversation.
02:50The race for data without permission
02:52is only broadening cultural, language,
02:55and gender imbalances because internet content
02:58is just a reflection of our society
03:00full of biases and misinformation.
03:03Your company, Defined AI, is the largest marketplace
03:07of ethically sourced data and models.
03:09What does ethically sourced data mean,
03:11and why is it so important?
03:12Thank you, I'm very proud of that,
03:14and actually very excited to hear the presentation
03:17before on Prorata AI, which is in line of our belief
03:23that just because data is public,
03:27it doesn't mean it's free.
03:29This is always what I've been saying,
03:31and now it comes to fruition.
03:33We created an ecosystem of a marketplace of training data
03:38that allows everyone to monetize their data,
03:43and we, as the brokers in that sense,
03:48we vet legally the data, we make sure we can trace it back
03:52to the consent of the participant level,
03:56we apply a price, we add all the machine learning readiness,
04:00and we sell it to a willing buyer
04:03that cares about brand reputation
04:04and not being sued by copyright infringement.
04:08But it's beyond that.
04:09We have to take care of, there's another area
04:14which, as you also mentioned,
04:18everything that is in the internet is biased by definition.
04:21It's mostly white male generated.
04:24It's English language generated, mostly.
04:28We can never have, it just always augments
04:32the divide of our society if we don't bring biased data,
04:37purposely built and ethically sourced
04:40with everybody paid on the chain
04:43to the models.
04:44And finally, the part that is hidden
04:46and nobody likes to talk about
04:48is the humans in the loop component,
04:50which I guess it gets to you,
04:53where a lot of this work is being done
04:56under what is called digital,
04:59well, digital sweatshops,
05:02in exploitation of people in developing
05:04and third world countries,
05:07exposing them to very, very low paid jobs
05:10and very harsh conditions,
05:13especially in the content moderation world,
05:17including psychological harm.
05:19So those three pillars is how we build our world
05:23and our marketplace and how we've been.
05:25I definitely wanna come back to the digital sweatshop
05:28and the humans behind that in a second.
05:30But I'm an investor now in AI companies
05:33and you've actually been quite critical
05:35of the AI investment landscape.
05:37You've talked a lot about how billions and billions
05:40of dollars of funding are going in
05:42to fund the same people, the same ideas
05:44over and over again.
05:45And I mean, how do we get to this world
05:48where we are actually funding underrepresented humans,
05:52applications, problems?
05:55I did mention that too in that United Nations talk.
05:59That's pretty cool, by the way.
06:00I highly recommend it.
06:01Essentially, of course, this is a capitalist world
06:06and venture capital is focused on making money,
06:11which by the way, so far, very few AI companies make money.
06:14That's the other ironic part here.
06:18The reality is if there must be an agreement,
06:22and this is why it was important to speak it
06:24at the United Nations level,
06:25where investors have tax breaks
06:28to incentivize them to look into other fields.
06:32The agriculture field itself has 1% of investment
06:37in relation to AI and it's what sustains humanity.
06:42So everything is very unequal in our world.
06:49I think that's, it's at the government level.
06:52It needs to be incentivized with money,
06:55which is what people understand.
06:57Yeah.
06:58Ben, let's talk about the humanism, the loop.
07:01You're the CEO of Invisible Tech
07:02and it's probably one of the best kept secrets,
07:05I think, in the AI world.
07:07You're already profitable.
07:09You started off as a workflow automation company,
07:13but you actually also do a lot of work behind the scenes
07:15with open AI and Cohere.
07:19What do you actually do with an open AI?
07:22Give us an example.
07:23Yeah, so I'd categorize what we do in two broad areas.
07:27One, loosely defined as sort of evaluations
07:31and understanding how are these models performing,
07:34where are they strong, where are they weak,
07:35where are there gaps in their capabilities
07:37so that they have a much better understanding
07:39of how these models are performing in the real world.
07:42And then the second part of that is creating
07:44really high quality data that can be used
07:48to retrain those models and actually close those gaps.
07:51And you can understand that there's a really
07:53symbiotic relationship between those two things
07:56and the faster you can crank that flywheel,
07:59the faster you can drive model performance and improvement.
08:02Now, you also recently, and I don't know
08:04if this is public or not,
08:05but you recently signed on NVIDIA.
08:08What are you doing with them
08:09and shouldn't you then work with Grok?
08:12Yeah, look, we've, over the last couple of years,
08:16worked with the majority of the frontier
08:19foundational model providers and we've really specialized
08:23in the sort of leading edge of that frontier,
08:26the most complex work,
08:28the most sort of intricate training of those models.
08:30And so any company looking to build
08:33a world-class foundational model
08:35is either a client or a potential client.
08:38And given this is so exploratory
08:41in terms of trying different techniques
08:43and still very much in the R&D phase,
08:46a lot of that is co-creating new capabilities.
08:49We're working with researchers,
08:50understanding their goals and building evaluation suites
08:55and building data that they can use
08:56to actually close the gaps in model performance.
09:00I'll come to audience questions soon.
09:03So tee up your questions, please.
09:05I wanna come back to this digital sweatshop
09:07because you employ thousands of humans around the world
09:11to basically do all this work.
09:13How do you think about what Daniela said?
09:16Yeah, it's really interesting because for us as a business,
09:20we've never really felt conflicted
09:22about needing to sort of decide
09:24what's best for our business
09:25or what's best for the people
09:27that participate in that ecosystem
09:29because we know that quality is paramount,
09:32that the highest quality data is absolutely necessary.
09:35And to get that, you need really happy,
09:39really engaged people.
09:40And so we spend a lot of time curating this community,
09:44investing in them, training them,
09:46making sure we understand
09:47we're paying well above living wages
09:50and really investing in building that out.
09:52Yes, it's a good thing to do,
09:54but it's also good for business
09:56in that these engaged people
09:57produce way higher quality results.
10:01I think probably the most surprising
10:02was when I discovered that a few of them
10:05had gotten tattoos with the invisible logo,
10:07which was probably a little too far
10:09in the engagement side of things.
10:11But yeah, really a fantastic community.
10:14That's loyalty.
10:14That's next level loyalty.
10:16All right, we have a question here.
10:17Can you please say your name and your affiliation?
10:21Yeah, hi, Pankaj Katia, friend of Chamath,
10:24investor in Grok.
10:26Oh.
10:28My question is, having been in the semi-space
10:32for three decades,
10:36I cannot recall the last time a semi-startup made it.
10:43It's a scale business,
10:46working with the foundry and the supply chain
10:48and so on and so forth.
10:50So while I love LPUs,
10:54and NVIDIA is showing the way,
10:57Jonathan, how do you think about scaling
11:00vis-a-vis NVIDIA and AMD for that matter?
11:05Well, I don't know if you intended to tee me up perfectly,
11:08but I think you just did, so I appreciate this.
11:11So AI is a scale game.
11:14And if you can't get to scale, there's no point.
11:18One of the unique things about what we did
11:20was we actually designed our chips
11:22to be using 14 nanometer, which is an old technology.
11:26It's underutilized.
11:28The fab that manufactures our chips is only about 50% used,
11:32which means that next year, if you look at that,
11:35if you look at our contract manufacturers,
11:37we actually have the ability to scale up
11:39to over two million of our LPUs.
11:42Physically, we can manufacture that.
11:44So what we've been doing to get to scale
11:46is we've been working with partners.
11:48So actually, I've been playing around with this token.
11:51So if some of you look at me on LinkedIn,
11:54you've probably seen this.
11:54This is my 25 million tokens per second target.
11:58Everyone at Grok carries one of these, that's scale.
12:01So when you produce output from these models,
12:05they produce tokens.
12:06And a token is, about 1.3 tokens is a word.
12:10And so 25 million is about where OpenAI
12:13and Microsoft combined were at the beginning of this year.
12:17And so our goal is to get there.
12:18We'll get there by the end of next quarter.
12:20So that'll make us hyperscaler scale.
12:22But I also happen to have this other one,
12:25which is one billion tokens.
12:28I'll even do this.
12:29One billion tokens.
12:31And so this is with Aramco Digital,
12:33and we're working together to do this.
12:34If we do that, that'll be more than
12:36all of the other cloud providers combined.
12:39And we partnered with them.
12:41They're covering our cost to deploy,
12:42and then we split the profits of that.
12:45And it's a very beneficial arrangement together.
12:49Because we're really the only ones
12:50who can get to that scale.
12:52NVIDIA has a lot of obligations with existing customers,
12:55and we can actually build as much compute
12:57as all of NVIDIA combined.
13:00Wow.
13:02I have to ask this follow-up question.
13:03You're very passionate about democratizing access to AI.
13:07Can you talk about how the scale
13:09will be a path to doing that?
13:11Yeah, because right now,
13:12if you want to get access to a GPU,
13:14you have to wait in line, and it's a long line.
13:16And it's not a very transparent line.
13:18You don't know how long you're gonna wait.
13:20So right now,
13:23Grok gives away about four times as many tokens
13:26for free every day as GCP does.
13:29And we're a startup.
13:30And so our intention is to get to a point
13:32where we make enough money where we can give access away
13:35for free to everyone in the world,
13:38just like you go into any room here,
13:40you plug into an outlet, and no one's gonna be upset.
13:43They're not gonna say you're stealing our electricity.
13:45Now, for those who are using a large amount of tokens,
13:48they will pay, but that'll cover the cost for everyone else.
13:51Amazing.
13:52Do we have more questions in the audience?
13:57I'm not seeing any right now.
13:58Okay, let's talk about jobs.
14:01Do you both see kind of an evolution
14:04of AI jobs that are created in AI
14:09that's needed kind of behind the scenes
14:11to train all these AI models
14:13and create all this ethically sourced data?
14:16Yeah, I mean, we've already started to see shifts
14:19in the demand for certain jobs is going up and being created
14:24and the demand for other types of job translation
14:27and some of these things is clearly going down.
14:30And that's not that different
14:32from I think a lot of sort of technology shifts.
14:35What is different about this one is A, the speed.
14:38I think these shifts have sort of happened
14:40much more quickly than typically.
14:42And the second is this really unique attribute
14:46about AI and these technologies
14:48and their ability to democratize knowledge
14:51and skills and capabilities.
14:53And so you have people
14:54who might not have any technical knowledge
14:57can create a website and create their own website
15:00and create these technologies.
15:01And so I think it's gonna be really interesting
15:03to give these powerful tools to people
15:06and free them up to go back to inventing
15:09and creating new things and removing them
15:12from the sort of boring mundane work
15:14that holds most people back.
15:16Okay.
15:17I'm just gonna add two more.
15:18In the AI life cycle, more constrained
15:21and not more philosophically what AI is gonna change.
15:25There's, I like, XAI calls them AI tutors,
15:30which is a glorified way, but really needed way
15:33of calling the humans in the loop
15:37in a fine tuning process of the model.
15:40And the legal, the lawyers,
15:43the amount of legal with AI, it's a huge,
15:49every company now has to either hire them in house
15:53or contract the consultants.
15:57It's really huge.
15:58Okay, one word answers.
16:00What is it gonna take for AI to be faster,
16:03smarter, impactful, and equitable?
16:05Just one word that comes to mind.
16:08If you have the answer, just say it.
16:10Humans.
16:11Humans, okay.
16:14Commitment.
16:15Commitment, love it.
16:21I think more equitable investment.
16:25Equitable investment, there we go.
16:26Thank you so much to our panelists.
16:28Great conversation.

Recommended