The Fineprint | Digital Personal Data Protection Bill, 2023: Key Highlights

  • last year
Transcript
00:00 Hi there, you're with BQ Prime and this is The Fine Print.
00:03 The much awaited Digital Personal Data Protection
00:06 Bill 2023--
00:07 well, I know it's quite a mouthful--
00:09 got tabled in the Lok Sabha today.
00:12 Now, this is the fourth attempt of India
00:14 to give itself a privacy law, whether it justifies
00:17 the long wait and what it means for businesses.
00:20 To answer that, I have with me Supratham Chakraborty,
00:23 partner at Khetan & Co.
00:25 Supratham, welcome to The Fine Print.
00:27 My first question, of course, is that you've
00:30 seen several iterations of the bill.
00:32 Earlier ones were too prescriptive.
00:34 Last year, the reaction was that a lot
00:37 has been led to delegated legislation and rulemaking.
00:42 Now, the new version has, I think,
00:44 addressed some concerns on that front.
00:46 Your first quick comment on what's
00:48 got tabled today in the Lok Sabha.
00:51 Right, I think you're absolutely right.
00:54 We have been seeing several reiterations.
00:56 And I think whatever concerns were raised,
00:59 I think largely those have been addressed.
01:02 An attempt has been made, honestly,
01:04 to see that we set sailing with the draft that is now there.
01:09 There were a few discrepancies which clearly
01:12 needed to be attended to, at least
01:14 from the business and commercial side of things, which
01:17 I think has been addressed.
01:18 As we speak, maybe we can touch upon those,
01:20 how things are slightly better on a few aspects.
01:24 Sure.
01:25 So let me first come to if there is any change to--
01:29 because the starting point of all of this, Subrata,
01:31 is whether, as a consumer or a citizen,
01:35 and as the bill puts it, a data subject
01:41 wants to or doesn't want to give consent for something,
01:44 and how that consent then translates
01:47 into processing of data.
01:49 The new version, there was a clear distinction
01:52 between consent and deemed consent in the earlier bill.
01:56 That, I could see, has gone away.
01:59 Your first quick thoughts on what it means for businesses
02:02 when, in an earlier example, in the 2022 version,
02:06 the bill said that, look, when you're
02:08 booking for a restaurant and you're
02:10 sharing your phone number, it's assumed
02:12 that you have given the consent to the restaurant
02:15 to process that data.
02:18 Has that approach seen any change to your mind?
02:22 See, I don't think the approach has changed.
02:24 Let me first demystify this whole issue
02:27 around deemed consent.
02:29 I think it was about the usage of the word first, which
02:32 some people did not like.
02:36 And I think they have now tried to come to a more modern usage
02:40 by saying these are certain legitimate uses.
02:43 Now, globally, we are all aware that consent is just
02:47 one ground of processing.
02:48 You definitely need other grounds of processing.
02:52 Just as an example, suppose there is an emergency.
02:56 You would not expect the patient to see the notice,
03:00 give consent, et cetera, things like those.
03:02 I'm just saying there are exceptions which are required,
03:05 and other avenues that are required.
03:07 So in this case, what we observe is, of course,
03:11 they have rebranded it.
03:14 I'm assuming this will fly.
03:17 But I think what is important is a lot of emphasis
03:20 has been laid on purpose, which I
03:24 don't know whether organizations, especially
03:27 in India today, understand the importance of.
03:30 If you see the first example in that deemed consent, which
03:34 is now rebranded, has been given, it's very interesting.
03:39 It's basically-- and if I may just refer to that,
03:43 there are two examples I would just like to refer to.
03:46 It is--
03:46 You can actually spell them out for context for our viewers.
03:52 Right.
03:52 So this is in the new provision 7a.
03:58 And the illustration reads as follows.
04:00 X, an individual, makes a purchase at Y, a pharmacy.
04:05 She voluntarily provides Y her personal data
04:08 and requests Y to acknowledge a receipt of payment
04:10 made for the purchase of sending a message to her mobile phone.
04:14 Y may process the personal data of X
04:17 for the purpose of sending the receipt.
04:20 So this purpose limitation, that it just stops there,
04:25 is very critical.
04:26 In today's day and age, when you go to a shop,
04:28 you will see that they will tell you,
04:30 madam, I'll send you the bill on your phone.
04:33 And they will take your phone number.
04:34 And then you will see you will also
04:36 get the promotional messages, et cetera,
04:38 which you never accepted to.
04:39 So even if you have bucketized this
04:41 under deemed consent or certain legitimate uses, what
04:45 is very clear is that the purpose is still
04:48 very critical to adhere to.
04:51 That's number one.
04:52 The second important thing, I think,
04:54 which I should mention here is that even in the consent
04:58 bucket, there is an example which needs due attention.
05:02 Because a lot of businesses, especially, again, in India,
05:06 are probably not following or would
05:09 be very difficult for them to follow.
05:10 And they have to recalibrate.
05:12 This is on consent.
05:13 And again, this provision is 6-1.
05:16 The first illustration, it says, "X, an individual,
05:19 downloads Y, a telemedicine app.
05:23 Y requests the consent of X for, number one,
05:26 the processing of her personal data
05:28 for making available telemedicine services,
05:31 and number two, accessing her mobile phone contact list.
05:36 And X signifies her consent to both."
05:39 So the consent has been given for both.
05:41 "Since phone contact list is not necessary for making
05:44 available telemedicine services, her consent
05:47 shall be limited to the processing
05:49 of her personal data for making available telemedicine
05:53 services."
05:53 So in effect, it is auto-expunging of consent,
05:57 which I don't think globally we would see many examples of
06:01 in other laws.
06:02 So you have taken the consent.
06:04 But someone later can sit in judgment
06:06 of whether this consent was really
06:09 required for the purpose that you have taken the consent for.
06:12 But this is no consent.
06:14 So from the two examples, Subrata, that you've read out,
06:17 I'm leaning to takeaways.
06:19 One is that if I'm going to any establishment
06:21 and they're saying that look for to send you the bill,
06:24 we need your phone number.
06:25 And I agree to getting that bill on my number.
06:29 They can't send me promotional messages.
06:30 They can't send me offers.
06:32 They can't send me sale or discount alerts.
06:35 That is one.
06:36 Two, even if inadvertently, out of ignorance,
06:40 out of not paying attention, a data subject or a consumer
06:43 has agreed to processing of data or to usage of that data, which
06:53 I think is, if I can call it, unreasonable,
06:57 I could go to a data protection board
06:59 and complain about that data fiduciary.
07:02 Yeah, so two things.
07:04 I think let me also explain to you the second example which
07:08 I gave in relation to consent, which you have actually
07:10 given voluntarily.
07:13 The first example, which is deemed consent,
07:15 which was assumed that because you walked into a store,
07:18 you were about to purchase and say, madam,
07:19 please give your mobile number.
07:20 And there was nothing, no notice shown to you,
07:22 no consent given.
07:23 But you told, OK, please send the invoice to my phone,
07:26 no problem.
07:27 And because of which, you later started
07:30 receiving promotional messages.
07:31 That cannot be done.
07:32 Now, where you are actually giving consent,
07:35 after reading, those have to be, if you
07:37 read those specific items that are there, specific, informed,
07:41 unconditional, unambiguous.
07:44 So anyways, it has to be very specifically taken from you.
07:47 Now, if by the by, even under those circumstances,
07:51 you've given a consent which is not
07:55 matching with the purpose of providing of that service,
07:58 then there is always a possibility
08:00 that that can stand auto-deleted.
08:02 I think that is something that industry needs to wake up
08:05 and understand that this is quite important,
08:08 minus everything else.
08:09 You have to do a lot of work for other things.
08:11 But this is different from global practices as well.
08:14 So Pradham, how do you back engineer it
08:16 as a consumer or a customer, if I can ask you that?
08:21 Let's say I've started getting promotional messages
08:25 from somewhere.
08:26 One is directly from the establishment
08:27 where I gave my phone number.
08:29 Other is that this data sharing that
08:31 is very prevalent and rampant in the industry.
08:35 I may not have dealt with an NBFC ever,
08:37 but I don't know how that NBFC got my data.
08:40 I would have never dealt with a particular bank.
08:44 And suddenly, I'm getting calls for auto loans
08:47 having not even applied for one with any bank.
08:50 How do I, as a customer, make sure--
08:53 or rather, figure out who to hold accountable for?
08:58 No, that's a difficult one.
08:59 Honestly, I don't think that we have an answer today.
09:02 Because like you correctly mentioned,
09:04 we are giving our phone numbers to several institutions.
09:08 Now, from where this got misused or who further
09:12 diverged this to whom, because of which you are getting,
09:15 I don't think we have an answer today.
09:17 You, as a consumer, can find out.
09:21 But then, if this bill of 33 pages
09:25 doesn't answer this question, which we on a daily basis
09:28 are pestered by--
09:30 I mean, can I not say that this is a big loser in the law?
09:34 It is.
09:35 But it is also a difficulty if you see.
09:38 If I ask you that what could be a possible solution to find
09:41 out, it's a practical challenge, I would say.
09:45 Exactly.
09:46 If, let's say, I'm getting--
09:47 I'm pestered by one particular, let's say,
09:51 establishment who's giving out loans, for instance.
09:54 And then I complain to that.
09:56 Or let's say, OK, let's talk about grievance redressal.
09:59 If you're moving a little bit away from consent,
10:02 if, let's say, I call or rather write to the grievance
10:06 redressal officer of this establishment
10:08 and they're not able to tell me where they got my number from,
10:13 because I have never complained to them
10:15 to use my phone number or my name or my email
10:19 address in some fashion, they don't come back to me
10:21 with an answer.
10:22 Is that not something that the bill can--
10:24 Yes, there is an avenue there.
10:26 Yes, you're right.
10:27 There is an avenue there if you try
10:29 to use it in that fashion, basically.
10:33 No, so when you say avenue, you mean that there is a solution,
10:38 but it's not there in the bill, right?
10:39 It's not directly there.
10:41 But if you, like you say, that if you are questioning
10:44 and if you're not getting the answer, probably that is--
10:47 but today, the way the bill is looking at it,
10:49 it's a very linear thing.
10:51 You went to an institution.
10:53 You gave your consent, or it has not
10:55 been used in a proper manner by that organization, et cetera.
10:58 This is a slightly different situation
11:01 that we are contemplating, which is not
11:02 there in the linear situation that the bill is contemplating.
11:07 Yeah, but the end goal should be this, right?
11:09 That it's meant for individual privacy, individual privacy
11:12 of personal data.
11:14 So to my mind, that's a big gaping hole
11:17 in how we are approaching this law then.
11:21 Yeah.
11:22 OK, let me come to some of the other, let's say,
11:26 changes, if I can compare it to the 2022 version, which
11:30 is on cross-border data transfers.
11:33 Earlier, the government in the 2022 version
11:36 said that look, we come out with a list of countries
11:38 where we are OK sending data from India to India,
11:41 and that's getting processed in those jurisdictions.
11:46 Now they've said we've come out with a negative list.
11:49 I mean, in terms of implications,
11:51 how do you think this will pan out?
11:54 So I think it is slightly better,
11:55 because otherwise you would have to really go jurisdiction
11:59 by jurisdiction and have to whitelist
12:01 a number of countries that would have taken time, et cetera.
12:03 This could be a slightly better avenue,
12:05 because otherwise it's giving a free leeway,
12:08 unless there is some embargo jurisdiction which
12:11 are called out.
12:12 And you need to accordingly see that you follow that dictator.
12:20 OK, and in terms of the scope itself,
12:25 what I could see a little bit of a change in approach
12:29 was that if an entity outside of India
12:32 is processing my data for profiling,
12:35 this law doesn't cover them.
12:37 The extra territorial--
12:38 Yeah, so it's a very interesting one.
12:40 Even GDPR covers this, that the extra territorial applicability
12:44 is on two fronts.
12:46 One is if you're providing goods or services
12:48 to the people in that country, which is India in this case.
12:51 And secondly, if you are profiling
12:52 the people in the country.
12:54 So earlier version had this profiling aspect,
12:57 which has now been removed.
12:58 So right now, if it's only providing goods or services
13:01 to data subjects in India from outside,
13:04 this law gets applicable to that particular entity.
13:08 In practical terms, what is it mean?
13:11 See, this is, again, a very interesting one.
13:14 In all probability, if you are profiling,
13:16 your intent could be that you want
13:19 to provide goods or services.
13:20 But interestingly, to the point you actually do that,
13:24 this bill keeps you outside the purview of the law.
13:28 So yeah.
13:30 No, but I mean, I get the implications
13:33 of extra territorial versus not.
13:35 But I'm saying, what does it translate into in terms?
13:39 I mean, I'm sharing--
13:41 I mean, we're accessing websites from all over the world.
13:47 And if they are profiling Indian data subjects,
13:49 why is it that the law is not addressing those situations?
13:53 I believe the way they would have looked at it
13:56 is that if it's a business organization,
13:59 and ultimately it is for providing goods or services,
14:01 they are providing goods and services,
14:03 it is in that scenario that we will capture it under the law.
14:08 But if it is not the case, then it
14:10 would remain outside the purview.
14:12 I think that's how they would have looked at it.
14:15 Can you elaborate on what the downsides of this are?
14:17 I mean, earlier cases which would have been covered
14:20 versus now not?
14:22 See, the downside would have been
14:23 that they could be profiling for various purposes, right?
14:26 Now, which may or may not ultimately
14:30 culminate into goods or service being provided.
14:34 In such a situation, you are keeping
14:36 that outside the purview of the bill, which
14:40 could be slightly problematic, dangerous, I would say.
14:45 Because profiling is happening of the data subjects or the data
14:48 principles in India.
14:49 But this law does not apply till someone is actually
14:53 trying to provide goods or services to them.
14:57 So, OK, let's say I'm accessing a gaming website
15:00 or I'm accessing a publication, right?
15:03 There's a situation where I'm paying for that subscription
15:07 or where I'm using it for free.
15:09 Is that the difference that the law will also make now?
15:12 No, I don't think that payment or non-payment is really
15:15 the aspect here.
15:17 But providing of any good or service
15:19 is what they have to talk about as a benchmark.
15:23 No, sorry, just break it down for me.
15:25 What do you mean by providing--
15:26 I mean, I'm not, let's say, XYZ global publication, right?
15:30 And now I'm reading their content, not as a subscriber.
15:33 I'm not paying for that service.
15:35 Does that mean that they can go ahead and profile?
15:38 Right, so I think in some of the foreign jurisdictions,
15:40 there is more guidance around this.
15:42 OK, for example, in EU, when you see GDPR,
15:46 they will say, are you providing that particular service
15:48 in some of the languages which is there in EU?
15:50 Is the currency of any of the countries
15:53 there mentioned in your website?
15:56 Which clearly indicates that you want to do business there.
15:59 And therefore, that's the way you have structured
16:02 your website, et cetera.
16:04 So I think that kind of guidance can be taken.
16:08 But if your intent is really to do business,
16:11 provide goods, provide services, then this law
16:13 should be applicable to you, whether or not
16:14 you have physical presence here.
16:16 So let's say GDPR, you're saying,
16:18 sort of doesn't make this exception.
16:21 In GDPR, you have both, the goods and services
16:23 and the profiling part, both are there.
16:27 So why have we done away with it?
16:29 So this was there till the last version.
16:30 It has now been done away with, like I was telling.
16:32 No, I know.
16:33 But I'm saying, you talk to industry.
16:34 You talk to the government.
16:35 Can you sort of help us understand why--
16:38 or I mean, there's some rationale to this removal,
16:40 right?
16:41 So can you--
16:41 I believe the way it would have been looked into
16:44 is purely commercial, that if there
16:46 is a commercial exploitation, probably then we
16:49 bring it within the purview, but not otherwise.
16:52 But honestly, beyond this, presently, we're
16:55 not aware as to what could have transpired
16:58 behind this particular sudden last minute change.
17:01 Yeah, because I mean, there could be consequences, right?
17:04 You're not probably providing me a goods or a service today.
17:07 But based on what you have profiled me,
17:10 you can start targeting me based on that data from tomorrow,
17:13 right?
17:14 So it was a little backbiting for me
17:15 that why this got taken away.
17:18 No, it is.
17:20 OK, let me come to in terms of preparedness.
17:22 And this is just the first quick sort of assessment
17:26 of, in terms of preparedness, what sort of businesses
17:29 should do.
17:31 What is the work, let's say, businesses have been doing
17:33 since the last several, several years
17:36 that this bill has seen or taken different avatars?
17:40 And where do you think the preparedness is now?
17:43 Anything that has to significantly change
17:45 bases the proposals that got tabled today?
17:49 See, I think-- see, there are a lot of things to be done.
17:53 For example, if you look at data subject rights or data
17:57 principle related rights, there itself,
17:59 organizations will have to carry out a lot of activities.
18:03 If you have to--
18:04 if someone come knocking at your door
18:06 and they say, I want to understand what kind of data
18:08 you have about me, right of erasure, right of correction,
18:11 et cetera, all of those, you need
18:13 to have that mechanism built in in order
18:15 to see that you are able to respond to those requests.
18:19 But I am going to a more fundamental aspect
18:22 of this particular bill, and especially the new avatar,
18:25 is consent.
18:28 So like I was mentioning, I think
18:29 we have to really, really look at the way our consent
18:32 language is, the way it is taken.
18:36 Withdrawing consent should be as easy as giving consent.
18:40 You have to keep evidence that you actually provided notice,
18:44 you actually took consent from a particular data principle.
18:49 Because in a bad day, you will have to prove that.
18:52 So these aspects, I don't think largely Indian organizations
18:56 are equipped with.
18:58 So forget about other more complex things
19:01 like data subject rights and how do you adhere to that.
19:04 But notice, consent, having a proper contract
19:08 with your data processor, which is specifically
19:10 called out in the law, these are things
19:13 I think which organizations should start working towards.
19:18 A number of organizations have started
19:19 moving towards from a bronze standard to a silver standard.
19:23 But I think now we will have to move towards a gold standard,
19:25 and especially with these [INAUDIBLE]
19:27 The other one, I think last time also we
19:29 had discussed what this translation into all eight
19:32 schedule languages, 22 languages,
19:34 which is a massive thing for every organization to adhere
19:37 to.
19:37 So these are things they'll have to start working towards.
19:40 Tell me something.
19:41 So there was, of course, after the last November's version,
19:44 there was a lot of human cry from the startup community
19:48 that, look, this is all very onerous for us,
19:50 and compliance is going to be difficult.
19:54 So there is a specific exemption from certain sections
19:58 of [INAUDIBLE] sections that the bill now brings in.
20:03 Any of those seem excessive to you?
20:05 Because some of them are--
20:07 let's say the startup is sharing my personal data
20:10 with any other third entity.
20:12 They don't have to now ensure that it's accurate
20:14 and it's up to date, et cetera.
20:17 Similarly, the grievance-- or rather,
20:20 the requirement of giving notice is not there for startups.
20:25 Again, startups, I don't know how much ever big they get.
20:30 They cannot be classified as significant data fiduciary,
20:33 if I've read that correctly.
20:35 Do some of these seem excessive to you?
20:38 I think the fundamental point is that not every startup
20:41 will get classified.
20:43 There is a layering which is there.
20:45 And depending on what kind of data
20:47 they're handling, the volume that they're handling,
20:49 they get that exemption.
20:52 We have seen that a set of four people
20:54 sitting in a particular office are
20:56 handling children's psychological data
20:59 of a foreign jurisdiction.
21:01 So you can imagine, even if it is four people,
21:03 they are handling something which is extremely critical.
21:06 So I think we should look at it from the criticality
21:08 perspective, rather the criticality of data perspective,
21:11 rather than just size, et cetera.
21:15 Yes, some of this--
21:16 But is that decision-based?
21:18 Yeah, yeah.
21:20 That kind of data you're dealing with.
21:22 Yeah, so basically, that threshold is also there.
21:24 It's not that everyone is getting it.
21:27 So if you look at it, the provision
21:29 is curated in a manner where you get--
21:32 also, you have that pre-screen in order to get the exemption.
21:40 OK, so I may have missed the categorization of data.
21:43 What does it exactly say, that if you're dealing with data
21:46 in x, y, z field?
21:48 No, it's the kind and volume, I think.
21:50 If I remember this correctly, it will
21:52 be the kind and volume of data which they are handling.
21:56 This is that-- the startups.
22:00 That is also one of the thresholds, basically.
22:02 But that is for significant data fiduciary, right?
22:04 Or does that--
22:05 No, no.
22:06 Significant data fiduciary have all those other requirements,
22:09 which is, again, criticality of data, et cetera.
22:12 So that's a different one.
22:13 But even where you--
22:14 So, OK, I get what you're saying.
22:16 Because the language says that having regard
22:17 to the volume and nature of personal data being processed,
22:21 the central government may notify certain data fiduciaries
22:24 or a class of data fiduciaries, including startups.
22:27 But again, I mean, this is going to come by rules.
22:32 See, how it will come or determine is yet to be seen.
22:35 But at least the good part is that it's not a free leaving.
22:39 Just because you are a startup, you go ahead.
22:40 Because we have seen, like I was mentioning to you,
22:43 this is actually a real example.
22:44 Because if you are handling such sensitive data,
22:49 then you have to be under the purview of the law.
22:52 OK, so you're saying then the rules should elaborate.
22:55 That let's say somebody-- a startup who's handling,
22:57 let's say, my shopping data of clothes versus somebody who's
23:02 looking at health data, for instance,
23:06 the rules should make a distinction
23:08 between these two startups.
23:10 Yeah.
23:11 OK, I missed you.
23:12 Any other sort of--
23:14 I mean, this was a quick first take.
23:16 Any other points that you think that businesses
23:19 should look at a little more closely in this new version?
23:24 I think we have covered a number of them.
23:27 I think one is that grievance redressal.
23:29 OK, I think this one, if you see the government's move right
23:32 now, whether it is social media, intermediary, or otherwise,
23:36 I think they are laying a lot of emphasis on grievance redressal.
23:39 In fact, this draft calls out that you
23:41 have to first exhaust the remedy of going to this data
23:45 fiduciary and exhaust your grievance redressal process
23:49 there.
23:50 And then you go ahead and you can go to the data protection
23:53 board.
23:53 So now it's a more tiered--
23:55 It's a more tiered approach.
23:57 And also, I think it kind of gives
24:00 a lot of onus on organizations that that is actually
24:04 paid heed to.
24:05 And you listen to the grievance that is there.
24:09 Otherwise, things could escalate.
24:11 So I think that's an important one.
24:15 Otherwise, I think a lot of emphasis
24:19 has moved from data processor to data controller.
24:24 So if you look at it-- and this was a discussion we had when
24:28 the consultations happened.
24:30 Earlier, the data controller and data processor
24:32 both were responsible to report breach.
24:37 And we had this discussion that many times,
24:39 the data processor does not even have access to the data
24:42 subject.
24:42 So how can they really go and do breach reporting to them?
24:46 So now if you see, even the overall construct,
24:49 the way it is, including your penalty schedule edits,
24:52 it has shown that the responsibility, liability
24:56 is largely on the data controller.
24:58 And that also means that the contract
25:00 that you are entering into with your processor,
25:02 it should be more comprehensive, robust,
25:05 because that could determine in a bad day
25:09 who handles what kind of monetary repercussion.
25:12 So sort of translate this into an actual sort of transaction
25:15 where there would be a controller versus a processor
25:19 and, of course, the data subject?
25:21 Yeah, so let's take a simple example.
25:23 Say you have a payroll guy.
25:28 And you have told that, listen, on first of this,
25:30 every month, these people should get the salary.
25:34 So I think you have clearly given the dictates to them
25:38 that what they have to do with the data.
25:40 And they cannot do anything beyond that.
25:42 Your responsibility or liability is
25:44 to see that everything is clearly called out
25:47 in the document, what they can do, cannot do, et cetera.
25:51 And because the onus, as per the bill,
25:54 is largely on you to ensure that all this is done.
25:56 You cannot say that, oh, I don't know what they did after I
25:59 had given the data to them.
26:01 So that responsibility, liability under this bill
26:03 is remaining there.
26:04 And so your employees are then getting that payments, et
26:08 cetera.
26:08 So in this situation, you as an organization
26:13 with data controller or data controller,
26:17 you would be primarily responsible.
26:21 The processor has to largely follow what you've said.
26:25 So that's the--
26:28 Shift in burden in that sense.
26:30 Yeah.
26:31 OK.
26:32 All right, Subrata, many thanks for your first quick views
26:35 on the latest version of the Personal Data Protection Bill.
26:39 And of course, as we all read the fine print,
26:41 there will be many and more issues to discuss.
26:43 The government has also assured the opposition,
26:45 which raised a lot of objections today,
26:47 that there will be an actual discussion
26:49 on every aspect of the bill.
26:51 So hopefully, looking forward to that.
26:53 Subrata, many thanks for joining us here on the fine print.
26:56 And thank you so much for watching.
26:58 Thank you.
26:59 [MUSIC PLAYING]
27:03 [MUSIC PLAYING]
27:07 [MUSIC PLAYING]
27:10 [BLANK_AUDIO]

Recommended