Conditional Probability & Bayes Theorem

  • 2 days ago
DEAR SIR/MADAM, I NEED THIS CHANNEL SPONSORSHIP FROM YOU TOO IF POSSIBLE https://www.dailymotion.com/mokkamodina19

DEAR SIR/MADAM, PLEASE ACCEPT AND PERMISSION DAILY MOTION, VIMEO AND RUMBLE.COM IN BANGLADESH.
Mohammad Ali Ashraf, [6/29/2024 7:07 PM]
https://ea.ebs.bankofchina.com/contactUs_en.html

ashrafm703@gmail.com IS MY EMAIL AND
DEAR SIR/MADAM, I NEED SUPPORT AND LOAN FROM CHINESE PRESIDENT AND GOVT.

DEAR SIR/MADAM, I WANT LOAN FROM YOU TO LEARN AND EARN MONEY BY THE SUPER AFFILIATE MARKETING OF JOHN CRESTANI.

DEAR SIR/MADAM.I AM FROM BANGLADESH MY PAYONEER ID IS mokkamodina19@gmail.com , IN ANY BANK ACCOUNTS OF BANGLADESH IS NOT SAFE FOR MY MONEY YOU KNOW THAT. WILL PAYONEER BANK ACCOUNTS OF MINE WILL BE SAFE FOR MY MONEY.? ALL OF MY ASSETS OR PROPERTIES ARE SEIZED BY FAKE WAYS IN BANGLADESH. I WANT TO LOAN FROM YOU TO BUY READYMADE HOUSES OR HOMES FOR ME. ALSO NEED LOANS FROM YOU WORK OR JOB VISA FOR USA/CANADA. ALSO NEED LOAN FROM YOU TO SETTLE AND STUDY IN THE USA/CANADA.. I WANT TO HANDOVER ALL OF MY ASSETS OR PROPERTIES WHICH ARE SEIZED BY FAKE WAYS TO THE USA/CANADA UNIVERSITIES.

https://www.aiddata.org/how-china-lends

mokkamodina19@gmail.com is my payoneer id. i want my youtube channel sponsorship from you.https://www.youtube.com/channel/UCrNbtmpYbyYMgmzD9aPtGMQ
and also help me to monetize my tiktok https://www.tiktok.com/@mohammadaliashr3


AFTER MONETIZATION I WANT THIS LINK SPONSORSHIP FROM YOU TOO. https://rutube.ru/channel/37901720/ ,,,,mokkamodina19@gmail.com is my payoneer id.
Transcript
00:00Hello everyone, where we will be learning the Bayesian classifier concept and the machine
00:22learning estimation.
00:23In this lesson, we will be learning about the basics of conditional probability and
00:29the Bayes theorem.
00:30This is for you to understand before we get into the Naves-Bayas algorithm.
00:35So this is how you should understand what is happening behind Naves-Bayas algorithm.
00:39So first we will move on to the conditional probability, the introduction, the idea behind
00:43it.
00:44Here the role of a class is to predict the value of the features for members of that
00:51class.
00:52So we are taking a particular class and the role of that class is to predict the value.
00:59The idea behind this Bayesian classifier is that if an agent knows the class, it can predict
01:06the value of other features.
01:08If for example, you are knowing the class A, you know how well the class A works.
01:14So definitely you will be able to know the values that the class A or you will be able
01:19to predict the values that would be related to the class A category.
01:24But if he does not know, that is where the Bayes rules are considered.
01:30The Bayes rules can be useful and it helps us to predict the values of the class given
01:38or the features related to the particular class.
01:42Now in a Bayesian classifier, the learning agent builds a probabilistic model of that
01:48particular feature and it uses the model to predict the classification of a new example.
01:55So if you know already about a particular class, the user will be able to predict the
02:02value of that class for the future prediction.
02:06But if you are not aware, if the details of the class is unknown, you use the Bayes concept
02:12here and with that you are able to predict the features of that particular new class.
02:20So this is how the Bayesian concept works.
02:22Now the definition here is, a Bayesian classifier is a probabilistic model where the classification
02:29is a latent variable that is probabilistically related to the observed variables.
02:37It is related, the latent variable is probabilistically related to the observed variable.
02:44Now classification here then becomes an inference in the probabilistic model.
02:50So this is the definition of a Bayesian classifier.
02:53Now the conditional probability, this you must have definitely studied in your math
02:58classes, how the conditional probability works here between the two events.
03:05The probability of an occurrence of an event A given that an event B has already occurred.
03:11The conditional probability of A given B is donated of course by P of A given B. The formula
03:18here is P of A given B is equal to P of A intersection B by P of B where P of B is not
03:26equal to 0.
03:28So now we have set of independent events.
03:31For example, if there are two events P of A and B then P of A intersection B is equal
03:36to P of A into P of B. Now if we have three that is A, B, C then you will be doing the
03:42pair wise that is P of A intersection B then P of B intersection C then P of C intersection
03:50A. This is how it works.
03:52Now if three events are mutually independent, so how the formula works?
04:00You will be taking the P of B intersection C, P of C intersection A, then P of A intersection
04:06B, then the intersection of all the three events that is P of A intersection B intersection
04:13C. So this is how the independent events work in the case of conditional probability.
04:21Now this is for you to understand regarding the independent dependent events.
04:26If independent events are events that does not affect the outcome of each other.
04:32If there are two outcomes or two events A and B, the outcome of event A does not affect
04:39the outcome of event B. In terms of probability, two events are independent.
04:45If the probability of one event occurring no way affects the probability of second event
04:51occurring.
04:53This is how the independent events are defined.
04:57Now dependent events, two events are dependent when the outcome of the first event influences
05:03the outcome of the second event.
05:06The probability of two dependent events is the product of the probability of X and the
05:12probability of Y after X occurs.
05:16After X occurs here is a very important term.
05:20So this is how the dependent events look like.
05:25So this is how the conditional probability concept specifies.
05:28Now we are going into the Bayes theorem.
05:31It is also known as Bayes rules or Bayes law.
05:35Here it is used to determine the probability of a hypothesis with prior knowledge, with
05:43a knowledge that you already have, we are able to determine the probability of a hypothesis.
05:51Now here also it depends on the conditional probability definitely P of A given B is equal
05:56to P of B given A by P of A by P of B where P of B is not equal to 0.
06:03So P of A given B is the posterior probability.
06:07These are certain terminologies which you should be aware of.
06:10P of A given B is the posterior probability, the probability of hypothesis A on the observed
06:18event B. Now probability of B given A is the likelihood probability where the probability
06:25of the evidence given that the probability of the hypothesis is true.
06:30There will be an evidence that the probability of hypothesis A is true.
06:34Now P of A is prior probability, the probability of a hypothesis before observing the evidence
06:42and P of B is the marginal probability that is the probability of the evidence.
06:48So these are the terminologies which you should be aware of when you see these terms.
06:54Now this is the generalization when there is a sample space and there is a lot of disjoint
07:00events based on an event A. B1, B2 are all the disjoint events.
07:05So how do you generalize the P of B by A. So this is how the generalization formula
07:12looks like.
07:13Now for your better understanding I have given here a problem here.
07:18There are 3 factories A, B, C of an electric bulb manufacturing company which produces
07:24respectively 35, 35 and 30% of the total output.
07:30Now we have approximately 1.5%, 1% and 2% of the bulbs here which are being produced
07:36are defective.
07:38So what you have to calculate here is a probability of the bulbs that is being manufactured in
07:45factory A. So we have 3 factories, we have the total output of each factories and definitely
07:53the defective bulbs that are being produced.
07:58So based on that we are trying to incorporate the problem here.
08:02We know P of A is 0.35, P of B is 0.
08:05That is 30%, 35%, 35%.
08:09So 35%, 35%, P of A and P of B, P of C is 30% that is 0.30.
08:16Now you need to find out the defective bulbs of A. We have the defective bulbs of 3 factories
08:23that is A, B, C.
08:24So P of D given A is 0.015, P of defective bulbs produced by B factory is 0.010.
08:33Then the probability of defective bulbs produced by the C factory is 0.020.
08:41Now by the Bayes theory concept here, we need to find out the probability of the bulb that
08:50is being manufactured in factory A, not the other factory.
08:55So P of A given D, this is the formula, you have all the values here, you are substituting
09:01it and you are getting the probability here that is 0.356.
09:05So this is how the calculation takes place.
09:08This is just an example for you to understand.
09:12Now we are coming into Bayes theorem for classification regarding the classification concept here.
09:19Classification is a predictive modeling problem that involves assigning a label to a given
09:26input data sample.
09:27The problem of classification predictive modeling can be framed as calculating the conditional
09:34probability of a class labeled from a given data sample.
09:40For example, we have probability of class given data, we will be finding it as probability
09:46of data given class into probability of class by the probability of data which is being
09:53obtained.
09:54So here probability of class given data is the probability of being obtained.
10:01So here probability of data sample.
10:04The problem of classification predictive modeling can be framed as calculating the conditional
10:11probability of a class labeled from a given data sample.
10:17For example, we have probability of class given data, we will be finding it as probability
10:23of data given class into probability of class by the probability of data which is being
10:30obtained.
10:31So here probability of class given data is the probability of class that is being provided
10:39by the data given the data which has been provided the class is related to the data
10:45which is being provided.
10:47This calculation can be performed for each class in the problem and the class that is
10:53assigned the largest probability can be selected and assigned as the input data.
11:02Okay now here when you are hearing all this we know that it is very challenging process
11:06because you need to get the data, you need to label it, you need to label the class data,
11:11you need to understand the probability between the class and the data.
11:15Then only you can do this Bayes theorem classification.
11:18So the conditional probability of the observed observation is based on the class that is
11:26the probability of the data given by the class.
11:30This concept is not feasible unless the examples are extraordinarily large.
11:36So we need a very large example or a data set where we can apply the concept of probability
11:45of data given by class.
11:48So this is how the Bayes theorem classification is done.
11:52You need to consider the probability of class, you need to consider the probability of the
11:56data and you need to consider the probability of class given by data.
12:01So with this only you will be able to understand how the Bayes theorem can be classified and
12:08how it is being implemented.
12:11Now the solution of to using the Bayes theorem for conditional probability classification
12:16is to simplify the calculation okay.
12:19We know the calculations are much more complex because we need a large number of data.
12:24So in order to simplify the data, the calculation here we are using two classifiers.
12:31One is the Naive Bayes classifier and the second one is Bayes optimal classifier.
12:37Bayes optimal classifier we will not be getting into it in detail but in the next section
12:42we will be going for the Naive Bayes classifier.
12:45So Naive Bayes classifier is based on the Bayes theorem with an assumption of independence
12:51among the predictors okay.
12:54We are assuming independence among the predictors.
12:58In the case of Bayes optimal classifier, it is a probabilistic model that makes most likely
13:05the prediction of a new example given the training data set.
13:09We are giving the training data set and we are predicting the most likelihood of the
13:16values from the new example okay.
13:20So this is how the calculation can be done much easier in the using Bayes theorem that
13:26is Naive Bayes classifier and Bayes optimal classifier.
13:30With this we have completed this session.
13:32Thank you so much.
13:33In the case of Bayes optimal classifier, it is a probabilistic model that makes most likely