Past JEE Main Entrance Papers

Overview

Random experiment

An experiment is random means that the experiment has more than one possible outcome and it is not possible to predict with certainty which outcome that will be. For instance, in an experiment of tossing an ordinary coin, it can be predicted with certainty that the coin will land either heads up or tails up, but it is not known for sure whether heads or tails will occur. If a die is thrown once, any of the six numbers, i.e., \(1,2,3,4,5,6\) may turn up, not sure which number will come up.

  • Outcome: A possible result of a random experiment is called its outcome for example if the experiment consists of tossing a coin twice, some of the outcomes are \(HH , HT\) etc.
  • Sample Space: A sample space is the set of all possible outcomes of an experiment. In fact, it is the universal set S pertinent to a given experiment.

The sample space for the experiment of tossing a coin twice is given by
\(
S =\{ HH , HT , TH , TT \}
\)
The sample space for the experiment of drawing a card out of a deck is the set of all cards in the deck.

Event

An event is a subset of a sample space S. For example, the event of drawing an ace from a deck is
\(A=\{\) Ace of Heart, Ace of Club, Ace of Diamond, Ace of Spade \(\}\)

Types of events

  • Impossible and Sure Events: The empty set \(\phi\) and the sample space S describe events. In fact \(\phi\) is called an impossible event and \(S\), i.e., the whole sample space is called a sure event.
  • Simple or Elementary Event: If an event E has only one sample point of a sample space, i.e., a single outcome of an experiment, it is called a simple or elementary event. The sample space of the experiment of tossing two coins is given by
    \(
    S =\{ HH , HT , TH , TT \}
    \)
    The event \(E_1=\{ HH \}\) containing a single outcome HH of the sample space S is a simple or elementary event. If one card is drawn from a well shuffled deck, any particular card drawn like ‘queen of Hearts’ is an elementary event.
  • Compound Event: If an event has more than one sample point it is called a compound event, for example, \(S =\{ HH , HT \}\) is a compound event.
  • Equally Likely Events: A number of simple events are said to be equally likely if there is no reason for one event to occur in preference to any other event.
  • Complementary event: Given an event A , the complement of A is the event consisting of all sample space outcomes that do not correspond to the occurrence of \(A\).
    The complement of A is denoted by \(A ^{\prime}\) ‘ or \(\overline{ A }\) or \(A ^{ C }\). It is also called the event ‘not A ‘. Further \(P (\overline{ A })\) denotes the probability that A will not occur.
    \(
    A ^{\prime}=A ^{ C }=\overline{ A }= S – A =\{w: w \in S \text { and } w \notin A \}
    \)

Event ‘ \(A\) or \(B\) ‘

If \(A\) and \(B\) are two events associated with same sample space, then the event ‘\(A\) or \(B\) ‘ is same as the event \(A \cup B\) and contains all those elements which are either in \(A\) or in \(B\) or in both. Further more, \(P ( A \cup B )\) denotes the probability that \(A\) or \(B\) (or both) will occur.

Event ‘ \(A\) and \(B\) ‘

If \(A\) and \(B\) are two events associated with a sample space, then the event ‘ \(A\) and \(B\) ‘ is same as the event \(A \cap B\) and contains all those elements which are common to both \(A\) and \(B\) . Further more, \(P ( A \cap B )\) denotes the probability that both \(A\) and \(B\) will simultaneously occur.

\(\text { The Event ‘A but not B’ (Difference A-B) }\)

An event \(A-B\) is the set of all those elements of the same space \(S\) which are in \(A\) but not in \(B\) , i.e., \(A – B = A \cap B ^{\prime}\).

Mutually exclusive event:

Two events \(A\) and \(B\) of a sample space S are mutually exclusive if the occurrence of any one of them excludes the occurrence of the other event. Hence, the two events \(A\) and \(B\) cannot occur simultaneously, and thus \(P ( A \cap B )=0\).

Remark

Simple or elementary events of a sample space are always mutually exclusive. For example, the elementary events \(\{1\},\{2\},\{3\},\{4\},\{5\}\) or \(\{6\}\) of the experiment of throwing a dice are mutually exclusive.
Consider the experiment of throwing a die once.
The events \(E =\) getting a even number and \(F =\) getting an odd number are mutually exclusive events because \(E \cap F =\phi\).
Note For a given sample space, there may be two or more mutually exclusive events.

Exhaustive events

If \(E _1, E _2, \ldots, E _n\) are \(n\) events of a sample space S and if
\(
E _1 \cup E _2 \cup E _3 \cup \ldots \cup E _n=\bigcup_{i=1}^n E _i= S
\)
then \(E _1, E _2, \ldots, E _n\) are called exhaustive events.
In other words, events \(E _1, E _2, \ldots, E _n\) of a sample space S are said to be exhaustive if atleast one of them necessarily occur whenever the experiment is performed.
Consider the example of rolling a die. We have \(S=\{1,2,3,4,5,6\}\). Define the two events.
\(A\): ‘a number less than or equal to 4 appears.’
\(B\) : ‘a number greater than or equal to 4 appears.’
Now
A: \(\{1,2,3,4\}, B=\{4,5,6\}\)
\(A \cup B=\{1,2,3,4,5,6\}=S\)
Such events \(A\) and \(B\) are called exhaustive events.

Mutually exclusive and exhaustive events

If \(E _1, E _2, \ldots, E _n\) are \(n\) events of a sample space S and if \(E _i \cap E _j=\phi\) for every \(i \neq j\), i.e., \(E _i\) and \(E _j\) are pairwise disjoint and \(\bigcup_{i=1}^n E _i= S\), then the events \(E _1, E _2, \ldots, E _n\) are called mutually exclusive and exhaustive events.
Consider the example of rolling a die.
We have \(\quad S=\{1,2,3,4,5,6\}\)
Let us define the three events as
\(A = \) a number which is a perfect square
\(B =\) a prime number
\(C = \) a number which is greater than or equal to 6
Now \(A=\{1,4\}, B=\{2,3,5\}, C=\{6\}\)
Note that \(A \cup B \cup C =\{1,2,3,4,5,6\}= S\). Therefore, \(A , B\) and \(C\) are exhaustive events.
Also \(A \cap B = B \cap C = C \cap A =\phi\)
Hence, the events are pairwise disjoint and thus mutually exclusive.
Classical approach is useful, when all the outcomes of the experiment are equally likely. We can use logic to assign probabilities. To understand the classical method consider the experiment of tossing a fair coin. Here, there are two equally likely outcomes – head \(( H )\) and tail \(( T )\). When the elementary outcomes are taken as equally likely, we have a uniform probablity model. If there are \(k\) elementary outcomes in \(S\) , each is assigned the probability of \(\frac{1}{k}\). Therefore, logic suggests that the probability of observing a head, denoted by \(P ( H )\), is \(\frac{1}{2}=0.5\), and that the probability of observing a tail,denoted \(P ( T )\), is also \(\frac{1}{2}=5\). Notice that each probability is between 0 and 1 , Further H and T are all the outcomes of the experiment and \(P ( H )+ P ( T )=1\).

Classical definition

If all of the outcomes of a sample space are equally likely, then the probability that an event will occur is equal to the ratio :
The number of outcomes favourable to the event
The total number of outcomes of the sample space
Suppose that an event \(E\) can happen in \(m\) ways out of a total of \(n\) possible equally likely ways.
Then the classical probability of occurrence of the event is denoted by
\(
P ( E )=\frac{m}{n}
\)
The probability of non occurrence of the event E is denoted by
\(
P (\text { not } E )=\frac{n-m}{n}=1-\frac{m}{n}=1- P ( E )
\)
Thus \(\quad P ( E )+ P (\) not \(E\) \()=1\)
The event ‘not \(E\) ‘ is denoted by \(\overline{ E }\) or \(E^{\prime}\) (complement of \(E\) )
Therefore \(P (\overline{ E })=1- P ( E )\)

Axiomatic approach to probability

  • Let S be the sample space of a random experiment. The probability P is a real valued function whose domain is the power set of \(S\) , i.e., \(P ( S )\) and range is the interval \([0,1]\) i.e. \(P : P ( S ) \rightarrow[0,1]\) satisfying the following axioms.
    For any event \(E , P ( E ) \geq 0\).
  • \(P ( S )=1\)
  • If \(E\) and \(F\) are mutually exclusive events, then \(P ( E \cup F )= P ( E )+ P ( F )\).
    It follows from (iii) that \(P (\phi)=0\).
    Let \(S\) be a sample space containing elementary outcomes \(w_1, w_2, \ldots, w_n\), i.e., \(S =\left\{w_1, w_2, \ldots, w_n\right\}\)

It follows from the axiomatic definition of probability that

  • \(0 \leq P \left(w_i\right) \leq 1\) for each \(w_i \in S\)
  • \(P \left(w_i\right)+ P \left(w_2\right)+\ldots+ P \left(w_n\right)=1\)
  • \(P ( A )= P \left(w_i\right)\) for any event A containing elementary events \(w_i\)

For example, if a fair coin is tossed once
\(P ( H )= P ( T )=\frac{1}{2}\) satisfies the three axioms of probability.
Now suppose the coin is not fair and has double the chances of falling heads up as compared to the tails, then \(P ( H )=\frac{2}{3}\) and \(P ( T )=\frac{1}{3}\).
This assignment of probabilities are also valid for \(H\) and \(T\) as these satisfy the axiomatic definitions.


Probabilities of equally likely outcomes

Let a sample space of an experiment be \(S =\left\{w_1, w_2, \ldots, w_n\right\}\) and suppose that all the outcomes are equally likely to occur i.e., the chance of occurrence of each simple event must be the same i.e., \(P \left(w_i\right)=p\) for all \(w_i \in S\), where \(0 \leq p \leq 1\)
Since
\(
\begin{aligned}
& n \\
& P \left(w_i\right)=1 \\
& i=1 \\
& p+p+p+\ldots+p(n \text { times })=1 \\
& \Rightarrow \quad n p=1, \text { i.e. } \quad p=\frac{1}{n} \\
&
\end{aligned}
\)
i.e.,
\(
\begin{aligned}
& p+p+p+\ldots+p(n \text { times })=1 \\
\Rightarrow \quad & n p=1, \quad \text { i.e. } \quad p=\frac{1}{n}
\end{aligned}
\)
Let S be the sample space and E be an event, such that \(n(S)=n\) and \(n( E )=m\). If each outcome is equally likely, then it follows that
\(
P ( E )=\frac{m}{n}=\frac{\text { Number of outcomes favourable to } E }{\text { Total number of possible outcomes }}
\)

Odds Against and Odds in Favour of an Event:

Let there be \(m + n\) equally likely, mutually exclusive and exhaustive cases out of which an event A can occur in m cases and does not occur in n cases. Then by definition, probability of occurrences of event \(A=P(A)=\frac{m}{m+n}\)
The probability of non-occurrence of event \(A=P\left(A^{\prime}\right)=\frac{n}{m+n}\) \(\therefore P ( A ): P \left( A ^{\prime}\right)= m : n\)
Thus the odd in favour of occurrences of the event A are defined by \(m : n\) i.e. \(P ( A ): P \left( A ^{\prime}\right)\); and the odds against the occurrence of the event \(A\) are defined by \(n\) : \(m\) i.e. \(P \left( A ^{\prime}\right)\) : \(P ( A )\).

Addition rule of probability

If \(A\) and \(B\) are any two events in a sample space \(S\) , then the probability that atleast one of the events \(A\) or \(B\) will occur is given by
\(
P ( A \cup B )= P ( A )+ P ( B )- P ( A \cap B )
\)
Similarly, for three events \(A\), \(B\) and \(C\), we have
\(
\begin{aligned}
& P ( A \cup B \cup C )= P ( A )+ P ( B )+ P ( C )- P ( A \cap B )- P ( A \cap C )- P ( B \cap C )+ \\
& P ( A \cap B \cap C )
\end{aligned}
\)

General form of addition theorem (Principle of Inclusion-Exclusion)

For \(n\) events \(A_1, A_2, A_3, \ldots . . . A_n\) in \(S\), we have
\(
P\left(A_1 \cup A_2 \cup A_3 \cup A_4 \ldots \ldots . . \cup A_n\right)
\)
\(
=\sum_{i=1}^{ n } P \left( A _i\right)-\sum_{i<j} P \left( A _i \cap A _j\right)+\sum_{i<i<k} P \left( A _i \cap A _j \cap A _k\right)+\ldots .+(-1)^{ n -1} P \left( A _1 \cap A _2 \cap A _3 \ldots \ldots \cap A _{ n }\right)
\)

Addition rule for mutually exclusive events

If \(A\) and \(B\) are disjoint sets, then
\(P ( A \cup B )= P ( A )+ P ( B )[\) since \(P ( A \cap B )= P (\phi)=0\), where A and B are disjoint \(]\).
The addition rule for mutually exclusive events can be extended to more than two events.

Conditional Probability

If \(A\) and \(B\) are any events in \(S\) then the conditional probability of \(B\) relative to \(A\), i.e. probability of occurence of \(B\) when \(A\) has occured, is given by
\(
P(B / A)=\frac{P(B \cap A)}{P(A)} \text {. If } P(A) \neq 0
\)

Bayes’ Theorem

It is a way of finding a probability when we know certain other probabilities.
The formula is:
\(
P(A \mid B)=\frac{P(A) P(B \mid A)}{P(B)}
\)
Which tells us: how often \(A\) happens given that \(B\) happens, written \(P ( A \mid B )\),
When we know: how often \(B\) happens given that \(A\) happens, written \(P ( B \mid A )\)
and how likely \(A\) is on its own, written \(P ( A )\)
and how likely B is on its own, written \(P ( B )\)

“A” With Three (or more) Cases

We just saw “\(A\)” with two cases ( \(A\) and not \(A\) ), which we took care of in the bottom line.
When “\(A\)” has 3 or more cases we include them all in the bottom line:
\(
P(A 1 \mid B)=\frac{P(A 1) P(B \mid A 1)}{P(A 1) P(B \mid A 1)+P(A 2) P(B \mid A 2)+P(A 3) P(B \mid A 3)+\ldots \text { etc }}
\)

You cannot copy content of this page