scrapbook
  • "Unorganized" Notes
  • The Best Public Datasets for Machine Learning and Data Science
  • Practice Coding
  • plaid-API project
  • Biotech
    • Machine Learning vs. Deep Learning
  • Machine Learning for Computer Graphics
  • Books (on GitHub)
  • Ideas/Thoughts
  • Ziva for feature animation: Stylized simulation and machine learning-ready workflows
  • Tools
  • πŸͺΆmath
    • Papers
    • Math for ML (coursera)
      • Linear Algebra
        • Wk1
        • Wk2
        • Wk3
        • Wk4
        • Wk5
      • Multivariate Calculus
    • Improving your Algorithms & Data Structure Skills
    • Algorithms
    • Algorithms (MIT)
      • Lecture 1: Algorithmic Thinking, Peak Finding
    • Algorithms (khan academy)
      • Binary Search
      • Asymptotic notation
      • Sorting
      • Insertion sort
      • Recursion
      • Solve Hanoi recursively
      • Merge Sort
      • Representing graphs
      • The breadth-first search algorithm
      • Breadth First Search in JavaScript
      • Breadth-first vs Depth-first Tree Traversal in Javascript
    • Algorithms (udacity)
      • Social Network
    • Udacity
      • Linear Algebra Refresher /w Python
    • math-notes
      • functions
      • differential calculus
      • derivative
      • extras
      • Exponentials & logarithms
      • Trigonometry
    • Probability (MIT)
      • Unit 1
        • Probability Models and Axioms
        • Mathematical background: Sets; sequences, limits, and series; (un)countable sets.
    • Statistics and probability (khan academy)
      • Analyzing categorical data
      • Describing and comparing distributions
      • Outliers Definition
      • Mean Absolute Deviation (MAD)
      • Modeling data distribution
      • Exploring bivariate numerical data
      • Study Design
      • Probability
      • Counting, permutations, and combinations
      • Binomial variables
        • Binomial Distribution
        • Binomial mean and standard deviation formulas
        • Geometric random variable
      • Central Limit Theorem
      • Significance Tests (hypothesis testing)
    • Statistics (hackerrank)
      • Mean, Medium, Mode
      • Weighted Mean
      • Quartiles
      • Standard Deviation
      • Basic Probability
      • Conditional Probability
      • Permutations & Combinations
      • Binomial Distribution
      • Negative Binomial
      • Poisson Distribution
      • Normal Distribution
      • Central Limit Theorem
      • Important Concepts in Bayesian Statistics
  • πŸ“½οΈPRODUCT
    • Product Strategy
    • Product Design
    • Product Development
    • Product Launch
  • πŸ‘¨β€πŸ’»coding
    • of any interest
    • Maya API
      • Python API
    • Python
      • Understanding Class Inheritance in Python 3
      • 100+ Python challenging programming exercises
      • coding
      • Iterables vs. Iterators vs. Generators
      • Generator Expression
      • Stacks (LIFO) / Queues (FIFO)
      • What does -1 mean in numpy reshape?
      • Fold Left and Right in Python
      • Flatten a nested list of lists
      • Flatten a nested dictionary
      • Traverse A Tree
      • How to Implement Breadth-First Search
      • Breadth First Search
        • Level Order Tree Traversal
        • Breadth First Search or BFS for a Graph
        • BFS for Disconnected Graph
      • Trees and Tree Algorithms
      • Graph and its representations
      • Graph Data Structure Interview Questions
      • Graphs in Python
      • GitHub Repo's
    • Python in CG Production
    • GLSL/HLSL Shading programming
    • Deep Learning Specialization
      • Neural Networks and Deep Learning
      • Untitled
      • Untitled
      • Untitled
    • TensorFlow for AI, ML, and DL
      • Google ML Crash Course
      • TensorFlow C++ API
      • TensorFlow - coursera
      • Notes
      • An Introduction to different Types of Convolutions in Deep Learning
      • One by One [ 1 x 1 ] Convolution - counter-intuitively useful
      • SqueezeNet
      • Deep Compression
      • An Overview of ResNet and its Variants
      • Introducing capsule networks
      • What is a CapsNet or Capsule Network?
      • Xception
      • TensorFlow Eager
    • GitHub
      • Project README
    • Agile - User Stories
    • The Open-Source Data Science Masters
    • Coding Challenge Websites
    • Coding Interview
      • leetcode python
      • Data Structures
        • Arrays
        • Linked List
        • Hash Tables
        • Trees: Basic
        • Heaps, Stacks, Queues
        • Graphs
          • Shortest Path
      • Sorting & Searching
        • Depth-First Search & Breadth-First Search
        • Backtracking
        • Sorting
      • Dynamic Programming
        • Dynamic Programming: Basic
        • Dynamic Programming: Advanced
    • spaCy
    • Pandas
    • Python Packages
    • Julia
      • jupyter
    • macos
    • CPP
      • Debugging
      • Overview of memory management problems
      • What are lvalues and rvalues?
      • The Rule of Five
      • Concurrency
      • Avoiding Data Races
      • Mutex
      • The Monitor Object Pattern
      • Lambdas
      • Maya C++ API Programming Tips
      • How can I read and parse CSV files in C++?
      • Cpp NumPy
    • Advanced Machine Learning
      • Wk 1
      • Untitled
      • Untitled
      • Untitled
      • Untitled
  • data science
    • Resources
    • Tensorflow C++
    • Computerphile
      • Big Data
    • Google ML Crash Course
    • Kaggle
      • Data Versioning
      • The Basics of Rest APIs
      • How to Make an API
      • How to deploying your API
    • Jupiter Notebook Tips & Tricks
      • Jupyter
    • Image Datasets Notes
    • DS Cheatsheets
      • Websites & Blogs
      • Q&A
      • Strata
      • Data Visualisation
      • Matplotlib etc
      • Keras
      • Spark
      • Probability
      • Machine Learning
        • Fast Computation of AUC-ROC score
    • Data Visualisation
    • fast.ai
      • deep learning
      • How to work with Jupyter Notebook on a remote machine (Linux)
      • Up and Running With Fast.ai and Docker
      • AWS
    • Data Scientist
    • ML for Beginners (Video)
    • ML Mastery
      • Machine Learning Algorithms
      • Deep Learning With Python
    • Linear algebra cheat sheet for deep learning
    • DL_ML_Resources
    • Awesome Machine Learning
    • web scraping
    • SQL Style Guide
    • SQL - Tips & Tricks
  • πŸ’‘Ideas & Thoughts
    • Outdoors
    • Blog
      • markdown
      • How to survive your first day as an On-set VFX Supervisor
    • Book Recommendations by Demi Lee
  • career
    • Skills
    • learn.co
      • SQL
      • Distribution
      • Hypothesis Testing Glossary
      • Hypothesis Tests
      • Hypothesis & AB Testing
      • Combinatorics Continued and Maximum Likelihood Estimation
      • Bayesian Classification
      • Resampling and Monte Carlo Simulation
      • Extensions To Linear Models
      • Time Series
      • Distance Metrics
      • Graph Theory
      • Logistic Regression
      • MLE (Maximum Likelihood Estimation)
      • Gradient Descent
      • Decision Trees
      • Ensemble Methods
      • Spark
      • Machine Learning
      • Deep Learning
        • Backpropagation - math notation
        • PRACTICE DATASETS
        • Big Data
      • Deep Learning Resources
      • DL Datasets
      • DL Tutorials
      • Keras
      • Word2Vec
        • Word2Vec Tutorial Part 1 - The Skip-Gram Model
        • Word2Vec Tutorial Part 2 - Negative Sampling
        • An Intuitive Explanation of Convolutional Neural Networks
      • Mod 4 Project
        • Presentation
      • Mod 5 Project
      • Capstone Project Notes
        • Streaming large training and test files into Tensorflow's DNNClassifier
    • Carrier Prep
      • The Job Search
        • Building a Strong Job Search Foundation
        • Key Traits of Successful Job Seekers
        • Your Job Search Mindset
        • Confidence
        • Job Search Action Plan
        • CSC Weekly Activity
        • Managing Your Job Search
      • Your Online Presence
        • GitHub
      • Building Your Resume
        • Writing Your Resume Summary
        • Technical Experience
      • Effective Networking
        • 30 Second Elevator Pitch
        • Leveraging Your Network
        • Building an Online Network
        • Linkedin For Research And Networking
        • Building An In-Person Network
        • Opening The Line Of Communication
      • Applying to Jobs
        • Applying To Jobs Online
        • Cover Letters
      • Interviewing
        • Networking Coffees vs Formal Interviews
        • The Coffee Meeting/ Informational Interview
        • Communicating With Recruiters And HR Professional
        • Research Before an Interview
        • Preparing Questions for Interviews
        • Phone And Video/Virtual Interviews
        • Cultural/HR Interview Questions
        • The Salary Question
        • Talking About Apps/Projects You Built
        • Sending Thank You's After an Interview
      • Technical Interviewing
        • Technical Interviewing Formats
        • Code Challenge Best Practices
        • Technical Interviewing Resources
      • Communication
        • Following Up
        • When You Haven't Heard From an Employer
      • Job Offers
        • Approaching Salary Negotiations
      • Staying Current in the Tech Industry
      • Module 6 Post Work
      • Interview Prep
  • projects
    • Text Classification
    • TERRA-REF
    • saildrone
  • Computer Graphics
  • AI/ML
  • 3deeplearning
    • Fast and Deep Deformation Approximations
    • Compress and Denoise MoCap with Autoencoders
    • β€˜Fast and Deep Deformation Approximations’ Implementation
    • Running a NeuralNet live in Maya in a Python DG Node
    • Implement a Substance like Normal Map Generator with a Convolutional Network
    • Deploying Neural Nets to the Maya C++ API
  • Tools/Plugins
  • AR/VR
  • Game Engine
  • Rigging
    • Deformer Ideas
    • Research
    • brave rabbit
    • Useful Rigging Links
  • Maya
    • Optimizing Node Graph for Parallel Evaluation
  • Houdini
    • Stuff
    • Popular Built-in VEX Attributes (Global Variables)
Powered by GitBook
On this page
  • Distribution
  • Example:
  • Generalizing k scores in n attempts:
  • Challenge Problem
  • Quiz #1
  • Quiz #2
  1. math
  2. Statistics and probability (khan academy)

Binomial variables

PreviousCounting, permutations, and combinationsNextBinomial Distribution

Last updated 6 years ago

coin flip -> P(H) = 0.6; P(T) = 0.4

X = # of heads after 10 flips of my coin

  • made up of independent trails (flip)

  • each trail can be classified either as success or failure

  • fixed # of trails

  • probability of success on each trail is constant

Inference -> The act or process of deriving logical conclusions from premises known or assumed to be true.

Distribution

X = # of heads from flipping a coin 5 times

possible outcomes from 5 flips: 2βˆ—2βˆ—2βˆ—2βˆ—2=25=322 * 2 * 2 * 2 * 2 = 2^5 = 322βˆ—2βˆ—2βˆ—2βˆ—2=25=32

P(X=0)=132=5C032β‡’5C0=5!0!βˆ—(5βˆ’0)!=5!5!=1P(X=0) = \frac{1}{32} =\frac{_5 C_0}{32} \Rightarrow _5 C_0 = \frac{5!}{0! * (5-0)!} = \frac{5!}{5!} = 1P(X=0)=321​=325​C0​​⇒5​C0​=0!βˆ—(5βˆ’0)!5!​=5!5!​=1

P(X=1)=532=5C132β‡’5C1=5!1!βˆ—(5βˆ’1)!=5!1!=5P(X=1) = \frac{5}{32} = \frac{_5 C_1}{32} \Rightarrow _5C_1 = \frac{5!}{1!*(5-1)!}=\frac{5!}{1!}=5P(X=1)=325​=325​C1​​⇒5​C1​=1!βˆ—(5βˆ’1)!5!​=1!5!​=5

P(X=2)=1032=5C232β‡’5C2=5!2!βˆ—(5βˆ’2)!=5!2!βˆ—3!=5βˆ—4βˆ—3βˆ—22βˆ—3βˆ—2=10P(X=2) = \frac{10}{32} = \frac{_5 C_2}{32} \Rightarrow _5C_2 = \frac{5!}{2!*(5-2)!}=\frac{5!}{2!*3!}=\frac{5*4*3*2}{2*3*2}=10P(X=2)=3210​=325​C2​​⇒5​C2​=2!βˆ—(5βˆ’2)!5!​=2!βˆ—3!5!​=2βˆ—3βˆ—25βˆ—4βˆ—3βˆ—2​=10

P(X=3)=1032=5C332β‡’5C3=5!3!βˆ—(5βˆ’3)!=5!3!βˆ—2!=10P(X=3) = \frac{10}{32} = \frac{_5 C_3}{32} \Rightarrow _5C_3 = \frac{5!}{3!*(5-3)!}=\frac{5!}{3!*2!}=10P(X=3)=3210​=325​C3​​⇒5​C3​=3!βˆ—(5βˆ’3)!5!​=3!βˆ—2!5!​=10

P(X=4)=532=5C432β‡’5C4=5!4!βˆ—(5βˆ’4)!=5!1!=5P(X=4) = \frac{5}{32} = \frac{_5 C_4}{32} \Rightarrow _5C_4 = \frac{5!}{4!*(5-4)!}=\frac{5!}{1!}=5P(X=4)=325​=325​C4​​⇒5​C4​=4!βˆ—(5βˆ’4)!5!​=1!5!​=5

P(X=5)=132=5C532β‡’5C5=5!5!βˆ—(5βˆ’5)!=5!5!=1P(X=5) = \frac{1}{32} =\frac{_5 C_5}{32} \Rightarrow _5 C_5 = \frac{5!}{5! * (5-5)!} = \frac{5!}{5!} = 1P(X=5)=321​=325​C5​​⇒5​C5​=5!βˆ—(5βˆ’5)!5!​=5!5!​=1

Example:

prob (score) = 70% or 0.7

prob (miss) = 30% or 0.3

P(Exactly 2 scores in 6 attempts) =

Generalizing k scores in n attempts:

f = prob of making attempts

A binomial probability problem has these features:

  • each trial can be classified as a "success" or "failure"

  • results from each trial are independent from each other

Here's a summary of our general strategy for binomial probability:

Using the example from Problem 1:

  • each free-throw is a "make" (success) or a "miss" (failure)

  • assume free-throws are independent

\begin{aligned}P(\text{makes 2 of 3 free throws}) &= \, _3\text{C}_2 \cdot(\greenD{0.90})^{2} \cdot (\maroonD{0.10})^1 \\ \\ &=3\cdot0.81\cdot0.10 \\ \\ &=3\cdot0.081 \\ \\ &=0.243\end{aligned}

In general...

P(\text{exactly }k \text{ successes})=\,_n\text{C}_k \cdot p^k \cdot (1-p)^{n-k}

Challenge Problem

  • each shot is either a make or miss

  • shots are independent

\begin{aligned}P(\text{makes 3 or more}) &= P(\text{makes 3 free-throws}) + P(\text{makes 4 free-throws}) \\ \\ & =\,_4\text{C}_3 \cdot (0.20)^3 \cdot (0.80)^1+\,_4\text{C}_4 \cdot (0.20)^4 \cdot (0.80)^0 \\ \\ &= \dfrac{4!}{(4-3)! \cdot3!} \cdot0.008 \cdot 0.80+\dfrac{4!}{(4-4)! \cdot4!} \cdot 0.0016 \\ \\ &=\dfrac{4 \cdot 3\cdot 2\cdot 1}{3\cdot 2 \cdot 1}\cdot0.008 \cdot 0.80 +\dfrac{4 \cdot 3\cdot 2\cdot 1}{4 \cdot 3 \cdot 2 \cdot 1} \cdot0.0016 \\ \\ &=4 \cdot0.008 \cdot 0.80+0.0016\\ \\ &=0.0256+0.0016 \\ \\ &=0.0272\end{aligned}

Quiz #1

Do they all have the same probability?

Putting it all together

Let's return to our original strategy to answer the question:

Quiz #2

Strategy (without a fancy calculator)

Putting it all together

Let's return to our original strategy to answer the question:

(62)βˆ—0.72βˆ—0.34=15βˆ—0.49βˆ—0.0081=0.05935\binom {6}{2} * 0.7^2 * 0.3^4 = 15 * 0.49 * 0.0081 = 0.05935(26​)βˆ—0.72βˆ—0.34=15βˆ—0.49βˆ—0.0081=0.05935

⟹SSMMMM=0.7βˆ—0.7βˆ—0.3βˆ—0.3βˆ—0.3βˆ—0.3=(0.7)2βˆ—(0.3)4\Longrightarrow \text{SSMMMM} = 0.7 * 0.7 * 0.3 * 0.3 * 0.3 * 0.3 = (0.7)^2 * (0.3)^4⟹SSMMMM=0.7βˆ—0.7βˆ—0.3βˆ—0.3βˆ—0.3βˆ—0.3=(0.7)2βˆ—(0.3)4

⟹6C2=(62)=6!2!βˆ—(6βˆ’2)!=6βˆ—5βˆ—4βˆ—3βˆ—2βˆ—1(2βˆ—1(4βˆ—3βˆ—2βˆ—1)=15\Longrightarrow _6C_2 = \binom{6}{2}=\frac{6!}{2!*(6-2)!} = \frac{6*5*4*3*2*1}{(2*1(4*3*2*1)}=15⟹6​C2​=(26​)=2!βˆ—(6βˆ’2)!6!​=(2βˆ—1(4βˆ—3βˆ—2βˆ—1)6βˆ—5βˆ—4βˆ—3βˆ—2βˆ—1​=15

P(Exactly kkk scores in nnn attempts) = (nk)βˆ—fkβˆ—(1βˆ’f)nβˆ’k\large\binom{n}{k}* f^k * (1-f)^{n-k}(kn​)βˆ—fkβˆ—(1βˆ’f)nβˆ’k

a set number of trials (n\blueD{n}n)

the probability of success (p\greenD{p}p) is the same for each trial

n=3n=3n=3 free-throws

probability she makes a free-throw is p=0.90\greenD{p}=\greenD{0.90}p=0.90

Steph promises to buy Luke ice cream if he makes 333 or more of his 444 free-throws.

What is the probability that he makes 333 or more of the 444 free throws?

Luke gets ice cream if he makes 333 or 444 free throws. We can find the probability of each of those outcomes and add the results together.Here's how we can think of this problem:

n=4n=4n=4 trials (shots)

probability that he makes a shot is p=0.20p=0.20p=0.20

Layla has a coin that has a 60%60\%60% chance of showing heads each time it is flipped. She is going to flip the coin 555 times. Let XXX represent the number of heads she gets.

What is the probability that she gets more than 333 heads?

Finding P(X=5)P(X=5)P(X=5)

For each flip, we know P(heads)=60%P(\blueD{\text{heads}})=\blueD{60\%}P(heads)=60%. To find the probability that all 555 flips are heads, we can multiply probabilities since flips are independent:

P(X=5)=(0.60)(0.60)(0.60)(0.60)(0.60)=(0.60)5=0.07776\begin{aligned} P(X=5)&=(\blueD{0.60})(\blueD{0.60})(\blueD{0.60})(\blueD{0.60})(\blueD{0.60}) \\\\ &=(\blueD{0.60})^5 \\\\ &=0.07776 \end{aligned}P(X=5)​=(0.60)(0.60)(0.60)(0.60)(0.60)=(0.60)5=0.07776​

We'll come back and use this result later. Next, we need to find P(X=4)P(X=4)P(X=4) (the probability that she gets 444 heads).

Finding P(X=4)P(X=4)P(X=4)

Getting 444 heads in 555 attempts means Layla needs to get 444 heads and 111 tail. For each flip, we know P(heads)=60%P(\blueD{\text{heads}})=\blueD{60\%}P(heads)=60% and P(tails)=40%P(\maroonD{\text{tails}})=\maroonD{40\%}P(tails)=40%.

Let's start by finding the probability of getting 444 heads followed by 111 tail:

P(HHHHT)=(0.6)4(0.4)=0.05184P(\blueD{\text{HHHH}}\maroonD{\text{T}})=(\blueD{0.6})^4(\maroonD{0.4})=0.05184P(HHHHT)=(0.6)4(0.4)=0.05184

This isn't the entire probability though, because there are other ways to get 444 heads from 555 flips (for example, THHHH). How many different ways are there? We can use the combination formula:

nCk=n!(nβˆ’k)!β‹…k!5C4=5!(5βˆ’4)!β‹…4!=5β‹…4β‹…3β‹…2β‹…1(1)β‹…4β‹…3β‹…2β‹…1=5\begin{aligned} _n\text{C}_k&=\dfrac{n!}{(n-k)!\cdot k!} \\\\ _5\text{C}_4&=\dfrac{5!}{(5-4)!\cdot4!} \\\\ &=\dfrac{5 \cdot \cancel{4 \cdot 3 \cdot 2 \cdot 1}}{(1) \cdot \cancel{4 \cdot 3 \cdot 2 \cdot 1}} \\\\ &=5 \end{aligned}n​Ck​5​C4​​=(nβˆ’k)!β‹…k!n!​=(5βˆ’4)!β‹…4!5!​=(1)β‹…4β‹…3β‹…2β‹…15β‹…4β‹…3β‹…2β‹…1​=5​

Each of the 555 ways has the same probability that we already found:

P(HHHHT)=(0.6)4(0.4)=0.05184P(HHHTH)=(0.6)4(0.4)=0.05184P(HHTHH)=(0.6)4(0.4)=0.05184P(HTHHH)=(0.6)4(0.4)=0.05184P(THHHH)=(0.6)4(0.4)=0.05184\begin{aligned} P(\blueD{\text{HHHH}}\maroonD{\text{T}})&=(\blueD{0.6})^4(\maroonD{0.4})=0.05184 \\\\ P(\blueD{\text{HHH}}\maroonD{\text{T}}\blueD{\text{H}})&=(\blueD{0.6})^4(\maroonD{0.4})=0.05184 \\\\ P(\blueD{\text{HH}}\maroonD{\text{T}}\blueD{\text{HH}})&=(\blueD{0.6})^4(\maroonD{0.4})=0.05184 \\\\ P(\blueD{\text{H}}\maroonD{\text{T}}\blueD{\text{HHH}})&=(\blueD{0.6})^4(\maroonD{0.4})=0.05184 \\\\ P(\maroonD{\text{T}}\blueD{\text{HHHH}})&=(\blueD{0.6})^4(\maroonD{0.4})=0.05184 \end{aligned}P(HHHHT)P(HHHTH)P(HHTHH)P(HTHHH)P(THHHH)​=(0.6)4(0.4)=0.05184=(0.6)4(0.4)=0.05184=(0.6)4(0.4)=0.05184=(0.6)4(0.4)=0.05184=(0.6)4(0.4)=0.05184​

So we can multiply this probability by 555 since that is how many ways there are to get 444 heads in 555 flips.

P(X=4)=5(0.6)4(0.4)=5(0.05184)=0.2592\begin{aligned} P(X=4)&=5(0.6)^4(0.4) \\\\ &=5(0.05184) \\\\ &=0.2592 \end{aligned}P(X=4)​=5(0.6)4(0.4)=5(0.05184)=0.2592​

P(X>3)=P(4Β heads)+P(5Β heads)=P(X=4)+P(X=5)=5(0.6)4(0.4)+(0.6)5=0.2592+0.07776=0.33696β‰ˆ0.34\begin{aligned} P(X>3)&=P(4\text{ heads})+P(5\text{ heads}) \\\\ &=P(X=4)+P(X=5) \\\\ &=5(0.6)^4(0.4)+(0.6)^5 \\\\ &=0.2592+0.07776 \\\\ &=0.33696 \\\\ &\approx0.34 \end{aligned}P(X>3)​=P(4Β heads)+P(5Β heads)=P(X=4)+P(X=5)=5(0.6)4(0.4)+(0.6)5=0.2592+0.07776=0.33696β‰ˆ0.34​

Ira ran out of time while taking a multiple-choice test and plans to guess on the last 666 questions. Each question has 444 possible choices, one of which is correct. Let X=X=X= the number of answers Ira correctly guesses in the last 666 questions.

What is the probability that he answers fewer than 222 questions correctly in the last 666 questions?

The probability that Ira gets fewer than 222 questions correct in the 666 questions is equivalent to the probability that he gets 000 or 111 question correct. So we can find those probabilities and add them them together to get our answer:

P(X<2)=P(0Β correct)+P(1Β correct)=P(X=0)+P(X=1)\begin{aligned} P(X<2)&=P(0\text{ correct})+P(1\text{ correct}) \\\\ &=P(X=0)+P(X=1) \end{aligned}P(X<2)​=P(0Β correct)+P(1Β correct)=P(X=0)+P(X=1)​

Finding P(X=0)P(X=0)P(X=0)

There are 444 possible choices for each question, so we know P(correct)=25%P(\blueD{\text{correct}})=\blueD{25\%}P(correct)=25% and P(not)=75%P(\maroonD{\text{not}})=\maroonD{75\%}P(not)=75%. Answering 000 questions correctly is equivalent to answering all 666 questions incorrectly. We can multiply probabilities since we are assuming independence:

P(X=0)=(0.75)(0.75)(0.75)(0.75)(0.75)(0.75)=(0.75)6β‰ˆ0.17798\begin{aligned} P(X=0)&=(\maroonD{0.75})(\maroonD{0.75})(\maroonD{0.75})(\maroonD{0.75})(\maroonD{0.75})(\maroonD{0.75}) \\\\ &=(\maroonD{0.75})^6 \\\\ &\approx0.17798 \end{aligned}P(X=0)​=(0.75)(0.75)(0.75)(0.75)(0.75)(0.75)=(0.75)6β‰ˆ0.17798​

​We'll come back and use this result later. Next, we need to find P(X=1)P(X=1)P(X=1) (the probability that he answers 111 question correctly).

Finding P(X=1)P(X=1)P(X=1)

Answering 111 question correctly in the last 666 questions means Ira needs to get 111 question correct and 555 questions not correct. There are 444 possible choices for each question, so we know P(correct)=25%P(\blueD{\text{correct}})=\blueD{25\%}P(correct)=25% and P(not)=75%P(\maroonD{\text{not}})=\maroonD{75\%}P(not)=75%.

Since we are assuming independence, let's multiply probabilities to find the probability of getting 111 question correct followed by 555 questions not correct:

P(CNNNNN)=(0.25)(0.75)5β‰ˆ0.05933P(\blueD{\text{C}}\maroonD{\text{NNNNN}})=(\blueD{0.25})(\maroonD{0.75})^5\approx0.05933P(CNNNNN)=(0.25)(0.75)5β‰ˆ0.05933

This isn't the entire probability though, because there are other ways to get 111 question correct from 666 questions (for example, NNNNNC). How many different ways are there? We can use the combination formula:

nCk=n!(nβˆ’k)!β‹…k!6C1=6!(6βˆ’1)!β‹…1!=6β‹…5β‹…4β‹…3β‹…2β‹…1(5β‹…4β‹…3β‹…2β‹…1)β‹…1=6\begin{aligned} _n\text{C}_k&=\dfrac{n!}{(n-k)!\cdot k!} \\\\ _6\text{C}_1&=\dfrac{6!}{(6-1)!\cdot1!} \\\\ &=\dfrac{6 \cdot \cancel{5 \cdot 4 \cdot 3 \cdot 2 \cdot 1}}{(\cancel{5 \cdot 4 \cdot 3 \cdot 2 \cdot 1}) \cdot 1} \\\\ &=6 \end{aligned}n​Ck​6​C1​​=(nβˆ’k)!β‹…k!n!​=(6βˆ’1)!β‹…1!6!​=(5β‹…4β‹…3β‹…2β‹…1)β‹…16β‹…5β‹…4β‹…3β‹…2β‹…1​=6​

There are 666 ways to get 111 question correct in 666 questions. Do they all have the same probability?

Each of the 666 ways has the same probability that we already found:

P(CNNNNN)=(0.25)(0.75)5β‰ˆ0.05933P(NCNNNN)=(0.25)(0.75)5β‰ˆ0.05933P(NNCNNN)=(0.25)(0.75)5β‰ˆ0.05933P(NNNCNN)=(0.25)(0.75)5β‰ˆ0.05933P(NNNNCN)=(0.25)(0.75)5β‰ˆ0.05933P(NNNNNC)=(0.25)(0.75)5β‰ˆ0.05933\begin{aligned} P(\blueD{\text{C}}\maroonD{\text{NNNNN}})&=(\blueD{0.25})(\maroonD{0.75})^5\approx0.05933 \\\\ P(\maroonD{\text{N}}\blueD{\text{C}}\maroonD{\text{NNNN}})&=(\blueD{0.25})(\maroonD{0.75})^5\approx0.05933 \\\\ P(\maroonD{\text{NN}}\blueD{\text{C}}\maroonD{\text{NNN}})&=(\blueD{0.25})(\maroonD{0.75})^5\approx0.05933 \\\\ P(\maroonD{\text{NNN}}\blueD{\text{C}}\maroonD{\text{NN}})&=(\blueD{0.25})(\maroonD{0.75})^5\approx0.05933 \\\\ P(\maroonD{\text{NNNN}}\blueD{\text{C}}\maroonD{\text{N}})&=(\blueD{0.25})(\maroonD{0.75})^5\approx0.05933 \\\\ P(\maroonD{\text{NNNNN}}\blueD{\text{C}})&=(\blueD{0.25})(\maroonD{0.75})^5\approx0.05933 \end{aligned}P(CNNNNN)P(NCNNNN)P(NNCNNN)P(NNNCNN)P(NNNNCN)P(NNNNNC)​=(0.25)(0.75)5β‰ˆ0.05933=(0.25)(0.75)5β‰ˆ0.05933=(0.25)(0.75)5β‰ˆ0.05933=(0.25)(0.75)5β‰ˆ0.05933=(0.25)(0.75)5β‰ˆ0.05933=(0.25)(0.75)5β‰ˆ0.05933​

​So we can multiply this probability by 666 since that is how many ways there are to get 111 question correct in 666 questions:

P(X=1)=6(0.25)(0.75)5β‰ˆ6(0.05933)β‰ˆ0.35596\begin{aligned} P(X=1)&=6(0.25)(0.75)^5 \\\\ &\approx6(0.05933) \\\\ &\approx0.35596 \end{aligned}P(X=1)​=6(0.25)(0.75)5β‰ˆ6(0.05933)β‰ˆ0.35596​​

P(X<2)=P(0Β correct)+P(1Β correct)=P(X=0)+P(X=1)=(0.75)6+6(0.25)(0.75)5β‰ˆ0.17798+0.35596β‰ˆ0.53394β‰ˆ0.53\begin{aligned} P(X<2)&=P(0\text{ correct})+P(1\text{ correct}) \\\\ &=P(X=0)+P(X=1) \\\\ &=(0.75)^6+6(0.25)(0.75)^5 \\\\ &\approx0.17798+0.35596 \\\\ &\approx0.53394 \\\\ &\approx0.53 \end{aligned}P(X<2)​=P(0Β correct)+P(1Β correct)=P(X=0)+P(X=1)=(0.75)6+6(0.25)(0.75)5β‰ˆ0.17798+0.35596β‰ˆ0.53394β‰ˆ0.53​

πŸͺΆ