A formula that is used to revise probabilities based on new information.
A process with two outcomes in each of a series of independent trials in which the probabilities of the outcomes do not change.
A discrete distribution that describes the number of successes in independent trials of a Bernoulli process.
An objective way of assessing probabilities based on logic.
A collection of all possible outcomes of an experiment.
The probability of one event occurring given that another has taken place.
A probability distribution with a continuous random variable.
A random variable that can assume an infinite or unlimited set of values.
A probability distribution with a discrete random variable.
A random variable that can only assume a finite or limited set of values.
The (weighted) average of a probability distribution.
A continuous probability distribution that is the ratio of the variances of samples from two independent normal distributions.
The situation in which the occurrence of one event has no effect on the probability of occurrence of a second event.
The set of all outcomes that are common to both events.
The probability of events occurring together (or one after the other).
A situation in which only one of two or more events can occur on any given trial or experiment.
A continuous probability distribution that describes the time between customer arrivals in a queuing situation.
A continuous bell-shaped distribution that is a function of two parameters, the mean and standard deviation of the distribution.
A method of determining probability values based on historical data or logic.
A discrete probability distribution used in queuing theory.
A probability value determined before new or additional information is obtained. It is sometimes called an a priori probability estimate.
A statement about the likelihood of an event occurring. It is expressed as a numerical value between 0 and 1, inclusive.
The mathematical function that describes a continuous probability distribution. It is represented by f(X).
The set of all possible values of a random variable and their associated probabilities.
A variable that assigns a number to every possible outcome of an experiment.
An objective way of determining probabilities based on observing frequencies over a number of trials.
A probability value that results from new or revised information and prior probabilities.
The square root of the variance.
A method of determining probability values based on experience or judgment.
The set of all outcomes that are contained in either of these two events.
A measure of dispersion or spread of the probability distribution.