Sampling-based approximate methods

In the previous sections, we discussed a class of approximate methods that used factor manipulation methods to answer approximate queries on the models. Now, in this section, we will be discussing a very different approach to approximate inference. In this method, we will try to estimate the original distribution by instantiating all the variables or a few variables of the network. Using these instantiations, we will try to answer queries on the model. The methods using instantiations are generally known as particle-based methods, and each instantiation is known as a particle.

There are many variations of the way we select particles or create instantiations of the variables. For example, we can either create particles using a deterministic process, or we can sample particles from some distribution. Also, we can have different notions of a particle. For example, we can have a full assignment of all the variables in the network, commonly known as full particles, or we can have assignments only to a subset Sampling-based approximate methods of variables of the network representing the conditional probability Sampling-based approximate methods. These are commonly known as collapsed particles. The main problem with full particles is that each particle is able to represent only a very small part of the whole space, and therefore, for a reasonable representation of the distribution, we need many more particles than are needed for collapsed particles.

In general, in the case of sampling methods, to approximate the values of queries, we generate some particles, and then, using these particles, we try to estimate the value or the expectation of the query relative to each of the generated particles and aggregate these to get the final result.

Also, concepts such as forward sampling and likelihood weighting, discussed in the next sections, only apply to Bayesian networks and not to Markov networks.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset