Independencies in Markov networks

In the previous chapter, we saw how a Bayesian network structure encodes independency conditions in it, and how observing variables affects the flow of influence in the network. Similarly, in the case of Markov networks, the graph structure encodes independency conditions. However, the flow of influence in a Markov network stops as soon as any node is observed in that trail. This is quite different from what we saw in the Bayesian network, where different structures responded differently to the observation of the nodes.

To understand this more formally, let H be a Markov network structure and Independencies in Markov networks be a set of observed variables. Then, the path Independencies in Markov networks is active if and only if none of the Independencies in Markov networks for Independencies in Markov networks are in Z.

In the case of Bayesian networks, we had the concept of local independencies, where a variable is independent of all its non-descendants, given given its parents. We also had global conditions which were implied by D-Separation. Similarly, in the case of Markov networks, the independence conditions that we discussed earlier are the global independencies in the network. Local independence conditions are a subset of global conditions, but local independencies are also very important because they allow us to focus on a much smaller part of the network.

There are two ways of looking at the local independencies in the case of a Markov network. One way is to be intuitive and think that if two nodes X and Y are directly connected, then there is no way of rendering them as independent. However, if they are not directly connected, there is always a way of rendering them conditionally independent of each other. One way to do this is by observing all the variables in the network, except for X and Y. If we have all the nodes observed in the network except X and Y, then there must be at least one observed node in the trail connecting the nodes X and Y, which will eventually lead X and Y to be independent of each other. This is known as pairwise independency. More formally, we can define pairwise independency in a Markov network H as follows:

Independencies in Markov networks

Another way of thinking about local independencies is to not let other nodes influence a given node, by observing all of its neighboring nodes. This set of neighboring nodes is known as the Markov blanket, and this type of independence in the network is known as local independency. More formally, this can be defined as follows:

Independencies in Markov networks

Like Bayesian networks, we also have the concept of I-Map in Markov models. For a probability distribution P and a Markov network structure H if Independencies in Markov networks, we say that H is an I-Map of P.

Let's check the local independencies in the network using pgmpy:

In [1]: from pgmpy.models import MarkovModel
In [2]: mm = MarkovModel()
In [3]: mm.add_nodes_from(

                   ['x1', 'x2', 'x3', 'x4', 'x5', 'x6', 'x7'])
In [4]: mm.add_edges_from(

                   [('x1', 'x3'), ('x1', 'x4'), ('x2', 'x4'),

                    ('x2', 'x5'), ('x3', 'x6'), ('x4', 'x6'),
                    ('x4', 'x7'), ('x5', 'x7')])
In [5]: mm.get_local_independencies()
Out[5]: 
(x3 _|_ x5, x4, x7, x2 | x6, x1)
(x4 _|_ x3, x5 | x6, x7, x1, x2)
(x1 _|_ x6, x7, x5, x2 | x3, x4)
(x5 _|_ x3, x4, x6, x1 | x7, x2)
(x7 _|_ x3, x6, x1, x2 | x5, x4)
(x2 _|_ x3, x6, x7, x1 | x5, x4)
(x6 _|_ x5, x7, x1, x2 | x3, x4)

We saw three different ways of defining independencies in Markov networks. While all of these are related, they are equivalent only for positive distributions. Non-positive distributions allow for deterministic dependencies between the variables, and such deterministic interactions can allow us to construct networks that are not I-maps of the distribution but local independencies hold for them.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset