Markov Logic Network

  1. Introduction : Markov logic combines logic and statistics (or we can say probability). Suppose we have a formula
    ∀x Smokes(x) => Cancer(x)

    which basically says that if a person smokes, then he has cancer. Now suppose there are 3 persons, Anna, Bob, and Charles in our world. So now we want to test if our formula is satisfied or not, and we get following evidence : So clearly, formula is not satisfied due to third evidence. The formula is not satisfied because there was ∀x in formula, which said that this should be true for every person in the world.
    So I am tempted to think that this formula is incorrect i.e. it lies, we can't say if a person has cancer or not, if we knew he smokes. Now suppose, there were 1000 persons in our world, 400 of them smoke and have cancer, 599 of them don't smoke (their cancer is irrelevant in this formula as they all satisfy formula anyway), and only 1 person is there who smokes but doesn't have cancer. So again this formula is not satisfied in our world, but now we see that it is satisfied by most of the people (infact 999/1000), so we can't be so rude to this formula. It is saying truth about most of the people in the world, so here comes the probability. Now we don't say a formula is possible or impossible, we say a formula is probable or improbable. Here in our case, the probability of formula being true is very high, so if a new person comes and if we come to know that he smokes, then with very large confidence, we can say that he has a cancer. On the other hand, if in our evidence of 1000 people, 500 people satisfied and 500 people didn't satisfy formula, then to that new person, I would have said you have 50-50 chance of cancer.
    Till now, all things we are saying are based on our intuition, we will see how it can all be done mathematically.
  2. Markov Logic to Markov Logic Network : So we discussed how we can combine logic and probability. Now we will see how these first order logic formulae can be used to create markov network, which in turn, will help us to find probability of a state. Ok, so there were some terms I didn't introduce, but bear with me, I will explain as we work out with an example. So suppose our formula was satisfied by most of the people in the world, and so we define something called potential function for a formula.
    Smokes(x)Cancer(x)φ
    0030
    0130
    101
    1130
    What this table says is that states 00, 01, and 11 are 30 times more likely than state 10. Note that formula is satisfied in all states except 10. These numbers are quite intuitive also because most of the people satisfied this formula which increased the likelihood of formula being true.
    Now let us form ground clauses i.e. assign the variables like x,y in clauses the constants in our world. Lets say we have 3 constants Anna, Bob, and Charles, then, in each clause (here we have only 1), we substitute these constants in place of variables. So substituting these constants, we get 3 ground clauses :
    Smokes(Anna) => Cancer(Anna), Smokes(Bob) => Cancer(Bob), and Smokes(Charles) => Cancer(Charles).
    Now we create a markov network with these clauses. A markov network is an undirected graph, where nodes in our case would be ground predicates, and there is an edge between two nodes, if they appear in same ground clause. So here we have 6 nodes, namely Smokes(Anna), Smokes(Bob), Smokes(Charles), Cancer(Anna), Cancer(Bob), and Cancer(Charles). Now Smokes(Anna) and Cancer(Anna) come in same ground clause, so they are connected, and similarly others are connected to each other, so we get a graph like this : Now actually, the potential function we defined earlier is for these ground clauses (and not an ungrounded clause).