Here's a big quandary many of us are facing: when you use a case file to induce probability values in a Bayes net (esp. in Netica), such as with the expectation maximization (EM) algorithm or a similar approach, often the resulting conditional probability table (CPT) contains a lot of "holes" filled by uniform probabilities. It's a chore to go in by hand and "fill" those holes with more appropriate values, especially if the network is not a naive Bayes structure (that is, if links point from predictor nodes to the results or output node).
Has anyone found a way around this, such as with some smoothing, interpolation, or imputation algorithm, to fill the holes more efficiently?
Netica's function Cases / Learn / Smooth CPTs does nothing more than replace CPT values with uniform probabilities (same as the Tables / Uniform Probabilities menu item does).
Anything related to BNs, including models, software and theory
1 post • Page 1 of 1