bnlearn (4.3)

  * the "strict" and "optimized" arguments of constraint-based algorims are
     now deprecated and will be removed at the beginning of 2019.
  * the relevant() function is now deprecated, and it will also be removed
     at the beginning of 2019.
  * improved and fixed a few bugs in the functions that import and export
     bn and bn.fit objects to the graph package.
  * fixed a bug in averaged.network(), which could result in inconsistent bn
     objects when arcs were dropped to obtain an acyclic graph (thanks Shuonan
     Chen).
  * added a graphviz.chart() function to produce DAG-with-barchart-nodes plots.
  * fixed the counting of the number of parameters of continuous and hybrid
     networks, which did not take the residual standard errors into account
     (thanks Jeffrey Hart). 
  * improved handling of singular models in impute().
  * added an import function for pcAlgo objects from pcalg.
  * fixed bug in the sanitization of conditional Gaussian networks (thanks
     Kostas Oikonomou).
  * added a loss() function to extract the estimated loss values from the
     objects returned by bn.cv() (thanks Dejan Neskovic).
  * it is now possible to use data with missing values in bn.fit() and
     nparams().
  * added a "replace.unidentifiable" argument to bn.fit(..., method = "mle"),
     to replace parameter estimates that are NA/NaN with zeroes (for
     regression coefficients) and uniform probabilities (in conditional
     probability tables).
  * added a bf.strength() function to compute arc strengths using Bayes
     factors.
  * learn.{mb,nbr}() now work even if all nodes are blacklisted.
  * assigning singular models from lm() to nodes in a bn.fit object will now
     zapsmall() near-zero coefficients, standard errors and residuals to match
     the estimates produced by bn.fit().
  * bn.cv() now supports performing multiple runs with custom folds (different
     for each run).
  * improved sanitization in mutilated(), and updated its documentation.
  * removed the bibTeX file with references, available at www.bnlearn.com.
  * implemented the stable version of the PC algorithm.
  * added a count.graph() function that implements a number of graph enumeration
     results useful for studying graphical priors.
  * fixed loss estimation in bn.cv() for non-extendable partially directed
     graphs, now errors are produced instead of returning meaningless results
     (thanks Derek Powell).

bnlearn (4.2)

  * added a tuning parameter for the inclusion probability to the marginal
     uniform graph prior.
  * added a Bayesian Dirichlet score using Jeffrey's prior (from Joe Suzuki).
  * allow fractional imaginary sample sizes for posterior scores.
  * allow imaginary sample sizes in (0, 1] for discrete posterior scores,
     to explore asymptotic results.
  * set the default imaginary sample size for discrete networks to 1, following
     recommendations from the literature.
  * moral(), cpdag(), skeleton() and vstructs() now accept bn.fit objects in
     addition to bn objects.
  * fixed a segfault in cpdist(..., method = "lw") caused by all weights
     being equal to NaN (thanks David Chen).
  * changed the default value of the "optimized" argument to "FALSE" in
     constraint-based algorithms.
  * changed the arguments of mmhc() and rsmax2() to improve their flexibility
     and to allow separate "optimized" values for the restrict and maximize
     phases.
  * fixed sanitization of fitted networks containing ordinal discrete
     variables (thanks David Chen).
  * improved argument sanitization in custom.fit() and model string functions.
  * added a BF() function to compute Bayes factors.
  * added a graphviz.compare() function to visually compare network structures.
  * implemented the locally averaged Bayesian Dirichlet score.
  * custom.strength() now accepts bn.kcv and bn.kcv objects and computes arc
     strengths from the networks learned by bn.cv() in the context of
     cross-validation.
  * fixed multiple bugs in cextend() and cpdag() that could result in the
     creation of additional v-structures.
  * implemented the Structural EM algorithm in structural.em().
  * fixed multiple bugs triggered by missing values in predict() (thanks
     Oussama Bouldjedri).
  * implemented an as.prediction() function that exports objects of class
     bn.strength to the ROCR package (contributed by Robert Ness).

bnlearn (4.1)

  * fixed memory corruption in dsep() (thanks Dominik Muller).
  * added the marginal uniform prior.
  * fixed the optimized score cache for the Castelo & Siebes and for the
     marginal uniform priors, which were affected by several subtle bugs.
  * bn.cv() now implements a "custom-folds" method that allows to manually
     specify which observation belongs to each fold, and folds are not
     constrained to have the same size.
  * fixed checks in the C code involving R objects' classes; they failed
     when additional, optional classes were present (thanks Claudia Vitolo).
  * fixed cpdag() handling of illegal arcs that are part of shielded
     colliders (thanks Vladimir Manewitsch).
  * removed misleading warning about conflicting v-structures from cpdag().
  * rsmax2() and mmhc() now return whitelists and blacklists as they are
     at the beginning restrict phase (thanks Vladimir Manewitsch).
  * bn.fit() can now fit local distributions in parallel, and has been mostly
     reimplemented in C for speed (thanks Claudia Vitolo).
  * added an impute() function to impute missing values from a bn.fit object.
  * fixed loss functions for data in which observations have to be dropped
     for various nodes (thanks Manuel Gomez Olmedo).
  * added an all.equal() method to compare bn.fit objects.
  * added a "by.node" argument to score() for decomposable scores (thanks
     Behjati Shahab).
  * added warning about partially direct graphs in choose.direction() and
     improved its debugging output (thanks Wei Kong).
  * added spouses(), ancestors() and descendats().
  * fixed a segfault in predict(..., method = "lw") with discrete BNS and
     sparse CPTs that included NaNs.

bnlearn (4.0)

  * fixed memory usage in aracne(), chow.liu() and tree.bayes() (thanks
     Sunkyung Kim).
  * rework memory management using calloc() and free() to avoid memory
     leaks arising from R_alloc() and missing memory barriers.
  * fixed a coefficients indexing bug in rbn() for conditional Gaussian
     nodes (thanks Vladimir Manewitsch).
  * added a mean() function to average bn.strength objects.
  * fixed S4 method creation on package load on MacOS X (thanks Dietmar
     Janetzko)
  * fixed more corner cases in the Castelo & Siebes prior, and increased
     numeric tolerance for prior probabilities.
  * allow non-uniform priors for the "mbde" score (thanks Robert Ness)
     and for "bdes".
  * the "mode" attribute in bn.strength objects it now named "method".
  * added posterior probabilities to the predictions for all discrete
     networks (thanks ShangKun Deng).
  * added the Steck's optimal ISS estimator for the BDe(u) score.
  * fixed the assignment of standard deviation in fitted CLG networks
     (thanks Rahul Swaminathan).
  * handle zero lambdas in the shrinkage Gaussian mutual information
     (thanks Piet Jones).
  * fixed segfault when computing posterior predictions from networks with
     NaNs in their conditional probability tables (thanks Giulio Caravagna).
  * fixed the assignment of LASSO models from the penalized package to
     fitted Gaussian networks (thanks Anthony Gualandri). 
  * cpdag() now preserves the directions of arcs between continuous and
     discrete nodes in conditional linear Gaussian networks, and optionally
     also takes whitelists and blacklist into account (for any network).
  * several checks are now in place to prevent the inclusion of illegal
     arcs in conditional Gaussian networks.
  * renamed the "ignore.cycles" argument to "check.cycles" in arcs<-() and
     amat<-() for consistency with other functions such as set.arc().
  * added an "undirected" argument to mmpc() and si.hiton.pc(), which can now
     learn the CPDAG of the network instead of just the skeleton.
  * added a "directed" argument to acyclic().
  * removed unsupported argument "start" from learn.nbr().
  * handle interventions correctly in boot.strength() when using the mixed
     BDe score (thanks Petros Boutselis).
  * "bdes" is now named "bds" (it is not score equivalent, so the "e" did
    not belong).

bnlearn (3.9)

  * fixed alpha threshold truncation bug in conditional independence tests
     (thanks Janko Tackmann).
  * massive cleanup of the C code handling conditional independence tests.
  * fixed variance scaling bug for the mi-cg test (thanks Nicholas Mitsakakis).
  * in the exact t-test for correlation and in Fisher's Z, assume independence
     instead of returning an error when degrees of freedom are < 1.
  * fixed segfault in cpdist(..., method = "lw") when the evidence has
     probability zero.
  * added loss functions based on MAP predictions in bn.cv().
  * removed bn.moments() and bn.var(), they were basically unmaintained and had
     numerical stability problems.
  * added support for hold-out cross-validation in bn.cv().
  * added plot() methods for comparing the results of different bn.cv() calls.
  * permutation tests should return a p-value of 1 when one of the two
     variables being tested is constant (thanks Maxime Gasse).
  * improved handling of zero prior probabilities for arcs in the Castelo &
     Siebes prior, so that hc() and tabu() do not get stuck (thanks Jim Metz).
  * added an "effective" argument to compute the effective degrees of freedoms
     of the network, estimated with the number of non-zero free parameters.
  * fixed optional argument handling in rsmax2().
  * fixed more corner cases related to singular models in 
     cpdist(..., method = "lw") and predict(..., method = "bayes-lw").
  * fixed Pearson's X^2 test, zero cells may have dropped too often in
     sparse contingency tables.
  * fixed floating point rounding issues in the shrinkage estimator for the
     Gaussian mutual information.

bnlearn (3.8.1)

  * fixed CPT import in read.net().
  * fixed penfit objects import from penalized (thanks John Noble).
  * fixed memory allocation corner case in BDe.

bnlearn (3.8)

  * reorder CPT dimensions as needed in custom.fit() (thanks Zheng Zhu).
  * fixed two uninitialized-memory bugs found by valgrind, one in
     predict() and one random.graph().
  * fixed wrong check for cluster objects (thanks Vladimir Manewitsch).
  * fixed the description of the alternative hypothesis for the
     Jonckheere-Terpstra test.
  * allow undirected cycles in whitelists for structure learning algorithms
     and let the algorithm learn arc directions (thanks Vladimir Manewitsch).
  * include sanitized whitelists (as opposed to those provided by the user)
     in bn.fit objects.
  * removed predict() methods for single-node objects, use the method for
     bn.fit objects instead.
  * various fixes in the monolithic C test functions.
  * fixed indexing bug in compare() (thanks Vladimir Manewitsch).
  * fixed false positives in cycle detection when adding edges to a graph
     (thanks Vladimir Manewitsch).
  * fixed prior handling in predict() for naive Bayes and TAN classifiers
     (thanks Vinay Bhat).
  * added configs() to construct configurations of discrete variables.
  * added sigma() to extract standard errors from bn.fit objects.

bnlearn (3.7.1)

  * small changes to make CRAN checks happy.

bnlearn (3.7)

  * fixed the default setting for the number of particles in cpquery()
     (thanks Nishanth Upadhyaya).
  * reimplemented common test patterns in monolithic C functions to speed
     up constraint-based algorithms.
  * added support for conditional linear Gaussian (CLG) networks.
  * fixed several recursion bugs in choose.direction().
  * make read.{bif,dsc,net}() consistent with the `$<-` method for bn.fit
     objects (thanks Felix Rios).
  * support empty networks in read.{bif,dsc,net}().
  * fixed bug in hc(), triggered when using both random restarts and the
     maxp argument (thanks Irene Kaplow).
  * correctly initialize the Castelo & Siebes prior (thanks Irene Kaplow).
  * change the prior distribution for the training variable in classifiers
     from the uniform prior to the fitted distribution in the
     bn.fit.{naive,tan} object, for consistency with gRain and e1071 (thanks
     Bojan Mihaljevic).
  * note AIC and BIC scaling in the documentation (thanks Thomas Lefevre).
  * note limitations of {white,black}lists in tree.bayes() (thanks Bojan
     Mihaljevic).
  * better input sanitization in custom.fit() and bn.fit<-().
  * fixed .Call stack imbalance in random restarts (thanks James Jensen).
  * note limitations of predict()ing from bn objects (thanks Florian Sieck).

bnlearn (3.6)

  * support rectangular nodes in {graphviz,strength}.plot().
  * fixed bug in hc(), random restarts occasionally introduced cycles in
     the graph (thanks Boris Freydin).
  * handle ordinal networks in as.grain(), treat variables as categorical
     (thanks Yannis Haralambous).
  * discretize() returns unordered factors for backward compatibility.
  * added write.dot() to export network structures as DOT files.
  * added mutual information and X^2 tests with adjusted degrees of freedom.
  * default vstruct() and cpdag() to moral = FALSE (thanks Jean-Baptiste
     Denis).
  * implemented posterior predictions in predict() using likelihood weighting.
  * prevent silent reuse of AIC penalization coefficient when computing BIC
     and vice versa (thanks MarĂ­a Luisa Matey).
  * added a "bn.cpdist" class and a "method" attribute to the random data
     generated by cpdist().
  * attach the weights to the return value of cpdist(..., method = "lw").
  * changed the default number of simulations in cp{query, dist}().
  * support interval and multiple-valued evidence for likelihood weighting
     in cp{query,dist}().
  * implemented dedup() to pre-process continuous data.
  * fixed a scalability bug in blacklist sanitization (thanks Dong Yeon Cho).
  * fixed permutation test support in relevant().
  * reimplemented the conditional.test() backend completely in C for
     speed, it is now called indep.test().

bnlearn (3.5)

  * fixed (again) function name collisions with the graph packages
     (thanks Carsten Krueger).
  * fixed some variable indexing issues in likelihood weighting.
  * removed bootstrap support from arc.strength(), use boot.strength()
     instead.
  * added set.edge() and drop.edge() to work with undirected arcs.
  * boot.strength() now has a parallelized implementation.
  * added support for non-uniform graph priors (Bayesian variable
     selection, Castelo & Siebes).
  * added a threshold for the maximum number of parents in hc() and tabu().
  * changed the default value of "moral" from FALSE to TRUE in cpdag()
     and vstructs() to ensure sensible results in model averaging.
  * added more sanity checks in cp{query,dist}() expression parsing
     (thanks Ofer Mendelevitch).
  * added 'nodes' and 'by.sample' arguments to logLik() for bn.fit objects.
  * support {naive,tree}.bayes() in bn.cv() (thanks Xin Zhou).
  * fixed predict() for ordinal networks (thanks Vitalie Spinu).
  * fixed zero variance handling in unconditional Jonckheere-Terpstra
     tests due to empty rows/columns (thanks Vitalie Spinu).
  * in bn.cv(), the default loss for classifiers is now classification
     error.
  * added a nodes<-() function to re-label nodes in bn and bn.fit object
     (based on a proof of concept by Vitalie Spinu).
  * replaced all calls to LENGTH() with length() in C code (thanks Brian
     Ripley).
  * default to an improper flat prior in predict() for classifiers for
     consistency (thanks Xin Zhou).
  * suggest the parallel package instead of snow (which still works fine).

bnlearn (3.4)

  * move the test counter into bnlearn's namespace.
  * include Tsamardinos' optimizations in mmpc(..., optimized = FALSE),
     but not backtracking, to make it comparable with other learning
     algorithms.
  * check whether the residuals and the fitted values are present
     before trying to plot a bn.fit{,.gnode} object.
  * fixed two integer overflows in factors' levels and degrees of
     freedom in large networks.
  * added {compelled,reversible}.arcs().
  * added the MSE and predictive correlation loss functions to bn.cv().
  * use the unbiased estimate of residual variance to compute the
     standard error in bn.fit(..., method = "mle") (thanks
     Jean-Baptiste Denis).
  * revised optimizations in constraint-based algorithms, removing
     most false positives by sacrificing speed.
  * fixed warning in cp{dist,query}().
  * added support for ordered factors.
  * implemented the Jonckheere-Terpstra test to support ordered
     factors in constraint-based structure learning.
  * added a plot() method for bn.strength objects containing
     bootstrapped confidence estimates; it prints their ECDF and
     the estimated significance threshold.
  * fixed dimension reduction in cpdist().
  * reimplemented Gaussian rbn() in C, it's now twice as fast.
  * improve precision and robustness of (partial) correlations.
  * remove the old network scripts for network that are now available
     from www.bnlearn.com/bnrepository.
  * implemented likelihood weighting in cp{dist,query}().

bnlearn (3.3)

  * fixed cpdag() and cextend(), which returned an error about
     the input graph being cyclic when it included the CPDAG of
     a shielded collider (thanks Jean-Baptiste Denis).
  * do not generate observations from redundant variables (those
     not in the upper closure of event and evidence) in cpdag()
     and cpquery().
  * added Pena's relevant() nodes identification.
  * make custom.fit() robust against floating point errors
     (thanks Jean-Baptiste Denis).
  * check v-structures do not introduce directed cycles in the
     graph when applying them (thanks Jean-Baptiste Denis).
  * fixed a buffer overflow in cextend() (thanks Jean-Baptiste
     Denis).
  * added a "strict" argument to cextend().
  * removed Depends on the graph package, which is in Suggests
     once more.
  * prefer the parallel package to snow, if it is available.
  * replace NaNs in bn.fit objects with uniform conditional
     probabilities when calling as.grain(), with a warning
     instead of an error.
  * remove reserved characters from levels in write.{dsc,bif,net}().
  * fix the Gaussian mutual information test (thanks Alex Lenkoski).

bnlearn (3.2)

  * fixed outstanding typo affecting the sequential Monte Carlo
     implementation of Pearson's X^2 (thanks Maxime Gasse).
  * switch from Margaritis' set of rules to the more standard
     Meek/Sprites set of rules, which are implemented in cpdag().
     Now the networks returned by constraint-based algorithms are
     guaranteed to be CPDAGs, which was not necessarily the case
     until now.
  * semiparametric tests now default to 100 permutations, not 5000.
  * make a local copy of rcont2() to make bnlearn compatible with
     both older and newer R versions.

bnlearn (3.1)

  * fixed all.equal(), it did not work as expected on networks
     that were identical save for the order of nodes or arcs.
  * added a "moral" argument to cpdag() and vstructs() to make
     those functions follow the different definitions of v-structure.
  * added support for graphs with 1 and 2 nodes.
  * fixed cpquery() handling of TRUE (this time for real).
  * handle more corner cases in dsep().
  * added a BIC method for bn and bn.fit objects.
  * added the semiparametric tests from Tsamardinos & Borboudakis
     (thanks Maxime Gasse).
  * added posterior probabilities to the predictions for
     {naive,tree}.bayes() models.
  * fixed buffer overflow in rbn() for discrete data.

Older Entries