- Unlocking cap on following requirements: networkx, pathos, torch and wrapt
- Switch to Pyvis for graph visualisation and remove dependency on Graphviz
- Support newer version of Numpy
- Support newer version of Scikit-learn
- Remove python 3.6, 3.7 support
- Add python 3.9, 3.10 support
- Unlock Scipy restrictions
- Fix bug: infinite loop on lv inference engine
- Fix DAGLayer moving out of gpu during optimization step of Pytorch learning
- Fix CPD comparison of floating point - rounding issue
- Fix set_cpd for parentless nodes that are not MultiIndex
- Add Docker files for development on a dockerized environment
- Add expectation-maximisation (EM) algorithm to learn with latent variables
- Add a new tutorial on adding latent variable as well as identifying its candidate location
- Allow users to provide self-defined CPD, as per #18 and #99
- Generalise the utility function to get Markov blanket and incorporate it within
StructureModel
(cf. #136) - Add a link to
PyGraphviz
installation guide under the installation prerequisites - Add GPU support to Pytorch implementation, as requested in #56 and #114 (some issues remain)
- Add an example for structure model exporting into first causalnex tutorial, as per #124 and #129
- Fix infinite loop when querying
InferenceEngine
after a do-intervention that splits the graph into two or more subgraphs, as per #45 and #100 - Fix decision tree and mdlp discretisations bug when input data is shuffled
- Fix broken URLs in FAQ documentation, as per #113 and #125
- Fix integer index type checking for timeseries data, as per #74 and #86
- Fix bug where inputs to the DAGRegressor/Classifier yielded different predictions between float and int dtypes, as per #140
- Fix bug in set_cpd() where only pd.MultiIndex dataframes were considered which does not account for parentless nodes, as per #146
- Add supervised discretisation strategies using Decision Tree and MDLP algorithms
- Add
BayesianNetworkClassifier
an sklearn compatible class for fitting and predicting probabilities in a BN - Fixes cyclical import of
causalnex.plots
, as per #106 - Add utility function to extract Markov blanket from a Bayesian Network
- Support receiving a list of inputs for
InferenceEngine
with a multiprocessing option - Add supervised discretisation strategies using Decision Tree and MDLP algorithms
- Added manifest files to ensure requirements and licenses are packaged
- Fix estimator issues with sklearn ("unofficial python 3.9 support", doesn't work with
discretiser
option) - Minor bumps in dependency versions, remove prettytable as dependency
- Remove Boston housing dataset from "sklearn tutorial", see #91 for more information.
- Update pylint version to 2.7
- Improve speed and non-stochasticity of tests
- Fixed bug where the sklearn tutorial documentation wasn't rendering.
- Weaken pandas requirements to >=1.0, <2.0 (was ~=1.1).
- Removed Python 3.5 support and add Python 3.8 support.
- Updated core dependencies, supporting pandas 1.1, networkx 2.5, pgmpy 0.1.12.
- Added PyTorch to requirements (i.e. not optional anymore).
- Allows sklearn imports via
from causalnex.structure import DAGRegressor, DAGClassifier
.
- Allows sklearn imports via
- Added multiclass support to pytorch sklearn wrapper.
- Added multi-parameter collapsed graph as graph attribute.
- Added poisson regression support to sklearn wrapper.
- Added distribution support for structure learning:
- Added ordinal distributed data support for pytorch NOTEARS.
- Added categorical distributed data support for pytorch NOTEARS.
- Added poisson distributed data support for pytorch NOTEARS.
- Added dist type schema tutorial to docs.
- Updated sklearn tutorial in docs to show new features.
- Added constructive ImportError for pygraphviz.
- Added matplotlib and ipython display convenience functions.
- Added
DAGClassifier
sklearn interface using the Pytorch NOTEARS implementation. Supports binary classification. - Added binary distributed data support for pytorch NOTEARS.
- Added a "distribution type" schema system for pytorch NOTEARS (
pytorch.dist_type
). - Rename "data type" to "distribution type" in internal language.
- Fixed uniform discretiser (
Discretiser(method='uniform')
) where all bins have identical widths. - Fixed and updated sklearn tutorial in docs.
- Added DYNOTEARS (
from_numpy_dynamic
, an algorithm for structure learning on Dynamic Bayesian Networks). - Added Pytorch implementation for NOTEARS MLP (
pytorch.from_numpy
) which is much faster and allows nonlinear modelling. - Added
DAGRegressor
sklearn interface using the Pytorch NOTEARS implementation. - Added non-linear data generators for multiple data types.
- Added a count data type to the data generator using a zero-inflated Poisson.
- Set bounds/max class imbalance for binary features for the data generators.
- Bugfix to resolve issue when applying NOTEARS on data containing NaN.
- Bugfix for data_gen system. Fixes issues with root node initialization.
- Added plotting tutorial to the documentation
- Updated
viz.draw
syntax in tutorial notebooks - Bugfix on notears lasso (
from_numpy_lasso
andfrom_pandas_lasso
) where the non-negativity constraint was not being set - Added DAG-based synthetic data generator for mixed types (binary, categorical, continuous) using a linear SEM approach.
- Unpinned some requirements
- support for newer versions of scikit-learn
- classification report now returns dict in line with scikit-learn
- Plotting now backed by pygraphviz. This allows:
- More powerful layout manager
- Cleaner fully customisable theme
- Out-the-box styling for different node and edge types
- Can now get subgraphs from StructureModel containing a specific node
- Bugfix to resolve issue when fitting CPDs with some missing states in data
- Minor documentation fixes and improvements
Bugfix to resolve broken links in README and minor text issues.
Bugfix to add image to readthedocs
Bugfix to address readthedocs issue.
The initial release of CausalNex.
CausalNex was originally designed by Paul Beaumont and Ben Horsburgh to solve challenges they faced in inferring causality in their project work. This work was later turned into a product thanks to the following contributors: Philip Pilgerstorfer , Angel Droth , Richard Oentaryo , Steve Ler , Hiep Nguyen , Gabriel Azevedo Ferreira , Zain Patel , Wesley Leong , Yetunde Dada , Viktoriia Oliinyk , Roxana Pamfil , Nisara Sriwattanaworachai , Nikolaos Tsaousis , Shuhei Ishida , Francesca Sogaro , Deepyaman Datta , Ryan Ng.
CausalNex would also not be possible without the generous sharing from leading researches in the field of causal inference and we are grateful to everyone who advised and supported us, filed issues or helped resolve them, asked and answered questions or simply be part of inspiring discussions.