deepgp: Bayesian Deep Gaussian Processes using MCMC
Performs Bayesian posterior inference for deep Gaussian processes following
Sauer, Gramacy, and Higdon (2023, <doi:10.48550/arXiv.2012.08015>). See Sauer (2023,
<http://hdl.handle.net/10919/114845>) for comprehensive methodological details and
<https://bitbucket.org/gramacylab/deepgp-ex/> for a variety of coding examples.
Models are trained through
MCMC including elliptical slice sampling of latent Gaussian layers and Metropolis-Hastings
sampling of kernel hyperparameters. Vecchia-approximation for faster computation is implemented
following Sauer, Cooper, and Gramacy (2022, <doi:10.48550/arXiv.2204.02904>). Downstream tasks include
sequential design through active learning Cohn/integrated mean squared error (ALC/IMSE; Sauer,
Gramacy, and Higdon, 2023), optimization through expected improvement (EI;
Gramacy, Sauer, and Wycoff, 2021 <doi:10.48550/arXiv.2112.07457>), and contour location through entropy
(Sauer, 2023). Models extend up to three layers deep; a one layer model is equivalent to
typical Gaussian process regression. Incorporates OpenMP and SNOW parallelization and
utilizes C/C++ under the hood.
Version: |
1.1.1 |
Depends: |
R (≥ 3.6) |
Imports: |
grDevices, graphics, stats, doParallel, foreach, parallel, GpGp, Matrix, Rcpp, mvtnorm, FNN |
LinkingTo: |
Rcpp, RcppArmadillo |
Suggests: |
interp, knitr, rmarkdown |
Published: |
2023-08-07 |
Author: |
Annie S. Booth |
Maintainer: |
Annie S. Booth <annie_booth at ncsu.edu> |
License: |
LGPL-2 | LGPL-2.1 | LGPL-3 [expanded from: LGPL] |
NeedsCompilation: |
yes |
Materials: |
README |
CRAN checks: |
deepgp results |
Documentation:
Downloads:
Linking:
Please use the canonical form
https://CRAN.R-project.org/package=deepgp
to link to this page.