A Julia implementation of Fast Automatic Relevance Determination (ARD) for sparse Bayesian regression with uncertainty quantification.
- Automatic sparsity detection in high-dimensional regression problems
- Uncertainty quantification with well-calibrated confidence intervals
- Polynomial chaos expansion support for function approximation
- Method comparison against traditional numerical approaches
- Performance optimization with timing analysis and convergence monitoring
using FastARD
using Random, Statistics, LinearAlgebra
# Test on the Ishigami function
function ishigami(x; a=7.0, b=0.1)
x1, x2, x3 = x[1], x[2], x[3]
return sin(x1) + a * sin(x2)^2 + b * x3^4 * sin(x1)
end
# Generate test data
Random.seed!(123)
n_train = 300
X_train = 2π * rand(n_train, 3) .- π
y_train = [ishigami(X_train[i, :]) for i in 1:n_train]
# Create polynomial basis (35 terms for degree 4)
# ... (see examples for full implementation)
# Fit FastARD model
model = FastARDRegressor(verbose=true, compute_score=true)
fit!(model, Psi_train, y_train_noisy)
# Analyze sparsity
active_indices, active_coefs = get_active_coefficients(model)
println("Selected $(length(active_indices)) out of 35 basis functions")
# Get predictions with uncertainty
y_pred, y_std = predict_with_uncertainty(model, Psi_test)The package includes comprehensive tests demonstrating performance on the Ishigami function, a standard benchmark for:
- Sensitivity analysis and uncertainty quantification
- Polynomial chaos expansion with automatic basis selection
- Comparison against multiple advanced numerical methods
- Uncertainty calibration and coverage probability analysis
See examples/ishigami_comparison_test.jl for the complete tested implementation.