Basic Examples 
(2)
 

Generate a Bayesian linear regression on random data:
In[1]:=
[◼]
BayesianLinearRegression
[RandomReal[1,{2,2}],x.,x.]
Out[1]=
LogEvidence-5.80737,PriorParametersB{0,0},Lambda
1
100
,0,0,
1
100
,LambdaInverse{{100,0},{0,100}},V
1
100
,Nu
1
100
,PosteriorParametersB{0.621262,0.0167739},Lambda{{2.01,1.12038},{1.12038,0.694701}},LambdaInverse{{4.92336,-7.94014},{-7.94014,14.2449}},V0.014074,Nu
201
100
,PosteriorRegressionCoefficientDistributionMultivariateTDistribution{0.621262,0.0167739},{{0.0344734,-0.0555969},{-0.0555969,0.0997429}},
201
100
,ErrorDistributionInverseGammaDistribution
201
200
,0.00703701,PredictiveDistributionStudentTDistribution0.621262+0.0167739x.,0.0836779
5.92336-15.8803x.+14.2449
2
x.
,
201
100
,UnderlyingValueDistributionStudentTDistribution0.621262+0.0167739x.,0.0836779
4.92336-15.8803x.+14.2449
2
x.
,
201
100
,PriorRegressionCoefficientDistributionMultivariateTDistribution{0,0},{{100,0},{0,100}},
1
100
,ErrorDistributionInverseGammaDistribution
1
200
,
1
200
,PredictiveDistributionStudentTDistribution0,
101+100
2
x.
,
1
100
,UnderlyingValueDistributionStudentTDistribution0,10
1+
2
x.
,
1
100
,Basis{1,x.},IndependentVariables{x.}
———
Generate test data:
In[1]:=
data=RandomVariate​​MultinormalDistribution
1
0.7
0.7
1
,​​20​​;​​ListPlot[data]
Out[1]=
-1.5
-1.0
-0.5
0.5
1.0
1.5
-2
-1
1
Fit the data with a first-order polynomial:
In[2]:=
linearModel=
[◼]
BayesianLinearRegression
[data,x.,x.]//Keys
Out[2]=
{LogEvidence,PriorParameters,PosteriorParameters,Posterior,Prior,Basis,IndependentVariables}
Show the predictive distribution of the model and the distribution of the fit line (dashed):
In[3]:=
Show[​​Plot[Evaluate@InverseCDF[linearModel["Posterior","PredictiveDistribution"],{0.95,0.5,0.05}],{x.,-3,3},Filling{1{2},3{2}},PlotLegendsTable[Quantity[i,"Percent"],{i,{95,50,5}}]],​​Plot[Evaluate@InverseCDF[linearModel["Posterior","UnderlyingValueDistribution"],{0.95,0.5,0.05}],{x.,-3,3},Filling{1{2},3{2}},PlotStyleDashed],​​ListPlot[data,PlotStyleBlack],​​PlotRangeAll​​]
Out[3]=
95
%
50
%
5
%
Plot the joint distribution of the coefficients
a
and
b
in the regression equation
yax+b
:
In[4]:=
With[{​​coefficientDist=linearModel["Posterior","RegressionCoefficientDistribution"]​​},​​ContourPlot[​​Evaluate[PDF[coefficientDist,{a,b}]],​​{a,-1,1},​​{b,0,2},​​PlotRange{0,All},PlotPoints20,FrameLabel{"a","b"},ImageSize400,​​PlotLabel"Posterior PDF of regression coefficients"​​]​​]
Out[4]=
Plot the
PDF
of the posterior variance of the residuals:
In[5]:=
With[{​​errDist=linearModel["Posterior","ErrorDistribution"]​​},​​Quiet@Plot[​​Evaluate[PDF[errDist,e]],​​{e,0,1},​​PlotRange{0,All},AxesLabel{"
2
σ
","Density"},ImageSize400,​​FillingAxis,​​PlotLabel"Posterior PDF of residual variance"​​]​​]
Out[5]=
Fit the data with a polynomial of arbitrary degree and compare the prediction bands and log-evidence of fits up to degree 4:
In[6]:=
Table​​Module​​model=
[◼]
BayesianLinearRegression
[Rule@@@data,x.^Range[0,degree],x.],​​predictiveDist​​,​​predictiveDist=model["Posterior","PredictiveDistribution"];​​Show[​​Plot[Evaluate@InverseCDF[predictiveDist,{0.95,0.5,0.05}],{x.,-3,3},Filling{1{2},3{2}},PlotLegendsTable[Quantity[i,"Percent"],{i,{95,50,5}}]],​​ListPlot[data,PlotStyleBlack],​​PlotRangeAll,​​PlotLabelStringForm["Degree: `1`\nLog evidence: `2`",degree,model["LogEvidence"]]​​]​​,​​{degree,0,4}​​