Basic Examples

Fit some random data to simple linear models with shared slope parameter:
In[]:=
data=RandomVariate[BinormalDistribution[0.7],{2,100}];​​data[[1,All,2]]+=1.;​​data[[2,All,2]]+=-1.;​​​​fit=
[◼]
MultiNonlinearModelFit
[​​data,​​{ax+b1,ax+b2},{a,b1,b2},{x}​​]
Out[]=
FittedModel
Switch[Round[n.],1,0.865856+0.672231x,2,-1.02017+0.672231x]

In[]:=
Show[​​ListPlot[data],​​Plot[Evaluate[Table[Normal[fit],{n.,Length[data]}]],{x,-5,5}]​​]
Out[]=
-3
-2
-1
1
2
-3
-2
-1
1
2
3
Fit the same model with parameter constraints to allow for a small difference between the slopes of the lines. This works best by using a quadratic form for the difference between the slope parameters, since
2
(a1-a2)
is differentiable (unlike
Abs
[a1-a2]
):
In[]:=
fit=
[◼]
MultiNonlinearModelFit
[​​data,​​<|​​"Expressions"{a1x+b1,a2x+b2},​​"Constraints"
2
(a1-a2)
<
2
0.1
&&b1>0&&b2<0​​|>,​​{a1,a2,b1,b2},​​{x}​​]
Out[]=
FittedModel
Switch[Round[n.],1,0.859858+0.628578x,2,-1.01732+0.726766x]

In[]:=
Show[​​ListPlot[data],​​Plot[Evaluate[Table[Normal[fit],{n.,Length[data]}]],{x,-5,5}]​​]
Out[]=
-3
-2
-1
1
2
-3
-2
-1
1
2
3
Fit two Gaussian peaks with a shared location parameter:
In[]:=
xvals=Range[-5,5,0.1];​​gauss[x_]:=Evaluate@PDF[NormalDistribution[],x];​​​​With[{amp1=1.2,amp2=0.5,width1=1,width2=2,sharedOffset=0.5,eps=0.05},​​dat1=Table[{x,amp1gauss[(x-sharedOffset)/width1]+epsRandomVariate[NormalDistribution[]]},{x,xvals}];​​dat2=Table[{x,amp2gauss[(x-sharedOffset)/width2]+epsRandomVariate[NormalDistribution[]]},{x,xvals}]​​];​​plot=ListPlot[{dat1,dat2}]
Out[]=
-4
-2
2
4
-0.1
0.1
0.2
0.3
0.4
0.5
Fit with the models that were used to generate the data:
In[]:=
fit=
[◼]
MultiNonlinearModelFit
[​​{dat1,dat2},​​{​​amp1gauss[(x-sharedOffset)/width1],​​amp2gauss[(x-sharedOffset)/width2]​​},​​{amp1,amp2,width1,width2,sharedOffset},​​{x}​​]
Out[]=
FittedModel
SwitchRound[n.],​​1,
1.14248
-
1
1

2π
,​​2,
0.519964
-
1
21

2π


In[]:=
fit["BestFitParameters"]
Out[]=
{amp11.14248,amp20.519964,width11.01437,width21.90244,sharedOffset0.509881}
Extract the fits as a list of expressions:
In[]:=
fits=Table[Normal[fit],{n.,{1,2}}]
Out[]=
0.455784
-0.485936
2
(-0.509881+x)

,0.207436
-0.138149
2
(-0.509881+x)


Compare the fits to the data:
In[]:=
Show[​​plot,​​Plot[fits,{x,-5,5}]​​]
Out[]=
-4
-2
2
4
-0.1
0.1
0.2
0.3
0.4
0.5

Options

Weights

The
Weights
option can be specified in a number of ways. First generate datasets with an unequal number of points and offset them slightly:
In[]:=
gauss[x_]:=Evaluate@PDF[NormalDistribution[],x];​​With[{amp1=1.2,amp2=0.5,width1=1,width2=2,sharedOffset=0.5,eps=0.025},​​dat1=Table[​​{x,amp1gauss[(x-sharedOffset)/width1]+epsRandomVariate[NormalDistribution[]]},​​{x,-5,5,0.1}​​];​​dat2=Table[​​{x,amp2gauss[(x-sharedOffset+1)/width2]+epsRandomVariate[NormalDistribution[]]},​​{x,-5,5,0.2}​​]​​];​​fitfuns={​​amp1gauss[(x-sharedOffset)/width1],​​amp2gauss[(x-sharedOffset)/width2]​​};​​fitParams={amp1,amp2,width1,width2,sharedOffset};​​variables={x};​​plot=ListPlot[{dat1,dat2}]
Fit the data normally. In this case, each individual data point has equal weights in the fit, so the first dataset gets more weight overall since it has more points:
Assign more weight to the second dataset:
Assign weights inversely proportional to the number of points in the dataset. This asserts that each dataset is equally important:
To assign weights for each individual data point, you can pass a list of vectors matching the input data:

DatasetIndexSymbol

Use a different symbol to index the datasets: