均差 (Divided differences)是遞歸 除法 過程。在數值分析 中,可用於計算牛頓多項式 形式的多項式插值 的系數。在微積分 中,均差與導數 一起合稱差商 ,是對函數 在一個區間 內的平均 變化率的測量[ 1] [ 2] [ 3] 。
均差也是一種算法 ,查爾斯·巴貝奇 的差分機 ,是他在1822年發表的論文中提出的一種早期的機械計算機 ,在歷史上意圖用來計算對數 表和三角函數 表, 它設計在其運算中使用這個算法[ 4] 。
給定n+1個數據點
(
x
0
,
y
0
)
,
…
,
(
x
n
,
y
n
)
{\displaystyle (x_{0},y_{0}),\ldots ,(x_{n},y_{n})}
定義前向均差 為:
[
y
ν
]
=
y
ν
,
ν
∈
{
0
,
…
,
n
}
[
y
ν
,
…
,
y
ν
+
j
]
=
[
y
ν
+
1
,
…
,
y
ν
+
j
]
−
[
y
ν
,
…
,
y
ν
+
j
−
1
]
x
ν
+
j
−
x
ν
,
ν
∈
{
0
,
…
,
n
−
j
}
,
j
∈
{
1
,
…
,
n
}
{\displaystyle {\begin{aligned}{\mathopen {[}}y_{\nu }]&=y_{\nu },\quad \nu \in \{0,\ldots ,n\}\\{\mathopen {[}}y_{\nu },\ldots ,y_{\nu +j}]&={\frac {[y_{\nu +1},\ldots ,y_{\nu +j}]-[y_{\nu },\ldots ,y_{\nu +j-1}]}{x_{\nu +j}-x_{\nu }}},\quad \nu \in \{0,\ldots ,n-j\},\ j\in \{1,\ldots ,n\}\\\end{aligned}}}
定義後向均差 為:
[
y
ν
]
=
y
ν
,
ν
∈
{
0
,
…
,
n
}
[
y
ν
,
…
,
y
ν
−
j
]
=
[
y
ν
,
…
,
y
ν
−
j
+
1
]
−
[
y
ν
−
1
,
…
,
y
ν
−
j
]
x
ν
−
x
ν
−
j
,
ν
∈
{
j
,
…
,
n
}
,
j
∈
{
1
,
…
,
n
}
{\displaystyle {\begin{aligned}{\mathopen {[}}y_{\nu }]&=y_{\nu },\quad \nu \in \{0,\ldots ,n\}\\{\mathopen {[}}y_{\nu },\ldots ,y_{\nu -j}]&={\frac {[y_{\nu },\ldots ,y_{\nu -j+1}]-[y_{\nu -1},\ldots ,y_{\nu -j}]}{x_{\nu }-x_{\nu -j}}},\quad \nu \in \{j,\ldots ,n\},\ j\in \{1,\ldots ,n\}\\\end{aligned}}}
假定數據點給出為函數 ƒ,
(
x
0
,
f
(
x
0
)
)
,
…
,
(
x
n
,
f
(
x
n
)
)
{\displaystyle (x_{0},f(x_{0})),\ldots ,(x_{n},f(x_{n}))}
其均差可以寫為:
f
[
x
ν
]
=
f
(
x
ν
)
,
ν
∈
{
0
,
…
,
n
}
f
[
x
ν
,
…
,
x
ν
+
j
]
=
f
[
x
ν
+
1
,
…
,
x
ν
+
j
]
−
f
[
x
ν
,
…
,
x
ν
+
j
−
1
]
x
ν
+
j
−
x
ν
,
ν
∈
{
0
,
…
,
n
−
j
}
,
j
∈
{
1
,
…
,
n
}
{\displaystyle {\begin{aligned}f[x_{\nu }]&=f(x_{\nu }),\qquad \nu \in \{0,\ldots ,n\}\\f[x_{\nu },\ldots ,x_{\nu +j}]&={\frac {f[x_{\nu +1},\ldots ,x_{\nu +j}]-f[x_{\nu },\ldots ,x_{\nu +j-1}]}{x_{\nu +j}-x_{\nu }}},\quad \nu \in \{0,\ldots ,n-j\},\ j\in \{1,\ldots ,n\}\end{aligned}}}
對函數 ƒ 在節點 x 0 , ..., x n 上的均差還有其他表示法,如:
[
x
0
,
…
,
x
n
]
f
[
x
0
,
…
,
x
n
;
f
]
D
[
x
0
,
…
,
x
n
]
f
{\displaystyle {\begin{matrix}{\mathopen {[}}x_{0},\ldots ,x_{n}]f\\{\mathopen {[}}x_{0},\ldots ,x_{n};f]\\{\mathopen {D}}[x_{0},\ldots ,x_{n}]f\\\end{matrix}}}
給定ν=0:
[
y
0
]
=
y
0
[
y
0
,
y
1
]
=
y
1
−
y
0
x
1
−
x
0
[
y
0
,
y
1
,
y
2
]
=
[
y
1
,
y
2
]
−
[
y
0
,
y
1
]
x
2
−
x
0
[
y
0
,
y
1
,
y
2
,
y
3
]
=
[
y
1
,
y
2
,
y
3
]
−
[
y
0
,
y
1
,
y
2
]
x
3
−
x
0
[
y
0
,
y
1
,
…
,
y
n
]
=
[
y
1
,
y
2
,
…
,
y
n
]
−
[
y
0
,
y
1
,
…
,
y
n
−
1
]
x
n
−
x
0
{\displaystyle {\begin{aligned}{\mathopen {[}}y_{0}]&=y_{0}\\{\mathopen {[}}y_{0},y_{1}]&={\frac {y_{1}-y_{0}}{x_{1}-x_{0}}}\\{\mathopen {[}}y_{0},y_{1},y_{2}]&={\frac {{\mathopen {[}}y_{1},y_{2}]-{\mathopen {[}}y_{0},y_{1}]}{x_{2}-x_{0}}}\\{\mathopen {[}}y_{0},y_{1},y_{2},y_{3}]&={\frac {{\mathopen {[}}y_{1},y_{2},y_{3}]-{\mathopen {[}}y_{0},y_{1},y_{2}]}{x_{3}-x_{0}}}\\{\mathopen {[}}y_{0},y_{1},\dots ,y_{n}]&={\frac {{\mathopen {[}}y_{1},y_{2},\dots ,y_{n}]-{\mathopen {[}}y_{0},y_{1},\dots ,y_{n-1}]}{x_{n}-x_{0}}}\end{aligned}}}
為了使涉及的遞歸過程更加清楚,以列表形式展示均差的計算過程[ 5] :
x
0
[
y
0
]
=
y
0
[
y
0
,
y
1
]
x
1
[
y
1
]
=
y
1
[
y
0
,
y
1
,
y
2
]
[
y
1
,
y
2
]
[
y
0
,
y
1
,
y
2
,
y
3
]
x
2
[
y
2
]
=
y
2
[
y
1
,
y
2
,
y
3
]
[
y
2
,
y
3
]
x
3
[
y
3
]
=
y
3
{\displaystyle {\begin{matrix}x_{0}&[y_{0}]=y_{0}&&&\\&&[y_{0},y_{1}]&&\\x_{1}&[y_{1}]=y_{1}&&[y_{0},y_{1},y_{2}]&\\&&[y_{1},y_{2}]&&[y_{0},y_{1},y_{2},y_{3}]\\x_{2}&[y_{2}]=y_{2}&&[y_{1},y_{2},y_{3}]&\\&&[y_{2},y_{3}]&&\\x_{3}&[y_{3}]=y_{3}&&&\\\end{matrix}}}
用數學歸納法 可證明[ 6] :
[
y
0
]
=
y
0
[
y
0
,
y
1
]
=
y
0
x
0
−
x
1
+
y
1
x
1
−
x
0
[
y
0
,
y
1
,
y
2
]
=
y
0
(
x
0
−
x
1
)
(
x
0
−
x
2
)
+
y
1
(
x
1
−
x
0
)
(
x
1
−
x
2
)
+
y
2
(
x
2
−
x
0
)
(
x
2
−
x
1
)
[
y
0
,
y
1
,
…
,
y
n
]
=
∑
j
=
0
n
y
j
∏
k
=
0
,
k
≠
j
n
(
x
j
−
x
k
)
{\displaystyle {\begin{aligned}{\mathopen {[}}y_{0}]&=y_{0}\\{\mathopen {[}}y_{0},y_{1}]&={\frac {y_{0}}{x_{0}-x_{1}}}+{\frac {y_{1}}{x_{1}-x_{0}}}\\{\mathopen {[}}y_{0},y_{1},y_{2}]&={\frac {y_{0}}{(x_{0}-x_{1})(x_{0}-x_{2})}}+{\frac {y_{1}}{(x_{1}-x_{0})(x_{1}-x_{2})}}+{\frac {y_{2}}{(x_{2}-x_{0})(x_{2}-x_{1})}}\\{\mathopen {[}}y_{0},y_{1},\dots ,y_{n}]&=\sum _{j=0}^{n}{\frac {y_{j}}{\prod _{k=0,\,k\neq j}^{n}(x_{j}-x_{k})}}\\\end{aligned}}}
此公式體現了均差的對稱性質。[ 7] 故可推知:任意調換數據點次序,其值不變。[ 8]
對稱性:若
σ
:
{
0
,
…
,
n
}
→
{
0
,
…
,
n
}
{\displaystyle \sigma :\{0,\dots ,n\}\to \{0,\dots ,n\}}
是一個排列 則
f
[
x
0
,
…
,
x
n
]
=
f
[
x
σ
(
0
)
,
…
,
x
σ
(
n
)
]
{\displaystyle f[x_{0},\dots ,x_{n}]=f[x_{\sigma (0)},\dots ,x_{\sigma (n)}]}
(
f
+
g
)
[
x
0
,
…
,
x
n
]
=
f
[
x
0
,
…
,
x
n
]
+
g
[
x
0
,
…
,
x
n
]
(
λ
⋅
f
)
[
x
0
,
…
,
x
n
]
=
λ
⋅
f
[
x
0
,
…
,
x
n
]
{\displaystyle {\begin{aligned}(f+g)[x_{0},\dots ,x_{n}]&=f[x_{0},\dots ,x_{n}]+g[x_{0},\dots ,x_{n}]\\(\lambda \cdot f)[x_{0},\dots ,x_{n}]&=\lambda \cdot f[x_{0},\dots ,x_{n}]\\\end{aligned}}}
(
f
⋅
g
)
[
x
0
,
…
,
x
n
]
=
f
[
x
0
]
⋅
g
[
x
0
,
…
,
x
n
]
+
f
[
x
0
,
x
1
]
⋅
g
[
x
1
,
…
,
x
n
]
+
⋯
+
f
[
x
0
,
…
,
x
n
]
⋅
g
[
x
n
]
{\displaystyle (f\cdot g)[x_{0},\dots ,x_{n}]=f[x_{0}]\cdot g[x_{0},\dots ,x_{n}]+f[x_{0},x_{1}]\cdot g[x_{1},\dots ,x_{n}]+\dots +f[x_{0},\dots ,x_{n}]\cdot g[x_{n}]}
∃
ξ
∈
(
min
{
x
0
,
…
,
x
n
}
,
max
{
x
0
,
…
,
x
n
}
)
f
[
x
0
,
…
,
x
n
]
=
f
(
n
)
(
ξ
)
n
!
{\displaystyle \exists \xi \in (\min\{x_{0},\dots ,x_{n}\},\max\{x_{0},\dots ,x_{n}\})\quad f[x_{0},\dots ,x_{n}]={\frac {f^{(n)}(\xi )}{n!}}}
通過對換 n 階均差中(x0 ,y0 )與(xn-1 ,yn-1 ),可得到等價定義:
[
y
0
,
y
1
,
…
,
y
n
−
1
,
y
n
]
=
[
y
1
,
y
2
,
…
,
y
n
]
−
[
y
0
,
y
1
,
…
,
y
n
−
1
]
x
n
−
x
0
=
[
y
1
,
…
,
y
n
−
2
,
y
0
,
y
n
]
−
[
y
n
−
1
,
y
1
,
…
,
y
n
−
2
,
y
0
]
x
n
−
x
n
−
1
=
[
y
0
,
…
,
y
n
−
2
,
y
n
]
−
[
y
0
,
y
1
,
…
,
y
n
−
1
]
x
n
−
x
n
−
1
{\displaystyle {\begin{aligned}{\mathopen {[}}y_{0},y_{1},\dots ,y_{n-1},y_{n}]&={\frac {{\mathopen {[}}y_{1},y_{2},\dots ,y_{n}]-{\mathopen {[}}y_{0},y_{1},\dots ,y_{n-1}]}{x_{n}-x_{0}}}\\&={\frac {{\mathopen {[}}y_{1},\dots ,y_{n-2},y_{0},y_{n}]-{\mathopen {[}}y_{n-1},y_{1},\dots ,y_{n-2},y_{0}]}{x_{n}-x_{n-1}}}\\&={\frac {{\mathopen {[}}y_{0},\dots ,y_{n-2},y_{n}]-{\mathopen {[}}y_{0},y_{1},\dots ,y_{n-1}]}{x_{n}-x_{n-1}}}\\\end{aligned}}}
這個定義有着不同的計算次序:
[
y
0
]
=
y
0
[
y
0
,
y
1
]
=
y
1
−
y
0
x
1
−
x
0
[
y
0
,
y
1
,
y
2
]
=
[
y
0
,
y
2
]
−
[
y
0
,
y
1
]
x
2
−
x
1
[
y
0
,
y
1
,
y
2
,
y
3
]
=
[
y
0
,
y
1
,
y
3
]
−
[
y
0
,
y
1
,
y
2
]
x
3
−
x
2
[
y
0
,
y
1
,
…
,
y
n
]
=
[
y
0
,
…
,
y
n
−
2
,
y
n
]
−
[
y
0
,
y
1
,
…
,
y
n
−
1
]
x
n
−
x
n
−
1
{\displaystyle {\begin{aligned}{\mathopen {[}}y_{0}]&=y_{0}\\{\mathopen {[}}y_{0},y_{1}]&={\frac {y_{1}-y_{0}}{x_{1}-x_{0}}}\\{\mathopen {[}}y_{0},y_{1},y_{2}]&={\frac {{\mathopen {[}}y_{0},y_{2}]-{\mathopen {[}}y_{0},y_{1}]}{x_{2}-x_{1}}}\\{\mathopen {[}}y_{0},y_{1},y_{2},y_{3}]&={\frac {{\mathopen {[}}y_{0},y_{1},y_{3}]-{\mathopen {[}}y_{0},y_{1},y_{2}]}{x_{3}-x_{2}}}\\{\mathopen {[}}y_{0},y_{1},\dots ,y_{n}]&={\frac {{\mathopen {[}}y_{0},\dots ,y_{n-2},y_{n}]-{\mathopen {[}}y_{0},y_{1},\dots ,y_{n-1}]}{x_{n}-x_{n-1}}}\\\end{aligned}}}
以列表形式展示這個定義下均差的計算過程[ 9] :
x
0
[
y
0
]
=
y
0
[
y
0
,
y
1
]
x
1
[
y
1
]
=
y
1
[
y
0
,
y
1
,
y
2
]
[
y
0
,
y
2
]
[
y
0
,
y
1
,
y
2
,
y
3
]
x
2
[
y
2
]
=
y
2
[
y
0
,
y
1
,
y
3
]
[
y
0
,
y
3
]
x
3
[
y
3
]
=
y
3
{\displaystyle {\begin{matrix}x_{0}&[y_{0}]=y_{0}&&&\\&&[y_{0},y_{1}]&&\\x_{1}&[y_{1}]=y_{1}&&[y_{0},y_{1},y_{2}]&\\&&[y_{0},y_{2}]&&[y_{0},y_{1},y_{2},y_{3}]\\x_{2}&[y_{2}]=y_{2}&&[y_{0},y_{1},y_{3}]&\\&&[y_{0},y_{3}]&&\\x_{3}&[y_{3}]=y_{3}&&&\\\end{matrix}}}
《自然哲學的數學原理 》的第三編「宇宙體系」的引理五的圖例。這裏在橫坐標上有6個點H,I,K,L,M,N,對應着6個值A,B,C,D,E,F,生成一個多項式函數對這6個點上有對應的6個值,計算任意點S對應的值R。牛頓給出了間距為單位值和任意值的兩種情況。
牛頓插值公式,得名於伊薩克·牛頓 爵士,最早發表為他在1687年出版的《自然哲學的數學原理 》中第三編「宇宙體系」的引理五,此前詹姆斯·格雷果里 於1670年和牛頓於1676年已經分別獨立得出這個成果。一般稱其為連續泰勒展開 的離散對應。
使用均差的牛頓插值法 為[ 10] :
N
n
(
x
)
=
y
0
+
(
x
−
x
0
)
(
[
y
0
,
y
1
]
+
(
x
−
x
1
)
(
[
y
0
,
y
1
,
y
2
]
+
⋯
)
)
=
[
y
0
]
+
[
y
0
,
y
1
]
(
x
−
x
0
)
+
⋯
+
[
y
0
,
y
1
,
…
,
y
n
]
∏
k
=
0
n
−
1
(
x
−
x
k
)
{\displaystyle {\begin{aligned}N_{n}(x)&=y_{0}+(x-{x}_{0})\left([{y}_{0},{y}_{1}]+(x-{x}_{1})\left([{y}_{0},{y}_{1},{y}_{2}]+\cdots \right)\right)\\&=[y_{0}]+[{y}_{0},{y}_{1}](x-{x}_{0})+\cdots +[{y}_{0},{y}_{1},\ldots ,{y}_{n}]\prod _{k=0}^{n-1}(x-{x}_{k})\end{aligned}}}
可以在計算過程中任意增添節點如點(xn+1 ,yn+1 ),只需計算新增的n+1階均差及其插值基函數,而無拉格朗日插值法 需重算全部插值基函數之虞。
對均差採用展開形式[ 11] :
N
n
(
x
)
=
y
0
+
y
0
x
−
x
0
x
0
−
x
1
+
y
1
x
−
x
0
x
1
−
x
0
+
⋯
+
∑
j
=
0
n
y
j
∏
k
=
0
n
−
1
(
x
−
x
k
)
∏
k
=
0
,
k
≠
j
n
(
x
j
−
x
k
)
{\displaystyle {\begin{aligned}N_{n}(x)&=y_{0}+y_{0}{\frac {x-{x}_{0}}{x_{0}-x_{1}}}+y_{1}{\frac {x-{x}_{0}}{x_{1}-x_{0}}}+\cdots +\sum _{j=0}^{n}y_{j}{\frac {\prod _{k=0}^{n-1}(x-{x}_{k})}{\prod _{k=0,\,k\neq j}^{n}(x_{j}-x_{k})}}\\\end{aligned}}}
以2階均差牛頓插值為例:
N
2
(
x
)
=
y
0
(
1
+
x
−
x
0
x
0
−
x
1
+
(
x
−
x
0
)
(
x
−
x
1
)
(
x
0
−
x
1
)
(
x
0
−
x
2
)
)
+
y
1
(
x
−
x
0
x
1
−
x
0
+
(
x
−
x
0
)
(
x
−
x
1
)
(
x
1
−
x
0
)
(
x
1
−
x
2
)
)
+
y
2
(
x
−
x
0
)
(
x
−
x
1
)
(
x
2
−
x
0
)
(
x
2
−
x
1
)
=
y
0
(
x
−
x
1
)
(
x
−
x
2
)
(
x
0
−
x
1
)
(
x
0
−
x
2
)
+
y
1
(
x
−
x
0
)
(
x
−
x
2
)
(
x
1
−
x
0
)
(
x
1
−
x
2
)
+
y
2
(
x
−
x
0
)
(
x
−
x
1
)
(
x
2
−
x
0
)
(
x
2
−
x
1
)
=
∑
j
=
0
2
y
j
∏
k
=
0
k
≠
j
2
x
−
x
k
x
j
−
x
k
{\displaystyle {\begin{aligned}N_{2}(x)&=y_{0}\left(1+{\frac {x-{x}_{0}}{x_{0}-x_{1}}}+{\frac {(x-x_{0})(x-x_{1})}{(x_{0}-x_{1})(x_{0}-x_{2})}}\right)+y_{1}\left({\frac {x-{x}_{0}}{x_{1}-x_{0}}}+{\frac {(x-x_{0})(x-x_{1})}{(x_{1}-x_{0})(x_{1}-x_{2})}}\right)+y_{2}{\frac {(x-x_{0})(x-x_{1})}{(x_{2}-x_{0})(x_{2}-x_{1})}}\\&=y_{0}{\frac {(x-x_{1})(x-x_{2})}{(x_{0}-x_{1})(x_{0}-x_{2})}}+y_{1}{\frac {(x-x_{0})(x-x_{2})}{(x_{1}-x_{0})(x_{1}-x_{2})}}+y_{2}{\frac {(x-x_{0})(x-x_{1})}{(x_{2}-x_{0})(x_{2}-x_{1})}}\\&=\sum _{j=0}^{2}y_{j}\prod _{\begin{smallmatrix}k=0\\k\neq j\end{smallmatrix}}^{2}{\frac {x-{x}_{k}}{x_{j}-x_{k}}}\\\end{aligned}}}
當數據點呈等距分佈的時候,這個特殊情況叫做「前向差分 」。它們比計算一般的均差要容易。
給定n+1個數據點
(
x
0
,
y
0
)
,
…
,
(
x
n
,
y
n
)
{\displaystyle (x_{0},y_{0}),\ldots ,(x_{n},y_{n})}
有着
x
i
=
x
0
+
i
h
,
h
>
0
,
0
≤
i
≤
n
{\displaystyle x_{i}=x_{0}+ih,\quad h>0{\mbox{ , }}0\leq i\leq n}
定義前向差分 為:
△
0
y
i
=
y
i
△
k
y
i
=
△
k
−
1
y
i
+
1
−
△
k
−
1
y
i
,
1
≤
k
≤
n
−
i
{\displaystyle {\begin{aligned}\triangle ^{0}y_{i}&=y_{i}\\\triangle ^{k}y_{i}&=\triangle ^{k-1}y_{i+1}-\triangle ^{k-1}y_{i},\quad 1\leq k\leq n-i\\\end{aligned}}}
前向差分所對應的均差為[ 12] :
f
[
x
0
,
x
1
,
…
,
x
k
]
=
1
k
!
h
k
Δ
(
k
)
f
(
x
0
)
{\displaystyle f[x_{0},x_{1},\ldots ,x_{k}]={\frac {1}{k!h^{k}}}\Delta ^{(k)}f(x_{0})}
y
0
△
y
0
y
1
△
2
y
0
△
y
1
△
3
y
0
y
2
△
2
y
1
△
y
2
y
3
{\displaystyle {\begin{matrix}y_{0}&&&\\&\triangle y_{0}&&\\y_{1}&&\triangle ^{2}y_{0}&\\&\triangle y_{1}&&\triangle ^{3}y_{0}\\y_{2}&&\triangle ^{2}y_{1}&\\&\triangle y_{2}&&\\y_{3}&&&\\\end{matrix}}}
差分的展開形式是均差展開形式的特殊情況[ 13] :
△
k
y
i
=
∑
j
=
0
k
(
−
1
)
k
−
j
(
k
j
)
y
i
+
j
,
0
≤
k
≤
n
−
i
{\displaystyle {\begin{aligned}\triangle ^{k}y_{i}&=\sum _{j=0}^{k}(-1)^{k-j}{\binom {k}{j}}y_{i+j},\quad 0\leq k\leq n-i\end{aligned}}}
這裏的表達式
(
n
k
)
=
(
n
)
k
k
!
(
n
)
k
=
n
(
n
−
1
)
(
n
−
2
)
⋯
(
n
−
k
+
1
)
{\displaystyle {n \choose k}={\frac {(n)_{k}}{k!}}\quad \quad (n)_{k}=n(n-1)(n-2)\cdots (n-k+1)}
是二項式系數 ,其中的(n)k 是「下降階乘冪 」,空積 (n)0 被定義為1。
其對應的牛頓插值公式為:
f
(
x
)
=
y
0
+
x
−
x
0
h
(
Δ
1
y
0
+
x
−
x
0
−
h
2
h
(
Δ
2
y
0
+
⋯
)
)
=
y
0
+
∑
k
=
1
n
Δ
k
y
0
k
!
h
k
∏
i
=
0
n
−
1
(
x
−
x
0
−
i
h
)
=
y
0
+
∑
k
=
1
n
Δ
k
y
0
k
!
∏
i
=
0
n
−
1
(
x
−
x
0
h
−
i
)
=
∑
k
=
0
n
(
x
−
x
0
h
k
)
Δ
k
y
0
{\displaystyle {\begin{aligned}f(x)&=y_{0}+{\frac {x-x_{0}}{h}}\left(\Delta ^{1}y_{0}+{\frac {x-x_{0}-h}{2h}}\left(\Delta ^{2}y_{0}+\cdots \right)\right)\\&=y_{0}+\sum _{k=1}^{n}{\frac {\Delta ^{k}y_{0}}{k!h^{k}}}\prod _{i=0}^{n-1}(x-x_{0}-ih)\\&=y_{0}+\sum _{k=1}^{n}{\frac {\Delta ^{k}y_{0}}{k!}}\prod _{i=0}^{n-1}({\frac {x-x_{0}}{h}}-i)\\&=\sum _{k=0}^{n}{{\frac {x-x_{0}}{h}} \choose k}~\Delta ^{k}y_{0}\\\end{aligned}}}
牛頓 在1665年得出並在1671年寫的《流數法》中發表了ln(1+x)的無窮級數 ,在1666年得出了arcsin(x)和arctan(x)的無窮級數,在1669年的《分析學》中發表了sin(x)、cos(x)、arcsin(x)和ex 的無窮級數;萊布尼茨 在1673年大概也得出了sin(x)、cos(x)和arctan(x)的無窮級數。布魯克·泰勒 在1715年著作《Methodus Incrementorum Directa et Inversa》[ 14] 中研討了「有限差分」方法,其中論述了他在1712年得出的泰勒定理 ,這個成果此前詹姆斯·格雷果里 在1670年和萊布尼茨 在1673年已經得出,而約翰·伯努利 在1694年已經在《教師學報》發表。
他對牛頓的均差的步長取趨於0的極限 ,得出:
f
(
x
)
=
f
(
a
)
+
lim
h
→
0
∑
k
=
1
∞
Δ
h
k
[
f
]
(
a
)
k
!
h
k
∏
i
=
0
k
−
1
(
(
x
−
a
)
−
i
h
)
=
f
(
a
)
+
∑
k
=
1
∞
d
k
d
x
k
f
(
a
)
(
x
−
a
)
k
k
!
{\displaystyle {\begin{aligned}f(x)&=f(a)+\lim _{h\to 0}\sum _{k=1}^{\infty }{\frac {\Delta _{h}^{k}[f](a)}{k!h^{k}}}\prod _{i=0}^{k-1}((x-a)-ih)\\&=f(a)+\sum _{k=1}^{\infty }{\frac {d^{k}}{dx^{k}}}f(a){\frac {(x-a)^{k}}{k!}}\\\end{aligned}}}
使用普通函數記號表示冪運算,
p
n
(
x
)
=
x
n
{\displaystyle p_{n}(x)=x^{n}}
,有:
p
j
[
x
0
,
…
,
x
n
]
=
0
∀
j
<
n
p
n
[
x
0
,
…
,
x
n
]
=
1
p
n
+
1
[
x
0
,
…
,
x
n
]
=
x
0
+
⋯
+
x
n
p
n
+
m
[
x
0
,
…
,
x
n
]
=
∑
k
0
+
⋯
+
k
n
=
m
∏
t
=
0
n
x
t
k
t
{\displaystyle {\begin{aligned}p_{j}[x_{0},\dots ,x_{n}]&=0\qquad \forall j<n\\p_{n}[x_{0},\dots ,x_{n}]&=1\\p_{n+1}[x_{0},\dots ,x_{n}]&=x_{0}+\dots +x_{n}\\p_{n+m}[x_{0},\dots ,x_{n}]&=\sum _{k_{0}+\cdots +k_{n}=m}{\begin{matrix}\prod _{t=0}^{n}x_{t}^{k_{t}}\end{matrix}}\\\end{aligned}}}
此中n+1元m次齊次多項式 的記法同於多項式定理 。
泰勒級數 和任何其他的函數級數,在原理上都可以用來逼近均差。將泰勒級數表示為:
f
=
f
(
0
)
p
0
+
f
′
(
0
)
p
1
+
f
″
(
0
)
2
!
p
2
+
…
{\displaystyle f=f(0)p_{0}+f'(0)p_{1}+{\frac {f''(0)}{2!}}p_{2}+\dots }
均差的泰勒級數為:
f
[
x
0
,
…
,
x
n
]
=
f
(
0
)
p
0
[
x
0
,
…
,
x
n
]
+
f
′
(
0
)
p
1
[
x
0
,
…
,
x
n
]
+
⋯
+
f
(
n
)
(
0
)
n
!
p
n
[
x
0
,
…
,
x
n
]
+
…
{\displaystyle f[x_{0},\dots ,x_{n}]=f(0)p_{0}[x_{0},\dots ,x_{n}]+f'(0)p_{1}[x_{0},\dots ,x_{n}]+\dots +{\frac {f^{(n)}(0)}{n!}}p_{n}[x_{0},\dots ,x_{n}]+\dots }
前
n
{\displaystyle n}
項消失了,因為均差的階高於多項式的階。可以得出均差的泰勒級數本質上開始於:
f
(
n
)
(
0
)
n
!
{\displaystyle {\frac {f^{(n)}(0)}{n!}}}
依據均差中值定理 ,這也是均差的最簡單逼近。
均差還可以表達為
f
[
x
0
,
…
,
x
n
]
=
1
n
!
∫
x
0
x
n
f
(
n
)
(
t
)
B
n
−
1
(
t
)
d
t
{\displaystyle f[x_{0},\ldots ,x_{n}]={\frac {1}{n!}}\int _{x_{0}}^{x_{n}}f^{(n)}(t)B_{n-1}(t)\,dt}
這裏的Bn-1 是數據點x0 ,...,xn 的n-1次B樣條 ,而f(n) 是函數f的n階導數 。這叫做均差的皮亞諾形式 ,而Bn-1 是均差的皮亞諾 核。
^ Frank C. Wilson; Scott Adamson. Applied Calculus . Cengage Learning. 2008: 177 . ISBN 0-618-61104-5 .
^ Tamara Lefcourt Ruby; James Sellers; Lisa Korf; Jeremy Van Horn; Mike Munn. Kaplan AP Calculus AB & BC 2015. Kaplan Publishing. 2014: 237. ISBN 978-1-61865-686-5 .
^ Thomas Hungerford; Douglas Shaw. Contemporary Precalculus: A Graphing Approach. Cengage Learning. 2008: 211–212. ISBN 0-495-10833-2 .
^ Isaacson, Walter. The Innovators . Simon & Schuster. 2014: 20 . ISBN 978-1-4767-0869-0 .
^
x
0
x
0
2
x
0
+
x
1
x
1
x
1
2
1
x
1
+
x
2
0
x
2
x
2
2
1
x
2
+
x
3
x
3
x
3
2
x
0
x
0
n
∑
i
=
0
n
−
1
x
0
n
−
1
−
i
x
1
i
x
1
x
1
n
x
0
x
0
n
+
1
x
1
n
+
1
−
x
1
x
0
n
+
x
1
x
0
n
−
x
0
n
+
1
x
1
−
x
0
=
x
1
x
1
n
−
x
0
n
x
1
−
x
0
+
x
0
n
=
x
1
∑
i
=
0
n
−
1
x
0
n
−
1
−
i
x
1
i
+
x
0
n
=
∑
i
=
0
n
x
0
n
−
i
x
1
i
x
1
x
1
n
+
1
x
0
x
0
n
+
1
∑
i
=
0
n
x
0
n
−
i
x
1
i
x
1
x
1
n
+
1
∑
i
=
0
n
x
1
n
−
i
x
2
i
−
∑
i
=
0
n
x
0
n
−
i
x
1
i
x
2
−
x
0
=
∑
i
=
0
n
−
1
x
1
i
(
x
2
n
−
i
−
x
0
n
−
i
)
x
2
−
x
0
=
∑
i
+
j
+
k
=
n
−
1
x
0
i
x
1
j
x
2
k
∑
i
=
0
n
x
1
n
−
i
x
2
i
x
2
x
2
n
+
1
x
0
x
0
n
+
1
∑
i
=
0
n
x
0
n
−
i
x
1
i
x
1
x
1
n
+
1
∑
i
+
j
+
k
=
n
−
1
x
0
i
x
1
j
x
2
k
∑
i
=
0
n
x
1
n
−
i
x
2
i
∑
i
+
j
+
k
=
n
−
1
x
1
i
x
2
j
x
3
k
−
∑
i
+
j
+
k
=
n
−
1
x
0
i
x
1
j
x
2
k
x
3
−
x
0
=
∑
i
+
j
+
k
+
l
=
n
−
2
x
0
i
x
1
j
x
2
k
x
3
l
x
2
x
2
n
+
1
∑
i
+
j
+
k
=
n
−
1
x
1
i
x
2
j
x
3
k
∑
i
=
0
n
x
2
n
−
i
x
3
i
x
3
x
3
n
+
1
x
0
x
0
3
x
0
2
+
x
0
x
1
+
x
1
2
x
1
x
1
3
x
0
+
x
1
+
x
2
x
1
1
+
x
1
x
2
+
x
2
2
1
x
2
x
2
3
x
1
+
x
2
+
x
3
0
x
2
2
+
x
2
x
3
+
x
3
2
1
x
3
x
3
3
x
2
+
x
3
+
x
4
x
3
2
+
x
3
x
4
+
x
4
2
x
4
x
4
3
x
0
x
0
4
x
0
3
+
x
0
2
x
1
+
x
0
x
1
2
+
x
1
3
x
1
x
1
4
x
0
2
+
x
0
x
1
+
x
1
2
+
x
0
x
2
+
x
1
x
2
+
x
2
2
x
1
3
+
x
1
2
x
2
+
x
1
x
2
2
+
x
2
3
x
0
+
x
1
+
x
2
+
x
3
x
2
x
2
4
x
1
2
+
x
1
x
2
+
x
2
2
+
x
1
x
3
+
x
2
x
3
+
x
3
2
1
x
2
3
+
x
2
2
x
3
+
x
2
x
3
2
+
x
3
3
x
1
+
x
2
+
x
3
+
x
4
0
x
3
x
3
4
x
2
2
+
x
2
x
3
+
x
3
2
+
x
2
x
4
+
x
3
x
4
+
x
4
2
1
x
3
3
+
x
3
2
x
4
+
x
3
x
4
2
+
x
4
3
x
2
+
x
3
+
x
4
+
x
5
x
4
x
4
4
x
3
2
+
x
3
x
4
+
x
4
2
+
x
3
x
5
+
x
4
x
5
+
x
5
2
x
4
3
+
x
4
2
x
5
+
x
4
x
5
2
+
x
5
3
x
5
x
5
4
x
0
x
0
5
∑
i
=
0
4
x
0
4
−
i
x
1
i
x
1
x
1
5
∑
i
+
j
+
k
=
3
x
0
i
x
1
j
x
2
k
∑
i
=
0
4
x
1
4
−
i
x
2
i
∑
i
+
j
+
k
+
l
=
2
x
0
i
x
1
j
x
2
k
x
3
l
x
2
x
2
5
∑
i
+
j
+
k
=
3
x
1
i
x
2
j
x
3
k
∑
i
=
0
4
x
i
∑
i
=
0
4
x
2
4
−
i
x
3
i
∑
i
+
j
+
k
+
l
=
2
x
1
i
x
2
j
x
3
k
x
4
l
1
x
3
x
3
5
∑
i
+
j
+
k
=
3
x
2
i
x
3
j
x
4
k
∑
i
=
1
5
x
i
0
∑
i
=
0
4
x
3
4
−
i
x
4
i
∑
i
+
j
+
k
+
l
=
2
x
2
i
x
3
j
x
4
j
x
5
l
1
x
4
x
4
5
∑
i
+
j
+
k
=
3
x
3
i
x
4
j
x
5
k
∑
i
=
2
6
x
i
∑
i
=
0
4
x
4
4
−
i
x
5
i
∑
i
+
j
+
k
+
l
=
2
x
3
i
x
4
j
x
5
k
x
6
l
x
5
x
5
5
∑
i
+
j
+
k
=
3
x
4
i
x
5
j
x
6
k
∑
i
=
0
4
x
5
4
−
i
x
6
i
x
6
x
6
5
{\displaystyle {\begin{array}{lcl}{\begin{matrix}x_{0}&x_{0}^{2}&&\\&&x_{0}+x_{1}&&\\x_{1}&x_{1}^{2}&&1&\\&&x_{1}+x_{2}&&0\\x_{2}&x_{2}^{2}&&1&\\&&x_{2}+x_{3}&&&\\x_{3}&x_{3}^{2}&&&\\\end{matrix}}\\\\{\begin{matrix}x_{0}&x_{0}^{n}&\\&&\sum _{i=0}^{n-1}x_{0}^{n-1-i}x_{1}^{i}\\x_{1}&x_{1}^{n}&\\\end{matrix}}\\\\{\begin{matrix}x_{0}&x_{0}^{n+1}&\\&&{\frac {x_{1}^{n+1}-x_{1}x_{0}^{n}+x_{1}x_{0}^{n}-x_{0}^{n+1}}{x_{1}-x_{0}}}=x_{1}{\frac {x_{1}^{n}-x_{0}^{n}}{x_{1}-x_{0}}}+x_{0}^{n}=x_{1}\sum _{i=0}^{n-1}x_{0}^{n-1-i}x_{1}^{i}+x_{0}^{n}=\sum _{i=0}^{n}x_{0}^{n-i}x_{1}^{i}\\x_{1}&x_{1}^{n+1}&\\\end{matrix}}\\\\{\begin{matrix}x_{0}&x_{0}^{n+1}&&\\&&\sum _{i=0}^{n}x_{0}^{n-i}x_{1}^{i}&\\x_{1}&x_{1}^{n+1}&&{\frac {\sum _{i=0}^{n}x_{1}^{n-i}x_{2}^{i}-\sum _{i=0}^{n}x_{0}^{n-i}x_{1}^{i}}{x_{2}-x_{0}}}={\frac {\sum _{i=0}^{n-1}x_{1}^{i}(x_{2}^{n-i}-x_{0}^{n-i})}{x_{2}-x_{0}}}=\sum _{i+j+k=n-1}{x_{0}^{i}x_{1}^{j}x_{2}^{k}}\\&&\sum _{i=0}^{n}x_{1}^{n-i}x_{2}^{i}&\\x_{2}&x_{2}^{n+1}&&\\\end{matrix}}\\\\{\begin{matrix}x_{0}&x_{0}^{n+1}&&&\\&&\sum _{i=0}^{n}x_{0}^{n-i}x_{1}^{i}&&\\x_{1}&x_{1}^{n+1}&&\sum _{i+j+k=n-1}{x_{0}^{i}x_{1}^{j}x_{2}^{k}}&\\&&\sum _{i=0}^{n}x_{1}^{n-i}x_{2}^{i}&&{\frac {\sum _{i+j+k=n-1}{x_{1}^{i}x_{2}^{j}x_{3}^{k}}-\sum _{i+j+k=n-1}{x_{0}^{i}x_{1}^{j}x_{2}^{k}}}{x_{3}-x_{0}}}=\sum _{i+j+k+l=n-2}{x_{0}^{i}x_{1}^{j}x_{2}^{k}x_{3}^{l}}\\x_{2}&x_{2}^{n+1}&&\sum _{i+j+k=n-1}{x_{1}^{i}x_{2}^{j}x_{3}^{k}}&\\&&\sum _{i=0}^{n}x_{2}^{n-i}x_{3}^{i}&&\\x_{3}&x_{3}^{n+1}&&&\\\end{matrix}}\\\\{\begin{matrix}x_{0}&x_{0}^{3}&&&&\\&&x_{0}^{2}+x_{0}x_{1}+x_{1}^{2}&&\\x_{1}&x_{1}^{3}&&x_{0}+x_{1}+x_{2}&&\\&&x_{1}^{1}+x_{1}x_{2}+x_{2}^{2}&&1&\\x_{2}&x_{2}^{3}&&x_{1}+x_{2}+x_{3}&&0\\&&x_{2}^{2}+x_{2}x_{3}+x_{3}^{2}&&1&\\x_{3}&x_{3}^{3}&&x_{2}+x_{3}+x_{4}&&\\&&x_{3}^{2}+x_{3}x_{4}+x_{4}^{2}&&&\\x_{4}&x_{4}^{3}&&&&\\\end{matrix}}\\\\{\begin{matrix}x_{0}&x_{0}^{4}&&&&&\\&&x_{0}^{3}+x_{0}^{2}x_{1}+x_{0}x_{1}^{2}+x_{1}^{3}&&&\\x_{1}&x_{1}^{4}&&x_{0}^{2}+x_{0}x_{1}+x_{1}^{2}+x_{0}x_{2}+x_{1}x_{2}+x_{2}^{2}&&&\\&&x_{1}^{3}+x_{1}^{2}x_{2}+x_{1}x_{2}^{2}+x_{2}^{3}&&x_{0}+x_{1}+x_{2}+x_{3}&\\x_{2}&x_{2}^{4}&&x_{1}^{2}+x_{1}x_{2}+x_{2}^{2}+x_{1}x_{3}+x_{2}x_{3}+x_{3}^{2}&&1&\\&&x_{2}^{3}+x_{2}^{2}x_{3}+x_{2}x_{3}^{2}+x_{3}^{3}&&x_{1}+x_{2}+x_{3}+x_{4}&&0\\x_{3}&x_{3}^{4}&&x_{2}^{2}+x_{2}x_{3}+x_{3}^{2}+x_{2}x_{4}+x_{3}x_{4}+x_{4}^{2}&&1&\\&&x_{3}^{3}+x_{3}^{2}x_{4}+x_{3}x_{4}^{2}+x_{4}^{3}&&x_{2}+x_{3}+x_{4}+x_{5}&&\\x_{4}&x_{4}^{4}&&x_{3}^{2}+x_{3}x_{4}+x_{4}^{2}+x_{3}x_{5}+x_{4}x_{5}+x_{5}^{2}&&&\\&&x_{4}^{3}+x_{4}^{2}x_{5}+x_{4}x_{5}^{2}+x_{5}^{3}&&&&\\x_{5}&x_{5}^{4}&&&&&\\\end{matrix}}\\\\{\begin{matrix}x_{0}&x_{0}^{5}&&&&&&\\&&\sum _{i=0}^{4}x_{0}^{4-i}x_{1}^{i}&&&&\\x_{1}&x_{1}^{5}&&\sum _{i+j+k=3}x_{0}^{i}x_{1}^{j}x_{2}^{k}&&&&\\&&\sum _{i=0}^{4}x_{1}^{4-i}x_{2}^{i}&&\sum _{i+j+k+l=2}x_{0}^{i}x_{1}^{j}x_{2}^{k}x_{3}^{l}&&&\\x_{2}&x_{2}^{5}&&\sum _{i+j+k=3}x_{1}^{i}x_{2}^{j}x_{3}^{k}&&\sum _{i=0}^{4}x_{i}&&\\&&\sum _{i=0}^{4}x_{2}^{4-i}x_{3}^{i}&&\sum _{i+j+k+l=2}x_{1}^{i}x_{2}^{j}x_{3}^{k}x_{4}^{l}&&1&\\x_{3}&x_{3}^{5}&&\sum _{i+j+k=3}x_{2}^{i}x_{3}^{j}x_{4}^{k}&&\sum _{i=1}^{5}x_{i}&&0\\&&\sum _{i=0}^{4}x_{3}^{4-i}x_{4}^{i}&&\sum _{i+j+k+l=2}x_{2}^{i}x_{3}^{j}x_{4}^{j}x_{5}^{l}&&1&\\x_{4}&x_{4}^{5}&&\sum _{i+j+k=3}x_{3}^{i}x_{4}^{j}x_{5}^{k}&&\sum _{i=2}^{6}x_{i}&&\\&&\sum _{i=0}^{4}x_{4}^{4-i}x_{5}^{i}&&\sum _{i+j+k+l=2}x_{3}^{i}x_{4}^{j}x_{5}^{k}x_{6}^{l}&&&\\x_{5}&x_{5}^{5}&&\sum _{i+j+k=3}x_{4}^{i}x_{5}^{j}x_{6}^{k}&&&&\\&&\sum _{i=0}^{4}x_{5}^{4-i}x_{6}^{i}&&&&&\\x_{6}&x_{6}^{5}&&&&&&\\\end{matrix}}\\\end{array}}}
^
[
y
0
]
=
y
0
[
y
0
,
y
1
]
=
y
1
−
y
0
x
1
−
x
0
=
y
0
x
0
−
x
1
+
y
1
x
1
−
x
0
=
∑
j
=
0
1
y
j
∏
k
=
0
,
k
≠
j
1
(
x
j
−
x
k
)
[
y
0
,
y
1
,
y
2
]
=
y
1
x
1
−
x
2
+
y
2
x
2
−
x
1
−
y
0
x
0
−
x
1
−
y
1
x
1
−
x
0
x
2
−
x
0
=
y
0
(
x
0
−
x
1
)
(
x
0
−
x
2
)
+
y
1
(
x
1
−
x
0
)
(
x
1
−
x
2
)
+
y
2
(
x
2
−
x
0
)
(
x
2
−
x
1
)
=
∑
j
=
0
2
y
j
∏
k
=
0
,
k
≠
j
2
(
x
j
−
x
k
)
[
y
0
,
y
1
,
…
,
y
n
]
=
∑
j
=
0
n
y
j
∏
k
=
0
,
k
≠
j
n
(
x
j
−
x
k
)
[
y
0
,
y
1
,
…
,
y
n
+
1
]
=
∑
j
=
1
n
+
1
y
j
∏
k
=
1
,
k
≠
j
n
+
1
(
x
j
−
x
k
)
−
∑
j
=
0
n
y
j
∏
k
=
0
,
k
≠
j
n
(
x
j
−
x
k
)
x
n
+
1
−
x
0
=
y
n
+
1
∏
k
=
1
n
(
x
n
+
1
−
x
k
)
+
∑
j
=
1
n
y
j
(
1
∏
k
=
1
,
k
≠
j
n
+
1
(
x
j
−
x
k
)
−
1
∏
k
=
0
,
k
≠
j
n
(
x
j
−
x
k
)
)
−
y
0
∏
k
=
1
n
(
x
0
−
x
k
)
x
n
+
1
−
x
0
=
y
n
+
1
∏
k
=
1
n
(
x
n
+
1
−
x
k
)
+
∑
j
=
1
n
y
j
(
x
j
−
x
0
∏
k
=
0
,
k
≠
j
n
+
1
(
x
j
−
x
k
)
−
x
j
−
x
n
+
1
∏
k
=
0
,
k
≠
j
n
+
1
(
x
j
−
x
k
)
)
−
y
0
∏
k
=
1
n
(
x
0
−
x
k
)
x
n
+
1
−
x
0
=
y
n
+
1
∏
k
=
1
n
(
x
n
+
1
−
x
k
)
+
∑
j
=
1
n
y
j
(
x
n
+
1
−
x
0
∏
k
=
0
,
k
≠
j
n
+
1
(
x
j
−
x
k
)
)
−
y
0
∏
k
=
1
n
(
x
0
−
x
k
)
x
n
+
1
−
x
0
=
y
n
+
1
∏
k
=
0
n
(
x
n
+
1
−
x
k
)
+
∑
j
=
1
n
y
j
∏
k
=
0
,
k
≠
j
n
+
1
(
x
j
−
x
k
)
+
y
0
∏
k
=
1
n
+
1
(
x
0
−
x
k
)
=
∑
j
=
0
n
+
1
y
j
∏
k
=
0
,
k
≠
j
n
+
1
(
x
j
−
x
k
)
{\displaystyle {\begin{aligned}{\mathopen {[}}y_{0}]&=y_{0}\\{\mathopen {[}}y_{0},y_{1}]&={\frac {y_{1}-y_{0}}{x_{1}-x_{0}}}={\frac {y_{0}}{x_{0}-x_{1}}}+{\frac {y_{1}}{x_{1}-x_{0}}}\\&=\sum _{j=0}^{1}{\frac {y_{j}}{\prod _{k=0,k\neq j}^{1}(x_{j}-x_{k})}}\\{\mathopen {[}}y_{0},y_{1},y_{2}]&={\frac {{\cfrac {y_{1}}{x_{1}-x_{2}}}+{\cfrac {y_{2}}{x_{2}-x_{1}}}-{\cfrac {y_{0}}{x_{0}-x_{1}}}-{\cfrac {y_{1}}{x_{1}-x_{0}}}}{x_{2}-x_{0}}}\\&={\frac {y_{0}}{(x_{0}-x_{1})(x_{0}-x_{2})}}+{\frac {y_{1}}{(x_{1}-x_{0})(x_{1}-x_{2})}}+{\frac {y_{2}}{(x_{2}-x_{0})(x_{2}-x_{1})}}\\&=\sum _{j=0}^{2}{\frac {y_{j}}{\prod _{k=0,k\neq j}^{2}(x_{j}-x_{k})}}\\{\mathopen {[}}y_{0},y_{1},\dots ,y_{n}]&=\sum _{j=0}^{n}{\frac {y_{j}}{\prod _{k=0,k\neq j}^{n}(x_{j}-x_{k})}}\\{\mathopen {[}}y_{0},y_{1},\dots ,y_{n+1}]&={\frac {\sum _{j=1}^{n+1}{\frac {y_{j}}{\prod _{k=1,\,k\neq j}^{n+1}(x_{j}-x_{k})}}-\sum _{j=0}^{n}{\frac {y_{j}}{\prod _{k=0,\,k\neq j}^{n}(x_{j}-x_{k})}}}{x_{n+1}-x_{0}}}\\&={\frac {{\frac {y_{n+1}}{\prod _{k=1}^{n}(x_{n+1}-x_{k})}}+\sum _{j=1}^{n}y_{j}\left({\frac {1}{\prod _{k=1,\,k\neq j}^{n+1}(x_{j}-x_{k})}}-{\frac {1}{\prod _{k=0,\,k\neq j}^{n}(x_{j}-x_{k})}}\right)-{\frac {y_{0}}{\prod _{k=1}^{n}(x_{0}-x_{k})}}}{x_{n+1}-x_{0}}}\\&={\frac {{\frac {y_{n+1}}{\prod _{k=1}^{n}(x_{n+1}-x_{k})}}+\sum _{j=1}^{n}y_{j}\left({\frac {x_{j}-x_{0}}{\prod _{k=0,\,k\neq j}^{n+1}(x_{j}-x_{k})}}-{\frac {x_{j}-x_{n+1}}{\prod _{k=0,\,k\neq j}^{n+1}(x_{j}-x_{k})}}\right)-{\frac {y_{0}}{\prod _{k=1}^{n}(x_{0}-x_{k})}}}{x_{n+1}-x_{0}}}\\&={\frac {{\frac {y_{n+1}}{\prod _{k=1}^{n}(x_{n+1}-x_{k})}}+\sum _{j=1}^{n}y_{j}\left({\frac {x_{n+1}-x_{0}}{\prod _{k=0,\,k\neq j}^{n+1}(x_{j}-x_{k})}}\right)-{\frac {y_{0}}{\prod _{k=1}^{n}(x_{0}-x_{k})}}}{x_{n+1}-x_{0}}}\\&={\frac {y_{n+1}}{\prod _{k=0}^{n}(x_{n+1}-x_{k})}}+\sum _{j=1}^{n}{\frac {y_{j}}{\prod _{k=0,\,k\neq j}^{n+1}(x_{j}-x_{k})}}+{\frac {y_{0}}{\prod _{k=1}^{n+1}(x_{0}-x_{k})}}\\&=\sum _{j=0}^{n+1}{\frac {y_{j}}{\prod _{k=0,\,k\neq j}^{n+1}(x_{j}-x_{k})}}\end{aligned}}}
^ 《數值分析及科學計算》 薛毅(編) 第六章 第2節 Newton插值. P200.
^ 《數值分析及科學計算》 薛毅(編) 第六章 第2節 Newton插值. P201.
^
x
0
x
0
3
x
0
2
+
x
0
x
1
+
x
1
2
x
1
x
1
3
x
0
+
x
1
+
x
2
x
0
2
+
x
0
x
2
+
x
2
2
1
x
2
x
2
3
x
0
+
x
1
+
x
3
0
x
0
2
+
x
0
x
3
+
x
3
2
1
x
3
x
3
3
x
0
+
x
1
+
x
4
x
0
2
+
x
0
x
4
+
x
4
2
x
4
x
4
3
{\displaystyle {\begin{matrix}x_{0}&x_{0}^{3}&&&&\\&&x_{0}^{2}+x_{0}x_{1}+x_{1}^{2}&&\\x_{1}&x_{1}^{3}&&x_{0}+x_{1}+x_{2}&&\\&&x_{0}^{2}+x_{0}x_{2}+x_{2}^{2}&&1&\\x_{2}&x_{2}^{3}&&x_{0}+x_{1}+x_{3}&&0\\&&x_{0}^{2}+x_{0}x_{3}+x_{3}^{2}&&1&\\x_{3}&x_{3}^{3}&&x_{0}+x_{1}+x_{4}&&\\&&x_{0}^{2}+x_{0}x_{4}+x_{4}^{2}&&&\\x_{4}&x_{4}^{3}&&&&\\\end{matrix}}}
^ The Newton Polynomial Interpolation . [2019-04-19 ] . (原始內容存檔 於2019-04-19).
^
N
1
(
x
)
=
[
y
0
]
+
[
y
0
,
y
1
]
(
x
−
x
0
)
=
y
0
+
y
0
x
−
x
0
x
0
−
x
1
+
y
1
x
−
x
0
x
1
−
x
0
=
y
0
(
1
+
x
−
x
0
x
0
−
x
1
)
+
y
1
x
−
x
0
x
1
−
x
0
=
y
0
x
−
x
1
x
0
−
x
1
+
y
1
x
−
x
0
x
1
−
x
0
=
∑
j
=
0
1
y
j
∏
k
=
0
,
k
≠
j
1
x
−
x
k
x
j
−
x
k
N
n
(
x
)
=
∑
j
=
0
n
y
j
∏
k
=
0
,
k
≠
j
n
x
−
x
k
x
j
−
x
k
N
n
+
1
(
x
)
=
N
n
(
x
)
+
[
y
0
,
y
1
,
…
,
y
n
+
1
]
∏
k
=
0
n
(
x
−
x
k
)
=
∑
j
=
0
n
y
j
∏
k
=
0
,
k
≠
j
n
x
−
x
k
x
j
−
x
k
+
∑
j
=
0
n
+
1
y
j
∏
k
=
0
n
(
x
−
x
k
)
∏
k
=
0
,
k
≠
j
n
+
1
(
x
j
−
x
k
)
=
∑
j
=
0
n
y
j
(
∏
k
=
0
,
k
≠
j
n
(
x
−
x
k
)
∏
k
=
0
,
k
≠
j
n
(
x
j
−
x
k
)
+
∏
k
=
0
n
(
x
−
x
k
)
∏
k
=
0
,
k
≠
j
n
+
1
(
x
j
−
x
k
)
)
+
y
n
+
1
∏
k
=
0
n
(
x
−
x
k
)
∏
k
=
0
n
(
x
n
+
1
−
x
k
)
=
∑
j
=
0
n
y
j
(
(
∏
k
=
0
,
k
≠
j
n
(
x
−
x
k
)
)
(
x
j
−
x
n
+
1
)
∏
k
=
0
,
k
≠
j
n
+
1
(
x
j
−
x
k
)
+
(
∏
k
=
0
,
k
≠
j
n
(
x
−
x
k
)
)
(
x
−
x
j
)
∏
k
=
0
,
k
≠
j
n
+
1
(
x
j
−
x
k
)
)
+
y
n
+
1
∏
k
=
0
n
(
x
−
x
k
)
∏
k
=
0
n
(
x
n
+
1
−
x
k
)
=
∑
j
=
0
n
y
j
(
∏
k
=
0
,
k
≠
j
n
(
x
−
x
k
)
)
(
x
−
x
n
+
1
)
∏
k
=
0
,
k
≠
j
n
+
1
(
x
j
−
x
k
)
+
y
n
+
1
∏
k
=
0
n
(
x
−
x
k
)
∏
k
=
0
n
(
x
n
+
1
−
x
k
)
=
∑
j
=
0
n
+
1
y
j
∏
k
=
0
,
k
≠
j
n
+
1
x
−
x
k
x
j
−
x
k
{\displaystyle {\begin{array}{lcl}{\begin{aligned}N_{1}(x)&=[y_{0}]+[{y}_{0},{y}_{1}](x-{x}_{0})=y_{0}+y_{0}{\frac {x-{x}_{0}}{x_{0}-x_{1}}}+y_{1}{\frac {x-{x}_{0}}{x_{1}-x_{0}}}=y_{0}(1+{\frac {x-{x}_{0}}{x_{0}-x_{1}}})+y_{1}{\frac {x-{x}_{0}}{x_{1}-x_{0}}}\\&=y_{0}{\frac {x-{x}_{1}}{x_{0}-x_{1}}}+y_{1}{\frac {x-{x}_{0}}{x_{1}-x_{0}}}=\sum _{j=0}^{1}y_{j}\prod _{k=0,k\neq j}^{1}{\frac {x-{x}_{k}}{x_{j}-x_{k}}}\\\end{aligned}}\\{\begin{aligned}N_{n}(x)&=\sum _{j=0}^{n}y_{j}\prod _{k=0,k\neq j}^{n}{\frac {x-{x}_{k}}{x_{j}-x_{k}}}\\\end{aligned}}\\{\begin{aligned}N_{n+1}(x)&=N_{n}(x)+[{y}_{0},{y}_{1},\ldots ,{y}_{n+1}]\prod _{k=0}^{n}(x-{x}_{k})\\&=\sum _{j=0}^{n}y_{j}\prod _{k=0,k\neq j}^{n}{\frac {x-{x}_{k}}{x_{j}-x_{k}}}+\sum _{j=0}^{n+1}y_{j}{\frac {\prod _{k=0}^{n}(x-{x}_{k})}{\prod _{k=0,\,k\neq j}^{n+1}(x_{j}-x_{k})}}\\&=\sum _{j=0}^{n}y_{j}\left({\frac {\prod _{k=0,k\neq j}^{n}(x-{x}_{k})}{\prod _{k=0,k\neq j}^{n}(x_{j}-x_{k})}}+{\frac {\prod _{k=0}^{n}(x-{x}_{k})}{\prod _{k=0,\,k\neq j}^{n+1}(x_{j}-x_{k})}}\right)+y_{n+1}{\frac {\prod _{k=0}^{n}(x-{x}_{k})}{\prod _{k=0}^{n}(x_{n+1}-x_{k})}}\\&=\sum _{j=0}^{n}y_{j}\left({\frac {\left(\prod _{k=0,k\neq j}^{n}(x-{x}_{k})\right)(x_{j}-x_{n+1})}{\prod _{k=0,k\neq j}^{n+1}(x_{j}-x_{k})}}+{\frac {\left(\prod _{k=0,k\neq j}^{n}(x-{x}_{k})\right)(x-x_{j})}{\prod _{k=0,\,k\neq j}^{n+1}(x_{j}-x_{k})}}\right)+y_{n+1}{\frac {\prod _{k=0}^{n}(x-{x}_{k})}{\prod _{k=0}^{n}(x_{n+1}-x_{k})}}\\&=\sum _{j=0}^{n}y_{j}{\frac {\left(\prod _{k=0,k\neq j}^{n}(x-{x}_{k})\right)(x-x_{n+1})}{\prod _{k=0,k\neq j}^{n+1}(x_{j}-x_{k})}}+y_{n+1}{\frac {\prod _{k=0}^{n}(x-{x}_{k})}{\prod _{k=0}^{n}(x_{n+1}-x_{k})}}\\&=\sum _{j=0}^{n+1}y_{j}\prod _{k=0,k\neq j}^{n+1}{\frac {x-{x}_{k}}{x_{j}-x_{k}}}\\\end{aligned}}\\\end{array}}}
^ Burden, Richard L.; Faires, J. Douglas. Numerical Analysis 9th. 2011: 129 .
^
△
k
y
i
=
∑
j
=
0
k
k
!
∏
l
=
0
,
l
≠
j
k
(
j
−
l
)
y
i
+
j
,
0
≤
k
≤
n
−
i
=
∑
j
=
0
k
k
!
j
!
(
−
1
)
k
−
j
(
k
−
j
)
!
y
i
+
j
,
0
≤
k
≤
n
−
i
=
∑
j
=
0
k
(
−
1
)
k
−
j
(
k
j
)
y
i
+
j
,
0
≤
k
≤
n
−
i
{\displaystyle {\begin{aligned}\triangle ^{k}y_{i}&=\sum _{j=0}^{k}{\frac {k!}{\prod _{l=0,\,l\neq j}^{k}(j-l)}}y_{i+j},\quad 0\leq k\leq n-i\\&=\sum _{j=0}^{k}{\frac {k!}{j!(-1)^{k-j}(k-j)!}}y_{i+j},\quad 0\leq k\leq n-i\\&=\sum _{j=0}^{k}(-1)^{k-j}{\binom {k}{j}}y_{i+j},\quad 0\leq k\leq n-i\end{aligned}}}
^ Methodus Incrementorum Directa et Inversa (頁面存檔備份 ,存於互聯網檔案館 )