Multivariable Optimization
Multivariable Optimization
April 7, 2024
𝜕 2𝑓
=1
𝜕𝑦𝜕𝑥
𝜕 2𝑓
= 6𝑦 + 6
𝜕𝑦2
Then the Hessian matrix of 𝑓 will be:
6𝑥 + 4 1
𝐽 =[ ]
1 6𝑦 + 6
Let’s write a code to solve the problem following the fundamental principles.
Let’s define the function and its derivatives
1
[2]: #Import important libraries.
def f(X):
return X[0]**3+X[1]**3+2*X[0]**2+3*X[1]**2+X[0]*X[1]+X[0]+X[1]+5
def dfdx(X):
return 3*X[0]**2+4*X[0]+X[1]+1
def dfdy(X):
return 3*X[1]**2+6*X[1]+X[0]+1
def J(X):
return np.array([[6*X[0]+4,1],[1,6*X[1]+6]])
def df(X):
return np.array([dfdx(X),dfdy(X)])
while err>tol:
dX=np.linalg.solve(J(X0),df(X0))
X0=X0-dX
x0=X0[0];y0=X0[1]
err=np.linalg.norm(dX)
2
else:
print('The stationary value is saddle point.')
ax = plt.axes(projection ="3d")
plt.rcParams['figure.figsize'] = [20, 15]
surf = ax.plot_surface(X[0], X[1], Z, cmap='viridis', alpha=0.7, linewidth=0,␣
↪antialiased=True, shade=True)
plt.show()
3
[ ]: