0% found this document useful (0 votes)
49 views

HW 2 Solution

The document contains 4 homework problems related to optimization techniques in machine learning. Problem 1 involves Lagrange multipliers with equality constraints. Problem 2 involves Lagrange multipliers with inequality constraints. Problem 3 asks to reformulate the optimization problem for soft margin support vector machines using Lagrange multipliers. Problem 4 involves calculating class probabilities for a Naive Bayes classifier given training examples.

Uploaded by

sindhura2258
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views

HW 2 Solution

The document contains 4 homework problems related to optimization techniques in machine learning. Problem 1 involves Lagrange multipliers with equality constraints. Problem 2 involves Lagrange multipliers with inequality constraints. Problem 3 asks to reformulate the optimization problem for soft margin support vector machines using Lagrange multipliers. Problem 4 involves calculating class probabilities for a Naive Bayes classifier given training examples.

Uploaded by

sindhura2258
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

ISE 233 HW #2

Due 11:59 PM on April 4th, 2021 (Sunday)

Problem 1 (Lagrange multiplier with equality constraints): Find the maximum and
minimum of 𝑓(𝑥, 𝑦, 𝑧) = 4𝑦 − 2𝑧 subject to the constraints 2𝑥 − 𝑦 − 𝑧 = 2 and 𝑥 2 +
𝑦 2 = 1.
Solution:

Problem 2 (Lagrange multiplier with inequality constraints): Minimize 𝑓(𝑥, 𝑦) =


𝑥 2 + 𝑦 2 subject to 𝑔(𝑥, 𝑦) = 2𝑥 + 𝑦 ≤ 2.
Solution:
Problem 3 (Support Vector Machine): Recall the soft margin SVM has been discussed
in the class. The original optimization problem for soft margin SVM is
1
Minimize 2 𝒘𝑇 𝒘+C∑ 𝜉𝑖
Subject to 𝑦𝑖 (𝒘𝑇 𝑥𝑖 + 𝑏) ≥ 1 − 𝜉𝑖
Please show that the original optimization problem can be reformulated as below by using
Lagrange multiplier.
1
Maximize ∑ 𝛼𝑖 - ∑ ∑ 𝛼𝑖 𝛼𝑗 𝑦𝑖 𝑦𝑗 𝒙𝑇𝑖 𝒙𝑗
2
Subject to ∑ 𝛼𝑖 𝑦𝑖 = 0, and 0 ≤ 𝛼𝑖 ≤ 𝐶 for all 𝛼𝑖

Problem 4 (Naïve Bayes Classifier): You are a robot in an animal shelter, and must
learn to discriminate Dogs from Cats. You choose to learn a Naïve Bayes classifier. You
are given the following examples:

Consider a new example (Sound=Bark ^ Fur=Coarse ^ Color=Brown). Write these class


probabilities.
Solution:

You might also like