CSE 535: Homework #1

Due: Fri, Oct 22
Gradescope Link

1. Convexity [14 points]

  1. [9 points] Show that the following sets are convex:

    • The -dimensional probability simplex .

    • The set , where and is an positive semidefinite matrix.

      Describe what the set looks like for the case and .

    • Show that the following set is not convex:

  2. [8 points] Show that the following functions are convex, or give an example showing non-convexity.

    • $f_1(x) = \max(x,0)$

    • for

    • where is an symmetric matrix.

    • .

2. Regularized gradient descent steps [12 points]

In this problem, you should think (intuitively) of the direction as equal to so that each of these computations offers a different “gradient descent step.”

  1. Compute a closed form expression (and show your work!) for the minimizer of

    where and .

  2. Recall that . Compute a closed form expression (and show your work!) for the minimizer of

    where and , and .

  3. (*) Compute a closed form expression (and show your work!) for the minimizer of

    for every , where and .

3. Residual neural networks [30 points]

For this problem, recall that every matrix admits a singular value decomposition , where and are orthogonal matrices and is a diagonal matrix with the singular values of on the diagonal. We use to denote the smallest singular value of , and to denote the largest singular value of . Also recall the Frobenius norm .

  1. [4 points] Given points and a matrix , define the function

    where .

    Define and show that

    where .

  2. [6 points] Show that for any ,

    where we used the notation

  3. [4 points] Using part (B), show that

  4. [8 points] Assume now that . Show that

    You may use the fact that for matrices , it holds that .

  5. [8 points] Show that any local minimum of within the domain is actually a global mininum of on .