Order For Similar Custom Papers & Assignment Help Services

Fill the order form details - writing instructions guides, and get your paper done.

Posted: September 10th, 2021

Assignment Sheet 3 of MTH3330, paper writing

Assignment Sheet 3 of MTH3330
Instructions: Your completed assignment must be submitted before 4pm on Friday
of week 9. Non-executable code will not be marked.
Assignment 3.1. (gradient descent, 23 marks)
In this task you will explore how the behaviour of the gradient descent
method on quadratic functions depends on their geometrical features.
a) Write a function s=myArmijo(f,gf,x,beta,sigma) in Matlab which
computes the Armijo step-size given a function f, its gradient ∇f, a
point x and parameters β, σ ∈ (0, 1).
b) Test your function from part a) on the data
f(x) = 1
2
(x1, x2)

20 1
1 2 x1
x2

+ (1, 0)
x1
x2

,
x = (1, 1)T
, β = σ = 1/2.
The correct result is s = 0.03125.
c) Write a
function recx=mygradient(f,gf,x0,beta,sigma,tol)
in Matlab which carries out gradient descent with Armijo step-size rule
until the stopping criterion k∇f(xk)k ≤ tol is satisfied, and which
returns the matrix recx = [x0, . . . , xk] of all iterates.
d) Apply your function from part c) on the data
fj (x) = 1
2
(x1, x2)Qj

x1
x2

+ (1, 0)
x1
x2

,
x0 = (0.2, 1)T
, β = σ = 1/2, tol = 10−4
with the matrices
Q1 =

2 1
1 2
, Q2 =

20 1
1 2
and Q3 =

200 1
1 2
.
Visualise the result in a figure with three panels in the following way.
1
– For each j = 1, 2, 3, generate a contour plot of the function fj
.
– Plot your iterates into this contour plot. Indicate their positions
by circles, which are connected by lines.
– Indicate the position of the exact minimum of f by an x. Hint:
For symmetric matrices Q ∈ Rn×n
, we have ∇(x
TQx) = Qx.
– Use the command axis equal to ensure that the geometry of the
function is displayed correctly.
– Provide suitable axis labels and a title containing the relevant
information, including the number of iterations needed to achieve
the error tolerance.
e) Explain your results from part d). Write a page paper – Describe briefly how the geometry of
the functions fj affects the performance of the gradient descent method.
f) Relate the results from part d) to a statement from the lectures/notes.
Assignment 3.2. (linear and nonlinear regression, 27 marks)
Sometimes we have reason to believe that given data (xi
, yi)
N
i=1 ⊂ R2
can be
represented up to a tolerable error by a function
u : R × R
m → R, (x, p) 7→ u(x, p)
with unknown parameters p ∈ Rm (see, e.g., next exercise). In this situation,
we would like to determine the best possible choice of the parameters p.
Usually, this is achieved by solving the optimisation problem
minimise f(p) := 1
N
X
N
i=1
(u(xi
, p) − yi)
2
, (1)
which aims at minimising the errors between the values u(xi
, p) predicted by
the model function u and the given data yi
. This problem is often referred
to as the regression problem.
a) In the particular case, where the function u is affine linear, i.e. when
u(x, p) = p2x + p1, this problem is called the linear regression problem.
a1) Compute the gradient ∇f of the objective function f for linear
regression with pen and paper.
2
a2) Write a function p=myLinearRegression(x,y,p0) in Matlab
which computes the parameters p = (p1, p2) solving problem (1)
for given data x = (x1, . . . , xN ) and y = (y1, . . . , yN ) and given
initial guess p0 ∈ R2
. Use the result of part a1) and either your
own gradient descent algorithm from the previous exercises or the
supplied obfuscated function
gradient_descent(f,gf,x0,beta,sigma,tol),
which does exactly the same. Set β = σ = 0.5 in the Armijo line
search and terminate when k∇f(pk)k ≤ 0.005.
a3) Apply your algorithm to the supplied data vectors x and y and
the deliberately badly chosen initial guess p0 = (0, 0).
a4) The aim of this task is to visualise in which sense the iterates of
the gradient descent algorithms provide better and better approximations
to the best affine linear function representing the data.
Create a plot with four panels for k = 0, 3, 6, 9, which display the
data set in the form of blue crosses and the graph of u(·, pk) as a
red line, where pk = (pk,1, pk,2) is the k-th iterate of your run of
the gradient descent algorithm from task a3). Provide axis labels,
and show the number of the iterate and the values of f(pk) and
k∇f(pk)k in the title of each panel. You may find the commands
s=sprintf(…) and title(s) helpful.
b) In the general case, it is not clear what the gradient ∇f of the objective
function is. (In the next exercise, e.g., the function u is given in terms
of the solution of a differential equation and not explicitly available.)
The simplest solution to this problem is to approximate the gradient
by numerical differentiation (see MTH3051 for details).
b1) Write a function p=myNonlinearRegression(x,y,p0) in Matlab
which computes the parameters p ∈ Rm solving problem (1)
for given data x = (x1, . . . , xN ) and y = (y1, . . . , yN ) and given
initial guess p0 ∈ Rm. Use the supplied function
num_diff(f,x),
which approximates the derivative Df(x) ∈ Rn×m of an arbitrary
function f ∈ C
3
(Rm, Rn
) at x ∈ Rm, and your own code or the
supplied function gradient_descent(…) to achieve this aim.
3
b2) Apply your algorithm to the supplied data vectors x and y and
the function
u(x, p) = p1
1 + e
−p2(x−p3)
.
Run your iteration with β = σ = 0.5 and the bad initial guess
p0 = (0, 0, 4), and terminate when k∇f(pk)k ≤ 0.005.
b3) Create a plot with four panels for k = 0, 70, 140, 210, which display
the data set in the form of blue crosses and the graph of u(·, pk) as
a red line, where pk is the k-th iterate of your run of the gradient
descent algorithm from task b2). Provide axis labels, and show
the number of the iterate and the values of f(pk) and k∇f(pk)k in
the title of each panel. (You will see how the hight, the length of
the profile and the inflection point of the estimated function are
incrementally improved along the iteration.)
Acemyhomework, apessay, https://essays.homeworkacetutors.com
4

Order | Check Discount

Paper Writing Help For You!

Special Offer! Get 20-25% Off On your Order!

Why choose us

You Want Quality and That’s What We Deliver

Professional Writers

We assemble our team by selectively choosing highly skilled writers, each boasting specialized knowledge in specific subject areas and a robust background in academic writing

Discounted Prices

Our service is committed to delivering the finest writers at the most competitive rates, ensuring that affordability is balanced with uncompromising quality. Our pricing strategy is designed to be both fair and reasonable, standing out favorably against other writing services in the market.

AI & Plagiarism-Free

Rest assured, you'll never receive a product tainted by plagiarism or AI-generated content. Each paper is research-written by human writers, followed by a rigorous scanning process of the final draft before it's delivered to you, ensuring the content is entirely original and maintaining our unwavering commitment to providing plagiarism-free work.

How it works

When you decide to place an order with Nurscola, here is what happens:

Complete the Order Form

You will complete our order form, filling in all of the fields and giving us as much detail as possible.

Assignment of Writer

We analyze your order and match it with a writer who has the unique qualifications to complete it, and he begins from scratch.

Order in Production and Delivered

You and your writer communicate directly during the process, and, once you receive the final draft, you either approve it or ask for revisions.

Giving us Feedback (and other options)

We want to know how your experience went. You can read other clients’ testimonials too. And among many options, you can choose a favorite writer.