function [theta,hist] = findmin(CF, X, y, theta, alpha, num_iters) %GRADIENTDESCENTMULTI Performs gradient descent to learn theta % theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) updates theta by % taking num_iters gradient steps with learning rate alpha hist = zeros(num_iters+1, length(theta)); hist(1,:) = theta'; for iter = 1:num_iters % ====================== YOUR CODE HERE ====================== % Instructions: Perform a single gradient step on the parameter vector % theta. % % Hint: While debugging, it can be useful to print out the values % of the cost function (computeCostMulti) and gradient here. % [J,g] = CF( theta, X, y ); theta = theta - alpha * g; hist(iter+1,:) = theta'; end end