Machine Learning week 4 quiz: programming assignment-Multi-class Classification and Neural Networks
生活随笔
收集整理的这篇文章主要介绍了
Machine Learning week 4 quiz: programming assignment-Multi-class Classification and Neural Networks
小编觉得挺不错的,现在分享给大家,帮大家做个参考.
一、ex3.m
%% Machine Learning Online Class - Exercise 3 | Part 1: One-vs-all% Instructions % ------------ % % This file contains code that helps you get started on the % linear exercise. You will need to complete the following functions % in this exericse: % % lrCostFunction.m (logistic regression cost function) % oneVsAll.m % predictOneVsAll.m % predict.m % % For this exercise, you will not need to change any code in this file, % or any other files other than those mentioned above. %%% Initialization clear ; close all; clc%% Setup the parameters you will use for this part of the exercise input_layer_size = 400; % 20x20 Input Images of Digits num_labels = 10; % 10 labels, from 1 to 10 % (note that we have mapped "0" to label 10)%% =========== Part 1: Loading and Visualizing Data ============= % We start the exercise by first loading and visualizing the dataset. % You will be working with a dataset that contains handwritten digits. %% Load Training Data fprintf('Loading and Visualizing Data ...\n')load('ex3data1.mat'); % training data stored in arrays X, y m = size(X, 1);% Randomly select 100 data points to display rand_indices = randperm(m); % Randomly select 100 sel = X(rand_indices(1:100), :);displayData(sel);fprintf('Program paused. Press enter to continue.\n'); pause;%% ============ Part 2: Vectorize Logistic Regression ============ % In this part of the exercise, you will reuse your logistic regression % code from the last exercise. You task here is to make sure that your % regularized logistic regression implementation is vectorized. After % that, you will implement one-vs-all classification for the handwritten % digit dataset. %fprintf('\nTraining One-vs-All Logistic Regression...\n')lambda = 0.1; [all_theta] = oneVsAll(X, y, num_labels, lambda);fprintf('Program paused. Press enter to continue.\n'); pause;%% ================ Part 3: Predict for One-Vs-All ================ % After ... pred = predictOneVsAll(all_theta, X);fprintf('\nTraining Set Accuracy: %f\n', mean(double(pred == y)) * 100);二、ex3_nn.m
%% Machine Learning Online Class - Exercise 3 | Part 2: Neural Networks% Instructions % ------------ % % This file contains code that helps you get started on the % linear exercise. You will need to complete the following functions % in this exericse: % % lrCostFunction.m (logistic regression cost function) % oneVsAll.m % predictOneVsAll.m % predict.m % % For this exercise, you will not need to change any code in this file, % or any other files other than those mentioned above. %%% Initialization clear ; close all; clc%% Setup the parameters you will use for this exercise input_layer_size = 400; % 20x20 Input Images of Digits hidden_layer_size = 25; % 25 hidden units num_labels = 10; % 10 labels, from 1 to 10 % (note that we have mapped "0" to label 10)%% =========== Part 1: Loading and Visualizing Data ============= % We start the exercise by first loading and visualizing the dataset. % You will be working with a dataset that contains handwritten digits. %% Load Training Data fprintf('Loading and Visualizing Data ...\n')load('ex3data1.mat'); m = size(X, 1);% Randomly select 100 data points to display sel = randperm(size(X, 1)); sel = sel(1:100);displayData(X(sel, :));fprintf('Program paused. Press enter to continue.\n'); pause;%% ================ Part 2: Loading Pameters ================ % In this part of the exercise, we load some pre-initialized % neural network parameters.fprintf('\nLoading Saved Neural Network Parameters ...\n')% Load the weights into variables Theta1 and Theta2 load('ex3weights.mat');%% ================= Part 3: Implement Predict ================= % After training the neural network, we would like to use it to predict % the labels. You will now implement the "predict" function to use the % neural network to predict the labels of the training set. This lets % you compute the training set accuracy.pred = predict(Theta1, Theta2, X);fprintf('\nTraining Set Accuracy: %f\n', mean(double(pred == y)) * 100);fprintf('Program paused. Press enter to continue.\n'); pause;% To give you an idea of the network's output, you can also run % through the examples one at the a time to see what it is predicting.% Randomly permute examples rp = randperm(m);for i = 1:m% Display fprintf('\nDisplaying Example Image\n');displayData(X(rp(i), :));pred = predict(Theta1, Theta2, X(rp(i),:));fprintf('\nNeural Network Prediction: %d (digit %d)\n', pred, mod(pred, 10));% Pausefprintf('Program paused. Press enter to continue.\n');pause; end三、lrCostFunction.m
function [J, grad] = lrCostFunction(theta, X, y, lambda) %LRCOSTFUNCTION Compute cost and gradient for logistic regression with %regularization % J = LRCOSTFUNCTION(theta, X, y, lambda) computes the cost of using % theta as the parameter for regularized logistic regression and the % gradient of the cost w.r.t. to the parameters. % Initialize some useful values m = length(y); % number of training examples % m% You need to return the following variables correctly J = 0; %1*1 grad = zeros(size(theta)); %(n+1)*1% ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta. % You should set J to the cost. % Compute the partial derivatives and set grad to the partial % derivatives of the cost w.r.t. each parameter in theta % % Hint: The computation of the cost function and gradients can be % efficiently vectorized. For example, consider the computation % % sigmoid(X * theta) % % Each row of the resulting matrix will contain the value of the % prediction for that example. You can make use of this to vectorize % the cost function and gradient computations. % % Hint: When computing the gradient of the regularized cost function, % there're many possible vectorized solutions, but one solution % looks like: % grad = (unregularized gradient for logistic regression) % temp = theta; % temp(1) = 0; % because we don't add anything for j = 0 % grad = grad + YOUR_CODE_HERE (using the temp variable) %h = sigmoid(X*theta); %m*1 part1 = y.*(log(h)); %m*1 part2 = (1-y).*(log(1-h)); %m*1 J_ori = sum(-part1 - part2) / m; %1*1 sz_theta = size(theta, 1); theta_temp = theta(2:sz_theta); punish_J = sum(theta_temp.^2)*lambda/2/m; J = J_ori + punish_J;% graddiff = h - y; %m*1 temp = X' * diff; % (n+1)*m × m*1 -> (n+1)*1 temp = temp / m; % (n+1)*1; grad_ori = temp; punish_theta = theta_temp*lambda/m; punish_theta = [0; punish_theta]; grad = grad_ori + punish_theta; % =============================================================grad = grad(:);end
四、oneVsAll.m
function [all_theta] = oneVsAll(X, y, num_labels, lambda) %ONEVSALL trains multiple logistic regression classifiers and returns all %the classifiers in a matrix all_theta, where the i-th row of all_theta %corresponds to the classifier for label i % [all_theta] = ONEVSALL(X, y, num_labels, lambda) trains num_labels % logisitc regression classifiers and returns each of these classifiers % in a matrix all_theta, where the i-th row of all_theta corresponds % to the classifier for label i% Some useful variables m = size(X, 1); % m n = size(X, 2); % n% You need to return the following variables correctly all_theta = zeros(num_labels, n + 1); % num_labels*(n+1)% Add ones to the X data matrix X = [ones(m, 1) X]; % m*(n+1)% ====================== YOUR CODE HERE ====================== % Instructions: You should complete the following code to train num_labels % logistic regression classifiers with regularization % parameter lambda. % % Hint: theta(:) will return a column vector. % % Hint: You can use y == c to obtain a vector of 1's and 0's that tell use % whether the ground truth is true/false for this class. % % Note: For this assignment, we recommend using fmincg to optimize the cost % function. It is okay to use a for-loop (for c = 1:num_labels) to % loop over the different classes. % % fmincg works similarly to fminunc, but is more efficient when we % are dealing with large number of parameters. % % Example Code for fmincg: % % % Set Initial theta % initial_theta = zeros(n + 1, 1); % % % Set options for fminunc % options = optimset('GradObj', 'on', 'MaxIter', 50); % % % Run fmincg to obtain the optimal theta % % This function will return theta and the cost % [theta] = ... % fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), ... % initial_theta, options); %for c = 1:num_labelsinitial_theta = zeros(n + 1, 1);options = optimset('GradObj', 'on', 'MaxIter', 50);[theta] = fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), initial_theta, options); all_theta(c, :) = theta; end% =========================================================================end
五、predictOneVsAll.m
function p = predictOneVsAll(all_theta, X) %PREDICT Predict the label for a trained one-vs-all classifier. The labels %are in the range 1..K, where K = size(all_theta, 1). % p = PREDICTONEVSALL(all_theta, X) will return a vector of predictions % for each example in the matrix X. Note that X contains the examples in % rows. all_theta is a matrix where the i-th row is a trained logistic % regression theta vector for the i-th class. You should set p to a vector % of values from 1..K (e.g., p = [1; 3; 1; 2] predicts classes 1, 3, 1, 2 % for 4 examples) m = size(X, 1); % m num_labels = size(all_theta, 1); % k% You need to return the following variables correctly p = zeros(size(X, 1), 1); % m*1% Add ones to the X data matrix X = [ones(m, 1) X]; % m*(n+1)% ====================== YOUR CODE HERE ====================== % Instructions: Complete the following code to make predictions using % your learned logistic regression parameters (one-vs-all). % You should set p to a vector of predictions (from 1 to % num_labels). % % Hint: This code can be done all vectorized using the max function. % In particular, the max function can also return the index of the % max element, for more information see 'help max'. If your examples % are in rows, then, you can use max(A, [], 2) to obtain the max % for each row. % x_theta = X * all_theta'; % m*(n+1) × (n+1)*k -> m*k for c = 1:mmax_value = max(x_theta(c,:)); idx = find(x_theta(c,:) == max_value) p(c) = idx;end% =========================================================================end六、predict.m
function p = predict(Theta1, Theta2, X) %PREDICT Predict the label of an input given a trained neural network % p = PREDICT(Theta1, Theta2, X) outputs the predicted label of X given the % trained weights of a neural network (Theta1, Theta2)% Useful values m = size(X, 1); % m num_labels = size(Theta2, 1); %num_labels% You need to return the following variables correctly p = zeros(size(X, 1), 1); % m*1% ====================== YOUR CODE HERE ====================== % Instructions: Complete the following code to make predictions using % your learned neural network. You should set p to a % vector containing labels between 1 to num_labels. % % Hint: The max function might come in useful. In particular, the max % function can also return the index of the max element, for more % information see 'help max'. If your examples are in rows, then, you % can use max(A, [], 2) to obtain the max for each row. %X = [ones(m,1) X]; % add 1, m*(n+1)x_theta1 = X * Theta1'; % m*(n+1) × (n+1)*k -> m*k x_theta1 = sigmoid(x_theta1);x_theta1 = [ones(m, 1) x_theta1] % add 1x_theta2 = x_theta1 * Theta2'; % m*k × k*(n+1) -> m*(n+1) x_theta2 = sigmoid(x_theta2);for c = 1:mmax_value = max(x_theta2(c,:)); idx = find(x_theta2(c,:) == max_value) p(c) = idx;end%max_value = max(x_theta2); %idx = find(x_theta2 == max_value); %p = idx;% =========================================================================end七、submit results
总结
以上是生活随笔为你收集整理的Machine Learning week 4 quiz: programming assignment-Multi-class Classification and Neural Networks的全部内容,希望文章能够帮你解决所遇到的问题。
- 上一篇: Machine Learning wee
- 下一篇: 放几张漂亮的壁纸