Computing Cross Entropy and the derivative of Softmax
3 views (last 30 days)
Show older comments
Hi everyone,
I am trying to manually code a three layer mutilclass neural net that has softmax activation in the output layer and cross entropy loss. I think my code for the derivative of softmax is correct, currently I have
function delta_softmax = grad_softmax(z)
delta = eye(size(z));
delta_softmax = ssmax(z).*(delta-ssmax(z));
end
However, I am having some trouble converting Python code to MATLAB for Cross Entropy Loss. In Python, the code is
def cross_entropy(X,y):
"""
X is the output from fully connected layer (num_examples x num_classes)
y is labels (num_examples x 1)
"""
m = y.shape[0]
p = softmax(X)
log_likelihood = -np.log(p[range(m),y])
loss = np.sum(log_likelihood) / m
return loss
whereas my MATLAB code is
function cross_entropy = cross_entropy(X, y)
m = size(y, 1);
v = ssmax(X);
y = y + ones(m,1);
v = v(1:m, y);
llhood = -log(v);
cross_entropy = sum(llhood)/m;
end
However, I get an issue when I try to use the same indexing convention as the Python code. I am working with a large data set so MATLAB throws a size error.
0 Comments
Answers (1)
Greg Heath
on 6 May 2018
Search both
comp.soft-sys.matlab
and
ANSWERS
for
greg crossentropy
Hope this helps.
Thank you for formally accepting my answer
Greg
0 Comments
See Also
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!