Data Science Asked by IDK on August 21, 2020
def hypothesis(W, B, X):
Z = np.dot(W.T, X) + B
return sigmoid(Z)
def sigmoid(Z):
return(1/(1+np.exp(-Z)))
def cost(A, Y):
return (-Y * np.log(A) - (1 - Y) * np.log(1 - A)).mean()
def derivatives(W, B, X, Y):
A = hypothesis(W, B, X)
return np.dot(X.T, (A - Y)) / Y.shape[0] , (A - Y) / Y.shape[0]
def updateParameters(W, B, X, Y, learning_rate):
# print(W.shape, X.shape, Y.shape )
dW, dB = derivatives(W, B, X, Y)
W -= learning_rate * dW
B -= learning_rate * dB
return W, B
def LogisticRegression(X, Y):
W = np.zeros((X.shape[0], 1))
B = np.zeros((X.shape[0], 1))
for i in range(0, 1000000):
print(cost(hypothesis(W, B, X), Y))
W, B = updateParameters(W, B, X, Y, 0.0000003)
LogisticRegression(X, Y)
I’m a beginner trying to implement logistic regression from scratch.
However, my cost always goes up, and when I reduced my learning rate in case that was too high, it started going down but only extremely slowly.
Could someone point out if there is any issue in the implementation of the algorithm? Any help would be greatly appreciated
I would refer you to this website that provides a very extensive and detailed walk through the math behind the logistic regression and it's cost function. https://www.internalpointers.com/post/cost-function-logistic-regression
And here is a complete tutorial on how to manually code the whole thing in python:
https://towardsdatascience.com/building-a-logistic-regression-in-python-301d27367c24
Finally, I don't know the reason you need to implement this manually, there are pretty good ready-to-use functions that have been used and tested and that would save you a lot of time.
Good luck !
Answered by Wajdi on August 21, 2020
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP