Data Science Asked by Burple on December 1, 2020
I am doing some supervised learning using neural networks, and i have a Targets array containing 1906 samples, which contain 664 unique values. min. count of each unique value==2, by design. Is there a smarter way to split this dataset into train and test, using a leaveoneout scheme to pick randomly 1 sample from each class and put it in the test set and use the rest for training, before i get down to explicitly iterating over all my values? I am using python, numpy, sklearn and pytorch btw!
Never mind! Found it!
just doing a,indices,counts=np.unique(y)
will return the starting indices of each unique value in indices
, and then you can just slice this off as x_test=X[indices]
, and remember to delete it using np.delete(X,indices)
Answered by Burple on December 1, 2020
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP