Artificial Intelligence Asked by Souradeep Nanda on August 24, 2021
I have been messing around in tensorflow playground. One of the input data sets is a spiral. No matter what input parameters I choose, no matter how wide and deep the neural network I make, I cannot fit the spiral. How do data scientists fit data of this shape?
There are many approaches to this kind of problem. The most obvious one is to create new features. The best features I can come up with is to transform the coordinates to spherical coordinates.
I have not found a way to do it in playground, so I just created a few features that should help with this (sin features). After 500 iterations it will saturate and will fluctuate at 0.1 score. This suggest that no further improvement will be done and most probably I should make the hidden layer wider or add another layer.
Not a surprise that after adding just one neuron to the hidden layer you easily get 0.013 after 300 iterations. Similar thing happens by adding a new layer (0.017, but after significantly longer 500 iterations. Also no surprise as it is harder to propagate the errors). Most probably you can play with a learning rate or do an adaptive learning to make it faster, but this is not the point here.
Correct answer by Salvador Dali on August 24, 2021
May be you need reset all settings, and select x squared and y squared, only 1 hidden layer with 5 neurons.
Answered by Jorgenio Selenio on August 24, 2021
This is the architecture proposed and tested on the playground tensor flow for the Spiral Dataset. Two Hidden Layers with 8 neurons each is proposed with Tanh activation function.
Answered by Muhammad Hamza Zafar on August 24, 2021
Answered by Kartheek Kumar on August 24, 2021
The solution I reached after an hour of trial usually converges in just 100 epochs.
Yeah, I know it does not have the smoothest decision boundary out there, but it converges pretty fast.
I learned a few things from this spiral experiment:-
Coincidentally the solution I came up with is very similar to the one provided by Salvador Dali.
Kindly add a comment, if you find any more intuitions or reasonings.
Answered by dracarys3 on August 24, 2021
This is an example of vanilla Tensorflow playground with no added features and no modifications. The run for Spiral was between 187 to ~300 Epoch, depending. I used Lasso Regularization L1 so I could eliminate coefficients. I decreased the batch size by 1 to keep the output from over fitting. In my second example I added some noise to the data set then upped the L1 to compensate.
Answered by Jamin Quimby on August 24, 2021
By cheating... theta
is $arctan(y,x)$, $r$ is $sqrt{(x^2 + y^2)}$.
In theory, $x^2$ and $y^2$ should work, but, in practice, they somehow failed, even though, occasionally, it works.
Answered by anonisnotanon on August 24, 2021
Ideally neural networks should be able to find out the function out on it's own without us providing the spherical features. After some experimentation I was able to reach a configuration where we do not need anything except $X_1$ and $X_2$. This net converged after about 1500 epochs which is quite long. So the best way might still be to add additional features but I am just trying to say that it is still possible to converge without them.
Answered by Dheeraj Pb on August 24, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP