Home
About
Resume
Neural Playground
Classifiers
Academic Writing
Blog
Login
Epoch: 0
Learning Rate
0.00001
0.0001
0.001
0.003
0.01
0.03
0.1
0.3
1
3
10
Activation Function
tanh
sigmoid
relu
linear
Output Resolution
Regular
Low
High
x
y
x^2
y^2
xy
sin(x)
sin(y)
-
Layers
+
+
-
Error: 0.000
-
+
clear
1
2
3
4