You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It's not a problem as long as it's following an up pattern in case of accuracy and a down pattern in terms of loss.
This might be due to the split rule of 10% for the validation split or due to the disharmony between batch normalization and dropout regularization.
I have changed the LR the learning curve shows a clearer convergence, still there is fluctuating in validation, my guess this is due to variability in the dataset ? will look further for a smoother curve
The text was updated successfully, but these errors were encountered: