Noisy Gaussian NN – Robustness to Label Noise

Overview

This project explores how a simple 1-hidden-layer neural network handles increasing label noise when fitting a Gaussian curve.
We test three noise levels (Οƒ = 0.05, 0.1, 0.2) to see when the network smooths effectively and when it starts to underfit.

Dataset

  • Synthetic dataset: Gaussian curve (y = exp(-x^2))
  • Noise added directly to labels using torch.normal
  • 200 evenly spaced x points in [-2, 2]

Model

  • Architecture: 1 hidden layer, 50 neurons, ReLU activation
  • Loss: MSELoss
  • Optimizer: Adam (lr=0.01)
  • Training: 2000 epochs

Results

  • Low noise: NN fits curve smoothly.
  • Medium noise: Slight underfitting.
  • High noise: Curve shape lost, noise dominates.

Key Insight

More noise β‰  better regularization.
Too much noise can destroy the signal beyond recovery.

Files

  • GaussianApproximation.ipynb – Full experiment, plots, and analysis
  • README.md – This file

License

MIT License – free to use, modify, and distribute with attribution.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support