Noisy-Gaussian-NN / README.md
GenAIDevTOProd's picture
Update README.md
27b70c1 verified
metadata
license: mit
language:
  - en
tags:
  - neuralnetworks
  - pytorch
  - normaldistribution
  - math
  - noisydata

Noisy Gaussian NN – Robustness to Label Noise

Overview

This project explores how a simple 1-hidden-layer neural network handles increasing label noise when fitting a Gaussian curve.
We test three noise levels (σ = 0.05, 0.1, 0.2) to see when the network smooths effectively and when it starts to underfit.

Dataset

  • Synthetic dataset: Gaussian curve (y = exp(-x^2))
  • Noise added directly to labels using torch.normal
  • 200 evenly spaced x points in [-2, 2]

Model

  • Architecture: 1 hidden layer, 50 neurons, ReLU activation
  • Loss: MSELoss
  • Optimizer: Adam (lr=0.01)
  • Training: 2000 epochs

Results

  • Low noise: NN fits curve smoothly.
  • Medium noise: Slight underfitting.
  • High noise: Curve shape lost, noise dominates.

Key Insight

More noise ≠ better regularization.
Too much noise can destroy the signal beyond recovery.

Files

  • GaussianApproximation.ipynb – Full experiment, plots, and analysis
  • README.md – This file

License

MIT License – free to use, modify, and distribute with attribution.