Flight Plan Coordinate Prediction Model (Seq2SeqCoordsTransformer)
Encoder-Decoder Transformer model trained for AI flight planning project. Predicts normalized coordinates directly and waypoint count via classification.
Model Description
Seq2SeqCoordsTransformer architecture using torch.nn.Transformer
. Predicts normalized lat/lon coordinates autoregressively and waypoint count (0-10) via classification head on encoder output.
- Embed Dim: 256, Heads: 8, Enc Layers: 4, Dec Layers: 4, Max Waypoints: 10
Intended Use
Research prototype. Not for real-world navigation.
Limitations
Accuracy depends on data/tuning. Fixed max waypoints (10). Not certified. Architecture differs significantly from previous versions in this repo.
How to Use
Requires loading the custom Seq2SeqCoordsTransformer
class and weights. Generation requires autoregressive decoding and taking argmax of count logits.
Training Data
Trained on frankmorales2020/flight_plan_waypoints
- https://huggingface.co/datasets/frankmorales2020/flight_plan_waypoints.
Contact
Frank Morales, BEng, MEng, SMIEEE (Boeing ATF) - https://www.linkedin.com/in/frank-morales1964/
- Downloads last month
- 4