JAME / README.md
liu-hanghang's picture
Upload folder using huggingface_hub
8d19e72 verified
metadata
language:
  - en
library_name: diffusers
metrics:
  - PER
  - WER
  - SongEval
  - Audio Aesthetics
  - MuQ
  - FAD
pipeline_tag: text-to-audio
tags:
  - music
  - art
  - song-generation
  - lyrics-to-song
  - flow-matching
  - direct-preference-optimization
license: other

JAM: A Tiny Flow-based Song Generator with Fine-grained Controllability and Aesthetic Alignment


arXiv Static Badge Static Badge Static Badge Github Static Badge

JAME Dataset

JAME (JAM Evaluation) is a comprehensive music dataset containing 250 high-quality music tracks designed for standardized evaluation of song generation models.

This dataset is part of Project Jamify developed by DeCLaRe Lab and supports research in controllable music generation.

Dataset Overview

  • Total Tracks: 250 carefully curated tracks (50 per genre, 5 genres total)
  • Duration: Approximately 230 seconds per track
  • Genres: Hip-Hop/Rap, Rock/Metal, Electronic/Dance, R&B/Soul/Jazz, Country/Folk
  • Release Time: select tracks released after 1st May 2025 to avoid data contaimination.
  • Purpose: Standardized evaluation benchmark for song generation models

Dataset Structure

jame/
β”œβ”€β”€ README.md                 # This file
β”œβ”€β”€ metadata.jsonl           # Complete metadata for all tracks
β”œβ”€β”€ spotify_urls.txt         # Plain text list of Spotify URLs
β”œβ”€β”€ transcriptions/          # JSON transcription files
β”‚   β”œβ”€β”€ Artist - Title.json
β”‚   └── ...
└── struct/                  # JSON structure analysis files
    β”œβ”€β”€ Artist - Title.json
    └── ...

Audio Access

Audio files are not directly provided in this dataset. Users can legally access the audio through:

  • Spotify: Using the provided spotify_url links
  • YouTube Music: Using the provided youtube_url links
  • Legal Streaming Services: Search by artist and title information

Please ensure compliance with terms of service and copyright laws when accessing audio content.

Metadata Format

Each line in metadata.jsonl contains a JSON object with the following fields:

{
  "file_name": "Artist - Title",
  "artist": "Artist Name",
  "title": "Song Title",
  "spotify_url": "https://open.spotify.com/track/...",
  "youtube_url": "https://music.youtube.com/watch?v=...",
  "duration": 180,
  "year": 2025,
  "genre": "Hip-Hop/Rap",
  "transcription_path": "transcriptions/Artist - Title.json",
  "struct_path": "struct/Artist - Title.json",
  "song_id": "spotify_track_id"
}

Field Descriptions

  • file_name: Base filename without extension
  • artist: Primary artist name
  • title: Song title
  • spotify_url: Official Spotify track URL
  • youtube_url: YouTube Music URL
  • duration: Track duration in seconds
  • year: Release year
  • genre: Music genre category
  • transcription_path: Path to transcription JSON file
  • struct_path: Path to structure analysis JSON file
  • song_id: Spotify track identifier

Genre Distribution

The dataset contains 250 tracks evenly distributed across 5 genres (50 tracks per genre):

  • Hip-Hop/Rap (50 tracks): Urban and rap music
  • Rock/Metal (50 tracks): Rock, alternative, and metal tracks
  • Electronic/Dance (50 tracks): Electronic, EDM, and dance music
  • R&B/Soul/Jazz (50 tracks): R&B, soul, and jazz-influenced tracks
  • Country/Folk (50 tracks): Country, folk, and acoustic music

File Formats

  • Metadata: JSONL (JSON Lines) format with complete track information
  • Transcriptions: JSON format with detailed transcription data
  • Structure: JSON format with musical structure annotations
  • URLs: Plain text file with Spotify URLs for easy access

License and Attribution

Please ensure proper attribution when using this dataset. Check individual track licenses through their respective Spotify and YouTube Music pages.

Dataset Statistics

  • Total Duration: ~15.3 hours (250 tracks Γ— ~230 seconds)
  • Completeness: 100% metadata coverage for all 250 tracks
  • Quality: High-quality metadata with detailed annotations
  • Coverage: Complete Spotify and YouTube URL mapping

Citation

If you use the JAME dataset in your research, please cite:

@misc{liu2025jamtinyflowbasedsong,
      title={JAM: A Tiny Flow-based Song Generator with Fine-grained Controllability and Aesthetic Alignment}, 
      author={Renhang Liu and Chia-Yu Hung and Navonil Majumder and Taylor Gautreaux and Amir Ali Bagherzadeh and Chuan Li and Dorien Herremans and Soujanya Poria},
      year={2025},
      eprint={2507.20880},
      archivePrefix={arXiv},
      primaryClass={cs.SD},
      url={https://arxiv.org/abs/2507.20880}, 
}

License and Attribution

This dataset is released under Project Jamify License for non-commercial, academic, and entertainment purposes only.

Key Restrictions:

  • No copyrighted material was used in a way that would intentionally infringe on intellectual property rights
  • Commercial use strictly prohibited
  • Attribution required: Must cite the JAM paper and maintain license notices
  • Users must ensure compliance with applicable legal and ethical standards

For complete license terms, see the Project Jamify repository.

Contact

For questions about this dataset: