audio
audioduration (s)
0.09
2
id
uint16
1
271
location
class label
68 classes
detail
stringclasses
51 values
hits
uint8
1
174
251
29Hallway
MITCampus
1
90
48Outside
StreetsOfCambridge
2
195
48Outside
SuburbanFrontYard
1
32
10Bedroom
null
6
167
48Outside
MITCampus
1
229
43Office
Lobby
1
92
43Office
ConferenceRoom
2
142
48Outside
StreetsOfBoston
1
84
48Outside
SuburbanBackyard
2
269
43Office
ConferenceRoom
1
59
48Outside
StreetsOfCambridge
3
177
38LaundryRoom
MITCampus
1
86
6Bar
null
2
249
17Classroom
null
1
108
19ComputerRoom
MITCampus
1
259
17Classroom
null
1
129
62Supermarket
null
1
261
17Classroom
null
1
213
61SubwayStation
CentralSquareCambridge
1
190
65Train
BostonTGreenline
1
112
12Bookstore
null
1
242
17Classroom
null
1
9
43Office
null
32
226
50Pizzeria
null
1
42
29Hallway
ElementarySchool
4
155
27FastFoodRestaurant
null
1
41
66TrainStation
SouthStationBoston
4
14
30HomeExerciseRoom
null
18
98
10Bedroom
null
2
187
48Outside
StreetsOfCambridge
1
231
17Classroom
null
1
218
56StreetsOfBoston
null
1
239
29Hallway
MITCampus
1
158
33HospitalWaitingRoom
null
1
11
16Car
null
29
134
49ParkingLot
null
1
126
48Outside
null
1
99
17Classroom
null
2
54
36Kitchen
null
3
219
57StreetsOfCambridge
null
1
56
48Outside
HarvardBridgeBetweenCambridgeAndBoston
3
105
17Classroom
null
2
266
58StudentLounge
MITCampus
1
2
10Bedroom
null
62
169
35IceCreamParlor
null
1
220
57StreetsOfCambridge
null
1
124
48Outside
MITCampusCourtyard
1
133
61SubwayStation
ParkStreetBoston
1
106
17Classroom
null
2
172
48Outside
EntranceOfLexingtonPublicLibrary
1
26
28Gym
null
8
110
43Office
MeetingRoom
1
212
48Outside
StreetsOfSomerville
1
143
48Outside
StreetsOfBoston
1
45
39LivingRoom
null
4
93
52Restaurant
null
2
85
6Bar
null
2
156
48Outside
StreetsOfCambridge
1
258
17Classroom
null
1
210
48Outside
Forest
1
5
43Office
Small
44
206
48Outside
Forest
1
81
54Shower
null
2
260
17Classroom
null
1
237
17Classroom
null
1
1
10Bedroom
null
65
19
2Atrium
MITCampus
1
211
55Stairwell
null
1
122
18CoffeeShop
null
1
72
6Bar
null
2
205
48Outside
Forest
1
74
48Outside
StreetsOfCambridge
2
244
17Classroom
null
1
141
48Outside
MITCampus
1
256
55Stairwell
null
1
245
17Classroom
null
1
12
36Kitchen
null
22
53
43Office
ConferenceRoom
1
271
48Outside
InTramStopRainShelter
2
136
53SandwichShop
null
1
55
29Hallway
House
3
215
53SandwichShop
null
1
109
18CoffeeShop
null
1
31
10Bedroom
null
7
140
48Outside
StreetsOfSomerville
1
174
6Bar
null
1
204
48Outside
Forest
1
20
39LivingRoom
null
10
24
9Bathroom
null
9
58
15Campground
Dininghall
3
100
17Classroom
null
2
77
58StudentLounge
MITCampus
2
40
17Classroom
null
5
101
17Classroom
null
2
102
55Stairwell
ElementraySchool
1
248
17Classroom
null
1
201
25DramaRoom
MITCampus
1
252
3Auditorium
null
1
138
48Outside
StreetsOfCambridge
1
250
17Classroom
null
1

Author's Description

These are environmental Impulse Responses (IRs) measured in the real-world IR survey as described in Traer and McDermott, PNAS, 2016. The survey locations were selected by tracking the motions of 7 volunteers over the course of 2 weeks of daily life. We sent the volunteers 24 text messages every day at randomized times and asked the volunteers to respond with their location at the time the text was sent. We then retraced their steps and measured the acoustic impulse responses of as many spaces as possible. We recorded 271 IRs from a total of 301 unique locations. This data set therefore reflects the diversity of acoustic distortion our volunteers encounter in the course of daily life. All recordings were made with a 1.5 meter spacing between speaker and microphone to simulate a typical conversation.

James Traer and Josh H. McDermott, mcdermottlab.mit.edu

Repacking Notes

The following changes were made to repack for 🤗 Datasets / 🥐 Croissant:

  • Resampled audio from 32khz to 16khz. For the 32khz version, see benjamin-paine/mit-impulse-response-survey.
  • Mapped beggining part of filename to id.
  • Mapped second part of filename to location, and turned into a class label (enumeration.)
  • When present, mapped third (but not final) part of filename to detail.
  • Mapped final part of filename to hits.
  • Adjusted several filenames by correcting typos, homogenizing capitalization, and occasionally switching the order of location and detail.

License

These files are licensed under an MIT Creative Commons license, CC-BY 4.0. Please cite the Traer and McDermott paper when used, as exampled below.

Citation

@article{
doi:10.1073/pnas.1612524113,
author = {James Traer and Josh H. McDermott},
title = {Statistics of natural reverberation enable perceptual separation of sound and space},
journal = {Proceedings of the National Academy of Sciences},
volume = {113},
number = {48},
pages = {E7856-E7865},
year = {2016},
doi = {10.1073/pnas.1612524113},
URL = {https://www.pnas.org/doi/abs/10.1073/pnas.1612524113},
eprint = {https://www.pnas.org/doi/pdf/10.1073/pnas.1612524113},
abstract = {Sounds produced in the world reflect off surrounding surfaces on their way to our ears. Known as reverberation, these reflections distort sound but provide information about the world around us. We asked whether reverberation exhibits statistical regularities that listeners use to separate its effects from those of a sound’s source. We conducted a large-scale statistical analysis of real-world acoustics, revealing strong regularities of reverberation in natural scenes. We found that human listeners can estimate the contributions of the source and the environment from reverberant sound, but that they depend critically on whether environmental acoustics conform to the observed statistical regularities. The results suggest a separation process constrained by knowledge of environmental acoustics that is internalized over development or evolution. In everyday listening, sound reaches our ears directly from a source as well as indirectly via reflections known as reverberation. Reverberation profoundly distorts the sound from a source, yet humans can both identify sound sources and distinguish environments from the resulting sound, via mechanisms that remain unclear. The core computational challenge is that the acoustic signatures of the source and environment are combined in a single signal received by the ear. Here we ask whether our recognition of sound sources and spaces reflects an ability to separate their effects and whether any such separation is enabled by statistical regularities of real-world reverberation. To first determine whether such statistical regularities exist, we measured impulse responses (IRs) of 271 spaces sampled from the distribution encountered by humans during daily life. The sampled spaces were diverse, but their IRs were tightly constrained, exhibiting exponential decay at frequency-dependent rates: Mid frequencies reverberated longest whereas higher and lower frequencies decayed more rapidly, presumably due to absorptive properties of materials and air. To test whether humans leverage these regularities, we manipulated IR decay characteristics in simulated reverberant audio. Listeners could discriminate sound sources and environments from these signals, but their abilities degraded when reverberation characteristics deviated from those of real-world environments. Subjectively, atypical IRs were mistaken for sound sources. The results suggest the brain separates sound into contributions from the source and the environment, constrained by a prior on natural reverberation. This separation process may contribute to robust recognition while providing information about spaces around us.}}
Downloads last month
513

Models trained or fine-tuned on benjamin-paine/mit-impulse-response-survey-16khz

Collection including benjamin-paine/mit-impulse-response-survey-16khz