Correct long-form generation config parameters 'max_initial_timestamp_index' and 'prev_sot_token_id'.
Browse filesHey openai 👋, 
 Your model repository seems to contain outdated generation config parameters, such as 'max_initial_timestamp_index' and is missing the 'prev_sot_token_id' parameter. These parameters need to be updated to correctly handle long-form generation as stated in  as part of https://github.com/huggingface/transformers/pull/27658. This PR makes sure that everything is up to date and can be safely merged. 
 Best, the Transformers team.
- generation_config.json +2 -1
    	
        generation_config.json
    CHANGED
    
    | @@ -144,10 +144,11 @@ | |
| 144 | 
             
                "<|yo|>": 50325,
         | 
| 145 | 
             
                "<|zh|>": 50260
         | 
| 146 | 
             
              },
         | 
| 147 | 
            -
              "max_initial_timestamp_index":  | 
| 148 | 
             
              "max_length": 448,
         | 
| 149 | 
             
              "no_timestamps_token_id": 50363,
         | 
| 150 | 
             
              "pad_token_id": 50257,
         | 
|  | |
| 151 | 
             
              "return_timestamps": false,
         | 
| 152 | 
             
              "suppress_tokens": [
         | 
| 153 | 
             
                1,
         | 
|  | |
| 144 | 
             
                "<|yo|>": 50325,
         | 
| 145 | 
             
                "<|zh|>": 50260
         | 
| 146 | 
             
              },
         | 
| 147 | 
            +
              "max_initial_timestamp_index": 50,
         | 
| 148 | 
             
              "max_length": 448,
         | 
| 149 | 
             
              "no_timestamps_token_id": 50363,
         | 
| 150 | 
             
              "pad_token_id": 50257,
         | 
| 151 | 
            +
              "prev_sot_token_id": 50361,
         | 
| 152 | 
             
              "return_timestamps": false,
         | 
| 153 | 
             
              "suppress_tokens": [
         | 
| 154 | 
             
                1,
         | 

