bweng commited on
Commit
1b5bcb8
·
verified ·
1 Parent(s): c51dbe5

Upload 4 files

Browse files
pyannote-segmentation.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b049f71944a1c9d81827855f651d8146240fd0425d86602e4d322a81757e4cf3
3
+ size 2978996
pyannote-segmentation.xml ADDED
@@ -0,0 +1,2888 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="main_graph" version="11">
3
+ <layers>
4
+ <layer id="0" name="chunks" type="Parameter" version="opset1">
5
+ <data shape="1,1,160000" element_type="f32" />
6
+ <output>
7
+ <port id="0" precision="FP32" names="chunks">
8
+ <dim>1</dim>
9
+ <dim>1</dim>
10
+ <dim>160000</dim>
11
+ </port>
12
+ </output>
13
+ </layer>
14
+ <layer id="1" name="Constant_7148_compressed" type="Const" version="opset1">
15
+ <data element_type="f16" shape="1, 1, 7" offset="0" size="14" />
16
+ <output>
17
+ <port id="0" precision="FP16">
18
+ <dim>1</dim>
19
+ <dim>1</dim>
20
+ <dim>7</dim>
21
+ </port>
22
+ </output>
23
+ </layer>
24
+ <layer id="2" name="Constant_7148" type="Convert" version="opset1">
25
+ <data destination_type="f32" />
26
+ <rt_info>
27
+ <attribute name="decompression" version="0" />
28
+ </rt_info>
29
+ <input>
30
+ <port id="0" precision="FP16">
31
+ <dim>1</dim>
32
+ <dim>1</dim>
33
+ <dim>7</dim>
34
+ </port>
35
+ </input>
36
+ <output>
37
+ <port id="1" precision="FP32">
38
+ <dim>1</dim>
39
+ <dim>1</dim>
40
+ <dim>7</dim>
41
+ </port>
42
+ </output>
43
+ </layer>
44
+ <layer id="3" name="Constant_7147_compressed" type="Const" version="opset1">
45
+ <data element_type="f16" shape="1, 1, 128" offset="14" size="256" />
46
+ <output>
47
+ <port id="0" precision="FP16">
48
+ <dim>1</dim>
49
+ <dim>1</dim>
50
+ <dim>128</dim>
51
+ </port>
52
+ </output>
53
+ </layer>
54
+ <layer id="4" name="Constant_7147" type="Convert" version="opset1">
55
+ <data destination_type="f32" />
56
+ <rt_info>
57
+ <attribute name="decompression" version="0" />
58
+ </rt_info>
59
+ <input>
60
+ <port id="0" precision="FP16">
61
+ <dim>1</dim>
62
+ <dim>1</dim>
63
+ <dim>128</dim>
64
+ </port>
65
+ </input>
66
+ <output>
67
+ <port id="1" precision="FP32">
68
+ <dim>1</dim>
69
+ <dim>1</dim>
70
+ <dim>128</dim>
71
+ </port>
72
+ </output>
73
+ </layer>
74
+ <layer id="5" name="Constant_7146_compressed" type="Const" version="opset1">
75
+ <data element_type="f16" shape="1, 1, 128" offset="270" size="256" />
76
+ <output>
77
+ <port id="0" precision="FP16">
78
+ <dim>1</dim>
79
+ <dim>1</dim>
80
+ <dim>128</dim>
81
+ </port>
82
+ </output>
83
+ </layer>
84
+ <layer id="6" name="Constant_7146" type="Convert" version="opset1">
85
+ <data destination_type="f32" />
86
+ <rt_info>
87
+ <attribute name="decompression" version="0" />
88
+ </rt_info>
89
+ <input>
90
+ <port id="0" precision="FP16">
91
+ <dim>1</dim>
92
+ <dim>1</dim>
93
+ <dim>128</dim>
94
+ </port>
95
+ </input>
96
+ <output>
97
+ <port id="1" precision="FP32">
98
+ <dim>1</dim>
99
+ <dim>1</dim>
100
+ <dim>128</dim>
101
+ </port>
102
+ </output>
103
+ </layer>
104
+ <layer id="7" name="Constant_3950" type="Const" version="opset1">
105
+ <data element_type="i64" shape="1" offset="526" size="8" />
106
+ <output>
107
+ <port id="0" precision="I64">
108
+ <dim>1</dim>
109
+ </port>
110
+ </output>
111
+ </layer>
112
+ <layer id="8" name="MVN_3951" type="MVN" version="opset6">
113
+ <data eps="9.9999997473787516e-06" normalize_variance="true" eps_mode="INSIDE_SQRT" />
114
+ <input>
115
+ <port id="0" precision="FP32">
116
+ <dim>1</dim>
117
+ <dim>1</dim>
118
+ <dim>160000</dim>
119
+ </port>
120
+ <port id="1" precision="I64">
121
+ <dim>1</dim>
122
+ </port>
123
+ </input>
124
+ <output>
125
+ <port id="2" precision="FP32">
126
+ <dim>1</dim>
127
+ <dim>1</dim>
128
+ <dim>160000</dim>
129
+ </port>
130
+ </output>
131
+ </layer>
132
+ <layer id="9" name="Reshape_3962_compressed" type="Const" version="opset1">
133
+ <data element_type="f16" shape="1, 1, 1" offset="534" size="2" />
134
+ <output>
135
+ <port id="0" precision="FP16">
136
+ <dim>1</dim>
137
+ <dim>1</dim>
138
+ <dim>1</dim>
139
+ </port>
140
+ </output>
141
+ </layer>
142
+ <layer id="10" name="Reshape_3962" type="Convert" version="opset1">
143
+ <data destination_type="f32" />
144
+ <rt_info>
145
+ <attribute name="decompression" version="0" />
146
+ </rt_info>
147
+ <input>
148
+ <port id="0" precision="FP16">
149
+ <dim>1</dim>
150
+ <dim>1</dim>
151
+ <dim>1</dim>
152
+ </port>
153
+ </input>
154
+ <output>
155
+ <port id="1" precision="FP32">
156
+ <dim>1</dim>
157
+ <dim>1</dim>
158
+ <dim>1</dim>
159
+ </port>
160
+ </output>
161
+ </layer>
162
+ <layer id="11" name="Multiply_3963" type="Multiply" version="opset1">
163
+ <data auto_broadcast="numpy" />
164
+ <input>
165
+ <port id="0" precision="FP32">
166
+ <dim>1</dim>
167
+ <dim>1</dim>
168
+ <dim>160000</dim>
169
+ </port>
170
+ <port id="1" precision="FP32">
171
+ <dim>1</dim>
172
+ <dim>1</dim>
173
+ <dim>1</dim>
174
+ </port>
175
+ </input>
176
+ <output>
177
+ <port id="2" precision="FP32">
178
+ <dim>1</dim>
179
+ <dim>1</dim>
180
+ <dim>160000</dim>
181
+ </port>
182
+ </output>
183
+ </layer>
184
+ <layer id="12" name="Reshape_3972_compressed" type="Const" version="opset1">
185
+ <data element_type="f16" shape="1, 1, 1" offset="536" size="2" />
186
+ <output>
187
+ <port id="0" precision="FP16">
188
+ <dim>1</dim>
189
+ <dim>1</dim>
190
+ <dim>1</dim>
191
+ </port>
192
+ </output>
193
+ </layer>
194
+ <layer id="13" name="Reshape_3972" type="Convert" version="opset1">
195
+ <data destination_type="f32" />
196
+ <rt_info>
197
+ <attribute name="decompression" version="0" />
198
+ </rt_info>
199
+ <input>
200
+ <port id="0" precision="FP16">
201
+ <dim>1</dim>
202
+ <dim>1</dim>
203
+ <dim>1</dim>
204
+ </port>
205
+ </input>
206
+ <output>
207
+ <port id="1" precision="FP32">
208
+ <dim>1</dim>
209
+ <dim>1</dim>
210
+ <dim>1</dim>
211
+ </port>
212
+ </output>
213
+ </layer>
214
+ <layer id="14" name="/sincnet/wav_norm1d/InstanceNormalization" type="Add" version="opset1">
215
+ <data auto_broadcast="numpy" />
216
+ <input>
217
+ <port id="0" precision="FP32">
218
+ <dim>1</dim>
219
+ <dim>1</dim>
220
+ <dim>160000</dim>
221
+ </port>
222
+ <port id="1" precision="FP32">
223
+ <dim>1</dim>
224
+ <dim>1</dim>
225
+ <dim>1</dim>
226
+ </port>
227
+ </input>
228
+ <output>
229
+ <port id="2" precision="FP32" names="/sincnet/wav_norm1d/InstanceNormalization_output_0">
230
+ <dim>1</dim>
231
+ <dim>1</dim>
232
+ <dim>160000</dim>
233
+ </port>
234
+ </output>
235
+ </layer>
236
+ <layer id="15" name="/sincnet/conv1d.0/Concat_2_compressed" type="Const" version="opset1">
237
+ <data element_type="f16" shape="80, 1, 251" offset="538" size="40160" />
238
+ <output>
239
+ <port id="0" precision="FP16" names="/sincnet/conv1d.0/Concat_2_output_0">
240
+ <dim>80</dim>
241
+ <dim>1</dim>
242
+ <dim>251</dim>
243
+ </port>
244
+ </output>
245
+ </layer>
246
+ <layer id="16" name="/sincnet/conv1d.0/Concat_2" type="Convert" version="opset1">
247
+ <data destination_type="f32" />
248
+ <rt_info>
249
+ <attribute name="decompression" version="0" />
250
+ </rt_info>
251
+ <input>
252
+ <port id="0" precision="FP16">
253
+ <dim>80</dim>
254
+ <dim>1</dim>
255
+ <dim>251</dim>
256
+ </port>
257
+ </input>
258
+ <output>
259
+ <port id="1" precision="FP32">
260
+ <dim>80</dim>
261
+ <dim>1</dim>
262
+ <dim>251</dim>
263
+ </port>
264
+ </output>
265
+ </layer>
266
+ <layer id="17" name="Conv_119" type="Convolution" version="opset1">
267
+ <data strides="10" dilations="1" pads_begin="0" pads_end="0" auto_pad="explicit" />
268
+ <input>
269
+ <port id="0" precision="FP32">
270
+ <dim>1</dim>
271
+ <dim>1</dim>
272
+ <dim>160000</dim>
273
+ </port>
274
+ <port id="1" precision="FP32">
275
+ <dim>80</dim>
276
+ <dim>1</dim>
277
+ <dim>251</dim>
278
+ </port>
279
+ </input>
280
+ <output>
281
+ <port id="2" precision="FP32" names="onnx::Abs_114">
282
+ <dim>1</dim>
283
+ <dim>80</dim>
284
+ <dim>15975</dim>
285
+ </port>
286
+ </output>
287
+ </layer>
288
+ <layer id="18" name="/sincnet/Abs" type="Abs" version="opset1">
289
+ <input>
290
+ <port id="0" precision="FP32">
291
+ <dim>1</dim>
292
+ <dim>80</dim>
293
+ <dim>15975</dim>
294
+ </port>
295
+ </input>
296
+ <output>
297
+ <port id="1" precision="FP32" names="/sincnet/Abs_output_0">
298
+ <dim>1</dim>
299
+ <dim>80</dim>
300
+ <dim>15975</dim>
301
+ </port>
302
+ </output>
303
+ </layer>
304
+ <layer id="19" name="/sincnet/pool1d.0/MaxPool" type="MaxPool" version="opset8">
305
+ <data strides="3" dilations="1" pads_begin="0" pads_end="0" kernel="3" rounding_type="floor" auto_pad="explicit" index_element_type="i64" axis="0" />
306
+ <input>
307
+ <port id="0" precision="FP32">
308
+ <dim>1</dim>
309
+ <dim>80</dim>
310
+ <dim>15975</dim>
311
+ </port>
312
+ </input>
313
+ <output>
314
+ <port id="1" precision="FP32" names="/sincnet/pool1d.0/MaxPool_output_0">
315
+ <dim>1</dim>
316
+ <dim>80</dim>
317
+ <dim>5325</dim>
318
+ </port>
319
+ <port id="2" precision="I64">
320
+ <dim>1</dim>
321
+ <dim>80</dim>
322
+ <dim>5325</dim>
323
+ </port>
324
+ </output>
325
+ </layer>
326
+ <layer id="20" name="Constant_4034" type="Const" version="opset1">
327
+ <data element_type="i64" shape="1" offset="526" size="8" />
328
+ <output>
329
+ <port id="0" precision="I64">
330
+ <dim>1</dim>
331
+ </port>
332
+ </output>
333
+ </layer>
334
+ <layer id="21" name="MVN_4035" type="MVN" version="opset6">
335
+ <data eps="9.9999997473787516e-06" normalize_variance="true" eps_mode="INSIDE_SQRT" />
336
+ <input>
337
+ <port id="0" precision="FP32">
338
+ <dim>1</dim>
339
+ <dim>80</dim>
340
+ <dim>5325</dim>
341
+ </port>
342
+ <port id="1" precision="I64">
343
+ <dim>1</dim>
344
+ </port>
345
+ </input>
346
+ <output>
347
+ <port id="2" precision="FP32">
348
+ <dim>1</dim>
349
+ <dim>80</dim>
350
+ <dim>5325</dim>
351
+ </port>
352
+ </output>
353
+ </layer>
354
+ <layer id="22" name="Reshape_4046_compressed" type="Const" version="opset1">
355
+ <data element_type="f16" shape="1, 80, 1" offset="40698" size="160" />
356
+ <output>
357
+ <port id="0" precision="FP16">
358
+ <dim>1</dim>
359
+ <dim>80</dim>
360
+ <dim>1</dim>
361
+ </port>
362
+ </output>
363
+ </layer>
364
+ <layer id="23" name="Reshape_4046" type="Convert" version="opset1">
365
+ <data destination_type="f32" />
366
+ <rt_info>
367
+ <attribute name="decompression" version="0" />
368
+ </rt_info>
369
+ <input>
370
+ <port id="0" precision="FP16">
371
+ <dim>1</dim>
372
+ <dim>80</dim>
373
+ <dim>1</dim>
374
+ </port>
375
+ </input>
376
+ <output>
377
+ <port id="1" precision="FP32">
378
+ <dim>1</dim>
379
+ <dim>80</dim>
380
+ <dim>1</dim>
381
+ </port>
382
+ </output>
383
+ </layer>
384
+ <layer id="24" name="Multiply_4047" type="Multiply" version="opset1">
385
+ <data auto_broadcast="numpy" />
386
+ <input>
387
+ <port id="0" precision="FP32">
388
+ <dim>1</dim>
389
+ <dim>80</dim>
390
+ <dim>5325</dim>
391
+ </port>
392
+ <port id="1" precision="FP32">
393
+ <dim>1</dim>
394
+ <dim>80</dim>
395
+ <dim>1</dim>
396
+ </port>
397
+ </input>
398
+ <output>
399
+ <port id="2" precision="FP32">
400
+ <dim>1</dim>
401
+ <dim>80</dim>
402
+ <dim>5325</dim>
403
+ </port>
404
+ </output>
405
+ </layer>
406
+ <layer id="25" name="Reshape_4056_compressed" type="Const" version="opset1">
407
+ <data element_type="f16" shape="1, 80, 1" offset="40858" size="160" />
408
+ <output>
409
+ <port id="0" precision="FP16">
410
+ <dim>1</dim>
411
+ <dim>80</dim>
412
+ <dim>1</dim>
413
+ </port>
414
+ </output>
415
+ </layer>
416
+ <layer id="26" name="Reshape_4056" type="Convert" version="opset1">
417
+ <data destination_type="f32" />
418
+ <rt_info>
419
+ <attribute name="decompression" version="0" />
420
+ </rt_info>
421
+ <input>
422
+ <port id="0" precision="FP16">
423
+ <dim>1</dim>
424
+ <dim>80</dim>
425
+ <dim>1</dim>
426
+ </port>
427
+ </input>
428
+ <output>
429
+ <port id="1" precision="FP32">
430
+ <dim>1</dim>
431
+ <dim>80</dim>
432
+ <dim>1</dim>
433
+ </port>
434
+ </output>
435
+ </layer>
436
+ <layer id="27" name="/sincnet/norm1d.0/InstanceNormalization" type="Add" version="opset1">
437
+ <data auto_broadcast="numpy" />
438
+ <input>
439
+ <port id="0" precision="FP32">
440
+ <dim>1</dim>
441
+ <dim>80</dim>
442
+ <dim>5325</dim>
443
+ </port>
444
+ <port id="1" precision="FP32">
445
+ <dim>1</dim>
446
+ <dim>80</dim>
447
+ <dim>1</dim>
448
+ </port>
449
+ </input>
450
+ <output>
451
+ <port id="2" precision="FP32" names="/sincnet/norm1d.0/InstanceNormalization_output_0">
452
+ <dim>1</dim>
453
+ <dim>80</dim>
454
+ <dim>5325</dim>
455
+ </port>
456
+ </output>
457
+ </layer>
458
+ <layer id="28" name="Constant_4058_compressed" type="Const" version="opset1">
459
+ <data element_type="f16" shape="1" offset="41018" size="2" />
460
+ <output>
461
+ <port id="0" precision="FP16">
462
+ <dim>1</dim>
463
+ </port>
464
+ </output>
465
+ </layer>
466
+ <layer id="29" name="Constant_4058" type="Convert" version="opset1">
467
+ <data destination_type="f32" />
468
+ <rt_info>
469
+ <attribute name="decompression" version="0" />
470
+ </rt_info>
471
+ <input>
472
+ <port id="0" precision="FP16">
473
+ <dim>1</dim>
474
+ </port>
475
+ </input>
476
+ <output>
477
+ <port id="1" precision="FP32">
478
+ <dim>1</dim>
479
+ </port>
480
+ </output>
481
+ </layer>
482
+ <layer id="30" name="/sincnet/LeakyRelu" type="PReLU" version="opset1">
483
+ <input>
484
+ <port id="0" precision="FP32">
485
+ <dim>1</dim>
486
+ <dim>80</dim>
487
+ <dim>5325</dim>
488
+ </port>
489
+ <port id="1" precision="FP32">
490
+ <dim>1</dim>
491
+ </port>
492
+ </input>
493
+ <output>
494
+ <port id="2" precision="FP32" names="/sincnet/LeakyRelu_output_0">
495
+ <dim>1</dim>
496
+ <dim>80</dim>
497
+ <dim>5325</dim>
498
+ </port>
499
+ </output>
500
+ </layer>
501
+ <layer id="31" name="sincnet.conv1d.1.weight_compressed" type="Const" version="opset1">
502
+ <data element_type="f16" shape="60, 80, 5" offset="41020" size="48000" />
503
+ <output>
504
+ <port id="0" precision="FP16" names="sincnet.conv1d.1.weight">
505
+ <dim>60</dim>
506
+ <dim>80</dim>
507
+ <dim>5</dim>
508
+ </port>
509
+ </output>
510
+ </layer>
511
+ <layer id="32" name="sincnet.conv1d.1.weight" type="Convert" version="opset1">
512
+ <data destination_type="f32" />
513
+ <rt_info>
514
+ <attribute name="decompression" version="0" />
515
+ </rt_info>
516
+ <input>
517
+ <port id="0" precision="FP16">
518
+ <dim>60</dim>
519
+ <dim>80</dim>
520
+ <dim>5</dim>
521
+ </port>
522
+ </input>
523
+ <output>
524
+ <port id="1" precision="FP32">
525
+ <dim>60</dim>
526
+ <dim>80</dim>
527
+ <dim>5</dim>
528
+ </port>
529
+ </output>
530
+ </layer>
531
+ <layer id="33" name="/sincnet/conv1d.1/Conv/WithoutBiases" type="Convolution" version="opset1">
532
+ <data strides="1" dilations="1" pads_begin="0" pads_end="0" auto_pad="explicit" />
533
+ <input>
534
+ <port id="0" precision="FP32">
535
+ <dim>1</dim>
536
+ <dim>80</dim>
537
+ <dim>5325</dim>
538
+ </port>
539
+ <port id="1" precision="FP32">
540
+ <dim>60</dim>
541
+ <dim>80</dim>
542
+ <dim>5</dim>
543
+ </port>
544
+ </input>
545
+ <output>
546
+ <port id="2" precision="FP32">
547
+ <dim>1</dim>
548
+ <dim>60</dim>
549
+ <dim>5321</dim>
550
+ </port>
551
+ </output>
552
+ </layer>
553
+ <layer id="34" name="Reshape_4071_compressed" type="Const" version="opset1">
554
+ <data element_type="f16" shape="1, 60, 1" offset="89020" size="120" />
555
+ <output>
556
+ <port id="0" precision="FP16">
557
+ <dim>1</dim>
558
+ <dim>60</dim>
559
+ <dim>1</dim>
560
+ </port>
561
+ </output>
562
+ </layer>
563
+ <layer id="35" name="Reshape_4071" type="Convert" version="opset1">
564
+ <data destination_type="f32" />
565
+ <rt_info>
566
+ <attribute name="decompression" version="0" />
567
+ </rt_info>
568
+ <input>
569
+ <port id="0" precision="FP16">
570
+ <dim>1</dim>
571
+ <dim>60</dim>
572
+ <dim>1</dim>
573
+ </port>
574
+ </input>
575
+ <output>
576
+ <port id="1" precision="FP32">
577
+ <dim>1</dim>
578
+ <dim>60</dim>
579
+ <dim>1</dim>
580
+ </port>
581
+ </output>
582
+ </layer>
583
+ <layer id="36" name="/sincnet/conv1d.1/Conv" type="Add" version="opset1">
584
+ <data auto_broadcast="numpy" />
585
+ <input>
586
+ <port id="0" precision="FP32">
587
+ <dim>1</dim>
588
+ <dim>60</dim>
589
+ <dim>5321</dim>
590
+ </port>
591
+ <port id="1" precision="FP32">
592
+ <dim>1</dim>
593
+ <dim>60</dim>
594
+ <dim>1</dim>
595
+ </port>
596
+ </input>
597
+ <output>
598
+ <port id="2" precision="FP32" names="/sincnet/conv1d.1/Conv_output_0">
599
+ <dim>1</dim>
600
+ <dim>60</dim>
601
+ <dim>5321</dim>
602
+ </port>
603
+ </output>
604
+ </layer>
605
+ <layer id="37" name="/sincnet/pool1d.1/MaxPool" type="MaxPool" version="opset8">
606
+ <data strides="3" dilations="1" pads_begin="0" pads_end="0" kernel="3" rounding_type="floor" auto_pad="explicit" index_element_type="i64" axis="0" />
607
+ <input>
608
+ <port id="0" precision="FP32">
609
+ <dim>1</dim>
610
+ <dim>60</dim>
611
+ <dim>5321</dim>
612
+ </port>
613
+ </input>
614
+ <output>
615
+ <port id="1" precision="FP32" names="/sincnet/pool1d.1/MaxPool_output_0">
616
+ <dim>1</dim>
617
+ <dim>60</dim>
618
+ <dim>1773</dim>
619
+ </port>
620
+ <port id="2" precision="I64">
621
+ <dim>1</dim>
622
+ <dim>60</dim>
623
+ <dim>1773</dim>
624
+ </port>
625
+ </output>
626
+ </layer>
627
+ <layer id="38" name="Constant_4074" type="Const" version="opset1">
628
+ <data element_type="i64" shape="1" offset="526" size="8" />
629
+ <output>
630
+ <port id="0" precision="I64">
631
+ <dim>1</dim>
632
+ </port>
633
+ </output>
634
+ </layer>
635
+ <layer id="39" name="MVN_4075" type="MVN" version="opset6">
636
+ <data eps="9.9999997473787516e-06" normalize_variance="true" eps_mode="INSIDE_SQRT" />
637
+ <input>
638
+ <port id="0" precision="FP32">
639
+ <dim>1</dim>
640
+ <dim>60</dim>
641
+ <dim>1773</dim>
642
+ </port>
643
+ <port id="1" precision="I64">
644
+ <dim>1</dim>
645
+ </port>
646
+ </input>
647
+ <output>
648
+ <port id="2" precision="FP32">
649
+ <dim>1</dim>
650
+ <dim>60</dim>
651
+ <dim>1773</dim>
652
+ </port>
653
+ </output>
654
+ </layer>
655
+ <layer id="40" name="Reshape_4086_compressed" type="Const" version="opset1">
656
+ <data element_type="f16" shape="1, 60, 1" offset="89140" size="120" />
657
+ <output>
658
+ <port id="0" precision="FP16">
659
+ <dim>1</dim>
660
+ <dim>60</dim>
661
+ <dim>1</dim>
662
+ </port>
663
+ </output>
664
+ </layer>
665
+ <layer id="41" name="Reshape_4086" type="Convert" version="opset1">
666
+ <data destination_type="f32" />
667
+ <rt_info>
668
+ <attribute name="decompression" version="0" />
669
+ </rt_info>
670
+ <input>
671
+ <port id="0" precision="FP16">
672
+ <dim>1</dim>
673
+ <dim>60</dim>
674
+ <dim>1</dim>
675
+ </port>
676
+ </input>
677
+ <output>
678
+ <port id="1" precision="FP32">
679
+ <dim>1</dim>
680
+ <dim>60</dim>
681
+ <dim>1</dim>
682
+ </port>
683
+ </output>
684
+ </layer>
685
+ <layer id="42" name="Multiply_4087" type="Multiply" version="opset1">
686
+ <data auto_broadcast="numpy" />
687
+ <input>
688
+ <port id="0" precision="FP32">
689
+ <dim>1</dim>
690
+ <dim>60</dim>
691
+ <dim>1773</dim>
692
+ </port>
693
+ <port id="1" precision="FP32">
694
+ <dim>1</dim>
695
+ <dim>60</dim>
696
+ <dim>1</dim>
697
+ </port>
698
+ </input>
699
+ <output>
700
+ <port id="2" precision="FP32">
701
+ <dim>1</dim>
702
+ <dim>60</dim>
703
+ <dim>1773</dim>
704
+ </port>
705
+ </output>
706
+ </layer>
707
+ <layer id="43" name="Reshape_4096_compressed" type="Const" version="opset1">
708
+ <data element_type="f16" shape="1, 60, 1" offset="89260" size="120" />
709
+ <output>
710
+ <port id="0" precision="FP16">
711
+ <dim>1</dim>
712
+ <dim>60</dim>
713
+ <dim>1</dim>
714
+ </port>
715
+ </output>
716
+ </layer>
717
+ <layer id="44" name="Reshape_4096" type="Convert" version="opset1">
718
+ <data destination_type="f32" />
719
+ <rt_info>
720
+ <attribute name="decompression" version="0" />
721
+ </rt_info>
722
+ <input>
723
+ <port id="0" precision="FP16">
724
+ <dim>1</dim>
725
+ <dim>60</dim>
726
+ <dim>1</dim>
727
+ </port>
728
+ </input>
729
+ <output>
730
+ <port id="1" precision="FP32">
731
+ <dim>1</dim>
732
+ <dim>60</dim>
733
+ <dim>1</dim>
734
+ </port>
735
+ </output>
736
+ </layer>
737
+ <layer id="45" name="/sincnet/norm1d.1/InstanceNormalization" type="Add" version="opset1">
738
+ <data auto_broadcast="numpy" />
739
+ <input>
740
+ <port id="0" precision="FP32">
741
+ <dim>1</dim>
742
+ <dim>60</dim>
743
+ <dim>1773</dim>
744
+ </port>
745
+ <port id="1" precision="FP32">
746
+ <dim>1</dim>
747
+ <dim>60</dim>
748
+ <dim>1</dim>
749
+ </port>
750
+ </input>
751
+ <output>
752
+ <port id="2" precision="FP32" names="/sincnet/norm1d.1/InstanceNormalization_output_0">
753
+ <dim>1</dim>
754
+ <dim>60</dim>
755
+ <dim>1773</dim>
756
+ </port>
757
+ </output>
758
+ </layer>
759
+ <layer id="46" name="Constant_4098_compressed" type="Const" version="opset1">
760
+ <data element_type="f16" shape="1" offset="41018" size="2" />
761
+ <output>
762
+ <port id="0" precision="FP16">
763
+ <dim>1</dim>
764
+ </port>
765
+ </output>
766
+ </layer>
767
+ <layer id="47" name="Constant_4098" type="Convert" version="opset1">
768
+ <data destination_type="f32" />
769
+ <rt_info>
770
+ <attribute name="decompression" version="0" />
771
+ </rt_info>
772
+ <input>
773
+ <port id="0" precision="FP16">
774
+ <dim>1</dim>
775
+ </port>
776
+ </input>
777
+ <output>
778
+ <port id="1" precision="FP32">
779
+ <dim>1</dim>
780
+ </port>
781
+ </output>
782
+ </layer>
783
+ <layer id="48" name="/sincnet/LeakyRelu_1" type="PReLU" version="opset1">
784
+ <input>
785
+ <port id="0" precision="FP32">
786
+ <dim>1</dim>
787
+ <dim>60</dim>
788
+ <dim>1773</dim>
789
+ </port>
790
+ <port id="1" precision="FP32">
791
+ <dim>1</dim>
792
+ </port>
793
+ </input>
794
+ <output>
795
+ <port id="2" precision="FP32" names="/sincnet/LeakyRelu_1_output_0">
796
+ <dim>1</dim>
797
+ <dim>60</dim>
798
+ <dim>1773</dim>
799
+ </port>
800
+ </output>
801
+ </layer>
802
+ <layer id="49" name="sincnet.conv1d.2.weight_compressed" type="Const" version="opset1">
803
+ <data element_type="f16" shape="60, 60, 5" offset="89380" size="36000" />
804
+ <output>
805
+ <port id="0" precision="FP16" names="sincnet.conv1d.2.weight">
806
+ <dim>60</dim>
807
+ <dim>60</dim>
808
+ <dim>5</dim>
809
+ </port>
810
+ </output>
811
+ </layer>
812
+ <layer id="50" name="sincnet.conv1d.2.weight" type="Convert" version="opset1">
813
+ <data destination_type="f32" />
814
+ <rt_info>
815
+ <attribute name="decompression" version="0" />
816
+ </rt_info>
817
+ <input>
818
+ <port id="0" precision="FP16">
819
+ <dim>60</dim>
820
+ <dim>60</dim>
821
+ <dim>5</dim>
822
+ </port>
823
+ </input>
824
+ <output>
825
+ <port id="1" precision="FP32">
826
+ <dim>60</dim>
827
+ <dim>60</dim>
828
+ <dim>5</dim>
829
+ </port>
830
+ </output>
831
+ </layer>
832
+ <layer id="51" name="/sincnet/conv1d.2/Conv/WithoutBiases" type="Convolution" version="opset1">
833
+ <data strides="1" dilations="1" pads_begin="0" pads_end="0" auto_pad="explicit" />
834
+ <input>
835
+ <port id="0" precision="FP32">
836
+ <dim>1</dim>
837
+ <dim>60</dim>
838
+ <dim>1773</dim>
839
+ </port>
840
+ <port id="1" precision="FP32">
841
+ <dim>60</dim>
842
+ <dim>60</dim>
843
+ <dim>5</dim>
844
+ </port>
845
+ </input>
846
+ <output>
847
+ <port id="2" precision="FP32">
848
+ <dim>1</dim>
849
+ <dim>60</dim>
850
+ <dim>1769</dim>
851
+ </port>
852
+ </output>
853
+ </layer>
854
+ <layer id="52" name="Reshape_4111_compressed" type="Const" version="opset1">
855
+ <data element_type="f16" shape="1, 60, 1" offset="125380" size="120" />
856
+ <output>
857
+ <port id="0" precision="FP16">
858
+ <dim>1</dim>
859
+ <dim>60</dim>
860
+ <dim>1</dim>
861
+ </port>
862
+ </output>
863
+ </layer>
864
+ <layer id="53" name="Reshape_4111" type="Convert" version="opset1">
865
+ <data destination_type="f32" />
866
+ <rt_info>
867
+ <attribute name="decompression" version="0" />
868
+ </rt_info>
869
+ <input>
870
+ <port id="0" precision="FP16">
871
+ <dim>1</dim>
872
+ <dim>60</dim>
873
+ <dim>1</dim>
874
+ </port>
875
+ </input>
876
+ <output>
877
+ <port id="1" precision="FP32">
878
+ <dim>1</dim>
879
+ <dim>60</dim>
880
+ <dim>1</dim>
881
+ </port>
882
+ </output>
883
+ </layer>
884
+ <layer id="54" name="/sincnet/conv1d.2/Conv" type="Add" version="opset1">
885
+ <data auto_broadcast="numpy" />
886
+ <input>
887
+ <port id="0" precision="FP32">
888
+ <dim>1</dim>
889
+ <dim>60</dim>
890
+ <dim>1769</dim>
891
+ </port>
892
+ <port id="1" precision="FP32">
893
+ <dim>1</dim>
894
+ <dim>60</dim>
895
+ <dim>1</dim>
896
+ </port>
897
+ </input>
898
+ <output>
899
+ <port id="2" precision="FP32" names="/sincnet/conv1d.2/Conv_output_0">
900
+ <dim>1</dim>
901
+ <dim>60</dim>
902
+ <dim>1769</dim>
903
+ </port>
904
+ </output>
905
+ </layer>
906
+ <layer id="55" name="/sincnet/pool1d.2/MaxPool" type="MaxPool" version="opset8">
907
+ <data strides="3" dilations="1" pads_begin="0" pads_end="0" kernel="3" rounding_type="floor" auto_pad="explicit" index_element_type="i64" axis="0" />
908
+ <input>
909
+ <port id="0" precision="FP32">
910
+ <dim>1</dim>
911
+ <dim>60</dim>
912
+ <dim>1769</dim>
913
+ </port>
914
+ </input>
915
+ <output>
916
+ <port id="1" precision="FP32" names="/sincnet/pool1d.2/MaxPool_output_0">
917
+ <dim>1</dim>
918
+ <dim>60</dim>
919
+ <dim>589</dim>
920
+ </port>
921
+ <port id="2" precision="I64">
922
+ <dim>1</dim>
923
+ <dim>60</dim>
924
+ <dim>589</dim>
925
+ </port>
926
+ </output>
927
+ </layer>
928
+ <layer id="56" name="Constant_4114" type="Const" version="opset1">
929
+ <data element_type="i64" shape="1" offset="526" size="8" />
930
+ <output>
931
+ <port id="0" precision="I64">
932
+ <dim>1</dim>
933
+ </port>
934
+ </output>
935
+ </layer>
936
+ <layer id="57" name="MVN_4115" type="MVN" version="opset6">
937
+ <data eps="9.9999997473787516e-06" normalize_variance="true" eps_mode="INSIDE_SQRT" />
938
+ <input>
939
+ <port id="0" precision="FP32">
940
+ <dim>1</dim>
941
+ <dim>60</dim>
942
+ <dim>589</dim>
943
+ </port>
944
+ <port id="1" precision="I64">
945
+ <dim>1</dim>
946
+ </port>
947
+ </input>
948
+ <output>
949
+ <port id="2" precision="FP32">
950
+ <dim>1</dim>
951
+ <dim>60</dim>
952
+ <dim>589</dim>
953
+ </port>
954
+ </output>
955
+ </layer>
956
+ <layer id="58" name="Reshape_4126_compressed" type="Const" version="opset1">
957
+ <data element_type="f16" shape="1, 60, 1" offset="125500" size="120" />
958
+ <output>
959
+ <port id="0" precision="FP16">
960
+ <dim>1</dim>
961
+ <dim>60</dim>
962
+ <dim>1</dim>
963
+ </port>
964
+ </output>
965
+ </layer>
966
+ <layer id="59" name="Reshape_4126" type="Convert" version="opset1">
967
+ <data destination_type="f32" />
968
+ <rt_info>
969
+ <attribute name="decompression" version="0" />
970
+ </rt_info>
971
+ <input>
972
+ <port id="0" precision="FP16">
973
+ <dim>1</dim>
974
+ <dim>60</dim>
975
+ <dim>1</dim>
976
+ </port>
977
+ </input>
978
+ <output>
979
+ <port id="1" precision="FP32">
980
+ <dim>1</dim>
981
+ <dim>60</dim>
982
+ <dim>1</dim>
983
+ </port>
984
+ </output>
985
+ </layer>
986
+ <layer id="60" name="Multiply_4127" type="Multiply" version="opset1">
987
+ <data auto_broadcast="numpy" />
988
+ <input>
989
+ <port id="0" precision="FP32">
990
+ <dim>1</dim>
991
+ <dim>60</dim>
992
+ <dim>589</dim>
993
+ </port>
994
+ <port id="1" precision="FP32">
995
+ <dim>1</dim>
996
+ <dim>60</dim>
997
+ <dim>1</dim>
998
+ </port>
999
+ </input>
1000
+ <output>
1001
+ <port id="2" precision="FP32">
1002
+ <dim>1</dim>
1003
+ <dim>60</dim>
1004
+ <dim>589</dim>
1005
+ </port>
1006
+ </output>
1007
+ </layer>
1008
+ <layer id="61" name="Reshape_4136_compressed" type="Const" version="opset1">
1009
+ <data element_type="f16" shape="1, 60, 1" offset="125620" size="120" />
1010
+ <output>
1011
+ <port id="0" precision="FP16">
1012
+ <dim>1</dim>
1013
+ <dim>60</dim>
1014
+ <dim>1</dim>
1015
+ </port>
1016
+ </output>
1017
+ </layer>
1018
+ <layer id="62" name="Reshape_4136" type="Convert" version="opset1">
1019
+ <data destination_type="f32" />
1020
+ <rt_info>
1021
+ <attribute name="decompression" version="0" />
1022
+ </rt_info>
1023
+ <input>
1024
+ <port id="0" precision="FP16">
1025
+ <dim>1</dim>
1026
+ <dim>60</dim>
1027
+ <dim>1</dim>
1028
+ </port>
1029
+ </input>
1030
+ <output>
1031
+ <port id="1" precision="FP32">
1032
+ <dim>1</dim>
1033
+ <dim>60</dim>
1034
+ <dim>1</dim>
1035
+ </port>
1036
+ </output>
1037
+ </layer>
1038
+ <layer id="63" name="/sincnet/norm1d.2/InstanceNormalization" type="Add" version="opset1">
1039
+ <data auto_broadcast="numpy" />
1040
+ <input>
1041
+ <port id="0" precision="FP32">
1042
+ <dim>1</dim>
1043
+ <dim>60</dim>
1044
+ <dim>589</dim>
1045
+ </port>
1046
+ <port id="1" precision="FP32">
1047
+ <dim>1</dim>
1048
+ <dim>60</dim>
1049
+ <dim>1</dim>
1050
+ </port>
1051
+ </input>
1052
+ <output>
1053
+ <port id="2" precision="FP32" names="/sincnet/norm1d.2/InstanceNormalization_output_0">
1054
+ <dim>1</dim>
1055
+ <dim>60</dim>
1056
+ <dim>589</dim>
1057
+ </port>
1058
+ </output>
1059
+ </layer>
1060
+ <layer id="64" name="Constant_4138_compressed" type="Const" version="opset1">
1061
+ <data element_type="f16" shape="1" offset="41018" size="2" />
1062
+ <output>
1063
+ <port id="0" precision="FP16">
1064
+ <dim>1</dim>
1065
+ </port>
1066
+ </output>
1067
+ </layer>
1068
+ <layer id="65" name="Constant_4138" type="Convert" version="opset1">
1069
+ <data destination_type="f32" />
1070
+ <rt_info>
1071
+ <attribute name="decompression" version="0" />
1072
+ </rt_info>
1073
+ <input>
1074
+ <port id="0" precision="FP16">
1075
+ <dim>1</dim>
1076
+ </port>
1077
+ </input>
1078
+ <output>
1079
+ <port id="1" precision="FP32">
1080
+ <dim>1</dim>
1081
+ </port>
1082
+ </output>
1083
+ </layer>
1084
+ <layer id="66" name="/sincnet/LeakyRelu_2" type="PReLU" version="opset1">
1085
+ <input>
1086
+ <port id="0" precision="FP32">
1087
+ <dim>1</dim>
1088
+ <dim>60</dim>
1089
+ <dim>589</dim>
1090
+ </port>
1091
+ <port id="1" precision="FP32">
1092
+ <dim>1</dim>
1093
+ </port>
1094
+ </input>
1095
+ <output>
1096
+ <port id="2" precision="FP32" names="/sincnet/LeakyRelu_2_output_0">
1097
+ <dim>1</dim>
1098
+ <dim>60</dim>
1099
+ <dim>589</dim>
1100
+ </port>
1101
+ </output>
1102
+ </layer>
1103
+ <layer id="67" name="Constant_4140" type="Const" version="opset1">
1104
+ <data element_type="i64" shape="3" offset="125740" size="24" />
1105
+ <output>
1106
+ <port id="0" precision="I64">
1107
+ <dim>3</dim>
1108
+ </port>
1109
+ </output>
1110
+ </layer>
1111
+ <layer id="68" name="/lstm/Transpose" type="Transpose" version="opset1">
1112
+ <input>
1113
+ <port id="0" precision="FP32">
1114
+ <dim>1</dim>
1115
+ <dim>60</dim>
1116
+ <dim>589</dim>
1117
+ </port>
1118
+ <port id="1" precision="I64">
1119
+ <dim>3</dim>
1120
+ </port>
1121
+ </input>
1122
+ <output>
1123
+ <port id="2" precision="FP32" names="/lstm/Transpose_output_0">
1124
+ <dim>589</dim>
1125
+ <dim>1</dim>
1126
+ <dim>60</dim>
1127
+ </port>
1128
+ </output>
1129
+ </layer>
1130
+ <layer id="69" name="Constant_4169" type="Const" version="opset1">
1131
+ <data element_type="i64" shape="3" offset="125764" size="24" />
1132
+ <output>
1133
+ <port id="0" precision="I64">
1134
+ <dim>3</dim>
1135
+ </port>
1136
+ </output>
1137
+ </layer>
1138
+ <layer id="70" name="Transpose_4170" type="Transpose" version="opset1">
1139
+ <input>
1140
+ <port id="0" precision="FP32">
1141
+ <dim>589</dim>
1142
+ <dim>1</dim>
1143
+ <dim>60</dim>
1144
+ </port>
1145
+ <port id="1" precision="I64">
1146
+ <dim>3</dim>
1147
+ </port>
1148
+ </input>
1149
+ <output>
1150
+ <port id="2" precision="FP32">
1151
+ <dim>1</dim>
1152
+ <dim>589</dim>
1153
+ <dim>60</dim>
1154
+ </port>
1155
+ </output>
1156
+ </layer>
1157
+ <layer id="71" name="/lstm/Constant_compressed" type="Const" version="opset1">
1158
+ <data element_type="f16" shape="2, 1, 128" offset="125788" size="512" />
1159
+ <output>
1160
+ <port id="0" precision="FP16" names="/lstm/Constant_output_0">
1161
+ <dim>2</dim>
1162
+ <dim>1</dim>
1163
+ <dim>128</dim>
1164
+ </port>
1165
+ </output>
1166
+ </layer>
1167
+ <layer id="72" name="/lstm/Constant" type="Convert" version="opset1">
1168
+ <data destination_type="f32" />
1169
+ <rt_info>
1170
+ <attribute name="decompression" version="0" />
1171
+ </rt_info>
1172
+ <input>
1173
+ <port id="0" precision="FP16">
1174
+ <dim>2</dim>
1175
+ <dim>1</dim>
1176
+ <dim>128</dim>
1177
+ </port>
1178
+ </input>
1179
+ <output>
1180
+ <port id="1" precision="FP32">
1181
+ <dim>2</dim>
1182
+ <dim>1</dim>
1183
+ <dim>128</dim>
1184
+ </port>
1185
+ </output>
1186
+ </layer>
1187
+ <layer id="73" name="Constant_141" type="Const" version="opset1">
1188
+ <data element_type="i64" shape="1" offset="526" size="8" />
1189
+ <rt_info>
1190
+ <attribute name="precise" version="0" />
1191
+ </rt_info>
1192
+ <output>
1193
+ <port id="0" precision="I64" names="onnx::Concat_819">
1194
+ <dim>1</dim>
1195
+ </port>
1196
+ </output>
1197
+ </layer>
1198
+ <layer id="74" name="ShapeOf_7195" type="ShapeOf" version="opset3">
1199
+ <data output_type="i64" />
1200
+ <input>
1201
+ <port id="0" precision="FP32">
1202
+ <dim>589</dim>
1203
+ <dim>1</dim>
1204
+ <dim>60</dim>
1205
+ </port>
1206
+ </input>
1207
+ <output>
1208
+ <port id="1" precision="I64">
1209
+ <dim>3</dim>
1210
+ </port>
1211
+ </output>
1212
+ </layer>
1213
+ <layer id="75" name="Constant_7196" type="Const" version="opset1">
1214
+ <data element_type="i64" shape="1" offset="126300" size="8" />
1215
+ <rt_info>
1216
+ <attribute name="precise" version="0" />
1217
+ </rt_info>
1218
+ <output>
1219
+ <port id="0" precision="I64">
1220
+ <dim>1</dim>
1221
+ </port>
1222
+ </output>
1223
+ </layer>
1224
+ <layer id="76" name="Constant_7197" type="Const" version="opset1">
1225
+ <data element_type="i64" shape="" offset="126308" size="8" />
1226
+ <rt_info>
1227
+ <attribute name="precise" version="0" />
1228
+ </rt_info>
1229
+ <output>
1230
+ <port id="0" precision="I64" />
1231
+ </output>
1232
+ </layer>
1233
+ <layer id="77" name="Gather_7198" type="Gather" version="opset8">
1234
+ <data batch_dims="0" />
1235
+ <input>
1236
+ <port id="0" precision="I64">
1237
+ <dim>3</dim>
1238
+ </port>
1239
+ <port id="1" precision="I64">
1240
+ <dim>1</dim>
1241
+ </port>
1242
+ <port id="2" precision="I64" />
1243
+ </input>
1244
+ <output>
1245
+ <port id="3" precision="I64" names="/lstm/Gather_1_output_0,/lstm/Gather_2_output_0,/lstm/Gather_3_output_0,/lstm/Gather_4_output_0,/lstm/Gather_5_output_0,/lstm/Gather_6_output_0,/lstm/Gather_7_output_0,/lstm/Gather_output_0,onnx::Concat_263,onnx::Concat_274,onnx::Concat_420,onnx::Concat_431,onnx::Concat_577,onnx::Concat_588,onnx::Concat_734,onnx::Concat_745">
1246
+ <dim>1</dim>
1247
+ </port>
1248
+ </output>
1249
+ </layer>
1250
+ <layer id="78" name="/lstm/Constant_3" type="Const" version="opset1">
1251
+ <data element_type="i64" shape="1" offset="126316" size="8" />
1252
+ <rt_info>
1253
+ <attribute name="precise" version="0" />
1254
+ </rt_info>
1255
+ <output>
1256
+ <port id="0" precision="I64" names="/lstm/Constant_3_output_0">
1257
+ <dim>1</dim>
1258
+ </port>
1259
+ </output>
1260
+ </layer>
1261
+ <layer id="79" name="/lstm/Concat" type="Concat" version="opset1">
1262
+ <data axis="0" />
1263
+ <input>
1264
+ <port id="0" precision="I64">
1265
+ <dim>1</dim>
1266
+ </port>
1267
+ <port id="1" precision="I64">
1268
+ <dim>1</dim>
1269
+ </port>
1270
+ <port id="2" precision="I64">
1271
+ <dim>1</dim>
1272
+ </port>
1273
+ </input>
1274
+ <output>
1275
+ <port id="3" precision="I64" names="/lstm/Concat_1_output_0,/lstm/Concat_2_output_0,/lstm/Concat_3_output_0,/lstm/Concat_4_output_0,/lstm/Concat_5_output_0,/lstm/Concat_6_output_0,/lstm/Concat_7_output_0,/lstm/Concat_output_0">
1276
+ <dim>3</dim>
1277
+ </port>
1278
+ </output>
1279
+ </layer>
1280
+ <layer id="80" name="/lstm/Expand" type="Broadcast" version="opset3">
1281
+ <data mode="bidirectional" />
1282
+ <input>
1283
+ <port id="0" precision="FP32">
1284
+ <dim>2</dim>
1285
+ <dim>1</dim>
1286
+ <dim>128</dim>
1287
+ </port>
1288
+ <port id="1" precision="I64">
1289
+ <dim>3</dim>
1290
+ </port>
1291
+ </input>
1292
+ <output>
1293
+ <port id="2" precision="FP32" names="/lstm/Expand_1_output_0,/lstm/Expand_2_output_0,/lstm/Expand_3_output_0,/lstm/Expand_4_output_0,/lstm/Expand_5_output_0,/lstm/Expand_6_output_0,/lstm/Expand_7_output_0,/lstm/Expand_output_0">
1294
+ <dim>2</dim>
1295
+ <dim>1</dim>
1296
+ <dim>128</dim>
1297
+ </port>
1298
+ </output>
1299
+ </layer>
1300
+ <layer id="81" name="Constant_4197" type="Const" version="opset1">
1301
+ <data element_type="i64" shape="3" offset="125764" size="24" />
1302
+ <output>
1303
+ <port id="0" precision="I64">
1304
+ <dim>3</dim>
1305
+ </port>
1306
+ </output>
1307
+ </layer>
1308
+ <layer id="82" name="Transpose_4198" type="Transpose" version="opset1">
1309
+ <input>
1310
+ <port id="0" precision="FP32">
1311
+ <dim>2</dim>
1312
+ <dim>1</dim>
1313
+ <dim>128</dim>
1314
+ </port>
1315
+ <port id="1" precision="I64">
1316
+ <dim>3</dim>
1317
+ </port>
1318
+ </input>
1319
+ <output>
1320
+ <port id="2" precision="FP32">
1321
+ <dim>1</dim>
1322
+ <dim>2</dim>
1323
+ <dim>128</dim>
1324
+ </port>
1325
+ </output>
1326
+ </layer>
1327
+ <layer id="83" name="ShapeOf_4177" type="ShapeOf" version="opset3">
1328
+ <data output_type="i64" />
1329
+ <input>
1330
+ <port id="0" precision="FP32">
1331
+ <dim>1</dim>
1332
+ <dim>589</dim>
1333
+ <dim>60</dim>
1334
+ </port>
1335
+ </input>
1336
+ <output>
1337
+ <port id="1" precision="I64">
1338
+ <dim>3</dim>
1339
+ </port>
1340
+ </output>
1341
+ </layer>
1342
+ <layer id="84" name="Constant_4181" type="Const" version="opset1">
1343
+ <data element_type="i32" shape="1" offset="126324" size="4" />
1344
+ <output>
1345
+ <port id="0" precision="I32">
1346
+ <dim>1</dim>
1347
+ </port>
1348
+ </output>
1349
+ </layer>
1350
+ <layer id="85" name="Constant_4178" type="Const" version="opset1">
1351
+ <data element_type="i32" shape="1" offset="126328" size="4" />
1352
+ <output>
1353
+ <port id="0" precision="I32">
1354
+ <dim>1</dim>
1355
+ </port>
1356
+ </output>
1357
+ </layer>
1358
+ <layer id="86" name="Gather_4182" type="Gather" version="opset8">
1359
+ <data batch_dims="0" />
1360
+ <input>
1361
+ <port id="0" precision="I64">
1362
+ <dim>3</dim>
1363
+ </port>
1364
+ <port id="1" precision="I32">
1365
+ <dim>1</dim>
1366
+ </port>
1367
+ <port id="2" precision="I32">
1368
+ <dim>1</dim>
1369
+ </port>
1370
+ </input>
1371
+ <output>
1372
+ <port id="3" precision="I64">
1373
+ <dim>1</dim>
1374
+ </port>
1375
+ </output>
1376
+ </layer>
1377
+ <layer id="87" name="Broadcast_4194" type="Broadcast" version="opset3">
1378
+ <data mode="numpy" />
1379
+ <input>
1380
+ <port id="0" precision="I64">
1381
+ <dim>1</dim>
1382
+ </port>
1383
+ <port id="1" precision="I64">
1384
+ <dim>1</dim>
1385
+ </port>
1386
+ </input>
1387
+ <output>
1388
+ <port id="2" precision="I64">
1389
+ <dim>1</dim>
1390
+ </port>
1391
+ </output>
1392
+ </layer>
1393
+ <layer id="88" name="Concat_4173_compressed" type="Const" version="opset1">
1394
+ <data element_type="f16" shape="2, 512, 60" offset="126332" size="122880" />
1395
+ <output>
1396
+ <port id="0" precision="FP16">
1397
+ <dim>2</dim>
1398
+ <dim>512</dim>
1399
+ <dim>60</dim>
1400
+ </port>
1401
+ </output>
1402
+ </layer>
1403
+ <layer id="89" name="Concat_4173" type="Convert" version="opset1">
1404
+ <data destination_type="f32" />
1405
+ <rt_info>
1406
+ <attribute name="decompression" version="0" />
1407
+ </rt_info>
1408
+ <input>
1409
+ <port id="0" precision="FP16">
1410
+ <dim>2</dim>
1411
+ <dim>512</dim>
1412
+ <dim>60</dim>
1413
+ </port>
1414
+ </input>
1415
+ <output>
1416
+ <port id="1" precision="FP32">
1417
+ <dim>2</dim>
1418
+ <dim>512</dim>
1419
+ <dim>60</dim>
1420
+ </port>
1421
+ </output>
1422
+ </layer>
1423
+ <layer id="90" name="Concat_4176_compressed" type="Const" version="opset1">
1424
+ <data element_type="f16" shape="2, 512, 128" offset="249212" size="262144" />
1425
+ <output>
1426
+ <port id="0" precision="FP16">
1427
+ <dim>2</dim>
1428
+ <dim>512</dim>
1429
+ <dim>128</dim>
1430
+ </port>
1431
+ </output>
1432
+ </layer>
1433
+ <layer id="91" name="Concat_4176" type="Convert" version="opset1">
1434
+ <data destination_type="f32" />
1435
+ <rt_info>
1436
+ <attribute name="decompression" version="0" />
1437
+ </rt_info>
1438
+ <input>
1439
+ <port id="0" precision="FP16">
1440
+ <dim>2</dim>
1441
+ <dim>512</dim>
1442
+ <dim>128</dim>
1443
+ </port>
1444
+ </input>
1445
+ <output>
1446
+ <port id="1" precision="FP32">
1447
+ <dim>2</dim>
1448
+ <dim>512</dim>
1449
+ <dim>128</dim>
1450
+ </port>
1451
+ </output>
1452
+ </layer>
1453
+ <layer id="92" name="Concat_4193_compressed" type="Const" version="opset1">
1454
+ <data element_type="f16" shape="2, 512" offset="511356" size="2048" />
1455
+ <output>
1456
+ <port id="0" precision="FP16">
1457
+ <dim>2</dim>
1458
+ <dim>512</dim>
1459
+ </port>
1460
+ </output>
1461
+ </layer>
1462
+ <layer id="93" name="Concat_4193" type="Convert" version="opset1">
1463
+ <data destination_type="f32" />
1464
+ <rt_info>
1465
+ <attribute name="decompression" version="0" />
1466
+ </rt_info>
1467
+ <input>
1468
+ <port id="0" precision="FP16">
1469
+ <dim>2</dim>
1470
+ <dim>512</dim>
1471
+ </port>
1472
+ </input>
1473
+ <output>
1474
+ <port id="1" precision="FP32">
1475
+ <dim>2</dim>
1476
+ <dim>512</dim>
1477
+ </port>
1478
+ </output>
1479
+ </layer>
1480
+ <layer id="94" name="LSTMSequence_4208" type="LSTMSequence" version="opset5">
1481
+ <data direction="bidirectional" hidden_size="128" activations="sigmoid, tanh, tanh" activations_alpha="" activations_beta="" clip="0" />
1482
+ <input>
1483
+ <port id="0" precision="FP32">
1484
+ <dim>1</dim>
1485
+ <dim>589</dim>
1486
+ <dim>60</dim>
1487
+ </port>
1488
+ <port id="1" precision="FP32">
1489
+ <dim>1</dim>
1490
+ <dim>2</dim>
1491
+ <dim>128</dim>
1492
+ </port>
1493
+ <port id="2" precision="FP32">
1494
+ <dim>1</dim>
1495
+ <dim>2</dim>
1496
+ <dim>128</dim>
1497
+ </port>
1498
+ <port id="3" precision="I64">
1499
+ <dim>1</dim>
1500
+ </port>
1501
+ <port id="4" precision="FP32">
1502
+ <dim>2</dim>
1503
+ <dim>512</dim>
1504
+ <dim>60</dim>
1505
+ </port>
1506
+ <port id="5" precision="FP32">
1507
+ <dim>2</dim>
1508
+ <dim>512</dim>
1509
+ <dim>128</dim>
1510
+ </port>
1511
+ <port id="6" precision="FP32">
1512
+ <dim>2</dim>
1513
+ <dim>512</dim>
1514
+ </port>
1515
+ </input>
1516
+ <output>
1517
+ <port id="7" precision="FP32">
1518
+ <dim>1</dim>
1519
+ <dim>2</dim>
1520
+ <dim>589</dim>
1521
+ <dim>128</dim>
1522
+ </port>
1523
+ <port id="8" precision="FP32">
1524
+ <dim>1</dim>
1525
+ <dim>2</dim>
1526
+ <dim>128</dim>
1527
+ </port>
1528
+ <port id="9" precision="FP32">
1529
+ <dim>1</dim>
1530
+ <dim>2</dim>
1531
+ <dim>128</dim>
1532
+ </port>
1533
+ </output>
1534
+ </layer>
1535
+ <layer id="95" name="Constant_6666" type="Const" version="opset1">
1536
+ <data element_type="i64" shape="4" offset="513404" size="32" />
1537
+ <output>
1538
+ <port id="0" precision="I64">
1539
+ <dim>4</dim>
1540
+ </port>
1541
+ </output>
1542
+ </layer>
1543
+ <layer id="96" name="/lstm/Transpose_1" type="Transpose" version="opset1">
1544
+ <input>
1545
+ <port id="0" precision="FP32">
1546
+ <dim>1</dim>
1547
+ <dim>2</dim>
1548
+ <dim>589</dim>
1549
+ <dim>128</dim>
1550
+ </port>
1551
+ <port id="1" precision="I64">
1552
+ <dim>4</dim>
1553
+ </port>
1554
+ </input>
1555
+ <output>
1556
+ <port id="2" precision="FP32" names="/lstm/Transpose_1_output_0">
1557
+ <dim>589</dim>
1558
+ <dim>1</dim>
1559
+ <dim>2</dim>
1560
+ <dim>128</dim>
1561
+ </port>
1562
+ </output>
1563
+ </layer>
1564
+ <layer id="97" name="/lstm/Constant_6" type="Const" version="opset1">
1565
+ <data element_type="i64" shape="3" offset="513436" size="24" />
1566
+ <rt_info>
1567
+ <attribute name="precise" version="0" />
1568
+ </rt_info>
1569
+ <output>
1570
+ <port id="0" precision="I64" names="/lstm/Constant_6_output_0">
1571
+ <dim>3</dim>
1572
+ </port>
1573
+ </output>
1574
+ </layer>
1575
+ <layer id="98" name="/lstm/Reshape" type="Reshape" version="opset1">
1576
+ <data special_zero="true" />
1577
+ <input>
1578
+ <port id="0" precision="FP32">
1579
+ <dim>589</dim>
1580
+ <dim>1</dim>
1581
+ <dim>2</dim>
1582
+ <dim>128</dim>
1583
+ </port>
1584
+ <port id="1" precision="I64">
1585
+ <dim>3</dim>
1586
+ </port>
1587
+ </input>
1588
+ <output>
1589
+ <port id="2" precision="FP32" names="/lstm/Reshape_output_0">
1590
+ <dim>589</dim>
1591
+ <dim>1</dim>
1592
+ <dim>256</dim>
1593
+ </port>
1594
+ </output>
1595
+ </layer>
1596
+ <layer id="99" name="Constant_4246" type="Const" version="opset1">
1597
+ <data element_type="i64" shape="3" offset="125764" size="24" />
1598
+ <output>
1599
+ <port id="0" precision="I64">
1600
+ <dim>3</dim>
1601
+ </port>
1602
+ </output>
1603
+ </layer>
1604
+ <layer id="100" name="Transpose_4247" type="Transpose" version="opset1">
1605
+ <input>
1606
+ <port id="0" precision="FP32">
1607
+ <dim>589</dim>
1608
+ <dim>1</dim>
1609
+ <dim>256</dim>
1610
+ </port>
1611
+ <port id="1" precision="I64">
1612
+ <dim>3</dim>
1613
+ </port>
1614
+ </input>
1615
+ <output>
1616
+ <port id="2" precision="FP32">
1617
+ <dim>1</dim>
1618
+ <dim>589</dim>
1619
+ <dim>256</dim>
1620
+ </port>
1621
+ </output>
1622
+ </layer>
1623
+ <layer id="101" name="ShapeOf_4254" type="ShapeOf" version="opset3">
1624
+ <data output_type="i64" />
1625
+ <input>
1626
+ <port id="0" precision="FP32">
1627
+ <dim>1</dim>
1628
+ <dim>589</dim>
1629
+ <dim>256</dim>
1630
+ </port>
1631
+ </input>
1632
+ <output>
1633
+ <port id="1" precision="I64">
1634
+ <dim>3</dim>
1635
+ </port>
1636
+ </output>
1637
+ </layer>
1638
+ <layer id="102" name="Constant_4258" type="Const" version="opset1">
1639
+ <data element_type="i32" shape="1" offset="126324" size="4" />
1640
+ <output>
1641
+ <port id="0" precision="I32">
1642
+ <dim>1</dim>
1643
+ </port>
1644
+ </output>
1645
+ </layer>
1646
+ <layer id="103" name="Constant_4255" type="Const" version="opset1">
1647
+ <data element_type="i32" shape="1" offset="126328" size="4" />
1648
+ <output>
1649
+ <port id="0" precision="I32">
1650
+ <dim>1</dim>
1651
+ </port>
1652
+ </output>
1653
+ </layer>
1654
+ <layer id="104" name="Gather_4259" type="Gather" version="opset8">
1655
+ <data batch_dims="0" />
1656
+ <input>
1657
+ <port id="0" precision="I64">
1658
+ <dim>3</dim>
1659
+ </port>
1660
+ <port id="1" precision="I32">
1661
+ <dim>1</dim>
1662
+ </port>
1663
+ <port id="2" precision="I32">
1664
+ <dim>1</dim>
1665
+ </port>
1666
+ </input>
1667
+ <output>
1668
+ <port id="3" precision="I64">
1669
+ <dim>1</dim>
1670
+ </port>
1671
+ </output>
1672
+ </layer>
1673
+ <layer id="105" name="Broadcast_4271" type="Broadcast" version="opset3">
1674
+ <data mode="numpy" />
1675
+ <input>
1676
+ <port id="0" precision="I64">
1677
+ <dim>1</dim>
1678
+ </port>
1679
+ <port id="1" precision="I64">
1680
+ <dim>1</dim>
1681
+ </port>
1682
+ </input>
1683
+ <output>
1684
+ <port id="2" precision="I64">
1685
+ <dim>1</dim>
1686
+ </port>
1687
+ </output>
1688
+ </layer>
1689
+ <layer id="106" name="Concat_4250_compressed" type="Const" version="opset1">
1690
+ <data element_type="f16" shape="2, 512, 256" offset="513460" size="524288" />
1691
+ <output>
1692
+ <port id="0" precision="FP16">
1693
+ <dim>2</dim>
1694
+ <dim>512</dim>
1695
+ <dim>256</dim>
1696
+ </port>
1697
+ </output>
1698
+ </layer>
1699
+ <layer id="107" name="Concat_4250" type="Convert" version="opset1">
1700
+ <data destination_type="f32" />
1701
+ <rt_info>
1702
+ <attribute name="decompression" version="0" />
1703
+ </rt_info>
1704
+ <input>
1705
+ <port id="0" precision="FP16">
1706
+ <dim>2</dim>
1707
+ <dim>512</dim>
1708
+ <dim>256</dim>
1709
+ </port>
1710
+ </input>
1711
+ <output>
1712
+ <port id="1" precision="FP32">
1713
+ <dim>2</dim>
1714
+ <dim>512</dim>
1715
+ <dim>256</dim>
1716
+ </port>
1717
+ </output>
1718
+ </layer>
1719
+ <layer id="108" name="Concat_4253_compressed" type="Const" version="opset1">
1720
+ <data element_type="f16" shape="2, 512, 128" offset="1037748" size="262144" />
1721
+ <output>
1722
+ <port id="0" precision="FP16">
1723
+ <dim>2</dim>
1724
+ <dim>512</dim>
1725
+ <dim>128</dim>
1726
+ </port>
1727
+ </output>
1728
+ </layer>
1729
+ <layer id="109" name="Concat_4253" type="Convert" version="opset1">
1730
+ <data destination_type="f32" />
1731
+ <rt_info>
1732
+ <attribute name="decompression" version="0" />
1733
+ </rt_info>
1734
+ <input>
1735
+ <port id="0" precision="FP16">
1736
+ <dim>2</dim>
1737
+ <dim>512</dim>
1738
+ <dim>128</dim>
1739
+ </port>
1740
+ </input>
1741
+ <output>
1742
+ <port id="1" precision="FP32">
1743
+ <dim>2</dim>
1744
+ <dim>512</dim>
1745
+ <dim>128</dim>
1746
+ </port>
1747
+ </output>
1748
+ </layer>
1749
+ <layer id="110" name="Concat_4270_compressed" type="Const" version="opset1">
1750
+ <data element_type="f16" shape="2, 512" offset="1299892" size="2048" />
1751
+ <output>
1752
+ <port id="0" precision="FP16">
1753
+ <dim>2</dim>
1754
+ <dim>512</dim>
1755
+ </port>
1756
+ </output>
1757
+ </layer>
1758
+ <layer id="111" name="Concat_4270" type="Convert" version="opset1">
1759
+ <data destination_type="f32" />
1760
+ <rt_info>
1761
+ <attribute name="decompression" version="0" />
1762
+ </rt_info>
1763
+ <input>
1764
+ <port id="0" precision="FP16">
1765
+ <dim>2</dim>
1766
+ <dim>512</dim>
1767
+ </port>
1768
+ </input>
1769
+ <output>
1770
+ <port id="1" precision="FP32">
1771
+ <dim>2</dim>
1772
+ <dim>512</dim>
1773
+ </port>
1774
+ </output>
1775
+ </layer>
1776
+ <layer id="112" name="LSTMSequence_4285" type="LSTMSequence" version="opset5">
1777
+ <data direction="bidirectional" hidden_size="128" activations="sigmoid, tanh, tanh" activations_alpha="" activations_beta="" clip="0" />
1778
+ <input>
1779
+ <port id="0" precision="FP32">
1780
+ <dim>1</dim>
1781
+ <dim>589</dim>
1782
+ <dim>256</dim>
1783
+ </port>
1784
+ <port id="1" precision="FP32">
1785
+ <dim>1</dim>
1786
+ <dim>2</dim>
1787
+ <dim>128</dim>
1788
+ </port>
1789
+ <port id="2" precision="FP32">
1790
+ <dim>1</dim>
1791
+ <dim>2</dim>
1792
+ <dim>128</dim>
1793
+ </port>
1794
+ <port id="3" precision="I64">
1795
+ <dim>1</dim>
1796
+ </port>
1797
+ <port id="4" precision="FP32">
1798
+ <dim>2</dim>
1799
+ <dim>512</dim>
1800
+ <dim>256</dim>
1801
+ </port>
1802
+ <port id="5" precision="FP32">
1803
+ <dim>2</dim>
1804
+ <dim>512</dim>
1805
+ <dim>128</dim>
1806
+ </port>
1807
+ <port id="6" precision="FP32">
1808
+ <dim>2</dim>
1809
+ <dim>512</dim>
1810
+ </port>
1811
+ </input>
1812
+ <output>
1813
+ <port id="7" precision="FP32">
1814
+ <dim>1</dim>
1815
+ <dim>2</dim>
1816
+ <dim>589</dim>
1817
+ <dim>128</dim>
1818
+ </port>
1819
+ <port id="8" precision="FP32">
1820
+ <dim>1</dim>
1821
+ <dim>2</dim>
1822
+ <dim>128</dim>
1823
+ </port>
1824
+ <port id="9" precision="FP32">
1825
+ <dim>1</dim>
1826
+ <dim>2</dim>
1827
+ <dim>128</dim>
1828
+ </port>
1829
+ </output>
1830
+ </layer>
1831
+ <layer id="113" name="Constant_6668" type="Const" version="opset1">
1832
+ <data element_type="i64" shape="4" offset="513404" size="32" />
1833
+ <output>
1834
+ <port id="0" precision="I64">
1835
+ <dim>4</dim>
1836
+ </port>
1837
+ </output>
1838
+ </layer>
1839
+ <layer id="114" name="/lstm/Transpose_2" type="Transpose" version="opset1">
1840
+ <input>
1841
+ <port id="0" precision="FP32">
1842
+ <dim>1</dim>
1843
+ <dim>2</dim>
1844
+ <dim>589</dim>
1845
+ <dim>128</dim>
1846
+ </port>
1847
+ <port id="1" precision="I64">
1848
+ <dim>4</dim>
1849
+ </port>
1850
+ </input>
1851
+ <output>
1852
+ <port id="2" precision="FP32" names="/lstm/Transpose_2_output_0">
1853
+ <dim>589</dim>
1854
+ <dim>1</dim>
1855
+ <dim>2</dim>
1856
+ <dim>128</dim>
1857
+ </port>
1858
+ </output>
1859
+ </layer>
1860
+ <layer id="115" name="/lstm/Constant_13" type="Const" version="opset1">
1861
+ <data element_type="i64" shape="3" offset="513436" size="24" />
1862
+ <rt_info>
1863
+ <attribute name="precise" version="0" />
1864
+ </rt_info>
1865
+ <output>
1866
+ <port id="0" precision="I64" names="/lstm/Constant_13_output_0">
1867
+ <dim>3</dim>
1868
+ </port>
1869
+ </output>
1870
+ </layer>
1871
+ <layer id="116" name="/lstm/Reshape_1" type="Reshape" version="opset1">
1872
+ <data special_zero="true" />
1873
+ <input>
1874
+ <port id="0" precision="FP32">
1875
+ <dim>589</dim>
1876
+ <dim>1</dim>
1877
+ <dim>2</dim>
1878
+ <dim>128</dim>
1879
+ </port>
1880
+ <port id="1" precision="I64">
1881
+ <dim>3</dim>
1882
+ </port>
1883
+ </input>
1884
+ <output>
1885
+ <port id="2" precision="FP32" names="/lstm/Reshape_1_output_0">
1886
+ <dim>589</dim>
1887
+ <dim>1</dim>
1888
+ <dim>256</dim>
1889
+ </port>
1890
+ </output>
1891
+ </layer>
1892
+ <layer id="117" name="Constant_4323" type="Const" version="opset1">
1893
+ <data element_type="i64" shape="3" offset="125764" size="24" />
1894
+ <output>
1895
+ <port id="0" precision="I64">
1896
+ <dim>3</dim>
1897
+ </port>
1898
+ </output>
1899
+ </layer>
1900
+ <layer id="118" name="Transpose_4324" type="Transpose" version="opset1">
1901
+ <input>
1902
+ <port id="0" precision="FP32">
1903
+ <dim>589</dim>
1904
+ <dim>1</dim>
1905
+ <dim>256</dim>
1906
+ </port>
1907
+ <port id="1" precision="I64">
1908
+ <dim>3</dim>
1909
+ </port>
1910
+ </input>
1911
+ <output>
1912
+ <port id="2" precision="FP32">
1913
+ <dim>1</dim>
1914
+ <dim>589</dim>
1915
+ <dim>256</dim>
1916
+ </port>
1917
+ </output>
1918
+ </layer>
1919
+ <layer id="119" name="Concat_4327_compressed" type="Const" version="opset1">
1920
+ <data element_type="f16" shape="2, 512, 256" offset="1301940" size="524288" />
1921
+ <output>
1922
+ <port id="0" precision="FP16">
1923
+ <dim>2</dim>
1924
+ <dim>512</dim>
1925
+ <dim>256</dim>
1926
+ </port>
1927
+ </output>
1928
+ </layer>
1929
+ <layer id="120" name="Concat_4327" type="Convert" version="opset1">
1930
+ <data destination_type="f32" />
1931
+ <rt_info>
1932
+ <attribute name="decompression" version="0" />
1933
+ </rt_info>
1934
+ <input>
1935
+ <port id="0" precision="FP16">
1936
+ <dim>2</dim>
1937
+ <dim>512</dim>
1938
+ <dim>256</dim>
1939
+ </port>
1940
+ </input>
1941
+ <output>
1942
+ <port id="1" precision="FP32">
1943
+ <dim>2</dim>
1944
+ <dim>512</dim>
1945
+ <dim>256</dim>
1946
+ </port>
1947
+ </output>
1948
+ </layer>
1949
+ <layer id="121" name="Concat_4330_compressed" type="Const" version="opset1">
1950
+ <data element_type="f16" shape="2, 512, 128" offset="1826228" size="262144" />
1951
+ <output>
1952
+ <port id="0" precision="FP16">
1953
+ <dim>2</dim>
1954
+ <dim>512</dim>
1955
+ <dim>128</dim>
1956
+ </port>
1957
+ </output>
1958
+ </layer>
1959
+ <layer id="122" name="Concat_4330" type="Convert" version="opset1">
1960
+ <data destination_type="f32" />
1961
+ <rt_info>
1962
+ <attribute name="decompression" version="0" />
1963
+ </rt_info>
1964
+ <input>
1965
+ <port id="0" precision="FP16">
1966
+ <dim>2</dim>
1967
+ <dim>512</dim>
1968
+ <dim>128</dim>
1969
+ </port>
1970
+ </input>
1971
+ <output>
1972
+ <port id="1" precision="FP32">
1973
+ <dim>2</dim>
1974
+ <dim>512</dim>
1975
+ <dim>128</dim>
1976
+ </port>
1977
+ </output>
1978
+ </layer>
1979
+ <layer id="123" name="Concat_4347_compressed" type="Const" version="opset1">
1980
+ <data element_type="f16" shape="2, 512" offset="2088372" size="2048" />
1981
+ <output>
1982
+ <port id="0" precision="FP16">
1983
+ <dim>2</dim>
1984
+ <dim>512</dim>
1985
+ </port>
1986
+ </output>
1987
+ </layer>
1988
+ <layer id="124" name="Concat_4347" type="Convert" version="opset1">
1989
+ <data destination_type="f32" />
1990
+ <rt_info>
1991
+ <attribute name="decompression" version="0" />
1992
+ </rt_info>
1993
+ <input>
1994
+ <port id="0" precision="FP16">
1995
+ <dim>2</dim>
1996
+ <dim>512</dim>
1997
+ </port>
1998
+ </input>
1999
+ <output>
2000
+ <port id="1" precision="FP32">
2001
+ <dim>2</dim>
2002
+ <dim>512</dim>
2003
+ </port>
2004
+ </output>
2005
+ </layer>
2006
+ <layer id="125" name="LSTMSequence_4362" type="LSTMSequence" version="opset5">
2007
+ <data direction="bidirectional" hidden_size="128" activations="sigmoid, tanh, tanh" activations_alpha="" activations_beta="" clip="0" />
2008
+ <input>
2009
+ <port id="0" precision="FP32">
2010
+ <dim>1</dim>
2011
+ <dim>589</dim>
2012
+ <dim>256</dim>
2013
+ </port>
2014
+ <port id="1" precision="FP32">
2015
+ <dim>1</dim>
2016
+ <dim>2</dim>
2017
+ <dim>128</dim>
2018
+ </port>
2019
+ <port id="2" precision="FP32">
2020
+ <dim>1</dim>
2021
+ <dim>2</dim>
2022
+ <dim>128</dim>
2023
+ </port>
2024
+ <port id="3" precision="I64">
2025
+ <dim>1</dim>
2026
+ </port>
2027
+ <port id="4" precision="FP32">
2028
+ <dim>2</dim>
2029
+ <dim>512</dim>
2030
+ <dim>256</dim>
2031
+ </port>
2032
+ <port id="5" precision="FP32">
2033
+ <dim>2</dim>
2034
+ <dim>512</dim>
2035
+ <dim>128</dim>
2036
+ </port>
2037
+ <port id="6" precision="FP32">
2038
+ <dim>2</dim>
2039
+ <dim>512</dim>
2040
+ </port>
2041
+ </input>
2042
+ <output>
2043
+ <port id="7" precision="FP32">
2044
+ <dim>1</dim>
2045
+ <dim>2</dim>
2046
+ <dim>589</dim>
2047
+ <dim>128</dim>
2048
+ </port>
2049
+ <port id="8" precision="FP32">
2050
+ <dim>1</dim>
2051
+ <dim>2</dim>
2052
+ <dim>128</dim>
2053
+ </port>
2054
+ <port id="9" precision="FP32">
2055
+ <dim>1</dim>
2056
+ <dim>2</dim>
2057
+ <dim>128</dim>
2058
+ </port>
2059
+ </output>
2060
+ </layer>
2061
+ <layer id="126" name="Constant_6670" type="Const" version="opset1">
2062
+ <data element_type="i64" shape="4" offset="513404" size="32" />
2063
+ <output>
2064
+ <port id="0" precision="I64">
2065
+ <dim>4</dim>
2066
+ </port>
2067
+ </output>
2068
+ </layer>
2069
+ <layer id="127" name="/lstm/Transpose_3" type="Transpose" version="opset1">
2070
+ <input>
2071
+ <port id="0" precision="FP32">
2072
+ <dim>1</dim>
2073
+ <dim>2</dim>
2074
+ <dim>589</dim>
2075
+ <dim>128</dim>
2076
+ </port>
2077
+ <port id="1" precision="I64">
2078
+ <dim>4</dim>
2079
+ </port>
2080
+ </input>
2081
+ <output>
2082
+ <port id="2" precision="FP32" names="/lstm/Transpose_3_output_0">
2083
+ <dim>589</dim>
2084
+ <dim>1</dim>
2085
+ <dim>2</dim>
2086
+ <dim>128</dim>
2087
+ </port>
2088
+ </output>
2089
+ </layer>
2090
+ <layer id="128" name="/lstm/Constant_20" type="Const" version="opset1">
2091
+ <data element_type="i64" shape="3" offset="513436" size="24" />
2092
+ <rt_info>
2093
+ <attribute name="precise" version="0" />
2094
+ </rt_info>
2095
+ <output>
2096
+ <port id="0" precision="I64" names="/lstm/Constant_20_output_0">
2097
+ <dim>3</dim>
2098
+ </port>
2099
+ </output>
2100
+ </layer>
2101
+ <layer id="129" name="/lstm/Reshape_2" type="Reshape" version="opset1">
2102
+ <data special_zero="true" />
2103
+ <input>
2104
+ <port id="0" precision="FP32">
2105
+ <dim>589</dim>
2106
+ <dim>1</dim>
2107
+ <dim>2</dim>
2108
+ <dim>128</dim>
2109
+ </port>
2110
+ <port id="1" precision="I64">
2111
+ <dim>3</dim>
2112
+ </port>
2113
+ </input>
2114
+ <output>
2115
+ <port id="2" precision="FP32" names="/lstm/Reshape_2_output_0">
2116
+ <dim>589</dim>
2117
+ <dim>1</dim>
2118
+ <dim>256</dim>
2119
+ </port>
2120
+ </output>
2121
+ </layer>
2122
+ <layer id="130" name="Constant_4400" type="Const" version="opset1">
2123
+ <data element_type="i64" shape="3" offset="125764" size="24" />
2124
+ <output>
2125
+ <port id="0" precision="I64">
2126
+ <dim>3</dim>
2127
+ </port>
2128
+ </output>
2129
+ </layer>
2130
+ <layer id="131" name="Transpose_4401" type="Transpose" version="opset1">
2131
+ <input>
2132
+ <port id="0" precision="FP32">
2133
+ <dim>589</dim>
2134
+ <dim>1</dim>
2135
+ <dim>256</dim>
2136
+ </port>
2137
+ <port id="1" precision="I64">
2138
+ <dim>3</dim>
2139
+ </port>
2140
+ </input>
2141
+ <output>
2142
+ <port id="2" precision="FP32">
2143
+ <dim>1</dim>
2144
+ <dim>589</dim>
2145
+ <dim>256</dim>
2146
+ </port>
2147
+ </output>
2148
+ </layer>
2149
+ <layer id="132" name="Concat_4404_compressed" type="Const" version="opset1">
2150
+ <data element_type="f16" shape="2, 512, 256" offset="2090420" size="524288" />
2151
+ <output>
2152
+ <port id="0" precision="FP16">
2153
+ <dim>2</dim>
2154
+ <dim>512</dim>
2155
+ <dim>256</dim>
2156
+ </port>
2157
+ </output>
2158
+ </layer>
2159
+ <layer id="133" name="Concat_4404" type="Convert" version="opset1">
2160
+ <data destination_type="f32" />
2161
+ <rt_info>
2162
+ <attribute name="decompression" version="0" />
2163
+ </rt_info>
2164
+ <input>
2165
+ <port id="0" precision="FP16">
2166
+ <dim>2</dim>
2167
+ <dim>512</dim>
2168
+ <dim>256</dim>
2169
+ </port>
2170
+ </input>
2171
+ <output>
2172
+ <port id="1" precision="FP32">
2173
+ <dim>2</dim>
2174
+ <dim>512</dim>
2175
+ <dim>256</dim>
2176
+ </port>
2177
+ </output>
2178
+ </layer>
2179
+ <layer id="134" name="Concat_4407_compressed" type="Const" version="opset1">
2180
+ <data element_type="f16" shape="2, 512, 128" offset="2614708" size="262144" />
2181
+ <output>
2182
+ <port id="0" precision="FP16">
2183
+ <dim>2</dim>
2184
+ <dim>512</dim>
2185
+ <dim>128</dim>
2186
+ </port>
2187
+ </output>
2188
+ </layer>
2189
+ <layer id="135" name="Concat_4407" type="Convert" version="opset1">
2190
+ <data destination_type="f32" />
2191
+ <rt_info>
2192
+ <attribute name="decompression" version="0" />
2193
+ </rt_info>
2194
+ <input>
2195
+ <port id="0" precision="FP16">
2196
+ <dim>2</dim>
2197
+ <dim>512</dim>
2198
+ <dim>128</dim>
2199
+ </port>
2200
+ </input>
2201
+ <output>
2202
+ <port id="1" precision="FP32">
2203
+ <dim>2</dim>
2204
+ <dim>512</dim>
2205
+ <dim>128</dim>
2206
+ </port>
2207
+ </output>
2208
+ </layer>
2209
+ <layer id="136" name="Concat_4424_compressed" type="Const" version="opset1">
2210
+ <data element_type="f16" shape="2, 512" offset="2876852" size="2048" />
2211
+ <output>
2212
+ <port id="0" precision="FP16">
2213
+ <dim>2</dim>
2214
+ <dim>512</dim>
2215
+ </port>
2216
+ </output>
2217
+ </layer>
2218
+ <layer id="137" name="Concat_4424" type="Convert" version="opset1">
2219
+ <data destination_type="f32" />
2220
+ <rt_info>
2221
+ <attribute name="decompression" version="0" />
2222
+ </rt_info>
2223
+ <input>
2224
+ <port id="0" precision="FP16">
2225
+ <dim>2</dim>
2226
+ <dim>512</dim>
2227
+ </port>
2228
+ </input>
2229
+ <output>
2230
+ <port id="1" precision="FP32">
2231
+ <dim>2</dim>
2232
+ <dim>512</dim>
2233
+ </port>
2234
+ </output>
2235
+ </layer>
2236
+ <layer id="138" name="LSTMSequence_4439" type="LSTMSequence" version="opset5">
2237
+ <data direction="bidirectional" hidden_size="128" activations="sigmoid, tanh, tanh" activations_alpha="" activations_beta="" clip="0" />
2238
+ <input>
2239
+ <port id="0" precision="FP32">
2240
+ <dim>1</dim>
2241
+ <dim>589</dim>
2242
+ <dim>256</dim>
2243
+ </port>
2244
+ <port id="1" precision="FP32">
2245
+ <dim>1</dim>
2246
+ <dim>2</dim>
2247
+ <dim>128</dim>
2248
+ </port>
2249
+ <port id="2" precision="FP32">
2250
+ <dim>1</dim>
2251
+ <dim>2</dim>
2252
+ <dim>128</dim>
2253
+ </port>
2254
+ <port id="3" precision="I64">
2255
+ <dim>1</dim>
2256
+ </port>
2257
+ <port id="4" precision="FP32">
2258
+ <dim>2</dim>
2259
+ <dim>512</dim>
2260
+ <dim>256</dim>
2261
+ </port>
2262
+ <port id="5" precision="FP32">
2263
+ <dim>2</dim>
2264
+ <dim>512</dim>
2265
+ <dim>128</dim>
2266
+ </port>
2267
+ <port id="6" precision="FP32">
2268
+ <dim>2</dim>
2269
+ <dim>512</dim>
2270
+ </port>
2271
+ </input>
2272
+ <output>
2273
+ <port id="7" precision="FP32">
2274
+ <dim>1</dim>
2275
+ <dim>2</dim>
2276
+ <dim>589</dim>
2277
+ <dim>128</dim>
2278
+ </port>
2279
+ <port id="8" precision="FP32">
2280
+ <dim>1</dim>
2281
+ <dim>2</dim>
2282
+ <dim>128</dim>
2283
+ </port>
2284
+ <port id="9" precision="FP32">
2285
+ <dim>1</dim>
2286
+ <dim>2</dim>
2287
+ <dim>128</dim>
2288
+ </port>
2289
+ </output>
2290
+ </layer>
2291
+ <layer id="139" name="Constant_6672" type="Const" version="opset1">
2292
+ <data element_type="i64" shape="4" offset="513404" size="32" />
2293
+ <output>
2294
+ <port id="0" precision="I64">
2295
+ <dim>4</dim>
2296
+ </port>
2297
+ </output>
2298
+ </layer>
2299
+ <layer id="140" name="/lstm/Transpose_4" type="Transpose" version="opset1">
2300
+ <input>
2301
+ <port id="0" precision="FP32">
2302
+ <dim>1</dim>
2303
+ <dim>2</dim>
2304
+ <dim>589</dim>
2305
+ <dim>128</dim>
2306
+ </port>
2307
+ <port id="1" precision="I64">
2308
+ <dim>4</dim>
2309
+ </port>
2310
+ </input>
2311
+ <output>
2312
+ <port id="2" precision="FP32" names="/lstm/Transpose_4_output_0">
2313
+ <dim>589</dim>
2314
+ <dim>1</dim>
2315
+ <dim>2</dim>
2316
+ <dim>128</dim>
2317
+ </port>
2318
+ </output>
2319
+ </layer>
2320
+ <layer id="141" name="/lstm/Constant_27" type="Const" version="opset1">
2321
+ <data element_type="i64" shape="3" offset="513436" size="24" />
2322
+ <rt_info>
2323
+ <attribute name="precise" version="0" />
2324
+ </rt_info>
2325
+ <output>
2326
+ <port id="0" precision="I64" names="/lstm/Constant_27_output_0">
2327
+ <dim>3</dim>
2328
+ </port>
2329
+ </output>
2330
+ </layer>
2331
+ <layer id="142" name="/lstm/Reshape_3" type="Reshape" version="opset1">
2332
+ <data special_zero="true" />
2333
+ <input>
2334
+ <port id="0" precision="FP32">
2335
+ <dim>589</dim>
2336
+ <dim>1</dim>
2337
+ <dim>2</dim>
2338
+ <dim>128</dim>
2339
+ </port>
2340
+ <port id="1" precision="I64">
2341
+ <dim>3</dim>
2342
+ </port>
2343
+ </input>
2344
+ <output>
2345
+ <port id="2" precision="FP32" names="/lstm/Reshape_3_output_0">
2346
+ <dim>589</dim>
2347
+ <dim>1</dim>
2348
+ <dim>256</dim>
2349
+ </port>
2350
+ </output>
2351
+ </layer>
2352
+ <layer id="143" name="Constant_4450" type="Const" version="opset1">
2353
+ <data element_type="i64" shape="3" offset="125764" size="24" />
2354
+ <output>
2355
+ <port id="0" precision="I64">
2356
+ <dim>3</dim>
2357
+ </port>
2358
+ </output>
2359
+ </layer>
2360
+ <layer id="144" name="/lstm/Transpose_5" type="Transpose" version="opset1">
2361
+ <input>
2362
+ <port id="0" precision="FP32">
2363
+ <dim>589</dim>
2364
+ <dim>1</dim>
2365
+ <dim>256</dim>
2366
+ </port>
2367
+ <port id="1" precision="I64">
2368
+ <dim>3</dim>
2369
+ </port>
2370
+ </input>
2371
+ <output>
2372
+ <port id="2" precision="FP32" names="/lstm/Transpose_5_output_0">
2373
+ <dim>1</dim>
2374
+ <dim>589</dim>
2375
+ <dim>256</dim>
2376
+ </port>
2377
+ </output>
2378
+ </layer>
2379
+ <layer id="145" name="Transpose_7057_compressed" type="Const" version="opset1">
2380
+ <data element_type="f16" shape="128, 256" offset="2878900" size="65536" />
2381
+ <output>
2382
+ <port id="0" precision="FP16">
2383
+ <dim>128</dim>
2384
+ <dim>256</dim>
2385
+ </port>
2386
+ </output>
2387
+ </layer>
2388
+ <layer id="146" name="Transpose_7057" type="Convert" version="opset1">
2389
+ <data destination_type="f32" />
2390
+ <rt_info>
2391
+ <attribute name="decompression" version="0" />
2392
+ </rt_info>
2393
+ <input>
2394
+ <port id="0" precision="FP16">
2395
+ <dim>128</dim>
2396
+ <dim>256</dim>
2397
+ </port>
2398
+ </input>
2399
+ <output>
2400
+ <port id="1" precision="FP32">
2401
+ <dim>128</dim>
2402
+ <dim>256</dim>
2403
+ </port>
2404
+ </output>
2405
+ </layer>
2406
+ <layer id="147" name="/linear.0/MatMul" type="MatMul" version="opset1">
2407
+ <data transpose_a="false" transpose_b="true" />
2408
+ <input>
2409
+ <port id="0" precision="FP32">
2410
+ <dim>1</dim>
2411
+ <dim>589</dim>
2412
+ <dim>256</dim>
2413
+ </port>
2414
+ <port id="1" precision="FP32">
2415
+ <dim>128</dim>
2416
+ <dim>256</dim>
2417
+ </port>
2418
+ </input>
2419
+ <output>
2420
+ <port id="2" precision="FP32" names="/linear.0/MatMul_output_0">
2421
+ <dim>1</dim>
2422
+ <dim>589</dim>
2423
+ <dim>128</dim>
2424
+ </port>
2425
+ </output>
2426
+ </layer>
2427
+ <layer id="148" name="/linear.0/Add" type="Add" version="opset1">
2428
+ <data auto_broadcast="numpy" />
2429
+ <input>
2430
+ <port id="0" precision="FP32">
2431
+ <dim>1</dim>
2432
+ <dim>1</dim>
2433
+ <dim>128</dim>
2434
+ </port>
2435
+ <port id="1" precision="FP32">
2436
+ <dim>1</dim>
2437
+ <dim>589</dim>
2438
+ <dim>128</dim>
2439
+ </port>
2440
+ </input>
2441
+ <output>
2442
+ <port id="2" precision="FP32" names="/linear.0/Add_output_0">
2443
+ <dim>1</dim>
2444
+ <dim>589</dim>
2445
+ <dim>128</dim>
2446
+ </port>
2447
+ </output>
2448
+ </layer>
2449
+ <layer id="149" name="Constant_4454_compressed" type="Const" version="opset1">
2450
+ <data element_type="f16" shape="1" offset="41018" size="2" />
2451
+ <output>
2452
+ <port id="0" precision="FP16">
2453
+ <dim>1</dim>
2454
+ </port>
2455
+ </output>
2456
+ </layer>
2457
+ <layer id="150" name="Constant_4454" type="Convert" version="opset1">
2458
+ <data destination_type="f32" />
2459
+ <rt_info>
2460
+ <attribute name="decompression" version="0" />
2461
+ </rt_info>
2462
+ <input>
2463
+ <port id="0" precision="FP16">
2464
+ <dim>1</dim>
2465
+ </port>
2466
+ </input>
2467
+ <output>
2468
+ <port id="1" precision="FP32">
2469
+ <dim>1</dim>
2470
+ </port>
2471
+ </output>
2472
+ </layer>
2473
+ <layer id="151" name="/LeakyRelu" type="PReLU" version="opset1">
2474
+ <input>
2475
+ <port id="0" precision="FP32">
2476
+ <dim>1</dim>
2477
+ <dim>589</dim>
2478
+ <dim>128</dim>
2479
+ </port>
2480
+ <port id="1" precision="FP32">
2481
+ <dim>1</dim>
2482
+ </port>
2483
+ </input>
2484
+ <output>
2485
+ <port id="2" precision="FP32" names="/LeakyRelu_output_0">
2486
+ <dim>1</dim>
2487
+ <dim>589</dim>
2488
+ <dim>128</dim>
2489
+ </port>
2490
+ </output>
2491
+ </layer>
2492
+ <layer id="152" name="Transpose_7076_compressed" type="Const" version="opset1">
2493
+ <data element_type="f16" shape="128, 128" offset="2944436" size="32768" />
2494
+ <output>
2495
+ <port id="0" precision="FP16">
2496
+ <dim>128</dim>
2497
+ <dim>128</dim>
2498
+ </port>
2499
+ </output>
2500
+ </layer>
2501
+ <layer id="153" name="Transpose_7076" type="Convert" version="opset1">
2502
+ <data destination_type="f32" />
2503
+ <rt_info>
2504
+ <attribute name="decompression" version="0" />
2505
+ </rt_info>
2506
+ <input>
2507
+ <port id="0" precision="FP16">
2508
+ <dim>128</dim>
2509
+ <dim>128</dim>
2510
+ </port>
2511
+ </input>
2512
+ <output>
2513
+ <port id="1" precision="FP32">
2514
+ <dim>128</dim>
2515
+ <dim>128</dim>
2516
+ </port>
2517
+ </output>
2518
+ </layer>
2519
+ <layer id="154" name="/linear.1/MatMul" type="MatMul" version="opset1">
2520
+ <data transpose_a="false" transpose_b="true" />
2521
+ <input>
2522
+ <port id="0" precision="FP32">
2523
+ <dim>1</dim>
2524
+ <dim>589</dim>
2525
+ <dim>128</dim>
2526
+ </port>
2527
+ <port id="1" precision="FP32">
2528
+ <dim>128</dim>
2529
+ <dim>128</dim>
2530
+ </port>
2531
+ </input>
2532
+ <output>
2533
+ <port id="2" precision="FP32" names="/linear.1/MatMul_output_0">
2534
+ <dim>1</dim>
2535
+ <dim>589</dim>
2536
+ <dim>128</dim>
2537
+ </port>
2538
+ </output>
2539
+ </layer>
2540
+ <layer id="155" name="/linear.1/Add" type="Add" version="opset1">
2541
+ <data auto_broadcast="numpy" />
2542
+ <input>
2543
+ <port id="0" precision="FP32">
2544
+ <dim>1</dim>
2545
+ <dim>1</dim>
2546
+ <dim>128</dim>
2547
+ </port>
2548
+ <port id="1" precision="FP32">
2549
+ <dim>1</dim>
2550
+ <dim>589</dim>
2551
+ <dim>128</dim>
2552
+ </port>
2553
+ </input>
2554
+ <output>
2555
+ <port id="2" precision="FP32" names="/linear.1/Add_output_0">
2556
+ <dim>1</dim>
2557
+ <dim>589</dim>
2558
+ <dim>128</dim>
2559
+ </port>
2560
+ </output>
2561
+ </layer>
2562
+ <layer id="156" name="Constant_4458_compressed" type="Const" version="opset1">
2563
+ <data element_type="f16" shape="1" offset="41018" size="2" />
2564
+ <output>
2565
+ <port id="0" precision="FP16">
2566
+ <dim>1</dim>
2567
+ </port>
2568
+ </output>
2569
+ </layer>
2570
+ <layer id="157" name="Constant_4458" type="Convert" version="opset1">
2571
+ <data destination_type="f32" />
2572
+ <rt_info>
2573
+ <attribute name="decompression" version="0" />
2574
+ </rt_info>
2575
+ <input>
2576
+ <port id="0" precision="FP16">
2577
+ <dim>1</dim>
2578
+ </port>
2579
+ </input>
2580
+ <output>
2581
+ <port id="1" precision="FP32">
2582
+ <dim>1</dim>
2583
+ </port>
2584
+ </output>
2585
+ </layer>
2586
+ <layer id="158" name="/LeakyRelu_1" type="PReLU" version="opset1">
2587
+ <input>
2588
+ <port id="0" precision="FP32">
2589
+ <dim>1</dim>
2590
+ <dim>589</dim>
2591
+ <dim>128</dim>
2592
+ </port>
2593
+ <port id="1" precision="FP32">
2594
+ <dim>1</dim>
2595
+ </port>
2596
+ </input>
2597
+ <output>
2598
+ <port id="2" precision="FP32" names="/LeakyRelu_1_output_0">
2599
+ <dim>1</dim>
2600
+ <dim>589</dim>
2601
+ <dim>128</dim>
2602
+ </port>
2603
+ </output>
2604
+ </layer>
2605
+ <layer id="159" name="Transpose_7095_compressed" type="Const" version="opset1">
2606
+ <data element_type="f16" shape="7, 128" offset="2977204" size="1792" />
2607
+ <output>
2608
+ <port id="0" precision="FP16">
2609
+ <dim>7</dim>
2610
+ <dim>128</dim>
2611
+ </port>
2612
+ </output>
2613
+ </layer>
2614
+ <layer id="160" name="Transpose_7095" type="Convert" version="opset1">
2615
+ <data destination_type="f32" />
2616
+ <rt_info>
2617
+ <attribute name="decompression" version="0" />
2618
+ </rt_info>
2619
+ <input>
2620
+ <port id="0" precision="FP16">
2621
+ <dim>7</dim>
2622
+ <dim>128</dim>
2623
+ </port>
2624
+ </input>
2625
+ <output>
2626
+ <port id="1" precision="FP32">
2627
+ <dim>7</dim>
2628
+ <dim>128</dim>
2629
+ </port>
2630
+ </output>
2631
+ </layer>
2632
+ <layer id="161" name="/classifier/MatMul" type="MatMul" version="opset1">
2633
+ <data transpose_a="false" transpose_b="true" />
2634
+ <input>
2635
+ <port id="0" precision="FP32">
2636
+ <dim>1</dim>
2637
+ <dim>589</dim>
2638
+ <dim>128</dim>
2639
+ </port>
2640
+ <port id="1" precision="FP32">
2641
+ <dim>7</dim>
2642
+ <dim>128</dim>
2643
+ </port>
2644
+ </input>
2645
+ <output>
2646
+ <port id="2" precision="FP32" names="/classifier/MatMul_output_0">
2647
+ <dim>1</dim>
2648
+ <dim>589</dim>
2649
+ <dim>7</dim>
2650
+ </port>
2651
+ </output>
2652
+ </layer>
2653
+ <layer id="162" name="/classifier/Add" type="Add" version="opset1">
2654
+ <data auto_broadcast="numpy" />
2655
+ <input>
2656
+ <port id="0" precision="FP32">
2657
+ <dim>1</dim>
2658
+ <dim>1</dim>
2659
+ <dim>7</dim>
2660
+ </port>
2661
+ <port id="1" precision="FP32">
2662
+ <dim>1</dim>
2663
+ <dim>589</dim>
2664
+ <dim>7</dim>
2665
+ </port>
2666
+ </input>
2667
+ <output>
2668
+ <port id="2" precision="FP32" names="/classifier/Add_output_0">
2669
+ <dim>1</dim>
2670
+ <dim>589</dim>
2671
+ <dim>7</dim>
2672
+ </port>
2673
+ </output>
2674
+ </layer>
2675
+ <layer id="163" name="outputs" type="LogSoftmax" version="opset5">
2676
+ <data axis="-1" />
2677
+ <input>
2678
+ <port id="0" precision="FP32">
2679
+ <dim>1</dim>
2680
+ <dim>589</dim>
2681
+ <dim>7</dim>
2682
+ </port>
2683
+ </input>
2684
+ <output>
2685
+ <port id="1" precision="FP32" names="outputs">
2686
+ <dim>1</dim>
2687
+ <dim>589</dim>
2688
+ <dim>7</dim>
2689
+ </port>
2690
+ </output>
2691
+ </layer>
2692
+ <layer id="164" name="outputs/sink_port_0" type="Result" version="opset1" output_names="outputs">
2693
+ <input>
2694
+ <port id="0" precision="FP32">
2695
+ <dim>1</dim>
2696
+ <dim>589</dim>
2697
+ <dim>7</dim>
2698
+ </port>
2699
+ </input>
2700
+ </layer>
2701
+ </layers>
2702
+ <edges>
2703
+ <edge from-layer="0" from-port="0" to-layer="8" to-port="0" />
2704
+ <edge from-layer="1" from-port="0" to-layer="2" to-port="0" />
2705
+ <edge from-layer="2" from-port="1" to-layer="162" to-port="0" />
2706
+ <edge from-layer="3" from-port="0" to-layer="4" to-port="0" />
2707
+ <edge from-layer="4" from-port="1" to-layer="155" to-port="0" />
2708
+ <edge from-layer="5" from-port="0" to-layer="6" to-port="0" />
2709
+ <edge from-layer="6" from-port="1" to-layer="148" to-port="0" />
2710
+ <edge from-layer="7" from-port="0" to-layer="8" to-port="1" />
2711
+ <edge from-layer="8" from-port="2" to-layer="11" to-port="0" />
2712
+ <edge from-layer="9" from-port="0" to-layer="10" to-port="0" />
2713
+ <edge from-layer="10" from-port="1" to-layer="11" to-port="1" />
2714
+ <edge from-layer="11" from-port="2" to-layer="14" to-port="0" />
2715
+ <edge from-layer="12" from-port="0" to-layer="13" to-port="0" />
2716
+ <edge from-layer="13" from-port="1" to-layer="14" to-port="1" />
2717
+ <edge from-layer="14" from-port="2" to-layer="17" to-port="0" />
2718
+ <edge from-layer="15" from-port="0" to-layer="16" to-port="0" />
2719
+ <edge from-layer="16" from-port="1" to-layer="17" to-port="1" />
2720
+ <edge from-layer="17" from-port="2" to-layer="18" to-port="0" />
2721
+ <edge from-layer="18" from-port="1" to-layer="19" to-port="0" />
2722
+ <edge from-layer="19" from-port="1" to-layer="21" to-port="0" />
2723
+ <edge from-layer="20" from-port="0" to-layer="21" to-port="1" />
2724
+ <edge from-layer="21" from-port="2" to-layer="24" to-port="0" />
2725
+ <edge from-layer="22" from-port="0" to-layer="23" to-port="0" />
2726
+ <edge from-layer="23" from-port="1" to-layer="24" to-port="1" />
2727
+ <edge from-layer="24" from-port="2" to-layer="27" to-port="0" />
2728
+ <edge from-layer="25" from-port="0" to-layer="26" to-port="0" />
2729
+ <edge from-layer="26" from-port="1" to-layer="27" to-port="1" />
2730
+ <edge from-layer="27" from-port="2" to-layer="30" to-port="0" />
2731
+ <edge from-layer="28" from-port="0" to-layer="29" to-port="0" />
2732
+ <edge from-layer="29" from-port="1" to-layer="30" to-port="1" />
2733
+ <edge from-layer="30" from-port="2" to-layer="33" to-port="0" />
2734
+ <edge from-layer="31" from-port="0" to-layer="32" to-port="0" />
2735
+ <edge from-layer="32" from-port="1" to-layer="33" to-port="1" />
2736
+ <edge from-layer="33" from-port="2" to-layer="36" to-port="0" />
2737
+ <edge from-layer="34" from-port="0" to-layer="35" to-port="0" />
2738
+ <edge from-layer="35" from-port="1" to-layer="36" to-port="1" />
2739
+ <edge from-layer="36" from-port="2" to-layer="37" to-port="0" />
2740
+ <edge from-layer="37" from-port="1" to-layer="39" to-port="0" />
2741
+ <edge from-layer="38" from-port="0" to-layer="39" to-port="1" />
2742
+ <edge from-layer="39" from-port="2" to-layer="42" to-port="0" />
2743
+ <edge from-layer="40" from-port="0" to-layer="41" to-port="0" />
2744
+ <edge from-layer="41" from-port="1" to-layer="42" to-port="1" />
2745
+ <edge from-layer="42" from-port="2" to-layer="45" to-port="0" />
2746
+ <edge from-layer="43" from-port="0" to-layer="44" to-port="0" />
2747
+ <edge from-layer="44" from-port="1" to-layer="45" to-port="1" />
2748
+ <edge from-layer="45" from-port="2" to-layer="48" to-port="0" />
2749
+ <edge from-layer="46" from-port="0" to-layer="47" to-port="0" />
2750
+ <edge from-layer="47" from-port="1" to-layer="48" to-port="1" />
2751
+ <edge from-layer="48" from-port="2" to-layer="51" to-port="0" />
2752
+ <edge from-layer="49" from-port="0" to-layer="50" to-port="0" />
2753
+ <edge from-layer="50" from-port="1" to-layer="51" to-port="1" />
2754
+ <edge from-layer="51" from-port="2" to-layer="54" to-port="0" />
2755
+ <edge from-layer="52" from-port="0" to-layer="53" to-port="0" />
2756
+ <edge from-layer="53" from-port="1" to-layer="54" to-port="1" />
2757
+ <edge from-layer="54" from-port="2" to-layer="55" to-port="0" />
2758
+ <edge from-layer="55" from-port="1" to-layer="57" to-port="0" />
2759
+ <edge from-layer="56" from-port="0" to-layer="57" to-port="1" />
2760
+ <edge from-layer="57" from-port="2" to-layer="60" to-port="0" />
2761
+ <edge from-layer="58" from-port="0" to-layer="59" to-port="0" />
2762
+ <edge from-layer="59" from-port="1" to-layer="60" to-port="1" />
2763
+ <edge from-layer="60" from-port="2" to-layer="63" to-port="0" />
2764
+ <edge from-layer="61" from-port="0" to-layer="62" to-port="0" />
2765
+ <edge from-layer="62" from-port="1" to-layer="63" to-port="1" />
2766
+ <edge from-layer="63" from-port="2" to-layer="66" to-port="0" />
2767
+ <edge from-layer="64" from-port="0" to-layer="65" to-port="0" />
2768
+ <edge from-layer="65" from-port="1" to-layer="66" to-port="1" />
2769
+ <edge from-layer="66" from-port="2" to-layer="68" to-port="0" />
2770
+ <edge from-layer="67" from-port="0" to-layer="68" to-port="1" />
2771
+ <edge from-layer="68" from-port="2" to-layer="74" to-port="0" />
2772
+ <edge from-layer="68" from-port="2" to-layer="70" to-port="0" />
2773
+ <edge from-layer="69" from-port="0" to-layer="70" to-port="1" />
2774
+ <edge from-layer="70" from-port="2" to-layer="83" to-port="0" />
2775
+ <edge from-layer="70" from-port="2" to-layer="94" to-port="0" />
2776
+ <edge from-layer="71" from-port="0" to-layer="72" to-port="0" />
2777
+ <edge from-layer="72" from-port="1" to-layer="80" to-port="0" />
2778
+ <edge from-layer="73" from-port="0" to-layer="79" to-port="0" />
2779
+ <edge from-layer="74" from-port="1" to-layer="77" to-port="0" />
2780
+ <edge from-layer="75" from-port="0" to-layer="77" to-port="1" />
2781
+ <edge from-layer="76" from-port="0" to-layer="77" to-port="2" />
2782
+ <edge from-layer="77" from-port="3" to-layer="79" to-port="1" />
2783
+ <edge from-layer="77" from-port="3" to-layer="105" to-port="1" />
2784
+ <edge from-layer="77" from-port="3" to-layer="87" to-port="1" />
2785
+ <edge from-layer="78" from-port="0" to-layer="79" to-port="2" />
2786
+ <edge from-layer="79" from-port="3" to-layer="80" to-port="1" />
2787
+ <edge from-layer="80" from-port="2" to-layer="82" to-port="0" />
2788
+ <edge from-layer="81" from-port="0" to-layer="82" to-port="1" />
2789
+ <edge from-layer="82" from-port="2" to-layer="125" to-port="2" />
2790
+ <edge from-layer="82" from-port="2" to-layer="138" to-port="1" />
2791
+ <edge from-layer="82" from-port="2" to-layer="138" to-port="2" />
2792
+ <edge from-layer="82" from-port="2" to-layer="94" to-port="1" />
2793
+ <edge from-layer="82" from-port="2" to-layer="94" to-port="2" />
2794
+ <edge from-layer="82" from-port="2" to-layer="112" to-port="1" />
2795
+ <edge from-layer="82" from-port="2" to-layer="112" to-port="2" />
2796
+ <edge from-layer="82" from-port="2" to-layer="125" to-port="1" />
2797
+ <edge from-layer="83" from-port="1" to-layer="86" to-port="0" />
2798
+ <edge from-layer="84" from-port="0" to-layer="86" to-port="1" />
2799
+ <edge from-layer="85" from-port="0" to-layer="86" to-port="2" />
2800
+ <edge from-layer="86" from-port="3" to-layer="87" to-port="0" />
2801
+ <edge from-layer="87" from-port="2" to-layer="94" to-port="3" />
2802
+ <edge from-layer="88" from-port="0" to-layer="89" to-port="0" />
2803
+ <edge from-layer="89" from-port="1" to-layer="94" to-port="4" />
2804
+ <edge from-layer="90" from-port="0" to-layer="91" to-port="0" />
2805
+ <edge from-layer="91" from-port="1" to-layer="94" to-port="5" />
2806
+ <edge from-layer="92" from-port="0" to-layer="93" to-port="0" />
2807
+ <edge from-layer="93" from-port="1" to-layer="94" to-port="6" />
2808
+ <edge from-layer="94" from-port="7" to-layer="96" to-port="0" />
2809
+ <edge from-layer="95" from-port="0" to-layer="96" to-port="1" />
2810
+ <edge from-layer="96" from-port="2" to-layer="98" to-port="0" />
2811
+ <edge from-layer="97" from-port="0" to-layer="98" to-port="1" />
2812
+ <edge from-layer="98" from-port="2" to-layer="100" to-port="0" />
2813
+ <edge from-layer="99" from-port="0" to-layer="100" to-port="1" />
2814
+ <edge from-layer="100" from-port="2" to-layer="101" to-port="0" />
2815
+ <edge from-layer="100" from-port="2" to-layer="112" to-port="0" />
2816
+ <edge from-layer="101" from-port="1" to-layer="104" to-port="0" />
2817
+ <edge from-layer="102" from-port="0" to-layer="104" to-port="1" />
2818
+ <edge from-layer="103" from-port="0" to-layer="104" to-port="2" />
2819
+ <edge from-layer="104" from-port="3" to-layer="105" to-port="0" />
2820
+ <edge from-layer="105" from-port="2" to-layer="112" to-port="3" />
2821
+ <edge from-layer="105" from-port="2" to-layer="125" to-port="3" />
2822
+ <edge from-layer="105" from-port="2" to-layer="138" to-port="3" />
2823
+ <edge from-layer="106" from-port="0" to-layer="107" to-port="0" />
2824
+ <edge from-layer="107" from-port="1" to-layer="112" to-port="4" />
2825
+ <edge from-layer="108" from-port="0" to-layer="109" to-port="0" />
2826
+ <edge from-layer="109" from-port="1" to-layer="112" to-port="5" />
2827
+ <edge from-layer="110" from-port="0" to-layer="111" to-port="0" />
2828
+ <edge from-layer="111" from-port="1" to-layer="112" to-port="6" />
2829
+ <edge from-layer="112" from-port="7" to-layer="114" to-port="0" />
2830
+ <edge from-layer="113" from-port="0" to-layer="114" to-port="1" />
2831
+ <edge from-layer="114" from-port="2" to-layer="116" to-port="0" />
2832
+ <edge from-layer="115" from-port="0" to-layer="116" to-port="1" />
2833
+ <edge from-layer="116" from-port="2" to-layer="118" to-port="0" />
2834
+ <edge from-layer="117" from-port="0" to-layer="118" to-port="1" />
2835
+ <edge from-layer="118" from-port="2" to-layer="125" to-port="0" />
2836
+ <edge from-layer="119" from-port="0" to-layer="120" to-port="0" />
2837
+ <edge from-layer="120" from-port="1" to-layer="125" to-port="4" />
2838
+ <edge from-layer="121" from-port="0" to-layer="122" to-port="0" />
2839
+ <edge from-layer="122" from-port="1" to-layer="125" to-port="5" />
2840
+ <edge from-layer="123" from-port="0" to-layer="124" to-port="0" />
2841
+ <edge from-layer="124" from-port="1" to-layer="125" to-port="6" />
2842
+ <edge from-layer="125" from-port="7" to-layer="127" to-port="0" />
2843
+ <edge from-layer="126" from-port="0" to-layer="127" to-port="1" />
2844
+ <edge from-layer="127" from-port="2" to-layer="129" to-port="0" />
2845
+ <edge from-layer="128" from-port="0" to-layer="129" to-port="1" />
2846
+ <edge from-layer="129" from-port="2" to-layer="131" to-port="0" />
2847
+ <edge from-layer="130" from-port="0" to-layer="131" to-port="1" />
2848
+ <edge from-layer="131" from-port="2" to-layer="138" to-port="0" />
2849
+ <edge from-layer="132" from-port="0" to-layer="133" to-port="0" />
2850
+ <edge from-layer="133" from-port="1" to-layer="138" to-port="4" />
2851
+ <edge from-layer="134" from-port="0" to-layer="135" to-port="0" />
2852
+ <edge from-layer="135" from-port="1" to-layer="138" to-port="5" />
2853
+ <edge from-layer="136" from-port="0" to-layer="137" to-port="0" />
2854
+ <edge from-layer="137" from-port="1" to-layer="138" to-port="6" />
2855
+ <edge from-layer="138" from-port="7" to-layer="140" to-port="0" />
2856
+ <edge from-layer="139" from-port="0" to-layer="140" to-port="1" />
2857
+ <edge from-layer="140" from-port="2" to-layer="142" to-port="0" />
2858
+ <edge from-layer="141" from-port="0" to-layer="142" to-port="1" />
2859
+ <edge from-layer="142" from-port="2" to-layer="144" to-port="0" />
2860
+ <edge from-layer="143" from-port="0" to-layer="144" to-port="1" />
2861
+ <edge from-layer="144" from-port="2" to-layer="147" to-port="0" />
2862
+ <edge from-layer="145" from-port="0" to-layer="146" to-port="0" />
2863
+ <edge from-layer="146" from-port="1" to-layer="147" to-port="1" />
2864
+ <edge from-layer="147" from-port="2" to-layer="148" to-port="1" />
2865
+ <edge from-layer="148" from-port="2" to-layer="151" to-port="0" />
2866
+ <edge from-layer="149" from-port="0" to-layer="150" to-port="0" />
2867
+ <edge from-layer="150" from-port="1" to-layer="151" to-port="1" />
2868
+ <edge from-layer="151" from-port="2" to-layer="154" to-port="0" />
2869
+ <edge from-layer="152" from-port="0" to-layer="153" to-port="0" />
2870
+ <edge from-layer="153" from-port="1" to-layer="154" to-port="1" />
2871
+ <edge from-layer="154" from-port="2" to-layer="155" to-port="1" />
2872
+ <edge from-layer="155" from-port="2" to-layer="158" to-port="0" />
2873
+ <edge from-layer="156" from-port="0" to-layer="157" to-port="0" />
2874
+ <edge from-layer="157" from-port="1" to-layer="158" to-port="1" />
2875
+ <edge from-layer="158" from-port="2" to-layer="161" to-port="0" />
2876
+ <edge from-layer="159" from-port="0" to-layer="160" to-port="0" />
2877
+ <edge from-layer="160" from-port="1" to-layer="161" to-port="1" />
2878
+ <edge from-layer="161" from-port="2" to-layer="162" to-port="1" />
2879
+ <edge from-layer="162" from-port="2" to-layer="163" to-port="0" />
2880
+ <edge from-layer="163" from-port="1" to-layer="164" to-port="0" />
2881
+ </edges>
2882
+ <rt_info>
2883
+ <Runtime_version value="2025.2.0-19136-649eec15cfe-releases/2025/2" />
2884
+ <conversion_parameters>
2885
+ <is_python_object value="False" />
2886
+ </conversion_parameters>
2887
+ </rt_info>
2888
+ </net>
pyannote-wespeaker.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b6038fcfa967d8141dd561ea32c96c17a889df9c82c2edb91759c95183f662f9
3
+ size 13260230
pyannote-wespeaker.xml ADDED
The diff for this file is too large to render. See raw diff