Unnamed: 0
stringlengths
5
37
Abstract Algebra
float64
0.21
0.82
Anatomy
float64
0.22
0.91
College Chemistry
float64
0.27
0.67
Computer Security
float64
0.3
0.89
Econometrics
float64
0.31
0.81
Global Facts
float64
0.25
0.8
Jurisprudence
float64
0.25
0.91
Philosophy
float64
0.33
0.9
Professional Medicine
float64
0.45
0.96
Us Foreign Policy
float64
0.26
0.96
Astronomy
float64
0.34
0.97
Business Ethics
float64
0.24
0.89
Clinical Knowledge
float64
0.26
0.93
College Biology
float64
0.24
0.97
College Computer Science
float64
0.35
0.84
College Mathematics
float64
0.22
0.8
College Medicine
float64
0.26
0.84
College Physics
float64
0.2
0.86
Conceptual Physics
float64
0.32
0.95
Electrical Engineering
float64
0.29
0.86
Elementary Mathematics
float64
0.25
0.94
Formal Logic
float64
0.27
0.79
High School Biology
float64
0.34
0.96
High School Chemistry
float64
0.31
0.83
High School Computer Science
float64
0.22
0.96
High School European History
float64
0.24
0.92
High School Geography
float64
0.23
0.96
High School Government
float64
0.36
0.99
High School Macroeconomics
float64
0.34
0.91
High School Mathematics
float64
0.09
0.73
High School Microeconomics
float64
0.29
0.97
High School Physics
float64
0.26
0.83
High School Psychology
float64
0.26
0.97
High School Statistics
float64
0.26
0.88
High School US History
float64
0.25
0.96
High School World History
float64
0.25
0.96
Human Aging
float64
0.28
0.89
Human Sexuality
float64
0.27
0.94
International Law
float64
0.31
0.96
Logical Fallacies
float64
0.26
0.93
Machine Learning
float64
0.29
0.84
Management
float64
0.27
0.94
Marketing
float64
0.27
0.96
Medical Genetics
float64
0.28
0.98
Moral Disputes
float64
0.27
0.89
Moral Scenarios
float64
0.23
0.9
Nutrition
float64
0.34
0.93
Prehistory
float64
0.32
0.95
Professional Accounting
float64
0.22
0.84
Professional Law
float64
0.24
0.75
Professional Psychology
float64
0.23
0.92
Public Relations
float64
0.35
0.86
Security Studies
float64
0.41
0.89
Sociology
float64
0.38
0.96
Virology
float64
0.39
0.6
World Religions
float64
0.23
0.92
Claude 3.5 Sonnet (20241022)
0.78
0.859
0.63
0.87
0.807
0.8
0.898
0.891
0.949
0.96
0.974
0.83
0.928
0.965
0.83
0.66
0.832
0.775
0.906
0.848
0.918
0.786
0.952
0.823
0.93
0.897
0.939
0.984
0.903
0.63
0.958
0.728
0.967
0.852
0.956
0.958
0.861
0.939
0.959
0.914
0.839
0.932
0.953
0.96
0.87
0.888
0.922
0.941
0.84
0.747
0.922
0.8
0.882
0.955
0.584
0.901
Gemini 1.5 Pro (002)
0.82
0.83
0.67
0.85
0.693
0.77
0.898
0.887
0.938
0.94
0.934
0.84
0.906
0.965
0.82
0.8
0.844
0.863
0.945
0.855
0.942
0.754
0.958
0.833
0.96
0.885
0.944
0.99
0.913
0.73
0.95
0.834
0.967
0.884
0.926
0.937
0.839
0.878
0.917
0.902
0.83
0.903
0.962
0.92
0.855
0.792
0.886
0.926
0.787
0.708
0.912
0.809
0.857
0.95
0.566
0.889
Claude 3.5 Sonnet (20240620)
0.75
0.844
0.59
0.89
0.807
0.72
0.889
0.891
0.945
0.96
0.961
0.85
0.913
0.965
0.84
0.58
0.827
0.696
0.885
0.828
0.892
0.698
0.942
0.813
0.94
0.897
0.949
0.979
0.9
0.633
0.945
0.742
0.96
0.847
0.936
0.954
0.888
0.939
0.959
0.926
0.786
0.942
0.949
0.98
0.887
0.882
0.912
0.951
0.816
0.754
0.922
0.855
0.878
0.96
0.602
0.924
Claude 3 Opus (20240229)
0.64
0.8
0.6
0.85
0.789
0.66
0.88
0.9
0.934
0.96
0.967
0.86
0.879
0.972
0.81
0.55
0.832
0.716
0.881
0.814
0.862
0.698
0.935
0.768
0.92
0.885
0.955
0.979
0.872
0.585
0.933
0.709
0.941
0.796
0.941
0.941
0.861
0.908
0.901
0.896
0.741
0.942
0.944
0.93
0.882
0.826
0.925
0.941
0.791
0.715
0.904
0.827
0.886
0.94
0.578
0.901
Llama 3.1 Instruct Turbo (405B)
0.7
0.822
0.6
0.81
0.746
0.71
0.87
0.878
0.934
0.94
0.921
0.81
0.879
0.965
0.78
0.66
0.78
0.696
0.877
0.821
0.828
0.698
0.932
0.803
0.95
0.885
0.955
0.984
0.892
0.637
0.954
0.735
0.949
0.838
0.926
0.941
0.843
0.855
0.95
0.92
0.795
0.893
0.962
0.93
0.87
0.876
0.928
0.929
0.727
0.704
0.861
0.818
0.857
0.94
0.572
0.906
GPT-4o (2024-08-06)
0.58
0.911
0.6
0.85
0.711
0.69
0.907
0.894
0.96
0.95
0.947
0.89
0.894
0.944
0.79
0.51
0.844
0.686
0.923
0.793
0.775
0.675
0.945
0.793
0.92
0.915
0.955
0.984
0.903
0.481
0.971
0.728
0.954
0.769
0.951
0.941
0.848
0.901
0.942
0.902
0.777
0.913
0.94
0.98
0.87
0.802
0.905
0.935
0.801
0.742
0.899
0.782
0.833
0.945
0.578
0.883
GPT-4o (2024-05-13)
0.66
0.911
0.61
0.85
0.693
0.64
0.898
0.9
0.949
0.96
0.941
0.85
0.894
0.938
0.77
0.47
0.838
0.686
0.911
0.807
0.741
0.683
0.948
0.778
0.93
0.903
0.955
0.979
0.91
0.481
0.958
0.735
0.956
0.787
0.956
0.945
0.843
0.908
0.934
0.883
0.768
0.942
0.936
0.96
0.882
0.841
0.899
0.938
0.77
0.723
0.905
0.809
0.837
0.94
0.596
0.889
Qwen2.5 Instruct Turbo (72B)
0.68
0.822
0.62
0.86
0.728
0.61
0.87
0.839
0.908
0.96
0.934
0.85
0.872
0.944
0.77
0.64
0.803
0.588
0.885
0.8
0.87
0.73
0.932
0.773
0.94
0.885
0.929
0.99
0.887
0.663
0.945
0.722
0.95
0.796
0.926
0.92
0.816
0.878
0.893
0.89
0.777
0.913
0.953
0.92
0.838
0.787
0.886
0.91
0.702
0.677
0.864
0.782
0.849
0.925
0.584
0.901
Gemini 1.5 Pro (001)
0.75
0.83
0.62
0.83
0.728
0.66
0.889
0.871
0.886
0.93
0.914
0.8
0.853
0.903
0.79
0.79
0.792
0.745
0.949
0.745
0.939
0.706
0.926
0.719
0.93
0.867
0.96
0.974
0.89
0.685
0.941
0.762
0.941
0.856
0.922
0.924
0.807
0.374
0.917
0.896
0.652
0.922
0.932
0.91
0.818
0.739
0.879
0.87
0.674
0.666
0.894
0.818
0.873
0.92
0.554
0.854
GPT-4 (0613)
0.63
0.8
0.55
0.86
0.684
0.62
0.889
0.859
0.934
0.95
0.934
0.79
0.845
0.924
0.76
0.54
0.763
0.627
0.868
0.786
0.807
0.643
0.948
0.709
0.87
0.879
0.96
0.99
0.872
0.544
0.937
0.596
0.949
0.792
0.946
0.945
0.834
0.908
0.917
0.871
0.759
0.932
0.962
0.94
0.867
0.902
0.892
0.926
0.702
0.727
0.891
0.745
0.861
0.93
0.596
0.877
Qwen2 Instruct (72B)
0.67
0.793
0.65
0.85
0.737
0.58
0.87
0.859
0.886
0.94
0.934
0.82
0.868
0.931
0.74
0.52
0.798
0.598
0.872
0.793
0.825
0.667
0.945
0.768
0.91
0.873
0.939
0.979
0.879
0.596
0.92
0.722
0.936
0.75
0.941
0.932
0.816
0.893
0.893
0.914
0.768
0.903
0.953
0.9
0.853
0.815
0.902
0.914
0.709
0.662
0.886
0.745
0.837
0.935
0.56
0.848
Palmyra-X-004
0.75
0.822
0.52
0.82
0.684
0.62
0.843
0.83
0.908
0.92
0.928
0.76
0.879
0.889
0.78
0.55
0.809
0.647
0.885
0.793
0.841
0.579
0.932
0.754
0.95
0.861
0.909
0.959
0.849
0.563
0.895
0.702
0.941
0.727
0.922
0.911
0.816
0.924
0.901
0.877
0.679
0.903
0.932
0.87
0.824
0.825
0.869
0.917
0.638
0.664
0.845
0.791
0.849
0.915
0.584
0.842
GPT-4 Turbo (2024-04-09)
0.56
0.822
0.53
0.83
0.675
0.58
0.88
0.868
0.915
0.96
0.941
0.82
0.83
0.931
0.72
0.55
0.792
0.539
0.894
0.752
0.72
0.706
0.939
0.724
0.91
0.867
0.949
0.974
0.862
0.515
0.945
0.603
0.95
0.759
0.951
0.941
0.848
0.901
0.942
0.871
0.741
0.883
0.949
0.92
0.861
0.803
0.892
0.92
0.681
0.671
0.873
0.755
0.8
0.915
0.602
0.848
Gemini 1.5 Pro (0409 preview)
0.6
0.77
0.58
0.81
0.737
0.66
0.87
0.846
0.868
0.94
0.914
0.8
0.868
0.938
0.78
0.59
0.786
0.804
0.915
0.772
0.884
0.643
0.939
0.788
0.89
0.861
0.934
0.979
0.854
0.622
0.895
0.702
0.941
0.81
0.912
0.924
0.816
0.397
0.917
0.859
0.67
0.874
0.953
0.91
0.827
0.696
0.846
0.886
0.638
0.642
0.866
0.755
0.849
0.925
0.584
0.877
Llama 3.2 Vision Instruct Turbo (90B)
0.52
0.8
0.57
0.81
0.684
0.6
0.88
0.839
0.934
0.93
0.921
0.76
0.845
0.938
0.73
0.5
0.751
0.539
0.826
0.759
0.688
0.683
0.923
0.754
0.92
0.861
0.934
0.979
0.841
0.407
0.92
0.636
0.949
0.764
0.946
0.941
0.834
0.87
0.934
0.834
0.688
0.913
0.944
0.92
0.85
0.841
0.889
0.886
0.66
0.676
0.843
0.718
0.853
0.92
0.584
0.901
Llama 3.1 Instruct Turbo (70B)
0.55
0.8
0.59
0.8
0.675
0.61
0.889
0.833
0.93
0.93
0.908
0.72
0.845
0.938
0.71
0.49
0.763
0.559
0.834
0.745
0.701
0.675
0.923
0.754
0.92
0.873
0.939
0.984
0.841
0.404
0.916
0.629
0.949
0.755
0.941
0.937
0.821
0.855
0.926
0.84
0.696
0.913
0.936
0.93
0.841
0.834
0.889
0.88
0.656
0.674
0.846
0.709
0.849
0.92
0.578
0.895
Mistral Large 2 (2407)
0.7
0.785
0.52
0.81
0.693
0.56
0.861
0.826
0.908
0.9
0.921
0.79
0.864
0.896
0.75
0.59
0.803
0.559
0.864
0.793
0.799
0.579
0.932
0.734
0.94
0.861
0.934
0.969
0.831
0.37
0.895
0.543
0.939
0.648
0.922
0.92
0.807
0.924
0.926
0.847
0.661
0.883
0.94
0.9
0.832
0.839
0.827
0.92
0.645
0.645
0.861
0.764
0.865
0.91
0.59
0.865
GPT-4 Turbo (1106 preview)
0.53
0.807
0.47
0.86
0.675
0.58
0.889
0.852
0.945
0.96
0.941
0.78
0.864
0.931
0.69
0.4
0.792
0.402
0.894
0.772
0.638
0.651
0.939
0.675
0.9
0.897
0.934
0.979
0.867
0.093
0.945
0.543
0.954
0.662
0.936
0.958
0.857
0.908
0.926
0.865
0.723
0.913
0.932
0.93
0.858
0.816
0.879
0.917
0.684
0.707
0.887
0.782
0.841
0.925
0.59
0.854
Llama 3 (70B)
0.43
0.785
0.56
0.85
0.693
0.49
0.861
0.865
0.882
0.94
0.921
0.83
0.845
0.944
0.7
0.56
0.798
0.529
0.838
0.766
0.632
0.651
0.9
0.734
0.87
0.873
0.939
0.979
0.833
0.493
0.887
0.576
0.939
0.736
0.951
0.941
0.825
0.878
0.901
0.865
0.714
0.913
0.94
0.89
0.85
0.598
0.876
0.91
0.649
0.615
0.871
0.727
0.833
0.93
0.59
0.906
Yi Large (Preview)
0.6
0.83
0.52
0.86
0.728
0.52
0.852
0.842
0.857
0.85
0.914
0.8
0.857
0.931
0.71
0.36
0.827
0.569
0.864
0.779
0.685
0.603
0.935
0.739
0.84
0.879
0.934
0.969
0.833
0.485
0.903
0.589
0.919
0.769
0.922
0.928
0.803
0.901
0.917
0.865
0.616
0.903
0.927
0.83
0.838
0.831
0.846
0.892
0.649
0.634
0.853
0.827
0.82
0.881
0.59
0.871
Palmyra X V3 (72B)
0.53
0.733
0.59
0.78
0.649
0.53
0.88
0.836
0.846
0.96
0.862
0.83
0.804
0.951
0.65
0.51
0.769
0.549
0.809
0.772
0.661
0.659
0.932
0.749
0.88
0.861
0.924
0.979
0.841
0.493
0.937
0.563
0.938
0.731
0.936
0.911
0.794
0.924
0.909
0.877
0.625
0.903
0.94
0.83
0.832
0.562
0.856
0.87
0.706
0.632
0.858
0.773
0.833
0.91
0.572
0.877
PaLM-2 (Unicorn)
0.51
0.733
0.59
0.77
0.649
0.53
0.88
0.836
0.849
0.96
0.862
0.83
0.804
0.951
0.65
0.51
0.769
0.549
0.809
0.772
0.661
0.659
0.932
0.749
0.9
0.861
0.924
0.979
0.841
0.493
0.937
0.563
0.938
0.731
0.936
0.911
0.794
0.924
0.909
0.877
0.625
0.903
0.94
0.83
0.832
0.562
0.856
0.87
0.706
0.633
0.858
0.773
0.829
0.91
0.572
0.877
Jamba 1.5 Large
0.53
0.793
0.55
0.8
0.614
0.54
0.87
0.849
0.875
0.92
0.882
0.77
0.849
0.91
0.71
0.46
0.723
0.51
0.779
0.793
0.656
0.619
0.906
0.709
0.9
0.861
0.939
0.969
0.833
0.544
0.903
0.603
0.93
0.657
0.931
0.911
0.812
0.832
0.884
0.859
0.688
0.864
0.94
0.89
0.85
0.686
0.869
0.892
0.642
0.608
0.842
0.755
0.771
0.93
0.554
0.865
Gemini 1.5 Flash (001)
0.58
0.8
0.6
0.79
0.614
0.53
0.889
0.791
0.813
0.93
0.882
0.81
0.834
0.924
0.66
0.54
0.751
0.696
0.851
0.8
0.754
0.627
0.929
0.714
0.9
0.824
0.919
0.974
0.851
0.511
0.899
0.702
0.949
0.736
0.907
0.907
0.798
0.374
0.901
0.853
0.571
0.864
0.94
0.86
0.812
0.637
0.82
0.867
0.66
0.605
0.828
0.764
0.808
0.915
0.566
0.883
Mixtral (8x22B)
0.48
0.741
0.57
0.84
0.667
0.56
0.852
0.842
0.871
0.95
0.882
0.74
0.819
0.903
0.7
0.48
0.769
0.569
0.796
0.766
0.622
0.627
0.91
0.69
0.87
0.848
0.924
0.974
0.803
0.463
0.866
0.556
0.928
0.685
0.912
0.895
0.78
0.885
0.917
0.877
0.661
0.883
0.915
0.85
0.832
0.646
0.866
0.87
0.621
0.605
0.845
0.755
0.865
0.92
0.596
0.901
Gemini 1.5 Flash (0514 preview)
0.56
0.807
0.6
0.77
0.64
0.55
0.889
0.807
0.831
0.93
0.868
0.82
0.838
0.91
0.64
0.53
0.757
0.667
0.855
0.814
0.778
0.611
0.926
0.739
0.9
0.836
0.919
0.969
0.854
0.5
0.899
0.682
0.947
0.755
0.912
0.907
0.785
0.374
0.876
0.853
0.563
0.854
0.936
0.86
0.818
0.631
0.801
0.867
0.656
0.601
0.825
0.773
0.812
0.9
0.566
0.871
Phi-3 (14B)
0.5
0.719
0.56
0.79
0.614
0.5
0.88
0.804
0.868
0.95
0.849
0.8
0.826
0.889
0.64
0.53
0.757
0.529
0.809
0.683
0.709
0.587
0.903
0.734
0.83
0.885
0.924
0.969
0.831
0.504
0.908
0.616
0.923
0.699
0.912
0.903
0.78
0.863
0.934
0.828
0.696
0.864
0.919
0.91
0.769
0.639
0.837
0.867
0.66
0.628
0.835
0.755
0.829
0.891
0.554
0.865
Qwen1.5 (72B)
0.44
0.733
0.5
0.81
0.544
0.56
0.824
0.83
0.824
0.94
0.868
0.79
0.834
0.91
0.68
0.56
0.78
0.559
0.821
0.779
0.696
0.556
0.887
0.675
0.84
0.861
0.939
0.99
0.813
0.5
0.832
0.563
0.928
0.671
0.926
0.899
0.812
0.878
0.909
0.853
0.67
0.854
0.949
0.87
0.832
0.669
0.859
0.88
0.631
0.609
0.809
0.755
0.824
0.9
0.584
0.883
Qwen1.5 Chat (110B)
0.57
0.696
0.62
0.82
0.64
0.51
0.833
0.823
0.86
0.87
0.901
0.8
0.766
0.882
0.64
0.55
0.746
0.51
0.838
0.752
0.669
0.643
0.894
0.67
0.81
0.885
0.919
0.984
0.767
0.478
0.87
0.55
0.921
0.639
0.912
0.903
0.798
0.855
0.876
0.828
0.634
0.835
0.919
0.85
0.777
0.783
0.804
0.867
0.635
0.624
0.82
0.773
0.735
0.866
0.542
0.871
GPT-4o mini (2024-07-18)
0.42
0.77
0.51
0.85
0.649
0.45
0.87
0.772
0.864
0.91
0.849
0.79
0.845
0.91
0.67
0.47
0.763
0.559
0.791
0.731
0.651
0.556
0.919
0.729
0.88
0.885
0.929
0.959
0.851
0.419
0.899
0.589
0.928
0.704
0.907
0.903
0.812
0.863
0.926
0.871
0.616
0.845
0.927
0.89
0.806
0.485
0.827
0.833
0.667
0.592
0.833
0.791
0.788
0.9
0.536
0.86
Yi (34B)
0.4
0.748
0.52
0.83
0.588
0.53
0.898
0.82
0.82
0.91
0.901
0.75
0.8
0.889
0.63
0.47
0.705
0.5
0.77
0.779
0.656
0.548
0.894
0.635
0.82
0.867
0.909
0.974
0.803
0.467
0.857
0.517
0.917
0.671
0.907
0.907
0.789
0.87
0.909
0.883
0.58
0.893
0.936
0.87
0.827
0.606
0.869
0.877
0.674
0.598
0.835
0.745
0.833
0.9
0.572
0.877
Claude 3 Sonnet (20240229)
0.39
0.711
0.5
0.79
0.64
0.53
0.861
0.852
0.868
0.94
0.855
0.82
0.804
0.91
0.66
0.41
0.74
0.559
0.774
0.703
0.635
0.579
0.887
0.606
0.84
0.848
0.909
0.959
0.787
0.422
0.853
0.477
0.932
0.694
0.931
0.895
0.794
0.809
0.909
0.853
0.643
0.922
0.85
0.79
0.829
0.626
0.82
0.864
0.656
0.645
0.814
0.782
0.865
0.905
0.578
0.871
Gemma 2 (27B)
0.4
0.77
0.53
0.84
0.667
0.43
0.861
0.849
0.835
0.95
0.829
0.78
0.808
0.917
0.61
0.43
0.78
0.5
0.834
0.738
0.558
0.516
0.91
0.704
0.71
0.824
0.934
0.979
0.808
0.47
0.887
0.589
0.932
0.713
0.917
0.89
0.821
0.84
0.843
0.865
0.625
0.864
0.94
0.87
0.838
0.394
0.824
0.877
0.578
0.611
0.84
0.745
0.808
0.9
0.56
0.924
Phi-3 (7B)
0.44
0.726
0.54
0.77
0.596
0.52
0.843
0.82
0.824
0.95
0.849
0.77
0.83
0.847
0.62
0.48
0.746
0.559
0.779
0.69
0.619
0.595
0.919
0.68
0.75
0.873
0.899
0.969
0.81
0.496
0.899
0.576
0.925
0.694
0.907
0.848
0.785
0.817
0.851
0.81
0.652
0.903
0.897
0.84
0.795
0.711
0.833
0.858
0.631
0.6
0.835
0.727
0.804
0.886
0.548
0.825
Qwen1.5 (32B)
0.4
0.644
0.5
0.77
0.561
0.47
0.843
0.826
0.809
0.91
0.855
0.77
0.781
0.868
0.59
0.51
0.763
0.51
0.766
0.731
0.685
0.524
0.894
0.645
0.85
0.861
0.919
0.974
0.782
0.5
0.832
0.53
0.91
0.694
0.902
0.869
0.785
0.847
0.884
0.822
0.616
0.874
0.936
0.85
0.795
0.545
0.81
0.83
0.571
0.573
0.75
0.664
0.829
0.881
0.578
0.854
DBRX Instruct
0.34
0.667
0.51
0.83
0.605
0.46
0.843
0.804
0.82
0.93
0.836
0.78
0.789
0.896
0.67
0.39
0.74
0.539
0.74
0.71
0.563
0.563
0.868
0.655
0.8
0.842
0.899
0.953
0.785
0.444
0.832
0.477
0.908
0.657
0.907
0.903
0.767
0.878
0.884
0.847
0.625
0.854
0.94
0.85
0.815
0.465
0.814
0.84
0.585
0.579
0.801
0.691
0.804
0.896
0.566
0.871
Gemini 1.5 Flash (002)
0.63
0.793
0.56
0.72
0.675
0.47
0.852
0.797
0.647
0.81
0.895
0.27
0.792
0.618
0.7
0.52
0.78
0.637
0.851
0.772
0.704
0.595
0.855
0.749
0.9
0.83
0.904
0.959
0.838
0.5
0.903
0.616
0.897
0.727
0.897
0.869
0.74
0.847
0.752
0.859
0.616
0.893
0.953
0.89
0.572
0.676
0.588
0.762
0.621
0.603
0.806
0.7
0.547
0.851
0.524
0.865
Claude 3 Haiku (20240307)
0.42
0.711
0.52
0.79
0.632
0.47
0.861
0.814
0.857
0.95
0.901
0.78
0.789
0.882
0.66
0.49
0.734
0.48
0.715
0.69
0.558
0.579
0.855
0.631
0.78
0.818
0.864
0.948
0.728
0.37
0.853
0.47
0.916
0.671
0.892
0.878
0.767
0.824
0.901
0.791
0.589
0.874
0.91
0.8
0.809
0.502
0.83
0.824
0.539
0.577
0.802
0.755
0.808
0.9
0.542
0.871
Claude 2.1
0.4
0.726
0.49
0.81
0.596
0.55
0.87
0.794
0.798
0.92
0.855
0.73
0.785
0.882
0.69
0.37
0.734
0.5
0.766
0.724
0.521
0.5
0.884
0.586
0.81
0.848
0.899
0.959
0.772
0.352
0.832
0.477
0.899
0.694
0.848
0.903
0.776
0.847
0.901
0.834
0.482
0.825
0.923
0.81
0.809
0.52
0.781
0.821
0.613
0.607
0.797
0.773
0.812
0.886
0.554
0.854
Qwen2.5 Instruct Turbo (7B)
0.49
0.689
0.51
0.79
0.64
0.42
0.796
0.746
0.779
0.86
0.836
0.82
0.785
0.799
0.67
0.44
0.676
0.51
0.736
0.717
0.643
0.587
0.832
0.645
0.82
0.855
0.848
0.912
0.782
0.526
0.887
0.576
0.879
0.708
0.868
0.878
0.744
0.794
0.86
0.773
0.554
0.845
0.919
0.85
0.757
0.511
0.778
0.836
0.589
0.504
0.757
0.709
0.682
0.861
0.578
0.83
DeepSeek LLM Chat (67B)
0.44
0.667
0.51
0.79
0.553
0.46
0.852
0.801
0.765
0.91
0.822
0.86
0.785
0.854
0.54
0.42
0.699
0.363
0.723
0.669
0.548
0.548
0.839
0.567
0.74
0.83
0.889
0.964
0.746
0.374
0.824
0.457
0.903
0.671
0.907
0.911
0.825
0.84
0.851
0.847
0.563
0.903
0.923
0.73
0.798
0.544
0.781
0.858
0.571
0.591
0.809
0.7
0.796
0.876
0.554
0.865
Gemma 2 (9B)
0.4
0.704
0.53
0.81
0.579
0.53
0.833
0.772
0.779
0.9
0.789
0.77
0.777
0.882
0.55
0.42
0.705
0.5
0.732
0.724
0.577
0.492
0.894
0.655
0.77
0.794
0.894
0.953
0.774
0.485
0.819
0.464
0.91
0.676
0.892
0.865
0.753
0.809
0.835
0.816
0.509
0.874
0.919
0.84
0.777
0.295
0.775
0.812
0.514
0.55
0.788
0.736
0.78
0.9
0.53
0.86
Mixtral (8x7B 32K seqlen)
0.38
0.696
0.52
0.81
0.605
0.46
0.833
0.797
0.794
0.93
0.829
0.72
0.785
0.868
0.67
0.46
0.699
0.51
0.681
0.676
0.476
0.532
0.835
0.631
0.71
0.818
0.854
0.933
0.715
0.381
0.777
0.503
0.895
0.63
0.858
0.886
0.771
0.87
0.86
0.767
0.509
0.845
0.923
0.76
0.801
0.444
0.83
0.849
0.578
0.537
0.779
0.682
0.792
0.871
0.506
0.871
Gemini 1.0 Pro (001)
0.34
0.652
0.44
0.84
0.553
0.49
0.861
0.762
0.728
0.89
0.796
0.69
0.758
0.847
0.55
0.51
0.659
0.333
0.706
0.69
0.476
0.468
0.861
0.567
0.79
0.824
0.874
0.933
0.731
0.393
0.803
0.437
0.877
0.579
0.873
0.865
0.776
0.618
0.876
0.804
0.527
0.845
0.91
0.8
0.772
0.46
0.788
0.802
0.525
0.557
0.752
0.691
0.804
0.9
0.536
0.86
Jamba 1.5 Mini
0.33
0.711
0.46
0.73
0.491
0.43
0.88
0.752
0.779
0.9
0.822
0.76
0.74
0.806
0.61
0.46
0.659
0.48
0.677
0.683
0.553
0.452
0.819
0.562
0.79
0.836
0.879
0.943
0.756
0.389
0.748
0.411
0.883
0.556
0.887
0.84
0.749
0.809
0.893
0.81
0.509
0.825
0.915
0.69
0.783
0.269
0.801
0.824
0.535
0.533
0.76
0.727
0.755
0.876
0.578
0.842
Llama 2 (70B)
0.31
0.607
0.52
0.77
0.43
0.47
0.824
0.791
0.75
0.92
0.829
0.73
0.717
0.84
0.59
0.39
0.653
0.363
0.668
0.634
0.421
0.468
0.816
0.512
0.76
0.812
0.879
0.933
0.731
0.359
0.769
0.464
0.886
0.597
0.931
0.882
0.794
0.84
0.868
0.791
0.491
0.845
0.889
0.72
0.775
0.45
0.758
0.84
0.571
0.541
0.76
0.745
0.796
0.9
0.53
0.854
Command R Plus
0.21
0.644
0.55
0.74
0.561
0.5
0.806
0.695
0.732
0.89
0.783
0.77
0.743
0.833
0.59
0.37
0.688
0.52
0.591
0.71
0.474
0.484
0.81
0.576
0.77
0.824
0.874
0.922
0.69
0.37
0.807
0.391
0.873
0.611
0.848
0.827
0.74
0.786
0.835
0.791
0.518
0.835
0.927
0.77
0.714
0.585
0.742
0.821
0.514
0.557
0.735
0.709
0.751
0.876
0.56
0.842
PaLM-2 (Bison)
0.39
0.644
0.52
0.74
0.518
0.38
0.769
0.736
0.728
0.87
0.803
0.76
0.725
0.826
0.51
0.35
0.613
0.51
0.694
0.69
0.487
0.5
0.829
0.596
0.74
0.848
0.848
0.927
0.692
0.378
0.803
0.331
0.894
0.588
0.877
0.869
0.691
0.84
0.835
0.853
0.563
0.893
0.893
0.75
0.766
0.369
0.709
0.812
0.578
0.489
0.761
0.691
0.812
0.92
0.494
0.883
GPT-3.5 Turbo (0613)
0.38
0.659
0.5
0.81
0.5
0.37
0.806
0.759
0.794
0.88
0.763
0.75
0.777
0.785
0.58
0.33
0.659
0.461
0.613
0.648
0.5
0.397
0.839
0.557
0.76
0.8
0.904
0.922
0.685
0.337
0.79
0.397
0.87
0.597
0.843
0.857
0.762
0.786
0.843
0.791
0.455
0.845
0.91
0.8
0.757
0.404
0.758
0.787
0.535
0.529
0.732
0.745
0.8
0.871
0.542
0.836
Claude Instant 1.2
0.37
0.637
0.51
0.76
0.614
0.38
0.833
0.756
0.75
0.9
0.743
0.7
0.709
0.778
0.6
0.4
0.682
0.49
0.613
0.641
0.45
0.444
0.823
0.537
0.75
0.818
0.894
0.902
0.695
0.333
0.773
0.45
0.861
0.556
0.873
0.878
0.726
0.794
0.851
0.81
0.67
0.835
0.885
0.71
0.757
0.488
0.735
0.762
0.582
0.57
0.724
0.627
0.784
0.841
0.548
0.784
Mistral Large (2402)
0.45
0.674
0.38
0.8
0.64
0.34
0.815
0.794
0.471
0.92
0.842
0.67
0.751
0.826
0.57
0.22
0.711
0.373
0.574
0.545
0.508
0.532
0.861
0.522
0.7
0.624
0.909
0.964
0.731
0.211
0.828
0.437
0.91
0.574
0.824
0.886
0.803
0.847
0.868
0.81
0.563
0.854
0.897
0.74
0.829
0.579
0.791
0.904
0.404
0.319
0.809
0.709
0.824
0.93
0.554
0.883
Mistral Small (2402)
0.26
0.674
0.43
0.77
0.614
0.45
0.833
0.765
0.746
0.89
0.77
0.71
0.766
0.806
0.55
0.29
0.653
0.402
0.685
0.628
0.415
0.516
0.865
0.542
0.78
0.818
0.874
0.948
0.738
0.215
0.79
0.364
0.888
0.551
0.809
0.857
0.744
0.824
0.826
0.804
0.563
0.786
0.906
0.75
0.743
0.575
0.761
0.802
0.486
0.509
0.768
0.773
0.788
0.871
0.542
0.848
Qwen1.5 (14B)
0.4
0.637
0.46
0.84
0.561
0.49
0.769
0.717
0.724
0.87
0.724
0.75
0.736
0.785
0.56
0.42
0.694
0.48
0.694
0.683
0.603
0.492
0.848
0.626
0.77
0.848
0.869
0.881
0.728
0.444
0.744
0.464
0.857
0.62
0.819
0.84
0.74
0.756
0.826
0.736
0.509
0.816
0.893
0.76
0.728
0.368
0.742
0.71
0.482
0.486
0.699
0.655
0.8
0.841
0.458
0.842
Arctic Instruct
0.35
0.652
0.46
0.84
0.5
0.39
0.741
0.752
0.721
0.88
0.763
0.69
0.781
0.764
0.57
0.37
0.699
0.461
0.634
0.662
0.481
0.444
0.823
0.591
0.75
0.752
0.864
0.912
0.682
0.363
0.761
0.411
0.868
0.532
0.833
0.827
0.704
0.847
0.826
0.779
0.473
0.796
0.902
0.76
0.749
0.28
0.725
0.79
0.525
0.511
0.724
0.664
0.78
0.891
0.536
0.854
GPT-3.5 Turbo (0125)
0.31
0.696
0.49
0.78
0.474
0.39
0.806
0.746
0.772
0.89
0.75
0.75
0.755
0.792
0.58
0.35
0.63
0.471
0.634
0.669
0.534
0.444
0.797
0.557
0.7
0.782
0.864
0.922
0.672
0.307
0.727
0.364
0.862
0.491
0.838
0.819
0.749
0.779
0.81
0.779
0.455
0.835
0.91
0.73
0.734
0.355
0.748
0.735
0.493
0.487
0.722
0.727
0.751
0.861
0.536
0.842
Llama 3 (8B)
0.33
0.696
0.48
0.8
0.518
0.34
0.741
0.743
0.728
0.88
0.711
0.65
0.751
0.764
0.58
0.35
0.624
0.451
0.557
0.669
0.426
0.468
0.781
0.547
0.67
0.745
0.828
0.881
0.638
0.415
0.731
0.417
0.85
0.556
0.843
0.823
0.713
0.748
0.843
0.755
0.545
0.874
0.885
0.83
0.717
0.416
0.761
0.738
0.482
0.468
0.711
0.736
0.771
0.866
0.566
0.819
Gemma (7B)
0.28
0.563
0.48
0.75
0.474
0.42
0.769
0.727
0.658
0.87
0.717
0.65
0.698
0.792
0.53
0.34
0.74
0.412
0.621
0.628
0.516
0.508
0.79
0.507
0.67
0.758
0.838
0.891
0.638
0.4
0.689
0.417
0.837
0.574
0.848
0.857
0.709
0.733
0.835
0.742
0.554
0.864
0.885
0.7
0.72
0.377
0.778
0.756
0.496
0.487
0.712
0.682
0.735
0.841
0.548
0.842
Jamba Instruct
0.36
0.615
0.44
0.76
0.439
0.4
0.796
0.749
0.654
0.91
0.73
0.6
0.702
0.771
0.48
0.39
0.601
0.422
0.677
0.621
0.497
0.444
0.784
0.498
0.68
0.812
0.859
0.881
0.703
0.341
0.718
0.358
0.859
0.472
0.853
0.797
0.704
0.794
0.835
0.706
0.536
0.786
0.885
0.67
0.717
0.465
0.745
0.796
0.511
0.498
0.716
0.682
0.743
0.891
0.53
0.813
Mistral NeMo (2402)
0.29
0.607
0.47
0.81
0.561
0.4
0.796
0.733
0.761
0.89
0.691
0.49
0.736
0.771
0.51
0.39
0.682
0.373
0.647
0.531
0.439
0.405
0.803
0.483
0.75
0.776
0.833
0.912
0.672
0.311
0.739
0.404
0.872
0.565
0.858
0.848
0.709
0.702
0.769
0.791
0.402
0.796
0.889
0.78
0.691
0.381
0.709
0.765
0.528
0.502
0.588
0.718
0.771
0.726
0.56
0.789
Command R
0.33
0.615
0.45
0.78
0.456
0.42
0.796
0.685
0.658
0.82
0.743
0.63
0.751
0.771
0.48
0.37
0.659
0.382
0.528
0.593
0.437
0.405
0.745
0.542
0.69
0.83
0.848
0.891
0.633
0.326
0.702
0.364
0.848
0.528
0.853
0.84
0.731
0.763
0.802
0.798
0.446
0.796
0.872
0.81
0.668
0.451
0.703
0.728
0.507
0.526
0.681
0.7
0.714
0.866
0.542
0.813
Yi (6B)
0.3
0.6
0.4
0.73
0.351
0.43
0.796
0.678
0.658
0.87
0.684
0.67
0.66
0.681
0.48
0.37
0.624
0.422
0.621
0.662
0.452
0.452
0.774
0.517
0.68
0.752
0.838
0.907
0.641
0.33
0.748
0.371
0.831
0.528
0.789
0.785
0.686
0.763
0.769
0.779
0.411
0.806
0.893
0.77
0.731
0.335
0.739
0.713
0.486
0.484
0.668
0.718
0.735
0.831
0.452
0.836
Qwen1.5 (7B)
0.39
0.526
0.41
0.76
0.447
0.4
0.778
0.691
0.581
0.84
0.671
0.69
0.691
0.667
0.56
0.37
0.618
0.471
0.579
0.572
0.5
0.397
0.761
0.547
0.73
0.733
0.818
0.839
0.597
0.367
0.639
0.364
0.817
0.523
0.789
0.789
0.632
0.695
0.76
0.706
0.411
0.816
0.863
0.69
0.685
0.372
0.696
0.688
0.457
0.452
0.603
0.627
0.727
0.836
0.488
0.778
Mistral Instruct v0.3 (7B)
0.27
0.585
0.37
0.7
0.421
0.33
0.713
0.659
0.632
0.79
0.638
0.57
0.687
0.694
0.45
0.3
0.601
0.343
0.549
0.572
0.402
0.397
0.645
0.458
0.66
0.733
0.773
0.881
0.562
0.293
0.613
0.258
0.802
0.486
0.789
0.759
0.668
0.702
0.76
0.712
0.455
0.767
0.842
0.75
0.673
0.393
0.676
0.673
0.454
0.392
0.641
0.636
0.682
0.806
0.47
0.825
Phi-2
0.31
0.437
0.43
0.73
0.342
0.35
0.694
0.598
0.482
0.78
0.605
0.59
0.619
0.66
0.46
0.4
0.595
0.382
0.519
0.545
0.463
0.389
0.716
0.483
0.66
0.648
0.758
0.798
0.585
0.341
0.609
0.364
0.806
0.509
0.681
0.73
0.641
0.733
0.752
0.767
0.5
0.748
0.833
0.62
0.702
0.231
0.627
0.605
0.426
0.426
0.572
0.673
0.702
0.816
0.47
0.702
Mistral v0.1 (7B)
0.25
0.467
0.37
0.69
0.351
0.29
0.667
0.63
0.592
0.79
0.599
0.56
0.653
0.667
0.4
0.26
0.52
0.314
0.451
0.538
0.32
0.365
0.506
0.448
0.56
0.715
0.702
0.845
0.569
0.274
0.655
0.311
0.783
0.389
0.755
0.726
0.596
0.702
0.76
0.693
0.438
0.709
0.833
0.68
0.665
0.33
0.657
0.642
0.447
0.424
0.578
0.6
0.731
0.831
0.44
0.789
Llama 3.2 Vision Instruct Turbo (11B)
0.28
0.533
0.39
0.71
0.395
0.25
0.722
0.646
0.474
0.78
0.671
0.64
0.638
0.764
0.39
0.29
0.671
0.333
0.536
0.51
0.458
0.46
0.487
0.552
0.56
0.43
0.823
0.865
0.664
0.285
0.714
0.377
0.815
0.375
0.402
0.502
0.61
0.763
0.711
0.742
0.375
0.728
0.838
0.7
0.636
0.328
0.752
0.744
0.323
0.292
0.649
0.645
0.567
0.627
0.446
0.696
Llama 3.1 Instruct Turbo (8B)
0.26
0.459
0.39
0.71
0.351
0.26
0.731
0.64
0.467
0.79
0.645
0.65
0.615
0.75
0.4
0.3
0.642
0.363
0.528
0.441
0.429
0.444
0.423
0.552
0.59
0.412
0.818
0.865
0.674
0.307
0.71
0.391
0.802
0.347
0.436
0.515
0.61
0.733
0.694
0.742
0.384
0.709
0.833
0.66
0.592
0.368
0.712
0.728
0.312
0.293
0.649
0.664
0.576
0.701
0.446
0.789
Llama 2 (13B)
0.27
0.496
0.39
0.69
0.307
0.38
0.704
0.672
0.526
0.83
0.546
0.55
0.592
0.597
0.4
0.29
0.532
0.235
0.413
0.49
0.307
0.381
0.674
0.488
0.57
0.655
0.712
0.798
0.505
0.274
0.597
0.311
0.75
0.417
0.765
0.705
0.641
0.618
0.752
0.687
0.286
0.738
0.786
0.57
0.63
0.407
0.627
0.654
0.429
0.422
0.567
0.6
0.608
0.761
0.476
0.76
OLMo 1.7 (7B)
0.33
0.496
0.43
0.65
0.404
0.34
0.565
0.592
0.526
0.76
0.526
0.59
0.57
0.535
0.48
0.38
0.538
0.333
0.434
0.517
0.307
0.325
0.632
0.414
0.59
0.63
0.687
0.751
0.521
0.319
0.563
0.358
0.73
0.454
0.623
0.713
0.61
0.595
0.612
0.607
0.375
0.689
0.769
0.56
0.546
0.335
0.608
0.593
0.429
0.39
0.526
0.6
0.522
0.751
0.452
0.731
Llama 2 (7B)
0.29
0.452
0.27
0.59
0.316
0.29
0.519
0.592
0.526
0.64
0.408
0.48
0.453
0.41
0.38
0.28
0.462
0.196
0.434
0.407
0.254
0.27
0.51
0.35
0.38
0.606
0.51
0.674
0.438
0.307
0.429
0.298
0.631
0.259
0.554
0.662
0.547
0.557
0.628
0.466
0.402
0.563
0.697
0.53
0.488
0.238
0.497
0.503
0.355
0.36
0.459
0.509
0.433
0.617
0.392
0.713
OLMo (7B)
0.26
0.222
0.38
0.3
0.325
0.32
0.25
0.325
0.449
0.26
0.342
0.24
0.26
0.236
0.35
0.27
0.26
0.294
0.319
0.29
0.254
0.278
0.335
0.31
0.22
0.236
0.227
0.358
0.336
0.267
0.294
0.325
0.259
0.454
0.25
0.253
0.278
0.267
0.306
0.264
0.286
0.272
0.269
0.28
0.272
0.265
0.34
0.318
0.223
0.238
0.232
0.345
0.408
0.383
0.416
0.234

No dataset card yet

Downloads last month
9