The image above represents a H&E stain of a skeletal muscle biopsy from a young boy who came into the clinic reporting muscle weakness. You are his doctor. Does the boy have Duchenne muscular dystrophy? Explain. Your answer should include an analysis of the biopsy (you can use arrows to point to various features) and be sure to list all features of the muscle that indicate diseased or healthy conditions.
Describe and compare your eye-tracking results from the Gaze Recorder online website. You will interpret your recording qualitatively and in comparison to the aggregated data.Inspect your recording. (use the heat maps and data provided on the images). Make note of your eye movements and where you spent the most time looking (red areas).
These are the visual representation of the code used for SMOTE on original data, accuracy and f1 scores for test and validation data, accuracy vs. loss graph. Interpret these results, compare with the metrics of original data, and briefly explain the impact of SMOTE of our data.
this is the result of filtering denoised image with ideal lowpass filter, gaussian filter and nuterworth filter with two cuttoff frequencu 30 an 100. 1-why is the reason of the butterworth D0=100 to be brighter that butterworth D0=30. 2- why butterworth filter has somthing to show buut ideal and gaussian are completely dark in both cutoff frquencies?
Rank | Model | Open‑Source? | Sci. | Cd. | CW. | IE. | Perc. | Knowl. | Arts | Plan. | Math. | Mt. | #Token | 95% CI | WR | Elo |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | Pixtral-Large-Instruct-2411 | Yes | 1230 | 1194 | 1280 | 1242 | 1224 | 1250 | 1245 | 1221 | 1175 | 1266 | 715 | (-8, 8) | 65.97 | 1229 |
2 | claude-3-5-sonnet-20241022 | No | 1228 | 1252 | 1259 | 1211 | 1213 | 1272 | 1236 | 1192 | 1197 | 1251 | 405 | (-7, 8) | 65.84 | 1228 |
3 | gemini-1.5-pro-002 | No | 1151 | 1145 | 1105 | 1100 | 1110 | 1067 | 1107 | 1095 | 1134 | 1147 | 500 | (-8, 10) | 50.58 | 1118 |
4 | gpt-4o-2024-05-13 | No | 1114 | 1114 | 1114 | 1114 | 1114 | 1114 | 1114 | 1114 | 1114 | 1114 | 491 | (0, 0) | 50.00 | 1114 |
5 | gpt-4o-mini-2024-07-18 | Yes | 1049 | 1074 | 1165 | 1094 | 1096 | 1101 | 1130 | 1102 | 1037 | 1159 | 526 | (-8, 10) | 47.12 | 1094 |
6 | gpt-4o-2024-08-06 | No | 1096 | 1112 | 1050 | 1097 | 995 | 1080 | 1032 | 1058 | 1175 | 1015 | 374 | (-7, 7) | 44.98 | 1079 |
7 | gemini-1.5-flash-002 | No | 1025 | 877 | 1092 | 1007 | 1022 | 1011 | 993 | 946 | 1035 | 1087 | 493 | (-8, 9) | 35.33 | 1009 |
8 | InternVL2_5-78B | Yes | 1083 | 1018 | 1051 | 1091 | 1031 | 1084 | 1042 | 1073 | 1065 | 1023 | 558 | (-7, 10) | 42.85 | 1064 |
9 | Pixtral-12B-2409 | Yes | 1028 | 965 | 1099 | 1031 | 1024 | 1057 | 1047 | 1083 | 996 | 1063 | 659 | (-5, 8) | 39.1 | 1037 |
10 | Aria-Chat | Yes | 990 | 982 | 985 | 937 | 998 | 1034 | 1019 | 974 | 973 | 1016 | 675 | (-7, 8) | 32.88 | 990 |
11 | InternVL2_5-38B | Yes | 1000 | 979 | 1028 | 987 | 1021 | 904 | 932 | 1041 | 1026 | 933 | 521 | (-9, 9) | 32.5 | 987 |
12 | Qwen2-VL-72B-Instruct | Yes | 1009 | 914 | 965 | 991 | 986 | 960 | 962 | 921 | 998 | 970 | 557 | (-9, 9) | 31.37 | 978 |
13 | InternVL2_5-26B | Yes | 890 | 816 | 1008 | 894 | 944 | 876 | 864 | 964 | 880 | 896 | 490 | (-10, 8) | 22.59 | 900 |
14 | InternVL2_5-8B | Yes | 824 | 806 | 983 | 880 | 914 | 840 | 915 | 895 | 835 | 868 | 644 | (-11, 8) | 20.45 | 878 |
15 | Molmo-72B-0924 | Yes | 828 | 733 | 953 | 859 | 903 | 881 | 862 | 817 | 871 | 852 | 301 | (-12, 8) | 18.46 | 856 |
16 | NVLM-D-72B | Yes | 780 | 877 | 991 | 810 | 849 | 835 | 767 | 881 | 838 | 725 | 561 | (-10, 10) | 16.63 | 834 |
17 | Qwen2-VL-7B-Instruct | Yes | 803 | 689 | 827 | 877 | 861 | 816 | 736 | 680 | 858 | 833 | 787 | (-9, 10) | 15.40 | 818 |
18 | Llama-3.2-90B-Vision-Instruct | Yes | 830 | 751 | 624 | 754 | 806 | 842 | 626 | 769 | 940 | 662 | 448 | (-11, 10) | 12.89 | 782 |
19 | llava-onevision-qwen2-72b-ov | Yes | 696 | 735 | 762 | 726 | 767 | 689 | 663 | 679 | 853 | 620 | 360 | (-11, 12) | 10.09 | 734 |
20 | Llama-3.2-11B-Vision-Instruct | Yes | 671 | 541 | 681 | 702 | 766 | 761 | 624 | 524 | 744 | 614 | 531 | (-13, 16) | 7.93 | 688 |
21 | MiniCPM-V-2_6 | Yes | 644 | 599 | 767 | 659 | 812 | 676 | 673 | 667 | 656 | 681 | 646 | (-12, 10) | 7.97 | 689 |
22 | llava-onevision-qwen2-7b-ov | Yes | 605 | 570 | 807 | 683 | 809 | 681 | 715 | 608 | 573 | 724 | 575 | (-13, 10) | 7.93 | 688 |
23 | Molmo-7B-D-0924 | Yes | 536 | 304 | 720 | 631 | 638 | 655 | 681 | 531 | 613 | 603 | 310 | (-14, 12) | 5.41 | 617 |
24 | Molmo-7B-O-0924 | Yes | 457 | 134 | 623 | 483 | 681 | 599 | 606 | 380 | 428 | 528 | 296 | (-18, 19) | 3.54 | 540 |
Rank | Model | Open‑Source? | PT | FR | ES | DE | Other | #Token | 95% CI | WR | Elo |
---|---|---|---|---|---|---|---|---|---|---|---|
1 | claude-3-5-sonnet-20241022 | No | 1248 | 1319 | 1335 | 1389 | 1309 | 485 | (-21, 29) | 74.58 | 1301 |
2 | Pixtral-Large-Instruct-2411 | Yes | 1229 | 1496 | 1216 | 1324 | 1286 | 966 | (-23, 22) | 73.81 | 1294 |
3 | gemini-1.5-pro-002 | No | 1273 | 1168 | 1131 | 1168 | 1139 | 629 | (-20, 20) | 59.11 | 1178 |
4 | gpt-4o-2024-08-06 | No | 1159 | 1224 | 1226 | 1259 | 1114 | 480 | (-17, 26) | 60.35 | 1187 |
5 | gpt-4o-2024-05-13 | No | 1114 | 1114 | 1114 | 1114 | 1114 | 585 | (0, 0) | 50.0 | 1114 |
6 | gpt-4o-mini-2024-07-18 | Yes | 1038 | 1079 | 1071 | 1151 | 1099 | 657 | (-21, 16) | 45.84 | 1085 |
7 | Qwen2-VL-72B-Instruct | Yes | 1067 | 1199 | 944 | 1241 | 999 | 834 | (-18, 21) | 47.56 | 1097 |
8 | InternVL2_5-38B | Yes | 1038 | 1092 | 1070 | 1100 | 1044 | 868 | (-20, 18) | 43.98 | 1072 |
9 | InternVL2_5-78B | Yes | 948 | 1125 | 1035 | 1123 | 1084 | 841 | (-14, 20) | 42.71 | 1063 |
10 | Pixtral-12B-2409 | Yes | 935 | 1096 | 998 | 1077 | 929 | 1199 | (-14, 22) | 35.73 | 1012 |
11 | Aria-Chat | Yes | 964 | 1042 | 983 | 1041 | 999 | 1014 | (-23, 17) | 35.33 | 1009 |
12 | gemini-1.5-flash-002 | No | 1031 | 990 | 845 | 1015 | 815 | 567 | (-25, 19) | 28.47 | 954 |
13 | NVLM-D-72B | Yes | 900 | 863 | 850 | 898 | 918 | 907 | (-17, 25) | 21.99 | 894 |
14 | Llama-3.2-90B-Vision-Instruct | Yes | 905 | 860 | 824 | 863 | 864 | 968 | (-29, 21) | 20.92 | 883 |
15 | Molmo-72B-0924 | Yes | 834 | 835 | 852 | 853 | 878 | 426 | (-27, 19) | 18.9 | 861 |
16 | InternVL2_5-26B | Yes | 779 | 858 | 782 | 880 | 839 | 814 | (-28, 19) | 17.7 | 847 |
17 | Qwen2-VL-7B-Instruct | Yes | 701 | 875 | 673 | 865 | 678 | 1216 | (-24, 22) | 12.25 | 772 |
18 | llava-onevision-qwen2-72b-ov | Yes | 782 | 810 | 609 | 800 | 729 | 534 | (-27, 24) | 11.95 | 767 |
19 | InternVL2_5-8B | Yes | 760 | 776 | 765 | 821 | 602 | 1021 | (-22, 20) | 11.95 | 767 |
20 | Llama-3.2-11B-Vision-Instruct | Yes | 714 | 663 | 626 | 627 | 665 | 2027 | (-29, 21) | 8.4 | 699 |
21 | MiniCPM-V-2_6 | Yes | 522 | 559 | 603 | 634 | 455 | 890 | (-36, 35) | 4.44 | 581 |
22 | Molmo-7B-D-0924 | Yes | 445 | 495 | 577 | 613 | 505 | 406 | (-52, 33) | 4.32 | 576 |
23 | llava-onevision-qwen2-7b-ov | Yes | 579 | 386 | 144 | 403 | 588 | 686 | (-68, 37) | 3.07 | 514 |
24 | Molmo-7B-O-0924 | Yes | 383 | 256 | 536 | 246 | 429 | 512 | (-73, 51) | 1.95 | 433 |
Rank | Model | Open‑Source? | 2 | 3 | 4 | 5 | 6+ | #Token | 95% CI | WR | Elo |
---|---|---|---|---|---|---|---|---|---|---|---|
1 | claude-3-5-sonnet-20241022 | No | 1260 | 1249 | 1356 | 1248 | 1321 | 1477 | (-20, 18) | 70.82 | 1268 |
2 | Pixtral-Large-Instruct-2411 | Yes | 1233 | 1273 | 1304 | 1376 | 1253 | 2593 | (-23, 19) | 69.73 | 1259 |
3 | gpt-4o-mini-2024-07-18 | Yes | 1147 | 1143 | 1142 | 1200 | 1151 | 1749 | (-17, 24) | 55.16 | 1150 |
4 | gemini-1.5-pro-002 | No | 1136 | 1140 | 1107 | 1207 | 1145 | 1425 | (-26, 19) | 53.88 | 1141 |
5 | gpt-4o-2024-05-13 | No | 1114 | 1114 | 1114 | 1114 | 1114 | 1563 | (0, 0) | 50.0 | 1114 |
6 | gpt-4o-2024-08-06 | No | 1146 | 1050 | 1138 | 1023 | 965 | 1052 | (-22, 18) | 45.41 | 1082 |
7 | InternVL2_5-78B | Yes | 1135 | 1040 | 1148 | 1015 | 992 | 2015 | (-21, 20) | 44.84 | 1078 |
8 | Pixtral-12B-2409 | Yes | 1054 | 1008 | 1160 | 1013 | 1035 | 2264 | (-19, 20) | 40.48 | 1047 |
9 | gemini-1.5-flash-002 | No | 1015 | 1040 | 1015 | 1119 | 1006 | 1388 | (-16, 19) | 38.14 | 1030 |
10 | InternVL2_5-38B | Yes | 1003 | 1037 | 1036 | 913 | 902 | 1734 | (-18, 21) | 34.68 | 1004 |
11 | Qwen2-VL-72B-Instruct | Yes | 1023 | 972 | 1033 | 936 | 875 | 1608 | (-21, 19) | 32.24 | 985 |
12 | Aria-Chat | Yes | 937 | 913 | 946 | 887 | 812 | 2321 | (-27, 12) | 23.92 | 913 |
13 | Molmo-72B-0924 | Yes | 886 | 817 | 787 | 920 | 808 | 967 | (-28, 25) | 18.64 | 858 |
14 | InternVL2_5-26B | Yes | 881 | 811 | 805 | 753 | 638 | 1554 | (-27, 28) | 15.77 | 823 |
15 | InternVL2_5-8B | Yes | 814 | 724 | 775 | 686 | 559 | 1835 | (-25, 22) | 11.77 | 764 |
16 | llava-onevision-qwen2-72b-ov | Yes | 753 | 721 | 673 | 525 | 692 | 1176 | (-31, 26) | 10.3 | 738 |
17 | Llama-3.2-90B-Vision-Instruct | Yes | 754 | 757 | 784 | 426 | 605 | 1350 | (-36, 24) | 9.88 | 730 |
18 | Qwen2-VL-7B-Instruct | Yes | 808 | 622 | 637 | 557 | 495 | 2004 | (-34, 25) | 9.48 | 722 |
19 | NVLM-D-72B | Yes | 770 | 557 | 602 | 641 | 682 | 1371 | (-35, 33) | 8.49 | 701 |
20 | llava-onevision-qwen2-7b-ov | Yes | 737 | 591 | 649 | N/A | 512 | 1743 | (-30, 30) | 6.58 | 653 |
21 | Llama-3.2-11B-Vision-Instruct | Yes | 741 | 380 | 487 | 275 | 490 | 2094 | (-38, 32) | 6.03 | 637 |
22 | MiniCPM-V-2_6 | Yes | 664 | 575 | 628 | 530 | 389 | 1861 | (-33, 37) | 5.35 | 615 |
23 | Molmo-7B-D-0924 | Yes | 672 | 470 | 523 | 409 | 618 | 923 | (-34, 26) | 5.04 | 604 |
24 | Molmo-7B-O-0924 | Yes | 589 | 413 | 490 | N/A | 402 | 925 | (-49, 37) | 3.43 | 534 |
@misc{yang2025probenchjudgingmultimodalfoundation,
title={ProBench: Judging Multimodal Foundation Models on Open-ended Multi-domain Expert Tasks},
author={Yan Yang and Dongxu Li and Haoning Wu and Bei Chen and Liu Liu and Liyuan Pan and Junnan Li},
year={2025},
eprint={2503.06885},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2503.06885},
}