19 моделей, 5 вакансий с hh.ru. Кликни по заголовку столбца для сортировки.
Модель получает резюме + вакансию → возвращает JSON с оценкой. Максимум 100 баллов.
← Листай таблицу вбок →
| # | Модель | Кач. | Парс. | Схема | Консист. | Точн. | $/вызов | $/1000 | Латен. | WSM (F1) | Pen. (F2) | WPM (F3) | TOP. (F4) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 1 | DeepSeek R1 | 93.6 | 38 | 20 | 15 | 20.6 | 0.00619 | 6.19 | 2008 мс | 0.742 | 86.9 | 0.711 | 0.319 |
| 2 | Llama 4 Maverick | 91.4 | 40 | 20 | 15 | 16.4 | 0.00060 | 0.60 | 564 мс | 0.765 | 91.4 | 0.775 | 0.390 |
| 3 | Qwen3 235B | 91.0 | 40 | 20 | 15 | 16.0 | 0.00448 | 4.48 | 896 мс | 0.688 | 85.7 | 0.677 | 0.300 |
| 4 | GPT-4.1-mini | 90.6 | 40 | 20 | 15 | 15.6 | 0.00177 | 1.77 | 449 мс | 0.709 | 89.4 | 0.714 | 0.311 |
| 5 | DeepSeek V3.1 | 90.4 | 36 | 20 | 15 | 19.4 | 0.00075 | 0.75 | 1642 мс | 0.732 | 90.4 | 0.742 | 0.357 |
| 6 | GPT-4o-mini | 89.8 | 40 | 20 | 15 | 14.8 | 0.00065 | 0.65 | 696 мс | 0.722 | 89.8 | 0.732 | 0.368 |
| 7 | Claude 3.5 Haiku | 89.8 | 40 | 20 | 15 | 14.8 | 0.00559 | 5.59 | 1687 мс | 0.651 | 83.6 | 0.637 | 0.288 |
| 8 | Qwen3 8B | 89.2 | 40 | 20 | 15 | 14.2 | 0.00055 | 0.55 | 1265 мс | 0.713 | 89.2 | 0.722 | 0.386 |
| 9 | o4-mini | 89.2 | 40 | 20 | 15 | 14.2 | 0.00846 | 8.46 | 404 мс | 0.622 | 81.2 | 0.597 | 0.280 |
| 10 | Grok 3 Mini | 88.6 | 40 | 20 | 15 | 13.6 | 0.00180 | 1.80 | 446 мс | 0.659 | 87.3 | 0.666 | 0.293 |
| 11 | Grok 4.1 Fast | 87.2 | 40 | 20 | 15 | 12.2 | 0.00133 | 1.33 | 588 мс | 0.634 | 87.2 | 0.644 | 0.290 |
| 12 | Llama 4 Scout | 84.4 | 30 | 20 | 15 | 19.4 | 0.00032 | 0.32 | 202 мс | 0.611 | 84.4 | 0.612 | 0.488 |
| 13 | Mistral Small 3.1 | 83.0 | 30 | 20 | 15 | 18.0 | 0.00015 | 0.15 | 2890 мс | 0.602 | 83.0 | 0.591 | 0.855 |
| 14 | Mistral Small 3.2 | 83.0 | 30 | 20 | 15 | 18.0 | 0.00036 | 0.36 | 491 мс | 0.573 | 83.0 | 0.572 | 0.440 |
| 15 | Gemini 2.5 Flash | 81.6 | 30 | 20 | 15 | 16.6 | 0.00149 | 1.49 | 621 мс | 0.491 | 81.1 | 0.500 | 0.229 |
| 16 | Claude 4.5 Haiku | 79.0 | 30 | 20 | 15 | 14.0 | 0.00639 | 6.39 | 5173 мс | 0.378 | 72.2 | 0.387 | 0.176 |
| 17 | Gemini Flash Lite | 77.8 | 30 | 20 | 15 | 12.8 | 0.00034 | 0.34 | 699 мс | 0.446 | 77.8 | 0.428 | 0.432 |
| 18 | Gemini 2.5 Pro | 75.8 | 30 | 20 | 15 | 10.8 | 0.03385 | 33.85 | 2456 мс | 0.244 | 61.7 | 0.166 | 0.135 |
| 19 | DeepSeek V3.2 | 66.0 | 28 | 16 | 12 | 10.0 | 0.00122 | 1.22 | 2506 мс | 0.110 | 66.0 | 0.021 | 0.105 |
Модель генерирует неформальное сопроводительное письмо. Максимум 100 баллов (минус штрафы за артефакты).
← Листай таблицу вбок →
| # | Модель | Итого | Стиль | Релев. | Грам. | Штр. | $/вызов | $/1000 | Латен. | WSM (F1) | Pen. (F2) | WPM (F3) | TOP. (F4) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 1 | Gemini 2.5 Pro | 98.8 | 45 | 35 | 18.8 | — | 0.02801 | 28.01 | 4128 мс | 0.815 | 85.4 | 0.438 | 0.336 |
| 2 | Grok 4.1 Fast | 96.6 | 42.5 | 34.1 | 20 | — | 0.00128 | 1.28 | 550 мс | 0.867 | 96.6 | 0.865 | 0.354 |
| 3 | Claude 4.5 Haiku | 95.6 | 41.8 | 35 | 18.8 | — | 0.00672 | 6.72 | 1831 мс | 0.785 | 88.4 | 0.736 | 0.317 |
| 4 | Gemini 2.5 Flash | 95.0 | 40 | 35 | 20 | — | 0.00147 | 1.47 | 554 мс | 0.822 | 94.4 | 0.822 | 0.336 |
| 5 | Grok 3 Mini | 94.6 | 39.6 | 35 | 20 | — | 0.00163 | 1.63 | 547 мс | 0.809 | 93.6 | 0.808 | 0.330 |
| 6 | DeepSeek V3.1 | 93.9 | 41 | 34.1 | 18.8 | — | 0.00078 | 0.78 | 2874 мс | 0.817 | 93.9 | 0.825 | 0.367 |
| 7 | DeepSeek R1 | 93.8 | 40 | 35 | 18.8 | — | 0.00483 | 4.83 | 1675 мс | 0.752 | 88.0 | 0.725 | 0.305 |
| 8 | GPT-4.1-mini | 92.8 | 39.2 | 33.6 | 20 | — | 0.00176 | 1.76 | 373 мс | 0.761 | 91.4 | 0.762 | 0.313 |
| 9 | Qwen3 235B | 92.7 | 38.9 | 35 | 18.8 | — | 0.00384 | 3.84 | 769 мс | 0.732 | 87.9 | 0.717 | 0.298 |
| 10 | DeepSeek V3.2 | 92.3 | 38.5 | 35 | 18.8 | — | 0.00124 | 1.24 | 1818 мс | 0.761 | 92.3 | 0.767 | 0.322 |
| 11 | Gemini Flash Lite | 91.2 | 38.5 | 32.7 | 20 | — | 0.00034 | 0.34 | 792 мс | 0.778 | 91.2 | 0.788 | 0.509 |
| 12 | o4-mini | 90.2 | 42.8 | 31.8 | 15.6 | — | 0.01129 | 11.29 | 535 мс | 0.633 | 80.8 | 0.578 | 0.271 |
| 13 | Qwen3 8B | 88.9 | 37.4 | 32.7 | 18.8 | — | 0.00056 | 0.56 | 757 мс | 0.704 | 88.9 | 0.714 | 0.371 |
| 14 | Llama 4 Maverick | 88.7 | 36 | 32.7 | 20 | — | 0.00063 | 0.63 | 666 мс | 0.695 | 88.7 | 0.705 | 0.351 |
| 15 | Claude 3.5 Haiku | 86.3 | 36 | 32.7 | 17.6 | — | 0.00503 | 5.03 | 1992 мс | 0.564 | 80.4 | 0.560 | 0.239 |
| 16 | GPT-4o-mini | 82.7 | 34.2 | 28.5 | 20 | — | 0.00067 | 0.67 | 564 мс | 0.544 | 82.7 | 0.549 | 0.293 |
| 17 | Mistral Small 3.1 | 77.3 | 36.7 | 31.8 | 18.8 | −10 | 0.00015 | 0.15 | 803 мс | 0.461 | 77.3 | 0.426 | 0.751 |
| 18 | Mistral Small 3.2 | 76.9 | 34.2 | 32.7 | 20 | −10 | 0.00037 | 0.37 | 765 мс | 0.420 | 76.9 | 0.400 | 0.390 |
| 19 | Llama 4 Scout | 66.4 | 19.8 | 29 | 17.6 | — | 0.00044 | 0.44 | 419 мс | 0.154 | 66.4 | 0.042 | 0.295 |