developer_uid: chai_backend_admin
submission_id: qwen-qwen3-235b-a22b-i_47730_v22
model_name: qwen-qwen3-235b-a22b-i_47730_v22
model_group: Qwen/Qwen3-235B-A22B-Ins
status: torndown
timestamp: 2026-02-03T00:17:51+00:00
num_battles: 12283
num_wins: 6296
celo_rating: 1315.18
family_friendly_score: 0.6138
family_friendly_standard_error: 0.006885485603790048
submission_type: basic
model_repo: Qwen/Qwen3-235B-A22B-Instruct-2507
model_architecture: Qwen3MoeForCausalLM
model_num_parameters: 18790207488.0
best_of: 4
max_input_tokens: 2048
max_output_tokens: 80
reward_model: default
display_name: qwen-qwen3-235b-a22b-i_47730_v22
ineligible_reason: max_output_tokens!=64
is_internal_developer: True
language_model: Qwen/Qwen3-235B-A22B-Instruct-2507
model_size: 19B
ranking_group: single
us_pacific_date: 2026-01-30
win_ratio: 0.5125783603354229
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['</s>', '<|assistant|>', '<|user|>', '####', '<|im_end|>', '</think>'], 'max_input_tokens': 2048, 'best_of': 4, 'max_output_tokens': 80}
formatter: {'memory_template': "<|im_start|>system\n{bot_name}'s persona: {memory}<|im_end|>\n", 'prompt_template': '', 'bot_template': '<|im_start|>assistant\n{bot_name}: {message}<|im_end|>\n', 'user_template': '<|im_start|>user\n{message}<|im_end|>\n', 'response_template': '<|im_start|>assistant\n{bot_name}:', 'truncate_by_message': True}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage VLLMUploader
Starting job with name qwen-qwen3-235b-a22b-i-47730-v22-uploader
Waiting for job on qwen-qwen3-235b-a22b-i-47730-v22-uploader to finish
qwen-qwen3-235b-a22b-i-47730-v22-uploader: Using quantization_mode: w4a16
qwen-qwen3-235b-a22b-i-47730-v22-uploader: Checking if ChaiML/Qwen3-235B-A22B-Instruct-2507-W4A16 already exists in ChaiML
qwen-qwen3-235b-a22b-i-47730-v22-uploader: Model already exists. Downloading to /dev/shm/model_output...
qwen-qwen3-235b-a22b-i-47730-v22-uploader: Downloading snapshot of ChaiML/Qwen3-235B-A22B-Instruct-2507-W4A16...
qwen-qwen3-235b-a22b-i-47730-v22-uploader: Fetching 39 files: 0%| | 0/39 [00:00<?, ?it/s] Fetching 39 files: 3%|▎ | 1/39 [00:00<00:08, 4.23it/s] Fetching 39 files: 18%|█▊ | 7/39 [00:16<01:19, 2.47s/it] Fetching 39 files: 23%|██▎ | 9/39 [00:16<00:53, 1.78s/it] Fetching 39 files: 36%|███▌ | 14/39 [00:17<00:22, 1.10it/s] Fetching 39 files: 38%|███▊ | 15/39 [00:30<01:00, 2.52s/it] Fetching 39 files: 44%|████▎ | 17/39 [00:31<00:41, 1.90s/it] Fetching 39 files: 49%|████▊ | 19/39 [00:32<00:30, 1.53s/it] Fetching 39 files: 51%|█████▏ | 20/39 [00:32<00:24, 1.31s/it] Fetching 39 files: 54%|█████▍ | 21/39 [00:32<00:20, 1.11s/it] Fetching 39 files: 59%|█████▉ | 23/39 [00:45<00:45, 2.87s/it] Fetching 39 files: 62%|██████▏ | 24/39 [00:50<00:49, 3.33s/it] Fetching 39 files: 79%|███████▉ | 31/39 [00:52<00:10, 1.32s/it] Fetching 39 files: 82%|████████▏ | 32/39 [00:53<00:08, 1.23s/it] Fetching 39 files: 100%|██████████| 39/39 [00:53<00:00, 1.36s/it]
qwen-qwen3-235b-a22b-i-47730-v22-uploader: Downloaded in 53.149s
qwen-qwen3-235b-a22b-i-47730-v22-uploader: Processed model Qwen/Qwen3-235B-A22B-Instruct-2507 in 53.686s
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/tokenizer_config.json s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/tokenizer_config.json
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/config.json s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/config.json
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/quantization_config.json s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/quantization_config.json
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/merges.txt s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/merges.txt
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/vocab.json s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/vocab.json
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model.safetensors.index.json s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model.safetensors.index.json
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/tokenizer.json s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/tokenizer.json
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00027-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00027-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00022-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00022-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00020-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00020-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00004-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00004-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00007-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00007-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00006-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00006-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00008-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00008-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00014-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00014-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00011-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00011-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00001-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00001-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00012-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00012-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00018-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00018-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00016-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00016-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00017-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00017-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00003-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00003-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00005-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00005-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00013-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00013-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00015-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00015-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00026-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00026-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00025-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00025-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00009-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00009-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00002-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00002-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00021-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00021-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00019-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00019-of-00027.safetensors
qwen-qwen3-235b-a22b-i-47730-v22-uploader: cp /dev/shm/model_output/model-00024-of-00027.safetensors s3://guanaco-vllm-models/qwen-qwen3-235b-a22b-i-47730-v22/model-00024-of-00027.safetensors
Job qwen-qwen3-235b-a22b-i-47730-v22-uploader completed after 286.07s with status: succeeded
Stopping job with name qwen-qwen3-235b-a22b-i-47730-v22-uploader
Pipeline stage VLLMUploader completed in 287.30s
run pipeline stage %s
Running pipeline stage VLLMTemplater
Pipeline stage VLLMTemplater completed in 0.23s
run pipeline stage %s
Running pipeline stage VLLMDeployer
Creating inference service qwen-qwen3-235b-a22b-i-47730-v22
Waiting for inference service qwen-qwen3-235b-a22b-i-47730-v22 to be ready
Inference service qwen-qwen3-235b-a22b-i-47730-v22 ready after 487.26848888397217s
Pipeline stage VLLMDeployer completed in 488.77s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.2167727947235107s
Received healthy response to inference request in 2.2064404487609863s
Received healthy response to inference request in 2.0955796241760254s
Received healthy response to inference request in 2.1493546962738037s
Received healthy response to inference request in 3.198137044906616s
Received healthy response to inference request in 2.32611346244812s
{"detail":"HTTPConnectionPool(host='qwen-qwen3-235b-a22b-i-47730-v22-predictor.tenant-chaiml-guanaco.kchai-coreweave-us-east-04a.chaiverse.com', port=80): Max retries exceeded with url: /v1/completions (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x767e7208f590>, 'Connection to qwen-qwen3-235b-a22b-i-47730-v22-predictor.tenant-chaiml-guanaco.kchai-coreweave-us-east-04a.chaiverse.com timed out. (connect timeout=12.0)'))"}
Received unhealthy response to inference request!
Received healthy response to inference request in 2.6202521324157715s
Received healthy response to inference request in 2.3362503051757812s
Received healthy response to inference request in 2.292360544204712s
Received healthy response to inference request in 1.9207420349121094s
Received healthy response to inference request in 1.9212324619293213s
Received healthy response to inference request in 1.806534767150879s
Received healthy response to inference request in 2.127309799194336s
Received healthy response to inference request in 1.9154808521270752s
Received healthy response to inference request in 2.0886592864990234s
Received healthy response to inference request in 1.952221155166626s
Received healthy response to inference request in 1.8973450660705566s
Received healthy response to inference request in 1.9922454357147217s
Received healthy response to inference request in 1.8959803581237793s
Received healthy response to inference request in 2.0243022441864014s
Received healthy response to inference request in 1.9683454036712646s
Received healthy response to inference request in 2.0438835620880127s
Received healthy response to inference request in 1.850700855255127s
Received healthy response to inference request in 1.9828898906707764s
Received healthy response to inference request in 2.2737977504730225s
Received healthy response to inference request in 2.3590400218963623s
Received healthy response to inference request in 1.820929765701294s
Received healthy response to inference request in 1.9133217334747314s
Received healthy response to inference request in 2.013676643371582s
30 requests
1 failed requests
5th percentile: 1.8343267560005188
10th percentile: 1.8914524078369142
20th percentile: 1.9150490283966064
30th percentile: 1.9429245471954346
40th percentile: 1.9885032176971436
50th percentile: 2.034092903137207
60th percentile: 2.1082716941833497
70th percentile: 2.2095401525497436
80th percentile: 2.2991111278533936
90th percentile: 2.3851612329483034
95th percentile: 2.9380888342857343
99th percentile: 9.783292655944832
mean time: 2.456096808115641
%s, retrying in %s seconds...
Received healthy response to inference request in 2.4954912662506104s
Received healthy response to inference request in 1.7648544311523438s
Received healthy response to inference request in 2.1997036933898926s
Received healthy response to inference request in 1.8748435974121094s
Received healthy response to inference request in 2.444223403930664s
Received healthy response to inference request in 1.9313852787017822s
Received healthy response to inference request in 2.036457061767578s
Received healthy response to inference request in 1.8326270580291748s
Received healthy response to inference request in 2.0356171131134033s
Received healthy response to inference request in 2.197659969329834s
Received healthy response to inference request in 2.29994535446167s
Received healthy response to inference request in 1.8718678951263428s
Received healthy response to inference request in 1.857536792755127s
Received healthy response to inference request in 2.184986114501953s
Received healthy response to inference request in 1.8594448566436768s
Received healthy response to inference request in 2.4448750019073486s
Received healthy response to inference request in 1.7895143032073975s
Received healthy response to inference request in 1.9977915287017822s
Received healthy response to inference request in 2.240633487701416s
Received healthy response to inference request in 1.9765636920928955s
Received healthy response to inference request in 1.9585771560668945s
Received healthy response to inference request in 2.346254825592041s
Received healthy response to inference request in 2.1683735847473145s
Received healthy response to inference request in 1.9833273887634277s
Received healthy response to inference request in 1.8964917659759521s
Received healthy response to inference request in 2.047121047973633s
Received healthy response to inference request in 2.0276095867156982s
Received healthy response to inference request in 2.033090829849243s
Received healthy response to inference request in 2.3642666339874268s
Received healthy response to inference request in 2.2605838775634766s
30 requests
0 failed requests
5th percentile: 1.8089150428771972
10th percentile: 1.8550458192825316
20th percentile: 1.874248456954956
30th percentile: 1.9504195928573609
40th percentile: 1.9920058727264405
50th percentile: 2.0343539714813232
60th percentile: 2.0956220626831055
70th percentile: 2.1982730865478515
80th percentile: 2.2684561729431154
90th percentile: 2.3722623109817507
95th percentile: 2.4445817828178407
99th percentile: 2.4808125495910645
mean time: 2.0807239532470705
Pipeline stage StressChecker completed in 144.72s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 1.34s
Shutdown handler de-registered
qwen-qwen3-235b-a22b-i_47730_v22 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Generating Leaderboard row for %s
Generated Leaderboard row for %s
Pipeline stage OfflineFamilyFriendlyScorer completed in 2915.39s
Shutdown handler de-registered
qwen-qwen3-235b-a22b-i_47730_v22 status is now inactive due to auto deactivation removed underperforming models
qwen-qwen3-235b-a22b-i_47730_v22 status is now torndown due to DeploymentManager action