developer_uid: zonemercy
submission_id: chaiml-mega-v1-top2-q35_93469_v2
model_name: chaiml-mega-v1-top2-q35_93469_v2
model_group: ChaiML/mega-v1-top2-q35b
status: torndown
timestamp: 2026-03-27T04:41:25+00:00
num_battles: 12121
num_wins: 5963
celo_rating: 8485.88
family_friendly_score: 0.0
family_friendly_standard_error: 0.0
submission_type: basic
model_repo: ChaiML/mega-v1-top2-q35b-lr5e6ep2g8
model_architecture: Qwen3_5MoeForConditionalGeneration
model_num_parameters: 33753909248.0
best_of: 8
max_input_tokens: 2048
max_output_tokens: 80
reward_model: default
display_name: chaiml-mega-v1-top2-q35_93469_v2
ineligible_reason: max_output_tokens!=64
is_internal_developer: True
language_model: ChaiML/mega-v1-top2-q35b-lr5e6ep2g8
model_size: 34B
ranking_group: single
us_pacific_date: 2026-03-23
win_ratio: 0.4919561092319116
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['<|user|>', '<|assistant|>', '####', '<|im_end|>', '</s>'], 'max_input_tokens': 2048, 'best_of': 8, 'max_output_tokens': 80}
formatter: {'memory_template': "<|im_start|>system\n{bot_name}'s persona: {memory}<|im_end|>\n", 'prompt_template': '', 'bot_template': '<|im_start|>assistant\n{bot_name}: {message}<|im_end|>\n', 'user_template': '<|im_start|>user\n{message}<|im_end|>\n', 'response_template': '<|im_start|>assistant\n{bot_name}:', 'truncate_by_message': True}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage VLLMUploader
Starting job with name chaiml-mega-v1-top2-q35-93469-v2-uploader
Waiting for job on chaiml-mega-v1-top2-q35-93469-v2-uploader to finish
chaiml-mega-v1-top2-q35-93469-v2-uploader: Using quantization_mode: none
chaiml-mega-v1-top2-q35-93469-v2-uploader: Downloading snapshot of ChaiML/mega-v1-top2-q35b-lr5e6ep2g8...
chaiml-mega-v1-top2-q35-93469-v2-uploader: Downloaded in 36.666s
2026-03-24T02:52:39.076876+00:00 monitor updated for chaiml-mega-v1-top2-q35_93469_v2
chaiml-mega-v1-top2-q35-93469-v2-uploader: Processed model ChaiML/mega-v1-top2-q35b-lr5e6ep2g8 in 62.268s
chaiml-mega-v1-top2-q35-93469-v2-uploader: creating bucket guanaco-vllm-models
chaiml-mega-v1-top2-q35-93469-v2-uploader: /usr/lib/python3/dist-packages/S3/BaseUtils.py:56: SyntaxWarning: invalid escape sequence '\.'
chaiml-mega-v1-top2-q35-93469-v2-uploader: RE_S3_DATESTRING = re.compile('\.[0-9]*(?:[Z\\-\\+]*?)')
chaiml-mega-v1-top2-q35-93469-v2-uploader: /usr/lib/python3/dist-packages/S3/BaseUtils.py:57: SyntaxWarning: invalid escape sequence '\s'
chaiml-mega-v1-top2-q35-93469-v2-uploader: RE_XML_NAMESPACE = re.compile(b'^(<?[^>]+?>\s*|\s*)(<\w+) xmlns=[\'"](https?://[^\'"]+)[\'"]', re.MULTILINE)
chaiml-mega-v1-top2-q35-93469-v2-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:240: SyntaxWarning: invalid escape sequence '\.'
chaiml-mega-v1-top2-q35-93469-v2-uploader: invalid = re.search("([^a-z0-9\.-])", bucket, re.UNICODE)
chaiml-mega-v1-top2-q35-93469-v2-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:244: SyntaxWarning: invalid escape sequence '\.'
chaiml-mega-v1-top2-q35-93469-v2-uploader: invalid = re.search("([^A-Za-z0-9\._-])", bucket, re.UNICODE)
chaiml-mega-v1-top2-q35-93469-v2-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:255: SyntaxWarning: invalid escape sequence '\.'
chaiml-mega-v1-top2-q35-93469-v2-uploader: if re.search("-\.", bucket, re.UNICODE):
chaiml-mega-v1-top2-q35-93469-v2-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:257: SyntaxWarning: invalid escape sequence '\.'
chaiml-mega-v1-top2-q35-93469-v2-uploader: if re.search("\.\.", bucket, re.UNICODE):
chaiml-mega-v1-top2-q35-93469-v2-uploader: /usr/lib/python3/dist-packages/S3/S3Uri.py:155: SyntaxWarning: invalid escape sequence '\w'
chaiml-mega-v1-top2-q35-93469-v2-uploader: _re = re.compile("^(\w+://)?(.*)", re.UNICODE)
chaiml-mega-v1-top2-q35-93469-v2-uploader: /usr/lib/python3/dist-packages/S3/FileLists.py:480: SyntaxWarning: invalid escape sequence '\*'
chaiml-mega-v1-top2-q35-93469-v2-uploader: wildcard_split_result = re.split("\*|\?", uri_str, maxsplit=1)
chaiml-mega-v1-top2-q35-93469-v2-uploader: Bucket 's3://guanaco-vllm-models/' created
chaiml-mega-v1-top2-q35-93469-v2-uploader: uploading /dev/shm/model_output to s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/.gitattributes s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/.gitattributes
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/config.json s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/config.json
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/processor_config.json s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/processor_config.json
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/special_tokens_map.json s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/special_tokens_map.json
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/tokenizer_config.json s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/tokenizer_config.json
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/chat_template.jinja s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/chat_template.jinja
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/generation_config.json s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/generation_config.json
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/trainer_state.json s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/trainer_state.json
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/preprocessor_config.json s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/preprocessor_config.json
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/README.md s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/README.md
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/added_tokens.json s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/added_tokens.json
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/training_args.bin s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/training_args.bin
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/args.json s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/args.json
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/model.safetensors.index.json s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/model.safetensors.index.json
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/merges.txt s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/merges.txt
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/vocab.json s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/vocab.json
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/tokenizer.json s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/tokenizer.json
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/model-00002-of-00002.safetensors s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/model-00002-of-00002.safetensors
2026-03-24T02:53:39.167546+00:00 monitor updated for chaiml-mega-v1-top2-q35_93469_v2
chaiml-mega-v1-top2-q35-93469-v2-uploader: cp /dev/shm/model_output/model-00001-of-00002.safetensors s3://guanaco-vllm-models/chaiml-mega-v1-top2-q35-93469-v2/default/model-00001-of-00002.safetensors
Job chaiml-mega-v1-top2-q35-93469-v2-uploader completed after 163.55s with status: succeeded
Stopping job with name chaiml-mega-v1-top2-q35-93469-v2-uploader
Pipeline stage VLLMUploader completed in 164.13s
run pipeline stage %s
Running pipeline stage VLLMTemplater
Pipeline stage VLLMTemplater completed in 3.74s
run pipeline stage %s
Running pipeline stage VLLMDeployer
Creating inference service chaiml-mega-v1-top2-q35-93469-v2
Waiting for inference service chaiml-mega-v1-top2-q35-93469-v2 to be ready
2026-03-24T02:54:39.256133+00:00 monitor updated for chaiml-mega-v1-top2-q35_93469_v2
2026-03-24T02:55:39.361985+00:00 monitor updated for chaiml-mega-v1-top2-q35_93469_v2
2026-03-24T02:56:39.448420+00:00 monitor updated for chaiml-mega-v1-top2-q35_93469_v2
Inference service chaiml-mega-v1-top2-q35-93469-v2 ready after 191.05227756500244s
Pipeline stage VLLMDeployer completed in 191.58s
run pipeline stage %s
Running pipeline stage StressChecker
2026-03-24T02:57:39.541056+00:00 monitor updated for chaiml-mega-v1-top2-q35_93469_v2
HTTPConnectionPool(host='guanaco-submitter-v2.guanaco-backend.kchai-google-us-east4.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
HTTPConnectionPool(host='guanaco-submitter-v2.guanaco-backend.kchai-google-us-east4.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
HTTPConnectionPool(host='guanaco-submitter-v2.guanaco-backend.kchai-google-us-east4.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
2026-03-24T02:58:39.701327+00:00 monitor updated for chaiml-mega-v1-top2-q35_93469_v2
HTTPConnectionPool(host='guanaco-submitter-v2.guanaco-backend.kchai-google-us-east4.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 3.5559048652648926s
HTTPConnectionPool(host='guanaco-submitter-v2.guanaco-backend.kchai-google-us-east4.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
2026-03-24T02:59:40.069633+00:00 monitor updated for chaiml-mega-v1-top2-q35_93469_v2
HTTPConnectionPool(host='guanaco-submitter-v2.guanaco-backend.kchai-google-us-east4.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 7.560349941253662s
Received healthy response to inference request in 1.7347676753997803s
Received healthy response to inference request in 2.1203830242156982s
HTTPConnectionPool(host='guanaco-submitter-v2.guanaco-backend.kchai-google-us-east4.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 1.142197847366333s
Received healthy response to inference request in 1.049851417541504s
HTTPConnectionPool(host='guanaco-submitter-v2.guanaco-backend.kchai-google-us-east4.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
2026-03-24T03:00:40.192445+00:00 monitor updated for chaiml-mega-v1-top2-q35_93469_v2
HTTPConnectionPool(host='guanaco-submitter-v2.guanaco-backend.kchai-google-us-east4.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 1.1172494888305664s
Received healthy response to inference request in 1.127962350845337s
HTTPConnectionPool(host='guanaco-submitter-v2.guanaco-backend.kchai-google-us-east4.chaiverse.com', port=80): Read timed out. (read timeout=20)
Received unhealthy response to inference request!
Received healthy response to inference request in 1.6713569164276123s
Received healthy response to inference request in 1.840402603149414s
Received healthy response to inference request in 3.490910768508911s
Received healthy response to inference request in 1.4992592334747314s
Received healthy response to inference request in 1.0484216213226318s
Received healthy response to inference request in 2.3074891567230225s
Received healthy response to inference request in 1.6461338996887207s
Received healthy response to inference request in 4.758827209472656s
Received healthy response to inference request in 1.168794870376587s
2026-03-24T03:01:40.289099+00:00 monitor updated for chaiml-mega-v1-top2-q35_93469_v2
Received healthy response to inference request in 1.1803185939788818s
Received healthy response to inference request in 1.7672762870788574s
Received healthy response to inference request in 1.1040804386138916s
30 requests
10 failed requests
5th percentile: 1.0742544770240783
10th percentile: 1.115932583808899
20th percentile: 1.163475465774536
30th percentile: 1.6020714998245238
40th percentile: 1.7542728424072267
50th percentile: 2.2139360904693604
60th percentile: 4.0370738029479964
70th percentile: 20.099514055252076
80th percentile: 20.116757822036742
90th percentile: 20.11868212223053
95th percentile: 20.123214399814607
99th percentile: 20.13084926843643
mean time: 8.135250560442607
%s, retrying in %s seconds...
Received healthy response to inference request in 1.162334680557251s
Received healthy response to inference request in 1.1253604888916016s
Received healthy response to inference request in 1.0533506870269775s
Received healthy response to inference request in 1.195281744003296s
Received healthy response to inference request in 1.2796852588653564s
Received healthy response to inference request in 1.0998117923736572s
Received healthy response to inference request in 1.2153520584106445s
Received healthy response to inference request in 1.3624608516693115s
Received healthy response to inference request in 1.181861400604248s
Received healthy response to inference request in 1.0557835102081299s
Received healthy response to inference request in 1.1634366512298584s
Received healthy response to inference request in 1.1781425476074219s
Received healthy response to inference request in 1.3431639671325684s
Received healthy response to inference request in 1.257885217666626s
Received healthy response to inference request in 1.1373093128204346s
Received healthy response to inference request in 1.4097514152526855s
Received healthy response to inference request in 1.081315040588379s
Received healthy response to inference request in 1.468458652496338s
Received healthy response to inference request in 1.1372432708740234s
Received healthy response to inference request in 1.0830073356628418s
Received healthy response to inference request in 1.1212425231933594s
Received healthy response to inference request in 1.6327295303344727s
Received healthy response to inference request in 1.0290515422821045s
Received healthy response to inference request in 1.267120599746704s
Received healthy response to inference request in 1.1416015625s
Received healthy response to inference request in 1.114983320236206s
Received healthy response to inference request in 1.1519391536712646s
Received healthy response to inference request in 1.1073050498962402s
Received healthy response to inference request in 1.42193603515625s
Received healthy response to inference request in 1.1188628673553467s
30 requests
0 failed requests
5th percentile: 1.054445457458496
10th percentile: 1.078761887550354
20th percentile: 1.1058063983917237
30th percentile: 1.1205286264419556
40th percentile: 1.1372828960418702
50th percentile: 1.1571369171142578
60th percentile: 1.1796300888061524
70th percentile: 1.2281120061874389
80th percentile: 1.292381000518799
90th percentile: 1.410969877243042
95th percentile: 1.4475234746932981
99th percentile: 1.5850909757614138
mean time: 1.2032589356104533
Pipeline stage StressChecker completed in 285.42s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 1.53s
Shutdown handler de-registered
chaiml-mega-v1-top2-q35_93469_v2 status is now deployed due to DeploymentManager action
chaiml-mega-v1-top2-q35_93469_v2 status is now inactive due to auto deactivation removed underperforming models
chaiml-mega-v1-top2-q35_93469_v2 status is now torndown due to DeploymentManager action