developer_uid: chai_evaluation_service
submission_id: evelyn777-chai-sft-3b-v3_v3
model_name: evelyn777-chai-sft-3b-v3_v3
model_group: evelyn777/chai-sft-3b-v3
status: inactive
timestamp: 2026-02-08T08:17:05+00:00
num_battles: 14094
num_wins: 5053
celo_rating: 1200.94
family_friendly_score: 0.0
family_friendly_standard_error: 0.0
submission_type: basic
model_repo: evelyn777/chai-sft-3b-v3
model_architecture: Qwen2ForCausalLM
model_num_parameters: 3397011456.0
best_of: 8
max_input_tokens: 2048
max_output_tokens: 64
reward_model: default
display_name: evelyn777-chai-sft-3b-v3_v3
is_internal_developer: True
language_model: evelyn777/chai-sft-3b-v3
model_size: 3B
ranking_group: single
us_pacific_date: 2026-02-08
win_ratio: 0.35852135660564777
generation_params: {'temperature': 0.6, 'top_p': 0.9, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 2048, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '<|im_start|>system\n{memory}<|im_end|>\n', 'prompt_template': '<|im_start|>user\n{prompt}<|im_end|>\n', 'bot_template': '<|im_start|>assistant\n{bot_name}: {message}<|im_end|>\n', 'user_template': '<|im_start|>user\n{user_name}: {message}<|im_end|>\n', 'response_template': '<|im_start|>assistant\n{bot_name}:', 'truncate_by_message': True}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage VLLMUploader
Starting job with name evelyn777-chai-sft-3b-v3-v3-uploader
Waiting for job on evelyn777-chai-sft-3b-v3-v3-uploader to finish
evelyn777-chai-sft-3b-v3-v3-uploader: Using quantization_mode: none
evelyn777-chai-sft-3b-v3-v3-uploader: Downloading snapshot of evelyn777/chai-sft-3b-v3...
evelyn777-chai-sft-3b-v3-v3-uploader: Fetching 13 files: 0%| | 0/13 [00:00<?, ?it/s] Fetching 13 files: 8%|▊ | 1/13 [00:00<00:03, 3.57it/s] Fetching 13 files: 54%|█████▍ | 7/13 [00:04<00:03, 1.58it/s] Fetching 13 files: 100%|██████████| 13/13 [00:04<00:00, 3.01it/s]
evelyn777-chai-sft-3b-v3-v3-uploader: Downloaded in 4.537s
evelyn777-chai-sft-3b-v3-v3-uploader: Processed model evelyn777/chai-sft-3b-v3 in 6.957s
evelyn777-chai-sft-3b-v3-v3-uploader: creating bucket guanaco-vllm-models
evelyn777-chai-sft-3b-v3-v3-uploader: /usr/lib/python3/dist-packages/S3/BaseUtils.py:56: SyntaxWarning: invalid escape sequence '\.'
evelyn777-chai-sft-3b-v3-v3-uploader: RE_S3_DATESTRING = re.compile('\.[0-9]*(?:[Z\\-\\+]*?)')
evelyn777-chai-sft-3b-v3-v3-uploader: /usr/lib/python3/dist-packages/S3/BaseUtils.py:57: SyntaxWarning: invalid escape sequence '\s'
evelyn777-chai-sft-3b-v3-v3-uploader: RE_XML_NAMESPACE = re.compile(b'^(<?[^>]+?>\s*|\s*)(<\w+) xmlns=[\'"](https?://[^\'"]+)[\'"]', re.MULTILINE)
evelyn777-chai-sft-3b-v3-v3-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:240: SyntaxWarning: invalid escape sequence '\.'
evelyn777-chai-sft-3b-v3-v3-uploader: invalid = re.search("([^a-z0-9\.-])", bucket, re.UNICODE)
evelyn777-chai-sft-3b-v3-v3-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:244: SyntaxWarning: invalid escape sequence '\.'
evelyn777-chai-sft-3b-v3-v3-uploader: invalid = re.search("([^A-Za-z0-9\._-])", bucket, re.UNICODE)
evelyn777-chai-sft-3b-v3-v3-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:255: SyntaxWarning: invalid escape sequence '\.'
evelyn777-chai-sft-3b-v3-v3-uploader: if re.search("-\.", bucket, re.UNICODE):
evelyn777-chai-sft-3b-v3-v3-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:257: SyntaxWarning: invalid escape sequence '\.'
evelyn777-chai-sft-3b-v3-v3-uploader: if re.search("\.\.", bucket, re.UNICODE):
evelyn777-chai-sft-3b-v3-v3-uploader: /usr/lib/python3/dist-packages/S3/S3Uri.py:155: SyntaxWarning: invalid escape sequence '\w'
evelyn777-chai-sft-3b-v3-v3-uploader: _re = re.compile("^(\w+://)?(.*)", re.UNICODE)
evelyn777-chai-sft-3b-v3-v3-uploader: /usr/lib/python3/dist-packages/S3/FileLists.py:480: SyntaxWarning: invalid escape sequence '\*'
evelyn777-chai-sft-3b-v3-v3-uploader: wildcard_split_result = re.split("\*|\?", uri_str, maxsplit=1)
evelyn777-chai-sft-3b-v3-v3-uploader: Bucket 's3://guanaco-vllm-models/' created
evelyn777-chai-sft-3b-v3-v3-uploader: uploading /dev/shm/model_output to s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v3-v3
evelyn777-chai-sft-3b-v3-v3-uploader: cp /dev/shm/model_output/.gitattributes s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v3-v3/.gitattributes
evelyn777-chai-sft-3b-v3-v3-uploader: cp /dev/shm/model_output/config.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v3-v3/config.json
evelyn777-chai-sft-3b-v3-v3-uploader: cp /dev/shm/model_output/tokenizer_config.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v3-v3/tokenizer_config.json
evelyn777-chai-sft-3b-v3-v3-uploader: cp /dev/shm/model_output/generation_config.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v3-v3/generation_config.json
evelyn777-chai-sft-3b-v3-v3-uploader: cp /dev/shm/model_output/added_tokens.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v3-v3/added_tokens.json
evelyn777-chai-sft-3b-v3-v3-uploader: cp /dev/shm/model_output/special_tokens_map.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v3-v3/special_tokens_map.json
evelyn777-chai-sft-3b-v3-v3-uploader: cp /dev/shm/model_output/chat_template.jinja s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v3-v3/chat_template.jinja
evelyn777-chai-sft-3b-v3-v3-uploader: cp /dev/shm/model_output/model.safetensors.index.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v3-v3/model.safetensors.index.json
evelyn777-chai-sft-3b-v3-v3-uploader: cp /dev/shm/model_output/merges.txt s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v3-v3/merges.txt
evelyn777-chai-sft-3b-v3-v3-uploader: cp /dev/shm/model_output/vocab.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v3-v3/vocab.json
evelyn777-chai-sft-3b-v3-v3-uploader: cp /dev/shm/model_output/tokenizer.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v3-v3/tokenizer.json
evelyn777-chai-sft-3b-v3-v3-uploader: cp /dev/shm/model_output/model-00002-of-00002.safetensors s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v3-v3/model-00002-of-00002.safetensors
evelyn777-chai-sft-3b-v3-v3-uploader: cp /dev/shm/model_output/model-00001-of-00002.safetensors s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v3-v3/model-00001-of-00002.safetensors
Job evelyn777-chai-sft-3b-v3-v3-uploader completed after 82.94s with status: succeeded
Stopping job with name evelyn777-chai-sft-3b-v3-v3-uploader
Pipeline stage VLLMUploader completed in 83.45s
run pipeline stage %s
Running pipeline stage VLLMTemplater
Pipeline stage VLLMTemplater completed in 0.14s
run pipeline stage %s
Running pipeline stage VLLMDeployer
Creating inference service evelyn777-chai-sft-3b-v3-v3
Waiting for inference service evelyn777-chai-sft-3b-v3-v3 to be ready
HTTP Request: %s %s "%s %d %s"
Inference service evelyn777-chai-sft-3b-v3-v3 ready after 171.1568901538849s
Pipeline stage VLLMDeployer completed in 171.62s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 0.7930333614349365s
Received healthy response to inference request in 0.6300990581512451s
Received healthy response to inference request in 0.7192819118499756s
Received healthy response to inference request in 0.9923932552337646s
Received healthy response to inference request in 0.7731397151947021s
Received healthy response to inference request in 0.7313644886016846s
Received healthy response to inference request in 1.0309324264526367s
Received healthy response to inference request in 0.692903995513916s
Received healthy response to inference request in 0.6972875595092773s
Received healthy response to inference request in 0.7532310485839844s
Received healthy response to inference request in 0.834846019744873s
Received healthy response to inference request in 0.9520473480224609s
Received healthy response to inference request in 0.8828980922698975s
Received healthy response to inference request in 1.116898536682129s
Received healthy response to inference request in 0.7593066692352295s
Received healthy response to inference request in 0.7577967643737793s
Received healthy response to inference request in 1.034749984741211s
Received healthy response to inference request in 0.6571853160858154s
Received healthy response to inference request in 1.0288589000701904s
Received healthy response to inference request in 0.773512601852417s
Received healthy response to inference request in 1.0881998538970947s
Received healthy response to inference request in 0.7055597305297852s
Received healthy response to inference request in 0.7269148826599121s
Received healthy response to inference request in 0.6606357097625732s
Received healthy response to inference request in 0.8371646404266357s
Received healthy response to inference request in 0.540266752243042s
Received healthy response to inference request in 1.063345193862915s
Received healthy response to inference request in 0.7417852878570557s
Received healthy response to inference request in 1.1524550914764404s
Received healthy response to inference request in 0.6154327392578125s
30 requests
0 failed requests
5th percentile: 0.6220325827598572
10th percentile: 0.6544766902923584
20th percentile: 0.696410846710205
30th percentile: 0.7246249914169312
40th percentile: 0.7486527442932129
50th percentile: 0.7662231922149658
60th percentile: 0.809758424758911
70th percentile: 0.9036428689956663
80th percentile: 1.0292736053466798
90th percentile: 1.065830659866333
95th percentile: 1.1039841294288635
99th percentile: 1.1421436905860902
mean time: 0.824784231185913
Pipeline stage StressChecker completed in 27.88s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.90s
Shutdown handler de-registered
evelyn777-chai-sft-3b-v3_v3 status is now deployed due to DeploymentManager action
evelyn777-chai-sft-3b-v3_v3 status is now inactive due to auto deactivation removed underperforming models