developer_uid: chai_evaluation_service
submission_id: evelyn777-chai-sft-3b_v2
model_name: evelyn777-chai-sft-3b_v2
model_group: evelyn777/chai-sft-3b
status: inactive
timestamp: 2026-02-07T21:56:55+00:00
num_battles: 11151
num_wins: 3819
celo_rating: 1192.39
family_friendly_score: 0.0
family_friendly_standard_error: 0.0
submission_type: basic
model_repo: evelyn777/chai-sft-3b
model_architecture: Qwen2ForCausalLM
model_num_parameters: 3397011456.0
best_of: 8
max_input_tokens: 2048
max_output_tokens: 64
reward_model: default
display_name: evelyn777-chai-sft-3b_v2
is_internal_developer: True
language_model: evelyn777/chai-sft-3b
model_size: 3B
ranking_group: single
us_pacific_date: 2026-02-07
win_ratio: 0.3424804950228679
generation_params: {'temperature': 0.7, 'top_p': 0.9, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 2048, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': '<|im_start|>system\n{memory}<|im_end|>\n', 'prompt_template': '<|im_start|>user\n{prompt}<|im_end|>\n', 'bot_template': '<|im_start|>assistant\n{bot_name}: {message}<|im_end|>\n', 'user_template': '<|im_start|>user\n{user_name}: {message}<|im_end|>\n', 'response_template': '<|im_start|>assistant\n{bot_name}:', 'truncate_by_message': True}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage VLLMUploader
Starting job with name evelyn777-chai-sft-3b-v2-uploader
Waiting for job on evelyn777-chai-sft-3b-v2-uploader to finish
evelyn777-chai-sft-3b-v2-uploader: Using quantization_mode: none
evelyn777-chai-sft-3b-v2-uploader: Downloading snapshot of evelyn777/chai-sft-3b...
evelyn777-chai-sft-3b-v2-uploader: Fetching 13 files: 0%| | 0/13 [00:00<?, ?it/s] Fetching 13 files: 8%|▊ | 1/13 [00:00<00:03, 3.48it/s] Fetching 13 files: 38%|███▊ | 5/13 [00:00<00:00, 12.58it/s] Fetching 13 files: 54%|█████▍ | 7/13 [00:04<00:05, 1.20it/s] Fetching 13 files: 100%|██████████| 13/13 [00:04<00:00, 2.82it/s]
evelyn777-chai-sft-3b-v2-uploader: Downloaded in 4.904s
evelyn777-chai-sft-3b-v2-uploader: Processed model evelyn777/chai-sft-3b in 7.444s
evelyn777-chai-sft-3b-v2-uploader: creating bucket guanaco-vllm-models
evelyn777-chai-sft-3b-v2-uploader: /usr/lib/python3/dist-packages/S3/BaseUtils.py:56: SyntaxWarning: invalid escape sequence '\.'
evelyn777-chai-sft-3b-v2-uploader: RE_S3_DATESTRING = re.compile('\.[0-9]*(?:[Z\\-\\+]*?)')
evelyn777-chai-sft-3b-v2-uploader: /usr/lib/python3/dist-packages/S3/BaseUtils.py:57: SyntaxWarning: invalid escape sequence '\s'
evelyn777-chai-sft-3b-v2-uploader: RE_XML_NAMESPACE = re.compile(b'^(<?[^>]+?>\s*|\s*)(<\w+) xmlns=[\'"](https?://[^\'"]+)[\'"]', re.MULTILINE)
evelyn777-chai-sft-3b-v2-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:240: SyntaxWarning: invalid escape sequence '\.'
evelyn777-chai-sft-3b-v2-uploader: invalid = re.search("([^a-z0-9\.-])", bucket, re.UNICODE)
evelyn777-chai-sft-3b-v2-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:244: SyntaxWarning: invalid escape sequence '\.'
evelyn777-chai-sft-3b-v2-uploader: invalid = re.search("([^A-Za-z0-9\._-])", bucket, re.UNICODE)
evelyn777-chai-sft-3b-v2-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:255: SyntaxWarning: invalid escape sequence '\.'
evelyn777-chai-sft-3b-v2-uploader: if re.search("-\.", bucket, re.UNICODE):
evelyn777-chai-sft-3b-v2-uploader: /usr/lib/python3/dist-packages/S3/Utils.py:257: SyntaxWarning: invalid escape sequence '\.'
evelyn777-chai-sft-3b-v2-uploader: if re.search("\.\.", bucket, re.UNICODE):
evelyn777-chai-sft-3b-v2-uploader: /usr/lib/python3/dist-packages/S3/S3Uri.py:155: SyntaxWarning: invalid escape sequence '\w'
evelyn777-chai-sft-3b-v2-uploader: _re = re.compile("^(\w+://)?(.*)", re.UNICODE)
evelyn777-chai-sft-3b-v2-uploader: /usr/lib/python3/dist-packages/S3/FileLists.py:480: SyntaxWarning: invalid escape sequence '\*'
evelyn777-chai-sft-3b-v2-uploader: wildcard_split_result = re.split("\*|\?", uri_str, maxsplit=1)
evelyn777-chai-sft-3b-v2-uploader: Bucket 's3://guanaco-vllm-models/' created
evelyn777-chai-sft-3b-v2-uploader: uploading /dev/shm/model_output to s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v2
evelyn777-chai-sft-3b-v2-uploader: cp /dev/shm/model_output/.gitattributes s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v2/.gitattributes
evelyn777-chai-sft-3b-v2-uploader: cp /dev/shm/model_output/added_tokens.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v2/added_tokens.json
evelyn777-chai-sft-3b-v2-uploader: cp /dev/shm/model_output/special_tokens_map.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v2/special_tokens_map.json
evelyn777-chai-sft-3b-v2-uploader: cp /dev/shm/model_output/chat_template.jinja s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v2/chat_template.jinja
evelyn777-chai-sft-3b-v2-uploader: cp /dev/shm/model_output/generation_config.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v2/generation_config.json
evelyn777-chai-sft-3b-v2-uploader: cp /dev/shm/model_output/tokenizer_config.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v2/tokenizer_config.json
evelyn777-chai-sft-3b-v2-uploader: cp /dev/shm/model_output/config.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v2/config.json
evelyn777-chai-sft-3b-v2-uploader: cp /dev/shm/model_output/model.safetensors.index.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v2/model.safetensors.index.json
evelyn777-chai-sft-3b-v2-uploader: cp /dev/shm/model_output/merges.txt s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v2/merges.txt
evelyn777-chai-sft-3b-v2-uploader: cp /dev/shm/model_output/vocab.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v2/vocab.json
evelyn777-chai-sft-3b-v2-uploader: cp /dev/shm/model_output/tokenizer.json s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v2/tokenizer.json
evelyn777-chai-sft-3b-v2-uploader: cp /dev/shm/model_output/model-00002-of-00002.safetensors s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v2/model-00002-of-00002.safetensors
evelyn777-chai-sft-3b-v2-uploader: cp /dev/shm/model_output/model-00001-of-00002.safetensors s3://guanaco-vllm-models/evelyn777-chai-sft-3b-v2/model-00001-of-00002.safetensors
Job evelyn777-chai-sft-3b-v2-uploader completed after 93.84s with status: succeeded
Stopping job with name evelyn777-chai-sft-3b-v2-uploader
Pipeline stage VLLMUploader completed in 94.30s
run pipeline stage %s
Running pipeline stage VLLMTemplater
Pipeline stage VLLMTemplater completed in 0.14s
run pipeline stage %s
Running pipeline stage VLLMDeployer
Creating inference service evelyn777-chai-sft-3b-v2
Waiting for inference service evelyn777-chai-sft-3b-v2 to be ready
Inference service evelyn777-chai-sft-3b-v2 ready after 170.85302448272705s
Pipeline stage VLLMDeployer completed in 171.36s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 0.9836118221282959s
Received healthy response to inference request in 1.23453688621521s
Received healthy response to inference request in 1.0308358669281006s
Received healthy response to inference request in 0.6913602352142334s
Received healthy response to inference request in 0.7395434379577637s
Received healthy response to inference request in 1.2768912315368652s
Received healthy response to inference request in 0.48784756660461426s
Received healthy response to inference request in 0.6291403770446777s
Received healthy response to inference request in 1.4478776454925537s
Received healthy response to inference request in 0.8960428237915039s
Received healthy response to inference request in 0.9096367359161377s
Received healthy response to inference request in 0.6548962593078613s
Received healthy response to inference request in 0.8218514919281006s
Received healthy response to inference request in 0.5235426425933838s
Received healthy response to inference request in 0.8900716304779053s
Received healthy response to inference request in 0.6731574535369873s
Received healthy response to inference request in 0.7439813613891602s
Received healthy response to inference request in 0.8154728412628174s
Received healthy response to inference request in 0.9945738315582275s
Received healthy response to inference request in 0.8465209007263184s
Received healthy response to inference request in 0.6850967407226562s
Received healthy response to inference request in 0.712655782699585s
Received healthy response to inference request in 0.68497633934021s
Received healthy response to inference request in 0.7045159339904785s
Received healthy response to inference request in 0.7271921634674072s
Received healthy response to inference request in 1.0272648334503174s
Received healthy response to inference request in 0.9516003131866455s
Received healthy response to inference request in 0.6390888690948486s
Received healthy response to inference request in 0.831669807434082s
Received healthy response to inference request in 0.6308588981628418s
30 requests
0 failed requests
5th percentile: 0.5710616230964661
10th percentile: 0.6306870460510254
20th percentile: 0.6695052146911621
30th percentile: 0.6894811868667603
40th percentile: 0.7213776111602783
50th percentile: 0.7797271013259888
60th percentile: 0.8376102447509766
70th percentile: 0.900120997428894
80th percentile: 0.9858042240142822
90th percentile: 1.0512059688568118
95th percentile: 1.2578317761421203
99th percentile: 1.398291585445404
mean time: 0.8295437574386597
Pipeline stage StressChecker completed in 27.29s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.72s
Shutdown handler de-registered
evelyn777-chai-sft-3b_v2 status is now deployed due to DeploymentManager action
evelyn777-chai-sft-3b_v2 status is now inactive due to auto deactivation removed underperforming models