developer_uid: junhua024
submission_id: junhua024-chai-16-full-12_429_v3
model_name: junhua024-chai-16-full-12_429_v3
model_group: junhua024/chai_16_full_1
status: torndown
timestamp: 2025-07-19T21:14:12+00:00
num_battles: 7629
num_wins: 3781
celo_rating: 1276.25
family_friendly_score: 0.5626
family_friendly_standard_error: 0.0070154292812343285
submission_type: basic
model_repo: junhua024/chai_16_full_12_o_ffn_1925
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 8
max_input_tokens: 1024
max_output_tokens: 64
reward_model: default
latencies: [{'batch_size': 1, 'throughput': 0.5943696810913318, 'latency_mean': 1.6822781491279601, 'latency_p50': 1.6899476051330566, 'latency_p90': 1.8612717628479003}, {'batch_size': 3, 'throughput': 1.0705634885243458, 'latency_mean': 2.789328808784485, 'latency_p50': 2.7958768606185913, 'latency_p90': 3.0954867124557492}, {'batch_size': 5, 'throughput': 1.267574026133155, 'latency_mean': 3.9201651513576508, 'latency_p50': 3.9249722957611084, 'latency_p90': 4.413367700576782}, {'batch_size': 6, 'throughput': 1.3258896086696172, 'latency_mean': 4.500145738124847, 'latency_p50': 4.510457277297974, 'latency_p90': 5.016635179519653}, {'batch_size': 8, 'throughput': 1.3901199181198507, 'latency_mean': 5.712511178255081, 'latency_p50': 5.752899408340454, 'latency_p90': 6.3997688531875605}, {'batch_size': 10, 'throughput': 1.4205417593546166, 'latency_mean': 6.975988306999207, 'latency_p50': 7.062240123748779, 'latency_p90': 7.842541337013245}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: junhua024-chai-16-full-12_429_v3
is_internal_developer: False
language_model: junhua024/chai_16_full_12_o_ffn_1925
model_size: 13B
ranking_group: single
throughput_3p7s: 1.25
us_pacific_date: 2025-07-19
win_ratio: 0.4956088609254162
generation_params: {'temperature': 1.0, 'top_p': 0.88, 'min_p': 0.0, 'top_k': 10, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name junhua024-chai-16-full-12-429-v3-mkmlizer
Waiting for job on junhua024-chai-16-full-12-429-v3-mkmlizer to finish
junhua024-chai-16-full-12-429-v3-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ██████ ██████ █████ ████ ████ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ░░██████ ██████ ░░███ ███░ ░░███ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ░███░█████░███ ░███ ███ ░███ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ░███░░███ ░███ ░███████ ░███ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ░███ ░░░ ░███ ░███░░███ ░███ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ░███ ░███ ░███ ░░███ ░███ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ █████ █████ █████ ░░████ █████ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ░░░░░ ░░░░░ ░░░░░ ░░░░ ░░░░░ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ Version: 0.29.15 ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ Features: FLYWHEEL, CUDA ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ Copyright 2023-2025 MK ONE TECHNOLOGIES Inc. ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ https://mk1.ai ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ The license key for the current software has been verified as ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ belonging to: ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ Chai Research Corp. ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ Expiration: 2028-03-31 23:59:59 ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ║ ║
junhua024-chai-16-full-12-429-v3-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
junhua024-chai-16-full-12-429-v3-mkmlizer: Downloaded to shared memory in 149.718s
junhua024-chai-16-full-12-429-v3-mkmlizer: Checking if junhua024/chai_16_full_12_o_ffn_1925 already exists in ChaiML
junhua024-chai-16-full-12-429-v3-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpf_13ykc9, device:0
junhua024-chai-16-full-12-429-v3-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
junhua024-chai-16-full-12-429-v3-mkmlizer: quantized model in 32.241s
junhua024-chai-16-full-12-429-v3-mkmlizer: Processed model junhua024/chai_16_full_12_o_ffn_1925 in 182.049s
junhua024-chai-16-full-12-429-v3-mkmlizer: creating bucket guanaco-mkml-models
junhua024-chai-16-full-12-429-v3-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
junhua024-chai-16-full-12-429-v3-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/junhua024-chai-16-full-12-429-v3/nvidia
junhua024-chai-16-full-12-429-v3-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/junhua024-chai-16-full-12-429-v3/nvidia/config.json
junhua024-chai-16-full-12-429-v3-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/junhua024-chai-16-full-12-429-v3/nvidia/special_tokens_map.json
junhua024-chai-16-full-12-429-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/junhua024-chai-16-full-12-429-v3/nvidia/tokenizer_config.json
junhua024-chai-16-full-12-429-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/junhua024-chai-16-full-12-429-v3/nvidia/tokenizer.json
junhua024-chai-16-full-12-429-v3-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/junhua024-chai-16-full-12-429-v3/nvidia/flywheel_model.0.safetensors
Job junhua024-chai-16-full-12-429-v3-mkmlizer completed after 211.21s with status: succeeded
Stopping job with name junhua024-chai-16-full-12-429-v3-mkmlizer
Pipeline stage MKMLizer completed in 211.76s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.15s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service junhua024-chai-16-full-12-429-v3
Waiting for inference service junhua024-chai-16-full-12-429-v3 to be ready
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Retrying (%r) after connection broken by '%r': %s
Inference service junhua024-chai-16-full-12-429-v3 ready after 331.46095967292786s
Pipeline stage MKMLDeployer completed in 331.90s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.4987235069274902s
Received healthy response to inference request in 1.6631414890289307s
Failed to get response for submission chaiml-nis-qwen32b-sim_98336_v34: HTTPConnectionPool(host='chaiml-nis-qwen32b-sim-98336-v34-predictor.tenant-chaiml-guanaco.k.chaiverse.com', port=80): Read timed out. (read timeout=12.0)
Received healthy response to inference request in 1.6433894634246826s
Received healthy response to inference request in 1.542820930480957s
Received healthy response to inference request in 2.034693956375122s
5 requests
0 failed requests
5th percentile: 1.5629346370697021
10th percentile: 1.5830483436584473
20th percentile: 1.6232757568359375
30th percentile: 1.6473398685455323
40th percentile: 1.6552406787872314
50th percentile: 1.6631414890289307
60th percentile: 1.8117624759674071
70th percentile: 1.9603834629058836
80th percentile: 2.127499866485596
90th percentile: 2.313111686706543
95th percentile: 2.4059175968170163
99th percentile: 2.4801623249053955
mean time: 1.8765538692474366
Pipeline stage StressChecker completed in 11.09s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 0.65s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 0.72s
Shutdown handler de-registered
junhua024-chai-16-full-12_429_v3 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
%s, retrying in %s seconds...
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 4892.01s
Shutdown handler de-registered
junhua024-chai-16-full-12_429_v3 status is now inactive due to auto deactivation removed underperforming models
junhua024-chai-16-full-12_429_v3 status is now torndown due to DeploymentManager action