developer_uid: azuruce
submission_id: chaiml-mistral-nemo-simp_7503_v4
model_name: chaiml-mistral-nemo-simp_7503_v4
model_group: ChaiML/mistral_nemo_simp
status: inactive
timestamp: 2024-12-15T23:18:29+00:00
num_battles: 8458
num_wins: 3161
celo_rating: 1141.0
family_friendly_score: 0.5462
family_friendly_standard_error: 0.007040817566163747
submission_type: basic
model_repo: ChaiML/mistral_nemo_simpo_baseline_albert_20241213_v2_mad-checkpoint-1872
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 4
max_input_tokens: 1024
max_output_tokens: 64
latencies: [{'batch_size': 1, 'throughput': 0.6410140680973554, 'latency_mean': 1.5599682891368867, 'latency_p50': 1.5683743953704834, 'latency_p90': 1.7086413383483887}, {'batch_size': 4, 'throughput': 1.4899697252387152, 'latency_mean': 2.682475017309189, 'latency_p50': 2.691136360168457, 'latency_p90': 3.018851351737976}, {'batch_size': 5, 'throughput': 1.6410675167860158, 'latency_mean': 3.0376181972026823, 'latency_p50': 3.022893786430359, 'latency_p90': 3.4532058954238893}, {'batch_size': 8, 'throughput': 1.9029961791704288, 'latency_mean': 4.180486240386963, 'latency_p50': 4.18738055229187, 'latency_p90': 4.669799065589904}, {'batch_size': 10, 'throughput': 2.007226318920881, 'latency_mean': 4.9427220046520235, 'latency_p50': 4.935956954956055, 'latency_p90': 5.707175087928772}, {'batch_size': 12, 'throughput': 2.0516740794366863, 'latency_mean': 5.810086200237274, 'latency_p50': 5.818430066108704, 'latency_p90': 6.526220703125}, {'batch_size': 15, 'throughput': 2.111330720629288, 'latency_mean': 7.032636160850525, 'latency_p50': 7.042567491531372, 'latency_p90': 8.020081830024719}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: chaiml-mistral-nemo-simp_7503_v4
is_internal_developer: True
language_model: ChaiML/mistral_nemo_simpo_baseline_albert_20241213_v2_mad-checkpoint-1872
model_size: 13B
ranking_group: single
throughput_3p7s: 1.82
us_pacific_date: 2024-12-15
win_ratio: 0.3737290139512887
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 100, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '</s>', '####', 'Bot:', 'User:', 'You:', '<|im_end|>', '<|eot_id|>'], 'max_input_tokens': 1024, 'best_of': 4, 'max_output_tokens': 64}
formatter: {'memory_template': '', 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name chaiml-mistral-nemo-simp-7503-v4-mkmlizer
Waiting for job on chaiml-mistral-nemo-simp-7503-v4-mkmlizer to finish
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ _____ __ __ ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ /___/ ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ Version: 0.11.12 ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ https://mk1.ai ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ The license key for the current software has been verified as ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ belonging to: ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ Chai Research Corp. ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ Expiration: 2025-01-15 23:59:59 ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ║ ║
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: Downloaded to shared memory in 31.787s
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp934sxqz9, device:0
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: Saving flywheel model at /dev/shm/model_cache
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: quantized model in 36.115s
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: Processed model ChaiML/mistral_nemo_simpo_baseline_albert_20241213_v2_mad-checkpoint-1872 in 67.902s
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: creating bucket guanaco-mkml-models
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/chaiml-mistral-nemo-simp-7503-v4
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/chaiml-mistral-nemo-simp-7503-v4/config.json
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/chaiml-mistral-nemo-simp-7503-v4/special_tokens_map.json
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/chaiml-mistral-nemo-simp-7503-v4/tokenizer_config.json
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/chaiml-mistral-nemo-simp-7503-v4/tokenizer.json
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/chaiml-mistral-nemo-simp-7503-v4/flywheel_model.0.safetensors
chaiml-mistral-nemo-simp-7503-v4-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:12, 29.52it/s] Loading 0: 4%|▎ | 13/363 [00:00<00:07, 49.18it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:07, 44.27it/s] Loading 0: 7%|▋ | 24/363 [00:00<00:07, 43.90it/s] Loading 0: 9%|▊ | 31/363 [00:00<00:06, 50.44it/s] Loading 0: 10%|█ | 37/363 [00:00<00:07, 46.48it/s] Loading 0: 12%|█▏ | 42/363 [00:00<00:07, 45.22it/s] Loading 0: 13%|█▎ | 49/363 [00:01<00:06, 49.38it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:06, 45.08it/s] Loading 0: 17%|█▋ | 61/363 [00:01<00:08, 35.37it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:08, 34.27it/s] Loading 0: 20%|█▉ | 71/363 [00:01<00:07, 39.53it/s] Loading 0: 21%|██ | 76/363 [00:01<00:06, 41.19it/s] Loading 0: 22%|██▏ | 81/363 [00:01<00:06, 42.76it/s] Loading 0: 24%|██▍ | 87/363 [00:02<00:06, 41.04it/s] Loading 0: 25%|██▌ | 92/363 [00:02<00:06, 40.57it/s] Loading 0: 27%|██▋ | 98/363 [00:02<00:05, 44.53it/s] Loading 0: 28%|██▊ | 103/363 [00:02<00:05, 44.14it/s] Loading 0: 30%|███ | 109/363 [00:02<00:05, 48.12it/s] Loading 0: 31%|███▏ | 114/363 [00:02<00:06, 39.87it/s] Loading 0: 33%|███▎ | 119/363 [00:02<00:06, 39.83it/s] Loading 0: 35%|███▍ | 126/363 [00:02<00:05, 45.10it/s] Loading 0: 36%|███▌ | 131/363 [00:03<00:05, 46.19it/s] Loading 0: 37%|███▋ | 136/363 [00:03<00:05, 38.15it/s] Loading 0: 39%|███▉ | 142/363 [00:03<00:06, 32.27it/s] Loading 0: 40%|████ | 146/363 [00:03<00:06, 33.05it/s] Loading 0: 41%|████▏ | 150/363 [00:03<00:06, 32.18it/s] Loading 0: 43%|████▎ | 157/363 [00:03<00:05, 39.02it/s] Loading 0: 45%|████▍ | 163/363 [00:04<00:05, 38.84it/s] Loading 0: 46%|████▋ | 168/363 [00:04<00:04, 39.27it/s] Loading 0: 48%|████▊ | 174/363 [00:04<00:04, 44.17it/s] Loading 0: 49%|████▉ | 179/363 [00:04<00:04, 43.66it/s] Loading 0: 51%|█████ | 184/363 [00:04<00:04, 44.01it/s] Loading 0: 52%|█████▏ | 190/363 [00:04<00:04, 42.04it/s] Loading 0: 54%|█████▎ | 195/363 [00:04<00:03, 42.09it/s] Loading 0: 55%|█████▌ | 201/363 [00:04<00:03, 46.50it/s] Loading 0: 57%|█████▋ | 206/363 [00:04<00:03, 46.49it/s] Loading 0: 58%|█████▊ | 211/363 [00:05<00:03, 46.74it/s] Loading 0: 60%|█████▉ | 217/363 [00:05<00:03, 43.61it/s] Loading 0: 61%|██████▏ | 223/363 [00:05<00:03, 35.23it/s] Loading 0: 63%|██████▎ | 227/363 [00:05<00:03, 35.81it/s] Loading 0: 64%|██████▎ | 231/363 [00:05<00:03, 34.05it/s] Loading 0: 66%|██████▌ | 238/363 [00:05<00:03, 40.85it/s] Loading 0: 67%|██████▋ | 244/363 [00:05<00:02, 40.29it/s] Loading 0: 69%|██████▊ | 249/363 [00:06<00:02, 40.25it/s] Loading 0: 70%|███████ | 255/363 [00:06<00:02, 44.98it/s] Loading 0: 72%|███████▏ | 260/363 [00:06<00:02, 45.34it/s] Loading 0: 73%|███████▎ | 265/363 [00:06<00:02, 46.10it/s] Loading 0: 75%|███████▍ | 271/363 [00:06<00:02, 43.17it/s] Loading 0: 76%|███████▌ | 276/363 [00:06<00:02, 41.74it/s] Loading 0: 78%|███████▊ | 282/363 [00:06<00:01, 46.21it/s] Loading 0: 79%|███████▉ | 287/363 [00:06<00:01, 46.50it/s] Loading 0: 80%|████████ | 292/363 [00:06<00:01, 45.85it/s] Loading 0: 82%|████████▏ | 297/363 [00:07<00:01, 46.44it/s] Loading 0: 83%|████████▎ | 303/363 [00:07<00:01, 43.66it/s] Loading 0: 85%|████████▍ | 308/363 [00:14<00:22, 2.50it/s] Loading 0: 86%|████████▌ | 312/363 [00:14<00:15, 3.22it/s] Loading 0: 88%|████████▊ | 320/363 [00:14<00:08, 5.29it/s] Loading 0: 90%|████████▉ | 325/363 [00:14<00:05, 6.95it/s] Loading 0: 91%|█████████ | 330/363 [00:14<00:03, 8.76it/s] Loading 0: 93%|█████████▎| 338/363 [00:14<00:01, 13.23it/s] Loading 0: 94%|█████████▍| 343/363 [00:14<00:01, 16.30it/s] Loading 0: 96%|█████████▌| 348/363 [00:14<00:00, 18.33it/s] Loading 0: 98%|█████████▊| 356/363 [00:15<00:00, 25.39it/s] Loading 0: 100%|█████████▉| 362/363 [00:15<00:00, 28.12it/s]
Job chaiml-mistral-nemo-simp-7503-v4-mkmlizer completed after 94.64s with status: succeeded
Stopping job with name chaiml-mistral-nemo-simp-7503-v4-mkmlizer
Pipeline stage MKMLizer completed in 95.27s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.18s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service chaiml-mistral-nemo-simp-7503-v4
Waiting for inference service chaiml-mistral-nemo-simp-7503-v4 to be ready
Failed to get response for submission chaiml-mistral-nemo-simp_3795_v1: ('http://chaiml-llama-8b-multihea-7878-v5-predictor.tenant-chaiml-guanaco.k2.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:43286->127.0.0.1:8080: read: connection reset by peer\n')
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service chaiml-mistral-nemo-simp-7503-v4 ready after 220.9531970024109s
Pipeline stage MKMLDeployer completed in 221.55s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.7595014572143555s
Received healthy response to inference request in 1.7047083377838135s
Received healthy response to inference request in 1.4233620166778564s
Received healthy response to inference request in 1.3898320198059082s
Received healthy response to inference request in 1.3690919876098633s
5 requests
0 failed requests
5th percentile: 1.3732399940490723
10th percentile: 1.3773880004882812
20th percentile: 1.3856840133666992
30th percentile: 1.3965380191802979
40th percentile: 1.4099500179290771
50th percentile: 1.4233620166778564
60th percentile: 1.5359005451202392
70th percentile: 1.648439073562622
80th percentile: 1.7156669616699218
90th percentile: 1.7375842094421388
95th percentile: 1.7485428333282471
99th percentile: 1.7573097324371338
mean time: 1.5292991638183593
Pipeline stage StressChecker completed in 9.19s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 2.40s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 2.33s
Shutdown handler de-registered
chaiml-mistral-nemo-simp_7503_v4 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 2395.55s
Shutdown handler de-registered
chaiml-mistral-nemo-simp_7503_v4 status is now inactive due to auto deactivation removed underperforming models