submission_id: zonemercy-lexical-viral-_3982_v1
developer_uid: chai_backend_admin
best_of: 4
celo_rating: 1277.64
display_name: tempv1-6
family_friendly_score: 0.5938
family_friendly_standard_error: 0.006945524602216884
formatter: {'memory_template': '', 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.9, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '</s>', '####', 'Bot:', 'User:', 'You:', '<|im_end|>', '<|eot_id|>'], 'max_input_tokens': 1024, 'best_of': 4, 'max_output_tokens': 64}
gpu_counts: {'NVIDIA RTX A6000': 1}
is_internal_developer: True
language_model: zonemercy/Lexical-Viral-v6ava-22b11e5r256
latencies: [{'batch_size': 1, 'throughput': 0.38182026225427557, 'latency_mean': 2.6189553999900816, 'latency_p50': 2.6312382221221924, 'latency_p90': 2.8808804512023927}, {'batch_size': 3, 'throughput': 0.8015609729085281, 'latency_mean': 3.7315286457538606, 'latency_p50': 3.75112521648407, 'latency_p90': 4.070001816749572}, {'batch_size': 5, 'throughput': 1.0662643675525973, 'latency_mean': 4.661459897756576, 'latency_p50': 4.679679870605469, 'latency_p90': 5.269766783714294}, {'batch_size': 6, 'throughput': 1.135903287764841, 'latency_mean': 5.240935279130936, 'latency_p50': 5.274003505706787, 'latency_p90': 5.8925337314605715}, {'batch_size': 10, 'throughput': 1.3560500933731563, 'latency_mean': 7.311697232723236, 'latency_p50': 7.301459193229675, 'latency_p90': 8.230113911628724}]
max_input_tokens: 1024
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: zonemercy/Lexical-Viral-
model_name: tempv1-6
model_num_parameters: 22247282688.0
model_repo: zonemercy/Lexical-Viral-v6ava-22b11e5r256
model_size: 22B
num_battles: 13320
num_wins: 7238
ranking_group: single
status: inactive
submission_type: basic
throughput_3p7s: 0.8
timestamp: 2024-11-13T11:58:16+00:00
us_pacific_date: 2024-11-13
win_ratio: 0.5433933933933934
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name zonemercy-lexical-viral-3982-v1-mkmlizer
Waiting for job on zonemercy-lexical-viral-3982-v1-mkmlizer to finish
zonemercy-lexical-viral-3982-v1-mkmlizer: Downloaded to shared memory in 86.226s
zonemercy-lexical-viral-3982-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpnkz515le, device:0
zonemercy-lexical-viral-3982-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
zonemercy-lexical-viral-3982-v1-mkmlizer: quantized model in 44.453s
zonemercy-lexical-viral-3982-v1-mkmlizer: Processed model zonemercy/Lexical-Viral-v6ava-22b11e5r256 in 130.680s
zonemercy-lexical-viral-3982-v1-mkmlizer: creating bucket guanaco-mkml-models
zonemercy-lexical-viral-3982-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/zonemercy-lexical-viral-3982-v1/special_tokens_map.json
zonemercy-lexical-viral-3982-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/zonemercy-lexical-viral-3982-v1/config.json
zonemercy-lexical-viral-3982-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/zonemercy-lexical-viral-3982-v1/tokenizer_config.json
zonemercy-lexical-viral-3982-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/zonemercy-lexical-viral-3982-v1/tokenizer.json
zonemercy-lexical-viral-3982-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/zonemercy-lexical-viral-3982-v1/flywheel_model.0.safetensors
zonemercy-lexical-viral-3982-v1-mkmlizer: Loading 0: 0%| | 0/507 [00:00<?, ?it/s] Loading 0: 1%| | 5/507 [00:00<00:20, 24.99it/s] Loading 0: 2%|▏ | 12/507 [00:00<00:12, 40.83it/s] Loading 0: 3%|▎ | 17/507 [00:00<00:12, 40.27it/s] Loading 0: 4%|▍ | 22/507 [00:00<00:12, 40.11it/s] Loading 0: 5%|▌ | 27/507 [00:00<00:11, 41.18it/s] Loading 0: 6%|▋ | 32/507 [00:00<00:14, 33.71it/s] Loading 0: 7%|▋ | 38/507 [00:00<00:11, 40.02it/s] Loading 0: 8%|▊ | 43/507 [00:01<00:12, 38.63it/s] Loading 0: 9%|▉ | 48/507 [00:01<00:13, 34.65it/s] Loading 0: 10%|█ | 53/507 [00:01<00:15, 28.46it/s] Loading 0: 11%|█ | 57/507 [00:01<00:16, 28.00it/s] Loading 0: 12%|█▏ | 63/507 [00:01<00:13, 32.99it/s] Loading 0: 13%|█▎ | 67/507 [00:01<00:13, 33.16it/s] Loading 0: 14%|█▍ | 73/507 [00:02<00:12, 35.91it/s] Loading 0: 16%|█▌ | 80/507 [00:02<00:11, 38.12it/s] Loading 0: 17%|█▋ | 87/507 [00:02<00:10, 41.78it/s] Loading 0: 18%|█▊ | 92/507 [00:02<00:10, 38.77it/s] Loading 0: 19%|█▉ | 96/507 [00:02<00:10, 38.08it/s] Loading 0: 20%|█▉ | 100/507 [00:02<00:11, 35.54it/s] Loading 0: 21%|██ | 105/507 [00:02<00:11, 36.35it/s] Loading 0: 21%|██▏ | 109/507 [00:03<00:11, 34.05it/s] Loading 0: 22%|██▏ | 113/507 [00:03<00:16, 24.14it/s] Loading 0: 23%|██▎ | 116/507 [00:03<00:17, 22.36it/s] Loading 0: 24%|██▍ | 122/507 [00:03<00:14, 25.72it/s] Loading 0: 25%|██▌ | 127/507 [00:03<00:12, 30.28it/s] Loading 0: 26%|██▌ | 131/507 [00:03<00:13, 28.17it/s] Loading 0: 27%|██▋ | 136/507 [00:04<00:11, 32.30it/s] Loading 0: 28%|██▊ | 140/507 [00:04<00:12, 30.03it/s] Loading 0: 29%|██▊ | 145/507 [00:04<00:10, 34.50it/s] Loading 0: 29%|██▉ | 149/507 [00:04<00:11, 31.38it/s] Loading 0: 31%|███ | 156/507 [00:04<00:09, 38.00it/s] Loading 0: 32%|███▏ | 161/507 [00:04<00:09, 37.42it/s] Loading 0: 33%|███▎ | 165/507 [00:04<00:09, 37.15it/s] Loading 0: 33%|███▎ | 169/507 [00:05<00:12, 26.42it/s] Loading 0: 34%|███▍ | 173/507 [00:05<00:12, 27.48it/s] Loading 0: 35%|███▍ | 177/507 [00:05<00:12, 25.55it/s] Loading 0: 36%|███▌ | 181/507 [00:05<00:11, 28.07it/s] Loading 0: 36%|███▋ | 185/507 [00:05<00:11, 27.19it/s] Loading 0: 37%|███▋ | 190/507 [00:05<00:09, 32.03it/s] Loading 0: 38%|███▊ | 194/507 [00:05<00:10, 29.84it/s] Loading 0: 39%|███▉ | 199/507 [00:06<00:09, 34.11it/s] Loading 0: 40%|████ | 203/507 [00:06<00:09, 31.22it/s] Loading 0: 41%|████▏ | 210/507 [00:06<00:07, 38.77it/s] Loading 0: 42%|████▏ | 215/507 [00:06<00:07, 37.21it/s] Loading 0: 43%|████▎ | 219/507 [00:06<00:09, 29.74it/s] Loading 0: 44%|████▍ | 224/507 [00:07<00:11, 25.00it/s] Loading 0: 45%|████▌ | 230/507 [00:07<00:10, 26.54it/s] Loading 0: 46%|████▋ | 235/507 [00:07<00:08, 30.54it/s] Loading 0: 47%|████▋ | 239/507 [00:07<00:09, 28.38it/s] Loading 0: 48%|████▊ | 244/507 [00:07<00:08, 32.74it/s] Loading 0: 49%|████▉ | 248/507 [00:07<00:08, 29.54it/s] Loading 0: 50%|█████ | 254/507 [00:07<00:07, 36.12it/s] Loading 0: 51%|█████ | 259/507 [00:08<00:07, 33.65it/s] Loading 0: 52%|█████▏ | 264/507 [00:08<00:06, 35.62it/s] Loading 0: 53%|█████▎ | 268/507 [00:08<00:07, 32.77it/s] Loading 0: 54%|█████▍ | 273/507 [00:08<00:06, 34.62it/s] Loading 0: 55%|█████▍ | 277/507 [00:08<00:06, 33.16it/s] Loading 0: 56%|█████▌ | 282/507 [00:08<00:06, 37.02it/s] Loading 0: 56%|█████▋ | 286/507 [00:08<00:08, 25.70it/s] Loading 0: 57%|█████▋ | 290/507 [00:09<00:07, 27.27it/s] Loading 0: 58%|█████▊ | 294/507 [00:09<00:07, 26.63it/s] Loading 0: 59%|█████▉ | 298/507 [00:09<00:07, 29.27it/s] Loading 0: 59%|█████▉ | 299/507 [00:23<00:07, 29.27it/s] Loading 0: 59%|█████▉ | 300/507 [00:23<04:20, 1.26s/it] Loading 0: 60%|█████▉ | 302/507 [00:23<03:30, 1.03s/it] Loading 0: 61%|██████ | 307/507 [00:24<02:03, 1.62it/s] Loading 0: 61%|██████ | 310/507 [00:24<01:32, 2.13it/s] Loading 0: 62%|██████▏ | 313/507 [00:24<01:08, 2.84it/s] Loading 0: 63%|██████▎ | 318/507 [00:24<00:42, 4.49it/s] Loading 0: 64%|██████▎ | 322/507 [00:24<00:30, 6.15it/s] Loading 0: 64%|██████▍ | 327/507 [00:24<00:20, 8.89it/s] Loading 0: 65%|██████▌ | 331/507 [00:24<00:15, 11.30it/s] Loading 0: 66%|██████▌ | 335/507 [00:24<00:12, 14.17it/s] Loading 0: 67%|██████▋ | 340/507 [00:25<00:10, 15.82it/s] Loading 0: 68%|██████▊ | 344/507 [00:25<00:08, 18.21it/s] Loading 0: 69%|██████▊ | 348/507 [00:25<00:08, 19.38it/s] Loading 0: 70%|██████▉ | 354/507 [00:25<00:06, 24.96it/s] Loading 0: 71%|███████ | 358/507 [00:25<00:05, 26.37it/s] Loading 0: 72%|███████▏ | 363/507 [00:25<00:04, 30.44it/s] Loading 0: 72%|███████▏ | 367/507 [00:25<00:04, 31.31it/s] Loading 0: 73%|███████▎ | 372/507 [00:26<00:03, 35.13it/s] Loading 0: 74%|███████▍ | 376/507 [00:26<00:03, 35.54it/s] Loading 0: 75%|███████▌ | 381/507 [00:26<00:03, 38.72it/s] Loading 0: 76%|███████▌ | 386/507 [00:26<00:03, 36.82it/s] Loading 0: 77%|███████▋ | 390/507 [00:26<00:03, 31.16it/s] Loading 0: 78%|███████▊ | 395/507 [00:26<00:04, 26.48it/s] Loading 0: 79%|███████▉ | 401/507 [00:27<00:03, 28.04it/s] Loading 0: 80%|████████ | 408/507 [00:27<00:02, 33.66it/s] Loading 0: 81%|████████▏ | 412/507 [00:27<00:02, 32.47it/s] Loading 0: 82%|████████▏ | 417/507 [00:27<00:02, 34.51it/s] Loading 0: 83%|████████▎ | 421/507 [00:27<00:02, 32.92it/s] Loading 0: 84%|████████▍ | 426/507 [00:27<00:02, 35.17it/s] Loading 0: 85%|████████▍ | 430/507 [00:27<00:02, 33.30it/s] Loading 0: 86%|████████▌ | 435/507 [00:27<00:02, 35.58it/s] Loading 0: 87%|████████▋ | 439/507 [00:28<00:01, 34.38it/s] Loading 0: 88%|████████▊ | 444/507 [00:28<00:01, 37.68it/s] Loading 0: 88%|████████▊ | 448/507 [00:28<00:01, 35.40it/s] Loading 0: 89%|████████▉ | 453/507 [00:28<00:01, 39.03it/s] Loading 0: 90%|█████████ | 458/507 [00:30<00:08, 6.09it/s] Loading 0: 91%|█████████ | 461/507 [00:30<00:06, 7.26it/s] Loading 0: 92%|█████████▏| 465/507 [00:31<00:04, 9.11it/s] Loading 0: 93%|█████████▎| 470/507 [00:31<00:02, 12.61it/s] Loading 0: 93%|█████████▎| 474/507 [00:31<00:02, 14.48it/s] Loading 0: 94%|█████████▍| 479/507 [00:31<00:01, 18.73it/s] Loading 0: 95%|█████████▌| 483/507 [00:31<00:01, 20.21it/s] Loading 0: 97%|█████████▋| 490/507 [00:31<00:00, 27.26it/s] Loading 0: 97%|█████████▋| 494/507 [00:31<00:00, 28.52it/s] Loading 0: 98%|█████████▊| 499/507 [00:31<00:00, 31.19it/s] Loading 0: 99%|█████████▉| 503/507 [00:32<00:00, 32.02it/s]
Job zonemercy-lexical-viral-3982-v1-mkmlizer completed after 164.21s with status: succeeded
Stopping job with name zonemercy-lexical-viral-3982-v1-mkmlizer
Pipeline stage MKMLizer completed in 164.80s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.22s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service zonemercy-lexical-viral-3982-v1
Waiting for inference service zonemercy-lexical-viral-3982-v1 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Failed to get response for submission rica40325-10-14dpo_v2: ('http://rica40325-10-14dpo-v2-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '')
Inference service zonemercy-lexical-viral-3982-v1 ready after 191.08638525009155s
Pipeline stage MKMLDeployer completed in 192.12s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.8159751892089844s
Received healthy response to inference request in 2.8951728343963623s
Received healthy response to inference request in 2.3269264698028564s
Received healthy response to inference request in 2.37074613571167s
Received healthy response to inference request in 2.4628238677978516s
5 requests
0 failed requests
5th percentile: 2.335690402984619
10th percentile: 2.3444543361663817
20th percentile: 2.3619822025299073
30th percentile: 2.3891616821289063
40th percentile: 2.4259927749633787
50th percentile: 2.4628238677978516
60th percentile: 2.6040843963623046
70th percentile: 2.7453449249267576
80th percentile: 2.83181471824646
90th percentile: 2.863493776321411
95th percentile: 2.8793333053588865
99th percentile: 2.892004928588867
mean time: 2.574328899383545
Pipeline stage StressChecker completed in 14.68s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Failed to get response for submission sao10k-mn-12b-lyra-v4a1_v12: ('http://sao10k-mn-12b-lyra-v4a1-v12-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:38658->127.0.0.1:8080: read: connection reset by peer\n')
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 2.72s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 2.44s
Shutdown handler de-registered
zonemercy-lexical-viral-_3982_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-lexical-viral-3982-v1-profiler
Waiting for inference service zonemercy-lexical-viral-3982-v1-profiler to be ready
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 3165.25s
Shutdown handler de-registered
zonemercy-lexical-viral-_3982_v1 status is now inactive due to auto deactivation removed underperforming models