submission_id: rinen0721-mistral12b-1123_v1
developer_uid: rinen0721
best_of: 8
celo_rating: 1239.81
display_name: rinen0721-mistral12b-1123_v1
family_friendly_score: 0.5800000000000001
family_friendly_standard_error: 0.006979971346646059
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
gpu_counts: {'NVIDIA RTX A5000': 1}
is_internal_developer: False
language_model: rinen0721/Mistral12b-1123
latencies: [{'batch_size': 1, 'throughput': 0.6076222011659029, 'latency_mean': 1.6457013976573944, 'latency_p50': 1.646143913269043, 'latency_p90': 1.8031798124313354}, {'batch_size': 3, 'throughput': 1.1233458100581717, 'latency_mean': 2.6675399148464205, 'latency_p50': 2.664899706840515, 'latency_p90': 2.942652726173401}, {'batch_size': 5, 'throughput': 1.3399134843769065, 'latency_mean': 3.717546898126602, 'latency_p50': 3.7414592504501343, 'latency_p90': 4.116902875900268}, {'batch_size': 6, 'throughput': 1.4134175737446621, 'latency_mean': 4.223429511785508, 'latency_p50': 4.1876161098480225, 'latency_p90': 4.789455199241638}, {'batch_size': 8, 'throughput': 1.468580398687423, 'latency_mean': 5.417654523849487, 'latency_p50': 5.468093395233154, 'latency_p90': 6.0564316511154175}, {'batch_size': 10, 'throughput': 1.5189799529332506, 'latency_mean': 6.524665600061416, 'latency_p50': 6.519066095352173, 'latency_p90': 7.413895440101624}]
max_input_tokens: 1024
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: rinen0721/Mistral12b-112
model_name: rinen0721-mistral12b-1123_v1
model_num_parameters: 12772070400.0
model_repo: rinen0721/Mistral12b-1123
model_size: 13B
num_battles: 14055
num_wins: 6535
ranking_group: single
status: inactive
submission_type: basic
throughput_3p7s: 1.34
timestamp: 2024-11-27T01:26:09+00:00
us_pacific_date: 2024-11-26
win_ratio: 0.46495908929206686
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name rinen0721-mistral12b-1123-v1-mkmlizer
Waiting for job on rinen0721-mistral12b-1123-v1-mkmlizer to finish
rinen0721-mistral12b-1123-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
rinen0721-mistral12b-1123-v1-mkmlizer: ║ _____ __ __ ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ /___/ ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ Version: 0.11.12 ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ https://mk1.ai ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ The license key for the current software has been verified as ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ belonging to: ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ Chai Research Corp. ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ Expiration: 2025-01-15 23:59:59 ║
rinen0721-mistral12b-1123-v1-mkmlizer: ║ ║
rinen0721-mistral12b-1123-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
rinen0721-mistral12b-1123-v1-mkmlizer: Downloaded to shared memory in 48.872s
rinen0721-mistral12b-1123-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpefq06uwt, device:0
rinen0721-mistral12b-1123-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
rinen0721-mistral12b-1123-v1-mkmlizer: quantized model in 36.270s
rinen0721-mistral12b-1123-v1-mkmlizer: Processed model rinen0721/Mistral12b-1123 in 85.142s
rinen0721-mistral12b-1123-v1-mkmlizer: creating bucket guanaco-mkml-models
rinen0721-mistral12b-1123-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
rinen0721-mistral12b-1123-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/rinen0721-mistral12b-1123-v1
rinen0721-mistral12b-1123-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/rinen0721-mistral12b-1123-v1/config.json
rinen0721-mistral12b-1123-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/rinen0721-mistral12b-1123-v1/special_tokens_map.json
rinen0721-mistral12b-1123-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/rinen0721-mistral12b-1123-v1/tokenizer.json
rinen0721-mistral12b-1123-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/rinen0721-mistral12b-1123-v1/flywheel_model.0.safetensors
rinen0721-mistral12b-1123-v1-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:11, 30.12it/s] Loading 0: 4%|▎ | 13/363 [00:00<00:06, 50.16it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:07, 46.25it/s] Loading 0: 7%|▋ | 24/363 [00:00<00:07, 44.52it/s] Loading 0: 9%|▊ | 31/363 [00:00<00:06, 50.53it/s] Loading 0: 10%|█ | 37/363 [00:00<00:07, 44.79it/s] Loading 0: 12%|█▏ | 42/363 [00:00<00:07, 44.27it/s] Loading 0: 13%|█▎ | 49/363 [00:01<00:06, 50.02it/s] Loading 0: 15%|█▌ | 55/363 [00:01<00:06, 47.41it/s] Loading 0: 17%|█▋ | 61/363 [00:01<00:08, 35.40it/s] Loading 0: 18%|█▊ | 66/363 [00:01<00:08, 36.11it/s] Loading 0: 20%|█▉ | 72/363 [00:01<00:07, 40.76it/s] Loading 0: 21%|██▏ | 78/363 [00:01<00:06, 41.15it/s] Loading 0: 23%|██▎ | 83/363 [00:01<00:06, 41.54it/s] Loading 0: 25%|██▍ | 90/363 [00:02<00:05, 46.85it/s] Loading 0: 26%|██▋ | 96/363 [00:02<00:05, 45.03it/s] Loading 0: 28%|██▊ | 101/363 [00:02<00:05, 44.06it/s] Loading 0: 30%|███ | 109/363 [00:02<00:04, 52.44it/s] Loading 0: 32%|███▏ | 115/363 [00:02<00:05, 48.45it/s] Loading 0: 33%|███▎ | 121/363 [00:02<00:05, 47.39it/s] Loading 0: 35%|███▍ | 126/363 [00:02<00:04, 48.01it/s] Loading 0: 36%|███▋ | 132/363 [00:02<00:05, 45.59it/s] Loading 0: 38%|███▊ | 137/363 [00:03<00:05, 43.54it/s] Loading 0: 39%|███▉ | 142/363 [00:03<00:06, 34.12it/s] Loading 0: 40%|████ | 146/363 [00:03<00:06, 34.28it/s] Loading 0: 41%|████▏ | 150/363 [00:03<00:06, 32.72it/s] Loading 0: 43%|████▎ | 156/363 [00:03<00:05, 38.89it/s] Loading 0: 44%|████▍ | 161/363 [00:03<00:04, 40.92it/s] Loading 0: 46%|████▌ | 166/363 [00:03<00:04, 42.36it/s] Loading 0: 47%|████▋ | 171/363 [00:03<00:04, 44.33it/s] Loading 0: 48%|████▊ | 176/363 [00:04<00:04, 38.50it/s] Loading 0: 50%|█████ | 183/363 [00:04<00:03, 45.94it/s] Loading 0: 52%|█████▏ | 188/363 [00:04<00:03, 46.35it/s] Loading 0: 53%|█████▎ | 193/363 [00:04<00:03, 46.83it/s] Loading 0: 55%|█████▍ | 199/363 [00:04<00:03, 46.11it/s] Loading 0: 56%|█████▌ | 204/363 [00:04<00:03, 45.19it/s] Loading 0: 58%|█████▊ | 211/363 [00:04<00:03, 49.62it/s] Loading 0: 60%|█████▉ | 217/363 [00:04<00:03, 47.86it/s] Loading 0: 61%|██████▏ | 223/363 [00:05<00:03, 36.68it/s] Loading 0: 63%|██████▎ | 228/363 [00:05<00:03, 37.11it/s] Loading 0: 64%|██████▍ | 233/363 [00:05<00:03, 35.15it/s] Loading 0: 65%|██████▌ | 237/363 [00:05<00:03, 33.10it/s] Loading 0: 66%|██████▋ | 241/363 [00:05<00:04, 29.08it/s] Loading 0: 67%|██████▋ | 245/363 [00:05<00:03, 29.98it/s] Loading 0: 69%|██████▊ | 249/363 [00:06<00:04, 26.99it/s] Loading 0: 70%|██████▉ | 253/363 [00:06<00:03, 27.88it/s] Loading 0: 71%|███████ | 257/363 [00:06<00:04, 25.45it/s] Loading 0: 72%|███████▏ | 262/363 [00:06<00:03, 28.67it/s] Loading 0: 73%|███████▎ | 266/363 [00:06<00:03, 28.62it/s] Loading 0: 75%|███████▌ | 274/363 [00:06<00:02, 38.45it/s] Loading 0: 77%|███████▋ | 279/363 [00:06<00:02, 40.85it/s] Loading 0: 78%|███████▊ | 284/363 [00:07<00:02, 36.30it/s] Loading 0: 80%|████████ | 291/363 [00:07<00:01, 43.34it/s] Loading 0: 82%|████████▏ | 296/363 [00:07<00:01, 43.77it/s] Loading 0: 83%|████████▎ | 303/363 [00:07<00:01, 44.23it/s] Loading 0: 85%|████████▍ | 308/363 [00:14<00:20, 2.64it/s] Loading 0: 86%|████████▌ | 312/363 [00:14<00:15, 3.37it/s] Loading 0: 88%|████████▊ | 320/363 [00:14<00:07, 5.44it/s] Loading 0: 90%|████████▉ | 326/363 [00:14<00:05, 7.38it/s] Loading 0: 91%|█████████ | 331/363 [00:14<00:03, 9.43it/s] Loading 0: 93%|█████████▎| 338/363 [00:14<00:01, 13.35it/s] Loading 0: 95%|█████████▍| 344/363 [00:15<00:01, 16.66it/s] Loading 0: 96%|█████████▌| 349/363 [00:15<00:00, 19.38it/s] Loading 0: 98%|█████████▊| 356/363 [00:15<00:00, 25.20it/s] Loading 0: 100%|█████████▉| 362/363 [00:15<00:00, 28.83it/s]
Job rinen0721-mistral12b-1123-v1-mkmlizer completed after 104.12s with status: succeeded
Stopping job with name rinen0721-mistral12b-1123-v1-mkmlizer
Pipeline stage MKMLizer completed in 104.62s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.16s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service rinen0721-mistral12b-1123-v1
Waiting for inference service rinen0721-mistral12b-1123-v1 to be ready
Failed to get response for submission blend_mokul_2024-11-14: ('http://chaiml-small-anthropic-a-1700-v2-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:37406->127.0.0.1:8080: read: connection reset by peer\n')
Inference service rinen0721-mistral12b-1123-v1 ready after 130.4570028781891s
Pipeline stage MKMLDeployer completed in 131.33s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.107992172241211s
Received healthy response to inference request in 1.6506385803222656s
Received healthy response to inference request in 1.695770502090454s
Received healthy response to inference request in 1.4245572090148926s
Received healthy response to inference request in 1.9702038764953613s
5 requests
0 failed requests
5th percentile: 1.4697734832763671
10th percentile: 1.5149897575378417
20th percentile: 1.605422306060791
30th percentile: 1.6596649646759034
40th percentile: 1.6777177333831788
50th percentile: 1.695770502090454
60th percentile: 1.805543851852417
70th percentile: 1.9153172016143798
80th percentile: 1.9977615356445313
90th percentile: 2.052876853942871
95th percentile: 2.080434513092041
99th percentile: 2.102480640411377
mean time: 1.769832468032837
Pipeline stage StressChecker completed in 10.14s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyTriggerPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage OfflineFamilyFriendlyTriggerPipeline completed in 2.25s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
triggered trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 2.11s
Shutdown handler de-registered
rinen0721-mistral12b-1123_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage OfflineFamilyFriendlyScorer
Evaluating %s Family Friendly Score with %s threads
Pipeline stage OfflineFamilyFriendlyScorer completed in 2884.53s
Shutdown handler de-registered
rinen0721-mistral12b-1123_v1 status is now inactive due to auto deactivation removed underperforming models