submission_id: jellywibble-mistralsmall_8515_v1
developer_uid: Jellywibble
best_of: 8
celo_rating: 1230.66
display_name: mistral-small-15kctx
family_friendly_score: 0.0
formatter: {'memory_template': '', 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
gpu_counts: {'NVIDIA RTX A6000': 1}
is_internal_developer: True
language_model: Jellywibble/MistralSmall1500CTX
latencies: [{'batch_size': 1, 'throughput': 0.3817385015722982, 'latency_mean': 2.61953467130661, 'latency_p50': 2.6151455640792847, 'latency_p90': 2.900964379310608}, {'batch_size': 2, 'throughput': 0.609336547327894, 'latency_mean': 3.2782436382770537, 'latency_p50': 3.2897515296936035, 'latency_p90': 3.626428413391113}, {'batch_size': 3, 'throughput': 0.7564298311615077, 'latency_mean': 3.947181990146637, 'latency_p50': 3.9754353761672974, 'latency_p90': 4.344059753417969}, {'batch_size': 4, 'throughput': 0.8789057065886203, 'latency_mean': 4.529485340118408, 'latency_p50': 4.534647345542908, 'latency_p90': 4.997293019294738}, {'batch_size': 5, 'throughput': 0.972513373170675, 'latency_mean': 5.114478387832642, 'latency_p50': 5.139166831970215, 'latency_p90': 5.778734588623047}]
max_input_tokens: 1024
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: Jellywibble/MistralSmall
model_name: mistral-small-15kctx
model_num_parameters: 22247282688.0
model_repo: Jellywibble/MistralSmall1500CTX
model_size: 22B
num_battles: 254014
num_wins: 120851
ranking_group: single
status: torndown
submission_type: basic
throughput_3p7s: 0.71
timestamp: 2024-09-22T01:21:15+00:00
us_pacific_date: 2024-09-21
win_ratio: 0.47576511530860505
Download Preference Data
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name jellywibble-mistralsmall-8515-v1-mkmlizer
Waiting for job on jellywibble-mistralsmall-8515-v1-mkmlizer to finish
jellywibble-mistralsmall-8515-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ _____ __ __ ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ /___/ ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ Version: 0.10.1 ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ https://mk1.ai ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ The license key for the current software has been verified as ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ belonging to: ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ Chai Research Corp. ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ║ ║
jellywibble-mistralsmall-8515-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
jellywibble-mistralsmall-8515-v1-mkmlizer: Downloaded to shared memory in 87.679s
jellywibble-mistralsmall-8515-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmplj0kd_ul, device:0
jellywibble-mistralsmall-8515-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Connection pool is full, discarding connection: %s. Connection pool size: %s
jellywibble-mistralsmall-8515-v1-mkmlizer: quantized model in 43.160s
jellywibble-mistralsmall-8515-v1-mkmlizer: Processed model Jellywibble/MistralSmall1500CTX in 130.840s
jellywibble-mistralsmall-8515-v1-mkmlizer: creating bucket guanaco-mkml-models
jellywibble-mistralsmall-8515-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
jellywibble-mistralsmall-8515-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/jellywibble-mistralsmall-8515-v1
jellywibble-mistralsmall-8515-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/jellywibble-mistralsmall-8515-v1/config.json
jellywibble-mistralsmall-8515-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/jellywibble-mistralsmall-8515-v1/special_tokens_map.json
jellywibble-mistralsmall-8515-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/jellywibble-mistralsmall-8515-v1/tokenizer_config.json
jellywibble-mistralsmall-8515-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/jellywibble-mistralsmall-8515-v1/tokenizer.json
jellywibble-mistralsmall-8515-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/jellywibble-mistralsmall-8515-v1/flywheel_model.1.safetensors
jellywibble-mistralsmall-8515-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/jellywibble-mistralsmall-8515-v1/flywheel_model.0.safetensors
jellywibble-mistralsmall-8515-v1-mkmlizer: Loading 0: 0%| | 0/507 [00:00<?, ?it/s] Loading 0: 1%| | 5/507 [00:00<00:20, 24.43it/s] Loading 0: 2%|▏ | 12/507 [00:00<00:12, 40.56it/s] Loading 0: 3%|▎ | 17/507 [00:00<00:13, 37.17it/s] Loading 0: 4%|▍ | 22/507 [00:00<00:12, 38.17it/s] Loading 0: 5%|▌ | 27/507 [00:00<00:11, 40.60it/s] Loading 0: 6%|▋ | 32/507 [00:00<00:13, 35.17it/s] Loading 0: 8%|▊ | 39/507 [00:00<00:10, 42.62it/s] Loading 0: 9%|▊ | 44/507 [00:01<00:11, 41.33it/s] Loading 0: 10%|▉ | 49/507 [00:01<00:12, 37.45it/s] Loading 0: 10%|█ | 53/507 [00:01<00:16, 28.15it/s] Loading 0: 11%|█ | 57/507 [00:01<00:16, 27.73it/s] Loading 0: 12%|█▏ | 63/507 [00:01<00:13, 33.33it/s] Loading 0: 13%|█▎ | 67/507 [00:01<00:12, 34.07it/s] Loading 0: 14%|█▍ | 73/507 [00:02<00:11, 36.70it/s] Loading 0: 16%|█▌ | 80/507 [00:02<00:11, 37.43it/s] Loading 0: 17%|█▋ | 87/507 [00:02<00:09, 42.72it/s] Loading 0: 18%|█▊ | 92/507 [00:02<00:09, 41.61it/s] Loading 0: 19%|█▉ | 97/507 [00:02<00:10, 40.52it/s] Loading 0: 20%|██ | 102/507 [00:02<00:09, 41.09it/s] Loading 0: 21%|██ | 107/507 [00:02<00:11, 33.73it/s] Loading 0: 22%|██▏ | 113/507 [00:03<00:12, 30.40it/s] Loading 0: 23%|██▎ | 117/507 [00:03<00:13, 29.37it/s] Loading 0: 24%|██▍ | 122/507 [00:03<00:12, 30.37it/s] Loading 0: 25%|██▌ | 129/507 [00:03<00:10, 37.24it/s] Loading 0: 26%|██▋ | 134/507 [00:03<00:10, 37.16it/s] Loading 0: 27%|██▋ | 139/507 [00:03<00:09, 38.48it/s] Loading 0: 28%|██▊ | 144/507 [00:03<00:08, 40.34it/s] Loading 0: 29%|██▉ | 149/507 [00:04<00:10, 34.02it/s] Loading 0: 31%|███ | 156/507 [00:04<00:08, 40.49it/s] Loading 0: 32%|███▏ | 161/507 [00:04<00:08, 40.08it/s] Loading 0: 33%|███▎ | 167/507 [00:04<00:07, 43.73it/s] Loading 0: 34%|███▍ | 172/507 [00:04<00:11, 29.31it/s] Loading 0: 35%|███▍ | 176/507 [00:04<00:11, 29.44it/s] Loading 0: 36%|███▌ | 183/507 [00:05<00:08, 37.12it/s] Loading 0: 37%|███▋ | 188/507 [00:05<00:08, 37.94it/s] Loading 0: 38%|███▊ | 193/507 [00:05<00:07, 39.30it/s] Loading 0: 39%|███▉ | 198/507 [00:05<00:07, 39.70it/s] Loading 0: 40%|████ | 203/507 [00:05<00:09, 32.33it/s] Loading 0: 42%|████▏ | 211/507 [00:05<00:07, 40.06it/s] Loading 0: 43%|████▎ | 217/507 [00:05<00:07, 41.01it/s] Loading 0: 44%|████▍ | 222/507 [00:06<00:07, 40.23it/s] Loading 0: 45%|████▍ | 227/507 [00:06<00:08, 33.04it/s] Loading 0: 46%|████▌ | 231/507 [00:06<00:08, 31.37it/s] Loading 0: 47%|████▋ | 237/507 [00:06<00:07, 36.16it/s] Loading 0: 48%|████▊ | 241/507 [00:06<00:07, 35.69it/s] Loading 0: 49%|████▊ | 246/507 [00:06<00:06, 37.93it/s] Loading 0: 49%|████▉ | 250/507 [00:06<00:07, 36.28it/s] Loading 0: 50%|█████ | 255/507 [00:07<00:06, 38.54it/s] Loading 0: 51%|█████ | 259/507 [00:07<00:06, 37.28it/s] Loading 0: 52%|█████▏ | 264/507 [00:07<00:06, 39.93it/s] Loading 0: 53%|█████▎ | 269/507 [00:07<00:06, 39.45it/s] Loading 0: 54%|█████▍ | 274/507 [00:07<00:05, 40.43it/s] Loading 0: 55%|█████▌ | 279/507 [00:07<00:05, 42.11it/s] Loading 0: 56%|█████▌ | 284/507 [00:07<00:07, 31.05it/s] Loading 0: 57%|█████▋ | 288/507 [00:08<00:07, 28.69it/s] Loading 0: 58%|█████▊ | 293/507 [00:08<00:07, 28.03it/s] Loading 0: 59%|█████▉ | 299/507 [00:08<00:06, 34.48it/s] Loading 0: 59%|█████▉ | 299/507 [00:22<00:06, 34.48it/s] Loading 0: 59%|█████▉ | 300/507 [00:22<03:57, 1.15s/it] Loading 0: 60%|█████▉ | 302/507 [00:22<03:15, 1.05it/s] Loading 0: 61%|██████ | 307/507 [00:23<01:58, 1.69it/s] Loading 0: 61%|██████▏ | 311/507 [00:23<01:23, 2.35it/s] Loading 0: 63%|██████▎ | 318/507 [00:23<00:46, 4.05it/s] Loading 0: 64%|██████▎ | 322/507 [00:23<00:34, 5.29it/s] Loading 0: 65%|██████▍ | 328/507 [00:23<00:23, 7.78it/s] Loading 0: 66%|██████▌ | 333/507 [00:23<00:16, 10.42it/s] Loading 0: 67%|██████▋ | 339/507 [00:23<00:12, 13.75it/s] Loading 0: 68%|██████▊ | 344/507 [00:24<00:10, 14.83it/s] Loading 0: 69%|██████▊ | 348/507 [00:24<00:09, 16.45it/s] Loading 0: 70%|██████▉ | 354/507 [00:24<00:07, 21.30it/s] Loading 0: 71%|███████ | 358/507 [00:24<00:06, 23.02it/s] Loading 0: 72%|███████▏ | 363/507 [00:24<00:05, 27.10it/s] Loading 0: 72%|███████▏ | 367/507 [00:24<00:04, 28.05it/s] Loading 0: 73%|███████▎ | 371/507 [00:25<00:04, 30.24it/s] Loading 0: 74%|███████▍ | 375/507 [00:25<00:04, 29.29it/s] Loading 0: 75%|███████▌ | 381/507 [00:25<00:03, 35.58it/s] Loading 0: 76%|███████▌ | 386/507 [00:25<00:03, 36.43it/s] Loading 0: 77%|███████▋ | 390/507 [00:25<00:03, 31.32it/s] Loading 0: 78%|███████▊ | 395/507 [00:25<00:04, 26.55it/s] Loading 0: 79%|███████▉ | 400/507 [00:25<00:03, 30.48it/s] Loading 0: 80%|███████▉ | 404/507 [00:26<00:03, 31.24it/s] Loading 0: 80%|████████ | 408/507 [00:26<00:03, 32.62it/s] Loading 0: 81%|████████▏ | 412/507 [00:26<00:02, 32.39it/s] Loading 0: 82%|████████▏ | 417/507 [00:26<00:02, 35.78it/s] Loading 0: 83%|████████▎ | 421/507 [00:26<00:02, 35.04it/s] Loading 0: 84%|████████▍ | 426/507 [00:26<00:02, 38.07it/s] Loading 0: 85%|████████▍ | 430/507 [00:26<00:02, 35.97it/s] Loading 0: 86%|████████▌ | 435/507 [00:26<00:01, 38.10it/s] Loading 0: 87%|████████▋ | 439/507 [00:26<00:01, 36.19it/s] Loading 0: 88%|████████▊ | 444/507 [00:27<00:01, 37.58it/s] Loading 0: 88%|████████▊ | 448/507 [00:27<00:01, 37.20it/s] Loading 0: 90%|████████▉ | 454/507 [00:27<00:01, 40.63it/s] Loading 0: 91%|█████████ | 459/507 [00:29<00:07, 6.32it/s] Loading 0: 92%|█████████▏| 465/507 [00:29<00:04, 8.84it/s] Loading 0: 93%|█████████▎| 472/507 [00:29<00:02, 12.79it/s] Loading 0: 94%|█████████▍| 476/507 [00:30<00:02, 14.84it/s] Loading 0: 95%|█████████▍| 481/507 [00:30<00:01, 18.26it/s] Loading 0: 96%|█████████▌| 485/507 [00:30<00:01, 20.50it/s] Loading 0: 97%|█████████▋| 490/507 [00:30<00:00, 24.33it/s] Loading 0: 97%|█████████▋| 494/507 [00:30<00:00, 26.00it/s] Loading 0: 98%|█████████▊| 499/507 [00:30<00:00, 30.16it/s] Loading 0: 99%|█████████▉| 503/507 [00:30<00:00, 31.37it/s]
Job jellywibble-mistralsmall-8515-v1-mkmlizer completed after 156.24s with status: succeeded
Stopping job with name jellywibble-mistralsmall-8515-v1-mkmlizer
Pipeline stage MKMLizer completed in 157.84s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.09s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service jellywibble-mistralsmall-8515-v1
Waiting for inference service jellywibble-mistralsmall-8515-v1 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service jellywibble-mistralsmall-8515-v1 ready after 201.3305859565735s
Pipeline stage MKMLDeployer completed in 201.96s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.860029935836792s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 2.469403028488159s
Received healthy response to inference request in 2.85426664352417s
Received healthy response to inference request in 3.321934461593628s
Received healthy response to inference request in 3.3121232986450195s
5 requests
0 failed requests
5th percentile: 2.5463757514953613
10th percentile: 2.6233484745025635
20th percentile: 2.7772939205169678
30th percentile: 2.8554193019866942
40th percentile: 2.8577246189117433
50th percentile: 2.860029935836792
60th percentile: 3.040867280960083
70th percentile: 3.221704626083374
80th percentile: 3.314085531234741
90th percentile: 3.3180099964141845
95th percentile: 3.3199722290039064
99th percentile: 3.321542015075684
mean time: 2.963551473617554
Pipeline stage StressChecker completed in 15.69s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 5.06s
Shutdown handler de-registered
jellywibble-mistralsmall_8515_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.13s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service jellywibble-mistralsmall-8515-v1-profiler
Waiting for inference service jellywibble-mistralsmall-8515-v1-profiler to be ready
Inference service jellywibble-mistralsmall-8515-v1-profiler ready after 190.51777386665344s
Pipeline stage MKMLProfilerDeployer completed in 190.90s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/jellywibble-mistralsbd5f63e0c9765876a74bf9fc02495c9e-deplobxfwd:/code/chaiverse_profiler_1726968689 --namespace tenant-chaiml-guanaco
kubectl exec -it jellywibble-mistralsbd5f63e0c9765876a74bf9fc02495c9e-deplobxfwd --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1726968689 && python profiles.py profile --best_of_n 8 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 1024 --output_tokens 64 --summary /code/chaiverse_profiler_1726968689/summary.json'
kubectl exec -it jellywibble-mistralsbd5f63e0c9765876a74bf9fc02495c9e-deplobxfwd --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1726968689/summary.json'
Pipeline stage MKMLProfilerRunner completed in 1556.73s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service jellywibble-mistralsmall-8515-v1-profiler is running
Tearing down inference service jellywibble-mistralsmall-8515-v1-profiler
Service jellywibble-mistralsmall-8515-v1-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 1.88s
Shutdown handler de-registered
jellywibble-mistralsmall_8515_v1 status is now inactive due to auto deactivation removed underperforming models
Deleting key jic062-dpo-v1-6-nemo-v1/tokenizer_config.json from bucket guanaco-mkml-models
Deleting key jellywibble-mistralsmall-v1/tokenizer_config.json from bucket guanaco-mkml-models
jellywibble-mistralsmall_4077_v2 status is now torndown due to DeploymentManager action
Deleting key riverise-mistral-0920-7872-v1/tokenizer.json from bucket guanaco-mkml-models
jellywibble-mistralsmall_8515_v1 status is now torndown due to DeploymentManager action