submission_id: jic062-nemo-v1-6_v3
developer_uid: chace9580
best_of: 8
celo_rating: 1270.92
display_name: jic062-nemo-v1-6_v3
family_friendly_score: 0.0
formatter: {'memory_template': '[INST]system\n{memory}[/INST]\n', 'prompt_template': '[INST]user\n{prompt}[/INST]\n', 'bot_template': '[INST]assistant\n{bot_name}: {message}[/INST]\n', 'user_template': '[INST]user\n{user_name}: {message}[/INST]\n', 'response_template': '[INST]assistant\n{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 0.9, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '/s', '[/INST]'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
gpu_counts: {'NVIDIA RTX A5000': 1}
ineligible_reason: num_battles<5000
is_internal_developer: False
language_model: jic062/Nemo-v1.6
latencies: [{'batch_size': 1, 'throughput': 0.6198987606512594, 'latency_mean': 1.613106838464737, 'latency_p50': 1.6010977029800415, 'latency_p90': 1.7877466917037963}, {'batch_size': 3, 'throughput': 1.1032224977961183, 'latency_mean': 2.716047092676163, 'latency_p50': 2.702701210975647, 'latency_p90': 3.039145064353943}, {'batch_size': 5, 'throughput': 1.2513459571444299, 'latency_mean': 3.968260405063629, 'latency_p50': 3.9835492372512817, 'latency_p90': 4.457217788696289}, {'batch_size': 6, 'throughput': 1.2718122275088122, 'latency_mean': 4.697573640346527, 'latency_p50': 4.715938925743103, 'latency_p90': 5.327931141853332}, {'batch_size': 8, 'throughput': 1.2662590527827973, 'latency_mean': 6.277941770553589, 'latency_p50': 6.297783851623535, 'latency_p90': 7.144484543800354}, {'batch_size': 10, 'throughput': 1.2207944087849858, 'latency_mean': 8.148595954179763, 'latency_p50': 8.125438451766968, 'latency_p90': 9.325861096382141}]
max_input_tokens: 1024
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: jic062/Nemo-v1.6
model_name: jic062-nemo-v1-6_v3
model_num_parameters: 12772070400.0
model_repo: jic062/Nemo-v1.6
model_size: 13B
num_battles: 2602
num_wins: 1421
ranking_group: single
status: torndown
submission_type: basic
throughput_3p7s: 1.24
timestamp: 2024-09-23T23:38:14+00:00
us_pacific_date: 2024-09-23
win_ratio: 0.5461183704842429
Download Preference Data
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name jic062-nemo-v1-6-v3-mkmlizer
Waiting for job on jic062-nemo-v1-6-v3-mkmlizer to finish
jic062-nemo-v1-6-v3-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jic062-nemo-v1-6-v3-mkmlizer: ║ _____ __ __ ║
jic062-nemo-v1-6-v3-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jic062-nemo-v1-6-v3-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jic062-nemo-v1-6-v3-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jic062-nemo-v1-6-v3-mkmlizer: ║ /___/ ║
jic062-nemo-v1-6-v3-mkmlizer: ║ ║
jic062-nemo-v1-6-v3-mkmlizer: ║ Version: 0.10.1 ║
jic062-nemo-v1-6-v3-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jic062-nemo-v1-6-v3-mkmlizer: ║ https://mk1.ai ║
jic062-nemo-v1-6-v3-mkmlizer: ║ ║
jic062-nemo-v1-6-v3-mkmlizer: ║ The license key for the current software has been verified as ║
jic062-nemo-v1-6-v3-mkmlizer: ║ belonging to: ║
jic062-nemo-v1-6-v3-mkmlizer: ║ ║
jic062-nemo-v1-6-v3-mkmlizer: ║ Chai Research Corp. ║
jic062-nemo-v1-6-v3-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jic062-nemo-v1-6-v3-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
jic062-nemo-v1-6-v3-mkmlizer: ║ ║
jic062-nemo-v1-6-v3-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
jic062-nemo-v1-6-v3-mkmlizer: Downloaded to shared memory in 48.188s
jic062-nemo-v1-6-v3-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpfffk5r_k, device:0
jic062-nemo-v1-6-v3-mkmlizer: Saving flywheel model at /dev/shm/model_cache
jic062-nemo-v1-6-v3-mkmlizer: quantized model in 35.690s
jic062-nemo-v1-6-v3-mkmlizer: Processed model jic062/Nemo-v1.6 in 83.878s
jic062-nemo-v1-6-v3-mkmlizer: creating bucket guanaco-mkml-models
jic062-nemo-v1-6-v3-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
jic062-nemo-v1-6-v3-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/jic062-nemo-v1-6-v3
jic062-nemo-v1-6-v3-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/jic062-nemo-v1-6-v3/config.json
jic062-nemo-v1-6-v3-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/jic062-nemo-v1-6-v3/special_tokens_map.json
jic062-nemo-v1-6-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/jic062-nemo-v1-6-v3/tokenizer_config.json
jic062-nemo-v1-6-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/jic062-nemo-v1-6-v3/tokenizer.json
jic062-nemo-v1-6-v3-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/jic062-nemo-v1-6-v3/flywheel_model.0.safetensors
jic062-nemo-v1-6-v3-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:11, 30.93it/s] Loading 0: 4%|▎ | 13/363 [00:00<00:07, 49.18it/s] Loading 0: 5%|▌ | 19/363 [00:00<00:07, 43.81it/s] Loading 0: 7%|▋ | 24/363 [00:00<00:07, 43.25it/s] Loading 0: 9%|▊ | 31/363 [00:00<00:06, 49.94it/s] Loading 0: 10%|█ | 37/363 [00:00<00:06, 47.10it/s] Loading 0: 12%|█▏ | 42/363 [00:00<00:07, 44.31it/s] Loading 0: 13%|█▎ | 48/363 [00:01<00:06, 48.30it/s] Loading 0: 15%|█▍ | 53/363 [00:01<00:06, 46.68it/s] Loading 0: 16%|█▋ | 59/363 [00:01<00:06, 50.11it/s] Loading 0: 18%|█▊ | 65/363 [00:01<00:09, 30.60it/s] Loading 0: 20%|█▉ | 72/363 [00:01<00:07, 36.66it/s] Loading 0: 21%|██ | 77/363 [00:01<00:07, 38.74it/s] Loading 0: 23%|██▎ | 82/363 [00:02<00:08, 33.78it/s] Loading 0: 25%|██▍ | 89/363 [00:02<00:06, 41.01it/s] Loading 0: 26%|██▌ | 94/363 [00:02<00:07, 38.05it/s] Loading 0: 27%|██▋ | 99/363 [00:02<00:06, 38.88it/s] Loading 0: 29%|██▉ | 105/363 [00:02<00:06, 39.99it/s] Loading 0: 31%|███ | 112/363 [00:02<00:05, 45.27it/s] Loading 0: 32%|███▏ | 117/363 [00:02<00:05, 42.62it/s] Loading 0: 34%|███▎ | 122/363 [00:02<00:05, 43.76it/s] Loading 0: 35%|███▍ | 127/363 [00:03<00:05, 39.50it/s] Loading 0: 37%|███▋ | 135/363 [00:03<00:04, 47.39it/s] Loading 0: 39%|███▉ | 141/363 [00:03<00:04, 45.23it/s] Loading 0: 40%|████ | 146/363 [00:03<00:06, 33.20it/s] Loading 0: 41%|████▏ | 150/363 [00:03<00:06, 33.23it/s] Loading 0: 43%|████▎ | 156/363 [00:03<00:05, 38.16it/s] Loading 0: 44%|████▍ | 161/363 [00:03<00:05, 39.05it/s] Loading 0: 46%|████▌ | 166/363 [00:04<00:04, 40.43it/s] Loading 0: 47%|████▋ | 172/363 [00:04<00:04, 40.67it/s] Loading 0: 49%|████▉ | 177/363 [00:04<00:04, 40.11it/s] Loading 0: 51%|█████ | 184/363 [00:04<00:03, 45.26it/s] Loading 0: 52%|█████▏ | 190/363 [00:04<00:03, 43.79it/s] Loading 0: 54%|█████▎ | 195/363 [00:04<00:03, 43.20it/s] Loading 0: 56%|█████▌ | 202/363 [00:04<00:03, 48.40it/s] Loading 0: 57%|█████▋ | 208/363 [00:04<00:03, 45.49it/s] Loading 0: 59%|█████▊ | 213/363 [00:05<00:03, 43.54it/s] Loading 0: 60%|██████ | 218/363 [00:05<00:03, 43.64it/s] Loading 0: 61%|██████▏ | 223/363 [00:05<00:04, 33.71it/s] Loading 0: 63%|██████▎ | 227/363 [00:05<00:03, 34.92it/s] Loading 0: 64%|██████▎ | 231/363 [00:05<00:03, 34.45it/s] Loading 0: 65%|██████▌ | 237/363 [00:05<00:03, 40.26it/s] Loading 0: 67%|██████▋ | 242/363 [00:05<00:02, 41.81it/s] Loading 0: 68%|██████▊ | 247/363 [00:06<00:02, 43.85it/s] Loading 0: 70%|██████▉ | 253/363 [00:06<00:02, 43.63it/s] Loading 0: 71%|███████ | 258/363 [00:06<00:02, 42.56it/s] Loading 0: 73%|███████▎ | 265/363 [00:06<00:02, 47.59it/s] Loading 0: 75%|███████▍ | 271/363 [00:06<00:02, 45.70it/s] Loading 0: 76%|███████▌ | 276/363 [00:06<00:01, 44.36it/s] Loading 0: 78%|███████▊ | 283/363 [00:06<00:01, 49.16it/s] Loading 0: 79%|███████▉ | 288/363 [00:06<00:01, 47.19it/s] Loading 0: 81%|████████ | 293/363 [00:07<00:01, 41.19it/s] Loading 0: 83%|████████▎ | 301/363 [00:07<00:01, 50.13it/s] Loading 0: 85%|████████▍ | 307/363 [00:13<00:19, 2.89it/s] Loading 0: 86%|████████▌ | 312/363 [00:14<00:13, 3.80it/s] Loading 0: 88%|████████▊ | 320/363 [00:14<00:07, 5.86it/s] Loading 0: 90%|████████▉ | 326/363 [00:14<00:04, 7.75it/s] Loading 0: 91%|█████████ | 331/363 [00:14<00:03, 9.81it/s] Loading 0: 93%|█████████▎| 339/363 [00:14<00:01, 13.85it/s] Loading 0: 96%|█████████▌| 347/363 [00:14<00:00, 19.34it/s] Loading 0: 97%|█████████▋| 353/363 [00:14<00:00, 23.07it/s] Loading 0: 99%|█████████▉| 359/363 [00:14<00:00, 27.16it/s]
Job jic062-nemo-v1-6-v3-mkmlizer completed after 104.66s with status: succeeded
Stopping job with name jic062-nemo-v1-6-v3-mkmlizer
Pipeline stage MKMLizer completed in 106.06s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.31s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service jic062-nemo-v1-6-v3
Waiting for inference service jic062-nemo-v1-6-v3 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Failed to get response for submission zonemercy-vingt-deux-v0-1e5_v27: ('http://zonemercy-vingt-deux-v0-1e5-v27-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:48144->127.0.0.1:8080: read: connection reset by peer\n')
Inference service jic062-nemo-v1-6-v3 ready after 202.36628603935242s
Pipeline stage MKMLDeployer completed in 202.91s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 3.1687123775482178s
Received healthy response to inference request in 4.012082099914551s
Received healthy response to inference request in 2.704888343811035s
Received healthy response to inference request in 1.902796983718872s
Received healthy response to inference request in 2.1450154781341553s
5 requests
0 failed requests
5th percentile: 1.9512406826019286
10th percentile: 1.9996843814849854
20th percentile: 2.0965717792510987
30th percentile: 2.2569900512695313
40th percentile: 2.480939197540283
50th percentile: 2.704888343811035
60th percentile: 2.890417957305908
70th percentile: 3.075947570800781
80th percentile: 3.3373863220214846
90th percentile: 3.6747342109680177
95th percentile: 3.843408155441284
99th percentile: 3.9783473110198972
mean time: 2.786699056625366
Pipeline stage StressChecker completed in 15.46s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 4.17s
Shutdown handler de-registered
jic062-nemo-v1-6_v3 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.12s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service jic062-nemo-v1-6-v3-profiler
Waiting for inference service jic062-nemo-v1-6-v3-profiler to be ready
Inference service jic062-nemo-v1-6-v3-profiler ready after 200.4603488445282s
Pipeline stage MKMLProfilerDeployer completed in 200.81s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/jic062-nemo-v1-6-v3-profiler-predictor-00001-deployment-cbkcfnd:/code/chaiverse_profiler_1727135270 --namespace tenant-chaiml-guanaco
kubectl exec -it jic062-nemo-v1-6-v3-profiler-predictor-00001-deployment-cbkcfnd --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1727135270 && python profiles.py profile --best_of_n 8 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 1024 --output_tokens 64 --summary /code/chaiverse_profiler_1727135270/summary.json'
kubectl exec -it jic062-nemo-v1-6-v3-profiler-predictor-00001-deployment-cbkcfnd --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1727135270/summary.json'
Pipeline stage MKMLProfilerRunner completed in 1149.24s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service jic062-nemo-v1-6-v3-profiler is running
Tearing down inference service jic062-nemo-v1-6-v3-profiler
Service jic062-nemo-v1-6-v3-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 2.12s
Shutdown handler de-registered
jic062-nemo-v1-6_v3 status is now inactive due to auto deactivation removed underperforming models
99th percentile: 2.210818748474121
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of jic062-nemo-v1-6_v3
mean time: 1.8350937843322754
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline stage %s
Pipeline stage StressChecker completed in 11.17s
run pipeline %s
Running pipeline stage MKMLDeleter
run pipeline stage %s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
Checking if service jic062-dpo-v1-4-nemo-v4 is running
Running pipeline stage MKMLDeleter
run_pipeline:run_in_cloud %s
Checking if service jic062-nemo-v1-6-v3 is running
starting trigger_guanaco_pipeline args=%s
Tearing down inference service jic062-nemo-v1-6-v3
Tearing down inference service jic062-dpo-v1-4-nemo-v4
Service jic062-nemo-v1-6-v3 has been torndown
Service jic062-dpo-v1-4-nemo-v4 has been torndown
Pipeline stage MKMLDeleter completed in 2.17s
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 2.64s
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Cleaning model data from S3
Running pipeline stage MKMLModelDeleter
Cleaning model data from model cache
Cleaning model data from S3
Cleaning model data from model cache
Pipeline stage TriggerMKMLProfilingPipeline completed in 5.01s
Shutdown handler de-registered
Deleting key jic062-nemo-v1-6-v3/config.json from bucket guanaco-mkml-models
Deleting key jic062-dpo-v1-4-nemo-v4/config.json from bucket guanaco-mkml-models
sao10k-mn-12b-vespa-x2_v1 status is now deployed due to DeploymentManager action
Deleting key jic062-nemo-v1-6-v3/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key jic062-dpo-v1-4-nemo-v4/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key jic062-nemo-v1-6-v3/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key jic062-dpo-v1-4-nemo-v4/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key jic062-nemo-v1-6-v3/tokenizer.json from bucket guanaco-mkml-models
Deleting key jic062-dpo-v1-4-nemo-v4/tokenizer.json from bucket guanaco-mkml-models
Deleting key jic062-nemo-v1-6-v3/tokenizer_config.json from bucket guanaco-mkml-models
Deleting key jic062-dpo-v1-4-nemo-v4/tokenizer_config.json from bucket guanaco-mkml-models
Pipeline stage MKMLModelDeleter completed in 4.55s
Pipeline stage MKMLModelDeleter completed in 4.40s
Shutdown handler de-registered
Shutdown handler de-registered
jic062-nemo-v1-6_v3 status is now torndown due to DeploymentManager action
jic062-dpo-v1-4-nemo_v4 status is now torndown due to DeploymentManager action