developer_uid: rica40325
submission_id: rica40325-feedback-1000_v1
model_name: rica40325-feedback-1000_v1
model_group: rica40325/feedback-1000
status: torndown
timestamp: 2024-09-07T12:45:52+00:00
num_battles: 14206
num_wins: 7323
celo_rating: 1251.86
family_friendly_score: 0.0
submission_type: basic
model_repo: rica40325/feedback-1000
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 16
max_input_tokens: 512
max_output_tokens: 64
latencies: [{'batch_size': 1, 'throughput': 0.8938243511157433, 'latency_mean': 1.1186929655075073, 'latency_p50': 1.1110073328018188, 'latency_p90': 1.2396543979644776}, {'batch_size': 4, 'throughput': 1.8175949186884242, 'latency_mean': 2.1981750547885897, 'latency_p50': 2.190255045890808, 'latency_p90': 2.488230562210083}, {'batch_size': 5, 'throughput': 1.9126906984336671, 'latency_mean': 2.5956434392929078, 'latency_p50': 2.581117033958435, 'latency_p90': 2.905148434638977}, {'batch_size': 8, 'throughput': 2.0127753551189085, 'latency_mean': 3.939420201778412, 'latency_p50': 3.961013913154602, 'latency_p90': 4.447454237937928}, {'batch_size': 10, 'throughput': 2.0337526195092304, 'latency_mean': 4.8746510446071625, 'latency_p50': 4.935366988182068, 'latency_p90': 5.550662803649902}, {'batch_size': 12, 'throughput': 2.0298417817441794, 'latency_mean': 5.831540428400039, 'latency_p50': 5.832401394844055, 'latency_p90': 6.712045645713806}, {'batch_size': 15, 'throughput': 2.030122019882892, 'latency_mean': 7.236456040143967, 'latency_p50': 7.391851544380188, 'latency_p90': 7.998731589317321}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: rica40325-feedback-1000_v1
is_internal_developer: False
language_model: rica40325/feedback-1000
model_size: 8B
ranking_group: single
throughput_3p7s: 2.02
us_pacific_date: 2024-09-07
win_ratio: 0.5154864141911868
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name rica40325-feedback-1000-v1-mkmlizer
Waiting for job on rica40325-feedback-1000-v1-mkmlizer to finish
Connection pool is full, discarding connection: %s. Connection pool size: %s
rica40325-feedback-1000-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
rica40325-feedback-1000-v1-mkmlizer: ║ _____ __ __ ║
rica40325-feedback-1000-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
rica40325-feedback-1000-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
rica40325-feedback-1000-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
rica40325-feedback-1000-v1-mkmlizer: ║ /___/ ║
rica40325-feedback-1000-v1-mkmlizer: ║ ║
rica40325-feedback-1000-v1-mkmlizer: ║ Version: 0.10.1 ║
rica40325-feedback-1000-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
rica40325-feedback-1000-v1-mkmlizer: ║ https://mk1.ai ║
rica40325-feedback-1000-v1-mkmlizer: ║ ║
rica40325-feedback-1000-v1-mkmlizer: ║ The license key for the current software has been verified as ║
rica40325-feedback-1000-v1-mkmlizer: ║ belonging to: ║
rica40325-feedback-1000-v1-mkmlizer: ║ ║
rica40325-feedback-1000-v1-mkmlizer: ║ Chai Research Corp. ║
rica40325-feedback-1000-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
rica40325-feedback-1000-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
rica40325-feedback-1000-v1-mkmlizer: ║ ║
rica40325-feedback-1000-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
rica40325-feedback-1000-v1-mkmlizer: Downloaded to shared memory in 66.714s
rica40325-feedback-1000-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpsjsiy6vu, device:0
rica40325-feedback-1000-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
rica40325-feedback-1000-v1-mkmlizer: quantized model in 28.799s
rica40325-feedback-1000-v1-mkmlizer: Processed model rica40325/feedback-1000 in 95.513s
rica40325-feedback-1000-v1-mkmlizer: creating bucket guanaco-mkml-models
rica40325-feedback-1000-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
rica40325-feedback-1000-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/rica40325-feedback-1000-v1
rica40325-feedback-1000-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/rica40325-feedback-1000-v1/config.json
rica40325-feedback-1000-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/rica40325-feedback-1000-v1/special_tokens_map.json
rica40325-feedback-1000-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/rica40325-feedback-1000-v1/tokenizer_config.json
rica40325-feedback-1000-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/rica40325-feedback-1000-v1/tokenizer.json
rica40325-feedback-1000-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/rica40325-feedback-1000-v1/flywheel_model.0.safetensors
rica40325-feedback-1000-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 2%|▏ | 5/291 [00:00<00:10, 26.06it/s] Loading 0: 4%|▍ | 12/291 [00:00<00:08, 34.33it/s] Loading 0: 5%|▌ | 16/291 [00:00<00:08, 32.41it/s] Loading 0: 7%|▋ | 21/291 [00:00<00:07, 35.69it/s] Loading 0: 9%|▊ | 25/291 [00:00<00:07, 33.62it/s] Loading 0: 10%|█ | 30/291 [00:00<00:07, 37.23it/s] Loading 0: 12%|█▏ | 34/291 [00:01<00:09, 26.59it/s] Loading 0: 13%|█▎ | 38/291 [00:01<00:09, 27.50it/s] Loading 0: 14%|█▍ | 42/291 [00:01<00:09, 25.86it/s] Loading 0: 16%|█▋ | 48/291 [00:01<00:07, 31.13it/s] Loading 0: 18%|█▊ | 52/291 [00:01<00:07, 30.12it/s] Loading 0: 20%|█▉ | 57/291 [00:01<00:07, 33.12it/s] Loading 0: 21%|██ | 61/291 [00:01<00:07, 32.54it/s] Loading 0: 23%|██▎ | 66/291 [00:02<00:06, 35.97it/s] Loading 0: 24%|██▍ | 70/291 [00:02<00:06, 34.56it/s] Loading 0: 25%|██▌ | 74/291 [00:02<00:06, 34.92it/s] Loading 0: 27%|██▋ | 78/291 [00:02<00:06, 34.23it/s] Loading 0: 28%|██▊ | 82/291 [00:02<00:09, 23.15it/s] Loading 0: 29%|██▉ | 85/291 [00:02<00:08, 24.45it/s] Loading 0: 31%|███ | 90/291 [00:02<00:06, 29.35it/s] Loading 0: 32%|███▏ | 94/291 [00:03<00:06, 29.35it/s] Loading 0: 34%|███▍ | 99/291 [00:03<00:05, 32.29it/s] Loading 0: 35%|███▌ | 103/291 [00:03<00:05, 31.67it/s] Loading 0: 37%|███▋ | 108/291 [00:03<00:05, 33.62it/s] Loading 0: 38%|███▊ | 112/291 [00:03<00:05, 32.52it/s] Loading 0: 40%|███▉ | 116/291 [00:03<00:05, 31.82it/s] Loading 0: 42%|████▏ | 122/291 [00:03<00:04, 35.47it/s] Loading 0: 44%|████▎ | 127/291 [00:04<00:04, 34.33it/s] Loading 0: 45%|████▌ | 132/291 [00:04<00:04, 37.49it/s] Loading 0: 47%|████▋ | 136/291 [00:04<00:06, 25.83it/s] Loading 0: 48%|████▊ | 140/291 [00:04<00:05, 25.20it/s] Loading 0: 50%|████▉ | 145/291 [00:04<00:04, 29.86it/s] Loading 0: 51%|█████ | 149/291 [00:04<00:05, 27.92it/s] Loading 0: 54%|█████▎ | 156/291 [00:04<00:03, 34.98it/s] Loading 0: 55%|█████▍ | 160/291 [00:05<00:03, 33.49it/s] Loading 0: 57%|█████▋ | 165/291 [00:05<00:03, 36.17it/s] Loading 0: 58%|█████▊ | 169/291 [00:05<00:03, 33.73it/s] Loading 0: 60%|█████▉ | 174/291 [00:05<00:03, 35.74it/s] Loading 0: 61%|██████ | 178/291 [00:05<00:03, 34.03it/s] Loading 0: 63%|██████▎ | 183/291 [00:05<00:02, 37.42it/s] Loading 0: 64%|██████▍ | 187/291 [00:06<00:03, 26.36it/s] Loading 0: 66%|██████▌ | 191/291 [00:06<00:03, 27.39it/s] Loading 0: 67%|██████▋ | 195/291 [00:06<00:03, 26.89it/s] Loading 0: 69%|██████▉ | 201/291 [00:06<00:02, 31.73it/s] Loading 0: 70%|███████ | 205/291 [00:06<00:02, 31.50it/s] Loading 0: 72%|███████▏ | 210/291 [00:06<00:02, 34.98it/s] Loading 0: 74%|███████▎ | 214/291 [00:06<00:02, 33.76it/s] Loading 0: 75%|███████▌ | 219/291 [00:06<00:01, 36.78it/s] Loading 0: 77%|███████▋ | 223/291 [00:07<00:01, 34.15it/s] Loading 0: 78%|███████▊ | 227/291 [00:07<00:01, 34.04it/s] Loading 0: 79%|███████▉ | 231/291 [00:07<00:01, 32.72it/s] Loading 0: 81%|████████ | 235/291 [00:07<00:02, 26.18it/s] Loading 0: 82%|████████▏ | 239/291 [00:07<00:02, 25.55it/s] Loading 0: 85%|████████▍ | 246/291 [00:07<00:01, 32.31it/s] Loading 0: 86%|████████▌ | 250/291 [00:07<00:01, 31.18it/s] Loading 0: 88%|████████▊ | 255/291 [00:08<00:01, 33.94it/s] Loading 0: 89%|████████▉ | 259/291 [00:08<00:00, 32.56it/s] Loading 0: 91%|█████████ | 264/291 [00:08<00:00, 34.37it/s] Loading 0: 92%|█████████▏| 268/291 [00:08<00:00, 33.29it/s] Loading 0: 94%|█████████▍| 273/291 [00:08<00:00, 36.44it/s] Loading 0: 95%|█████████▌| 277/291 [00:08<00:00, 34.74it/s] Loading 0: 97%|█████████▋| 281/291 [00:08<00:00, 35.40it/s] Loading 0: 98%|█████████▊| 286/291 [00:14<00:01, 2.61it/s] Loading 0: 99%|█████████▉| 289/291 [00:14<00:00, 3.25it/s]
Job rica40325-feedback-1000-v1-mkmlizer completed after 171.82s with status: succeeded
Stopping job with name rica40325-feedback-1000-v1-mkmlizer
Pipeline stage MKMLizer completed in 172.88s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.18s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service rica40325-feedback-1000-v1
Waiting for inference service rica40325-feedback-1000-v1 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service rica40325-feedback-1000-v1 ready after 150.82141423225403s
Pipeline stage MKMLDeployer completed in 151.25s
run pipeline stage %s
Running pipeline stage StressChecker
{"detail":"('http://chaiml-llama-8b-pairwis-8189-v19-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:49580->127.0.0.1:8080: read: connection reset by peer\\n')"}
Received unhealthy response to inference request!
Received healthy response to inference request in 2.052536725997925s
Received healthy response to inference request in 1.5141582489013672s
Received healthy response to inference request in 1.4948756694793701s
Received healthy response to inference request in 1.6679532527923584s
5 requests
1 failed requests
5th percentile: 1.4987321853637696
10th percentile: 1.502588701248169
20th percentile: 1.5103017330169677
30th percentile: 1.5449172496795653
40th percentile: 1.6064352512359619
50th percentile: 1.6679532527923584
60th percentile: 1.7099687099456786
70th percentile: 1.751984167098999
80th percentile: 1.8289008617401123
90th percentile: 1.9407187938690185
95th percentile: 1.9966277599334716
99th percentile: 2.041354932785034
mean time: 1.700503158569336
%s, retrying in %s seconds...
Received healthy response to inference request in 1.8581492900848389s
Received healthy response to inference request in 1.9045372009277344s
Received healthy response to inference request in 1.8373053073883057s
Received healthy response to inference request in 1.8218724727630615s
Received healthy response to inference request in 2.2648799419403076s
5 requests
0 failed requests
5th percentile: 1.8249590396881104
10th percentile: 1.8280456066131592
20th percentile: 1.8342187404632568
30th percentile: 1.8414741039276123
40th percentile: 1.8498116970062255
50th percentile: 1.8581492900848389
60th percentile: 1.876704454421997
70th percentile: 1.8952596187591553
80th percentile: 1.9766057491302491
90th percentile: 2.120742845535278
95th percentile: 2.192811393737793
99th percentile: 2.2504662322998046
mean time: 1.9373488426208496
Pipeline stage StressChecker completed in 20.12s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 5.40s
Shutdown handler de-registered
rica40325-feedback-1000_v1 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.12s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.10s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service rica40325-feedback-1000-v1-profiler
Waiting for inference service rica40325-feedback-1000-v1-profiler to be ready
Inference service rica40325-feedback-1000-v1-profiler ready after 150.37871718406677s
Pipeline stage MKMLProfilerDeployer completed in 150.74s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/rica40325-feedback-1000-v1-profiler-predictor-00001-deployw2t5c:/code/chaiverse_profiler_1725713692 --namespace tenant-chaiml-guanaco
kubectl exec -it rica40325-feedback-1000-v1-profiler-predictor-00001-deployw2t5c --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725713692 && python profiles.py profile --best_of_n 16 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725713692/summary.json'
kubectl exec -it rica40325-feedback-1000-v1-profiler-predictor-00001-deployw2t5c --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1725713692/summary.json'
Pipeline stage MKMLProfilerRunner completed in 840.28s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service rica40325-feedback-1000-v1-profiler is running
Tearing down inference service rica40325-feedback-1000-v1-profiler
Service rica40325-feedback-1000-v1-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 1.62s
Shutdown handler de-registered
rica40325-feedback-1000_v1 status is now inactive due to auto deactivation removed underperforming models
rica40325-feedback-1000_v1 status is now torndown due to DeploymentManager action
rica40325-feedback-1000_v1 status is now torndown due to DeploymentManager action