developer_uid: chai_backend_admin
submission_id: zonemercy-lexical-nemo-_1518_v26
model_name: 0906stv1-1
model_group: zonemercy/Lexical-Nemo-v
status: torndown
timestamp: 2024-09-06T14:44:00+00:00
num_battles: 10467
num_wins: 5670
celo_rating: 1259.76
family_friendly_score: 0.0
submission_type: basic
model_repo: zonemercy/Lexical-Nemo-v4-1k1e5
model_architecture: MistralForCausalLM
model_num_parameters: 12772070400.0
best_of: 2
max_input_tokens: 1024
max_output_tokens: 128
latencies: [{'batch_size': 1, 'throughput': 0.3624442106608688, 'latency_mean': 2.7589481568336485, 'latency_p50': 2.776124596595764, 'latency_p90': 2.910985541343689}, {'batch_size': 3, 'throughput': 0.8443088973512051, 'latency_mean': 3.5414422500133513, 'latency_p50': 3.543583631515503, 'latency_p90': 3.725886416435242}, {'batch_size': 5, 'throughput': 1.1819180551601614, 'latency_mean': 4.215242636203766, 'latency_p50': 4.227721095085144, 'latency_p90': 4.5131807088851925}, {'batch_size': 6, 'throughput': 1.3000490188961602, 'latency_mean': 4.566681246757508, 'latency_p50': 4.563777923583984, 'latency_p90': 4.912664365768433}, {'batch_size': 8, 'throughput': 1.500603083156626, 'latency_mean': 5.303667550086975, 'latency_p50': 5.293783903121948, 'latency_p90': 5.791717553138732}, {'batch_size': 10, 'throughput': 1.6137513941571593, 'latency_mean': 6.157537810802459, 'latency_p50': 6.157510757446289, 'latency_p90': 6.677109551429749}]
gpu_counts: {'NVIDIA RTX A5000': 1}
display_name: 0906stv1-1
ineligible_reason: max_output_tokens!=64
is_internal_developer: True
language_model: zonemercy/Lexical-Nemo-v4-1k1e5
model_size: 13B
ranking_group: single
throughput_3p7s: 0.94
us_pacific_date: 2024-09-06
win_ratio: 0.5417024935511608
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 100, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['</s>', 'Bot:', 'User:', 'You:'], 'max_input_tokens': 1024, 'best_of': 2, 'max_output_tokens': 128}
formatter: {'memory_template': "Write a story which is engaging, emotionally nuanced, and successfully captures User's attention in an unexpected setting.\nIt adheres closely to the previous portion of the story while offering a fresh and compelling take on the scenario.\nBot's Name: {bot_name}\nBot's Persona: {memory}\n####\n", 'prompt_template': '', 'bot_template': 'Bot: {message}\n', 'user_template': 'User: {message}\n', 'response_template': 'Bot:[Start story]', 'truncate_by_message': False}
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name zonemercy-lexical-nemo-1518-v26-mkmlizer
Waiting for job on zonemercy-lexical-nemo-1518-v26-mkmlizer to finish
Failed to get response for submission zonemercy-base-story-v1_v4: ('http://zonemercy-base-story-v1-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '{"error":"ValueError : [TypeError(\\"\'numpy.int64\' object is not iterable\\"), TypeError(\'vars() argument must have __dict__ attribute\')]"}')
zonemercy-lexical-nemo-1518-v26-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ _____ __ __ ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ /___/ ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ Version: 0.10.1 ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ https://mk1.ai ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ The license key for the current software has been verified as ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ belonging to: ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ Chai Research Corp. ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ║ ║
zonemercy-lexical-nemo-1518-v26-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Connection pool is full, discarding connection: %s. Connection pool size: %s
Failed to get response for submission zonemercy-base-story-v1_v3: ('http://zonemercy-base-story-v1-v3-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'upstream connect error or disconnect/reset before headers. reset reason: connection timeout')
Failed to get response for submission zonemercy-base-story-v1_v4: ('http://zonemercy-base-story-v1-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '{"error":"ValueError : [TypeError(\\"\'numpy.int64\' object is not iterable\\"), TypeError(\'vars() argument must have __dict__ attribute\')]"}')
Failed to get response for submission zonemercy-base-story-v1_v3: ('http://zonemercy-base-story-v1-v3-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '{"error":"ValueError : [TypeError(\\"\'numpy.int64\' object is not iterable\\"), TypeError(\'vars() argument must have __dict__ attribute\')]"}')
zonemercy-lexical-nemo-1518-v26-mkmlizer: Downloaded to shared memory in 64.579s
zonemercy-lexical-nemo-1518-v26-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpfohl2379, device:0
zonemercy-lexical-nemo-1518-v26-mkmlizer: Saving flywheel model at /dev/shm/model_cache
zonemercy-lexical-nemo-1518-v26-mkmlizer: quantized model in 41.166s
zonemercy-lexical-nemo-1518-v26-mkmlizer: Processed model zonemercy/Lexical-Nemo-v4-1k1e5 in 105.746s
zonemercy-lexical-nemo-1518-v26-mkmlizer: creating bucket guanaco-mkml-models
zonemercy-lexical-nemo-1518-v26-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
zonemercy-lexical-nemo-1518-v26-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/zonemercy-lexical-nemo-1518-v26
zonemercy-lexical-nemo-1518-v26-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/zonemercy-lexical-nemo-1518-v26/config.json
Failed to get response for submission zonemercy-base-story-v1_v3: ('http://zonemercy-base-story-v1-v3-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'upstream connect error or disconnect/reset before headers. reset reason: connection timeout')
zonemercy-lexical-nemo-1518-v26-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/zonemercy-lexical-nemo-1518-v26/tokenizer.json
zonemercy-lexical-nemo-1518-v26-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/zonemercy-lexical-nemo-1518-v26/flywheel_model.0.safetensors
zonemercy-lexical-nemo-1518-v26-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:15, 22.84it/s] Loading 0: 3%|▎ | 10/363 [00:00<00:12, 29.40it/s] Loading 0: 4%|▍ | 14/363 [00:00<00:13, 25.85it/s] Loading 0: 6%|▌ | 21/363 [00:00<00:09, 36.67it/s] Loading 0: 7%|▋ | 26/363 [00:01<00:15, 21.94it/s] Loading 0: 9%|▊ | 31/363 [00:01<00:12, 26.74it/s] Loading 0: 10%|▉ | 35/363 [00:01<00:11, 27.37it/s] Loading 0: 11%|█ | 39/363 [00:01<00:11, 27.92it/s] Loading 0: 12%|█▏ | 43/363 [00:01<00:11, 26.83it/s] Loading 0: 13%|█▎ | 48/363 [00:01<00:10, 29.08it/s] Loading 0: 14%|█▍ | 52/363 [00:01<00:11, 27.85it/s] Loading 0: 15%|█▌ | 56/363 [00:02<00:10, 28.96it/s] Loading 0: 17%|█▋ | 61/363 [00:02<00:11, 25.72it/s] Loading 0: 18%|█▊ | 64/363 [00:02<00:13, 22.19it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:10, 27.23it/s] Loading 0: 20%|██ | 73/363 [00:02<00:10, 26.91it/s] Loading 0: 21%|██ | 77/363 [00:02<00:11, 24.83it/s] Loading 0: 23%|██▎ | 82/363 [00:03<00:09, 29.40it/s] Loading 0: 24%|██▎ | 86/363 [00:03<00:10, 25.83it/s] Loading 0: 25%|██▌ | 92/363 [00:03<00:08, 32.84it/s] Loading 0: 26%|██▋ | 96/363 [00:03<00:09, 27.51it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:10, 24.11it/s] Loading 0: 29%|██▊ | 104/363 [00:04<00:12, 20.40it/s] Loading 0: 30%|███ | 109/363 [00:04<00:10, 25.31it/s] Loading 0: 31%|███ | 113/363 [00:04<00:10, 22.94it/s] Loading 0: 33%|███▎ | 120/363 [00:04<00:08, 29.70it/s] Loading 0: 34%|███▍ | 124/363 [00:04<00:08, 29.00it/s] Loading 0: 36%|███▌ | 129/363 [00:04<00:07, 31.94it/s] Loading 0: 37%|███▋ | 133/363 [00:04<00:07, 30.42it/s] Loading 0: 38%|███▊ | 137/363 [00:05<00:07, 30.84it/s] Loading 0: 39%|███▉ | 142/363 [00:05<00:08, 27.05it/s] Loading 0: 40%|███▉ | 145/363 [00:05<00:08, 25.10it/s] Loading 0: 41%|████ | 149/363 [00:05<00:09, 23.56it/s] Loading 0: 42%|████▏ | 154/363 [00:05<00:07, 28.50it/s] Loading 0: 44%|████▎ | 158/363 [00:05<00:08, 25.53it/s] Loading 0: 45%|████▍ | 163/363 [00:06<00:06, 29.95it/s] Loading 0: 46%|████▌ | 167/363 [00:06<00:07, 25.93it/s] Loading 0: 47%|████▋ | 172/363 [00:06<00:06, 30.43it/s] Loading 0: 48%|████▊ | 176/363 [00:06<00:06, 26.73it/s] Loading 0: 50%|█████ | 182/363 [00:06<00:06, 26.38it/s] Loading 0: 51%|█████ | 185/363 [00:06<00:07, 23.02it/s] Loading 0: 53%|█████▎ | 192/363 [00:07<00:05, 29.73it/s] Loading 0: 54%|█████▍ | 196/363 [00:07<00:05, 28.89it/s] Loading 0: 55%|█████▌ | 201/363 [00:07<00:05, 31.69it/s] Loading 0: 56%|█████▋ | 205/363 [00:07<00:05, 30.15it/s] Loading 0: 58%|█████▊ | 210/363 [00:07<00:04, 31.63it/s] Loading 0: 59%|█████▉ | 214/363 [00:07<00:04, 30.10it/s] Loading 0: 60%|██████ | 218/363 [00:07<00:04, 30.51it/s] Loading 0: 61%|██████▏ | 223/363 [00:08<00:05, 26.12it/s] Loading 0: 62%|██████▏ | 226/363 [00:08<00:05, 24.36it/s] Loading 0: 63%|██████▎ | 230/363 [00:08<00:05, 23.17it/s] Loading 0: 65%|██████▍ | 235/363 [00:08<00:04, 28.09it/s] Loading 0: 66%|██████▌ | 239/363 [00:08<00:04, 25.39it/s] Loading 0: 68%|██████▊ | 246/363 [00:08<00:03, 32.13it/s] Loading 0: 69%|██████▉ | 250/363 [00:09<00:03, 30.59it/s] Loading 0: 70%|███████ | 255/363 [00:09<00:03, 33.00it/s] Loading 0: 71%|███████▏ | 259/363 [00:09<00:03, 30.63it/s] Loading 0: 72%|███████▏ | 263/363 [00:09<00:04, 23.78it/s] Loading 0: 73%|███████▎ | 266/363 [00:09<00:04, 21.04it/s] Loading 0: 75%|███████▍ | 271/363 [00:09<00:03, 26.27it/s] Loading 0: 76%|███████▌ | 275/363 [00:10<00:03, 23.86it/s] Loading 0: 77%|███████▋ | 280/363 [00:10<00:02, 28.52it/s] Loading 0: 78%|███████▊ | 284/363 [00:10<00:03, 25.19it/s] Loading 0: 80%|███████▉ | 289/363 [00:10<00:02, 30.00it/s] Loading 0: 81%|████████ | 293/363 [00:10<00:02, 26.71it/s] Loading 0: 82%|████████▏ | 298/363 [00:10<00:02, 31.36it/s] Loading 0: 83%|████████▎ | 303/363 [00:11<00:01, 32.29it/s] Loading 0: 85%|████████▍ | 307/363 [00:11<00:02, 23.54it/s] Loading 0: 86%|████████▌ | 311/363 [00:11<00:02, 22.83it/s] Loading 0: 88%|████████▊ | 318/363 [00:11<00:01, 29.71it/s] Loading 0: 89%|████████▊ | 322/363 [00:11<00:01, 29.03it/s] Loading 0: 90%|█████████ | 327/363 [00:11<00:01, 31.54it/s] Loading 0: 91%|█████████ | 331/363 [00:12<00:01, 29.59it/s] Loading 0: 92%|█████████▏| 335/363 [00:12<00:00, 31.54it/s] Loading 0: 93%|█████████▎| 339/363 [00:12<00:00, 26.75it/s] Loading 0: 94%|█████████▍| 343/363 [00:12<00:00, 29.46it/s] Loading 0: 96%|█████████▌| 347/363 [00:19<00:08, 1.89it/s] Loading 0: 96%|█████████▋| 350/363 [00:19<00:05, 2.41it/s] Loading 0: 97%|█████████▋| 353/363 [00:19<00:03, 3.11it/s] Loading 0: 98%|█████████▊| 357/363 [00:19<00:01, 4.31it/s]
Job zonemercy-lexical-nemo-1518-v26-mkmlizer completed after 125.08s with status: succeeded
Stopping job with name zonemercy-lexical-nemo-1518-v26-mkmlizer
Pipeline stage MKMLizer completed in 126.69s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.09s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service zonemercy-lexical-nemo-1518-v26
Waiting for inference service zonemercy-lexical-nemo-1518-v26 to be ready
Failed to get response for submission blend_sehof_2024-08-22: ('http://mistralai-mixtral-8x7b-3473-v130-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '{"error":"ValueError : [TypeError(\\"\'numpy.int64\' object is not iterable\\"), TypeError(\'vars() argument must have __dict__ attribute\')]"}')
Failed to get response for submission zonemercy-base-story-v1_v4: ('http://zonemercy-base-story-v1-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '{"error":"ValueError : [TypeError(\\"\'numpy.int64\' object is not iterable\\"), TypeError(\'vars() argument must have __dict__ attribute\')]"}')
Inference service zonemercy-lexical-nemo-1518-v26 ready after 150.5693004131317s
Pipeline stage MKMLDeployer completed in 151.33s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 3.14083194732666s
Received healthy response to inference request in 3.022113084793091s
Received healthy response to inference request in 3.4921038150787354s
Received healthy response to inference request in 3.0138914585113525s
Received healthy response to inference request in 3.1913154125213623s
5 requests
0 failed requests
5th percentile: 3.0155357837677004
10th percentile: 3.0171801090240478
20th percentile: 3.020468759536743
30th percentile: 3.0458568572998046
40th percentile: 3.0933444023132326
50th percentile: 3.14083194732666
60th percentile: 3.161025333404541
70th percentile: 3.181218719482422
80th percentile: 3.251473093032837
90th percentile: 3.3717884540557863
95th percentile: 3.431946134567261
99th percentile: 3.4800722789764404
mean time: 3.1720511436462404
Pipeline stage StressChecker completed in 16.73s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 5.46s
Shutdown handler de-registered
zonemercy-lexical-nemo-_1518_v26 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.13s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.13s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-lexical-nemo-1518-v26-profiler
Waiting for inference service zonemercy-lexical-nemo-1518-v26-profiler to be ready
Inference service zonemercy-lexical-nemo-1518-v26-profiler ready after 150.41004729270935s
Pipeline stage MKMLProfilerDeployer completed in 150.80s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-lexical-ne9226cf35caaddaf863555421d4ddc827-deplohnwz2:/code/chaiverse_profiler_1725634343 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-lexical-ne9226cf35caaddaf863555421d4ddc827-deplohnwz2 --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725634343 && python profiles.py profile --best_of_n 2 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 1024 --output_tokens 128 --summary /code/chaiverse_profiler_1725634343/summary.json'
kubectl exec -it zonemercy-lexical-ne9226cf35caaddaf863555421d4ddc827-deplohnwz2 --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1725634343/summary.json'
Pipeline stage MKMLProfilerRunner completed in 1376.07s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-lexical-nemo-1518-v26-profiler is running
Tearing down inference service zonemercy-lexical-nemo-1518-v26-profiler
Service zonemercy-lexical-nemo-1518-v26-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 3.35s
Shutdown handler de-registered
zonemercy-lexical-nemo-_1518_v26 status is now inactive due to auto deactivation removed underperforming models
zonemercy-lexical-nemo-_1518_v26 status is now torndown due to DeploymentManager action