submission_id: zonemercy-lexical-nemo-_1518_v27
developer_uid: chai_backend_admin
best_of: 2
celo_rating: 1222.91
display_name: 0906stv1-3
family_friendly_score: 0.0
formatter: {'memory_template': "Write a slice of story that takes place over the course of a single day in Bot's life. Use stream-of-consciousness narration to explore the character's thoughts and perceptions. Include poetic, impressionistic descriptions of the character's surroundings and sensations. Weave in memories and reflections that provide insight into the Bot's past and inner life. The scene should feel like part of a lived-in world, with the scene naturally existing in a wider story.\nBot's Name: {bot_name}\nBot's Persona: {memory}\n####\n", 'prompt_template': '', 'bot_template': 'Bot: {message}\n', 'user_template': 'User: {message}\n', 'response_template': 'Bot[Start story]:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 100, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['</s>', 'Bot:', 'User:', 'You:'], 'max_input_tokens': 1024, 'best_of': 2, 'max_output_tokens': 128}
gpu_counts: {'NVIDIA RTX A5000': 1}
ineligible_reason: max_output_tokens!=64
is_internal_developer: True
language_model: zonemercy/Lexical-Nemo-v4-1k1e5
latencies: [{'batch_size': 1, 'throughput': 0.35950240741800793, 'latency_mean': 2.7815301883220673, 'latency_p50': 2.790284752845764, 'latency_p90': 2.938419961929321}, {'batch_size': 3, 'throughput': 0.8434470687396841, 'latency_mean': 3.5425951611995696, 'latency_p50': 3.5448864698410034, 'latency_p90': 3.7449988603591917}, {'batch_size': 5, 'throughput': 1.174805042433325, 'latency_mean': 4.23940673828125, 'latency_p50': 4.208611607551575, 'latency_p90': 4.535007524490356}, {'batch_size': 6, 'throughput': 1.2907604249919815, 'latency_mean': 4.602039135694504, 'latency_p50': 4.583564519882202, 'latency_p90': 5.044256401062012}, {'batch_size': 8, 'throughput': 1.4796362264379035, 'latency_mean': 5.382944295406341, 'latency_p50': 5.356655120849609, 'latency_p90': 5.8989616394042965}, {'batch_size': 10, 'throughput': 1.5875634103572935, 'latency_mean': 6.244426683187485, 'latency_p50': 6.228735685348511, 'latency_p90': 6.795530939102173}]
max_input_tokens: 1024
max_output_tokens: 128
model_architecture: MistralForCausalLM
model_group: zonemercy/Lexical-Nemo-v
model_name: 0906stv1-3
model_num_parameters: 12772070400.0
model_repo: zonemercy/Lexical-Nemo-v4-1k1e5
model_size: 13B
num_battles: 10388
num_wins: 5068
ranking_group: single
status: torndown
submission_type: basic
throughput_3p7s: 0.93
timestamp: 2024-09-06T14:44:20+00:00
us_pacific_date: 2024-09-06
win_ratio: 0.48787061994609165
Download Preference Data
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name zonemercy-lexical-nemo-1518-v27-mkmlizer
Waiting for job on zonemercy-lexical-nemo-1518-v27-mkmlizer to finish
Failed to get response for submission zonemercy-base-story-v1_v3: ('http://zonemercy-base-story-v1-v3-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '{"error":"ValueError : [TypeError(\\"\'numpy.int64\' object is not iterable\\"), TypeError(\'vars() argument must have __dict__ attribute\')]"}')
Failed to get response for submission zonemercy-base-story-v1_v3: ('http://zonemercy-base-story-v1-v3-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'upstream connect error or disconnect/reset before headers. reset reason: connection timeout')
zonemercy-lexical-nemo-1518-v27-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ _____ __ __ ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ /___/ ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ Version: 0.10.1 ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ https://mk1.ai ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ The license key for the current software has been verified as ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ belonging to: ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ Chai Research Corp. ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ║ ║
zonemercy-lexical-nemo-1518-v27-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
zonemercy-lexical-nemo-1518-v27-mkmlizer: Downloaded to shared memory in 54.567s
zonemercy-lexical-nemo-1518-v27-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpc2ws6e97, device:0
zonemercy-lexical-nemo-1518-v27-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission zonemercy-base-story-v1_v3: ('http://zonemercy-base-story-v1-v3-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '{"error":"ValueError : [TypeError(\\"\'numpy.int64\' object is not iterable\\"), TypeError(\'vars() argument must have __dict__ attribute\')]"}')
Failed to get response for submission zonemercy-base-story-v1_v4: ('http://zonemercy-base-story-v1-v4-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', '{"error":"ValueError : [TypeError(\\"\'numpy.int64\' object is not iterable\\"), TypeError(\'vars() argument must have __dict__ attribute\')]"}')
zonemercy-lexical-nemo-1518-v27-mkmlizer: quantized model in 39.722s
zonemercy-lexical-nemo-1518-v27-mkmlizer: Processed model zonemercy/Lexical-Nemo-v4-1k1e5 in 94.289s
zonemercy-lexical-nemo-1518-v27-mkmlizer: creating bucket guanaco-mkml-models
zonemercy-lexical-nemo-1518-v27-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
zonemercy-lexical-nemo-1518-v27-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/zonemercy-lexical-nemo-1518-v27
zonemercy-lexical-nemo-1518-v27-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/zonemercy-lexical-nemo-1518-v27/config.json
zonemercy-lexical-nemo-1518-v27-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/zonemercy-lexical-nemo-1518-v27/special_tokens_map.json
zonemercy-lexical-nemo-1518-v27-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/zonemercy-lexical-nemo-1518-v27/tokenizer_config.json
zonemercy-lexical-nemo-1518-v27-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/zonemercy-lexical-nemo-1518-v27/tokenizer.json
zonemercy-lexical-nemo-1518-v27-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/zonemercy-lexical-nemo-1518-v27/flywheel_model.0.safetensors
zonemercy-lexical-nemo-1518-v27-mkmlizer: Loading 0: 0%| | 0/363 [00:00<?, ?it/s] Loading 0: 1%|▏ | 5/363 [00:00<00:14, 24.36it/s] Loading 0: 3%|▎ | 10/363 [00:00<00:11, 29.59it/s] Loading 0: 4%|▍ | 14/363 [00:00<00:13, 26.14it/s] Loading 0: 6%|▌ | 20/363 [00:00<00:09, 35.20it/s] Loading 0: 7%|▋ | 24/363 [00:00<00:14, 24.06it/s] Loading 0: 8%|▊ | 28/363 [00:01<00:13, 24.53it/s] Loading 0: 9%|▉ | 32/363 [00:01<00:13, 23.73it/s] Loading 0: 11%|█ | 39/363 [00:01<00:10, 30.39it/s] Loading 0: 12%|█▏ | 43/363 [00:01<00:10, 29.37it/s] Loading 0: 13%|█▎ | 48/363 [00:01<00:10, 31.19it/s] Loading 0: 14%|█▍ | 52/363 [00:01<00:10, 29.90it/s] Loading 0: 15%|█▌ | 56/363 [00:01<00:10, 29.98it/s] Loading 0: 17%|█▋ | 61/363 [00:02<00:12, 25.06it/s] Loading 0: 18%|█▊ | 64/363 [00:02<00:13, 22.28it/s] Loading 0: 19%|█▉ | 69/363 [00:02<00:10, 27.37it/s] Loading 0: 20%|██ | 73/363 [00:02<00:10, 27.41it/s] Loading 0: 21%|██ | 77/363 [00:02<00:11, 25.65it/s] Loading 0: 23%|██▎ | 84/363 [00:02<00:08, 31.84it/s] Loading 0: 24%|██▍ | 88/363 [00:03<00:08, 30.69it/s] Loading 0: 26%|██▌ | 93/363 [00:03<00:08, 32.77it/s] Loading 0: 27%|██▋ | 97/363 [00:03<00:08, 31.07it/s] Loading 0: 28%|██▊ | 101/363 [00:03<00:10, 24.85it/s] Loading 0: 29%|██▊ | 104/363 [00:03<00:11, 22.41it/s] Loading 0: 31%|███ | 111/363 [00:03<00:08, 29.46it/s] Loading 0: 32%|███▏ | 115/363 [00:04<00:08, 28.98it/s] Loading 0: 33%|███▎ | 120/363 [00:04<00:07, 31.35it/s] Loading 0: 34%|███▍ | 124/363 [00:04<00:07, 30.13it/s] Loading 0: 36%|███▌ | 129/363 [00:04<00:07, 32.55it/s] Loading 0: 37%|███▋ | 133/363 [00:04<00:07, 30.85it/s] Loading 0: 38%|███▊ | 137/363 [00:04<00:07, 30.49it/s] Loading 0: 39%|███▉ | 142/363 [00:05<00:08, 25.98it/s] Loading 0: 40%|███▉ | 145/363 [00:05<00:08, 24.91it/s] Loading 0: 41%|████ | 149/363 [00:05<00:09, 23.70it/s] Loading 0: 43%|████▎ | 156/363 [00:05<00:06, 30.77it/s] Loading 0: 44%|████▍ | 160/363 [00:05<00:06, 29.90it/s] Loading 0: 45%|████▌ | 165/363 [00:05<00:06, 31.74it/s] Loading 0: 47%|████▋ | 169/363 [00:05<00:06, 30.37it/s] Loading 0: 48%|████▊ | 174/363 [00:06<00:05, 32.38it/s] Loading 0: 49%|████▉ | 178/363 [00:06<00:06, 30.77it/s] Loading 0: 50%|█████ | 182/363 [00:06<00:07, 24.70it/s] Loading 0: 51%|█████ | 185/363 [00:06<00:08, 22.24it/s] Loading 0: 53%|█████▎ | 192/363 [00:06<00:05, 29.41it/s] Loading 0: 54%|█████▍ | 196/363 [00:06<00:05, 28.46it/s] Loading 0: 55%|█████▌ | 201/363 [00:07<00:05, 31.09it/s] Loading 0: 56%|█████▋ | 205/363 [00:07<00:05, 30.16it/s] Loading 0: 58%|█████▊ | 210/363 [00:07<00:04, 32.47it/s] Loading 0: 59%|█████▉ | 214/363 [00:07<00:04, 30.86it/s] Loading 0: 60%|██████ | 218/363 [00:07<00:04, 30.91it/s] Loading 0: 61%|██████▏ | 223/363 [00:07<00:05, 26.31it/s] Loading 0: 62%|██████▏ | 226/363 [00:08<00:05, 24.79it/s] Loading 0: 63%|██████▎ | 230/363 [00:08<00:05, 23.76it/s] Loading 0: 65%|██████▌ | 237/363 [00:08<00:04, 30.37it/s] Loading 0: 66%|██████▋ | 241/363 [00:08<00:04, 29.28it/s] Loading 0: 68%|██████▊ | 246/363 [00:08<00:03, 31.90it/s] Loading 0: 69%|██████▉ | 250/363 [00:08<00:03, 30.03it/s] Loading 0: 70%|███████ | 255/363 [00:08<00:03, 32.21it/s] Loading 0: 71%|███████▏ | 259/363 [00:09<00:03, 30.74it/s] Loading 0: 72%|███████▏ | 263/363 [00:09<00:04, 24.62it/s] Loading 0: 73%|███████▎ | 266/363 [00:09<00:04, 22.20it/s] Loading 0: 75%|███████▌ | 273/363 [00:09<00:03, 29.11it/s] Loading 0: 76%|███████▋ | 277/363 [00:09<00:03, 28.54it/s] Loading 0: 78%|███████▊ | 282/363 [00:09<00:02, 31.29it/s] Loading 0: 79%|███████▉ | 286/363 [00:10<00:02, 29.83it/s] Loading 0: 80%|████████ | 291/363 [00:10<00:02, 31.68it/s] Loading 0: 81%|████████▏ | 295/363 [00:10<00:02, 30.35it/s] Loading 0: 82%|████████▏ | 299/363 [00:10<00:02, 30.52it/s] Loading 0: 84%|████████▎ | 304/363 [00:10<00:02, 25.86it/s] Loading 0: 85%|████████▍ | 307/363 [00:10<00:02, 24.92it/s] Loading 0: 86%|████████▌ | 311/363 [00:11<00:02, 23.92it/s] Loading 0: 88%|████████▊ | 318/363 [00:11<00:01, 30.87it/s] Loading 0: 89%|████████▊ | 322/363 [00:11<00:01, 29.83it/s] Loading 0: 90%|█████████ | 327/363 [00:11<00:01, 32.01it/s] Loading 0: 91%|█████████ | 331/363 [00:11<00:01, 30.75it/s] Loading 0: 93%|█████████▎| 336/363 [00:11<00:00, 32.84it/s] Loading 0: 94%|█████████▎| 340/363 [00:11<00:00, 31.50it/s] Loading 0: 95%|█████████▍| 344/363 [00:18<00:09, 2.00it/s] Loading 0: 96%|█████████▌| 348/363 [00:18<00:05, 2.70it/s] Loading 0: 97%|█████████▋| 353/363 [00:19<00:02, 3.92it/s] Loading 0: 98%|█████████▊| 357/363 [00:19<00:01, 5.09it/s]
Job zonemercy-lexical-nemo-1518-v27-mkmlizer completed after 115.79s with status: succeeded
Stopping job with name zonemercy-lexical-nemo-1518-v27-mkmlizer
Pipeline stage MKMLizer completed in 116.58s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.08s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service zonemercy-lexical-nemo-1518-v27
Waiting for inference service zonemercy-lexical-nemo-1518-v27 to be ready
Failed to get response for submission zonemercy-base-story-v1_v3: ('http://zonemercy-base-story-v1-v3-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'upstream connect error or disconnect/reset before headers. reset reason: connection timeout')
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service zonemercy-lexical-nemo-1518-v27 ready after 151.61339473724365s
Pipeline stage MKMLDeployer completed in 152.59s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 3.134650707244873s
Received healthy response to inference request in 2.7204642295837402s
Received healthy response to inference request in 3.313580274581909s
Received healthy response to inference request in 3.314502239227295s
Received healthy response to inference request in 2.6686222553253174s
5 requests
0 failed requests
5th percentile: 2.6789906501770018
10th percentile: 2.6893590450286866
20th percentile: 2.710095834732056
30th percentile: 2.803301525115967
40th percentile: 2.96897611618042
50th percentile: 3.134650707244873
60th percentile: 3.2062225341796875
70th percentile: 3.277794361114502
80th percentile: 3.3137646675109864
90th percentile: 3.3141334533691404
95th percentile: 3.3143178462982177
99th percentile: 3.3144653606414796
mean time: 3.030363941192627
Pipeline stage StressChecker completed in 15.92s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 3.97s
Shutdown handler de-registered
zonemercy-lexical-nemo-_1518_v27 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.13s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.11s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-lexical-nemo-1518-v27-profiler
Waiting for inference service zonemercy-lexical-nemo-1518-v27-profiler to be ready
Inference service zonemercy-lexical-nemo-1518-v27-profiler ready after 140.36696410179138s
Pipeline stage MKMLProfilerDeployer completed in 140.82s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-lexical-ne512f3c241c847b258541d7ae1d5dfa7f-deplog5v9m:/code/chaiverse_profiler_1725634331 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-lexical-ne512f3c241c847b258541d7ae1d5dfa7f-deplog5v9m --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725634331 && python profiles.py profile --best_of_n 2 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 1024 --output_tokens 128 --summary /code/chaiverse_profiler_1725634331/summary.json'
kubectl exec -it zonemercy-lexical-ne512f3c241c847b258541d7ae1d5dfa7f-deplog5v9m --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1725634331/summary.json'
Pipeline stage MKMLProfilerRunner completed in 1387.57s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-lexical-nemo-1518-v27-profiler is running
Tearing down inference service zonemercy-lexical-nemo-1518-v27-profiler
Service zonemercy-lexical-nemo-1518-v27-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 1.63s
Shutdown handler de-registered
zonemercy-lexical-nemo-_1518_v27 status is now inactive due to auto deactivation removed underperforming models
Shutdown handler de-registered
zonemercy-lexical-nemo-_1518_v25 status is now torndown due to DeploymentManager action