Running pipeline stage MKMLizer
Starting job with name trace2333-fd-llama3-v4-v3-mkmlizer
Waiting for job on trace2333-fd-llama3-v4-v3-mkmlizer to finish
trace2333-fd-llama3-v4-v3-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
trace2333-fd-llama3-v4-v3-mkmlizer: ║ _____ __ __ ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ /___/ ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ Version: 0.9.9 ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ https://mk1.ai ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ The license key for the current software has been verified as ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ belonging to: ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ Chai Research Corp. ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
trace2333-fd-llama3-v4-v3-mkmlizer: ║ ║
trace2333-fd-llama3-v4-v3-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
trace2333-fd-llama3-v4-v3-mkmlizer: Downloaded to shared memory in 46.002s
trace2333-fd-llama3-v4-v3-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpauuhj7qu, device:0
trace2333-fd-llama3-v4-v3-mkmlizer: Saving flywheel model at /dev/shm/model_cache
trace2333-fd-llama3-v4-v3-mkmlizer: quantized model in 28.675s
trace2333-fd-llama3-v4-v3-mkmlizer: Processed model Trace2333/fd_llama3_v4 in 74.677s
trace2333-fd-llama3-v4-v3-mkmlizer: creating bucket guanaco-mkml-models
trace2333-fd-llama3-v4-v3-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
trace2333-fd-llama3-v4-v3-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/trace2333-fd-llama3-v4-v3
trace2333-fd-llama3-v4-v3-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/trace2333-fd-llama3-v4-v3/tokenizer.json
trace2333-fd-llama3-v4-v3-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/trace2333-fd-llama3-v4-v3/flywheel_model.0.safetensors
trace2333-fd-llama3-v4-v3-mkmlizer: loading reward model from Jellywibble/gpt2_xl_pairwise_89m_step_347634
trace2333-fd-llama3-v4-v3-mkmlizer:
Loading 0: 0%| | 0/291 [00:00<?, ?it/s]
Loading 0: 2%|▏ | 5/291 [00:00<00:10, 27.41it/s]
Loading 0: 4%|▍ | 12/291 [00:00<00:07, 36.77it/s]
Loading 0: 5%|▌ | 16/291 [00:00<00:08, 33.61it/s]
Loading 0: 7%|▋ | 21/291 [00:00<00:07, 36.88it/s]
Loading 0: 9%|▊ | 25/291 [00:00<00:07, 34.52it/s]
Loading 0: 11%|█ | 31/291 [00:00<00:06, 40.65it/s]
Loading 0: 12%|█▏ | 36/291 [00:01<00:10, 24.37it/s]
Loading 0: 14%|█▍ | 41/291 [00:01<00:09, 26.02it/s]
Loading 0: 16%|█▋ | 48/291 [00:01<00:07, 32.92it/s]
Loading 0: 18%|█▊ | 52/291 [00:01<00:07, 32.40it/s]
Loading 0: 20%|█▉ | 57/291 [00:01<00:06, 34.95it/s]
Loading 0: 21%|██ | 61/291 [00:01<00:06, 33.47it/s]
Loading 0: 23%|██▎ | 66/291 [00:01<00:06, 36.01it/s]
Loading 0: 24%|██▍ | 70/291 [00:02<00:06, 34.48it/s]
Loading 0: 25%|██▌ | 74/291 [00:02<00:06, 33.90it/s]
Loading 0: 27%|██▋ | 78/291 [00:02<00:06, 32.95it/s]
Loading 0: 28%|██▊ | 82/291 [00:02<00:08, 23.52it/s]
Loading 0: 30%|██▉ | 86/291 [00:02<00:07, 26.30it/s]
Loading 0: 31%|███ | 90/291 [00:02<00:07, 28.19it/s]
Loading 0: 32%|███▏ | 94/291 [00:03<00:06, 28.80it/s]
Loading 0: 34%|███▍ | 99/291 [00:03<00:05, 32.33it/s]
Loading 0: 35%|███▌ | 103/291 [00:03<00:05, 32.01it/s]
Loading 0: 37%|███▋ | 108/291 [00:03<00:05, 34.76it/s]
Loading 0: 38%|███▊ | 112/291 [00:03<00:05, 33.41it/s]
Loading 0: 40%|███▉ | 116/291 [00:03<00:05, 33.24it/s]
Loading 0: 42%|████▏ | 122/291 [00:03<00:04, 37.59it/s]
Loading 0: 44%|████▎ | 127/291 [00:03<00:04, 35.12it/s]
Loading 0: 46%|████▌ | 133/291 [00:04<00:05, 29.61it/s]
Loading 0: 47%|████▋ | 137/291 [00:04<00:05, 29.87it/s]
Loading 0: 48%|████▊ | 141/291 [00:04<00:05, 28.35it/s]
Loading 0: 51%|█████ | 147/291 [00:04<00:04, 32.65it/s]
Loading 0: 52%|█████▏ | 151/291 [00:04<00:04, 32.05it/s]
Loading 0: 54%|█████▎ | 156/291 [00:04<00:03, 34.61it/s]
Loading 0: 55%|█████▍ | 160/291 [00:05<00:03, 33.46it/s]
Loading 0: 57%|█████▋ | 165/291 [00:05<00:03, 35.28it/s]
Loading 0: 58%|█████▊ | 169/291 [00:05<00:03, 33.89it/s]
Loading 0: 60%|█████▉ | 174/291 [00:05<00:03, 36.47it/s]
Loading 0: 61%|██████ | 178/291 [00:05<00:03, 34.88it/s]
Loading 0: 63%|██████▎ | 183/291 [00:05<00:02, 38.59it/s]
Loading 0: 64%|██████▍ | 187/291 [00:05<00:03, 26.70it/s]
Loading 0: 66%|██████▌ | 191/291 [00:06<00:03, 27.26it/s]
Loading 0: 67%|██████▋ | 195/291 [00:06<00:03, 26.30it/s]
Loading 0: 69%|██████▉ | 201/291 [00:06<00:02, 32.12it/s]
Loading 0: 70%|███████ | 205/291 [00:06<00:02, 31.96it/s]
Loading 0: 72%|███████▏ | 210/291 [00:06<00:02, 34.89it/s]
Loading 0: 74%|███████▎ | 214/291 [00:06<00:02, 33.92it/s]
Loading 0: 75%|███████▌ | 219/291 [00:06<00:01, 36.63it/s]
Loading 0: 77%|███████▋ | 223/291 [00:06<00:01, 34.74it/s]
Loading 0: 78%|███████▊ | 227/291 [00:07<00:01, 34.70it/s]
Loading 0: 79%|███████▉ | 231/291 [00:07<00:01, 34.86it/s]
Loading 0: 81%|████████ | 235/291 [00:07<00:02, 25.98it/s]
Loading 0: 82%|████████▏ | 239/291 [00:07<00:02, 25.90it/s]
Loading 0: 85%|████████▍ | 246/291 [00:07<00:01, 33.77it/s]
Loading 0: 86%|████████▌ | 250/291 [00:07<00:01, 32.93it/s]
Loading 0: 88%|████████▊ | 255/291 [00:07<00:01, 35.59it/s]
Loading 0: 89%|████████▉ | 259/291 [00:08<00:00, 34.35it/s]
Loading 0: 91%|█████████ | 264/291 [00:08<00:00, 36.76it/s]
Loading 0: 92%|█████████▏| 268/291 [00:08<00:00, 34.86it/s]
Loading 0: 94%|█████████▍| 273/291 [00:08<00:00, 37.08it/s]
Loading 0: 95%|█████████▌| 277/291 [00:08<00:00, 35.18it/s]
Loading 0: 97%|█████████▋| 281/291 [00:08<00:00, 34.92it/s]
Loading 0: 98%|█████████▊| 286/291 [00:14<00:01, 2.61it/s]
Loading 0: 99%|█████████▉| 289/291 [00:14<00:00, 3.25it/s]
/opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
trace2333-fd-llama3-v4-v3-mkmlizer: warnings.warn(
trace2333-fd-llama3-v4-v3-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
trace2333-fd-llama3-v4-v3-mkmlizer: warnings.warn(
trace2333-fd-llama3-v4-v3-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
trace2333-fd-llama3-v4-v3-mkmlizer: Saving duration: 1.386s
trace2333-fd-llama3-v4-v3-mkmlizer: Processed model Jellywibble/gpt2_xl_pairwise_89m_step_347634 in 10.869s
trace2333-fd-llama3-v4-v3-mkmlizer: creating bucket guanaco-reward-models
trace2333-fd-llama3-v4-v3-mkmlizer: Bucket 's3://guanaco-reward-models/' created
trace2333-fd-llama3-v4-v3-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/trace2333-fd-llama3-v4-v3_reward
trace2333-fd-llama3-v4-v3-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/trace2333-fd-llama3-v4-v3_reward/special_tokens_map.json
trace2333-fd-llama3-v4-v3-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/trace2333-fd-llama3-v4-v3_reward/tokenizer_config.json
trace2333-fd-llama3-v4-v3-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/trace2333-fd-llama3-v4-v3_reward/merges.txt
trace2333-fd-llama3-v4-v3-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/trace2333-fd-llama3-v4-v3_reward/vocab.json
trace2333-fd-llama3-v4-v3-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/trace2333-fd-llama3-v4-v3_reward/tokenizer.json
trace2333-fd-llama3-v4-v3-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/trace2333-fd-llama3-v4-v3_reward/reward.tensors
Job trace2333-fd-llama3-v4-v3-mkmlizer completed after 115.05s with status: succeeded
Stopping job with name trace2333-fd-llama3-v4-v3-mkmlizer
Pipeline stage MKMLizer completed in 115.99s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service trace2333-fd-llama3-v4-v3
Waiting for inference service trace2333-fd-llama3-v4-v3 to be ready
Inference service trace2333-fd-llama3-v4-v3 ready after 191.40394616127014s
Pipeline stage ISVCDeployer completed in 193.26s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.301894426345825s
Received healthy response to inference request in 1.4948816299438477s
Received healthy response to inference request in 1.44254469871521s
Received healthy response to inference request in 1.417111873626709s
Received healthy response to inference request in 1.3761625289916992s
5 requests
0 failed requests
5th percentile: 1.3843523979187011
10th percentile: 1.392542266845703
20th percentile: 1.408922004699707
30th percentile: 1.422198438644409
40th percentile: 1.4323715686798095
50th percentile: 1.44254469871521
60th percentile: 1.463479471206665
70th percentile: 1.48441424369812
80th percentile: 1.6562841892242433
90th percentile: 1.9790893077850342
95th percentile: 2.1404918670654296
99th percentile: 2.2696139144897463
mean time: 1.6065190315246582
Pipeline stage StressChecker completed in 8.69s
trace2333-fd-llama3-v4_v3 status is now deployed due to DeploymentManager action
trace2333-fd-llama3-v4_v3 status is now inactive due to auto deactivation removed underperforming models
trace2333-fd-llama3-v4_v3 status is now torndown due to DeploymentManager action