Running pipeline stage MKMLizer
Starting job with name zonemercy-cogent-nemo-v1-4327-v5-mkmlizer
Waiting for job on zonemercy-cogent-nemo-v1-4327-v5-mkmlizer to finish
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ _____ __ __ ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ /___/ ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ Version: 0.9.9 ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ https://mk1.ai ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ The license key for the current software has been verified as ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ belonging to: ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ Chai Research Corp. ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ║ ║
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: Downloaded to shared memory in 111.503s
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpe_2ygk6j, device:0
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: Saving flywheel model at /dev/shm/model_cache
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: quantized model in 42.607s
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: Processed model zonemercy/Cogent-Nemo-v1-1k1e5-ep1 in 154.110s
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: creating bucket guanaco-mkml-models
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/zonemercy-cogent-nemo-v1-4327-v5
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/zonemercy-cogent-nemo-v1-4327-v5/special_tokens_map.json
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/zonemercy-cogent-nemo-v1-4327-v5/config.json
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/zonemercy-cogent-nemo-v1-4327-v5/tokenizer_config.json
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/zonemercy-cogent-nemo-v1-4327-v5/tokenizer.json
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/zonemercy-cogent-nemo-v1-4327-v5/flywheel_model.0.safetensors
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: loading reward model from ChaiML/gpt2_xl_pairwise_89m_step_347634
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer:
Loading 0: 0%| | 0/363 [00:00<?, ?it/s]
Loading 0: 1%|▏ | 5/363 [00:00<00:15, 23.22it/s]
Loading 0: 3%|▎ | 10/363 [00:00<00:12, 28.46it/s]
Loading 0: 4%|▍ | 14/363 [00:00<00:13, 25.03it/s]
Loading 0: 6%|▌ | 20/363 [00:00<00:10, 33.87it/s]
Loading 0: 7%|▋ | 24/363 [00:00<00:14, 23.78it/s]
Loading 0: 7%|▋ | 27/363 [00:01<00:14, 22.73it/s]
Loading 0: 9%|▊ | 31/363 [00:01<00:12, 26.18it/s]
Loading 0: 10%|▉ | 35/363 [00:01<00:12, 27.26it/s]
Loading 0: 11%|█ | 39/363 [00:01<00:11, 28.41it/s]
Loading 0: 12%|█▏ | 43/363 [00:01<00:11, 27.52it/s]
Loading 0: 13%|█▎ | 48/363 [00:01<00:10, 29.73it/s]
Loading 0: 14%|█▍ | 52/363 [00:01<00:11, 28.21it/s]
Loading 0: 15%|█▌ | 56/363 [00:02<00:10, 28.35it/s]
Loading 0: 17%|█▋ | 61/363 [00:02<00:12, 24.18it/s]
Loading 0: 18%|█▊ | 64/363 [00:02<00:14, 20.98it/s]
Loading 0: 19%|█▉ | 69/363 [00:02<00:11, 26.02it/s]
Loading 0: 20%|█▉ | 72/363 [00:02<00:12, 23.27it/s]
Loading 0: 21%|██ | 76/363 [00:02<00:10, 26.62it/s]
Loading 0: 22%|██▏ | 80/363 [00:03<00:10, 27.61it/s]
Loading 0: 23%|██▎ | 84/363 [00:03<00:09, 27.92it/s]
Loading 0: 24%|██▍ | 87/363 [00:03<00:10, 25.17it/s]
Loading 0: 25%|██▌ | 91/363 [00:03<00:09, 28.27it/s]
Loading 0: 26%|██▌ | 95/363 [00:03<00:10, 25.06it/s]
Loading 0: 28%|██▊ | 100/363 [00:03<00:08, 30.07it/s]
Loading 0: 29%|██▊ | 104/363 [00:04<00:13, 19.89it/s]
Loading 0: 30%|███ | 109/363 [00:04<00:10, 24.51it/s]
Loading 0: 31%|███ | 113/363 [00:04<00:11, 22.43it/s]
Loading 0: 33%|███▎ | 118/363 [00:04<00:09, 26.94it/s]
Loading 0: 34%|███▎ | 122/363 [00:04<00:10, 24.07it/s]
Loading 0: 35%|███▍ | 127/363 [00:04<00:08, 28.51it/s]
Loading 0: 36%|███▌ | 131/363 [00:05<00:09, 25.37it/s]
Loading 0: 37%|███▋ | 136/363 [00:05<00:07, 29.89it/s]
Loading 0: 39%|███▉ | 141/363 [00:05<00:07, 30.22it/s]
Loading 0: 40%|███▉ | 145/363 [00:05<00:10, 21.21it/s]
Loading 0: 41%|████ | 148/363 [00:05<00:09, 22.67it/s]
Loading 0: 42%|████▏ | 151/363 [00:05<00:09, 22.89it/s]
Loading 0: 43%|████▎ | 156/363 [00:06<00:08, 25.76it/s]
Loading 0: 44%|████▍ | 159/363 [00:06<00:08, 23.55it/s]
Loading 0: 45%|████▍ | 163/363 [00:06<00:07, 26.36it/s]
Loading 0: 46%|████▌ | 167/363 [00:06<00:08, 23.28it/s]
Loading 0: 47%|████▋ | 172/363 [00:06<00:06, 27.80it/s]
Loading 0: 48%|████▊ | 176/363 [00:06<00:07, 24.28it/s]
Loading 0: 50%|████▉ | 181/363 [00:07<00:06, 28.33it/s]
Loading 0: 51%|█████ | 185/363 [00:07<00:09, 18.60it/s]
Loading 0: 52%|█████▏ | 190/363 [00:07<00:07, 23.08it/s]
Loading 0: 53%|█████▎ | 194/363 [00:07<00:07, 21.18it/s]
Loading 0: 55%|█████▍ | 199/363 [00:07<00:06, 25.52it/s]
Loading 0: 56%|█████▌ | 203/363 [00:08<00:06, 23.71it/s]
Loading 0: 58%|█████▊ | 210/363 [00:08<00:05, 29.87it/s]
Loading 0: 59%|█████▉ | 214/363 [00:08<00:05, 28.53it/s]
Loading 0: 60%|██████ | 218/363 [00:08<00:05, 28.66it/s]
Loading 0: 61%|██████▏ | 223/363 [00:08<00:05, 24.86it/s]
Loading 0: 62%|██████▏ | 226/363 [00:08<00:05, 23.26it/s]
Loading 0: 63%|██████▎ | 230/363 [00:09<00:06, 21.84it/s]
Loading 0: 65%|██████▌ | 236/363 [00:09<00:04, 28.87it/s]
Loading 0: 66%|██████▌ | 240/363 [00:09<00:04, 25.77it/s]
Loading 0: 68%|██████▊ | 246/363 [00:09<00:03, 30.21it/s]
Loading 0: 69%|██████▉ | 250/363 [00:09<00:03, 28.92it/s]
Loading 0: 70%|███████ | 255/363 [00:09<00:03, 31.29it/s]
Loading 0: 71%|███████▏ | 259/363 [00:10<00:03, 28.88it/s]
Loading 0: 72%|███████▏ | 263/363 [00:10<00:04, 23.52it/s]
Loading 0: 73%|███████▎ | 266/363 [00:10<00:04, 20.90it/s]
Loading 0: 75%|███████▍ | 271/363 [00:10<00:03, 25.97it/s]
Loading 0: 76%|███████▌ | 275/363 [00:10<00:03, 23.04it/s]
Loading 0: 77%|███████▋ | 280/363 [00:10<00:02, 27.97it/s]
Loading 0: 78%|███████▊ | 284/363 [00:11<00:03, 24.71it/s]
Loading 0: 80%|███████▉ | 289/363 [00:11<00:02, 29.40it/s]
Loading 0: 81%|████████ | 293/363 [00:11<00:02, 25.41it/s]
Loading 0: 82%|████████▏ | 298/363 [00:11<00:02, 30.01it/s]
Loading 0: 83%|████████▎ | 303/363 [00:11<00:01, 30.58it/s]
Loading 0: 85%|████████▍ | 307/363 [00:12<00:02, 21.45it/s]
Loading 0: 86%|████████▌ | 311/363 [00:12<00:02, 20.45it/s]
Loading 0: 87%|████████▋ | 316/363 [00:12<00:01, 24.92it/s]
Loading 0: 88%|████████▊ | 320/363 [00:12<00:01, 22.37it/s]
Loading 0: 90%|████████▉ | 325/363 [00:12<00:01, 26.87it/s]
Loading 0: 91%|█████████ | 329/363 [00:12<00:01, 24.00it/s]
Loading 0: 92%|█████████▏| 334/363 [00:13<00:01, 28.36it/s]
Loading 0: 93%|█████████▎| 338/363 [00:13<00:01, 24.83it/s]
Loading 0: 94%|█████████▍| 343/363 [00:13<00:00, 29.15it/s]
Loading 0: 96%|█████████▌| 347/363 [00:20<00:08, 1.94it/s]
Loading 0: 96%|█████████▋| 350/363 [00:20<00:05, 2.45it/s]
Loading 0: 97%|█████████▋| 353/363 [00:20<00:03, 3.12it/s]
Loading 0: 98%|█████████▊| 357/363 [00:21<00:01, 4.25it/s]
Loading 0: 100%|█████████▉| 362/363 [00:21<00:00, 6.33it/s]
/opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:957: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: warnings.warn(
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: warnings.warn(
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: warnings.warn(
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer:
Downloading shards: 0%| | 0/2 [00:00<?, ?it/s]
Downloading shards: 50%|█████ | 1/2 [00:06<00:06, 6.61s/it]
Downloading shards: 100%|██████████| 2/2 [00:13<00:00, 6.68s/it]
Downloading shards: 100%|██████████| 2/2 [00:13<00:00, 6.67s/it]
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer:
Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s]
Loading checkpoint shards: 50%|█████ | 1/2 [00:00<00:00, 2.45it/s]
Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.94it/s]
Loading checkpoint shards: 100%|██████████| 2/2 [00:00<00:00, 3.61it/s]
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: Saving duration: 1.417s
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: Processed model ChaiML/gpt2_xl_pairwise_89m_step_347634 in 18.359s
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: creating bucket guanaco-reward-models
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: Bucket 's3://guanaco-reward-models/' created
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/zonemercy-cogent-nemo-v1-4327-v5_reward
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/zonemercy-cogent-nemo-v1-4327-v5_reward/config.json
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/zonemercy-cogent-nemo-v1-4327-v5_reward/tokenizer_config.json
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/zonemercy-cogent-nemo-v1-4327-v5_reward/special_tokens_map.json
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/zonemercy-cogent-nemo-v1-4327-v5_reward/merges.txt
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/zonemercy-cogent-nemo-v1-4327-v5_reward/vocab.json
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/zonemercy-cogent-nemo-v1-4327-v5_reward/tokenizer.json
zonemercy-cogent-nemo-v1-4327-v5-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/zonemercy-cogent-nemo-v1-4327-v5_reward/reward.tensors
Job zonemercy-cogent-nemo-v1-4327-v5-mkmlizer completed after 218.62s with status: succeeded
Stopping job with name zonemercy-cogent-nemo-v1-4327-v5-mkmlizer
Pipeline stage MKMLizer completed in 219.14s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.11s
Running pipeline stage ISVCDeployer
Creating inference service zonemercy-cogent-nemo-v1-4327-v5
Waiting for inference service zonemercy-cogent-nemo-v1-4327-v5 to be ready
Inference service zonemercy-cogent-nemo-v1-4327-v5 ready after 211.31155729293823s
Pipeline stage ISVCDeployer completed in 212.29s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.5534472465515137s
Received healthy response to inference request in 1.747460126876831s
Received healthy response to inference request in 1.7521710395812988s
Received healthy response to inference request in 1.723402976989746s
Received healthy response to inference request in 1.7710633277893066s
5 requests
0 failed requests
5th percentile: 1.728214406967163
10th percentile: 1.7330258369445801
20th percentile: 1.7426486968994142
30th percentile: 1.7484023094177246
40th percentile: 1.7502866744995118
50th percentile: 1.7521710395812988
60th percentile: 1.759727954864502
70th percentile: 1.767284870147705
80th percentile: 1.927540111541748
90th percentile: 2.2404936790466308
95th percentile: 2.396970462799072
99th percentile: 2.5221518898010253
mean time: 1.9095089435577393
Pipeline stage StressChecker completed in 10.42s
zonemercy-cogent-nemo-v1_4327_v5 status is now deployed due to DeploymentManager action
zonemercy-cogent-nemo-v1_4327_v5 status is now inactive due to auto deactivation removed underperforming models
zonemercy-cogent-nemo-v1_4327_v5 status is now torndown due to DeploymentManager action