Running pipeline stage MKMLizer
Starting job with name riverise-my-second-model-v1-mkmlizer
Waiting for job on riverise-my-second-model-v1-mkmlizer to finish
Stopping job with name riverise-my-second-model-v1-mkmlizer
%s, retrying in %s seconds...
Starting job with name riverise-my-second-model-v1-mkmlizer
Waiting for job on riverise-my-second-model-v1-mkmlizer to finish
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
riverise-my-second-model-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
riverise-my-second-model-v1-mkmlizer: ║ _____ __ __ ║
riverise-my-second-model-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
riverise-my-second-model-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
riverise-my-second-model-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
riverise-my-second-model-v1-mkmlizer: ║ /___/ ║
riverise-my-second-model-v1-mkmlizer: ║ ║
riverise-my-second-model-v1-mkmlizer: ║ Version: 0.10.1 ║
riverise-my-second-model-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
riverise-my-second-model-v1-mkmlizer: ║ https://mk1.ai ║
riverise-my-second-model-v1-mkmlizer: ║ ║
riverise-my-second-model-v1-mkmlizer: ║ The license key for the current software has been verified as ║
riverise-my-second-model-v1-mkmlizer: ║ belonging to: ║
riverise-my-second-model-v1-mkmlizer: ║ ║
riverise-my-second-model-v1-mkmlizer: ║ Chai Research Corp. ║
riverise-my-second-model-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
riverise-my-second-model-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
riverise-my-second-model-v1-mkmlizer: ║ ║
riverise-my-second-model-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
riverise-my-second-model-v1-mkmlizer: Downloaded to shared memory in 33.639s
riverise-my-second-model-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpg8ip6z6p, device:0
riverise-my-second-model-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
riverise-my-second-model-v1-mkmlizer: quantized model in 26.042s
riverise-my-second-model-v1-mkmlizer: Processed model Riverise/my-second-model in 59.681s
riverise-my-second-model-v1-mkmlizer: creating bucket guanaco-mkml-models
riverise-my-second-model-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
riverise-my-second-model-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/riverise-my-second-model-v1
riverise-my-second-model-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/riverise-my-second-model-v1/config.json
riverise-my-second-model-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/riverise-my-second-model-v1/special_tokens_map.json
riverise-my-second-model-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/riverise-my-second-model-v1/tokenizer_config.json
riverise-my-second-model-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/riverise-my-second-model-v1/tokenizer.json
riverise-my-second-model-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/riverise-my-second-model-v1/flywheel_model.0.safetensors
riverise-my-second-model-v1-mkmlizer:
Loading 0: 0%| | 0/291 [00:00<?, ?it/s]
Loading 0: 2%|▏ | 7/291 [00:00<00:06, 43.59it/s]
Loading 0: 5%|▌ | 16/291 [00:00<00:04, 62.82it/s]
Loading 0: 9%|▊ | 25/291 [00:00<00:04, 66.31it/s]
Loading 0: 12%|█▏ | 34/291 [00:00<00:03, 73.03it/s]
Loading 0: 15%|█▌ | 44/291 [00:00<00:03, 81.62it/s]
Loading 0: 20%|█▉ | 58/291 [00:00<00:02, 87.22it/s]
Loading 0: 23%|██▎ | 67/291 [00:00<00:02, 87.91it/s]
Loading 0: 27%|██▋ | 78/291 [00:00<00:02, 94.20it/s]
Loading 0: 30%|███ | 88/291 [00:02<00:08, 23.94it/s]
Loading 0: 33%|███▎ | 97/291 [00:02<00:06, 29.84it/s]
Loading 0: 36%|███▋ | 106/291 [00:02<00:05, 36.79it/s]
Loading 0: 39%|███▉ | 114/291 [00:02<00:05, 34.93it/s]
Loading 0: 42%|████▏ | 121/291 [00:02<00:04, 39.28it/s]
Loading 0: 45%|████▍ | 130/291 [00:02<00:03, 47.08it/s]
Loading 0: 48%|████▊ | 139/291 [00:02<00:02, 53.22it/s]
Loading 0: 51%|█████ | 148/291 [00:03<00:02, 60.43it/s]
Loading 0: 54%|█████▍ | 157/291 [00:03<00:02, 64.85it/s]
Loading 0: 57%|█████▋ | 166/291 [00:03<00:01, 67.81it/s]
Loading 0: 60%|██████ | 175/291 [00:03<00:01, 71.37it/s]
Loading 0: 64%|██████▍ | 187/291 [00:04<00:04, 23.29it/s]
Loading 0: 67%|██████▋ | 196/291 [00:04<00:03, 29.15it/s]
Loading 0: 70%|███████ | 205/291 [00:04<00:02, 35.63it/s]
Loading 0: 74%|███████▎ | 214/291 [00:04<00:01, 42.64it/s]
Loading 0: 77%|███████▋ | 223/291 [00:04<00:01, 49.56it/s]
Loading 0: 80%|███████▉ | 232/291 [00:05<00:01, 57.06it/s]
Loading 0: 83%|████████▎ | 241/291 [00:05<00:00, 62.84it/s]
Loading 0: 86%|████████▌ | 250/291 [00:05<00:00, 68.45it/s]
Loading 0: 89%|████████▉ | 259/291 [00:05<00:00, 73.53it/s]
Loading 0: 92%|█████████▏| 268/291 [00:05<00:00, 76.80it/s]
Loading 0: 97%|█████████▋| 282/291 [00:05<00:00, 93.66it/s]
Job riverise-my-second-model-v1-mkmlizer completed after 85.17s with status: succeeded
Stopping job with name riverise-my-second-model-v1-mkmlizer
Pipeline stage MKMLizer completed in 87.09s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service riverise-my-second-model-v1
Waiting for inference service riverise-my-second-model-v1 to be ready
Inference service riverise-my-second-model-v1 ready after 181.76094365119934s
Pipeline stage ISVCDeployer completed in 182.64s
Running pipeline stage StressChecker
Received healthy response to inference request in 4.379415273666382s
Received healthy response to inference request in 1.2190375328063965s
Received healthy response to inference request in 1.2043561935424805s
Received healthy response to inference request in 2.5437023639678955s
Received healthy response to inference request in 1.420515537261963s
5 requests
0 failed requests
5th percentile: 1.2072924613952636
10th percentile: 1.210228729248047
20th percentile: 1.2161012649536134
30th percentile: 1.2593331336975098
40th percentile: 1.3399243354797363
50th percentile: 1.420515537261963
60th percentile: 1.869790267944336
70th percentile: 2.319064998626709
80th percentile: 2.910844945907593
90th percentile: 3.6451301097869875
95th percentile: 4.012272691726684
99th percentile: 4.3059867572784425
mean time: 2.1534053802490236
Pipeline stage StressChecker completed in 11.65s
riverise-my-second-model_v1 status is now deployed due to DeploymentManager action
riverise-my-second-model_v1 status is now inactive due to auto deactivation removed underperforming models
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of riverise-my-second-model_v1
run pipeline %s
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline stage %s
run pipeline %s
Running pipeline stage MKMLDeleter
run pipeline stage %s
Running pipeline stage MKMLDeleter
%s, retrying in %s seconds...
%s, retrying in %s seconds...
clean up pipeline due to error=%s
Shutdown handler de-registered
riverise-my-second-model_v1 status is now torndown due to DeploymentManager action