Running pipeline stage MKMLizer
Starting job with name riverise-my-first-model-v8-mkmlizer
Waiting for job on riverise-my-first-model-v8-mkmlizer to finish
Stopping job with name riverise-my-first-model-v8-mkmlizer
%s, retrying in %s seconds...
Starting job with name riverise-my-first-model-v8-mkmlizer
Waiting for job on riverise-my-first-model-v8-mkmlizer to finish
riverise-my-first-model-v8-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
riverise-my-first-model-v8-mkmlizer: ║ _____ __ __ ║
riverise-my-first-model-v8-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
riverise-my-first-model-v8-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
riverise-my-first-model-v8-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
riverise-my-first-model-v8-mkmlizer: ║ /___/ ║
riverise-my-first-model-v8-mkmlizer: ║ ║
riverise-my-first-model-v8-mkmlizer: ║ Version: 0.10.1 ║
riverise-my-first-model-v8-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
riverise-my-first-model-v8-mkmlizer: ║ https://mk1.ai ║
riverise-my-first-model-v8-mkmlizer: ║ ║
riverise-my-first-model-v8-mkmlizer: ║ The license key for the current software has been verified as ║
riverise-my-first-model-v8-mkmlizer: ║ belonging to: ║
riverise-my-first-model-v8-mkmlizer: ║ ║
riverise-my-first-model-v8-mkmlizer: ║ Chai Research Corp. ║
riverise-my-first-model-v8-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
riverise-my-first-model-v8-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
riverise-my-first-model-v8-mkmlizer: ║ ║
riverise-my-first-model-v8-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
riverise-my-first-model-v8-mkmlizer: Downloaded to shared memory in 22.015s
riverise-my-first-model-v8-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp50l2i0y7, device:0
riverise-my-first-model-v8-mkmlizer: Saving flywheel model at /dev/shm/model_cache
riverise-my-first-model-v8-mkmlizer: quantized model in 25.516s
riverise-my-first-model-v8-mkmlizer: Processed model Riverise/my-first-model in 47.530s
riverise-my-first-model-v8-mkmlizer: creating bucket guanaco-mkml-models
riverise-my-first-model-v8-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
riverise-my-first-model-v8-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/riverise-my-first-model-v8
riverise-my-first-model-v8-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/riverise-my-first-model-v8/config.json
riverise-my-first-model-v8-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/riverise-my-first-model-v8/special_tokens_map.json
riverise-my-first-model-v8-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/riverise-my-first-model-v8/tokenizer_config.json
riverise-my-first-model-v8-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/riverise-my-first-model-v8/tokenizer.json
riverise-my-first-model-v8-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/riverise-my-first-model-v8/flywheel_model.0.safetensors
riverise-my-first-model-v8-mkmlizer:
Loading 0: 0%| | 0/291 [00:00<?, ?it/s]
Loading 0: 1%|▏ | 4/291 [00:00<00:07, 38.27it/s]
Loading 0: 5%|▌ | 16/291 [00:00<00:03, 71.66it/s]
Loading 0: 11%|█ | 31/291 [00:00<00:02, 92.68it/s]
Loading 0: 15%|█▍ | 43/291 [00:00<00:02, 92.29it/s]
Loading 0: 20%|█▉ | 58/291 [00:00<00:02, 101.15it/s]
Loading 0: 24%|██▍ | 70/291 [00:00<00:02, 97.59it/s]
Loading 0: 29%|██▊ | 83/291 [00:01<00:07, 28.41it/s]
Loading 0: 31%|███ | 90/291 [00:01<00:06, 32.02it/s]
Loading 0: 35%|███▌ | 103/291 [00:02<00:04, 41.39it/s]
Loading 0: 40%|███▉ | 115/291 [00:02<00:03, 49.32it/s]
Loading 0: 43%|████▎ | 124/291 [00:02<00:03, 55.36it/s]
Loading 0: 48%|████▊ | 139/291 [00:02<00:02, 68.03it/s]
Loading 0: 52%|█████▏ | 151/291 [00:02<00:01, 74.02it/s]
Loading 0: 57%|█████▋ | 166/291 [00:02<00:01, 83.93it/s]
Loading 0: 61%|██████ | 177/291 [00:02<00:01, 89.49it/s]
Loading 0: 65%|██████▍ | 188/291 [00:04<00:03, 25.95it/s]
Loading 0: 69%|██████▉ | 202/291 [00:04<00:02, 34.49it/s]
Loading 0: 74%|███████▎ | 214/291 [00:04<00:01, 42.10it/s]
Loading 0: 79%|███████▊ | 229/291 [00:04<00:01, 52.20it/s]
Loading 0: 83%|████████▎ | 241/291 [00:04<00:00, 58.75it/s]
Loading 0: 88%|████████▊ | 256/291 [00:04<00:00, 70.16it/s]
Loading 0: 92%|█████████▏| 268/291 [00:04<00:00, 73.85it/s]
Loading 0: 97%|█████████▋| 283/291 [00:05<00:00, 82.56it/s]
Job riverise-my-first-model-v8-mkmlizer completed after 74.01s with status: succeeded
Stopping job with name riverise-my-first-model-v8-mkmlizer
Pipeline stage MKMLizer completed in 75.39s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service riverise-my-first-model-v8
Waiting for inference service riverise-my-first-model-v8 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Inference service riverise-my-first-model-v8 ready after 181.4129354953766s
Pipeline stage ISVCDeployer completed in 182.05s
Running pipeline stage StressChecker
Received healthy response to inference request in 1.9636814594268799s
Received healthy response to inference request in 1.6188766956329346s
Received healthy response to inference request in 1.8606417179107666s
Received healthy response to inference request in 1.5701227188110352s
Received healthy response to inference request in 1.4936089515686035s
5 requests
0 failed requests
5th percentile: 1.50891170501709
10th percentile: 1.5242144584655761
20th percentile: 1.5548199653625487
30th percentile: 1.579873514175415
40th percentile: 1.599375104904175
50th percentile: 1.6188766956329346
60th percentile: 1.7155827045440675
70th percentile: 1.8122887134552002
80th percentile: 1.8812496662139893
90th percentile: 1.9224655628204346
95th percentile: 1.9430735111236572
99th percentile: 1.9595598697662353
mean time: 1.701386308670044
Pipeline stage StressChecker completed in 9.71s
riverise-my-first-model_v8 status is now deployed due to DeploymentManager action
Connection pool is full, discarding connection: %s. Connection pool size: %s
Connection pool is full, discarding connection: %s. Connection pool size: %s
riverise-my-first-model_v8 status is now inactive due to auto deactivation removed underperforming models
riverise-my-first-model_v8 status is now torndown due to DeploymentManager action