Running pipeline stage MKMLizer
Starting job with name jic062-instruct-v12-mkmlizer
Waiting for job on jic062-instruct-v12-mkmlizer to finish
jic062-instruct-v12-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jic062-instruct-v12-mkmlizer: ║ _____ __ __ ║
jic062-instruct-v12-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jic062-instruct-v12-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jic062-instruct-v12-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jic062-instruct-v12-mkmlizer: ║ /___/ ║
jic062-instruct-v12-mkmlizer: ║ ║
jic062-instruct-v12-mkmlizer: ║ Version: 0.9.7 ║
jic062-instruct-v12-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jic062-instruct-v12-mkmlizer: ║ https://mk1.ai ║
jic062-instruct-v12-mkmlizer: ║ ║
jic062-instruct-v12-mkmlizer: ║ The license key for the current software has been verified as ║
jic062-instruct-v12-mkmlizer: ║ belonging to: ║
jic062-instruct-v12-mkmlizer: ║ ║
jic062-instruct-v12-mkmlizer: ║ Chai Research Corp. ║
jic062-instruct-v12-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jic062-instruct-v12-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
jic062-instruct-v12-mkmlizer: ║ ║
jic062-instruct-v12-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
Failed to get response for submission mistralai-mistral-nemo-_9330_v30: ('http://mistralai-mistral-nemo-9330-v30-predictor.tenant-chaiml-guanaco.k.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'read tcp 127.0.0.1:53748->127.0.0.1:8080: read: connection reset by peer\n')
jic062-instruct-v12-mkmlizer: Downloaded to shared memory in 31.104s
jic062-instruct-v12-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpxw6q5m6c, device:0
jic062-instruct-v12-mkmlizer: Saving flywheel model at /dev/shm/model_cache
jic062-instruct-v12-mkmlizer: creating bucket guanaco-mkml-models
jic062-instruct-v12-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
jic062-instruct-v12-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/jic062-instruct-v12
jic062-instruct-v12-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/jic062-instruct-v12/config.json
jic062-instruct-v12-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/jic062-instruct-v12/special_tokens_map.json
jic062-instruct-v12-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/jic062-instruct-v12/tokenizer_config.json
jic062-instruct-v12-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/jic062-instruct-v12/tokenizer.json
jic062-instruct-v12-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/jic062-instruct-v12/flywheel_model.0.safetensors
jic062-instruct-v12-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:785: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
jic062-instruct-v12-mkmlizer: warnings.warn(
jic062-instruct-v12-mkmlizer: /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:469: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
jic062-instruct-v12-mkmlizer: warnings.warn(
jic062-instruct-v12-mkmlizer: Saving model to /tmp/reward_cache/reward.tensors
jic062-instruct-v12-mkmlizer: Saving duration: 1.410s
jic062-instruct-v12-mkmlizer: Processed model Jellywibble/gpt2_xl_pairwise_89m_step_347634 in 11.544s
jic062-instruct-v12-mkmlizer: creating bucket guanaco-reward-models
jic062-instruct-v12-mkmlizer: Bucket 's3://guanaco-reward-models/' created
jic062-instruct-v12-mkmlizer: uploading /tmp/reward_cache to s3://guanaco-reward-models/jic062-instruct-v12_reward
jic062-instruct-v12-mkmlizer: cp /tmp/reward_cache/config.json s3://guanaco-reward-models/jic062-instruct-v12_reward/config.json
jic062-instruct-v12-mkmlizer: cp /tmp/reward_cache/special_tokens_map.json s3://guanaco-reward-models/jic062-instruct-v12_reward/special_tokens_map.json
jic062-instruct-v12-mkmlizer: cp /tmp/reward_cache/tokenizer_config.json s3://guanaco-reward-models/jic062-instruct-v12_reward/tokenizer_config.json
jic062-instruct-v12-mkmlizer: cp /tmp/reward_cache/merges.txt s3://guanaco-reward-models/jic062-instruct-v12_reward/merges.txt
jic062-instruct-v12-mkmlizer: cp /tmp/reward_cache/vocab.json s3://guanaco-reward-models/jic062-instruct-v12_reward/vocab.json
jic062-instruct-v12-mkmlizer: cp /tmp/reward_cache/tokenizer.json s3://guanaco-reward-models/jic062-instruct-v12_reward/tokenizer.json
jic062-instruct-v12-mkmlizer: cp /tmp/reward_cache/reward.tensors s3://guanaco-reward-models/jic062-instruct-v12_reward/reward.tensors
Job jic062-instruct-v12-mkmlizer completed after 106.07s with status: succeeded
Stopping job with name jic062-instruct-v12-mkmlizer
Pipeline stage MKMLizer completed in 107.62s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.10s
Running pipeline stage ISVCDeployer
Creating inference service jic062-instruct-v12
Waiting for inference service jic062-instruct-v12 to be ready
Inference service jic062-instruct-v12 ready after 121.320077419281s
Pipeline stage ISVCDeployer completed in 123.02s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.399700880050659s
Received healthy response to inference request in 1.4939696788787842s
Received healthy response to inference request in 1.4657089710235596s
Received healthy response to inference request in 1.4300644397735596s
Received healthy response to inference request in 1.4821171760559082s
5 requests
0 failed requests
5th percentile: 1.4371933460235595
10th percentile: 1.4443222522735595
20th percentile: 1.4585800647735596
30th percentile: 1.4689906120300293
40th percentile: 1.4755538940429687
50th percentile: 1.4821171760559082
60th percentile: 1.4868581771850586
70th percentile: 1.491599178314209
80th percentile: 1.6751159191131593
90th percentile: 2.0374083995819094
95th percentile: 2.218554639816284
99th percentile: 2.363471632003784
mean time: 1.6543122291564942
Pipeline stage StressChecker completed in 9.07s
jic062-instruct_v12 status is now deployed due to DeploymentManager action
jic062-instruct_v12 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of jic062-instruct_v12
Running pipeline stage ISVCDeleter
Checking if service jic062-instruct-v12 is running
Tearing down inference service jic062-instruct-v12
Service jic062-instruct-v12 has been torndown
Pipeline stage ISVCDeleter completed in 4.50s
Running pipeline stage MKMLModelDeleter
Cleaning model data from S3
Cleaning model data from model cache
Deleting key jic062-instruct-v12/config.json from bucket guanaco-mkml-models
Deleting key jic062-instruct-v12/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key jic062-instruct-v12/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key jic062-instruct-v12/tokenizer.json from bucket guanaco-mkml-models
Deleting key jic062-instruct-v12/tokenizer_config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key jic062-instruct-v12_reward/config.json from bucket guanaco-reward-models
Deleting key jic062-instruct-v12_reward/merges.txt from bucket guanaco-reward-models
Deleting key jic062-instruct-v12_reward/reward.tensors from bucket guanaco-reward-models
Deleting key jic062-instruct-v12_reward/special_tokens_map.json from bucket guanaco-reward-models
Deleting key jic062-instruct-v12_reward/tokenizer.json from bucket guanaco-reward-models
Deleting key jic062-instruct-v12_reward/tokenizer_config.json from bucket guanaco-reward-models
Deleting key jic062-instruct-v12_reward/vocab.json from bucket guanaco-reward-models
Pipeline stage MKMLModelDeleter completed in 5.29s
jic062-instruct_v12 status is now torndown due to DeploymentManager action