developer_uid: chace9580
submission_id: jic062-instruct-g32-lr25_7175_v1
model_name: jic062-instruct-g32-lr25_7175_v1
model_group: jic062/instruct_g32_lr25
status: torndown
timestamp: 2024-08-15T20:35:16+00:00
num_battles: 18989
num_wins: 8702
celo_rating: 1218.81
family_friendly_score: 0.0
submission_type: basic
model_repo: jic062/instruct_g32_lr25_dp7_c2
model_architecture: LlamaForCausalLM
model_num_parameters: 8030261248.0
best_of: 16
max_input_tokens: 512
max_output_tokens: 64
display_name: jic062-instruct-g32-lr25_7175_v1
is_internal_developer: False
language_model: jic062/instruct_g32_lr25_dp7_c2
model_size: 8B
ranking_group: single
us_pacific_date: 2024-08-15
win_ratio: 0.4582653114961293
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 40, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '<|end_of_text|>', '|eot_id|'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
formatter: {'memory_template': "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{bot_name}'s Persona: {memory}\n\n", 'prompt_template': '{prompt}<|eot_id|>', 'bot_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}: {message}<|eot_id|>', 'user_template': '<|start_header_id|>user<|end_header_id|>\n\n{user_name}: {message}<|eot_id|>', 'response_template': '<|start_header_id|>assistant<|end_header_id|>\n\n{bot_name}:', 'truncate_by_message': False}
Resubmit model
Running pipeline stage MKMLizer
Starting job with name jic062-instruct-g32-lr25-7175-v1-mkmlizer
Waiting for job on jic062-instruct-g32-lr25-7175-v1-mkmlizer to finish
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ _____ __ __ ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ /___/ ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ Version: 0.9.11 ║
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ https://mk1.ai ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ The license key for the current software has been verified as ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ belonging to: ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ Chai Research Corp. ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ║ ║
jic062-instruct-g32-lr25-7175-v1-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
jic062-instruct-g32-lr25-7175-v1-mkmlizer: Downloaded to shared memory in 45.300s
jic062-instruct-g32-lr25-7175-v1-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmpqbh96sub, device:0
jic062-instruct-g32-lr25-7175-v1-mkmlizer: Saving flywheel model at /dev/shm/model_cache
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
jic062-instruct-g32-lr25-7175-v1-mkmlizer: quantized model in 26.828s
jic062-instruct-g32-lr25-7175-v1-mkmlizer: Processed model jic062/instruct_g32_lr25_dp7_c2 in 72.128s
jic062-instruct-g32-lr25-7175-v1-mkmlizer: creating bucket guanaco-mkml-models
jic062-instruct-g32-lr25-7175-v1-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
jic062-instruct-g32-lr25-7175-v1-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/jic062-instruct-g32-lr25-7175-v1
jic062-instruct-g32-lr25-7175-v1-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/jic062-instruct-g32-lr25-7175-v1/config.json
jic062-instruct-g32-lr25-7175-v1-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/jic062-instruct-g32-lr25-7175-v1/special_tokens_map.json
jic062-instruct-g32-lr25-7175-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/jic062-instruct-g32-lr25-7175-v1/tokenizer_config.json
jic062-instruct-g32-lr25-7175-v1-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/jic062-instruct-g32-lr25-7175-v1/tokenizer.json
jic062-instruct-g32-lr25-7175-v1-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/jic062-instruct-g32-lr25-7175-v1/flywheel_model.0.safetensors
jic062-instruct-g32-lr25-7175-v1-mkmlizer: Loading 0: 0%| | 0/291 [00:00<?, ?it/s] Loading 0: 1%|▏ | 4/291 [00:00<00:07, 37.22it/s] Loading 0: 4%|▍ | 13/291 [00:00<00:04, 60.80it/s] Loading 0: 9%|▊ | 25/291 [00:00<00:03, 74.54it/s] Loading 0: 14%|█▎ | 40/291 [00:00<00:02, 90.26it/s] Loading 0: 18%|█▊ | 52/291 [00:00<00:02, 88.10it/s] Loading 0: 21%|██ | 61/291 [00:00<00:02, 87.29it/s] Loading 0: 26%|██▌ | 76/291 [00:00<00:02, 95.33it/s] Loading 0: 30%|██▉ | 86/291 [00:02<00:08, 24.59it/s] Loading 0: 33%|███▎ | 97/291 [00:02<00:06, 30.96it/s] Loading 0: 38%|███▊ | 112/291 [00:02<00:04, 42.04it/s] Loading 0: 43%|████▎ | 124/291 [00:02<00:03, 48.91it/s] Loading 0: 46%|████▌ | 133/291 [00:02<00:02, 54.12it/s] Loading 0: 51%|█████ | 148/291 [00:02<00:02, 65.83it/s] Loading 0: 55%|█████▍ | 160/291 [00:02<00:01, 69.83it/s] Loading 0: 58%|█████▊ | 169/291 [00:03<00:01, 69.80it/s] Loading 0: 61%|██████ | 178/291 [00:03<00:01, 71.39it/s] Loading 0: 64%|██████▍ | 187/291 [00:04<00:04, 22.15it/s] Loading 0: 67%|██████▋ | 196/291 [00:04<00:03, 26.89it/s] Loading 0: 70%|███████ | 205/291 [00:04<00:02, 32.38it/s] Loading 0: 74%|███████▎ | 214/291 [00:04<00:01, 38.61it/s] Loading 0: 77%|███████▋ | 223/291 [00:04<00:01, 44.75it/s] Loading 0: 80%|███████▉ | 232/291 [00:04<00:01, 49.04it/s] Loading 0: 83%|████████▎ | 241/291 [00:05<00:00, 52.63it/s] Loading 0: 86%|████████▌ | 250/291 [00:05<00:00, 55.03it/s] Loading 0: 89%|████████▉ | 259/291 [00:05<00:00, 55.57it/s] Loading 0: 92%|█████████▏| 268/291 [00:05<00:00, 59.25it/s] Loading 0: 95%|█████████▌| 277/291 [00:05<00:00, 61.78it/s] Loading 0: 98%|█████████▊| 286/291 [00:05<00:00, 65.36it/s]
Job jic062-instruct-g32-lr25-7175-v1-mkmlizer completed after 104.89s with status: succeeded
Stopping job with name jic062-instruct-g32-lr25-7175-v1-mkmlizer
Pipeline stage MKMLizer completed in 106.09s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.12s
Running pipeline stage ISVCDeployer
Creating inference service jic062-instruct-g32-lr25-7175-v1
Waiting for inference service jic062-instruct-g32-lr25-7175-v1 to be ready
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Inference service jic062-instruct-g32-lr25-7175-v1 ready after 231.49431371688843s
Pipeline stage ISVCDeployer completed in 233.05s
Running pipeline stage StressChecker
Received healthy response to inference request in 3.96878981590271s
Received healthy response to inference request in 2.4097609519958496s
Received healthy response to inference request in 1.5456063747406006s
Received healthy response to inference request in 1.9307401180267334s
Received healthy response to inference request in 1.6546964645385742s
5 requests
0 failed requests
5th percentile: 1.5674243927001954
10th percentile: 1.58924241065979
20th percentile: 1.6328784465789794
30th percentile: 1.709905195236206
40th percentile: 1.8203226566314696
50th percentile: 1.9307401180267334
60th percentile: 2.12234845161438
70th percentile: 2.313956785202026
80th percentile: 2.721566724777222
90th percentile: 3.345178270339966
95th percentile: 3.6569840431213376
99th percentile: 3.9064286613464354
mean time: 2.3019187450408936
Pipeline stage StressChecker completed in 12.18s
jic062-instruct-g32-lr25_7175_v1 status is now deployed due to DeploymentManager action
jic062-instruct-g32-lr25_7175_v1 status is now inactive due to auto deactivation removed underperforming models
jic062-instruct-g32-lr25_7175_v1 status is now torndown due to DeploymentManager action